Choose your pal – a Honda 3EA18, left, of a 3C18, both concept robots for now./AFP
Choose your pal – a Honda 3EA18, left, of a 3C18, both concept robots for now./AFP

Nice big hug from a robot

lifestyle January 14, 2018 01:00

By Agence France-Presse
Las Vegas, Nevada

The new ‘emotional’ bots aim to read your feelings and console you as needed



THE ROBOT called Forpheus does more than play a mean game of table tennis. It can read body language to gauge its opponent’s ability and offer advice and encouragement.

“It will try to understand your mood and your playing ability and predict a bit about your next shot,” says Keith Kersten of Japan-based Omron Automation, which developed Forpheus to showcase its technology.

“We don’t sell ping-pong robots, but we are using Forpheus to show how technology works with people,” he adds.

Forpheus was among several devices shown at last week’s Consumer Electronics Show, which highlighted how robots can become more humanlike by acquiring “emotional intelligence” and empathy.

Although this specialisation is still emerging, the notion of robotic empathy appeared to be a strong theme at the huge gathering of technology professionals in Las Vegas.

Honda, the Japanese auto giant, launched a new robotics programme called “Empower, Experience, Empathy”, including its new 3E-A18 robot, which “shows compassion to humans with a variety of facial expressions”, according to a statement.

The Omron Forpheus robot thrashes a mere human at table tennis during the Las Vegas convention./AFP

Although empathy and emotional intelligence don’t necessarily require a humanoid form, some robot-makers have been working on form as well as function.

“We’re been working very hard to have an emotional robot,” says Jean-Michel Mourier of French-based Blue Frog Robotics, which makes the companion and social robot called Buddy, set to be released later this year.

“He has a complex brain. He will ask for a caress or get mad if you poke him in the eye.”

Other robots, such as Qihan Technology’s Sanbot and SoftBank Robotics’ Pepper, are being “humanised” by teaching them to read and react to people’s emotional states.

Pepper is “capable of interpreting a smile, a frown, your tone of voice, as well as the lexical field you use and non-verbal language such as the angle of your head”, according to SoftBank.

Developing emotional intelligence in robots is a difficult task, melding the use of computer “vision” to interpret objects and people and creating software that can respond accordingly.

“Empathy is the goal – the robot is putting itself in the shoes of the human, and that’s about as hard as it gets,” says Patrick Moorhead, a technology analyst with Moor Insights & Strategy.

“It’s not just about technology – it’s about psychology and trust.”

Choose your pal – a Honda 3EA18, left, of a 3C18, both concept robots for now./AFP

Moorhead says this technology is still in the early stages, but holds promise in some areas, noting that there is strong interest in Japan amid a lack of caretakers for the elderly population.

“In some ways it can be a bit creepy if you’re crying and the robot is trying to console you,” he says.

“If you have no friends, the next best thing is a friend robot, and introverts might feel more comfortable talking to a robot.”

One CES exhibitor offers a promise of going further than the current devices by developing an “emotion chip” that allows robots to process emotions in a manner similar to humans.

“There’s been a lot of research on detecting human emotions. We do the opposite. We synthesise emotions for the machine,” says Patrick Levy-Rosenthal, founder of New York-based Emoshape, which is producing its chip for partners in gaming, virtual and augmented reality and other sectors.

It could be used to power a humanoid robot or other devices. For example, an e-reader could better understand a text to infuse more emotion in storytelling.

As for Forpheus, Kersten says the robot’s ability to help people improve their table-tennis skills could have numerous applications for sports, businesses and more.

“You could sense how people are feeling, if they are attentive or in a good state to drive.”

Another key application could be in healthcare, he says. “In an elderly-patient facility, you can determine if someone is in distress and needs help.”