Affectiva, SoftBank Robotics Team Up to Broaden Pepper’s Emotional Intelligence | Robotics
SoftBank Robotics’ Pepper humanoid robot, already adept at recognizing human emotions such as joy, anger, or surprise, will soon expand its emotional intelligence to understand more complex cognitive states and expressions, thanks to a partnership announced today by SoftBank and Affectiva.
Affectiva’s Emotion AI will be integrated into Pepper, giving the robot the ability to “detect highly nuanced states based on people’s facial and vocal expressions” in real time, said the companies. While Pepper can already detect emotions via its microphones, cameras, and loudspeakers, the inclusion of the Emotion AI technology will allow it to detect more advanced states.
For example, Pepper will be able to identify cognitive states such as distraction, drowsiness, and “differentiating between a smile and a smirk.” Affectiva said understanding these more complex states will give Pepper more meaningful interactions with people, allowing it to adapt its behavior “to better reflect the way people interact with one another.”
This year, RoboBusiness includes four conferences to make it easier for you to find the information you need most. Whether you are involved in running a robotics business, designing products, or implementing robotics solutions in your company – we have a conference to meet your needs.
Human-machine interaction: The next generation
Pepper, already seen working in customer service roles in banks, retail stores, museums, and hotels around the world, will be able to expand its companion and concierge abilities through this partnership, Affectiva said.
“But this is only the beginning, especially as Pepper continues to evolve and learns to relate to people in increasingly meaningful ways,” stated Marine Chamoux, affective computing roboticist at SoftBank Robotics. “The partnership really signifies the next generation of human-machine interaction, as we approach a point where our interactions with devices and robots like Pepper more closely mirrors how people interact with one another.”
Dr. Rana el Kaliouby, co-founder and CEO of Affectiva, said there would be a need for deeper understanding and mutual trust between humans and robots as the machines take on more interactive roles in healthcare, homes, and retail environments.
“Just as people interact with one another based on social and emotional cues, robots need to have that same social awareness in order to truly be effective as coworkers or companions,” el Kaliouby said.
Affectiva, which spun out of the MIT Media Lab, utilizes machine learning, deep learning, computer vision, and speech science for its emotional intelligence technology, dubbed Emotion AI. The company said it has the “world’s largest emotion data repository,” with more than 7 million faces analyzed across 87 countries. The company also works with the automotive industry to provide multi-modal driver state monitoring and in-cabin mood sensing.
The two companies will be discussing the partnership and their vision for social robotics at Affectiva’s Emotion AI Summit, to be held on Thursday, Sept. 6, in Boston.