In the movie Big Hero 6, Baymax seem to be more intelligent (and more loveable) than other robots as he could easily empathize with his friend Hiro. Could he really? Yes, it is possible. In fact, it is happening now judging by the leaps and bounds robotic technology has achieved. Maximizing the potential of Internet of Things technology, robotics designers had been able to develop an emotion sensor technology to allow robots to be sensitive to human emotion.
At Vanderbilt University in Nashville, Tennessee, psychologists work hand in hand with engineers to find ways to make robots sensitive to moods, temperament and feelings of humans. Several breakthrough technology have been developed as a result of this research which is also being done across the world in Japan. At Tokyo Metropolitan University, researchers are helping develop a machine learning device that can detect the emotion of its listener and then produces a new type of song or rhythm based on the feelings it detected.
The robotics designers at Vanderbilt University wanted to determine if a robot can be intelligent enough to sense what a person is feeling, and thus change the way a robot interacts with human beings.
Assistant professor Nilanjan Sarkar says they want to make robots that can communicate with humans naturally by making them sensitive to human emotion. To achieve this, they had to carefully study the unique patterns of behavior that indicate various types of human emotion. These patterns are then fed into the robot’s system by converting the information into actuator-style commands. Founded on the technology used for voice recognition and handwriting recognition that connects machines into IoT, the robotics designers then installed sensors on robots so that it can monitor heart rate, anxiety level, sweating and other expression of feelings. Experiments were conducted which had human subjects repeatedly put under anxiety-producing situations. The researchers then acquired electrocardiogram profiles to
determine the specific mental states of the subjects. This information was enough to allow scientists to find patterns in the variations and make it possible for robots respond according to the mental state of the person.
To take IoT even farther, Sarkar is using biosensors to analyze changes in stress and moods and feed this into the robot. He mentions some of the biosensors such as skin conductance that indicates if a person sweats and is thus under stress, and facial muscles, like jaw clenching and furrowing of the brow.
Researchers at Tokyo Metropolitan University has been able to develop remarkable technology to detect the emotional state of a person. Working alongside scientists at Osaka University, the research institute in Belgium and Crimson Technology, the machine they developed basically involves monitoring the brain waves of a person.
Using wireless headphones that capture brain waves, a machine was able to tailor music to the feelings of the listener
What the researchers wanted to achieve is not to make robots emotional or something that is too sensitive, but to enhance the interactive experience between the machine and the person. This is done through music, the universal language.
Says Masayuki Numao of the Osaka University “We programmed the robots with songs, but added the brain waves of listener to make new music,” says Masayuki Numao.
How does this work? Sensors installed on a headset that plays the music are able to detected electroencephalogram (EEG) readings which are fed into the robot. The robot was preprogrammed with songs and the designers then added the brain waves of the person listening to the songs so that robot are able to make new music.
The results of these experiments could remarkably change human-robot relationship as we know it. Scientists also believe that it can have several social benefits, such as for health care. In the future, emotional applications for human-machine interaction will be as normal and natural as human to human interaction.