Can AI Detect Your Emotion Just By How You Walk?
Artificial intelligence systems are being employed for a wide range of tasks from recognition systems to autonomous activities, from pattern and anomaly detection to predictive analytics and conversational systems, and many other aspects. One of the areas where AI has shown particular capability is in the area of recognition, from image recognition to speech and other aspects of pattern recognition. Many have applied AI to aspects of facial recognition and text-based sentiment analysis, but one researcher has taken the power of machine learning for recognition to a new level. Is it possible to determine the emotional state of a human simply by the way they walk?
Aniket Bera is an Assistant Research Professor at the University of Maryland Institute for Advanced Computer Studies (UMIACS) and the Department of Computer Science, and pursuing research as part of the Geometric Algorithms for Modeling, Motion, and Animation (GAMMA) group. The GAMMA, at the University of Maryland at College Park, works on many different research problems, including multi-modal behavior learning using AI, research into robot navigation and autonomous vehicles, and physically-based simulation. His current research focuses on the social perception of intelligent agents. His research involves novel combinations of methods and collaborations in the field of computer graphics, physically-based simulation, statistical analysis, and machine learning to develop real-time computational models to classify such behaviors and validate their performance.
Specifically, one of the most intriguing aspects of his current research is into “socially-intelligent robots”. Professor Bera predicts that humans will soon be in close proximity to autonomous robots in public places such as their homes, offices, sidewalks, and buildings. These robots are being tasked with an increasing range of tasks including surveillance, delivery, and warehousing applications. Indeed, the continuing challenge of the global pandemic is even motivating hospitals and health care facilities to introduce increasing numbers of autonomous robotic systems into highly critical healthcare operations.
As such, Professor Aniket Bera believes that in these close-contact situations, it is critical that these robots work collaboratively in a socially-intuitive way. These robots need to understand human emotions, feelings, intent, social boundaries, and expectations. Whereas much of current robotic applications are focused primarily on accomplishing tasks focused on efficiency or time, socially-intelligent robotics add an additional component of human emotional and social interaction. The addition of this aspect helps make humans feel more safe, comfortable, and friendly in their close-quarters robotic interactions.
One of the most interesting research applications of socially-intelligent robots is their ability to read body language. The research into determining emotion from facial expressions is already fairly established. However there are many instances when facial expressions can be unreliable, such as situations where facial data is only partially available or where facial cues are challenging to obtain. For example, a person may not be directly facing the robot or maybe far away from the robot. In addition, social psychology research shows that people can alter their expressions, signaling false emotions, and people do not always spontaneously make facial expressions. Furthermore, it is unclear how much facial expressions are tied to actual behavior.
To address these issues as well as give systems the ability to understand emotional state from a distance, research conducted by the GAMMA lab is looking at AI systems that can detect emotion based on gait. The lab combines facial expressions with body motion as a way to improve prediction of humans' emotional states. According to the researchers, in the case of gaits, walking style is not something that can be easily manipulated. Identifying people based on their gait using AI systems has already been widely publicized and shown to have a certain degree of accuracy. As such, gait detection and identification of walking trajectories combined with facial expressions in a given pedestrian setting can predict their future walking behavior and emotional state. Professor Bera also notes that gait-based research avoids many privacy-related issues that place facial recognition systems. The lab’s algorithm extracts the gait and ignores all features which can be identifiable since all pixel-level data is removed and only the joint positions are stored.
Professor Bera and his fellow researchers and authors (Venkatraman Narayanan, Bala Murali Manoghar, Vishnu Sashank Dorbala, Dinesh Manocha, Tanmay Randhavane, Kurt Gray and Kyra Kapsaskis) built the ProxEmo robot to illustrate how such emotional gait detection works. The researchers believe that socially intelligent systems can be applied in many different applications such as developing automated techniques for perceiving human emotions in therapy, rehabilitation, anomaly detection and surveillance, audience understanding, character generation for animation and movies, and other applications.
Professor Bera also shares a number of other practical applications of socially-intelligent robots. If a robot can learn from body motions and gaits of pedestrians using their emotion as predictive measures, we can design more effective evacuation plans to help the public escape dangerous situations more efficiently and safely. In another example, robots could predict if someone is in a distressed mental state while walking on a bridge or assess mental health signals in general. Robots could predict if a pedestrian is likely to jaywalk while crossing the street, putting themselves and drivers in danger. Robots could also potentially spot criminal actions or threatening situations such as robberies, hijacking, thefts, or assaults. With the current global pandemic in mind, socially-intelligent robots can better understand human needs and detect cough-like motions or body expressions relating to pain and or enforce social distances in public places.
The Challenges of Social Robots
While we’ve long wanted robots in our daily lives, if science fiction and movies are any indication, the reality is that the robotics industry has long struggled to make social robots a reality. Robotics companies such as Anki, Jibo, and Rethink Robotics shut their doors after failing to sustain operations. However, to this Professor Bera has an answer. He says that some robotics industry researchers say that these robotics companies failed because of a lack of real use of “social robots” or the lack of a real application for such robots. Furthermore he states that in many cases the “social” aspect of these robots might have been overpromised and underdelivered, and the social aspect was more a novelty than anything else.
He believes that while these bots imitated emotional affect, they did not in actuality efficiently capture the user’s mood or emotion. He sees that many of our emotions are non-verbal. What we say and how we say it are two different things. Many of these platforms missed the vital two-way social aspect, and in this regard, the use of additional emotional cues from gait and body language might help significantly make these sorts of social robots more useful and lifelike.
That being said, Professor Bera has many years of experience in research in this field with over 40 research papers published to his name. He believes that the future for robotics and AI is bright. He sees that over the past few years, an increasing number of roboticists have been using AI to solve real problems from the warehouse floor to the operating room. From his perspective, he wants to use AI to understand the world better and use that to make robots smarter and more efficient. He sees a future for robots beyond the home and factory floor to space exploration and collaboration in military settings for continuous monitoring or supply chains and transportation. These sorts of cooperative decentralized human-robot teams are suitable for large-scale exploration, mapping, and other team tasks because research and interaction can often be subdivided into several robot goals. Of course, as he says, the real challenge is making sure that these robots perform well in the real world and not just laboratory settings.