top of page

Affective Computing: How AI can become compassionate

Imagine a day where Alexa and Siri can understand and interact with you beyond the medium of plain words but also according to your emotions. Through perceptions, if you are happy, your intelligent assistance may ask you what good things have happened and help you cherish the moment; if you are bored, it may crack a joke or two to lighten up the mood; if you are sad, it may help you offer you words of comfort and healing suggestions. See, from the fictional characters Wall-E to Baymax, we all have seen how friendly, lovable, and human the robots and AIs can become, and one of the major ingredients towards making them such is their inner-ability to intelligently interpret, process, and act according to other’s emotions, the distinct and crucial social ability possessed by humans.

Fortunately, AI may offer us just that in real life, and it has been achieved through the area of study and development known as Affective Computing. Utilizing image data of human facial expressions as one way, such emotional intelligent systems can compute for the predictions of that person’s emotional state according to the patterns demonstrated by tracking and analyzing facial muscles’ position and movements, known as “landmarks”. Conversely, other forms of affective computing systems may also utilize data from written or spoken texts, “speech patterns, pulse rate, and other biometrics”, applying similar pattern analysis to compute for demonstrated emotions (Kompella 1).

What then is so different about these emotional AIs than the traditional ones? Well, once the machines become able to interpret human emotions, they then can act according to their different perceptions just a real human would in everyday social situations, in effect provide for better cooperation and more natural relationships between machines and humans. In fact, in recent research conducted by computer scientists, it was concluded that machines that understand and communicate with emotional expressions are more likely to be identified by a human as their social group member than those that are not capable of emotional recognition and activation (de Melo, Celso M. 6). With such possible more friendly perception towards the machines, we can indeed expect many creative applications in cross all technology-integrated industries that will help to bring the relationship of humans and machines steps closer.

Currently, many AI tech companies are already working to implement effective computing into our daily lives. Affectiva, an MIT Media Lab spin-off, for example, has already pioneered a human perception AI system that is applied to fields of automotive, media analytics. Claiming their AI can not only “detect nuanced emotions as well as complex cognitive states, activities, interactions, and objects people use”, but it can also help to monitor in real-time driver’s state of mind, whether they are fatigued, distracted, and initiate corresponding actions like alerting the driver or even letting the AI take control of the vehicle until it brings back driver’s attention, thus help improving road safety greatly (“Affectiva”; “Affectiva Automotive”). On the other hand, Affectiva’s special AI system, Affdex, is being used in monitoring viewers' reactions and feedbacks from advertising through different facial cues recognitions. It is capable of generating convenient feedbacks through ad-viewers all over the world who agreed their emotions to be observed when viewing the ads, which is absolutely a revolutionary way of ads survey that will be massively more scientific and cost-efficient (Joyner).

As the previous unformidable emotion barrier that stood between humans and AI becomes crumbling down, there opens up another distinct realm of possibilities between the future cooperation and between humans and machines. Affective computing, in fact, might be just another great step to help to forge the utopian vision of the advanced, efficient society where humans and AI can harmonically interact, collaborate, and even befriend each other.



Work Cited

“Affectiva Raises $26M for Human Perception AI Platform.” The Robot Report, 5 Apr. 2019, www.therobotreport.com/affectiva-raises-26m-human-perception-ai/. Accessed 16 Jan. 2020.

“Affectiva Automotive AI for Driver Monitoring Systems”. (n.d.). Retrieved June 14, 2020, from https://www.affectiva.com/product/affectiva-automotive-ai-for-driver-monitoring-solutions/

de Melo, Celso M., et al. "Cooperation with autonomous machines through culture and emotion."

PLoS ONE, vol. 14, no. 11, 2019, p. e0224758. Gale In Context: OpposingViewpoints,

Joyner, A. (n.d.). How to Take the Guesswork Out of Your Advertising. from

Kompella, Kashyap. "A Guide to Emotional AI for Business."

EContent, Summer 2019, p. 36+. Gale In Context: High School,

8 views0 comments

Recent Posts

See All

Comments


bottom of page