Analysis

Smile! Your feelings have been spotted!

26th November 2015
Enaie Azambuja
0

A project, lead by Daniel McDuff at the Massachusetts Institute of Technology's Media Lab, is developing emotion-reading computers that could not only recognise faces but also understand the thoughts of the person behind them. It could eventually lead to machines that have emotional intelligence, or even everyday objects that are able to empathise with our moods.

Imagine a mirror that knows how you feel about the way you look, a fridge that can offer you food according to your state of mind, or a car that detects how stressed you are. Now imagine a computer that knows your true feelings even though you attempt to mask them. Dr. McDuff is developing a system that works with a basic webcam which detects a range of different facial movements and, later, translates them into seven of the most commonly recognised emotional states: sadness, amusement, surprise, fear, joy, disgust and contempt.

The computer learns from a huge database of four million videos from volunteers and paid-for market researchers in various emotional states and the algorithms are constantly updated and tested against real-world scenarios. Next, it will aim at integrating voice analysis and other measures of physical wellbeing such as heart rate and hand gestures. The data collected has already revealed that there are big differences in emotional responses between men and women, between different age groups, and demographics. 

Affectiva's system in action

Among the varied applications of such a technology, Dr. McDuff intends to use it in the mental health arena, in partnership with Affectiva - an MIT spin-off for which he is research director. "Previously there has been no way for a clinician to monitor patients between appointments. They can prescribe drugs but they don't know how the patient is doing. "With this system they could start to track how people respond to treatment."

The researcher recognises the fear that this technology induces in people: "It is scary to think that someone could measure my emotions without me realising it and so it is crucial that we think about the social impact of such technology," he commented. "It is vital that everyone actively opts in to sharing their data." However, according to him, people should be allowed to hide their feelings when they want to and these technologies should let people "control what the computer sees, records and shares."

Ethical implications aside, it will be pretty cool if machines could "be funny, get the joke and understand human emotion", as wished by Ray Kurzweil, Google's director of engineering, who predicts that computers will have emotional intelligence by 2029. 

Featured products

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier