Day 043 - Augmented emotional reality

Submitted by Sam on 3 July, 2011 - 00:13

Whilst emotional contagion is an effective means of transmitting information and engendering synchronized emotions in a group, the effects are only beneficial if the emotions being transmitted are appropriately recognized and interpreted. Misreading of emotion is surprisingly common in human social interaction as the spectrum of human emotion manifests itself through facial and bodily indicators so subtle and myriad that they are often only partially observed and partially classified, leading to miscommunication and potentially negative social consequences.

In the pursuit of improving our ability to correctly identify these signals (often referred to as our emotional intelligence), various technologies are being developed to artificially classify emotional expression with an above-human level of accuracy, aiming to help us better understand each other by removing ambiguity from the interpretation of physically expressed emotions.

Rosalind Picard from the MIT Media Lab has developed a prototype of one such technology to boost our emotional intelligence – a pair of glasses containing a camera connected to a computer which interprets facial expression. The camera sends a feed to software which analyzes faces for the thousands of tiny muscle movements which constitute expressions, interpreting them and relaying them back to the wearer either though earphones or a computer screen. By tracking twenty-four “feature points” on the face, the software analyzes micro-expressions and compares them with its database of six known expressions, divided into “thinking”, “agreeing”, “concentrating”, “interested”, “confused” and “disagreeing”, correctly identifying them 64% of the time, compared to the human accuracy of 54%. These figures indicate that this model could aid not only people suffering from impaired emotional intelligence (such as those with autism), but could in fact assist the majority of people in sensing the mood of the people they are talking to.

The prototype consequently carries commercial value, and the team behind the glasses have established a company called Affectiva selling their expression-recognition software to companies wishing to measure how people feel about their adverts or products, for instance.

However, the recognition of emotional states from facial expressions is still far from foolproof, and can be subject to the same manipulations and misreadings that effect human interpretation. The goal of complete emotional knowledge and perfect brain-state sharing remains the domain of connectome projects like The Blue Brain Project.

Attribution Noncommercial Share Alike
This text, Day 043 - Augmented emotional reality, by Sam Haskell is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike license.
Drupal theme by Kiwi Themes.