Monday, May 23, 2011

Paper Summary - The Emotional Economy for the Augmented Human

Notable Quote: Commercial-On-The-Shelf (COTS) Brain-Computer Interfaces (BCI) can be used to provide real-time happiness feedback as people live their life.
This paper by Jean-Marc Seigneur investigates the obtainment and use of real-time happiness measures. Previous and current methods for evaluating happiness are done a posteriori, which allows room for subjects to change their opinions and be influenced by current emotions while being expected to relive the past events in question. Using a BCI the author proposes detecting happiness in the moment, and at a more basic level. Seigneur describes this as the Emotional Economy (EE) Model. Given a user and a service, the model looks at what emotions reach a threshold to be measured (the example here being happiness). This information can then be used to make decisions or catalog emotions at desired checkpoints. The proof-of-concepts afford measures of engagement, frustration, meditation, instantaneous excitement, long term excitement, and happiness. Note that these measures are easily accommodated with the Emotiv EPOC headset used.

The Facebook scenario. Taken from the author's paper.
As an example, the author created two use-scenarios. In the first, a user watches a video via Facebook, and if Happiness is detected, then the video is automatically liked.The second scenario incorporates location-based computing. The wearer is given a backpack containing a portable computer and a GPS unit. While moving along an outdoor tour, the GPS and emotion readings can be synched via their timestamps to determine at which geographic point the wearer felt different emotions, and thus if they enjoyed themselves and when. As stated by the author, this scenario could allow for automatic tourist reviews and, when combined with written testimonials, lead credence to factual positive and negative reviews overall.

I highly recommend this short paper to those interested in ubiquitous computing and/or emotional measures. The thing that really sticks out is that as BCIs become more functional and effective at a higher lever they can become just another input device or sensor, albeit one with a wider gambit of potential uses. In essence, the author is stating this in his Facebook scenario, the code needed was for interacting with Facebook and not for manipulating the headset sensors themselves. This work is also open to quick extensions. For instance, measuring frustration could be an indicator of not liking the video in the Facebook scenario, which could cause it to be removed the viewer's news feed. The second scenario opens up a wider range of applications via the incorporation of positioning and timestamps. Such a scenario could be extended for testing amusement parks and rides, and movies or commercials without the need for the GPS data. In fact, a similar article can be found here on engadget.

Very engaging. For some reason the extension I thought of was relationship evaluation. Are you really happy with the person you're with? Who makes you happier? That's all an online dating site needs are brain signals to further prove connections between paying customers romantic matches. Or therapists for that matter! Pop on a headset and really see what you feel given stimuli. With the potential perfection of emotional classification in the future, who knows what can be learned about others. This raises an ethical concern as well. If I CAN see how everything makes you feel, SHOULD I? Could BCIs be used to weed out unfit Soldiers or track down criminals? Could they be used to detect biases, fallacies, and overall ethnocentric beliefs? To me that is very heavy stuff worth much further consideration.

Full Reference:
Jean-Marc Seigneur. 2011. The emotional economy for the augmented human. In Proceedings of the 2nd Augmented Human International Conference (AH '11). ACM, New York, NY, USA, , Article 24 , 4 pages. DOI=10.1145/1959826.1959850

No comments:

Post a Comment