Towards an affect-sensitive multimodal human-computer interaction

title: Towards an affect-sensitive multimodal human-computer interaction
author(s): Maja Pantic and Leon J.M. Rothkrantz
published in: September 2003
appeared in: Proceedings of the IEEE, Vol.91, No.9
pages: 1370-1390
publisher: IEEE Press
PDF (806 KB)

Abstract

The ability to recognize affective states of a person we are communicating with is the core of emotional intelligence. Emotional intelligence is a facet of human intelligence that has been argued to be indispensable and perhaps the most important for successful interpersonal social interaction. This paper argues that next-generation human–computer interaction (HCI) designs need to include the essence of emotional intelligence — the ability to recognize a user’s affective states — in order to become more human-like, more effective, and more efficient. Affective arousal modulates all nonverbal communicative cues (facial expressions, body movements, and vocal and physiological reactions). In a face-to-face interaction, humans detect and interpret those interactive signals of their communicator with little or no effort. Yet design and development of an automated system that accomplishes these tasks is rather difficult. This paper surveys the past work in solving these problems by a computer and provides a set of recommendations for developing the first part of an intelligent multimodal HCI — an automatic personalized analyzer of a user’s nonverbal affective feedback.

 
blue line
University logo