Affect-sensitive multimodal monitoring in ubiquitous computing: Advances and challenges

title: Affect-sensitive multimodal monitoring in ubiquitous computing: Advances and challenges
author(s): Maja Pantic and Leon J.M. Rothkrantz
published in: July 2001
appeared in: Proc of the AAAI/IEEE Int. Conf. on Enterprise Information Systems, Setubal, Portugal
pages: 466-474
publisher: AAAI/IEEE Press
PDF (104 KB)

Abstract

The topic of automatic interpretation of human communicative behaviour, that is, giving machines the ability to detect, identify, and understand human interactive cues, has become a central topic in machine vision research, natural language processing research and in AI research in general. The catalyst behind this recent ‘human-centred computing hoopla’ is the fact that automating monitoring and interpretation of human communicative behaviour is essential for the design of future smart environments, next generation perceptual user interfaces, and ubiquitous computing in general. The key technical goals concern determining of the context in which the user acts, that is, disclosing in an automatic way where is the user, what is he doing, and how is he feeling, so that the computer can act appropriately. This paper is pertained with the last of these issues, that is, with providing machines with the ability to detect and interpret user’s affective states. It surveys the past work done in tackling this problem, provides taxonomy of the problem domain, and discusses the research challenges and opportunities.

 
blue line
University logo