Multimodal Human-Computer Interaction

title: Multimodal Human-Computer Interaction: the design and implementation of a prototype
author: Maurits J.A. André
published in: August 1998
appeared as: Master of Science thesis
Delft University of Technology
pages: 99
PostScript (1.310 KB)


The influence of computers on people's daily lifes is increasing and the need for simpler interfaces to use computers is emerging. Current human-machine communication systems predominantly use keyboard and mouse inputs which inadequately approximate human abilities for communication. More natural communication technologies are capable of freeing computer users from the keyboard and mouse.

This work presents a prototype of a multimodal interface featuring fusion of multiple modalities for human-computer interaction. The three modalities integrated are a speech recognizer and synthesizer, a tactile glove, and a gaze tracker. The application used for this system is a collaborative whiteboard application extended to a military mission planning system. The design and implementation of the whole system and the methods applied are described and preliminary results of the real-time multimodal fusion are analyzed.

Concluded can be that this work presents a look into the future in which computers are controlled by modalities in the dimensions of sight, sound and touch. The input devices used in this project differ in accuracy, speed and efficiency. Together they form an appropriate means of communication with the system.

(This project is done at the Center for Computer Aids for Industrial Productivity (CAIP) of the Rutgers University in New Brunswick, New Jersey.)

blue line
University logo