|title:||My_Eliza; a multimodal communication system|
|published in:||August 2002|
Master of Science thesis
Delft University of Technology
|PDF (3.596 KB)|
The field Natural Language Processing (NLP) is an active area of research. One of the
possible applications of NLP systems is a question answering system - QA system. QA system is a
program that attempts to simulate typed conversation in human natural language. In contrast with
it, there are only a few researches involving research in Nonverbal Communication. Even though, it
is proved that beyond the speech bodily activities give real substance to face-to-face interaction in
real life conversation.
Motivated by this issue, in this report we investigate a design and implementation of a multimodal communication system, called my_Eliza. The design of this system is based on Weizenbaum's Eliza. A user or client can communicate with the system using typed natural language. The system will reply by text-prompts and appropriate facial-expressions. In order to communicate using a nonverbal facial display, the developed system should be able to process natural language and emotional reasoning.
A first prototype as a proof of concept has been developed that consists of a dialog box, an emotional recognizer based on stimulus response, and a facial display generator. To implement the dialog box, the work of Richard Wallace, A.L.I.C.E system, has been used as a starting point. My_Eliza system has a rule engine that determines current system's affective state as reaction to the user's string input and conversation content. This report presents the result of a conversation between my_Eliza and a human user.