KEYWORDS:
ABSTRACT:
A way to assess facial expressions is to extract
thirty vectors which are defined on the facial features eyes, eyebrows
and mouth. The distances between these vectors are input to an artificial
neural network, that will classify into the basic emotions. This method
is semi-automatic. In this report a description is given of a fully automated
method to extract the distances between the vectors. This fully automated
method, that uses the facial feature shapes, has been implemented for the
mouth. Two attempts were made to extract the mouth shape. First, an contour
follower was used. Since no results were achieved, a second attempt was
made. This resulted on a automatic facial feature extraction system SEEM
that is based on vision techniques. The system uses a priori knowledge
(anthropometric facial dimensions) and integral projection to locate the
face features and to extract the mouth contour at real-time speed.
A small image sequence was generated to test the
extraction of distances between the mouth vectors. These distances were
used to train a backpropagation neural network for emotion classification.
It can be concluded that the SEEM system is able to extract the distances
and that it is possible to use this data to recognise the facial expressions.