Abstract
The AUBADE system can be trained to classify a subject's feelings into six different emotional classes, derived from three of the basic emotions (happiness, disgust and fear). The performance of different classifiers was examined. Biosignals were recorded from 24 healthy subjects who viewed pictures designed to invoke different emotional responses. A psychologist evaluated the emotional status of the subjects by looking at their faces. During the training stage, information from 15 subjects was used to teach the system how to discriminate the emotional status of the subject based on the biosignals provided as input. A subset of the data was used for comparing the performance of four different classifiers. They were evaluated using three different metrics: sensitivity, positive predictive accuracy and accuracy. Using the SVM classifier, the AUBADE system provided sensitivities in the range 63-81%. The positive predictive accuracy was in the range 71-95%. The accuracy was in the range 63-83%, depending on the emotional class considered. The work paves the way for remote telemonitoring of patients suffering from neurological diseases.
Original language | English |
---|---|
Journal | Journal of Telemedicine and Telecare |
Volume | 14 |
Issue number | 3 |
Pages (from-to) | 152-4 |
Number of pages | 3 |
ISSN | 1357-633X |
DOIs | |
Publication status | Published - 2008 |
Keywords
- Adult
- Artificial Intelligence
- Biosensing Techniques
- Emotions/physiology
- Facial Expression
- Female
- Humans
- Huntington Disease/psychology
- Male
- Pattern Recognition, Automated/methods
- Visual Perception/physiology