Khan, Masood Mehmood, Ward, Robert D. and Ingleby, Michael (2009) Classifying pretended and evoked facial expressions of positive and negative affective states using infrared measurement of skin temperature. ACM Transactions on Applied Perception, 6 (1). pp. 1-22. ISSN 1544-3558Metadata only available from this repository.
Earlier researchers were able to extract the transient facial thermal features from thermal infrared images (TIRIs) to make binary distinctions between the expressions of affective states. However, effective human-computer interaction would require machines to distinguish between the subtle facial expressions of affective states. This work, for the first time, attempts to use the transient facial thermal features for recognizing a much wider range of facial expressions. A database of 324 time-sequential, visible-spectrum, and thermal facial images was developed representing different facial expressions from 23 participants in different situations. A novel facial thermal feature extraction, selection, and classification approach was developed and invoked on various Gaussian mixture models constructed using: neutral and pretended happy and sad faces, faces with multiple positive and negative facial expressions, faces with neutral and six (pretended) basic facial expressions, and faces with evoked happiness, sadness, disgust, and anger. This work demonstrates that (1) infrared imaging can be used to observe the affective-state-specific facial thermal variations, (2) pixel-grey level analysis of TIRIs can help localise significant facial thermal feature points along the major facial muscles, and (3) cluster-analytic classification of transient thermal features can help distinguish between the facial expressions of affective states in an optimized eigenspace of input thermal feature vectors. The observed classification results exhibited influence of a Gaussian mixture model's structure on classifier-performance. The work also unveiled some pertinent aspects of future research on the use of facial thermal features in automated facial expression classification and affect recognition.
|Subjects:||Q Science > Q Science (General)|
|Schools:||School of Human and Health Sciences|
School of Human and Health Sciences > Centre for Applied Psychological Research
|Depositing User:||Sharon Beastall|
|Date Deposited:||26 Mar 2010 12:53|
|Last Modified:||22 Dec 2010 16:30|
Item control for Repository Staff only: