Computing and Library Services - delivering an inspiring information environment

Automated facial expression classification and affect interpretation using infrared measurement of facial skin temperature

Khan, Masood Mehmood, Ingleby, Michael and Ward, Robert D. (2006) Automated facial expression classification and affect interpretation using infrared measurement of facial skin temperature. ACM Transactions on Autonomous and Adaptive Systems, 1 (1). pp. 91-113. ISSN 1556-4665

[img] PDF
Restricted to Registered users only

Download (1MB)


Machines would require the ability to perceive and adapt to affects for achieving artificial sociability.
Most autonomous systems use Automated Facial Expression Classification (AFEC) and Automated
Affect Interpretation (AAI) to achieve sociability. Varying lighting conditions, occlusion, and control
over physiognomy can influence the real life performance of vision-based AFEC systems. Physiological
signals provide complementary information for AFEC and AAI. We employed transient
facial thermal features for AFEC and AAI. Infrared thermal images with participants’ normal
expression and intentional expressions of happiness, sadness, disgust, and fear were captured. Facial
points that undergo significant thermal changes with a change in expression termed as Facial
Thermal Feature Points (FTFPs) were identified. Discriminant analysis was invoked on principal
components derived from the Thermal Intensity Values (TIVs) recorded at the FTFPs. The crossvalidation
and person-independent classification respectively resulted in 66.28% and 56.0% success
rates. Classification significance tests suggest that (1) like other physiological cues, facial skin
temperature also provides useful information about affective states and their facial expression; (2)
patterns of facial skin temperature variation can complement other cues for AFEC and AAI; and (3)
infrared thermal imaging may help achieve artificial sociability in robots and autonomous systems.

Item Type: Article
Additional Information: UoA 23 (Computer Science and Informatics) © ACM Press New York, NY, USA
Uncontrolled Keywords: socially intelligent machines, infrared thermal imaging
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Q Science > QP Physiology
Schools: School of Computing and Engineering

BALDWIN, J. F., CASE, S. J., AND MARTIN, T. P. 1998. Machine interpretation of facial expressions.
BT Technol. J. 16, 156–164.
BARTNECK, C. 2001. How convincing is Mr. Data’s smile: Affective expressions of machines. User
Model. User-Adapt. Interact. 11, 279–295.
BELMONT REPORT. 1979. Ethical principles and guidelines for the protection of human subjects
of research. United States Government, National Institute of Health, Washington D.C.,
BOLLE, R. M., CONNELL, J. H., PANKANTI, S., RATHA, N. K., AND SENIOR A. W. 2004. Guide to Biometrics.
Springer, Verlag, Berlin, Germany.
BOULIC, L. E. R. AND THALMANN, D. 1998. Interacting with virtual humans through body actions.
IEEE Computer Graphics and Applications 98, 8–11.
BROOKS, R. A. 2002. Flesh and Machines: How Robots Will Change Us. Pantheon Books, New
York, NY.
CANTRONIC SYSTEMS INC. 2001. IR 860 User Manual. Cantronic Systems Inc., Coquitlam, Canada.
CANTRONIX SYSTEMS INC. 2003. IR 860 Product Data Sheet. Cantronic Systems Inc., Coquitlam,
COHEN, I., SEBE,N., GARG, A., CHEN, L. S. ANDHUANG, T. S. 2003. Facial expression recognition from
video sequences: temporal and static modeling. Comput. Vision Image Understand. 91, 160–187.
COHEN, J. 1977. Statistical Power Analysis for Behavioral Sciences. Academic Press, New York.
COSTANZA, M. C. AND AFFIFI, A. A. 1979. Comparison of stopping rules in forward stepwise discriminant
analysis. J. Amer. Statist. Assoc. 74, 777–785.
ROWE, A., PHILLIPS, M., MCALONAN, G., HOWLLIN, P., AND MURPHY, D. G. M. 2000. The functional neuroanantomy of social behavior: changes in cerebral blood flow when people with autistic
disorder process facial expressions. Brain 123, 203–212.
DAUTENHAHN, K. 1995. Getting to know each other—Artificial social intelligence for autonomous
robots. Robotics Autonom. Syst. 16, 333–356.
DHEW. 1979. Department of Health, Education and Welfare Publication No. (OS) 78-0014. US
Government Printing Office, Washington D.C.
DONATO, G., BARTLETT, M. S., HAGER J. C., EKMAN P. AND SEJNOWSKI, T. J. 1999. Classifying facial
expressions. IEEE Trans. Patt. Anal. Machine Intell. 21, 974–989.
DUDA, R. O., HART, P. E. AND STORK, D. G. 2001. Pattern Classification. Wiley Interscience, New
Ekman, P. 1982. Emotion in the Human Face. Cambridge University Press, Cambridge, UK.
EKMAN, P. 1993. Facial expression and emotion. Amer. Psychol. 48, 384–392.
EKMAN, P. AND FRIESEN,W. V. 1978. Facial Action Coding System: A Technique for the Measurement
off Facial Movement. Consulting Psychology Press, Mountain View, CA.
ESSA, I. AND PENTLAND, A. 1997. Coding, analysis, interpretation and recognition of facial expressions.
IEEE Trans. Patt. Anal. Machine Intell. 19, 757–763.
EVELAND, C. K., SOCOLINSKY, D. A., AND WOLF, L. B. 2003. Tracking human faces in infrared video.
Image Vision Comput. 21, 579–590.
FASEL, B. AND LUETTIN, J. 2003. Automatic facial expression analysis: a survey. Patt. Recognit. 36,
FIELD, A. 2000. Discovering Statistics Using SPSS for Windows. Sage Publications, London, UK.
GAO, Y., LEUNG,M. K. H., HUI, S. C., AND TANANDA,M.W. 2003. Facial expression recognition form
line-based caricatures. IEEE Trans. Syst. Man Cyber. 33, 407–412.
GEORGE, D. AND MALLERY, P. 1995. SPSS/PC+ Step by Step: A Simple Guide and Reference.
Wadsworth Publishing, London, UK.
HUANG, C. AND HUANG, Y. 1997. Facial expression recognition using model-based feature extraction
and action parameters classification. J. Visual Comm. Image Represent. 8, 278–
HUBERTY, C. J. 1984. Issues in the use and interpretation of discriminant analysis. Psycholog.
Bull. 95, 156–171.
HUBERTY, C. J. 1994. Applied Discriminant Analysis. Wiley, New York, NY.
JONES, C. H., RING, E. F. J., AND CLARK, R. P. 1988. Medical thermography. In Applications of
Thermal Imaging, Burnay, S. G., Williams, T. L. and Jones, C. H. N., Eds. Adam Hilger, Bristol,
UK, 156–187.
KEARNEY, G. D. AND MCKENZIE, S. 1993. Machine interpretation of emotion: Design of memorybased
expert systems for interpreting facial expressions in terms of signaled emotions. Cognitive
Science 17, 589–622.
KHAN, M. M., WARD, R. D., AND INGLEBY, M. 2004. Automated classification and recognition of
facial expressions. In Proceedings of the IEEE Conference on Cybernetics and Intelligent Systems,
Singapore, (Dec), 202–206.
KHAN,M. M.,WARD, R. D., AND INGLEBY,M. 2005. The distinguishing facial expressions by thermal
imaging using facial thermal feature points. In Proceedings of the 19th British HCI Group Annual
Conference (HCI’05), Edinburgh, (Sept), L. Mackinnon, O. Bertelsen and N. Bryan-Kinns Eds.
The British Computer Society, London, UK. 10–14.
KIM, H. K., BANG, S. W., AND KIM S. R. 2004. Emotion recognition system using short-term monitoring
of physiological signals. Medical Biological Engin. Comput. 42, 419–427.
KLEIN, J., MOON, Y., AND PICARD, R. W. 2002. This computer responds to user frustration: Theory,
design and results. Interact. Comput. 14, 119–140.
KURSE, P.W. 2001. Uncooled Thermal Imaging: Analysis, Systems and Applications. SPIE Press,
Bellingham, WA.
LISETTI, C. S. AND SCHIANO, D. J. 2000. Automatic facial expression interpretation: Where humancomputer
interaction, artificial intelligence and cognitive science intersect. Pragmatics Cognition
8, 185–235.
MATSUZAKI, H. AND MIZOTE, M. 1996. Measurement of facial temperature fluctuations by thermal
image analysis. Progress in Biophysics and Molecular Biology 65 Supplement 1, 185–186.MCGIMPSEY, J. G., VAIDYA, A., BIAGIONI, P. A., AND LAMEY, P.-J. 2000. Role of thermography is the
assessment of infraorbital nerve injury after malar fractures. British J. Oral Maxillofacial Surg.
38, 581–584.
NANAVATI, S., THIEME, M., NANAVATI, R. 2002. Biometrics: Identity Verification in a Networked
World. John Wiley & Sons, New York, NY.
NORMAN, D. A., ORTONY, A., AND RUSSELL, D. M. 2003. Affect and machine design: Lessons for the
development of autonomous machines. IBM Syst. J. 42, 38–44.
and thermography of facial angioedema: A case report. Oral Surg. Oral Medicine, Oral Pathol.
92, 473–476.
PANTIC, M. AND ROTHKRANTZ, L. J. M. 2000. Automatic analysis of facial expressions: The state of
the art. IEEE Trans. Patt. Anal. Machine Understand. 22, 1424–1445.
PAVLIDIS, I. 2004. Lie detection using thermal imaging. In Proceedings of SPIE Thermosense
XXVI (April), Orlando, FL, The International Society for Optical Engineering, Bellingham, WA,
PICARD, R. W. 2000. Affective Computing. MIT Press, Cambridge, MA.
PIZZAGALLI, D., KOENIG, T., REGARD, M., AND LEHMANN, D. 1998. Faces and emotion: brain electric
field sources during covert emotional processing. Neuropsychologia 36, 323–332.
POSAMENTIER, M. T. AND ABDI, H. 2003. Processing faces and facial expressions. Neuropsychology
Rev. 13, 113–143.
PROKOSKI, F. J. AND IEDEL, R. 1999. Infrared identification of faces and body parts. In Biometrics:
Personal Identification in Networked Society. Jain, A. K., Bolle, R. M., and Pankanti, S., Eds.
Kulwer Academic Press, Boston, MA, 191–212.
REDFORD, P. 2000. Conference Report: Social goals and emotions. Psychologist 13, 290–291.
visualizing, recording, and analyzing animal behavior. Comput. Methods Programs Biomedicine
67, 55–66.
SHARMA, S. 1996. Applied Multivariate Techniques. Wiley, New York, NY.
SUGIMOTO, Y., YOSHITOMI, Y., AND TOMITA, S. 2000. A method of detecting transitions of emotional
states using a thermal facial image based on a synthesis of facial expressions. Robotics Autonom.
Syst. 31, 147–160.
SWETS, D. ANDWENG, J. 1998. Using discriminant eigenfeatures for image retrieval. IEEE Trans.
Patt. Anal. Machine Intell. 20, 39–51.
TURNER, J. R. AND THAYER, J. F. 2001. Introduction to Variance Analysis. Sage Publications,
London, UK.
WARD, R. D. AND MARSDEN, P. H. 2004. Affective computing: problems, reactions and intentions.
Interacting with Computers 16, 707–713.
YOSHITOMI, Y., KIM, S. I., KAWANO, T., AND KITAZOE, T. 2000. Effects of sensor fusion for recognition
of emotional states using voice, face image and thermal image of face. In Proceedings of
the IEEE International Workshop on Robotics and Human Interactive Communication, Osaka,
Japan, (Sept), 178–183.

Depositing User: Sara Taylor
Date Deposited: 12 Jul 2007
Last Modified: 28 Jul 2010 18:20


Downloads per month over past year

Repository Staff Only: item control page

View Item View Item

University of Huddersfield, Queensgate, Huddersfield, HD1 3DH Copyright and Disclaimer All rights reserved ©