Computing and Library Services - delivering an inspiring information environment

An analysis of facial movement tracking in ordinary human-computer interaction

Ward, Robert D. (2004) An analysis of facial movement tracking in ordinary human-computer interaction. Interacting With Computers, 16 (5). pp. 879-896. ISSN 0953-5438

[img] PDF
Restricted to Registered users only

Download (285kB)


Automatic tracking of facial movement is potentially important as a non-invasive source of physiological data in Affective Computing applications. Facial movement tracking software is becoming commercially available and affordable. This paper explores the association between facial and physiological responses to computer-based events, and the viability of facial movement tracking in detecting and distinguishing qualitative differences in users' facial movements under normal conditions of computer use.

Fifteen participants took a web-based quiz. The quiz contained two relatively ordinary HCI events as stimuli: an alert intended to evoke surprise, and questions with high affective content intended to evoke amusement. From previous findings, the alert was expected to be the stronger of the two stimuli. Participants' physiological arousal was recorded and their faces videoed. The videos for the periods around each event were analysed by commercially available facial movement tracking software.

Human judges considered participants' faces to have responded to both stimuli, but more to the stronger of the two stimuli. Facial response did not always concur with physiological arousal. The tracker detected reactions to the stronger stimulus but had mixed success with the weaker stimulus. The tracker also generated different data profiles for two different facial expressions. These findings support the supposition that users' facial expressions can and do respond to ordinary computer-based events, and indicate that facial movement tracking is becoming a viable technique, and is available to non-computer vision specialists

Item Type: Article
Additional Information: UoA 23 (Computer Science and Informatics) Copyright © 2004 Elsevier B.V.
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Q Science > QP Physiology
Schools: School of Computing and Engineering
References: Andreassi, J.L., 2000. Psychophysiology: Human Behavior and Physiological Response, fourth ed. Lawrence Erlbaum Associates, Mahwah, NJ. Bianchi-Berthouze, N., Lisetti, C.L., 2002. Modelling multimodal expression of users’ affective subjective experience. User Modeling and User-Adapted Interaction 12 (1), 49–84. Conati, C., 2002. Probabilistic assessment of users’ emotions in educational games. Applied Artificial Intelligence 16 (7/8), 555–575. Ekman, P., Friesen, W.V., 1978. The Facial Action Coding System: a Technique for the Measurement of Facial Movement. Consulting Psychologists Press, Palo Alto, CA. Essa, I., 1999. Computers seeing people. AI Magazine 20 (2), 69–82. Essa, I., Pentland, A., 1997. Coding, analysis, interpretation and recognition of facial expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence 19 (7), 757–763. Eyematic Facestation. (accessed 15 September 2003). Since acquired by Neven Vision: (accessed 23rd June 2004). Hudlicka, E., McNeese, M., 2002. Assessment of user affective and belief states for interface adaptation: application to an air force pilot task. User Modelling and User-Adapted Interaction 12 (1), 1–47. Kapoor, A., Picard, R.W., 2003. Fully-automatic upper facial action recognition, IEEE International Workshop on Analysis and Modeling of Faces and Gestures (AMFG2003), Nice, France, October 2003. Lisetti, C.L., Schiano, D.J., 2000. Automatic facial expression interpretation: where human–computer interaction, artificial intelligence and cognitive science intersect. Pragmatics and Cognition (Special Issue on Facial Information Processing: A Multidisciplinary Perspective) 8 (1), 185–235. Phillips, P.J., Grother, P., Micheals, R.J., Blackburn, D.M., Tabassi, E., Bone, J.M., 2002. Face Recognition Vendor Test 2002: Overview and Summary. pdf (last accessed 23 June 2004). Picard, R.W., 1997. Affective Computing, . The MIT Press, Cambridge, MA. Picard, R.W., Klein, J., 2002. Computers that recognise and respond to human emotion: theoretical and practical implications. Interacting with Computers 14 (2), 141–169. Scheirer, J., Fernandez, R., Klein, J., Picard, R.W., 2002. Frustrating the user on purpose: a step towards building an affective computer. Interacting with Computers 14 (2), 93–118. Ward, R.D., Marsden, P.H., 2003. Physiological responses to different web page designs. International Journal of Human Computer Studies 59 (1/2), 199–212. Ward, R.D., Cahill, B., Marsden, P.H., Johnson, C.A., 2002. Physiological responses to HCI events—what produces them and how detectable are they?. Proceedings of HCI2002 2, 90–93. Ward, R.D., Bell, D., Marsden, P.H., 2003. An exploration of facial expression tracking in affective HCI, in: Palanque, P., Johnson, P., O’Neill, E. (Eds.), People and Computers XVII: Designing for Society Proceedings of the HCI2003, pp. 383–399. Wilson, G.M., Sasse, M.A., 2000. Do users always know what’s good for them? Utilising physiological responses to assess media quality, in: McDonald, S., Waern, Y., Cockton, G. (Eds.), People and Computers XIV— Usability or Else! Proceedings of HCI2000. Springer, Berlin, pp. 327–339. Yang, M-H., Ahuja, N., 2001. Face Detection and Gesture Recognition for Human–Computer Interaction, Kluwer International Series in Video Computing, vol. 1. Kluwer, Boston.
Depositing User: Sara Taylor
Date Deposited: 07 Jan 2008
Last Modified: 28 Aug 2021 23:33


Downloads per month over past year

Repository Staff Only: item control page

View Item View Item

University of Huddersfield, Queensgate, Huddersfield, HD1 3DH Copyright and Disclaimer All rights reserved ©