Abidi, B., Huq, S. and Abidi M. 2004. “Fusion of visual, thermal, and range as a
solution to illumination and pose restrictions in face recognition,” In the Proceedings of
the International Carnahan Conference on Security Technology, IEEE 38th Annual
2004 International Carnahan Conference on Security Technology, pp. 325-330.
Acharya, T., and Ajoy R. 2005. Image Processing: Principles and Applications,
London: Wiley Interscience.
Allanson J. and Fairclough, S.H. 2004. “A research agenda for physiological
computing,” Interacting with computers, vol. 16, pp. 857-878. Boulic, L.E.R., and Thalmann, D. 1998. “Interacting with virtual humans through body
actions,” IEEE Computer Graphics and Applications,” January/February 98, pp. 8-11.
Bozionelos, N. 2001. “The relationship of instrumental and expressive traits with
computer anxiety,” Journal of Personality and Individual differences, vol. 31, no.6, pp.
Bradley, M.M., Sabatinelli, D., Lang, P.J., Fitzsimmons, J.R., King, W., Desai, P. 2003.
“Activation of the visual cortex in motivated attention,” Behavioral Neuroscience, vol.
117 pp. 369–380.
Breiman, L., Friedman, J.H., Olshen, R.A., and Stone, C.J. (1984). Classification and
Regression Trees, California: Wadsworth Publishing.
Briese, E. 1995. “Emotional hyperthermia and performance in humans,” Physiological
Behavior, vol. 58, no. 3, pp. 615-618.
Brooks, R.A. 2002. Flesh and Machines: How Robots will change us, New York:
Brosnan, M.J. 1998. “The impact of computer anxiety and self-efficacy upon
performance,” Journal of Computer Assisted Learning, vol. 14, no. 3, pp. 223-234.
Busso, C., Deng, Z., Yildirim, S., Bulut, M., Lee, C.-M., Kazemzadeh, A., Lee, S.,
Neumann, U., Narayanan, S. 2004. “Analysis of emotion recognition using facial
expressions, speech and multimodal information,” In the Proceedings of 6th
International Conference on Multimodal Interface, ICMI’04, PA, pp. 205-211.
Cabanac, A.J., and Guillemette, M. 2001. “Temperature and heart rate as stress
indicators of handled common eider,” Physiological Behavior, vol. 74, nos. 4-5, pp.
Cacioppo, J.T., Bush L.K., and Tassinary, L.G. 1990. “Microexpressive facial actions
as function of affective stimuli: replication and extension, “ Personality and Social
Psychology Bulletin, vol. 18, pp. 515-526.Calder, A.J., Burton, A.M., Miller, P., Young, A.W., Akamatsu, S. 2001. “A principal
component analysis of facial expressions,” Vision Research, vol. 41, pp. 1179-1208.
CANTRONIC Systems Inc. 2001. IR 860 User Manual, BC, Canada Coquitlam:
Cantronic Systems Inc.
CANTRONIC Systems Inc. 2001[a]. CMView Plus Imaging Software Manual, BC,
Canada, Coquitlam: Cantronic Systems Inc.
CANTRONIC Systems Inc. 2002. IR 860 Product Data Sheet, available at
http://www.cantronix.com/ir860_spec.html, last visited 18 February 2002.
Chang, W.C. 1983. “On using principal components before separating a mixture of two
multivariate normal distributions,” Applied Statistics, vol. 32, pp. 267-275.
Chatfield C., and Collins, A.J. 1995. Introduction to Multivariate Analysis, London,
Chapman & Hall.
Chellappa, R. 1998. “Discriminant analysis for face recognition,” in Face recognition:
From theory to applications, Wechsler, S., Phillips, J., Bruce, V. Fogelman-Soulie, F.,
and Huang, T., (Eds.), Springer-Verlag, London, pp. 564-571.
Chen, X., and Huang, T. 2003. “Facial expression recognition: A clustering-based
approach,” Pattern Recognition Letters, vol. 24, pp. 1295-1302.
Choi, S.C. 1972. “Classification of multiple observed data,” Biomedical Journal, vol.
14, pp. 8-11.
Christie, I. C. and Friedman, B. H. 2004. “Autonomic specificity of discrete emotion
and dimensions of affective space: a multivariate approach,” International Journal of
Psychophysiology, vol. 51, pp. 143-153.
Coakes, S.J., and Steed, L.G. 1999. SPSS: Analysis without anguish, New York: John
Wiley & Sons. Cohen, I., Sebe, N., Garg, A., Chen, L.S., and Huang, T.S. 2003. “Facial expression
recognition from video sequences: temporal and static modeling, ”Journal of Computer
Vision and Image Understanding, no. 91, pp. 160-187.
Cohen, J. 1977. Statistical Power Analysis for Behavioral Sciences, New York:
Cohn, J., Zlochower, A., Lien J., and Kanade, T. 1999. “Automated face analysis by
feature point tracking has high concurrent validity with manual FACS coding,”
Psychophysiology, vol. 36, pp. 35-43.
Collet, C., Vernet-Maury, E., Delhomme, G., and Dittmar, A. 1997.“Autonomic
nervous system response patterns specificity to basic emotions,” Journal of Autonomic
Nervous System, vol. 62, pp. 45-57.
Costanza, M.C., and Affifi, A.A. 1979. “Comparison of stopping rules in forward
stepwise discriminant analysis,” Journal of American Statistical Association, vol. 74,
Cottrell G., and Metacalfe, J. 1991. “EMPATH: Face, gender and emotion recognition
using holons,” in Advances in Neural Information Processing Systems, Lippman, R.,
Moody J., and Touretzky D., (Eds.), Morgan Kaufmann Pub., CA, vol. 3, pp. 564-571.
Critchley, H.D, Daly, E.M., Bullmore, E.T., Williams, S.C.R., Amelsvoort, T.V.,
Robertson, D.M., Rowe, A., Phillips, M., McAlonan, G., Howllin P., and Murphy,
D.G.M. 2000. “The functional neuroanantomy of social behavior: changes in cerebral
blood flow when people with autistic disorder process facial expressions,” Brain, no.
123, pp. 2203-212.
Darabi A., and Maldague, X. 2002. “Neural network based defect detection and depth
estimation in TNDE,” NDT&E International, no. 35, pp. 165-175.
Dautenhahn, K., and Billard, A. 1999. “Bringing up robots or psychology of socially
intelligent robots: from theory to implementation,” In the Proceedings of 3rd 2002
International Conference on Autonomous Agents, Seattle, WA, pp. 366-367.DeCarlo, D., and Metaxas, D. 2000. “Optical flow constraints to deformable models
with applications to face tracking,” International Journal of Computer Vision, vol. 38,
no. 2, pp. 99-127.
DeSilva, L.C., Miyasato, T. and Nakatsu, R. 1997. “Facial emotion recognition using
multimodal information,” In the Proceedings of IEEE information, Communication and
Signal Processing Conference 1997, pp. 397-401.
DHEW. 1979. Department of Health, Education and Welfare. Publication Number (OS)
78-0014. US Government Printing Office, Washington, D.C.
Diakides, N.A. 1998. “Infrared Imaging: An emerging technology in medicine,” IEEE
Engineering in Medicine and Biology, pp. 17-18.
Dimberg U., and Petterson, M. 2000. “Facial reactions to happy and angry facial
expressions: evidence for right hemisphere dominance,” Psychophysiology, vol. 37, no.
5, pp. 693-696.
Dimberg, U. 1990a. “Facial electromyography and emotional reactions,”
Psychophysiology, vol. 27, no. 5, pp. 481-494.
Dimberg, U. 1990b. “Facial reactions to auditory stimuli: sex differences,”
Scandinavian Journal of Psychology, vol. 31, no. 3, pp.228-233.
Dimberg, U., Thunberg, M., and Elmehed, K. 2000. “Unconscious facial reactions to
emotional facial expressions,” Psychology Science, vol. 11, no. 1, pp. 86-89.
Donato, G. 1999. Bartlett, M.S., Hager, J.C., Ekman, P., and Sejnowski, T.J.,
“Classifying Facial Expressions,” IEEE Transactions on Pattern Analysis and Machine
Intelligence, vol. 21, no. 10, pp. 974-989.
Dror, I.E., Péron, A.E., Hind, S.L., and Charlton, D. 2005. When emotions get better of
us: The effect of contextual top-down processing on matching fingertips,” Applied
Cognitive Psychology, vol. 19, pp. 799-809.Drummond, P.D., and Lance, J.W. 1987. “Facial flushing and sweating mediated by the
sympathetic nervous system,” Brain, vol. 110, pp. 793-803.
Du, Y. and Lin, X. 2003. “Emotional facial expression model building,” Pattern
Recognition Letters, vol. 24, pp. 2923-2934.
Dubuisson, S., Davoine F., and Masson, M. 2002. “A solution for facial expression
representation and recognition,” Signal Processing: Image Communication,” vol. 17,
Duda, R.O., Hart, P.E., Stork, D.G. 2001. Pattern Classification, New York: Wiley
Ekman, P. 1992. “An argument for basic emotions,” Cognition and Emotion, vol.6, pp.
Ekman, P., and Friesen, W.V. 1971. “Constants across cultures in the face and
emotion,” Journal of Personality and Social Psychology, vol. 17, pp. 124-129.
Ekman, P., and Friesen, W.V. 1978. Facial Action Coding System: A technique for the
measurement off facial movement, Pal Alto, CA, Consulting Psychology Press.
Ekman, P., Davidson, R.J. and Friesen, W.V. 2000. “Duchenne’s smiles: Emotional
expression and their brain physiology II,” Journal of Personality and Social
Psychology, vol. 58, pp.342-353.
Ekman, P. 1982. Emotion in the Human Face, England: Cambridge University Press.
Ekman, P., Friesen, W.V., and O’Sullivan, M. 1988. “Smiles when lying,” Journal of
Personality and Social Psychology, vol. 54, pp.414-420.
Ekman, P., Huang, T.S., Sejnowski, T.J., and Hager, J.C. 1993. “Final report to NSF of
the planning workshop on facial expression understanding,” Human-Computer
Interaction Laboratory, University of California, San Francisco, (also available on
http://face-emotion.com/dataface/nsfrept/nsf_contents.html, last visited 13 December
2004).Ekman, P., Levenson, R.W. and Friesen, W.V. 1983. “Autonomic nervous system
activity distinguishes among emotions,” Science, vol. 221, pp. 1208-1210.
Essa, I., Pentland, A. 1997. “Coding, analysis, interpretation and recognition of facial
expressions,” IEEE Transactions on Pattern Analysis, Machine Intelligence, vol. 19,
no. 7, pp. 757-763.
Eveland, C.K., Socolinsky, D.A., Wolf, L.B. 2003. “Tracking human faces in infrared
video, Image and Vision Computing, vol. 21, no. 7, pp. 579-590.
Everitt B.S., and Dunn, G. 1991. Applied Multivariate Data Analysis, London: John
Wiley and Sons.
Fasel B., and Luettin, J. 2003. “Automatic facial expression analysis: a survey,” Pattern
Recognition, vol. 36, pp. 259-275.
Field, A. 2000. Discovering Statistics using SPSS for Windows. London: Sage
Fried, L.A. 1976. Anatomy of the head, neck, face and jaws, Philadelphia: Lea and
Friedman, S. M. 1970. Visual Anatomy, vol. I, Head and Neck, New York: Harper and
Fujimasa, I. 1998. “Pathophysiological expression and analysis of infrared thermal
images,” IEEE Engineering in Medicine and Biology, vol. 17, no. 4, pp. 34-42.
Fujimasa, I., Chinzei, T., and Saito, I. 2000. “Converting far infrared image information
to other physiological data,” IEEE Engineering in Medicine and Biology, vol. 19, no. 3,
Fukunaga, K. 1990. Statistical Pattern Recognition, London: Academic Press.Gao, Y., Leung, M.K.H., Hui S.C., and Tananda, M.W. 2003. “Facial expression
recognition form line-based caricatures,” IEEE Transactions on Systems, Man, and
Cybernetics, vol. 33, no. 3, pp. 407-412.
Garbey, M., Merla, A. and Pavlidia I. 2004. “Estimation of blood flow speed and vessel
location from thermal video,” In the Proceedings of CVRP 2004, IEEE 2004
Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 1356-1363,
Gavhed, D., Makinen, T., Holmer I., and Rintamaki, H. 2000. “Face temperature and
cardiorespiratory responses to wind in thermoneutral and cool subjects exposed to –10
ºC,” European Journal of Applied Physiology, vol. 83, pp. 449-456.
George D., and Mallery, P. 1995. SPSS/ PC+ Step by Step: A simple guide and
reference. London: Wadsworth Publishing Co.
Gong, S., McKenna S.J., and Psarrou, A. 2000. Dynamic Vision: From Images to Face
Recognition, London: Imperial College Press.
Gonzalez, R.C., and Woods, R.E. 2002. Digital Image Processing. New York:
Addison-Wesley Publishing Inc.
Gottumukkal R., and Asari, V.K. 2004. “An improved face recognition technique based
on modular PCA approach,” Pattern Recognition Letters, vol. 25, pp. 429-436.
Gupta, A.K., and Logan, T.P. 1990. “On a multiple observations model in discriminant
analysis,” Journal of Statistics and Computer Simulation, vol. 34, pp. 119–132.
Gur, R.C., Sara, R., Hagendoorn, M., Maron, O., Hughett, P., Macy, L., Turner, T.,
Bajcsy, R., Posner A., and Gur, R.E. 2002. “A method of obtaining 3-dimensional
facial expressions and its standardization for use in neurocognitive studies,” Journal of
neuroscience methods, no. 115, pp. 137-143.
Hager, J. C. 1985. “A comparison of units for visually measuring facial actions,”
Journal of Behavior Research, Methods, Instruments and Computers, vol. 17, pp. 450-
468.Hara, F. and Kobayashi, H. 1997. “State of Art in component development for
interactive communications with humans,” Advanced Robotics, vol. 11, no. 6, pp. 585-
Haussecker, H.W., and Fleet, D.J. 2000. “Computing optical flow with physical models
of brightness variation,” In the Proceedings of the IEEE Computer Society Conference
on Computer Vision and Pattern Recognition, 2000, vol. 2, IEEE Computer Society,
Haykin, S. 1994. Neural Networks: A comprehensive foundation, New York:
Head, M., and Dyson, S. 2001. “Talking temperature of equine thermography,” The
Veterinary Journal, no. 162, pp. 166-167.
Heijden, F., Duin, R.P. W., Ridder D., and Tax, D.M.J. 2004. Classification, Parameter
Estimation and State Estimation, Western Sussex, England: John Wiley & Sons.
Henderson, R., Podd, J., Smith M., and Varela-Alvarez, H. 1995. “An examination of
four user-based software evaluation methods,” Interacting with Computers, vol. 7, no.
4, pp. 412-432.
Herry, C.L., and Frize, M. 2002. “Digital processing techniques for the assessment of
pain with infrared thermal imaging,” In the Proceedings of the 2002 IEEE International
Conference on Engineering in Medicine and Biology, IEEE EMBS 2002, pp. 1157-
1158, Houston, October 23-26.
Hess, U., Kappas, A., McHugo, G.J., Lanzetta, J.T., and Kleck, R.E. 1992. “The
facilitative effect of facial expression on self-generation of emotion,”
Psychophysiology, vol. 12, no.3, pp. 251-265.
Hirsch, C. and Mathews, A. 1997. “Interpretative inferences when reading about
emotional events,” Behavioral Research Therapy, vol. 35, no. 12, pp. 1123-32.
Holt, R.J., Huang, T.S., Netravali, A.N., and Qian, R.J. 1997. “Determining articulated
motion from perspective views,” Pattern Recognition, vol. 30, no. 6, pp. 585-604.Hosseini, H.G., and Krechowec, Z. 2004. “Facial expression analysis for estimating
patients emotional states in RPMS,” In the Proceedings of the 2004 IEEE International
Conference on Engineering in Medicine and Biology, IEEE EMBC 2004, vol. 2, pp.
1517-1520, San Francisco, September 1-5.
Huang D., and Yan, H. 2001. “Modeling and animation of human expressions using
NURBS curves based on facial anatomy,” Signal Processing: Image Communication,
vol. 17, pp. 457-465.
Huang, C., and Huang, Y. 1997. “Facial expression recognition using model-based
feature extraction and action parameters classification,” Journal of Visual
Communication and Image Representation, vol. 8, no. 3, pp. 278-290.
Huberty C.J. 1984 “Issues in the use and interpretation of discriminant analysis,”
Psychological Bulletin, vol. 95, no. 1, pp. 156-171.
Huberty, C.J. 1994. Applied Discriminant Analysis, New York: Wiley.
Hussein, S.E., and Granat, M.H. 2002. “Intention detection using a neuro-fuzzy EMG
classifier,” IEEE Engineering in Medicine and Biology, vol.21, no. 6, pp. 123-129.
IAPS. 1997. “International affective picture system technical manual and affective
ratings,” available online. http://www.unifesp.br/adap/instructions.pdf.
Iwase, M., Ouchi, Y., Okada, H., Yokohama, C., Nobezawa, S., Yoshikawa, E.,
Tsukada, H., Takeda, M., Yamagguti, K., Kuratsune, H., Shimizu A., and Watanabe, Y.
2002. “Neural substrates of human facial expression of pleasant emotion induced by
comic films: a PET study,” Neuroimage, vol. 17, no.2, pp. 758-768.
Izard, C.E. 1979. “The maximally discriminative facial movement coding system
(MAX),”unpublished manuscript available at the University of Delaware Library
through Instructional Resource Center. Inter-library loan services. Delaware.
Jolliffe, I.T. 2002. Principal Component Analysis, New York: Springer-Verlag.Jones B.F., and Plassmann, P. 2002. “Digital infrared thermal imaging of human skin,”
IEEE Engineering in medicine and biology