Integrating facial expressions and skin texture in face recognition
MetadataShow full item record
Face recognition is the most common biometric modality in use today. In this thesis, we have investigated two new aspects of face recognition, namely the use of skin marks as an additional biometric trait and the use of facial expressions both as a biometric as well as a behavior indicator. Most of the current face recognition systems, such as Eigenfaces, Fisherfaces, and Active Appearance Models are designed to process faces holistically. These methods cannot adequately model the local features such as moles, and scars. We have developed an algorithm for facial skin mark detection based on modeling the local skin patches using Principal Component Analysis (PCA). We have also explored different ways of calculating distances between the original patches and PCA models for this purpose. Most of the behavioral science research on facial expression analysis operates within the framework of the Facial Action Coding System (FACS). FACS is a quantitative system for measuring all visible facial muscle movement and is thus ideal for identifying unique individual patterns in movement. We have developed a system that extracts the AUs automatically and have tested the hypothesis--facial expressions can be used as features for person identification and thereby augment face biometrics. We use the displacement of face landmark points extracted from the neutral face and the face expression at its 'emotional apex' (instead of using a video sequence as in previous work) to construct a single feature vector having that particular expression. Genuine and impostor distributions are constructed accordingly for pairs of same people and different people with same and different expressions. We have conducted experiments on the publicly available Cohn-Kanade Facial Expression Database and verified if the distributions of genuine and impostor match scores are different by using the Wilcoxon rank sum test. We have also developed methods that automatically classify facial expressions (sad, happy, anger, etc.) based on the extracted AUs and distinguishing between those expressions which arise involuntarily through natural emotions and those that are "posed". This latter classification is based on behavioral science guidelines that "posed" expressions are distinguishable by examination of characteristics such as expression symmetry and presence of the right combination of AUs.