Nonverbal behavior is multimodal and interpersonal. In several studies, I addressed the dynamics of facial expression and head movement for emotion communication, social interaction, and clinical applications. By modeling multimodal and interpersonal communication my work seeks to inform affective computing and behavioral health informatics. In this talk, I will address some of my recent work that has addressed computational methods for affect communication in children with facial abnormalities, automatic measurement of pain intensity, and depression severity assessment. I will conclude my talk by sketching future directions in moving from the lab to the real world.
Zakia Hammal is a senior project scientist at the Robotics Institute at Carnegie Mellon University. Her areas of expertise are affective computing (also known as Emotion AI), multimodal human behavior modeling in social interaction, and behavioral health informatics. She organized successful workshops in Interpersonal Synchrony and Influence (INTERPERSONAL at ICMI 2015), and in Face and Gesture Analysis for Health Informatics (FGAHI at CVPR 2019, FG 2018). To promote the critical importance of context in affect recognition, she leads a series of six successful Context-Based Affect Recognition workshops at premier IEEE conferences in computer vision, affective computing, social communication, and multimedia (CBAR at FG 2019, ACII 2017, CVPR 2016, FG 2015, ACII 2013, and SocialCom 2012). Her honors include an outstanding paper award, at ACM ICMI 2012, Best paper ward at IEEE ACII 2015, and Outstanding Reviewer Award at IEEE FG 2015.