Student Theses and Dissertations

Author

Yanis Tazi

Date of Award

2024

Document Type

Thesis

Degree Name

Doctor of Philosophy (PhD)

RU Laboratory

Freiwald Laboratory

Abstract

Facial movements are the primary medium for non-verbal communication and involve a complex orchestration of muscles controlled by the brain. The ability to interpret and produce these movements enables the expression of a wide range of emotions and social cues, all essential for social interactions. Despite their significance, the neural mechanisms governing facial movements remain poorly understood, hindered by the complexity of muscle coordination and the limitations of traditional, subjective, and labor-intensive analysis methods. To objectively understand facial motor control, this research adopts a three-pronged approach: 1) Developing and interpreting computational models within a novel multi-task training framework to simultaneously distinguish between facial expression and identity recognition, 2) Introducing a novel self-supervised Person-Specific Model (PSM) framework that extracts person-specific facial movements independently of other facial characteristics, enhancing facial muscle action characterization by leveraging individual differences, and 3) Utilizing data-driven computational models to analyze a unique dataset of single-cell recordings from sensorimotor cortex regions and behavioral video recordings of spontaneous, unconstrained, and naturalistic facial movements of macaques.This research first focused on developing computational models capable of separating facial movements from other characteristics like identity, which is challenging for computational models due to individual variations in facial movements and shapes. Utilizing Convolutional Neural Networks in a novel training framework, networks were trained simultaneously for facial identity and expression recognition, mirroring the human visual system’s face processing multi-tasking capability. This approach revealed functional segregation within the network, enabling the differentiation between a person’s identity and their expressions by dedicating different facial zones to solve each task and identifying task-specific facial features emerging from the network’s latest layers. Building on this, I introduced Person-Specific Model (PSM), an innovative self-supervised learning approach, which successfully extracts individual-specific facial movements independently of other facial characteristics. PSM stands out by leveraging individual differences, improving facial muscle action characterization. Its dual learning approach uniquely reveals a “repertoire” of facial movement primitives, capturing both universal patterns shared across individuals, and more complex, nuanced movements unique to each individual missed by other traditional methods. In parallel, this research explored the neural basis of facial movements, from larger movements like threats and chewing, to subtle, spontaneous movements in monkeys using naturalistic facial video recordings and single-cell neural data from sensorimotor cortex regions. A flexible computational framework was developed to analyze unconstrained continuous behavior. The analysis revealed that specialized neural patterns were linked to various movements; for example sensory area S1 was active during lipsmack, and primary motor cortex M1 was involved in actions like chewing and lipsmack. Distinct neural subspaces and neurons were associated with different social behaviors. The findings also revealed parallels between the neural dynamics of facial and well-studied arm motor behaviors; for example, threat expressions’ neural dynamics resembled reaching and sensory cortical areas like S1 exhibited unique dynamics during these expressions. Transitioning to continuous unconstrained behavior, this framework confirmed S1 and M1’s role in lower face movements, and importantly detected subtle neural-behavioral temporal patterns, like the role of anterior primary motor F4 in nose movements and ventral premotor PMV in eye movements, which traditional techniques failed to capture due to minimal movement variance in these facial locations. Particularly, distinct neural control strategies were identified, with regions like S1and M1 more active in larger expressive movements and PMV involved in more subtle movements, highlighting a sophisticated level of neural segregation between these different movement scales. Taken together, these results unveil a comprehensive picture of the intricate cortical control underlying facial movements, distinguishing between larger expressive motions and smaller, subtle actions. Crucially, this work underscores the need for standardized tools capable of analyzing spontaneous, unconstrained behaviors, beyond labeled expressions. I address this challenge in this analysis, promising a deeper understanding of natural behavior and its neural underpinnings.

Comments

A Thesis Presented to the Faculty of The Rockefeller University in Partial Fulfillment of the Requirements for the degree of Doctor of Philosophy

Available for download on Wednesday, May 06, 2026

Included in

Life Sciences Commons

Share

COinS