Multivoxel Patterns in Face-Sensitive Temporal Regions Reveal an Encoding Schema Based on Detecting Life in a Face
Social Cognitive and Affective Neuroscience
Department of Psychological and Brain Sciences
More than a decade of research has demonstrated that faces evoke prioritized processing in a ‘core face network’ of three brain regions. However, whether these regions prioritize the detection of global facial form (shared by humans and mannequins) or the detection of life in a face has remained unclear. Here, we dissociate form-based and animacy-based encoding of faces by using animate and inanimate faces with human form (humans, mannequins) and dog form (real dogs, toy dogs). We used multivariate pattern analysis of BOLD responses to uncover the representational similarity space for each area in the core face network. Here, we show that only responses in the inferior occipital gyrus are organized by global facial form alone (human vs dog) while animacy becomes an additional organizational priority in later face-processing regions: the lateral fusiform gyri (latFG) and right superior temporal sulcus. Additionally, patterns evoked by human faces were maximally distinct from all other face categories in the latFG and parts of the extended face perception system. These results suggest that once a face configuration is perceived, faces are further scrutinized for whether the face is alive and worthy of social cognitive resources.
Looser CE, Guntupalli JS, Wheatley T. Multivoxel patterns in face-sensitive temporal regions reveal an encoding schema based on detecting life in a face. Soc Cogn Affect Neurosci. 2013 Oct;8(7):799-805. doi: 10.1093/scan/nss078. Epub 2012 Jul 13. PMID: 22798395; PMCID: PMC3791074.
Dartmouth Digital Commons Citation
Looser, Christine E.; Guntupalli, Jyothi S.; and Wheatley, Thalia, "Multivoxel Patterns in Face-Sensitive Temporal Regions Reveal an Encoding Schema Based on Detecting Life in a Face" (2012). Dartmouth Scholarship. 3800.