Document Type

Article

Publication Date

7-13-2012

Publication Title

Social Cognitive and Affective Neuroscience

Department

Department of Psychological and Brain Sciences

Abstract

More than a decade of research has demonstrated that faces evoke prioritized processing in a ‘core face network’ of three brain regions. However, whether these regions prioritize the detection of global facial form (shared by humans and mannequins) or the detection of life in a face has remained unclear. Here, we dissociate form-based and animacy-based encoding of faces by using animate and inanimate faces with human form (humans, mannequins) and dog form (real dogs, toy dogs). We used multivariate pattern analysis of BOLD responses to uncover the representational similarity space for each area in the core face network. Here, we show that only responses in the inferior occipital gyrus are organized by global facial form alone (human vs dog) while animacy becomes an additional organizational priority in later face-processing regions: the lateral fusiform gyri (latFG) and right superior temporal sulcus. Additionally, patterns evoked by human faces were maximally distinct from all other face categories in the latFG and parts of the extended face perception system. These results suggest that once a face configuration is perceived, faces are further scrutinized for whether the face is alive and worthy of social cognitive resources.

DOI

10.1093/scan/nss078

Original Citation

Looser CE, Guntupalli JS, Wheatley T. Multivoxel patterns in face-sensitive temporal regions reveal an encoding schema based on detecting life in a face. Soc Cogn Affect Neurosci. 2013 Oct;8(7):799-805. doi: 10.1093/scan/nss078. Epub 2012 Jul 13. PMID: 22798395; PMCID: PMC3791074.

COinS