Date of Award

Winter 2020

Document Type

Thesis (Undergraduate)

Department or Program

Cognitive Science

First Advisor

Caroline Robertson

Abstract

Autism is a multifaceted neurodevelopmental condition. Around 90% of individuals with autism experience sensory sensitivities, particularly impacting visual perception. Despite this high percentage, previous studies investigating visual perception in autism impose severe limitations on our understanding. In many of these experiments, their stimuli and experimental methods are un-naturalistic and produce unreproducible and conflicting results. In this study, we investigate the nature of the real-world visual experience in autism with a cutting-edge experimental approach. First, we use virtual reality headsets with eye-trackers to measure gaze behavior while individuals freely explore real-world, everyday scenes. Then, we compare their gaze behavior to the representations within convolutional neural networks (CNNs), a class of computational models resemblant of the primate visual system. This allows us to model the stages of the visual processing hierarchy that could account for differences in visual processing between individuals with and without autism. To our knowledge, this is the first fully unbiased, data-driven approach to studying naturalistic visual behavior in autism. In brief, we found that convolutional neural networks, regardless of the task upon which they were trained, are better able to predict gaze behavior in typically developing controls than in individuals with autism. This suggests that differences in gaze behavior between the two groups are not principally driven by the semantically-meaningful features within a scene and emerge from differences earlier in visual processing.

Share

COinS