A new study being conducted by researchers in the McPartland Lab at the Yale Child Study Center is using technological advancements to bring biomarker discovery research to minimally verbal and cognitively impaired autistic individuals. These individuals have historically been underrepresented in neuroscience research because participation typically requires understanding and complying with complicated verbal instructions. However, in this study, computer vision tools watch what a person is doing and respond by changing what is shown on a computer screen to engage the participant and guide them through the experiment. Using this approach, it is now possible to invite participants with significant cognitive impairment to research studies that, in the past, would have been too hard to do.
The field of autism spectrum disorder (ASD) research lacks biomarkers, i.e., objective, sensitive measures of symptoms suitable for use in clinical research. Currently, there are only two biomarkers being considered for use in autism as part of the FDA’s Biomarker Qualification Program. Both putative biomarkers measure how people process visual social information. These biomarkers are: (1) An electroencephalographic (EEG) index of face processing efficiency, the N170; and (2) the Oculomotor Index (OMI), which is a quantification of how much someone looks at social regions, such as faces, of visual scenes. Evidence in support of these biomarkers was collected as part of the Autism Biomarkers Consortium for Clinical Trials (ABC-CT), led by Dr. McPartland, at the Yale Child Study Center. This consortium is the largest study of its kind in the United States and assessed 399 children with cognitive ability in the normal range. However, despite the promise of these biomarkers, it is unknown whether they are practical to collect or appropriate for individuals with cognitive impairment.
Despite advances in early intervention and increased access to services, many people with autism have significant cognitive impairment. This impairment is associated with difficulties in most avenues of life and a reduced likelihood that these individuals can live independently. Despite abundant evidence that these people are in significant need of support and represent up to 30% of the ASD population, they represent far less than 30% of the proportion of individuals enrolled in research studies. Consequently, the research findings in the field may fail to represent cognitively impaired individuals. This is particularly meaningful with regard to neuroscience research. A recent study indicated that, of 23 studies investigating the N170 biomarker in ASD, none focused on a sample with intellectual disability (ID),1 and, of 5,033 participants across 122 studies examining eye-tracking in ASD, only 416 had ID.2 Importantly, only two studies, with fewer than 30 children, included children with ASD who both had an intellectual impairment and were older than six years old, suggesting that as children age, ID becomes an increasing obstacle for research participation.
One of the most significant reasons for the underrepresentation of cognitively impaired individuals in neuroscience research is the need to follow spoken or written instructions and maintain engagement with a task. These demands are unique to neuroscience research because the processes under investigation, such as how the brain processes images of other people, require that a participant sit still and look at pictures of people. If you cannot or will not sit still, then it is impossible to accurately image brain activity, and if you cannot or will not look at pictures on a screen, then it’s impossible to know what’s happening in your brain when you do look at people. While these requirements are simple, they create a hurdle to research participation for many cognitively impaired autistic people.
To address this hurdle, researchers at the Yale Child Study Center have developed an interactive experimental delivery system that helps people participate in research without needing to understand complex instructions. We call this program PELICAN (Participant Empowering Learning Infrastructure for Characterization and Neuroscience). PELICAN uses high-speed computer vision systems to track a person’s eye and head movements. These measures of movement are used to support an experiment that rewards a participant for sitting still and attending to the experimental task. Specifically, rather than asking a participant to sit quietly and attend to a computer screen, as these experiments are typically designed, participants are greeted by a computer playing their favorite video. As long as they look at the screen and remain relatively still, the movie plays. In this way, they are rewarded for sitting calmly and attending. What is unique about this approach, compared to an observer turning a movie on or off, is that the computer responds almost instantaneously and consistently. This instantaneous reactivity creates an environment of easily learned rules for how the experiment works without the need for verbal instructions. Once the system has determined that the participant has learned how the experiment works, it incorporates brief segments of ABC-CT experiments, allowing us to collect data to measure the N170 and the OMI. Participants learn how the experiment works, and this allows us to collect data on these promising biomarkers.
The benefits of this approach are threefold. First, this approach blazes a trail for developing inclusive experimental paradigms that also maintain the necessary rigor for cognitive neuroscience research. Secondly, the biomarker data will provide unique insight into the social perception in severely impaired individuals with ASD. Finally, by incorporating experimental assays from the ABC-CT, we are pioneering biomarker discovery in cognitively impaired individuals. This advance sets the stage for increasing the diversity of individuals with ASD participating in clinical trials and cognitive neuroscience overall.
Dr. Adam Naples is Associate Research Scientist and Dr. James McPartland is the Associate Professor of Child Psychiatry and Psychology at the Yale Child Study Center.
These advancements would not have been possible without the partnership and support of individuals with autism and their families. If you would like to find out more about this exciting new study, visit www.mcp-lab.org or contact Erin MacDonnell at (203) 737-3439 or firstname.lastname@example.org.
- Kang E, Keifer CM, Levy EJ, Foss-Feig JH, McPartland JC, Lerner MD. Atypicality of the N170 Event-Related Potential in Autism Spectrum Disorder: A meta-analysis. Biol Psychiatry Cogn Neurosci Neuroimaging. 2018;3(8):657-666.
- Frazier TW, Strauss M, Klingemier EW, et al. A meta-analysis of gaze differences to social and nonsocial information between individuals with and without autism. Journal of the American Academy of Child & Adolescent Psychiatry. 2017;56(7):546-555.