The findings suggest that the app could one day screen infants and toddlers for ASD and refer them for early intervention, when chances for treatment success are greatest.
The study appears in JAMA Pediatrics and was conducted by Geraldine Dawson, Ph.D., director of the NIH Autism Center of Excellence at Duke University, and colleagues. Funding was provided by NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD) and National Institute of Mental Health.
Studies have found that the human brain is hard-wired for social cues, with a person’s gaze automatically focusing on social signals. In ASD, attention to social stimuli is reduced, and researchers have sought to screen for ASD in young children by tracking their eye movements while they view social stimuli. However, equipment used for visual tracking is expensive and requires specially trained personnel, limiting its use outside of laboratory settings.
The current study enrolled 933 toddlers ages 16 to 38 months during a well-child primary care visit. Of these children, 40 were later diagnosed with ASD. They viewed on a mobile device short videos of people smiling and making eye contact or engaging in conversation. Researchers recorded the children’s gaze patterns with the device’s camera and measured them using computer vision and machine learning analysis. Children with ASD were much less likely than typically developing children to focus on social cues and visually track the conversations in the videos.
Pending confirmation by larger studies, the authors concluded that this eye-tracking app featuring specially designed videos and computer vision analysis could be a viable method for identifying young children with ASD.