Imagine using a handheld device to help you determine if a child has autism. The momentum for such a future is building, some application developers say.
In October, a team of scientists debuted a tablet-based app to identify signs of autism in a child in just 10 minutes. That app, called SenseToKnow, uses the tablet’s camera to monitor children as they watch short movies and to track their motor skills during a bubble-popping video game. It then uses artificial intelligence to analyze their eye movements, blinks and other physical responses.
SenseToKnow correctly identifies children with autism (a measure called sensitivity) 87.8 percent of the time and correctly returns negative results (referred to as specificity) 80.8 percent of the time, its makers reported in Nature Medicine. “The ability to detect early signs using only a brief app delivered on a tablet in real settings is exciting,” says Geraldine Dawson, professor of psychiatry and behavioral sciences at Duke University in Durham, North Carolina, who co-led the research.
Another digital tool, called EarliPoint, was developed by researchers at the Emory University School of Medicine in Atlanta, Georgia, and described in two studies in September. It relies on eye tracking to spot signs of autism in toddlers. Cleared in 2022 by the U.S. Food and Drug Administration (FDA), EarliPoint showed 78 percent sensitivity and 85.4 percent specificity in a clinical trial that team conducted.
And more devices and mobile applications are on the way. “People love this tablet tech because it [offers] the promise of accessibility,” says Frederick Shic, professor of general pediatrics at the University of Washington in Seattle, who studies eye-tracking technology but did not contribute to either project. “It’s also just cool as hell.”
But it’s not without its detractors. Among the most pressing questions critics raise is whether these tools can actually solve the problem they promise to address: delays in diagnosis and care for autistic children.
In the positive column, tablet-based apps promise to flag more children for diagnosis than standard questionnaire-based screeners do.
For example, among 475 toddlers screened by SenseToKnow during pediatric well-child visits, 49 were subsequently diagnosed with autism and 98 with developmental delay without autism. (That fraction may be higher than would be expected in the general population, Dawson notes, because parents who already had concerns may have been more inclined to participate in the study.) Children who screened positive had a 40.6 percent likelihood of subsequently being diagnosed with the condition, compared with only about 15 percent for those screened using the standard parent questionnaire. Combining the app and the questionnaire boosted those chances to 63.4 percent.
“This is a huge advance for the use of computer vision and AI-related methods to improve the state of the art in autism screening technologies,” Shic says. “[It] pioneers … the next generation of machine-learning methods for remote diagnostics not just for autism, but broadly into child health and development.”
The EarliPoint tool, Shic notes, is “sitting in a different class” of technology and presents different advantages. Whereas SenseToKnow is an app, designed to run on a standard tablet and to evaluate toddlers, EarliPoint, which also evaluates toddlers, deploys technology that can be used with either a desktop computer or a custom-designed tablet from the same team, cleared by the FDA this past July.
“Our system has a built-in eye tracker, especially installed for this purpose,” says Ami Klin, director of the Marcus Autism Center and division chief of autism and developmental disabilities at the Emory University School of Medicine and co-lead investigator on the EarliPoint research.
Because the administration of eye tracking does not require any specialized training, it can be implemented immediately in any medical office,” says Karen Pierce, professor of neurosciences at the University of California, San Diego, who is developing a tool that combines six different eye-tracking tests for ages 12 to 48 months to help clinicians detect autism early in childhood.
And that accessibility could help to address the delays many families face when seeking help, she says. “The waiting list to obtain a diagnosis of autism and receive autism spectrum disorder-related therapies can be extremely long, usually months, or even years in some cases.”
These tools could not only cut down on the time and human skill required to detect autism, they could also — in principle, at least — offer reliable, objective measures of autism-linked behaviors, “rather than relying on subjective, qualitative assessments that require a high level of training, time and expertise,” Dawson adds. “The latter approach has created barriers to access to care for many children.”
But how these devices should be deployed divides opinion. Pierce and Klin say they feel strongly that their tools should be optimized for use in a clinical setting — though not necessarily administered by clinicians — so that children who screen positive are in a position to find expert support. For instance, EarliPoint is designed for use in clinics by technicians who have received an hour of training.
Dawson, however, presents a different vision, in which SenseToKnow serves as a form of at-home screening. She says she anticipates multiple ways to use the app, including as a tool for parents to monitor a child’s behavior over time or a way for caregivers to gather information for health-care providers.
Yet another autism screening app, ASDetect, runs on smartphones and tablets and is intended for use by families at home. “Almost everyone has a mobile phone and access to the internet,” says its creator, Josephine Barbaro, associate professor of psychology at La Trobe University in Melbourne, Australia. The aim is “to find ways that we can reach as many children as possible.”
ASDetect extends the Social Attention and Communication Surveillance tools that Barbaro and her colleagues developed in 2010 to identify signs of autism in children between the ages of 12 and 24 months. Instead of eye tracking, it offers what is essentially an enhanced parent questionnaire: Short, narrated videos and verbal descriptions of child behavior precede questions and activities for parents — from which it calculates a high or low likelihood of autism, but not a diagnosis.
The ASDetect app also gives parents suggested next steps for their children. In a study of 745 caregivers who had prior concerns that their children had autism, the app showed 77 percent sensitivity and 91 percent specificity when compared with clinical assessments of those children.
Hannah Waddington, autism clinic lead and researcher at Victoria University of Wellington in New Zealand, who did not take part in the ASDetect research, praises the app for being free, easy to use and highly predictive.
But “alternative approaches, such as parents leveraging a tool using their smartphone, has more potential for error and the introduction of noise into the data,” Pierce cautions.
More broadly, some scientists are skeptical whether any of these technologies can make a significant difference for screening and care delays anyway. “I don’t think they’ll relieve the bottleneck surrounding autism diagnosis at all,” says Catherine Lord, distinguished professor of psychiatry at the University of California, Los Angeles, who did not take part in any of these studies and is creator of the Autism Diagnostic Observation Schedule, the gold-standard method clinicians use to diagnose autism.
One of Lord’s concerns, she says, is that technology cannot substitute for working with a clinician. “This is a diagnosis of a lifelong condition, and if you look at the level of disinformation on the internet, you want a clinician there to help families figure out what they are learning.”
And even with easier access to screening tools, Lord predicts that the bottleneck in accessing a diagnosis would simply shift to the accessing treatment stage. She points out that there are not enough developmental pediatricians or child psychiatrists to serve children with an autism diagnosis.
The ultimate value of earlier detection is also debatable, says Connie Kasari, distinguished professor of human development and psychology at the University of California, Los Angeles, who did not take part in any of the device research. “To date, we do not have strong evidence of very early intervention approaches for under 12 months,” she says. “There are more options during the toddler years, but effects on child development are also mixed.”
Developers will also need to address several questions about the reliability of these tools in providing measurements, Lord says. She expresses concern about how replicable scores are in individual children and whether a tool’s performance will change when used across races and genders.
The tool creators say they agree with many of these concerns — though each device requires different next steps. Tests of these devices should include toddlers with a wide range of developmental conditions, not just autism, Pierce says. This can help to probe how specific their findings are to autism and avoid confusing one group with another, she says.
This kind of investigation is already in the works for SenseToKnow, Dawson says. Her team is examining how co-occurring conditions such as attention-deficit/hyperactivity disorder influence behaviors the app measures. They are also assessing the tool’s ability to detect autism in infants as young as 6 months.
If device developers can answer more of these questions, they might win over some critics — at least in part. Lord, for example, says she can imagine tools that someday help experts standardize measures of behavior related to autism. But, she adds, only provided they are rigorously tested first.