Robots come to the rescue in sensory processing studies

Robots that help children with autism become more socially engaged may also increase understanding of sensory processing in the disorder, suggests unpublished research presented today at the 2014 Society for Neuroscience annual meeting in Washington, D.C.

By Sarah DeWeerdt
20 November 2014 | 3 min read
This article is more than five years old.
Neuroscience—and science in general—is constantly evolving, so older articles may contain information or theories that have been reevaluated since their original publication date.

University of Hertfordshire Friendly face: Researchers are investigating why robots ease social interactions for some children with autism.

Robots that help children with autism become more socially engaged may also increase understanding of sensory processing, suggests unpublished research presented today at the 2014 Society for Neuroscience annual meeting in Washington, D.C.

Children who find interacting with their peers overwhelming sometimes find interacting with a so-called ‘social robot’ more enjoyable. Researchers aren’t sure why this is, but the new study offers a method to explore the question.

Researchers showed ten adults a series of video clips featuring either a robot or a young woman shaking its or her head, accompanied by a soundtrack of a robot or human voice saying “no.”

In one version of each video, the soundtrack is precisely synchronized with the head shake. Other versions have the “no” begin up to 400 milliseconds before or after the head shake.

Participants watched 600 of these video clips and each time pressed a button on a computer keyboard to indicate whether the video was in synch. They can generally spot when a video is out of synch unless the head shake and the “no” begin within about 150 milliseconds of each other, the study found.

That’s roughly consistent with previous studies. This period within which people perceive two separate events as synchronous is known as the temporal binding window.

Robot speech and human speech sound different to the ear, and the brain processes them differently as well. The participants have a slightly wider temporal binding window for robot speech than they do for human speech, though this difference is not statistically significant. A larger sample size may be necessary to show a clearer difference, the researchers say.

The real question is how the temporal binding windows for human and robot speech will differ for people with autism.

When it comes to human speech, people with autism are known to have a wider temporal binding window than controls do, meaning that there is a broader span of time within which they perceive events as happening simultaneously.

This larger window may contribute to the sensory sensitivities in autism, says study leader Diana Sarko, course director of neuroscience at Edward Via College of Osteopathic Medicine in Spartanburg, South Carolina. “Things are coming at them and they’re binding them together when they shouldn’t be bound,” she says.

Sarko hypothesizes that individuals with autism will show a pattern of temporal binding that is opposite to that of controls: a smaller temporal binding window for robotic speech than for human speech. This might be one reason why some children with autism feel more at ease with robots than they do with their peers.

It might also provide an opportunity for intervention. The researchers aim to use social robots to train children to narrow their temporal binding window, which could in turn improve their social interactions with people.

The team plans to begin a study of temporal binding windows in 30 to 50 adolescents with autism within the next couple of months.

For more reports from the 2014 Society for Neuroscience annual meeting, please click here.