Crawling chimera: A nematode worm brain can control a fruit fly body, a new preprint finds.
Brunton and Abe et al.

‘Digital sphinx’ raises questions about connectome models

The sphinx, with a worm’s brain and a fly’s body, illustrates the potential pitfalls of using deep-learning techniques to model biological processes.

By Natalia Mesa
2 April 2026 | 5 min read

A neural network based on a nematode worm’s connectome can puppeteer a digital fruit fly’s body, a new preprint shows. The work comes just two weeks after Eon Systems, a neurotechnology company based in San Francisco, announced that it had “uploaded” a fly brain and released a video of that brain controlling a biomechanical fly model in a virtual world.

“We need to be really careful in interpreting this kind of work,” says Bing Wen Brunton, professor of biology at the University of Washington, who posted the new preprint on bioRxiv last week in response to Eon Systems’ announcement. Working with her team, Brunton strung a biophysical model of a Drosophila body to a simulation of the Caenorhabditis elegans connectome and trained that “digital sphinx,” a term coined in the preprint, to walk using deep reinforcement learning—all with a “brain” that wasn’t a fly brain at all. 

Brunton’s work points to an important control for other researchers looking to combine deep learning and connectomics to simulate fly behavior, says Benjamin Cowley, assistant professor at Cold Spring Harbor Laboratory, who was not involved in the preprint. It enables them to ask, “If I just created a randomly connected connectome, could it also do the same behaviors?” he says.

T

he problem with connectome models is that they do not capture the biophysical properties of neurons or the pools of neurotransmitters that modulate neural communication, and they exist without a body, says Srinivas Turaga, group leader at the Howard Hughes Medical Institute’s Janelia Research Campus, who was not involved in the work. 

To circumvent these shortcomings, researchers are starting to use deep reinforcement learning to relate the connectomes to behavior. But using these techniques to model biological processes also has pitfalls, as illustrated by the model Brunton and her team created. 

Deep reinforcement learning is a process of optimization, Brunton says, and biological systems don’t always work optimally. This approach can work “really well” in capturing fly behavior, she says, even when the model isn’t biologically realistic. 

Eon assembled its fly from three previously published datasets: a biophysical model of the fly body called NeuroMechFly, a fly brain connectome, and part of the fly visual system. And the company also used deep reinforcement learning to stitch the pieces together, training the networks to emulate a walking fly. But the Eon video quickly received pushback, and Turaga says that any random network connected to the NeuroMechFly model in this way might generate a walking fly. Because Eon hasn’t published the specifics of how they built their fly, it’s unclear if the model is any more accurate than a random network, Turaga says. 

Philip Shiu, head of engineering at Eon Systems, doesn’t entirely disagree. “I think it’s fair to say that this is not a full blown copy of a fly. My personal preference might be to say maybe we ought to call this a digital twin or an embodied model,” he says. “Obviously, part of the intention of the company was to say: This is something that’s really exciting and cool and not science fiction as it has been in the past.”

Even a “small network of 300 neurons” contains enough information for deep learning to extract patterns that drive realistic behaviors, Cowley adds. “There’s enough randomness in this network that you can map it to fly legs and make them move in a reasonable way.” 

B

runton was familiar with Eon’s work before it was released, she says; early last year, the company approached her and her colleague, John Tuthill, professor of neurobiology and biophysics at the University of Washington and an investigator on the preprint, in hopes of collaborating, though that never came to fruition. Brunton says she also saw Eon present a poster on its virtual fly at last year’s Society for Neuroscience conference. 

Of two minds: The virtual fly body sends sensory information to the worm connectome, and deep reinforcement learning trains an artificial neural network to map worm motor neuron activations to the fly’s leg muscles.

She had also reviewed several grant applications that proposed similar projects, so she was aware that “lots of people have had this idea,” she says. Brunton and her team kick-started the digital sphinx project only a few months ago, in December. But when Eon’s video came out, Brunton rushed to include her team’s digital sphinx in a presentation she gave at the Computational and Systems Neuroscience (COSYNE) annual meeting last month, and spent the flight to the conference working on the preprint.

All of this activity “just means that it’s a field where there’s a lot happening and people are excited about it,” Tuthill says. “But we see our role as calming the waters so that we can stay grounded.”

Brunton agrees with that assessment and adds that their paper is a caution flag for the growing number of people building and interpreting these models. “For lack of a better term,” she says, “there’s so much BS out there.” 

Sign up for our weekly newsletter.

Catch up on what you missed from our recent coverage, and get breaking news alerts.