Evan Schaffer is assistant professor of neuroscience at the Friedman Brain Institute at the Icahn School of Medicine at Mount Sinai. His lab uses mathematical tools to understand distributed computations in the brain, identify how these computations change with learning and identify how feedback from the body impacts cognition. Schaffer received his Ph.D. at the Center for Theoretical Neuroscience, in Larry Abbott’s lab at Columbia University. He completed his postdoctoral work in Richard Axel’s Lab at Columbia University

Evan Schaffer
Assistant professor of neuroscience
Icahn School of Medicine at Mount Sinai
Selected articles
- “Inhibitory stabilization of the cortical network underlies visual surround suppression” | Neuron
- “A complex-valued firing-rate model that approximates the dynamics of spiking networks” | PLoS Computational Biology
- “Odor perception on the two sides of the brain: Consistency despite randomness” | Neuron
- “The spatial and temporal structure of neural activity across the fly brain” | Nature Communications
- “Behavioral fingerprinting of the naked mole-rat uncovers signatures of eusociality and social touch” | bioRxiv
Explore more from The Transmitter
Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.
Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.
The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.