Earl K. Miller is Picower Professor of Neuroscience at the Massachusetts Institute of Technology, with faculty roles in the Picower Institute for Learning and Memory and the Department of Brain and Cognitive Sciences. His lab focuses on neural mechanisms of cognition, especially working memory, attention and executive control, using both experimental and computational methods. He holds a B.A. from Kent State University and an M.A. and Ph.D. from Princeton University. In 2020, he received an honorary Doctor of Science degree from Kent State University.

Earl K. Miller
Professor of neuroscience
Massachusetts Institute of Technology
Selected articles
- “An integrative theory of prefrontal cortex function” | Annual Review of Neuroscience
- “Top-down versus bottom-up control of attention in the prefrontal and posterior parietal cortices” | Science
- “The importance of mixed selectivity in complex cognitive tasks” | Nature
- “Gamma and beta bursts during working memory readout suggest roles in its volitional control” | Nature Communications
Explore more from The Transmitter
Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.
Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.
The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.