Theoretical neuroscience
Recent articles
Thinking about thinking: AI offers theoretical insights into human memory
We need a new conceptual framework for understanding cognitive functions—particularly how globally distributed brain states are formed and maintained for hours.

Thinking about thinking: AI offers theoretical insights into human memory
We need a new conceptual framework for understanding cognitive functions—particularly how globally distributed brain states are formed and maintained for hours.
Breaking the barrier between theorists and experimentalists
Many neuroscience students are steeped in an experiment-first style of thinking that leads to “random walk science.” Let’s not forget how theory can guide experiments toward deeper insights.

Breaking the barrier between theorists and experimentalists
Many neuroscience students are steeped in an experiment-first style of thinking that leads to “random walk science.” Let’s not forget how theory can guide experiments toward deeper insights.
Future watch: What should neuroscience prioritize during the next 10 to 20 years?
For The Transmitter’s first annual book, five contributing editors reflect on what subfields demand greater focus in the near future—from dynamical systems and computation to technologies for studying the human brain.

Future watch: What should neuroscience prioritize during the next 10 to 20 years?
For The Transmitter’s first annual book, five contributing editors reflect on what subfields demand greater focus in the near future—from dynamical systems and computation to technologies for studying the human brain.
What are recurrent networks doing in the brain?
The cortex is filled with excitatory local synapses, but we know little about their role in brain function. New experimental tools, along with ideas from artificial intelligence, are poised to change that.

What are recurrent networks doing in the brain?
The cortex is filled with excitatory local synapses, but we know little about their role in brain function. New experimental tools, along with ideas from artificial intelligence, are poised to change that.
Computational and systems neuroscience needs development
Embracing recent advances in developmental biology can drive a new wave of innovation.

Computational and systems neuroscience needs development
Embracing recent advances in developmental biology can drive a new wave of innovation.
Must a theory be falsifiable to contribute to good science?
Four researchers debate the role that non-testable theories play in neuroscience.

Must a theory be falsifiable to contribute to good science?
Four researchers debate the role that non-testable theories play in neuroscience.
Explore more from The Transmitter
Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.
Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.
The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.