Large language models
Recent articles
The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.
‘Digital humans’ in a virtual world
By combining large language models with modular cognitive control architecture, Robert Yang and his collaborators have built agents that are capable of grounded reasoning at a linguistic level. Striking collective behaviors have emerged.
‘Digital humans’ in a virtual world
By combining large language models with modular cognitive control architecture, Robert Yang and his collaborators have built agents that are capable of grounded reasoning at a linguistic level. Striking collective behaviors have emerged.
Are brains and AI converging?—an excerpt from ‘ChatGPT and the Future of AI: The Deep Language Revolution’
In his new book, to be published next week, computational neuroscience pioneer Terrence Sejnowski tackles debates about AI’s capacity to mirror cognitive processes.

Are brains and AI converging?—an excerpt from ‘ChatGPT and the Future of AI: The Deep Language Revolution’
In his new book, to be published next week, computational neuroscience pioneer Terrence Sejnowski tackles debates about AI’s capacity to mirror cognitive processes.
Explore more from The Transmitter
Null and Noteworthy: Learning theory validated 20 years later
The first published paper from the EEGManyLabs replication project nullifies a null result that had complicated a famous reinforcement learning theory.

Null and Noteworthy: Learning theory validated 20 years later
The first published paper from the EEGManyLabs replication project nullifies a null result that had complicated a famous reinforcement learning theory.
Neuroscientist Gerry Fischbach, in his own words
In 2023, I had the privilege of sitting down with Gerry over the course of several days and listening as he told the story of his life and career—including stints as dean or director of such leading institutions as Columbia University and NINDS—so that we could record it for posterity.

Neuroscientist Gerry Fischbach, in his own words
In 2023, I had the privilege of sitting down with Gerry over the course of several days and listening as he told the story of his life and career—including stints as dean or director of such leading institutions as Columbia University and NINDS—so that we could record it for posterity.
Amina Abubakar translates autism research and care for Kenya
First an educator and now an internationally recognized researcher, the Kenyan psychologist is changing autism science and services in sub-Saharan Africa.

Amina Abubakar translates autism research and care for Kenya
First an educator and now an internationally recognized researcher, the Kenyan psychologist is changing autism science and services in sub-Saharan Africa.