Elizabeth Preston is a science writer and editor in the Boston area. She has written for The Atlantic, Wired, Jezebel and the Boston Globe, among other publications. Her blog, Inkfish, is published by Discover.

Elizabeth Preston
From this contributor
Test paints quick picture of intelligence in autism
A picture-based test is a fast and flexible way to assess intelligence in large studies of people with autism.

Test paints quick picture of intelligence in autism
New atlases chart early brain growth in monkeys
A collection of brain scans from monkeys aged 2 weeks to 12 months reveals how their brain structures and nerve tracts develop over time.

New atlases chart early brain growth in monkeys
Work in progress: An inside look at autism’s job boom
Splashy corporate initiatives aim to hire people with autism, but finding and keeping work is still a struggle for those on the spectrum. Can virtual avatars and for-profit startups help?

Work in progress: An inside look at autism’s job boom
Explore more from The Transmitter
Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.
Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.
The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.