Matthew Judson is a research associate in the UNC Neuroscience Center at the University of North Carolina at Chapel Hill.

Matthew Judson
Research associate
University of North Carolina at Chapel Hill
From this contributor
Angelman syndrome: Bellwether for genetic therapy in autism
It is not a matter of whether there will be clinical trials of genetic therapy for Angelman syndrome, but when.

Angelman syndrome: Bellwether for genetic therapy in autism
Insights for autism from Angelman syndrome
Deletions or duplications of the UBE3A gene lead to both Angelman syndrome and some cases of autism, respectively. Studying the effects of altered gene dosage in this region will provide insights into brain defects and suggest targets for therapies for both disorders, says expert Benjamin Philpot.

Insights for autism from Angelman syndrome
Explore more from The Transmitter
Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.
Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.
The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.