Nico Dosenbach is associate professor of neurology at Washington University School of Medicine. His research as a systems neuroscientist is focused on pushing resting-state functional connectivity MRI (RSFC), functional MRI (fMRI) and diffusion tensor imaging (DTI) to the level of individual patients. To create and annotate the connectomes of individuals he is working to improve the signal-to-noise, spatial resolution and replicability of RSFC, DTI and fMRI data.

Nico Dosenbach
Associate professor of neurology
Washington University School of Medicine
From this contributor
Breaking down the winner’s curse: Lessons from brain-wide association studies
We found an issue with a specific type of brain imaging study and tried to share it with the field. Then the backlash began.

Breaking down the winner’s curse: Lessons from brain-wide association studies
Explore more from The Transmitter
Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.
Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.
The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.