Tychele Turner is assistant professor of genetics at the Washington University School of Medicine in St. Louis, Missouri, where her lab focuses on the study of noncoding variation in autism, precision genomics in 9p deletion syndrome, optimization of genomic workflows and the application of long-read sequencing to human genetics.

Tychele Turner
Assistant professor of genetics
Washington University School of Medicine
From this contributor
How long-read sequencing will transform neuroscience
New technology that delivers much more than a simple DNA sequence could have a major impact on brain research, enabling researchers to study transcript diversity, imprinting and more.

How long-read sequencing will transform neuroscience
Focus on function may help unravel autism’s complex genetics
To find the pathogenic mutations in complex disorders such as autism, researchers may need to conduct sophisticated analyses of the genetic functions that are disrupted, says geneticist Aravinda Chakravarti.

Focus on function may help unravel autism’s complex genetics
Explore more from The Transmitter
Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.
Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.
The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.