Xujun Duan is a professor of Biomedical Engineering at the University of Electronic Science and Technology of China, Chengdu, China. She received her PhD in Biomedical Engineering at the University of Electronic Science and Technology of China, and conducted a Joint PhD study at Stanford University under the supervision of Dr. Vinod Menon. Her long-term research goal is to address how brain anatomy, function and connectivity are altered in autism spectrum disorder (ASD), and how they vary across the population, by using multi-modal brain imaging techniques and computational methods.
Xujun Duan
Professor of biomedical engineering
University of Electronic Science and Technology of China
From this contributor
Magnetic stimulation for autism: Q&A with Xujun Duan
A new individualized approach to transcranial magnetic stimulation may one day be an effective treatment for social and communication difficulties, if the results from Duan’s small preliminary trial pan out.

Magnetic stimulation for autism: Q&A with Xujun Duan
Explore more from The Transmitter
Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.
Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.
The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.