Headshot of Ashley Juavinett.

Ashley Juavinett

Associate teaching professor of neurobiology
University of California, San Diego

Ashley Juavinett is associate teaching professor of neurobiology at the University of California, San Diego, where she also co-directs STARTneuro, a program funded by the National Institutes of Health’s Blueprint Enhancing Neuroscience Diversity through Undergraduate Research Education Experiences. Through her work and writing, she seeks to understand the best ways to train the next generation of neuroscientists. A significant part of this effort is building resources to make such training accessible and more effective.

Juavinett completed her Ph.D. with Edward Callaway at the Salk Institute for Biological Studies in San Diego, California, investigating the cell types and circuits underlying visual perception in mice. She then conducted postdoctoral research with Anne Churchland at Cold Spring Harbor Laboratory in New York, advancing ethological approaches to understanding behavior as well as cutting-edge ways of recording from freely moving animals.

Juavinett is the author of “So You Want to Be a Neuroscientist?” an accessible guide to the field for aspiring researchers, and she has previously written for the Simons Collaboration on the Global Brain.

Explore more from The Transmitter

Photograph of the BRIDGE team and students visiting a laboratory.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo

These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

By Lauren Schenkman
20 May 2025 | 6 min read
Research image of neurite overgrowth in cells grown from people with autism-linked PPP2R5D variants.

Cortical structures in infants linked to future language skills; and more

Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

By Jill Adams
20 May 2025 | 2 min read
Digitally distorted building blocks.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants

A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

By Alona Fyshe
19 May 2025 | 7 min read