Anna Devor.

Anna Devor

Professor of biomedical engineering
Boston University

Anna Devor is professor of biomedical engineering at Boston University (BU), associate director of the BU Neurophotonics Center, and editor-in-chief of the journal Neurophotonics, published by the optical engineering society SPIE.

Devor’s lab, the Neurovascular Imaging Laboratory, specializes in imaging neuronal, glial, vascular and metabolic activity in the brains of living and behaving experimental animals. Her research is focused on understanding fundamental neurovascular and neurometabolic principles of brain activity and the mechanistic underpinning of noninvasive brain imaging signals. She also works on imaging of stem-cell-derived human neuronal networks.

Devor received her Ph.D. in neuroscience from the Hebrew University of Jerusalem. After completing her postdoctoral training in neuroimaging at the Athinoula A. Martinos Center for Biomedical Imaging, she established her own lab at the University of California, San Diego before moving it to BU in 2020. She has a wide network of collaborators across the world and is experienced in leading large, multidisciplinary teams.

Explore more from The Transmitter

Photograph of the BRIDGE team and students visiting a laboratory.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo

These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

By Lauren Schenkman
20 May 2025 | 6 min read
Research image of neurite overgrowth in cells grown from people with autism-linked PPP2R5D variants.

Cortical structures in infants linked to future language skills; and more

Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

By Jill Adams
20 May 2025 | 2 min read
Digitally distorted building blocks.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants

A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

By Alona Fyshe
19 May 2025 | 7 min read