Adrien Peyrache.

Adrien Peyrache

Associate professor of neurology and neurosurgery
McGill University’s Montreal Neurological Institute

Adrien Peyrache is associate professor of neurology and neurosurgery at McGill University’s Montreal Neurological Institute, joining the institution in 2016. He holds the Canadian Research Chair in Systems Neuroscience and has significantly contributed to understanding memory and spatial navigation, particularly regarding the coordination of neurons during sleep to support memory formation.

As an active proponent of open science, Peyrache co-founded—and, until 2022, chaired—the selection committee of The Neuro – Irv and Helga Cooper Foundation Open Science Prize. He serves on the reviewing editorial board of eLife, and in 2024 he co-founded the Quebec Sleep Research Network, where he is co-director.

Peyrache completed his undergraduate studies at ESPCI-Paris Sciences et Lettres University and obtained a master’s degree in cognitive science at Ecole Normale Supérieure, followed by a Ph.D. in neuroscience at the College de France. For his postdoctoral training, he first worked with Alain Destexhe at the CNRS and then joined the lab of Gyorgy Buzsaki at New York University.

Explore more from The Transmitter

Photograph of the BRIDGE team and students visiting a laboratory.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo

These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

By Lauren Schenkman
20 May 2025 | 6 min read
Research image of neurite overgrowth in cells grown from people with autism-linked PPP2R5D variants.

Cortical structures in infants linked to future language skills; and more

Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

By Jill Adams
20 May 2025 | 2 min read
Digitally distorted building blocks.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants

A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

By Alona Fyshe
19 May 2025 | 7 min read