Timothy O’Leary.

Timothy O’Leary

Professor of information engineering and neuroscience
University of Cambridge

Timothy O’Leary is professor of information engineering and neuroscience at the University of Cambridge. His research lies at the intersection between physiology, computation and control engineering. His goal is to understand how nervous systems self-organize, adapt and fail, and to connect these to diversity and variability in nervous system properties.

Originally trained as a pure mathematician, O’Leary dropped out of a Ph.D. on hyperbolic geometry to study the brain. After retraining as an experimental physiologist, he obtained his doctorate in experimental and computational neuroscience from the University of Edinburgh in 2009.

He has worked as both an experimentalist and theoretician, on systems that span the scale from single ion channel dynamics to whole brain and behavior, and across invertebrate and vertebrate species. His group works closely with experimentalists to study neuromodulation, neural dynamics and how sensorimotor information is represented in the brain, more recently focusing on how neural representations evolve over time. He approaches these problems from an unusual perspective, citing engineering principles as being key to understanding the brain—and biology more widely.

Explore more from The Transmitter

Photograph of the BRIDGE team and students visiting a laboratory.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo

These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

By Lauren Schenkman
20 May 2025 | 6 min read
Research image of neurite overgrowth in cells grown from people with autism-linked PPP2R5D variants.

Cortical structures in infants linked to future language skills; and more

Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

By Jill Adams
20 May 2025 | 2 min read
Digitally distorted building blocks.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants

A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

By Alona Fyshe
19 May 2025 | 7 min read