Terrence Sejnowski.

Terrence Sejnowski

Francis Crick Chair
Salk Institute for Biological Studies

Terrence Sejnowski holds the Francis Crick Chair at the Salk Institute for Biological Studies. He is also professor of biology at the University of California, San Diego, where he co-directs the Institute for Neural Computation and the NSF Temporal Dynamics of Learning Center. He is president of the Neural Information Processing Systems Foundation, which organizes an annual conference attended by more than 1,000 researchers in machine learning and neural computation and is founding editor-in-chief of Neural Computation, published by the MIT Press.

As a pioneer in computational neuroscience, Sejnowski’s goal is to understand the principles that link brain to behavior. His laboratory uses both experimental and modeling techniques to study the biophysical properties of synapses and neurons and the population dynamics of large networks of neurons.

He received his Ph.D. in physics from Princeton University and was a postdoctoral fellow at Harvard Medical School. He was on the faculty at the Johns Hopkins University before joining the faculty at the University of California, San Diego. He has published more than 300 scientific papers and 12 books, including “The Computational Brain,” with Patricia Churchland.

Explore more from The Transmitter

Photograph of the BRIDGE team and students visiting a laboratory.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo

These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

By Lauren Schenkman
20 May 2025 | 6 min read
Research image of neurite overgrowth in cells grown from people with autism-linked PPP2R5D variants.

Cortical structures in infants linked to future language skills; and more

Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

By Jill Adams
20 May 2025 | 2 min read
Digitally distorted building blocks.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants

A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

By Alona Fyshe
19 May 2025 | 7 min read