Headshot of Robert Froemke.

Robert Froemke

Skirball Foundation Professor of Genetics
NYU Grossman School of Medicine

Robert Froemke is Skirball Foundation Professor of Genetics in the Neuroscience Institute and the otolaryngology and neuroscience departments at NYU Grossman School of Medicine. His lab studies neuromodulation, plasticity and behavior in rodents and humans. Froemke has a background in systems neuroscience, having performed Ph.D. work with Yang Dan at the University of California, Berkeley on spike-timing-dependent plasticity induced by natural spike trains in cortical networks. His postdoctoral research with Christoph Schreiner at the University of California, San Francisco focused on synaptic plasticity in vivo as related to auditory perception and behavior.

Froemke started his faculty position at NYU Grossman School of Medicine in 2010. He studies the synaptic mechanisms by which sounds acquire meaning, with a focus on oxytocin, maternal behavior and the use of neuroprosthetic devices, such as cochlear implants. For this work, he was awarded Sloan and Klingenstein Fellowships, and Pew and McKnight Scholarships. In 2021, Froemke was honored to receive a Landis Award for Outstanding Mentorship from the National Institute of Neurological Disorders and Stroke.

Explore more from The Transmitter

Photograph of the BRIDGE team and students visiting a laboratory.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo

These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

By Lauren Schenkman
20 May 2025 | 6 min read
Research image of neurite overgrowth in cells grown from people with autism-linked PPP2R5D variants.

Cortical structures in infants linked to future language skills; and more

Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

By Jill Adams
20 May 2025 | 2 min read
Digitally distorted building blocks.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants

A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

By Alona Fyshe
19 May 2025 | 7 min read