David Sassoon is the founder and publisher of InsideClimate News, the nonpartisan and nonprofit news organization that won the Pulitzer Prize for National Reporting in 2013. He has been a writer, editor and publisher for 25 years, involved with public interest issues, including human rights, cultural preservation, healthcare, education and the environment. In 2003, he began researching the business case for climate action for the Rockefeller Brothers Fund. As an outgrowth of his research, Sassoon founded a blog in 2007 that has grown and evolved into InsideClimate News. He earned his undergraduate degree from Harvard University and a master’s degree from Columbia University’s Graduate School of Journalism. He is the author of “Tiny Specks in a Hurry: The Story of a Journey to Mustang.”
David Sassoon
Publisher
InsideClimate News
Explore more from The Transmitter
Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.
Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.
The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.