Mike Hawrylycz joined the Allen Institute for Brain Science in Seattle, Washington, in 2003 as director of informatics and one of the institute’s first staff. His group is responsible for developing algorithms and computational approaches in the development of multimodal brain atlases, and in data analysis and annotation. Hawrylycz has worked in a variety of applied mathematics and computer science areas, addressing challenges in consumer and investment finance, electrical engineering and image processing, and computational biology and genomics. He received his Ph.D. in applied mathematics at the Massachusetts Institute of Technology and subsequently was a postdoctoral researcher at the Center for Nonlinear Studies at the Los Alamos National Laboratory in New Mexico.

Michael Hawrylycz
Investigator
Allen Institute for Brain Science
From this contributor
Knowledge graphs can help make sense of the flood of cell-type data
These tools, widely used in the technology industry, could provide a foundation for the study of brain circuits.

Knowledge graphs can help make sense of the flood of cell-type data
Explore more from The Transmitter
Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.

Sharing Africa’s brain data: Q&A with Amadi Ihunwo
These data are “virtually mandatory” to advance neuroscience, says Ihunwo, a co-investigator of the Brain Research International Data Governance & Exchange (BRIDGE) initiative, which seeks to develop a global framework for sharing, using and protecting neuroscience data.
Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.

Cortical structures in infants linked to future language skills; and more
Here is a roundup of autism-related news and research spotted around the web for the week of 19 May.
The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.

The BabyLM Challenge: In search of more efficient learning algorithms, researchers look to infants
A competition that trains language models on relatively small datasets of words, closer in size to what a child hears up to age 13, seeks solutions to some of the major challenges of today’s large language models.