Health

$28M challenge to figure out why brains are so good at learning

5 min read

IARPA grant could bring artificial intelligence closer to reality

The U.S. federal government has awarded more than $28 million to Harvard’s John A. Paulson School of Engineering and Applied Sciences (SEAS),  Center for Brain Science (CBS), and Department of Molecular and Cellular Biology to develop advanced machine learning algorithms by pushing the frontiers of neuroscience.

The Intelligence Advanced Research Projects Activity (IARPA) funds large-scale research programs that address the most difficult challenges facing the intelligence community.

Intelligence agencies today are inundated with data — more than they are able to analyze in a reasonable amount of time. Humans, naturally good at recognizing patterns, can’t keep pace with the influx of new information. The pattern-recognition and learning abilities of machines, meanwhile, still pale in comparison to even the simplest mammalian brains.

IARPA’s challenge: figure out why brains are so good at learning, and use that information to design computer systems that can interpret, analyze, and learn information as successfully as humans. To tackle this, Harvard researchers will record activity in the brain’s visual cortex in unprecedented detail, map its connections at a scale never before attempted, and reverse-engineer the data to inspire better computer algorithms for learning.

“This is a moonshot challenge, akin to the Human Genome Project in scope,” said project leader David Cox, assistant professor of molecular and cellular biology and computer science. “The scientific value of recording the activity of so many neurons and mapping their connections alone is enormous, but that is only the first half of the project. As we figure out the fundamental principles governing how the brain learns, it’s not hard to imagine that we’ll eventually be able to design computer systems that can match, or even outperform, humans.”

“This project is not only pushing the boundaries of brain science, it is also pushing the boundaries of what is possible in computer science.” — Hanspeter Pfister

These systems could be designed to detect network invasions, read MRI images, drive cars, or anything in between.

The research team tackling this challenge includes Jeff Lichtman, the Jeremy R. Knowles Professor of Molecular and Cellular Biology; Hanspeter Pfister, the An Wang Professor of Computer Science; Haim Sompolinsky, the William N. Skirball Professor of Neuroscience; and Ryan Adams, assistant professor of computer science; as well as collaborators from MIT, Notre Dame, New York University, the University of Chicago, and Rockefeller University.

The multi-stage effort begins in Cox’s lab, where rats will be trained to visually recognize various objects on a computer screen. As the animals are learning, Cox’s team will record the activity of visual neurons using next-generation laser microscopes built for this project with collaborators at Rockefeller, to see how brain activity changes. Then, a substantial portion of the rat’s brain — 1 cubic millimeter in size — will be sent down the hall to Lichtman’s lab, where it will be sliced ultra-thin and imaged under the world’s first multi-beam scanning electron microscope, housed in the Center for Brain Science.

“This is an amazing opportunity to see all the intricate details of a full piece of cerebral cortex,” says Lichtman. “We are very excited to get started but have no illusions that this will be easy.”

This difficult process will generate over a petabyte of data — equivalent to about 1.6 million CDs worth of information. This vast trove of data will then be sent to Pfister, whose algorithms will reconstruct cell boundaries, synapses, and connections, and visualize them in three dimensions.

“This project is not only pushing the boundaries of brain science, it is also pushing the boundaries of what is possible in computer science,” said Pfister. “We will reconstruct neural circuits at an unprecedented scale from petabytes of structural and functional data. This requires us to make new advances in data management, high-performance computing, computer vision, and network analysis.”

If the work stopped here, its scientific impact would already be enormous — but it doesn’t. Once researchers know how visual cortex neurons are connected to each other in three dimensions, the next task will be to figure out how the brain uses those connections to quickly process information and infer patterns from new stimuli. Today, one of the biggest challenges in computer science is the amount of training data that deep-learning systems require. For example, to learn to recognize a car, a computer system needs to see hundreds of thousands of cars. But humans and other mammals don’t need to see an object thousands of times to recognize it — they only need to see it a few times.

In subsequent phases of the project, researchers at Harvard and their collaborators will build computer algorithms for learning and pattern recognition that are inspired and constrained by the connectomics data. These biologically-inspired computer algorithms will outperform current computer systems in their ability to recognize patterns and make inferences from limited data inputs. Among other things, this research could improve the performance of computer vision systems that can help robots see and navigate through new environments.

“We have a huge task ahead of us in this project, but at the end of the day, this research will help us understand what is special about our brains,” Cox said. “One of the most exciting things about this project is that we are working on one of the great remaining achievements for human knowledge — understanding how the brain works at a fundamental level.”

The Institute for Applied Computational Science (IACS) symposium on the Future of Computation in Science and Engineering will be held from 9 a.m. to 5 p.m. Friday in Science Center Hall B, 1 Oxford St., Cambridge. It will focus on the converging fields of neuroscience and computer science and machine learning. For more information, visit IACS.