News+

Jacob Zavatone-Veth awarded NIH High-Risk, High-Reward grant

Jacob Zavatone-Veth.

Courtesy photo

2 min read

Jacob Zavatone-Veth, a Junior Fellow in the Harvard Society of Fellows, has received a Director’s Early Independence Award from the National Institutes of Health. The award is part of the NIH’s High-Risk, High-Reward Research program that supports scientists who propose “visionary and broadly impactful” behavioral and biomedical research projects. 

The Early Independence Award gives exceptional early-career scientists the opportunity to skip traditional postdoctoral training in order to move immediately into independence research positions. Zavatone-Veth was one of 12 national awardees this year. 

An affiliate of Harvard’s Center for Brain Science, Zavatone-Veth is broadly focused on the theory of neural computation, with particular emphasis on how representations and dynamics are learned. 

Recent technical advances have enabled scientists to record the simultaneous activity of hundreds of thousands of neurons. Extracting insight from this deluge of data has required computational neuroscientists like Zavatone-Veth to use data-driven modeling approaches, including the use of artificial recurrent neural networks.

The NIH award will support Zavatone-Veth’s work in uncovering the dynamics of computation in a trained recurrent neural network, for which firm theoretical understanding is lacking within the field. His goal is to build a complete theory of how network architecture and training procedures interact to bias how a recurrent neural network functions. 

“In total, this research will elucidate the limitations of one of the most popular approaches for extracting understanding from large-scale neural data and advance our basic understanding of how recurrent computations are learned,” wrote Zavatone-Veth in his project proposal. 

Zavatone-Veth was first introduced to neuroscience during his undergraduate work in physics at Yale, where he studied visual fruit fly motion detection and locomotor coordination with Damon Clark. He came to Harvard for his Ph.D.; his doctoral work with Cengiz Pehlevan applied tools from statistical physics to investigate the structure of learned representations in natural and artificial neural networks. 

More information on this year’s award recipients.