Scientists studying the movement of animals have longed for a motion-capture method similar to the one Hollywood animators use to create spectacular big-screen villains (think Thanos in “The Avengers”).
Now a team of Harvard-led scientists has made a breakthrough, assembling a new system combining motion capture and deep learning to continuously track the 3D movements of freely behaving animals. The project, which monitors how the brain controls behavior, has the potential to help combat human disease or advance the creation of artificial intelligence.
The system, called continuous appendicular and postural tracking using retroreflector embedding — CAPTURE, for short — delivers what’s believed to be an unprecedented look at how animals move and behave naturally. This can one day lead to new understandings of how the brain functions.
“It’s really a technique that will end up informing a lot of neuroscience, psychology, drug discovery, [and disciplines] where questions of behavioral characterization and phenotyping are important,” said Bence Ölveczky, a professor in the Harvard Department of Organismic and Evolutionary Biology.
Up to now, no technology has captured intricate details of an animal’s natural behavior for extended periods of time. Recording precise animal movements during simple tasks, like pressing a lever, is possible, but because of the limited range of movements and behaviors scientists have been able to explore, it isn’t clear if the insights gained can lead to a general understanding of brain function.
CAPTURE starts the push beyond those limitations. It uses a series of custom markers that are attached to an animal, like tiny earrings, to track the position of the animal’s whole body nonstop with a 12-camera array. This lets them digitally reconstruct the animal’s skeletal pose and measure its normal movements for weeks at a time. With that data, the scientists can then develop new algorithms to create a foundational map of an animal’s normal behaviors. These behavioral maps can then be compared with maps when the animal is in an altered state, giving researchers an exact look at even some of the most subtle differences.
CAPTURE is described in a study recently published in Neuron. Harvard postdoctoral fellow Jesse Marshall led the project. Working with him and Ölveczky were William Wang ’20 and Diego E. Aldarondo, a student at the Graduate School of Arts and Sciences. Right now, CAPTURE has been designed to work only on rats, but the team plans to expand to other animals in the future.