{"id":249509,"date":"2018-08-30T12:50:25","date_gmt":"2018-08-30T16:50:25","guid":{"rendered":"https:\/\/news.harvard.edu\/gazette\/?p=249509"},"modified":"2023-11-08T20:46:45","modified_gmt":"2023-11-09T01:46:45","slug":"an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species","status":"publish","type":"post","link":"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/","title":{"rendered":"Movement monitor"},"content":{"rendered":"<header\n\tclass=\"wp-block-harvard-gazette-article-header alignfull article-header is-style-full-width-text-below centered-image\"\n\tstyle=\" \"\n>\n\t<figure class=\"wp-block-image\"><img fetchpriority=\"high\" decoding=\"async\" alt=\"Rendering of lab animals moving.\" height=\"1667\" loading=\"eager\" src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/mouserendering.jpg\" width=\"2500\"\/><figcaption class=\"wp-element-caption\"><p class=\"wp-element-caption--caption\">A team of researchers uses artificial intelligence technology to make it far easier than ever before to track animals\u2019 movements in the lab. <\/p><p class=\"wp-element-caption--credit\">Images courtesy of DeepLabCut<\/p><\/figcaption><\/figure>\n\n\t<div class=\"article-header__content\">\n\t\t\t<a\n\t\t\tclass=\"article-header__category\"\n\t\t\thref=\"https:\/\/news.harvard.edu\/gazette\/section\/science-technology\/\"\n\t\t>\n\t\t\tScience &amp; Tech\t\t<\/a>\n\t\t\n\t\t<h1 class=\"article-header__title wp-block-heading \">\n\t\tMovement monitor\t<\/h1>\n\n\t\n\t\t\t<\/div>\n\t\t\n\t<div class=\"article-header__meta\">\n\t\t<div class=\"wp-block-post-author\">\n\t\t\t<address class=\"wp-block-post-author__content\">\n\t\t\t\t\t<p class=\"author wp-block-post-author__name\">\n\t\tIsabel Suditsch and Peter Reuell\t<\/p>\n\t\t\t<p class=\"wp-block-post-author__byline\">\n\t\t\tHarvard Correspondent, Harvard Staff Writer\t\t<\/p>\n\t\t\t\t\t<\/address>\n\t\t<\/div>\n\n\t\t<time class=\"article-header__date\" datetime=\"2018-08-30\">\n\t\t\tAugust 30, 2018\t\t<\/time>\n\n\t\t<span class=\"article-header__reading-time\">\n\t\t\t4 min read\t\t<\/span>\n\t<\/div>\n\n\t\n\t\t\t<h2 class=\"article-header__subheading wp-block-heading\">\n\t\t\tAn open-source AI tool for studying movement across behaviors and species\t\t<\/h2>\n\t\t\n<\/header>\n\n\n\n<div class=\"wp-block-group alignwide has-global-padding is-content-justification-center is-layout-constrained wp-block-group-is-layout-constrained\">\n\n\n\t\t<p>Understanding the brain, in part, means understanding how behavior is created.<\/p>\n<p>To reverse-engineer how neural circuits drive behavior requires accurate and vigorous tracking of behavior, yet the increasingly complex tasks animals perform in the laboratory have made that challenging.<\/p>\n<p>Now, a team of researchers from the Rowland Institute at Harvard, Harvard University, and the University of T\u00fcbingen is turning to artificial intelligence technology to solve the problem.<\/p>\n<p>The software they developed, dubbed DeepLabCut, harnesses new learning techniques to track features from the digits of mice, to egg-laying behavior in Drosophila, and beyond. The work is described in an Aug. 20 <a href=\"https:\/\/www.nature.com\/articles\/s41593-018-0209-y\">paper published in Nature Neuroscience<\/a>.<\/p>\n<p>The software is the brainchild of Mackenzie Mathis, a Rowland Fellow at the Rowland Institute at Harvard; Alexander Mathis, a postdoctoral fellow working in the lab of Venkatesh N. Murthy, professor of molecular and cellular biology and chair of the Department of Molecular and Cellular Biology; and Matthias Bethge, a professor at the University of T\u00fcbingen and chair of the Bernstein Center for Computational Neuroscience T\u00fcbingen.<\/p>\n<p>The notion of using software to track animal movements was born partly of necessity. Both Mackenzie and Alexander Mathis had tried using traditional techniques, which typically involve placing tracking markers on animals and using heuristics such as object segmentation, with mixed success.<\/p>\n<p>Such techniques are often sensitive to the choice of analysis parameters, and markers or tattoos are invasive and can hinder natural behaviors, or may be impossible to place on very small or wild animals, they said.<\/p>\n<p>Luckily, international competitions in recent years have driven advances in computer vision and the development of new algorithms capable of human pose-estimation (automatically tracking human body parts).<\/p>\n<p>Such algorithms, however, are widely seen as data-hungry, requiring thousands of labeled examples for the artificial neural network to learn. This is prohibitively large for typical laboratory experiments, and would require days of manual labeling for each behavior.<\/p>\n<p>The solution came in what is called \u201ctransfer learning,\u201d or applying an already-trained network to a different problem, similar to the way scientists believe biological systems learn.<\/p>\n<p>Using a state-of-the-art algorithm for tracking human movement called DeeperCut, the Mathises were able to show that deep learning could be highly data-efficient. The new software\u2019s name is a nod to DeeperCut\u2019s authors.<\/p>\n\r\n<figure class=\"wp-block-group wp-block-table alignwide is-layout-flow wp-block-group-is-layout-flow\">\n<div class=\"wp-block-columns alignwide are-vertically-aligned-top media-cluster is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t<figcaption class=\"wp-block-group wp-element-caption is-layout-flow wp-block-group-is-layout-flow\"><\/figcaption>\r\n\t\t\t<\/div>\n\t\t\t\n\t\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t\t\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"600\" height=\"475\" src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/ezgif.com-optimize-1.gif\" alt=\"\" class=\"wp-image-261768\"><\/figure>\n\t\n\t\r\n\t\t\t<\/div>\n\t\t\t\n\t\t<\/div>\n\n<\/figure>\r\n\n\r\n<figure class=\"wp-block-group wp-block-table alignwide is-layout-flow wp-block-group-is-layout-flow\">\n<div class=\"wp-block-columns alignwide are-vertically-aligned-top media-cluster is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t<figcaption class=\"wp-block-group wp-element-caption is-layout-flow wp-block-group-is-layout-flow\"><p class=\"wp-element-caption--caption\">The software tracks the movements of a fly laying eggs and the digits of a mouse.<\/p><\/figcaption>\r\n\t\t\t<\/div>\n\t\t\t\n\t\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t\t\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"489\" height=\"280\" src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/MousereachGIF1.gif\" alt=\"\" class=\"wp-image-249512\"><\/figure>\n\t\n\t\r\n\t\t\t<\/div>\n\t\t\t\n\t\t<\/div>\n\n<\/figure>\r\n\n<p>Just as a child does not need to develop another visual system from scratch in order to recognize a novel object, but relies on thousands of hours of experience and adapts them to recognize new objects, DeepLabCut is pretrained on thousands of images containing natural objects, images of hammers, cats, dogs, foods, and more.<\/p>\n<p>With that pretraining in place, the software needed only 100 examples of mice performing an odor-guided navigation experiment to recognize specific mouse body parts as well as humans could.<\/p>\n<p>The team was also able to apply the technology to mice making reaching movements, and, in collaboration with Kevin Cury, a neuroscientist from Columbia University, to flies laying eggs in a 3-D chamber.<\/p>\n<p>\u201cWe were very impressed by the success of the transfer-learning approach and the versatility of DeepLabCut,\u201d Mackenzie Mathis said. \u201cWith only a few hundred frames of training data, we were able to get accurate and robust tracking across a myriad of experimental conditions, animals, and behaviors.\u201d<\/p>\n<p>\u201cExperimentalists have very good intuitions about what body parts should be analyzed to study a particular behavior, but traditionally extracting limb coordinates from videos has been very challenging \u2014 DeepLabCut does just that based on a few examples,\u201d Alexander Mathis said. \u201cSince the program is designed as a user-friendly, \u2018plug-and-play\u2019 solution, and does not require any coding skills, it can be widely used.\u201d<\/p>\n<p>\u201cWe want as many researchers as possible to benefit from our work,\u201d said Bethge. \u201cDeepLabCut was created as an open software, as sharing results, data, and also algorithms is essential for scientific progress.\u201d<\/p>\n<p>Even as the paper describing the software was published, the technology had been used by more than 50 labs to study everything from the gait of horses to bacteria dynamics to the movement of surgery robots.<\/p>\n<p>The software toolbox can be used with minimal to no coding experience and is freely available at <a href=\"http:\/\/mousemotorlab.org\/deeplabcut\">mousemotorlab.org\/deeplabcut<\/a>.<\/p>\n<p><em>This study was supported with funding from the Marie Sklodowska-Curie International Fellowship, the Rowland Institute at Harvard, Project ALS Women &amp; the Brain Neuroscience Fellowship, the German Science <\/em>foundation<em> (DFG) CRC 1233 on Robust Vision, and IARPA through the MICrONS program.<\/em><\/p>\n\n\n<\/div>\n\n\t\t","protected":false},"excerpt":{"rendered":"<p>A team of researchers from the Rowland Institute at Harvard, Harvard University, and the University of T\u00fcbingen is turning to artificial intelligence technology to make it far easier than ever before to track animals\u2019 movements in the lab. <\/p>\n","protected":false},"author":122429419,"featured_media":249516,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"gz_ga_pageviews":16,"gz_ga_lastupdated":"2021-12-16 12:22","document_color_palette":"crimson","author":"Isabel Suditsch and Peter Reuell","affiliation":"Harvard Correspondent, Harvard Staff Writer","_category_override":"","_yoast_wpseo_primary_category":"","_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[1387],"tags":[36780,41094,41090,41091,4836,10484,41089,12941,13050,15359,21061,21079,41093,41092,39956,27327,29235,30024,35144],"gazette-formats":[],"series":[],"class_list":["post-249509","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-science-technology","tag-ai","tag-alexander-mathis","tag-animal","tag-animal-movement","tag-artificial-intelligence","tag-deep-learning","tag-deeplabcut","tag-faculty-of-arts-and-sciences","tag-fas","tag-harvard","tag-lab","tag-laboratory","tag-mackenzie-mathis","tag-mathis","tag-movement","tag-peter-reuell","tag-reuell","tag-rowland-institute","tag-venkatesh-murthy"],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v23.0 (Yoast SEO v27.1.1) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>An open-source AI tool available to study movement across behaviors and species &#8212; Harvard Gazette<\/title>\n<meta name=\"description\" content=\"A team of researchers from the Rowland Institute at Harvard, Harvard University, and the University of T\u00fcbingen is turning to artificial intelligence technology to make it far easier than ever before to track animals\u2019 movements in the lab.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"An open-source AI tool available to study movement across behaviors and species\" \/>\n<meta property=\"og:description\" content=\"A team of researchers from the Rowland Institute at Harvard, Harvard University, and the University of T\u00fcbingen is turning to artificial intelligence technology to make it far easier than ever before to track animals\u2019 movements in the lab.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/\" \/>\n<meta property=\"og:site_name\" content=\"Harvard Gazette\" \/>\n<meta property=\"article:published_time\" content=\"2018-08-30T16:50:25+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2023-11-09T01:46:45+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/mouserendering.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"2500\" \/>\n\t<meta property=\"og:image:height\" content=\"1667\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"gazettebeckycoleman\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:title\" content=\"An open-source AI tool available to study movement across behaviors and species\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/\"},\"author\":{\"name\":\"gazettebeckycoleman\",\"@id\":\"https:\/\/news.harvard.edu\/gazette\/#\/schema\/person\/c6c859c924528563b44146bb17e8949f\"},\"headline\":\"Movement monitor\",\"datePublished\":\"2018-08-30T16:50:25+00:00\",\"dateModified\":\"2023-11-09T01:46:45+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/\"},\"wordCount\":774,\"publisher\":{\"@id\":\"https:\/\/news.harvard.edu\/gazette\/#organization\"},\"image\":{\"@id\":\"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/news.harvard.edu\/wp-content\/uploads\/2018\/08\/mouserendering.jpg\",\"keywords\":[\"A.I.\",\"Alexander Mathis\",\"animal\",\"animal movement\",\"Artificial Intelligence\",\"deep learning\",\"DeepLabCut\",\"Faculty of Arts and Sciences\",\"FAS\",\"Harvard\",\"lab\",\"laboratory\",\"Mackenzie Mathis\",\"Mathis\",\"movement\",\"Peter Reuell\",\"Reuell\",\"Rowland Institute\",\"Venkatesh Murthy\"],\"articleSection\":[\"Science &amp; Tech\"],\"inLanguage\":\"en-US\",\"copyrightYear\":\"2018\",\"copyrightHolder\":{\"@id\":\"https:\/\/news.harvard.edu\/gazette\/#organization\"}},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/\",\"url\":\"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/\",\"name\":\"An open-source AI tool available to study movement across behaviors and species &#8212; Harvard Gazette\",\"isPartOf\":{\"@id\":\"https:\/\/news.harvard.edu\/gazette\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/news.harvard.edu\/wp-content\/uploads\/2018\/08\/mouserendering.jpg\",\"datePublished\":\"2018-08-30T16:50:25+00:00\",\"dateModified\":\"2023-11-09T01:46:45+00:00\",\"description\":\"A team of researchers from the Rowland Institute at Harvard, Harvard University, and the University of T\u00fcbingen is turning to artificial intelligence technology to make it far easier than ever before to track animals\u2019 movements in the lab.\",\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/#primaryimage\",\"url\":\"https:\/\/news.harvard.edu\/wp-content\/uploads\/2018\/08\/mouserendering.jpg\",\"contentUrl\":\"https:\/\/news.harvard.edu\/wp-content\/uploads\/2018\/08\/mouserendering.jpg\",\"width\":2500,\"height\":1667,\"caption\":\"Rendering of lab animals moving.\"},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/news.harvard.edu\/gazette\/#website\",\"url\":\"https:\/\/news.harvard.edu\/gazette\/\",\"name\":\"Harvard Gazette\",\"description\":\"Official news from Harvard University covering innovation in teaching, learning, and research\",\"publisher\":{\"@id\":\"https:\/\/news.harvard.edu\/gazette\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/news.harvard.edu\/gazette\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/news.harvard.edu\/gazette\/#organization\",\"name\":\"The Harvard Gazette\",\"url\":\"https:\/\/news.harvard.edu\/gazette\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/news.harvard.edu\/gazette\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/12\/Harvard_Gazette_logo.svg\",\"contentUrl\":\"https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/12\/Harvard_Gazette_logo.svg\",\"width\":164,\"height\":64,\"caption\":\"The Harvard Gazette\"},\"image\":{\"@id\":\"https:\/\/news.harvard.edu\/gazette\/#\/schema\/logo\/image\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\/\/news.harvard.edu\/gazette\/#\/schema\/person\/c6c859c924528563b44146bb17e8949f\",\"name\":\"gazettebeckycoleman\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"An open-source AI tool available to study movement across behaviors and species &#8212; Harvard Gazette","description":"A team of researchers from the Rowland Institute at Harvard, Harvard University, and the University of T\u00fcbingen is turning to artificial intelligence technology to make it far easier than ever before to track animals\u2019 movements in the lab.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/","og_locale":"en_US","og_type":"article","og_title":"An open-source AI tool available to study movement across behaviors and species","og_description":"A team of researchers from the Rowland Institute at Harvard, Harvard University, and the University of T\u00fcbingen is turning to artificial intelligence technology to make it far easier than ever before to track animals\u2019 movements in the lab.","og_url":"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/","og_site_name":"Harvard Gazette","article_published_time":"2018-08-30T16:50:25+00:00","article_modified_time":"2023-11-09T01:46:45+00:00","og_image":[{"width":2500,"height":1667,"url":"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/mouserendering.jpg","type":"image\/jpeg"}],"author":"gazettebeckycoleman","twitter_card":"summary_large_image","twitter_title":"An open-source AI tool available to study movement across behaviors and species","schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/#article","isPartOf":{"@id":"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/"},"author":{"name":"gazettebeckycoleman","@id":"https:\/\/news.harvard.edu\/gazette\/#\/schema\/person\/c6c859c924528563b44146bb17e8949f"},"headline":"Movement monitor","datePublished":"2018-08-30T16:50:25+00:00","dateModified":"2023-11-09T01:46:45+00:00","mainEntityOfPage":{"@id":"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/"},"wordCount":774,"publisher":{"@id":"https:\/\/news.harvard.edu\/gazette\/#organization"},"image":{"@id":"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/#primaryimage"},"thumbnailUrl":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2018\/08\/mouserendering.jpg","keywords":["A.I.","Alexander Mathis","animal","animal movement","Artificial Intelligence","deep learning","DeepLabCut","Faculty of Arts and Sciences","FAS","Harvard","lab","laboratory","Mackenzie Mathis","Mathis","movement","Peter Reuell","Reuell","Rowland Institute","Venkatesh Murthy"],"articleSection":["Science &amp; Tech"],"inLanguage":"en-US","copyrightYear":"2018","copyrightHolder":{"@id":"https:\/\/news.harvard.edu\/gazette\/#organization"}},{"@type":"WebPage","@id":"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/","url":"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/","name":"An open-source AI tool available to study movement across behaviors and species &#8212; Harvard Gazette","isPartOf":{"@id":"https:\/\/news.harvard.edu\/gazette\/#website"},"primaryImageOfPage":{"@id":"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/#primaryimage"},"image":{"@id":"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/#primaryimage"},"thumbnailUrl":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2018\/08\/mouserendering.jpg","datePublished":"2018-08-30T16:50:25+00:00","dateModified":"2023-11-09T01:46:45+00:00","description":"A team of researchers from the Rowland Institute at Harvard, Harvard University, and the University of T\u00fcbingen is turning to artificial intelligence technology to make it far easier than ever before to track animals\u2019 movements in the lab.","inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/#primaryimage","url":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2018\/08\/mouserendering.jpg","contentUrl":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2018\/08\/mouserendering.jpg","width":2500,"height":1667,"caption":"Rendering of lab animals moving."},{"@type":"WebSite","@id":"https:\/\/news.harvard.edu\/gazette\/#website","url":"https:\/\/news.harvard.edu\/gazette\/","name":"Harvard Gazette","description":"Official news from Harvard University covering innovation in teaching, learning, and research","publisher":{"@id":"https:\/\/news.harvard.edu\/gazette\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/news.harvard.edu\/gazette\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/news.harvard.edu\/gazette\/#organization","name":"The Harvard Gazette","url":"https:\/\/news.harvard.edu\/gazette\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/news.harvard.edu\/gazette\/#\/schema\/logo\/image\/","url":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/12\/Harvard_Gazette_logo.svg","contentUrl":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/12\/Harvard_Gazette_logo.svg","width":164,"height":64,"caption":"The Harvard Gazette"},"image":{"@id":"https:\/\/news.harvard.edu\/gazette\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/news.harvard.edu\/gazette\/#\/schema\/person\/c6c859c924528563b44146bb17e8949f","name":"gazettebeckycoleman"}]}},"parsely":{"version":"1.1.0","canonical_url":"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/","smart_links":{"inbound":0,"outbound":0},"traffic_boost_suggestions_count":0,"meta":{"@context":"https:\/\/schema.org","@type":"NewsArticle","headline":"Movement monitor","url":"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/","mainEntityOfPage":{"@type":"WebPage","@id":"https:\/\/news.harvard.edu\/gazette\/story\/2018\/08\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\/"},"thumbnailUrl":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2018\/08\/mouserendering.jpg?w=150","image":{"@type":"ImageObject","url":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2018\/08\/mouserendering.jpg"},"articleSection":"Science &amp; Tech","author":[{"@type":"Person","name":"gazettebeckycoleman"}],"creator":["gazettebeckycoleman"],"publisher":{"@type":"Organization","name":"Harvard Gazette","logo":"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2023\/12\/Harvard_Gazette_logo.svg"},"keywords":["a.i.","alexander mathis","animal","animal movement","artificial intelligence","deep learning","deeplabcut","faculty of arts and sciences","fas","harvard","lab","laboratory","mackenzie mathis","mathis","movement","peter reuell","reuell","rowland institute","venkatesh murthy"],"dateCreated":"2018-08-30T16:50:25Z","datePublished":"2018-08-30T16:50:25Z","dateModified":"2023-11-09T01:46:45Z"},"rendered":"<script type=\"application\/ld+json\" class=\"wp-parsely-metadata\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@type\":\"NewsArticle\",\"headline\":\"Movement monitor\",\"url\":\"https:\\\/\\\/news.harvard.edu\\\/gazette\\\/story\\\/2018\\\/08\\\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\\\/\",\"mainEntityOfPage\":{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/news.harvard.edu\\\/gazette\\\/story\\\/2018\\\/08\\\/an-open-source-ai-tool-available-to-study-movement-across-behaviors-and-species\\\/\"},\"thumbnailUrl\":\"https:\\\/\\\/news.harvard.edu\\\/wp-content\\\/uploads\\\/2018\\\/08\\\/mouserendering.jpg?w=150\",\"image\":{\"@type\":\"ImageObject\",\"url\":\"https:\\\/\\\/news.harvard.edu\\\/wp-content\\\/uploads\\\/2018\\\/08\\\/mouserendering.jpg\"},\"articleSection\":\"Science &amp; Tech\",\"author\":[{\"@type\":\"Person\",\"name\":\"gazettebeckycoleman\"}],\"creator\":[\"gazettebeckycoleman\"],\"publisher\":{\"@type\":\"Organization\",\"name\":\"Harvard Gazette\",\"logo\":\"https:\\\/\\\/news.harvard.edu\\\/gazette\\\/wp-content\\\/uploads\\\/2023\\\/12\\\/Harvard_Gazette_logo.svg\"},\"keywords\":[\"a.i.\",\"alexander mathis\",\"animal\",\"animal movement\",\"artificial intelligence\",\"deep learning\",\"deeplabcut\",\"faculty of arts and sciences\",\"fas\",\"harvard\",\"lab\",\"laboratory\",\"mackenzie mathis\",\"mathis\",\"movement\",\"peter reuell\",\"reuell\",\"rowland institute\",\"venkatesh murthy\"],\"dateCreated\":\"2018-08-30T16:50:25Z\",\"datePublished\":\"2018-08-30T16:50:25Z\",\"dateModified\":\"2023-11-09T01:46:45Z\"}<\/script>","tracker_url":"https:\/\/cdn.parsely.com\/keys\/news.harvard.edu\/p.js"},"jetpack_featured_media_url":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2018\/08\/mouserendering.jpg","has_blocks":true,"block_data":{"0":{"blockName":"harvard-gazette\/article-header","attrs":{"blockColorPalette":"","coloredHeading":"","creditText":"Images courtesy of DeepLabCut","displayDetails":"","displayTitle":"","categoryId":1387,"mediaAlt":"Rendering of lab animals moving.","mediaCaption":"A team of researchers uses artificial intelligence technology to make it far easier than ever before to track animals\u2019 movements in the lab. ","mediaId":249516,"mediaSize":"full","mediaType":"image","mediaUrl":"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/mouserendering.jpg","poster":"","title":"Movement monitor","subheading":"An open-source AI tool for studying movement across behaviors and species","centeredImage":true,"className":"is-style-full-width-text-below","mediaHeight":1667,"mediaWidth":2500,"backgroundFixed":false,"backgroundTone":"light","coloredBackground":false,"displayOverlay":true,"fadeInText":false,"isAmbient":false,"mediaLength":"","mediaPosition":"","posterText":"","titleAbove":false,"useUncroppedImage":false,"lock":[],"metadata":[]},"innerBlocks":[],"innerHTML":"<figure class=\"wp-block-image\"><img alt=\"Rendering of lab animals moving.\" height=\"1667\" loading=\"eager\" src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/mouserendering.jpg\" width=\"2500\"\/><figcaption class=\"wp-element-caption\"><p class=\"wp-element-caption--caption\">A team of researchers uses artificial intelligence technology to make it far easier than ever before to track animals\u2019 movements in the lab. <\/p><p class=\"wp-element-caption--credit\">Images courtesy of DeepLabCut<\/p><\/figcaption><\/figure>\n","innerContent":["<figure class=\"wp-block-image\"><img alt=\"Rendering of lab animals moving.\" height=\"1667\" loading=\"eager\" src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/mouserendering.jpg\" width=\"2500\"\/><figcaption class=\"wp-element-caption\"><p class=\"wp-element-caption--caption\">A team of researchers uses artificial intelligence technology to make it far easier than ever before to track animals\u2019 movements in the lab. <\/p><p class=\"wp-element-caption--credit\">Images courtesy of DeepLabCut<\/p><\/figcaption><\/figure>\n"],"rendered":"<header\n\tclass=\"wp-block-harvard-gazette-article-header alignfull article-header is-style-full-width-text-below centered-image\"\n\tstyle=\" \"\n>\n\t<figure class=\"wp-block-image\"><img alt=\"Rendering of lab animals moving.\" height=\"1667\" loading=\"eager\" src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/mouserendering.jpg\" width=\"2500\"\/><figcaption class=\"wp-element-caption\"><p class=\"wp-element-caption--caption\">A team of researchers uses artificial intelligence technology to make it far easier than ever before to track animals\u2019 movements in the lab. <\/p><p class=\"wp-element-caption--credit\">Images courtesy of DeepLabCut<\/p><\/figcaption><\/figure>\n\n\t<div class=\"article-header__content\">\n\t\t\t<a\n\t\t\tclass=\"article-header__category\"\n\t\t\thref=\"https:\/\/news.harvard.edu\/gazette\/section\/science-technology\/\"\n\t\t>\n\t\t\tScience &amp; Tech\t\t<\/a>\n\t\t\n\t\t<h1 class=\"article-header__title wp-block-heading \">\n\t\tMovement monitor\t<\/h1>\n\n\t\n\t\t\t<\/div>\n\t\t\n\t<div class=\"article-header__meta\">\n\t\t<div class=\"wp-block-post-author\">\n\t\t\t<address class=\"wp-block-post-author__content\">\n\t\t\t\t\t<p class=\"author wp-block-post-author__name\">\n\t\tIsabel Suditsch and Peter Reuell\t<\/p>\n\t\t\t<p class=\"wp-block-post-author__byline\">\n\t\t\tHarvard Correspondent, Harvard Staff Writer\t\t<\/p>\n\t\t\t\t\t<\/address>\n\t\t<\/div>\n\n\t\t<time class=\"article-header__date\" datetime=\"2018-08-30\">\n\t\t\tAugust 30, 2018\t\t<\/time>\n\n\t\t<span class=\"article-header__reading-time\">\n\t\t\t4 min read\t\t<\/span>\n\t<\/div>\n\n\t\n\t\t\t<h2 class=\"article-header__subheading wp-block-heading\">\n\t\t\tAn open-source AI tool for studying movement across behaviors and species\t\t<\/h2>\n\t\t\n<\/header>\n"},"2":{"blockName":"core\/group","attrs":{"templateLock":false,"metadata":{"name":"Article content"},"align":"wide","layout":{"type":"constrained","justifyContent":"center"},"tagName":"div","lock":[],"className":"","style":[],"backgroundColor":"","textColor":"","gradient":"","fontSize":"","fontFamily":"","borderColor":"","ariaLabel":"","anchor":""},"innerBlocks":[{"blockName":"core\/freeform","attrs":{"content":"","lock":[],"metadata":[]},"innerBlocks":[],"innerHTML":"\n\t\t<p>Understanding the brain, in part, means understanding how behavior is created.<\/p>\n<p>To reverse-engineer how neural circuits drive behavior requires accurate and vigorous tracking of behavior, yet the increasingly complex tasks animals perform in the laboratory have made that challenging.<\/p>\n<p>Now, a team of researchers from the Rowland Institute at Harvard, Harvard University, and the University of T\u00fcbingen is turning to artificial intelligence technology to solve the problem.<\/p>\n<p>The software they developed, dubbed DeepLabCut, harnesses new learning techniques to track features from the digits of mice, to egg-laying behavior in Drosophila, and beyond. The work is described in an Aug. 20 <a href=\"https:\/\/www.nature.com\/articles\/s41593-018-0209-y\">paper published in Nature Neuroscience<\/a>.<\/p>\n<p>The software is the brainchild of Mackenzie Mathis, a Rowland Fellow at the Rowland Institute at Harvard; Alexander Mathis, a postdoctoral fellow working in the lab of Venkatesh N. Murthy, professor of molecular and cellular biology and chair of the Department of Molecular and Cellular Biology; and Matthias Bethge, a professor at the University of T\u00fcbingen and chair of the Bernstein Center for Computational Neuroscience T\u00fcbingen.<\/p>\n<p>The notion of using software to track animal movements was born partly of necessity. Both Mackenzie and Alexander Mathis had tried using traditional techniques, which typically involve placing tracking markers on animals and using heuristics such as object segmentation, with mixed success.<\/p>\n<p>Such techniques are often sensitive to the choice of analysis parameters, and markers or tattoos are invasive and can hinder natural behaviors, or may be impossible to place on very small or wild animals, they said.<\/p>\n<p>Luckily, international competitions in recent years have driven advances in computer vision and the development of new algorithms capable of human pose-estimation (automatically tracking human body parts).<\/p>\n<p>Such algorithms, however, are widely seen as data-hungry, requiring thousands of labeled examples for the artificial neural network to learn. This is prohibitively large for typical laboratory experiments, and would require days of manual labeling for each behavior.<\/p>\n<p>The solution came in what is called \u201ctransfer learning,\u201d or applying an already-trained network to a different problem, similar to the way scientists believe biological systems learn.<\/p>\n<p>Using a state-of-the-art algorithm for tracking human movement called DeeperCut, the Mathises were able to show that deep learning could be highly data-efficient. The new software\u2019s name is a nod to DeeperCut\u2019s authors.<\/p>\n","innerContent":["\n\t\t<p>Understanding the brain, in part, means understanding how behavior is created.<\/p>\n<p>To reverse-engineer how neural circuits drive behavior requires accurate and vigorous tracking of behavior, yet the increasingly complex tasks animals perform in the laboratory have made that challenging.<\/p>\n<p>Now, a team of researchers from the Rowland Institute at Harvard, Harvard University, and the University of T\u00fcbingen is turning to artificial intelligence technology to solve the problem.<\/p>\n<p>The software they developed, dubbed DeepLabCut, harnesses new learning techniques to track features from the digits of mice, to egg-laying behavior in Drosophila, and beyond. The work is described in an Aug. 20 <a href=\"https:\/\/www.nature.com\/articles\/s41593-018-0209-y\">paper published in Nature Neuroscience<\/a>.<\/p>\n<p>The software is the brainchild of Mackenzie Mathis, a Rowland Fellow at the Rowland Institute at Harvard; Alexander Mathis, a postdoctoral fellow working in the lab of Venkatesh N. Murthy, professor of molecular and cellular biology and chair of the Department of Molecular and Cellular Biology; and Matthias Bethge, a professor at the University of T\u00fcbingen and chair of the Bernstein Center for Computational Neuroscience T\u00fcbingen.<\/p>\n<p>The notion of using software to track animal movements was born partly of necessity. Both Mackenzie and Alexander Mathis had tried using traditional techniques, which typically involve placing tracking markers on animals and using heuristics such as object segmentation, with mixed success.<\/p>\n<p>Such techniques are often sensitive to the choice of analysis parameters, and markers or tattoos are invasive and can hinder natural behaviors, or may be impossible to place on very small or wild animals, they said.<\/p>\n<p>Luckily, international competitions in recent years have driven advances in computer vision and the development of new algorithms capable of human pose-estimation (automatically tracking human body parts).<\/p>\n<p>Such algorithms, however, are widely seen as data-hungry, requiring thousands of labeled examples for the artificial neural network to learn. This is prohibitively large for typical laboratory experiments, and would require days of manual labeling for each behavior.<\/p>\n<p>The solution came in what is called \u201ctransfer learning,\u201d or applying an already-trained network to a different problem, similar to the way scientists believe biological systems learn.<\/p>\n<p>Using a state-of-the-art algorithm for tracking human movement called DeeperCut, the Mathises were able to show that deep learning could be highly data-efficient. The new software\u2019s name is a nod to DeeperCut\u2019s authors.<\/p>\n"],"rendered":"\n\t\t<p>Understanding the brain, in part, means understanding how behavior is created.<\/p>\n<p>To reverse-engineer how neural circuits drive behavior requires accurate and vigorous tracking of behavior, yet the increasingly complex tasks animals perform in the laboratory have made that challenging.<\/p>\n<p>Now, a team of researchers from the Rowland Institute at Harvard, Harvard University, and the University of T\u00fcbingen is turning to artificial intelligence technology to solve the problem.<\/p>\n<p>The software they developed, dubbed DeepLabCut, harnesses new learning techniques to track features from the digits of mice, to egg-laying behavior in Drosophila, and beyond. The work is described in an Aug. 20 <a href=\"https:\/\/www.nature.com\/articles\/s41593-018-0209-y\">paper published in Nature Neuroscience<\/a>.<\/p>\n<p>The software is the brainchild of Mackenzie Mathis, a Rowland Fellow at the Rowland Institute at Harvard; Alexander Mathis, a postdoctoral fellow working in the lab of Venkatesh N. Murthy, professor of molecular and cellular biology and chair of the Department of Molecular and Cellular Biology; and Matthias Bethge, a professor at the University of T\u00fcbingen and chair of the Bernstein Center for Computational Neuroscience T\u00fcbingen.<\/p>\n<p>The notion of using software to track animal movements was born partly of necessity. Both Mackenzie and Alexander Mathis had tried using traditional techniques, which typically involve placing tracking markers on animals and using heuristics such as object segmentation, with mixed success.<\/p>\n<p>Such techniques are often sensitive to the choice of analysis parameters, and markers or tattoos are invasive and can hinder natural behaviors, or may be impossible to place on very small or wild animals, they said.<\/p>\n<p>Luckily, international competitions in recent years have driven advances in computer vision and the development of new algorithms capable of human pose-estimation (automatically tracking human body parts).<\/p>\n<p>Such algorithms, however, are widely seen as data-hungry, requiring thousands of labeled examples for the artificial neural network to learn. This is prohibitively large for typical laboratory experiments, and would require days of manual labeling for each behavior.<\/p>\n<p>The solution came in what is called \u201ctransfer learning,\u201d or applying an already-trained network to a different problem, similar to the way scientists believe biological systems learn.<\/p>\n<p>Using a state-of-the-art algorithm for tracking human movement called DeeperCut, the Mathises were able to show that deep learning could be highly data-efficient. The new software\u2019s name is a nod to DeeperCut\u2019s authors.<\/p>\n"},{"blockName":"core\/group","attrs":{"tagName":"figure","align":"wide","className":"wp-block-table","templateLock":null,"lock":[],"metadata":[],"style":[],"backgroundColor":"","textColor":"","gradient":"","fontSize":"","fontFamily":"","borderColor":"","layout":[],"ariaLabel":"","anchor":""},"innerBlocks":[{"blockName":"core\/columns","attrs":{"verticalAlignment":"top","isStackedOnMobile":true,"templateLock":null,"lock":[],"metadata":[],"align":"","className":"","style":[],"backgroundColor":"","textColor":"","gradient":"","fontSize":"","fontFamily":"","borderColor":"","layout":[],"anchor":""},"innerBlocks":[{"blockName":"core\/column","attrs":{"verticalAlignment":"top","width":"","templateLock":null,"lock":[],"metadata":[],"className":"","style":[],"backgroundColor":"","textColor":"","gradient":"","fontSize":"","fontFamily":"","borderColor":"","layout":[],"anchor":""},"innerBlocks":[{"blockName":"core\/group","attrs":{"tagName":"figcaption","className":"wp-element-caption","templateLock":null,"lock":[],"metadata":[],"align":"","style":[],"backgroundColor":"","textColor":"","gradient":"","fontSize":"","fontFamily":"","borderColor":"","layout":[],"ariaLabel":"","anchor":""},"innerBlocks":[],"innerHTML":"<figcaption class=\"wp-block-group wp-element-caption\"><\/figcaption>","innerContent":["<figcaption class=\"wp-block-group wp-element-caption\"><\/figcaption>"],"rendered":"<figcaption class=\"wp-block-group wp-element-caption is-layout-flow wp-block-group-is-layout-flow\"><\/figcaption>"}],"innerHTML":"\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top\">\n\t\t\t\r\n\t\t\t<\/div>\n\t\t\t","innerContent":["\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top\">\n\t\t\t","\r\n\t\t\t<\/div>\n\t\t\t"],"rendered":"\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t<figcaption class=\"wp-block-group wp-element-caption is-layout-flow wp-block-group-is-layout-flow\"><\/figcaption>\r\n\t\t\t<\/div>\n\t\t\t"},{"blockName":"core\/column","attrs":{"verticalAlignment":"top","width":"","templateLock":null,"lock":[],"metadata":[],"className":"","style":[],"backgroundColor":"","textColor":"","gradient":"","fontSize":"","fontFamily":"","borderColor":"","layout":[],"anchor":""},"innerBlocks":[{"blockName":"core\/image","attrs":{"sizeSlug":"full","align":"none","id":261768,"blob":"","url":"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/ezgif.com-optimize-1.gif","alt":"","caption":null,"lightbox":[],"title":"","href":"","rel":"","linkClass":"","width":"","height":"","aspectRatio":"","scale":"","linkDestination":"","linkTarget":"","lock":[],"metadata":[],"className":"","style":[],"borderColor":"","anchor":""},"innerBlocks":[],"innerHTML":"\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/ezgif.com-optimize-1.gif\" alt=\"\" class=\"wp-image-261768\"><\/figure>\n\t","innerContent":["\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/ezgif.com-optimize-1.gif\" alt=\"\" class=\"wp-image-261768\"><\/figure>\n\t"],"rendered":"\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/ezgif.com-optimize-1.gif\" alt=\"\" class=\"wp-image-261768\"><\/figure>\n\t"}],"innerHTML":"\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top\">\n\t\t\t\t\n\t\r\n\t\t\t<\/div>\n\t\t\t","innerContent":["\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top\">\n\t\t\t\t","\n\t\r\n\t\t\t<\/div>\n\t\t\t"],"rendered":"\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t\t\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/ezgif.com-optimize-1.gif\" alt=\"\" class=\"wp-image-261768\"><\/figure>\n\t\n\t\r\n\t\t\t<\/div>\n\t\t\t"}],"innerHTML":"\n<div class=\"wp-block-columns alignwide are-vertically-aligned-top media-cluster\">\n\t\t\t\t\n\t\t\t\t\t\n\t\t<\/div>\n","innerContent":["\n<div class=\"wp-block-columns alignwide are-vertically-aligned-top media-cluster\">\n\t\t\t\t","\n\t\t\t\t\t","\n\t\t<\/div>\n"],"rendered":"\n<div class=\"wp-block-columns alignwide are-vertically-aligned-top media-cluster is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t<figcaption class=\"wp-block-group wp-element-caption is-layout-flow wp-block-group-is-layout-flow\"><\/figcaption>\r\n\t\t\t<\/div>\n\t\t\t\n\t\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t\t\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/ezgif.com-optimize-1.gif\" alt=\"\" class=\"wp-image-261768\"><\/figure>\n\t\n\t\r\n\t\t\t<\/div>\n\t\t\t\n\t\t<\/div>\n"}],"innerHTML":"<figure class=\"wp-block-group wp-block-table alignwide\">\n<\/figure>","innerContent":["<figure class=\"wp-block-group wp-block-table alignwide\">","\n<\/figure>"],"rendered":"<figure class=\"wp-block-group wp-block-table alignwide is-layout-flow wp-block-group-is-layout-flow\">\n<div class=\"wp-block-columns alignwide are-vertically-aligned-top media-cluster is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t<figcaption class=\"wp-block-group wp-element-caption is-layout-flow wp-block-group-is-layout-flow\"><\/figcaption>\r\n\t\t\t<\/div>\n\t\t\t\n\t\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t\t\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/ezgif.com-optimize-1.gif\" alt=\"\" class=\"wp-image-261768\"><\/figure>\n\t\n\t\r\n\t\t\t<\/div>\n\t\t\t\n\t\t<\/div>\n\n<\/figure>"},{"blockName":"core\/freeform","attrs":{"content":"","lock":[],"metadata":[]},"innerBlocks":[],"innerHTML":"\n","innerContent":["\n"],"rendered":"\n"},{"blockName":"core\/group","attrs":{"tagName":"figure","align":"wide","className":"wp-block-table","templateLock":null,"lock":[],"metadata":[],"style":[],"backgroundColor":"","textColor":"","gradient":"","fontSize":"","fontFamily":"","borderColor":"","layout":[],"ariaLabel":"","anchor":""},"innerBlocks":[{"blockName":"core\/columns","attrs":{"verticalAlignment":"top","isStackedOnMobile":true,"templateLock":null,"lock":[],"metadata":[],"align":"","className":"","style":[],"backgroundColor":"","textColor":"","gradient":"","fontSize":"","fontFamily":"","borderColor":"","layout":[],"anchor":""},"innerBlocks":[{"blockName":"core\/column","attrs":{"verticalAlignment":"top","width":"","templateLock":null,"lock":[],"metadata":[],"className":"","style":[],"backgroundColor":"","textColor":"","gradient":"","fontSize":"","fontFamily":"","borderColor":"","layout":[],"anchor":""},"innerBlocks":[{"blockName":"core\/group","attrs":{"tagName":"figcaption","className":"wp-element-caption","templateLock":null,"lock":[],"metadata":[],"align":"","style":[],"backgroundColor":"","textColor":"","gradient":"","fontSize":"","fontFamily":"","borderColor":"","layout":[],"ariaLabel":"","anchor":""},"innerBlocks":[{"blockName":"core\/paragraph","attrs":{"className":"wp-element-caption--caption","align":"","content":"The software tracks the movements of a fly laying eggs and the digits of a mouse.","dropCap":false,"placeholder":"","direction":"","lock":[],"metadata":[],"style":[],"backgroundColor":"","textColor":"","gradient":"","fontSize":"","fontFamily":"","borderColor":"","anchor":""},"innerBlocks":[],"innerHTML":"<p class=\"wp-element-caption--caption\">The software tracks the movements of a fly laying eggs and the digits of a mouse.<\/p>","innerContent":["<p class=\"wp-element-caption--caption\">The software tracks the movements of a fly laying eggs and the digits of a mouse.<\/p>"],"rendered":"<p class=\"wp-element-caption--caption\">The software tracks the movements of a fly laying eggs and the digits of a mouse.<\/p>"}],"innerHTML":"<figcaption class=\"wp-block-group wp-element-caption\"><\/figcaption>","innerContent":["<figcaption class=\"wp-block-group wp-element-caption\">","<\/figcaption>"],"rendered":"<figcaption class=\"wp-block-group wp-element-caption is-layout-flow wp-block-group-is-layout-flow\"><p class=\"wp-element-caption--caption\">The software tracks the movements of a fly laying eggs and the digits of a mouse.<\/p><\/figcaption>"}],"innerHTML":"\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top\">\n\t\t\t\r\n\t\t\t<\/div>\n\t\t\t","innerContent":["\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top\">\n\t\t\t","\r\n\t\t\t<\/div>\n\t\t\t"],"rendered":"\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t<figcaption class=\"wp-block-group wp-element-caption is-layout-flow wp-block-group-is-layout-flow\"><p class=\"wp-element-caption--caption\">The software tracks the movements of a fly laying eggs and the digits of a mouse.<\/p><\/figcaption>\r\n\t\t\t<\/div>\n\t\t\t"},{"blockName":"core\/column","attrs":{"verticalAlignment":"top","width":"","templateLock":null,"lock":[],"metadata":[],"className":"","style":[],"backgroundColor":"","textColor":"","gradient":"","fontSize":"","fontFamily":"","borderColor":"","layout":[],"anchor":""},"innerBlocks":[{"blockName":"core\/image","attrs":{"sizeSlug":"full","align":"none","id":249512,"blob":"","url":"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/MousereachGIF1.gif","alt":"","caption":null,"lightbox":[],"title":"","href":"","rel":"","linkClass":"","width":"","height":"","aspectRatio":"","scale":"","linkDestination":"","linkTarget":"","lock":[],"metadata":[],"className":"","style":[],"borderColor":"","anchor":""},"innerBlocks":[],"innerHTML":"\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/MousereachGIF1.gif\" alt=\"\" class=\"wp-image-249512\"><\/figure>\n\t","innerContent":["\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/MousereachGIF1.gif\" alt=\"\" class=\"wp-image-249512\"><\/figure>\n\t"],"rendered":"\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/MousereachGIF1.gif\" alt=\"\" class=\"wp-image-249512\"><\/figure>\n\t"}],"innerHTML":"\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top\">\n\t\t\t\t\n\t\r\n\t\t\t<\/div>\n\t\t\t","innerContent":["\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top\">\n\t\t\t\t","\n\t\r\n\t\t\t<\/div>\n\t\t\t"],"rendered":"\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t\t\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/MousereachGIF1.gif\" alt=\"\" class=\"wp-image-249512\"><\/figure>\n\t\n\t\r\n\t\t\t<\/div>\n\t\t\t"}],"innerHTML":"\n<div class=\"wp-block-columns alignwide are-vertically-aligned-top media-cluster\">\n\t\t\t\t\n\t\t\t\t\t\n\t\t<\/div>\n","innerContent":["\n<div class=\"wp-block-columns alignwide are-vertically-aligned-top media-cluster\">\n\t\t\t\t","\n\t\t\t\t\t","\n\t\t<\/div>\n"],"rendered":"\n<div class=\"wp-block-columns alignwide are-vertically-aligned-top media-cluster is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t<figcaption class=\"wp-block-group wp-element-caption is-layout-flow wp-block-group-is-layout-flow\"><p class=\"wp-element-caption--caption\">The software tracks the movements of a fly laying eggs and the digits of a mouse.<\/p><\/figcaption>\r\n\t\t\t<\/div>\n\t\t\t\n\t\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t\t\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/MousereachGIF1.gif\" alt=\"\" class=\"wp-image-249512\"><\/figure>\n\t\n\t\r\n\t\t\t<\/div>\n\t\t\t\n\t\t<\/div>\n"}],"innerHTML":"<figure class=\"wp-block-group wp-block-table alignwide\">\n<\/figure>","innerContent":["<figure class=\"wp-block-group wp-block-table alignwide\">","\n<\/figure>"],"rendered":"<figure class=\"wp-block-group wp-block-table alignwide is-layout-flow wp-block-group-is-layout-flow\">\n<div class=\"wp-block-columns alignwide are-vertically-aligned-top media-cluster is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t<figcaption class=\"wp-block-group wp-element-caption is-layout-flow wp-block-group-is-layout-flow\"><p class=\"wp-element-caption--caption\">The software tracks the movements of a fly laying eggs and the digits of a mouse.<\/p><\/figcaption>\r\n\t\t\t<\/div>\n\t\t\t\n\t\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t\t\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/MousereachGIF1.gif\" alt=\"\" class=\"wp-image-249512\"><\/figure>\n\t\n\t\r\n\t\t\t<\/div>\n\t\t\t\n\t\t<\/div>\n\n<\/figure>"},{"blockName":"core\/freeform","attrs":{"content":"","lock":[],"metadata":[]},"innerBlocks":[],"innerHTML":"\n<p>Just as a child does not need to develop another visual system from scratch in order to recognize a novel object, but relies on thousands of hours of experience and adapts them to recognize new objects, DeepLabCut is pretrained on thousands of images containing natural objects, images of hammers, cats, dogs, foods, and more.<\/p>\n<p>With that pretraining in place, the software needed only 100 examples of mice performing an odor-guided navigation experiment to recognize specific mouse body parts as well as humans could.<\/p>\n<p>The team was also able to apply the technology to mice making reaching movements, and, in collaboration with Kevin Cury, a neuroscientist from Columbia University, to flies laying eggs in a 3-D chamber.<\/p>\n<p>\u201cWe were very impressed by the success of the transfer-learning approach and the versatility of DeepLabCut,\u201d Mackenzie Mathis said. \u201cWith only a few hundred frames of training data, we were able to get accurate and robust tracking across a myriad of experimental conditions, animals, and behaviors.\u201d<\/p>\n<p>\u201cExperimentalists have very good intuitions about what body parts should be analyzed to study a particular behavior, but traditionally extracting limb coordinates from videos has been very challenging \u2014 DeepLabCut does just that based on a few examples,\u201d Alexander Mathis said. \u201cSince the program is designed as a user-friendly, \u2018plug-and-play\u2019 solution, and does not require any coding skills, it can be widely used.\u201d<\/p>\n<p>\u201cWe want as many researchers as possible to benefit from our work,\u201d said Bethge. \u201cDeepLabCut was created as an open software, as sharing results, data, and also algorithms is essential for scientific progress.\u201d<\/p>\n<p>Even as the paper describing the software was published, the technology had been used by more than 50 labs to study everything from the gait of horses to bacteria dynamics to the movement of surgery robots.<\/p>\n<p>The software toolbox can be used with minimal to no coding experience and is freely available at <a href=\"http:\/\/mousemotorlab.org\/deeplabcut\">mousemotorlab.org\/deeplabcut<\/a>.<\/p>\n<p><em>This study was supported with funding from the Marie Sklodowska-Curie International Fellowship, the Rowland Institute at Harvard, Project ALS Women &amp; the Brain Neuroscience Fellowship, the German Science <\/em>foundation<em> (DFG) CRC 1233 on Robust Vision, and IARPA through the MICrONS program.<\/em><\/p>\n","innerContent":["\n<p>Just as a child does not need to develop another visual system from scratch in order to recognize a novel object, but relies on thousands of hours of experience and adapts them to recognize new objects, DeepLabCut is pretrained on thousands of images containing natural objects, images of hammers, cats, dogs, foods, and more.<\/p>\n<p>With that pretraining in place, the software needed only 100 examples of mice performing an odor-guided navigation experiment to recognize specific mouse body parts as well as humans could.<\/p>\n<p>The team was also able to apply the technology to mice making reaching movements, and, in collaboration with Kevin Cury, a neuroscientist from Columbia University, to flies laying eggs in a 3-D chamber.<\/p>\n<p>\u201cWe were very impressed by the success of the transfer-learning approach and the versatility of DeepLabCut,\u201d Mackenzie Mathis said. \u201cWith only a few hundred frames of training data, we were able to get accurate and robust tracking across a myriad of experimental conditions, animals, and behaviors.\u201d<\/p>\n<p>\u201cExperimentalists have very good intuitions about what body parts should be analyzed to study a particular behavior, but traditionally extracting limb coordinates from videos has been very challenging \u2014 DeepLabCut does just that based on a few examples,\u201d Alexander Mathis said. \u201cSince the program is designed as a user-friendly, \u2018plug-and-play\u2019 solution, and does not require any coding skills, it can be widely used.\u201d<\/p>\n<p>\u201cWe want as many researchers as possible to benefit from our work,\u201d said Bethge. \u201cDeepLabCut was created as an open software, as sharing results, data, and also algorithms is essential for scientific progress.\u201d<\/p>\n<p>Even as the paper describing the software was published, the technology had been used by more than 50 labs to study everything from the gait of horses to bacteria dynamics to the movement of surgery robots.<\/p>\n<p>The software toolbox can be used with minimal to no coding experience and is freely available at <a href=\"http:\/\/mousemotorlab.org\/deeplabcut\">mousemotorlab.org\/deeplabcut<\/a>.<\/p>\n<p><em>This study was supported with funding from the Marie Sklodowska-Curie International Fellowship, the Rowland Institute at Harvard, Project ALS Women &amp; the Brain Neuroscience Fellowship, the German Science <\/em>foundation<em> (DFG) CRC 1233 on Robust Vision, and IARPA through the MICrONS program.<\/em><\/p>\n"],"rendered":"\n<p>Just as a child does not need to develop another visual system from scratch in order to recognize a novel object, but relies on thousands of hours of experience and adapts them to recognize new objects, DeepLabCut is pretrained on thousands of images containing natural objects, images of hammers, cats, dogs, foods, and more.<\/p>\n<p>With that pretraining in place, the software needed only 100 examples of mice performing an odor-guided navigation experiment to recognize specific mouse body parts as well as humans could.<\/p>\n<p>The team was also able to apply the technology to mice making reaching movements, and, in collaboration with Kevin Cury, a neuroscientist from Columbia University, to flies laying eggs in a 3-D chamber.<\/p>\n<p>\u201cWe were very impressed by the success of the transfer-learning approach and the versatility of DeepLabCut,\u201d Mackenzie Mathis said. \u201cWith only a few hundred frames of training data, we were able to get accurate and robust tracking across a myriad of experimental conditions, animals, and behaviors.\u201d<\/p>\n<p>\u201cExperimentalists have very good intuitions about what body parts should be analyzed to study a particular behavior, but traditionally extracting limb coordinates from videos has been very challenging \u2014 DeepLabCut does just that based on a few examples,\u201d Alexander Mathis said. \u201cSince the program is designed as a user-friendly, \u2018plug-and-play\u2019 solution, and does not require any coding skills, it can be widely used.\u201d<\/p>\n<p>\u201cWe want as many researchers as possible to benefit from our work,\u201d said Bethge. \u201cDeepLabCut was created as an open software, as sharing results, data, and also algorithms is essential for scientific progress.\u201d<\/p>\n<p>Even as the paper describing the software was published, the technology had been used by more than 50 labs to study everything from the gait of horses to bacteria dynamics to the movement of surgery robots.<\/p>\n<p>The software toolbox can be used with minimal to no coding experience and is freely available at <a href=\"http:\/\/mousemotorlab.org\/deeplabcut\">mousemotorlab.org\/deeplabcut<\/a>.<\/p>\n<p><em>This study was supported with funding from the Marie Sklodowska-Curie International Fellowship, the Rowland Institute at Harvard, Project ALS Women &amp; the Brain Neuroscience Fellowship, the German Science <\/em>foundation<em> (DFG) CRC 1233 on Robust Vision, and IARPA through the MICrONS program.<\/em><\/p>\n"}],"innerHTML":"\n<div class=\"wp-block-group alignwide\">\n\n\r\n\r\n\r\n\r\n\n\n<\/div>\n","innerContent":["\n<div class=\"wp-block-group alignwide\">\n\n","\r\n","\r\n","\r\n","\r\n","\n\n<\/div>\n"],"rendered":"\n<div class=\"wp-block-group alignwide has-global-padding is-content-justification-center is-layout-constrained wp-block-group-is-layout-constrained\">\n\n\n\t\t<p>Understanding the brain, in part, means understanding how behavior is created.<\/p>\n<p>To reverse-engineer how neural circuits drive behavior requires accurate and vigorous tracking of behavior, yet the increasingly complex tasks animals perform in the laboratory have made that challenging.<\/p>\n<p>Now, a team of researchers from the Rowland Institute at Harvard, Harvard University, and the University of T\u00fcbingen is turning to artificial intelligence technology to solve the problem.<\/p>\n<p>The software they developed, dubbed DeepLabCut, harnesses new learning techniques to track features from the digits of mice, to egg-laying behavior in Drosophila, and beyond. The work is described in an Aug. 20 <a href=\"https:\/\/www.nature.com\/articles\/s41593-018-0209-y\">paper published in Nature Neuroscience<\/a>.<\/p>\n<p>The software is the brainchild of Mackenzie Mathis, a Rowland Fellow at the Rowland Institute at Harvard; Alexander Mathis, a postdoctoral fellow working in the lab of Venkatesh N. Murthy, professor of molecular and cellular biology and chair of the Department of Molecular and Cellular Biology; and Matthias Bethge, a professor at the University of T\u00fcbingen and chair of the Bernstein Center for Computational Neuroscience T\u00fcbingen.<\/p>\n<p>The notion of using software to track animal movements was born partly of necessity. Both Mackenzie and Alexander Mathis had tried using traditional techniques, which typically involve placing tracking markers on animals and using heuristics such as object segmentation, with mixed success.<\/p>\n<p>Such techniques are often sensitive to the choice of analysis parameters, and markers or tattoos are invasive and can hinder natural behaviors, or may be impossible to place on very small or wild animals, they said.<\/p>\n<p>Luckily, international competitions in recent years have driven advances in computer vision and the development of new algorithms capable of human pose-estimation (automatically tracking human body parts).<\/p>\n<p>Such algorithms, however, are widely seen as data-hungry, requiring thousands of labeled examples for the artificial neural network to learn. This is prohibitively large for typical laboratory experiments, and would require days of manual labeling for each behavior.<\/p>\n<p>The solution came in what is called \u201ctransfer learning,\u201d or applying an already-trained network to a different problem, similar to the way scientists believe biological systems learn.<\/p>\n<p>Using a state-of-the-art algorithm for tracking human movement called DeeperCut, the Mathises were able to show that deep learning could be highly data-efficient. The new software\u2019s name is a nod to DeeperCut\u2019s authors.<\/p>\n\r\n<figure class=\"wp-block-group wp-block-table alignwide is-layout-flow wp-block-group-is-layout-flow\">\n<div class=\"wp-block-columns alignwide are-vertically-aligned-top media-cluster is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t<figcaption class=\"wp-block-group wp-element-caption is-layout-flow wp-block-group-is-layout-flow\"><\/figcaption>\r\n\t\t\t<\/div>\n\t\t\t\n\t\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t\t\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/ezgif.com-optimize-1.gif\" alt=\"\" class=\"wp-image-261768\"><\/figure>\n\t\n\t\r\n\t\t\t<\/div>\n\t\t\t\n\t\t<\/div>\n\n<\/figure>\r\n\n\r\n<figure class=\"wp-block-group wp-block-table alignwide is-layout-flow wp-block-group-is-layout-flow\">\n<div class=\"wp-block-columns alignwide are-vertically-aligned-top media-cluster is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t<figcaption class=\"wp-block-group wp-element-caption is-layout-flow wp-block-group-is-layout-flow\"><p class=\"wp-element-caption--caption\">The software tracks the movements of a fly laying eggs and the digits of a mouse.<\/p><\/figcaption>\r\n\t\t\t<\/div>\n\t\t\t\n\t\t\t\t\t\n\t\t\t<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\">\n\t\t\t\t\n\n\t<figure class=\"wp-block-image alignnone  size-full is-resized\"><img src=\"https:\/\/news.harvard.edu\/gazette\/wp-content\/uploads\/2018\/08\/MousereachGIF1.gif\" alt=\"\" class=\"wp-image-249512\"><\/figure>\n\t\n\t\r\n\t\t\t<\/div>\n\t\t\t\n\t\t<\/div>\n\n<\/figure>\r\n\n<p>Just as a child does not need to develop another visual system from scratch in order to recognize a novel object, but relies on thousands of hours of experience and adapts them to recognize new objects, DeepLabCut is pretrained on thousands of images containing natural objects, images of hammers, cats, dogs, foods, and more.<\/p>\n<p>With that pretraining in place, the software needed only 100 examples of mice performing an odor-guided navigation experiment to recognize specific mouse body parts as well as humans could.<\/p>\n<p>The team was also able to apply the technology to mice making reaching movements, and, in collaboration with Kevin Cury, a neuroscientist from Columbia University, to flies laying eggs in a 3-D chamber.<\/p>\n<p>\u201cWe were very impressed by the success of the transfer-learning approach and the versatility of DeepLabCut,\u201d Mackenzie Mathis said. \u201cWith only a few hundred frames of training data, we were able to get accurate and robust tracking across a myriad of experimental conditions, animals, and behaviors.\u201d<\/p>\n<p>\u201cExperimentalists have very good intuitions about what body parts should be analyzed to study a particular behavior, but traditionally extracting limb coordinates from videos has been very challenging \u2014 DeepLabCut does just that based on a few examples,\u201d Alexander Mathis said. \u201cSince the program is designed as a user-friendly, \u2018plug-and-play\u2019 solution, and does not require any coding skills, it can be widely used.\u201d<\/p>\n<p>\u201cWe want as many researchers as possible to benefit from our work,\u201d said Bethge. \u201cDeepLabCut was created as an open software, as sharing results, data, and also algorithms is essential for scientific progress.\u201d<\/p>\n<p>Even as the paper describing the software was published, the technology had been used by more than 50 labs to study everything from the gait of horses to bacteria dynamics to the movement of surgery robots.<\/p>\n<p>The software toolbox can be used with minimal to no coding experience and is freely available at <a href=\"http:\/\/mousemotorlab.org\/deeplabcut\">mousemotorlab.org\/deeplabcut<\/a>.<\/p>\n<p><em>This study was supported with funding from the Marie Sklodowska-Curie International Fellowship, the Rowland Institute at Harvard, Project ALS Women &amp; the Brain Neuroscience Fellowship, the German Science <\/em>foundation<em> (DFG) CRC 1233 on Robust Vision, and IARPA through the MICrONS program.<\/em><\/p>\n\n\n<\/div>\n"}},"jetpack-related-posts":[{"id":366120,"url":"https:\/\/news.harvard.edu\/gazette\/story\/2023\/11\/new-study-explains-how-exercise-reduces-chronic-inflammation\/","url_meta":{"origin":249509,"position":0},"title":"Research shows working out gets inflammation-fighting T cells moving","author":"harvardgazette","date":"November 3, 2023","format":false,"excerpt":"Activated by regular exercise, immune cells in muscles found to fend off inflammation, enhance endurance in mice","rel":"","context":"In &quot;Health&quot;","block_context":{"text":"Health","link":"https:\/\/news.harvard.edu\/gazette\/section\/health\/"},"img":{"alt_text":"People running.","src":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/11\/1406-runners.jpg?resize=350%2C200","width":350,"height":200,"srcset":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/11\/1406-runners.jpg?resize=350%2C200 1x, https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/11\/1406-runners.jpg?resize=525%2C300 1.5x, https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/11\/1406-runners.jpg?resize=700%2C400 2x"},"classes":[]},{"id":354784,"url":"https:\/\/news.harvard.edu\/gazette\/story\/2023\/03\/gut-microbes-found-to-help-mend-damaged-muscles-in-mice\/","url_meta":{"origin":249509,"position":1},"title":"Torn muscle? Send in the gut microbes for rapid repair","author":"harvardgazette","date":"March 6, 2023","format":false,"excerpt":"A Harvard-led study shows that the gut microbiota acts as the training camp for a class of immune cells that are recruited to heal muscle injury.","rel":"","context":"In &quot;Health&quot;","block_context":{"text":"Health","link":"https:\/\/news.harvard.edu\/gazette\/section\/health\/"},"img":{"alt_text":"Injured muscle.","src":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/03\/Muscle-pain.jpg?resize=350%2C200","width":350,"height":200,"srcset":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/03\/Muscle-pain.jpg?resize=350%2C200 1x, https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/03\/Muscle-pain.jpg?resize=525%2C300 1.5x, https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/03\/Muscle-pain.jpg?resize=700%2C400 2x"},"classes":[]},{"id":390877,"url":"https:\/\/news.harvard.edu\/gazette\/story\/2024\/08\/hey-quantum-youre-home\/","url_meta":{"origin":249509,"position":2},"title":"Hey Quantum, you\u2019re home\u00a0","author":"Anne Mannning","date":"August 23, 2024","format":false,"excerpt":"David E. and Stacey L. Goel Quantum Science and Engineering Building opens","rel":"","context":"In &quot;Campus &amp; Community&quot;","block_context":{"text":"Campus &amp; Community","link":"https:\/\/news.harvard.edu\/gazette\/section\/campus-community\/"},"img":{"alt_text":"Exterior of the David E. and Stacey L. Goel Quantum Science and Engineering Building.","src":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2024\/08\/061427_quantumbuilding_002.jpeg?resize=350%2C200","width":350,"height":200,"srcset":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2024\/08\/061427_quantumbuilding_002.jpeg?resize=350%2C200 1x, https:\/\/news.harvard.edu\/wp-content\/uploads\/2024\/08\/061427_quantumbuilding_002.jpeg?resize=525%2C300 1.5x, https:\/\/news.harvard.edu\/wp-content\/uploads\/2024\/08\/061427_quantumbuilding_002.jpeg?resize=700%2C400 2x"},"classes":[]},{"id":346347,"url":"https:\/\/news.harvard.edu\/gazette\/story\/2022\/08\/cutting-edge-science-at-the-harvards-rowland-institute\/","url_meta":{"origin":249509,"position":3},"title":"Using designs by Mother Nature, guiding flies, making things glow","author":"harvardgazette","date":"August 10, 2022","format":false,"excerpt":"Rowland Fellows at the cutting edge of science.","rel":"","context":"In &quot;Campus &amp; Community&quot;","block_context":{"text":"Campus &amp; Community","link":"https:\/\/news.harvard.edu\/gazette\/section\/campus-community\/"},"img":{"alt_text":"Rowland Fellows.","src":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2022\/08\/202208_rowland_tyc_1407.jpg?resize=350%2C200","width":350,"height":200,"srcset":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2022\/08\/202208_rowland_tyc_1407.jpg?resize=350%2C200 1x, https:\/\/news.harvard.edu\/wp-content\/uploads\/2022\/08\/202208_rowland_tyc_1407.jpg?resize=525%2C300 1.5x, https:\/\/news.harvard.edu\/wp-content\/uploads\/2022\/08\/202208_rowland_tyc_1407.jpg?resize=700%2C400 2x"},"classes":[]},{"id":157879,"url":"https:\/\/news.harvard.edu\/gazette\/story\/2014\/07\/harvesting-energy-from-devices\/","url_meta":{"origin":249509,"position":4},"title":"Harvesting energy from devices","author":"harvardgazette","date":"July 3, 2014","format":false,"excerpt":"Heat is a byproduct of nearly all electronic devices, yet most of it goes wasted. In an effort to recapture some of that energy and transform it into electricity, a team of Harvard and University of Sannio researchers have developed computer simulations to control the flow of heat and electrical\u2026","rel":"","context":"In &quot;Science &amp; Tech&quot;","block_context":{"text":"Science &amp; Tech","link":"https:\/\/news.harvard.edu\/gazette\/section\/science-technology\/"},"img":{"alt_text":"","src":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2014\/06\/heat-conductivity_figure3_605.jpg?resize=350%2C200","width":350,"height":200,"srcset":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2014\/06\/heat-conductivity_figure3_605.jpg?resize=350%2C200 1x, https:\/\/news.harvard.edu\/wp-content\/uploads\/2014\/06\/heat-conductivity_figure3_605.jpg?resize=525%2C300 1.5x"},"classes":[]},{"id":361142,"url":"https:\/\/news.harvard.edu\/gazette\/story\/2023\/06\/researchers-use-light-to-make-spins-more-efficient-easier-to-manipulate\/","url_meta":{"origin":249509,"position":5},"title":"Using light to make electrons even more energy efficient","author":"harvardgazette","date":"June 26, 2023","format":false,"excerpt":"A team of researchers was able to generate electron spin domains without the need of magnetic fields on perfectly ordered materials at extremely low temperatures.","rel":"","context":"In &quot;Science &amp; Tech&quot;","block_context":{"text":"Science &amp; Tech","link":"https:\/\/news.harvard.edu\/gazette\/section\/science-technology\/"},"img":{"alt_text":"Sascha Feldmann.","src":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/06\/060923_Sasha_102.jpg?resize=350%2C200","width":350,"height":200,"srcset":"https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/06\/060923_Sasha_102.jpg?resize=350%2C200 1x, https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/06\/060923_Sasha_102.jpg?resize=525%2C300 1.5x, https:\/\/news.harvard.edu\/wp-content\/uploads\/2023\/06\/060923_Sasha_102.jpg?resize=700%2C400 2x"},"classes":[]}],"jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/news.harvard.edu\/gazette\/wp-json\/wp\/v2\/posts\/249509","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/news.harvard.edu\/gazette\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/news.harvard.edu\/gazette\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/news.harvard.edu\/gazette\/wp-json\/wp\/v2\/users\/122429419"}],"replies":[{"embeddable":true,"href":"https:\/\/news.harvard.edu\/gazette\/wp-json\/wp\/v2\/comments?post=249509"}],"version-history":[{"count":17,"href":"https:\/\/news.harvard.edu\/gazette\/wp-json\/wp\/v2\/posts\/249509\/revisions"}],"predecessor-version":[{"id":261769,"href":"https:\/\/news.harvard.edu\/gazette\/wp-json\/wp\/v2\/posts\/249509\/revisions\/261769"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/news.harvard.edu\/gazette\/wp-json\/wp\/v2\/media\/249516"}],"wp:attachment":[{"href":"https:\/\/news.harvard.edu\/gazette\/wp-json\/wp\/v2\/media?parent=249509"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/news.harvard.edu\/gazette\/wp-json\/wp\/v2\/categories?post=249509"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/news.harvard.edu\/gazette\/wp-json\/wp\/v2\/tags?post=249509"},{"taxonomy":"format","embeddable":true,"href":"https:\/\/news.harvard.edu\/gazette\/wp-json\/wp\/v2\/gazette-formats?post=249509"},{"taxonomy":"series","embeddable":true,"href":"https:\/\/news.harvard.edu\/gazette\/wp-json\/wp\/v2\/series?post=249509"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}