Chia Shen is thinking about you … and your computer.
Shen, director of the Scientists Discovery Room Lab at the School of Engineering and Applied Sciences (SEAS), is helping develop a new angle to Harvard’s computer science research.
Together with Gordon McKay Professor of the Practice of Computer Science Hanspeter Pfister, Shen for the past year has been devising new ways for people — whether researchers or the general public — to interact with and explore the enormous data sets that are increasingly being created and made available on computers.
“Human-computer interaction and visual computing, scientific visualization, and information visualization have not been part of the focus at SEAS,” Shen said. “I’m always focused on how people can collaborate, how they can interact [with their data].”
Shen came to Harvard from Mitsubishi Electric Research Labs in Cambridge, where she helped develop early tabletop user interfaces and applications since 1999. In the applications, several users can interact with data on tables that incorporate computers, sensors, and graphics into their design.
Shen was recruited when the Harvard Initiative in Innovative Computing (IIC) began to explore visual computing.
“Interaction is a big ingredient of visualization. You’re not just looking at information, you’re interacting with it,” Shen said. “We’ve been working on some applications with respect to science, applications for education, for communicating science to the public. Hopefully, we’re building up ideas that scientists can use in spaces like this to do discovery.”
Shen’s base of operations is the Scientists Discovery Room (SDR) Lab, located in Maxwell-Dworkin. The room is home to a display wall and two Microsoft “Surface” tables, the latest generation of the collaborative tabletop computing that Shen has worked on for years. The Surfaces are driven by ordinary personal computers equipped with heavy-duty graphics cards. Input from the users — in this case, the movements their hands make on the tabletop — are recorded with several cameras positioned under the tabletop. The users’ hand movements are fed to the computer, whose software projects a responsive image onto the tabletop from below.
“The Discovery Room is more about interaction, about interfaces, and not about processing large amounts of data,” Pfister said. “We want to really help the scientists use that technology.”
Shen said the work, while experimental, is also meant to be practical. One of the lab’s major projects is called INVOLV. It was started in June 2008 by Shen and Michael Horn, who recently joined the faculty at Northwestern University and who was then a fellow in Shen’s lab. INVOLV allows users to visualize large sets of hierarchical data — data arranged in ascending or descending order, such as family trees — and interact with it. Shen showed a recent visitor to the lab two ways INVOLV is being applied, both to be employed in education. One combines the enormous amounts of information in the Encyclopedia of Life project, which is an online catalog of all the known life on Earth, with the Tree of Life, which presents evolutionary relationships of different creatures and organisms into a single, searchable, visual database.
A second related INVOLV effort is already ready for prime time. Called IMA, for Interactive Multi-touch Arthropod, the project has focused in on the world’s arthropods and, using a Microsoft Surface, allows visitors to the Harvard Museum of Natural History another way to explore the museum’s arthropods exhibit. By touching the tabletop and expanding polygons labeled with names of families and species, users can explore the world’s arthropods, zeroing in to a specific species and finding images, video, and data on them.
Another major new project is the Connectome Touch, a collaboration with the IIC’s Connectome, an ambitious effort led by Pfister, Professor of Molecular and Cellular Biology in the Faculty of Arts and Sciences Jeff Lichtman, and Professor of Neurobiology at Harvard Medical School Clay Reid. The Connectome aims to completely map the nervous systems of specific organisms. The major hurdle for the Connectome is the enormous amount of data researchers need to analyze and process. Connectome researchers are imaging extremely thin slices of brain tissue at ultrahigh resolution. The resulting images make up enormous data files that have to be manipulated as researchers manually trace and track nerve cells from one slice to another.
The Connectome Touch, which also utilizes a Microsoft Surface, aims to allow researchers to search an image, mark it, and track it from slice to slice.
“One of the challenges when trying to annotate data is you have a large set of images that essentially look very similar,” said Meekal Bajaj, a graduate student from the Georgia Institute of Technology working with Shen on the project. “You want to be able to do a lot of different operations simultaneously: navigate through the image set, go from one slice to the other, … mark it, zoom in and out, and pan, all while maintaining an understanding of where you are in the slice stack so you know exactly what you’re looking at. Doing this with traditional interfaces is really limiting.”
Though the equipment used in the Discovery Room is relatively new and still expensive, Shen said she expects the prices to soon come down, enabling individual researchers whose work would benefit from the ability to explore data in nontraditional ways to purchase them.
“The ultimate goal is to be able to evaluate the new visualization and human-computer interaction techniques that we invent and, ultimately, offer more … principles as guidelines for future design of human-computer interaction systems,” Shen said.