In a preview of what is likely now playing out in a closed-door meeting of the World Health Organization, a cadre of experts on infectious disease gathered Wednesday at the Harvard School of Public Health (HSPH) to debate whether efforts to combat a deadly form of flu have actually increased the risk to public health.
Panelists at The Forum at Harvard School of Public Health examined “Bird Flu Research: Dangerous Information on a Deadly Virus,” in a sober discussion about H5N1, an avian flu that has infected about 600 people worldwide since 2003, and killed approximately 60 percent of them.
Under direction by moderator Sharon Begley, the senior U.S. Health and Science correspondent for Reuters, the panelists agreed that the creation in laboratories of a different form of H5N1, one that could now pass from mammal to mammal, was done with the best intentions: to determine whether this virus could mutate to a form that is more deadly to humans. Such research could then prepare public health groups for a possible pandemic. Thus far, most infections have occurred among people exposed to poultry.
But the panelists also raised troubling questions about unintended consequences, ranging from bioterrorism to an accidental release to blanket secrecy that could stymie collaboration among scientists and prevent crucial information sharing. Recently, a federal advisory board recommended that details of the bird flu research be stricken from upcoming journal articles.
“I think this H5N1 experience gives us an opportunity to reset the way we think about infectious disease, infectious disease research, the tools and the knowledge of biotechnology that have become so powerful, and responsibility in life sciences research,” said David R. Franz, a former commander in the U.S. Army Medical Research Institute of Infectious Diseases. Such research has been used for good, “but we can’t ignore the small possibility it may be used for harm, even accidentally or intentionally.”
Marc Lipsitch, professor of epidemiology at the Harvard School of Public Health (HSPH) and director of the Center for Communicable Disease Dynamics, laid out the issue: Labs in Wisconsin and the Netherlands, using genetic modification and other methods, generated an H5N1 strain that can be passed airborne from ferret to ferret. The virus was done to study flu evolution and transmission. Ferrets, Lipsitch noted, are not a perfect reflection of how flu viruses work in humans but “they are the best animal model we have.” If the H5N1 could evolve to spread airborne among people, “it would be a very dangerous strain for humans.”
Some parties have taken an extreme position that this research is so potentially dangerous, it should not be allowed at all, said Barry R. Bloom, the former dean of HSPH and professor in the Department of Immunology and Infectious Disease. The other extreme position is to make the research completely open, as “we don’t know if transmission in ferrets is applicable to humans.” H1N1 (swine flu) kills ferrets but turned out to be not that dangerous in humans, for example. The argument goes: “The Internet makes it very unlikely you can keep anything in science secret for very long.”
But could published research be used by terrorists to create a bioweapon? Al-Qaida reportedly has put out a call for “brothers” with degrees in microbiology and chemistry to develop weapons of mass destruction.
Ever since 9/11, there has been a “twinning” of the bioterrorism threat and the infectious disease threat, said Jeanne Guillemin, a senior adviser in the MIT Security Studies Program. The funding emphasis has gradually shifted from bioweaponry, in subjects like anthrax, to infectious diseases, she said. But the two camps have different mentalities.
The aim of anti-terrorism efforts is to get ahead of the enemy, to “do it before they do,” she said. In contrast, public health efforts, such as in combating SARS, requires transparency. “The idea is protecting the public, and secrecy is the last thing you want,” she said.
Franz said he was “probably more concerned about safety than … about security. There will certainly be more scientists working in legitimate laboratories on bugs like this than there will be terrorists working in caves.”
Going forward, “my sense is that education, consensus, norms, [and] behavioral responses will be more useful to dealt with this kind of problem in the future than regulatory schemes,” he said.
Asked whether the research should have been done, Lipsitch said yes, but it should continue only under highest containment, not in “hundreds of labs around the world.”
Bloom recommends separating the issues of what should be published and who should be allowed to do this kind of research. He has no problem with redacting technical details; no one wants to “write a cookbook on how to make a transmissible virus.”
“It’s not trivial to re-create this virus in a cave. (But) not impossible,” Bloom said.
Still, “My personal view is it’s very hard to squelch information in a highly connected world,” he said. “The most important thing the scientific community can do is to be as open and transparent as possible.
A webcast of the forum is available on its site.