The Deadliest Virus

How likely is the virus to escape the laboratory?Illustration by Sachin Teng

On May 21, 1997, a three-year-old boy died in Hong Kong from a viral infection that turned out to be influenza. The death was not unusual: flu viruses kill hundreds of thousands of people every year. Hong Kong is among the world’s most densely populated cities, and pandemics have a long history of first appearing there or in nearby regions of southern China, and then spreading rapidly around the globe.

This strain, however, was unusual, and it took an international team of virologists three months to identify it as H5N1—“bird flu,” as it has come to be called. Avian influenza had been responsible for the deaths of hundreds of millions of chickens, but there had never been a report of an infected person, even among poultry workers.

By the end of the year, eighteen people in Hong Kong had become sick, and six had died. That’s a remarkably high mortality rate: if seasonal flu were as virulent, it would kill twenty million Americans a year. Hong Kong health officials, fearing that the virus was on the verge of becoming extremely contagious, acted forcefully to build a moat around the outbreak: during the last week of December, they destroyed every chicken in the city.

The tactic worked. Bird flu disappeared, at least for a while. “We felt we had dodged a bullet,” Keiji Fukuda told me earlier this year, when I visited him in his office at the World Health Organization’s headquarters, in Geneva. Fukuda, as the assistant director-general for health, security, and environment, oversees influenza planning. At the end of 1997, when he was the chief influenza epidemiologist at the Centers for Disease Control and Prevention, in Atlanta, he spent a few tense weeks in Hong Kong, searching for clues to how the virus was transmitted from chickens to humans and whether it would set off a global pandemic. “It was a very scary time,” he said, “and we were bracing ourselves for the worst. But by the end of the month nobody else got sick, so we crossed our fingers and went back to Atlanta.”

Then, in 2003, the virus reëmerged, in Thailand; it has since killed three hundred and forty-six of the five hundred and eighty-seven people it is known to have infected—nearly sixty per cent. The true percentage is undoubtedly lower, since many cases go unreported. Even so, the Spanish-flu epidemic of 1918, which killed at least fifty million people, had a mortality rate of between two and three per cent. Influenza normally kills far fewer than one-tenth of one per cent of those infected. This makes H5N1 one of the deadliest microbes known to medical science.

To ignite a pandemic, even the most lethal virus would need to meet three conditions: it would have to be one that humans hadn’t confronted before, so that they lacked antibodies; it would have to kill them; and it would have to spread easily—through a cough, for instance, or a handshake. Bird flu meets the first two criteria but not the third. Virologists regard cyclical pandemics as inevitable; as with earthquakes, though, it is impossible to predict when they will occur. Flu viruses mutate rapidly, but over time they tend to weaken, and researchers hoped that this would be the case with H5N1. Nonetheless, for the past decade the threat of an airborne bird flu lingered ominously in the dark imaginings of scientists around the world. Then, last September, the threat became real.

At the annual meeting of the European Scientific Working Group on Influenza, in Malta, several hundred astonished scientists sat in silence as Ron Fouchier, a Dutch virologist at the Erasmus Medical Center, in Rotterdam, reported that simply transferring avian influenza from one ferret to another had made it highly contagious. Fouchier explained that he and his colleagues “mutated the hell out of H5N1”—meaning that they had altered the genetic sequence of the virus in a variety of ways. That had no effect. Then, as Fouchier later put it, “someone finally convinced me to do something really, really stupid.” He spread the virus the old-fashioned way, by squirting the mutated H5N1 into the nose of a ferret and then implanting nasal fluid from that ferret into the nose of another. After ten such manipulations, the virus began to spread around the ferret cages in his lab. Ferrets that received high doses of H5N1 died within days, but several survived exposure to lower doses.

When Fouchier examined the flu cells closely, however, he became alarmed. There were only five genetic changes in two of the viruses’ eight genes. But each mutation had already been found circulating naturally in influenza viruses. Fouchier’s achievement was to place all five mutations together in one virus, which meant that nature could do precisely what he had done in the lab. Another team of researchers, led by Yoshihiro Kawaoka, at the University of Wisconsin, created a slightly different form of the virus, which, while not as virulent, was also highly contagious. One of the world’s most persistent horror fantasies, expressed everywhere from Mary Shelley’s “Frankenstein” to “Jurassic Park,” had suddenly come to pass: a dangerous form of life, manipulated and enhanced by man, had become lethal.

Fouchier’s report caused a sensation. Scientists harbored new fears of a natural pandemic, and biological-weapons experts maintained that Fouchier’s bird flu posed a threat to hundreds of millions of people. The most important question about the continued use of the virus, and the hardest to answer, is how likely it is to escape the laboratory. “I am not nearly as worried about terrorists as I am about an incredibly smart, smug kid at Harvard, or a lone crazy employee with access to these sequences,” Michael T. Osterholm, the director of the Center for Infectious Disease Research and Policy at the University of Minnesota Health Center, told me. Osterholm is one of the nation’s leading experts on influenza and bioterrorism. “We have seen many times that accidental releases of dangerous microbes are not rare,” he said.

Osterholm’s anxiety was based in recent history. The last person known to have died of smallpox, in 1978, was a medical photographer in England named Janet Parker, who worked in the anatomy department of the University of Birmingham Medical School. Parker became fatally ill after she was accidentally exposed to smallpox grown in a research lab on the floor below her office. In the late nineteen-seventies, a strain of H1N1—“swine flu”—was isolated in northern China, near the Russian border, and it later spread throughout the world. Most virologists familiar with the outbreak are convinced that it came from a sample that was frozen in a lab and then released accidentally. In 2003, several laboratory technicians in Hong Kong were infected with the SARS virus. The following year, a Russian scientist died after mistakenly infecting herself with the Ebola virus.

Biological labs are given four possible biosafety-level security grades, ranging from BSL-1 to BSL-4. Research on the most lethal and contagious organisms is carried out at BSL-4 laboratories. Under U.S. guidelines, BSL-3 facilities contain microbes that cause “serious or potentially lethal diseases” but do not easily pass among people, or for which there are easily accessible preventives. BSL-4 laboratories house agents that have no preventives or treatments. The labs in Rotterdam and in Wisconsin where the H5N1 ferret work was conducted were both BSL-3 facilities that had been enhanced with additional security measures. In such laboratories, scientists are typically subjected to security checks; they wear spacesuits and breathe through special respirators. Although no safeguards are absolute, negative air filters attempt to insure that no particles accidentally escape from the lab.

“I feel terrible—I took it from a job creator.”

Last December, the National Science Advisory Board for Biosecurity, a panel of science, defense, and public-health experts, was asked by the Department of Health and Human Services to evaluate Fouchier’s research. The panel recommended that the two principal scientific journals, Science and Nature, reconsider plans to publish information about the methods used to create the H5N1 virus. It was the first time that the Advisory Board, which was formed after the anthrax attacks of 2001 to provide guidance on “dual use” scientific research, which could both harm and protect the public, had issued such a request. “We are in the midst of a revolutionary period in the life sciences,” the advisers wrote. “With this has come unprecedented potential for better control of infectious diseases and significant societal benefit. However, there is also a growing risk that the same science will be deliberately misused and that the consequences could be catastrophic.” The Times published an editorial that echoed the Advisory Board’s concern, and even questioned the purpose of the experiments: “We believe in robust research and almost always oppose censorship. But in this case the risks—of doing the work and publishing the results—far outweigh the benefits.” The journal New Scientist agreed: “ONE MISTAKE AWAY FROM A WORLDWIDE FLU PANDEMIC.” Television talk shows and the Internet pulsated with anxiety.

The widespread alarm led Science and Nature to agree to postpone publication. Fouchier’s virus, which now sits in a vault within his securely guarded underground laboratory in Rotterdam, has fundamentally altered the scope of the biological sciences. Like the research that led to splitting the atom and the creation of nuclear energy, the knowledge that his experiment has provided could be used to attack the public as well as to protect it.

“Terror is not an unjustified reaction to knowing this virus exists,” Osterholm, who serves on the Advisory Board, told me. “We have no room to be wrong about this. None. We can be wrong about other things. If smallpox got out, it would be unfortunate, but it has a fourteen-day incubation period, it’s easy to recognize, and we would stop it. Much the same is true with SARS. But with flu you are infectious before you even know you are sick. And when it gets out it is gone. Those researchers have all of our lives at the ends of their fingers.”

Fouchier, a lanky forty-five-year old man with intense blue eyes, works at one of the most highly regarded virological laboratories in Europe. “I have spent many years and this institution has paid millions of dollars to insure that this research was carried out in the safest possible manner,” he told me when we met in a conference room in the grim research facility that houses his laboratories at the Erasmus Medical Center. The center devoted several years to constructing a special lab for Fouchier’s research. From the windows, one can see barges and hulking gray cranes; Rotterdam is Europe’s busiest port. It is an industrial cityscape whose bleakness, on the day I visited, seemed to match Fouchier’s mood. As he spoke, he stared at his hands, which he clenched nervously. “People are acting like I am some mad scientist,” he said.

Fouchier spent much of his career working on the structure of the AIDS virus. In 1997, he abruptly turned to bird flu, both because he was fascinated by its molecular structure and because he quickly grasped its pandemic potential. He has published scores of scientific articles on how influenza viruses move between species. Since December, however, when the Advisory Board recommended postponing publication of the bird-flu research, and some of his colleagues called for stopping it entirely, he has felt, he says, like the focus of “an international witch hunt.” He was incensed. “To attempt to prevent this research from reaching the largest number of scientists is bullshit,” he told me. “The more people who have access to it, the more likely we are to get answers to the many questions we still need to ask. Everyone who knows anything about virology can get hold of the recipe.” There were nearly a thousand people at the Malta meeting where he first announced his findings. “This moratorium serves some fake sense of security,” he said. “It does not serve the public health.”

Fouchier, as well as Kawaoka and other researchers, had been trying for years to learn whether H5N1 could trigger a worldwide pandemic. He wondered why the virus has destroyed so many poultry flocks in the United States, Europe, and Asia but infected so few people. Fouchier hoped to characterize the properties that make the virus so much deadlier than others. The only way to answer these questions was to create a variant that would cling to human cells in the nose and throat. Fouchier’s research was hardly the work of a furtive renegade. Several international review committees oversaw his experiments, and he received funding from the National Institutes of Health. Despite the risks, most people in his field believed that the experiments were necessary. Moreover, they were not without precedent. In 2002, Eckard Wimmer, at Stony Brook University, stitched together hundreds of DNA fragments, mostly acquired via the Internet, then used them to create a fully functional polio virus. In the fall of 2005, several published academic papers described the genomic sequence of the 1918 Spanish flu, which caused the world’s deadliest influenza pandemic. In each case, the publications were initially denounced but were eventually accepted as valuable.

“In this profession, you always do it wrong,” Ab Osterhaus, a leading infectious-disease expert who runs the virology department at Erasmus, said. “Either you give too much warning or not enough. Either you take things too seriously or not seriously enough. Fouchier’s work is essential, and the questions it raises must be addressed.”

There have been many hypotheses about how bird flu could become epidemic. Most researchers had believed that the avian virus would have to combine with human genes in pigs. Pigs usually serve as a mixing vessel for influenza viruses that make the transition from poultry to humans. (This is how the global pandemic starts in Steven Soderbergh’s recent film “Contagion”: Gwyneth Paltrow is exposed to a pig that’s been infected by a bat, and soon much of the world is dead or dying.) Other scientists believed that the H5 protein, because of its molecular structure, could not easily infect human cells. (Strains of influenza are named for two proteins on their surface that latch on to respiratory cells and make it possible for them to invade our lungs.) “There has been a lot of speculation that this virus cannot be transmitted easily or through the air,” Fouchier told me. “That speculation has been wrong.”

Although no animal study can predict with certainty what will happen in humans, ferrets get flu pretty much the way we do. Their lung physiology is similar to humans’, and avian-influenza viruses bind to the same receptor cells in their respiratory tracts. Still, there has been sharp debate among scientists about whether results in ferrets can predict how humans will react to similar infections, with some researchers discounting the data entirely.

“The mutations . . . could cause the viruses to be more transmissible between humans,” Peter Palese, a prominent microbiologist at Mount Sinai School of Medicine, wrote recently. “But this is simply unknowable from available data.” Palese argues that the virus may be better adapted to ferrets than to other mammals.

“You cannot say, ‘Just forget about it, because it happened in a ferret,’ ” Fouchier said. “This is our best model. But you also can’t say, ‘Because it happened in a ferret, it will happen in a human.’ So it becomes a question of whether it’s worth the risk of finding out. This is one of the most dangerous viruses you can imagine. It’s not my virus—it’s our virus. And it’s out there. We need to deal with that. And, if we focus on what matters, we can.”

Once you create a virus that could kill millions of people, what should you do with it? And how should you handle the knowledge that made it possible?

There have been angry calls for Fouchier’s virus to be destroyed, for it to be transferred to a military-level bioweapons facility, and for research to be stopped entirely. “It’s just a bad idea for scientists to turn a lethal virus into a lethal and highly contagious virus,” Dr. Thomas Inglesby, a bioterrorism expert and the director of the Center for Biosecurity, at the University of Pittsburgh Medical Center, said. “And it’s a second bad idea for them to publish how they did it, so others can copy it.”

Still, most scientists who work with viruses insist that the value of this research outweighs the risks. Anthony S. Fauci, the longtime chief of the Institute of Allergy and Infectious Diseases, told me, “Those data could help scientists determine rapidly whether existing vaccines or drugs are effective against such a virus, as well as help in the development of new medications. It’s hard to stop something if you don’t know what it’s made of. Naturally, if epidemiologists in countries where pandemics most often arise know what they are looking for, they will be able to move with greater urgency to contain the spread.”

“It’s a magic potion that makes everything you say interesting.”

How likely is it that publishing the genetic sequence could help a terrorist, a rogue, or a legitimate researcher who might develop a novel vaccine or drug? “Most of us are unequivocal about the value of the research,” Fauci said. “But deciding what to do with these types of studies is complicated. At the moment, there are no official governing bodies to regulate such decisions. They rely on the good will of researchers.” Fauci and others have noted that, precisely because flu is so hard to control, the virus would be difficult to use as a weapon.

In this case, as in most other cases, the work was supported heavily by the National Institutes of Health, and it seems unlikely to proceed without U.S. government support. Scientists bicker as vigorously as any other group, but rarely about the right to share and publish the data on which their research depends. Even the National Science Advisory Board for Biosecurity has made clear its general support for open investigation and full publication. The scientific method and the entire edifice of institutional research depend on such openness; without it, progress would slow dramatically. As biology has become more accessible, the balance between freedom and protection has become harder to maintain. This is certainly not the last time that preventing wide dissemination of information may seem necessary. But who should make those decisions, and how? Scientists fear that any regulatory body will stifle research. In 1975, when biologists met at Asilomar, California, to discuss the potential hazards of the new field of recombinant DNA technology, the group drew up voluntary guidelines to govern their research. Those guidelines have worked well, and that meeting is often regarded as a model of coöperative regulation.

We live in a very different world now. Secretary of State Hillary Clinton recently gave a speech at a biological-weapons conference in Geneva in which she stressed that the threat of biological terror can no longer be ignored. “There are warning signs,” Clinton said, including “evidence in Afghanistan that . . . Al Qaeda in the Arabian Peninsula made a call to arms for—and I quote—‘brothers with degrees in microbiology or chemistry to develop a weapon of mass destruction.’ ”

While scientists disagree sharply about whether it would be easy to replicate such a virus in a laboratory, and whether it would be worth the effort, there is no question that we are moving toward a time when work like this, and even more complex biology, will be accessible to anyone with the will to use it, a few basic chemicals, and a relatively small amount of money.

Those realities have compelled many scientists to reconsider their unilateral support of the principle of open research. “I can tell you that when I began this journey I was certainly of the view that everything should be out and science should not be interfered with,” Arturo Casadevall, the chief of infectious disease at the Albert Einstein College of Medicine and a member of the Advisory Board, said at a recent forum on the issue sponsored by the New York Academy of Sciences. “And as the result of hundreds of hours of the deliberative process I changed my mind.” Others are even more emphatic, arguing that although the information is bound to become available, any delay is better than none. Many countries lack proper surveillance capacities, and existing vaccines are not good enough to stop influenza viruses from taking hold in the human population. By the time that public-health officials were fully aware of the swine-flu virus that originated in Mexico in 2009, for instance, it had spread across the globe.

In January, a few days before we met in Rotterdam, Fouchier had agreed to a sixty-day moratorium on the project, but only after he received a long, late-night phone call from Fauci, who convinced him that a worldwide time-out—the first since the beginning of the era of molecular biology—would allow people to cool off and enable them to explain the value of such research to the public. In mid-February, a committee of specialists, including Fouchier, met in Geneva at the W.H.O. headquarters and announced that the papers would eventually be published in full, but that a sixty-day moratorium was probably not long enough. It is not clear when or where the research will continue.

Attempts to control information or to prohibit research rarely succeed for long. As the physicist and synthetic biologist Rob Carlson has written, most notably in his 2010 book, “Biology Is Technology,” in the case of crystal methamphetamine both prohibition and efforts by the federal government to shut down production labs have failed, and in similar ways. In each case, success in cracking down on small-time dealers led to failure on a larger scale. Carlson believes that cutting the flow of H5N1 data will have the same effect. “Any attempt to secure the data would have to start with an assessment of how widely it is already distributed,” he wrote recently on his blog, Synthesis. “I have yet to meet an academic who regularly encrypts e-mail, and my suspicion is that few avail themselves of the built-in encryption on their laptops.” Carlson noted that, in addition to university computers and e-mail servers in facilities where the science originated, the information is probably stored in the computers of reviewers, on servers at Nature and Science, at the Advisory Board, and, depending on how the papers were distributed and discussed by the board’s members, possibly on their various e-mail servers and individual computers as well. “And,” Carlson wrote, “let’s not forget the various unencrypted phones and tablets all of those reviewers now carry around.”

Carlson and others argue that restricting publication would retard the progress of the research without increasing safety. With influenza viruses, speed matters. Vaccine-production methods have not changed substantially in sixty years, and it was months before a useful vaccine was widely available for the H1N1 pandemic of 2009. That virus infected more than a billion people. Future bird-flu research could help scientists learn how it is transmitted through the air, why it makes the leap from animal to man, and how specifically it binds to human cell receptors. By placing the virus into tissue culture, scientists could discover more about how it destroys cells and make a better assessment of whether current vaccines would protect us—and, if they wouldn’t, the research could guide us toward making more effective vaccines. None of these experiments are without risk, but one must also consider the risk of not carrying them out.

“We can learn a great deal about transmission of influenza virus through the air from this work, and it’s something we know very little about,” Ab Osterhaus, the leader of the Erasmus team, said. “Nobody was going to make this virus in his garage. There are so many better ways to create terror. You have to compare the risk posed by nature with the theoretical risk that a human might use this virus for harm. I take the bioterror threat very seriously. But we have to address the problems logically. And nature is much more sophisticated than anyone in any lab. Nature is going to manufacture this virus or something like it. We know that. Bioterrorists might, but nature will. Look at the past century: the 1918 flu, H.I.V., Ebola, and H1N1. The Spanish flu took months. SARS maybe a couple of weeks. This is happening all the time, and we have ways to fight it. So where is the greatest risk? Is it in someone’s garage or in nature? Because you cannot prevent scientists from getting the information they need to address that risk. I understand politics and publicity. But I also understand that viruses do not care about any of that.” ♦