Containing the Risks of Bioengineered Super Viruses

h5n1 cdc 1841

The H5N1 strain of the bird flu is a deadly virus that kills more than half of the people who catch it.

Fortunately, it’s not easily spread from person to person, and is usually contracted though close contact with infected birds.

But scientists in the Netherlands have genetically engineered a much more contagious airborne version of the virus that quickly spread among the ferrets they use as an experimental model for how the disease might be transmitted among humans.

And researchers from the University of Wisconsin-Madison used samples from the corpses of birds frozen in the Arctic to recreate a version of the virus similar to the one that killed an estimated 40 million people in the 1918 flu pandemic.

It’s experiments like these that make David Relman, a Stanford microbiologist and co-director of the Center for International Security and Cooperation, say it's time to create a better system for oversight of risky research before a man-made super virus escapes from the lab and causes the next global pandemic.

“The stakes are the health and welfare of much of the earth’s ecosystem,” said Relman.

“We need greater awareness of risk and a greater number of different kinds of tools for regulating the few experiments that are going to pose major risks to large populations of humans and animals and plants.”

Terrorists, rogue states or conventional military powers could also use the published results of experiments like these to create a deadly bioweapon.

“This is an issue of biosecurity, not just biosafety,” he said.

“It’s not simply the production of a new infectious agent, it’s the production of a blueprint for a new infectious agent that’s just as risky as the agent itself.”

Image
H5N1 bird flu seen under an electron microscope. The virus is colored gold. Photo credit: CDC
Scientists who conduct this kind of research argue that their labs, which follow a set of safety procedures known at Biosafety Level 3, are highly secure and the chances of a genetically engineered virus being released into the general population are almost zero.

But Relman cited a series of recent lapses at laboratories in the United States as evidence that accidents can and do happen.

“There have been a frightening number of accidents at the best laboratories in the United States with mishandling and escape of dangerous pathogens,” Relman said.

“There is no laboratory, there is no investigator, there is no system that is foolproof, and our best laboratories are not as safe as one would have thought.”

The Centers for Disease Control and Prevention (CDC) admitted last year that it had mishandled samples of Ebola during the recent outbreak, potentially exposing lab workers to the deadly disease.

In the same year, a CDC lab accidentally contaminated a mild strain of the bird flu virus with deadly H5N1 and mailed it to unsuspecting researchers.

And a 60 year-old vial of smallpox (the contagious virus that was effectively eradicated by a worldwide vaccination program) was discovered sitting in an unused storage room at a U.S. Food and Drug Administration lab.

Earlier this year, the U.S. Army accidentally shipped samples of live anthrax to hundreds of labs around the world.

Similar problems have been reported in labs around the world. The United Kingdom has had more than 100 mishaps in its high-containment labs in recent years.

It’s difficult to judge the full scope of the problem, because many lab accidents are underreported.

Studying viruses in the lab does bring important potential benefits, such as the promise of universal vaccines, as well as cheap and effective ways of developing new drugs and other kinds of alternative defenses against naturally occurring diseases.

“It’s a very tricky balancing act,” Relman said.

“We don’t want to simply shut down the work or impede it unnecessarily.”

However, there are safer ways to conduct research, such as using harmless “avirulent” versions of the virus that would not cause widespread death and injury if it infected the general public, Relman said.

Developing better tools for risk-benefit analysis to identify and mitigate potential dangers in the early stages of research would be another important step towards making biological experiments safer.

Closer cooperation among diverse stakeholders (including domain experts, government agencies, funding groups, governing organizations of scientists and the general public) is also needed in order to develop effective rules for oversight and regulation of dangerous experiments, both domestically and abroad.

“We believe that the solutions are going to have to involve a diverse group of actors that has not yet been brought together,” Relman said.

“We need new approaches for governance in the life sciences that allow for these kinds of considerations across the science community and the policy community.”

You can read more about Relman’s views on how to limit the risks of biological engineering in this article he wrote for Foreign Affairs with co-author with Marc Lipsitch, director of Harvard’s Center for Communicable Disease Dynamics.