New U.S. Rules for Dangerous Pathogen Research, Explained
When the next pandemic hits, we may be better prepared—in no small part thanks to a recent win for science policy.
On May 6, the White House Office of Science and Technology Policy announced new guidelines for research on dangerous pathogens and toxins, including microorganisms that can cause highly transmissible, high-mortality diseases, including H5N1 avian influenza, anthrax, and Ebola virus. The regulations become effective in May 2025.
The guidelines were more than four years in the making and replaced a policy patchwork cobbled together since the mid-2000s. They update and expand the list of high-risk human and animal viruses and bacteria that can be experimented with and add to the types of highest-risk experiments that require oversight.
And they update rules on so-called gain-of-function research—”research that seeks to alter the functional characteristics” of a pathogen, according to the Centers for Disease Control and Prevention. Revelations of gain-of-function research conducted on coronaviruses in Wuhan, China, not far from the first documented outbreaks of COVID-19, may have added an even greater sense of urgency to the policymaking task.
A careful weighing of risks.
The debate behind the new policy centered on a difficult risk trade-off. Research on pathogens is inherently risky: If handled incorrectly, pathogens can escape the lab, for example, or researchers may unintentionally make a pathogen more lethal by modifying it. Yet if research is too tightly restricted, we risk being unprepared for the next pandemic.
Ideally, these risks should be balanced. But judging the balance is very hard, not just because the most dangerous pathogens are, well, dangerous, but also because of the very provisional nature of scientific inquiry. Not even the experts themselves—scientists and other researchers—know in advance what lines of scientific inquiry will be the most promising to follow.
That was the challenge for science policymakers. And they appear to have met it with the new guidelines, “The United States Government Policy for Oversight of Dual Use Research of Concern and Pathogens with Enhanced Pandemic Potential,” or DURC—the handy acronym used in the field, pronounced “dirk.”
Did they strike the right balance? “I don’t think there’s such a thing as right,” says Dr. Nicole Lurie, executive director of preparedness and response and U.S. director at the Coalition for Epidemic Preparedness Innovations. She also served an eight-year term as assistant secretary for preparedness and response at the Department of Health and Human Services, where she led national responses to public health emergencies, including the 2009 H1N1 influenza pandemic and other infectious disease outbreaks.
“I think they got it better,” she told The Dispatch. “And it’s critical that these policies won’t impede life-saving research.”
Dr. Larry Schlesinger, CEO of Texas Biomedical Research Institute, agrees. An independent, nonprofit infectious disease research institution based in San Antonio, it operates eight high-containment labs, including North America’s only privately owned and operated lab equipped with the highest level of biocontainment.
He believes the new policy turns a fragmented and less-than-clear set of guidelines into a cogent, streamlined process. “It will allow for more transparency, more clarity, and responsible practices,” he said. Nor does he see it as over-regulation, “which makes me happy.”
Research for good versus bioterror threats.
The dual-use research that the policy governs describes any scientific research that can benefit or threaten human society. State actors have conducted dual-use research since at least the beginning of the modern industrial era, the classic example being the atomic research that led to nuclear medicine and nuclear bombs.
By the 1980s and 1990s, incidents such as the Rajneeshee bioterror attack in Oregon, where a cult contaminated salad bars with salmonella, and sarin attacks in Japan made clear the other side of dual use: that deadly pathogens could be used by terrorists if not more tightly regulated. Meanwhile, scientists were increasingly working on high-risk pathogen research, publishing papers that “could provide a ‘roadmap’ for terrorists seeking to weaponize biological agents,” according to a 2017 report by the National Academies of Sciences, Engineering, and Medicine.
The anthrax mailings across the U.S. in the weeks after the September 11 attacks led to the formation in 2005 of the National Science Advisory Board for Biosecurity (NSABB), a federal advisory committee within the National Institutes of Health.
Over the next 15 years, the NSABB issued a set of three separate guidelines that still govern dangerous pathogen research. But their shortcomings–they’re not always clear and may not address all current and emerging research–were widely recognized not long after their creation.
In January 2020 the NSABB began work on a comprehensive update that was quickly sidetracked by—you guessed it—the coronavirus pandemic. The board resumed work in early 2022, but a set of recommendations made in 2023 were viewed as being too restrictive. According to Science, many researchers believed the new guidelines might affect studies on even low-level pathogens, such as the common cold and herpesviruses.
A policy that may have gotten it right.
The new research policy simplifies and expands the current policy by replacing the current roster of dangerous pathogens subject to oversight with two categories that cover a broader range of pathogens and experiments involving them.
The first category adds more high-risk human pathogens to the regulated list, such as West Nile virus, as well as plant and animal pathogens that pose risks to livestock and crops. It also expands the number of experiment types that are regulated. To the relief of many researchers, low-risk pathogens are not included.
The second category addresses all gain-of-function (GOF) research on any pathogen that, when manipulated, could become a “pathogen with enhanced pandemic potential”—one that significantly threatens public health safety. For example, a lab that wants to better understand coronavirus transmissibility might introduce mutations to an actual coronavirus sample to see how the changes affect replication and transmission of the virus. If the mutations make the sample more transmissible, that could constitute a gain of function.
The new policy also includes in this category any research on extinct or eradicated viruses, such as the 1918 flu strain. But GOF research aimed at developing countermeasures—for example, to test antiviral drugs—isn’t subject to regulatory review.
Reporting is straightforward. Researchers who determine that their project falls into either category must notify the federal agency funding the project, which works with the researchers and their institutions to develop a risk mitigation plan for final agency approval. Approved projects are assessed through their completion.
Scientists who don’t abide by the rules could become ineligible for federal funds. And their institutions could be shut off from funding for any life science research. Although the policy applies only to federally funded research in the U.S. and internationally, the reach of federal funding is expected to influence the entire field of infectious disease research. Because it must be reviewed for updates at least every four years and its implementation guide every two years, it’s also future-proofed.
Crucially, the new policy combines two potentially competing interests—open scientific inquiry and careful scientific oversight—in a workable arrangement that aims to provide the safest outcome for public health. Scientists and institutions keep their freedom to research, but they risk losing their funding or worse if they don’t use reasonable judgment in seeking government oversight.
Schlesinger thinks giving reporting responsibility to the research community in this way isn’t a potential loophole, but an appropriate way to manage risk. “I consider it actually a very careful level of regulation,” he said.
For Lurie, the vulnerability of the DURC, or any policy, is human behavior. “No matter what you build, some people will find ways around it,” she said. “This is about managing and mitigating risk.”
Although the NSABB did not call for the creation of an independent agency to oversee dangerous pathogen research, the new policy will place an administrative burden on research institutions. Texas Biomed has exceeded current standards for some time, said Schlesinger, and while he may have to add staff to comply with the new regulations, he doesn’t believe meeting the new requirements will be a problem. “We’re perfectly primed, and perhaps ahead,” he said. “I don’t believe the new guidelines are going to impede our science at all.”
The new policy lets him get on with what he views as work that is critical to national security–for example, developing countermeasures for the next contagion that jumps from animals to humans. He would like to see a transformation in infectious disease research and development that uses the time between pandemics to develop vaccines and other solutions that can get us through the next pandemic. “And there will be a next outbreak,” he said. “We’re not prepared if we don’t do the work before that.”