Redirecting Dual-Use Research Regulations

Working with pathogenic and toxic materials puts researchers, and their working environment, at considerable risk of harmful accidental exposure. Even beyond this explicit challenge of life science research, though, a pressing and yet unresolved national policy issue has emerged among political decision makers and scientists. Moreover, the risks posed by the potential misuse of life science research are expanding rapidly as new advances in biotechnology, genetics, and related sciences quickly outpace the regulations governing that research. Today, although life science research is essential to continued international improvements in the health and safety of human life, farm animals, agricultural products, and the environment, the security challenges associated with such research are growing as well.

Throughout history, humans have taken advantage of the dual-use nature of certain microorganisms to cause harm. The aborigines in various areas of the world used amphibian-derived toxins in poison arrows; Hannibal struck fear into the enemies of Carthage with the release of live and extremely toxic serpents; and the Mongols used plague-ridden corpses to kill thousands of enemy soldiers (and private citizens as well). Unlike those predecessors, today’s modern scientists use their knowledge and skills primarily to help save human lives, not destroy them. However, the fast-paced global revolution in the life sciences of recent years has made it extremely difficult, and sometimes impossible, to prevent the abuse of important research breakthroughs. There needs to be a comprehensive and coordinated solution to preserve the benefits of life science research, while at the same time minimizing the risk of the knowledge, products, and technologies generated by that research being used to threaten public health and safety.

Defining Dual-Use Biotechnology

In the 2004 publication, “Biotechnology Research in an Age of Terrorism,” the U.S. National Research Council (NRC) pointed out that dual-use biotechnology “could be misapplied to cause substantial damage to human health, agriculture, the environment, the economy, or national security.” Today, of course, most technology and research has both civilian and military applications, which has led to the use of a new acronym, DURC (dual-use research of concern), to describe the potentially harmful effects of certain research. The processes and equipment involved in the development of biological weapons (BW), for example, are inherently dual-use. Moreover, the materials, methods, and technologies used for growing, recovering, concentrating, and stabilizing the materials used in biological agents also are used to produce vaccines, pharmaceuticals, and a broad spectrum of food products.

In response to concerns about the growing national security and public health implications of various DURC projects, many members of the global scientific community have stressed such important but intangible values as “academic freedom,” the “open exchange of information,” and the rights and privileges of “self-governance.” Experience shows that, although some private and public institutions have the infrastructure and impetus to self-regulate, most of them have, so far, failed to comply with voluntary DURC regulatory guidelines. One prominent example was a 2008 evaluation of institutional oversight pertaining to the U.S. NIH (National Institutes of Health) “Guidelines for Research Involving Recombinant DNA (rDNA) Molecules” that, among other things, revealed: (a) a lack of transparency; (b) the avoidance of due-process rules and regulations; and (c) noncompliance at an overwhelming majority of high-containment (biosafety levels 3 and 4) laboratories. The noncompliance problem will almost surely expand significantly as the number of high-containment facilities grows exponentially to accommodate the worldwide increase in dual-use life science research facilities now expected.

Existing DURC Regulations

To address the dual-use dilemma as a whole, the U.S. government has issued a number of additional policy statements since the release of the NIH Guidelines on rDNA. The most prominent and overarching of those statements – both of which were issued in 2012 – are the United States Government Policy for Oversight of Life Sciences Dual Use Research of Concern (also known as the March 29 Policy), and the United States Government Policy for Institutional Oversight of Life Sciences Dual Use Research of Concern.

The March 29 Policy formalized a process of regular federal review of U.S. government-funded (and/or conducted) research on new treatments and diagnostics, improvements in public health and surveillance, and the enhancement of emergency preparedness and response efforts. If a project raises concern, researchers are now required to provide risk-mitigation plans. If a plan is still not adequate to justify the research, according to the March 29 Policy, the federal department and agencies may pursue one of three options: “(a) request voluntary redaction of the research publications or communications; (b) identify the research … [or] (c) not provide or terminate research funding.” The U.S. government objective, of course, is to identify potential DURC problems at the project proposal stage and, more specifically, to control the dissemination of potentially harmful information before authorizing any research.

The March 29 Policy covers only research funded by the U.S. government; the follow-on policy mentioned above addresses institutional DURC without government affiliation. The second policy establishes uniform requirements governing the institutional oversight of certain research. Consistent with the March 29 Policy, the scope of oversight is limited to seven categories of experiments and 15 agents and toxins, as defined by the 2004 “Biotechnology Research in an Age of Terrorism” report. The same policy emphasizes that DURC-identified research should not assume a negative connotation, but should serve as an indication that the research may warrant additional oversight in order to mitigate intentional or unintentional risks to public safety. Both policies put the researcher in a subordinate position, defending the potential value and security of research results before even achieving those results.

Advancing Science & DURC Oversight

The 2011 DURC policies are well intentioned but seem unlikely to deliver a substantially more secure environment in the field of life science research. However, operating under the assumption that the proliferation of dual-use research and technology poses a direct threat to U.S. national security, the regulations issued seek to prevent scientists from unintentionally transmitting information and/or any other material that could be dangerous in the hands of potential enemies. This approach begins to address the problem of technology and information transfer; however, the policies issued to date do not present a truly comprehensive solution to the dual-use dilemma.

Part of the problem is that the policies were developed reactively as a response to the avian flu (H5N1) transmission studies conducted in 2011 by research scientists Yoshihiro Kawaoka and Ron Fouchier. Their somewhat controversial work revealed the presence of a mutation that allowed the transmissibility of H5N1 between mammals and, consequently, brought to light some other biosecurity weaknesses. For example, controversy over publication of the study results revealed significant gaps in the oversight of DURC and the nonexistence of guidance and decision-making authority associated with publication of the studies. Largely for that reason, the March 29 Policy focuses heavily on U.S. government evaluation and review of research and its usefulness in the sharing of scientific results. The policy seems to assert that, as the overseers of dual-use research, the reviewers can keep scientists – naïve of the implications of their research – from letting dangerous information fall into the wrong hands. The bureaucratic manner of framing these policies may explain the reluctance and widespread disapproval among members of the scientific community directly affected.

Instead of dictating the conduct of some of the country’s most capable innovators, DURC policies should be used to put responsibility – and, therefore, the “ownership” of potential consequences – in their hands. Scientists have the expertise needed to create socially and medically beneficial technologies; conversely, though, they also have the expertise necessary to manipulate otherwise benign technologies into threatening ones. Updated DURC policies should reflect the importance of scientists’ leadership and engagement in its regulation. Dual-use research is not inherently dangerous; it is, rather, the convergence of technologies and innovation that yield prohibited versus beneficial results. In accordance with this definition, no matter how quickly technology advances, scientists are (and will indefinitely remain) in control not only of research results but also of the many ways in which those results are applied.

The Practicality of DURC Regulations

As demonstrated by the H5N1 transmission studies, researchers sometimes exhibit overconfidence in assessing the risks posed by their work. In August 2013, Fouchier and Erasmus Medical Center (MC), his Rotterdam-based research institution, went to court to challenge the Netherlands government’s ruling that required him to obtain an export permit before submitting his research results for publication in the U.S.-based journal Science. Fouchier argued that his research should be considered “basic research,” which is already available in the public domain, and therefore should be exempt from the 2009 E.U. regulations upon which the Netherlands had originally identified the papers as an export. In September 2013, the Netherlands court upheld its original position and ruled that Fouchier’s study, pursuing airborne transmission of a deadly virus strain as a “practical goal,” went beyond the field of basic research.

Targeted international regulations might have better directed Fouchier’s research from the onset. Policy guidance should require researchers to pursue only socially or medically beneficial research objectives. Instead of creating a blueprint for a highly transmissible H5N1 virus, therefore, the guidance provided to Fouchier should have persuaded him to develop a more thorough study. Goals of that study could have been: (a) protecting mammals from identified virus mutation with the potential to increase virulence or transmissibility; and/or (b)identifying the likelihood of naturally occurring mutations and their potential to affect mammals. By taking the research one major step further, the individual researcher would have achieved the same (and possibly additional) results, society might have benefitted from those results, and the publisher could have shared the information without creating a new and potentially very harmful risk to national security.

Of course, in conducting dual-use research, unintended and unanticipated results are not uncommon. With regulations designed to put the onus on the researcher, he or she is not only in control of any achievements of his/her DURC project, but of any harmful consequences as well. Comprehensive guidance and training on how, specifically, to address risky or potentially dangerous results should spell out, in significant detail, the responsibilities of the researcher, including but not limited to: reporting requirements; the mandatory security and review protocols; and the various publishing and dissemination restrictions and guidance involved.

Courtney Gavitt
Courtney Gavitt

Courtney Gavitt, MS, is an analyst at Gryphon Scientific where she focuses on chemical, biological, radiological, nuclear, and high-yield explosive (CBRNE) consequence management in support of the Federal Emergency Management Agency. As a Nonproliferation Graduate Fellow at the Department of Energy (DOE), National Nuclear Security Administration (NNSA), she contributed to U.S. interagency export control and interdiction efforts designed to curb proliferation of CBRNE weapons and dual-use materials. She served as part of the U.S. delegation to the Proliferation Security Initiative and supported DOE’s Australia Group representative. Before working at NNSA, she was a contractor at the U.S. Department of Homeland Security Customs & Border Protection. She holds an MS in biodefense from George Mason University.

SHARE:

TAGS:

No tags to display

COMMENTS

RELATED ARTICLES

TRENDING

RELATED ARTICLES

TRENDING

Translate »