citation

Rocco Casagrande. "Federal Funding for Biosafety Research Is Critically Needed." CSIS Commission on Strengthening America's Health Security, Center for Strategic and International Studies, August 06, 2019. Accessed December 21, 2023. https://healthsecurity.csis.org/articles/federal-funding-for-biosafety-research-is-critically-needed/

Our rapidly expanding ability to understand and manipulate life is outpacing biosafety, the practice of preventing accidental exposure of people, animals or the environment to dangerous microbes. This brief proposes targeted research to address this health security threat.

Photo Credit: PAVEL GOLOVKIN/POOL/AFP via Getty Images

This photo taken on April 16, 2013 shows a doctor at the Beijing Center of Disease Control, working in their laboratory in Beijing.
This photo taken on April 16, 2013 shows a doctor at the Beijing Center of Disease Control, working in their laboratory in Beijing. STR/AFP/Getty Images

The Issue

The advent of powerful new tools in biotechnology promises to open a new era in the battle against infectious disease. This research will undoubtedly lead to better capability to predict and prevent global outbreaks, and will support the development of new vaccines and treatments to reduce the burden of outbreaks we cannot prevent. The benefits afforded by these powerful tools are not without attendant risks.

The threat of biological laboratory accidents is not commonly understood to be a serious health security concern similar in significance to the threats of emerging infectious diseases and biological attacks. However, scientists are just now creating viruses that exceed the transmissibility or pathogenicity of naturally occurring strains. Also contributing to the rising risk of accidents is the entry into the life sciences of scientists from other fields and hobbyists, who may be accustomed to weaker accountability measures, enjoy less training and weaker knowledge of safety and consequences of accidents, and yet are drawn to the tools of biology because of the expansive power they afford.

Unlike accidents in transportation, chemical production, or even nuclear power, biosafety accidents can result in the unforeseen and uncontrolled infection of lab personnel or their local communities, the release of a pathogen into the environment, or even the initiation of a global pandemic that could reach millions. There is abundant concern among health security experts and U.S. policymakers over the threat of pathogens being deliberately released into communities by malevolent actors and terrorist attacks. While these concerns are valid, we are not paying enough attention to the costly and real risks of biological laboratory accidents that threaten similarly dangerous outcomes.

The United States is underinvesting in the science of biosafety. The resulting lack of data and active research in the field leaves the biosafety community with little understanding of how accidents are likely to occur and impedes the identification of cost- effective measures to prevent accidents. This gap in knowledge could be addressed with a modest budget of $10 million per year given to the National Institute of Occupational Safety and Health to fund a dedicated U.S. biosafety research program. The funding is modest because the first step is to develop a research community that focuses on these issues; currently, the lack of funding has prevented focused attention on research in biosafety. With adequate funding, this research will lead to the development of cost-effective training programs to reduce human error in the laboratory, the redesign of risky experiments to prevent outbreaks before they occur, and the identification of cost-effective investments in laboratory safety equipment, which could make the conduct of life sciences research more efficient.

Overview

Currently, we lack the evidence basis to take new, needed measures to prevent accidents in biological laboratories, which, as mankind continues to expand its capabilities to manipulate life (including the viruses and bacteria that cause disease), leaves us more vulnerable to the accidental initiation of disease outbreaks with potentially dangerous consequences locally, regionally, and beyond. New biotechnologies are enabling scientists to design or modify life in ways not previously possible. These biotechnologies enable professional and amateur researchers to use simple life forms (e.g., bacteria and yeast) to create simple sensors and produce industrial chemicals, materials, and pharmaceuticals cheaply and from commonplace reagents. The manipulation of pathogens (the microbes that cause disease) fosters a better understanding of how these agents evolve and interact with the body, enabling the development of next generation cures.

Despite the significant U.S. and global investment in biotechnology, concern has been voiced by scientists, policy experts, and members of the community that scientists may be ill-equipped to handle novel, manipulated microbes safely, potentially resulting in accidental infection of themselves or their local communities, accidental release into the environment, or even the initiation of a global pandemic. For example, in 2016, a laboratory worker was infected when working with a strain of HIV at a low safety level because the strain was supposedly crippled. However, this worker became infected with the virus because two changes to the strain were made: one by the worker herself and another by the virus serendipitously picking up genetic material in the laboratory. In another example, a laboratory accident at the Beijing Institute of Virology in 2004 led to a SARS outbreak with approximately one death, a dozen serious illnesses, and hundreds quarantined. If the outbreak weren’t recognized early, travelers from this region could have spread the outbreak worldwide, similar to the outbreak of SARS in 2003. These lapses illustrate that our lack of knowledge impedes our ability to effectively contain the worst diseases that nature can create. Worse still, our ability to understand and manipulate life is outpacing our understanding and knowledge about biosafety, the practice of preventing accidental exposure of people, animals, or the environment to dangerous microbes.

Since the turn of the twenty-first century, scientists have published several discoveries that, while pushing the boundaries of knowledge, have also sent shudders through the global biosafety community. In the early 2000s, researchers in Australia added a component of the immune system to a virus related to the smallpox virus, finding that it was deadlier and was able to overcome protective vaccination. Over the past decade, researchers in China, the United States, and the Netherlands manipulated strains of influenza, creating strains that were more deadly, more transmissible or, in one case, both. These experiments are particularly worrisome because influenza pandemics, which occur about every twenty years, arise when similar modifications evolve in the virus, and these outbreaks can kill tens of millions of people worldwide.

In October 2014, the U.S. government recognized the potential implications of an accident in laboratories manipulating the most dangerous pathogens and imposed the first-ever moratorium on life-science research funding in the United States based upon safety concerns. Soon after, Gryphon Scientific was asked to identify what the risks and benefits of the research were so that the government could determine if the work was too risky to fund. While we performed the most extensive study to date on accidents in the laboratory and came to conclusions that were acted upon by the government, we found a shocking lack of data in a field that could lead to an accident of unprecedented global reach and severity.

We found that we lack the fundamental research that illuminates how and why accidents happen in a biological laboratory and the necessary evidence to create evidence- based mitigation measures to prevent them.[^1] Data related to the frequency by which various mishaps occur and the effectiveness of systems to prevent or mitigate accidents is missing in the life sciences. The life sciences stand alone in the lack of safety data, whereas safety has been the subject of decades of investigation in other endeavors that can cause lesser (albeit more frequent) loss of life, like transportation, nuclear power, and the chemical industry.

This lack of data has persisted for decades, but the consequences of the data gap are today more perilous than before. We are now witnessing a revolution in the life sciences arising from (a) the creation of tools that provide unprecedented power to manipulate life, (b) the unprecedented collection and analysis of knowledge of how these changes lead to desired traits in manipulated organisms, (c) the entry of researchers from other disciplines into biology, (d) the increase of commercial activity in biology due to its applications in various industrial and medical fields, and (e) the entry into the life sciences by hobbyists. The recent advent of synthetic biology (which is the application of engineering principles to the life sciences) has vastly expanded the type and extent of alterations that can be made to microbes and reduced the technical barriers to executing these changes. This additional power in the hands of traditional scientists has been used to make vast collections of modified pathogens to understand how their properties (infectiousness, pathogenicity, and transmissibility) change. Some of these modified pathogens have properties that exceed those of their natural counterparts and, as illustrated above, have led to the creation of disease-causing strains that nature itself has not yet devised. The government has recently begun to evaluate these experiments to determine if their biosafety risks outweigh their potential benefit to science and medicine, and many in the scientific community are uncomfortable that the framework for this evaluation is not public. Worse still, the lack of evidence underpinning studies in biosafety implies that these decisions are being made without a solid evidence basis about what would make these experiments pose an unacceptable risk or what changes could be suggested to make those experiments less risky.

Reduced technical barriers have enabled researchers who are not formally trained in the life sciences (or biological safety norms) to manipulate microbes. While some of these researchers are not trained scientists, some are scientists/engineers from other disciplines who are now drawn to biology because microbes can be manipulated to produce materials, chemicals, and structures on a size scale, at a production volume, or a cost point that is difficult to match via traditional approaches in chemistry, materials science, or engineering. This power has drawn many well- financed researchers, who lack formal training in biosafety or even standard biological techniques, into the life sciences to engineer microbes with the hopes of revolutionizing many industries.

Although the data needed to inform biosafety is lacking, little research is being conducted because there is nearly no federal funding for empirical biosafety research and no government agency has the mission to fund the generation of data on the effectiveness of current or proposed biosafety practices, to support research to understand how and why accidents happen in a biological laboratory, or to compile and disseminate best practices for biosafety. The little research that is ongoing is often funded by the private sector to promote the effectiveness of their products.[^2] In our conversations with biosafety personnel, nearly all argue—some stridently—that more funding for research is critically needed. Without funding for this research (and an agency champion), a vicious cycle hinders improvements in biosafety: little research occurs, and little data is generated so no researchers are drawn to investigate these critical questions.

This critical need has finally been recognized at the highest levels of government. The call for applied biosafety research was recently featured in the president’s National Biodefense Strategy, which listed the need to “strengthen biosafety and biosecurity practices and oversight to mitigate risks of bioincidents” as a key objective, reinforcing that the need has yet to be addressed. Perhaps the clearest call for more funding was featured in the HHS assistant secretary for preparedness and response’s National Health Security Strategy, which included the need for “Strengthening biosafety and biosecurity by improving programs…conducting basic and applied biosafety research and encouraging pre- incident response planning.” This recent statement was presaged more than a decade before. In 2009, the Trans-Federal Task Force on Optimizing Biosafety and Biocontainment Oversight recommended obtaining and analyzing biosafety incidents, sharing lessons learned with the scientific community, and supporting research for applied biosafety and biocontainment. Despite these pleas for funding for applied biosafety, research into the effectiveness of biosafety measures and compilation and sharing of best practices have not materialized in any comprehensive fashion.

Benefits of Biosafety Research

Unlike fundamental biological research, where often decades pass before the benefits to health or medicine arising from the discoveries are realized, discoveries resulting from biosafety research could be applied immediately. The knowledge generated could lead to a change in practices for the prevention of laboratory accidents (as the community generates real data on which practices are safe and which are risky—and under what conditions). These practices could involve a subtle change to a protocol, the movement of equipment within a laboratory, or a redesign of a risky experimental approach.

For example, several mishaps have occurred because researchers shared microbes that they thought were deactivated but actually were infectious. Within the laboratory, improper deactivation increases the risks that workers infect themselves after cleanup or when reacting to a spill. Studies that determine the minimum conditions for inactivating infectious microbes and define minimal protocols to validate that samples were deactivated could immediately reduce the chances that infections occur within the laboratory or infectious materials are accidentally shipped.

In clinical settings, accidental infections often occur when protective gear (gloves, masks, coats, etc.) is removed. It is suspected—but unproven—that similar mishaps are a main driver of laboratory accidents. Studies that determine how frequently laboratory workers contaminate their hands when taking off gloves (or breach their gloves during experiments) can help to determine if wearing two overlapping pairs of gloves would meaningfully reduce infections. Similar studies could determine when respiratory protection should be worn and what type of protection is needed under what conditions. If the protective gear works well but is compromised when careless or ignorant workers remove the gear, additional investments in training workers on how to wear and remove the gear would be warranted.

A scientist examining cells in a 96-well plate.
A scientist examining cells in a 96-well plate. Source: Dan Kitwood/Getty Images/Cancer Research UK

We lack the basic data on how workers in a biology laboratory are exposed to infectious material through spills, splashes, and contamination. A lab simulator in which workers are observed manipulating small volumes of fluid and running mock assays could be used to compile data on how often spills and splashes occur, how contamination gets onto the body of a worker, and how it may leave a laboratory. These data would also help inform estimates of how often accidents could be expected to occur, which informs other key studies.

Data are also needed to improve human practices in the laboratory. Unlike in other industries where mechanical failures alone can cause catastrophe, most accidents in the biological laboratory are initiated by a worker (infectious material is spilled, for example) or are exacerbated because workers respond inappropriately (violate quarantine or contaminate themselves during cleanup). Also, unlike in other industries, the most dangerous incidents involve pathogens that can spread from person to person, and therefore the behavior of those that may be initially exposed is critical in preventing a tragedy (that infects a handful of laboratory workers) or a global catastrophe (that infects millions of people worldwide). Research can identify the types of mistakes most commonly made by workers to help determine how training could immediately reduce the chance that accidents occur and minimize consequences should an accident happen. Best practices that are identified, compiled and shared would enable all labs in the United States and the developed world to implement measures with demonstrated value.

Variability in the reliability of human workers can be immediately leveraged to improve biosafety. Perhaps the largest single improvement in laboratory safety could be obtained by determining if the Pareto Principle applies to biological laboratories. The Pareto Principle states that 80 percent of the effects are generated by 20 percent of the causes or, in this context, that 80 percent of the consequences of accidents in the lab are generated by 20 percent of the workers. This principle holds across many human endeavors and, given that many of those working in a biological laboratory know of workers who are sloppy, the principle likely applies. If it is proven that 20 percent of scientists are actually responsible for 80 percent of accidents, identifying these workers and retraining them or barring them from working on the most dangerous pathogens would immediately reduce the risk of a serious accident by 80 percent. Research would not only determine if the principle applies to the biology laboratory but would also present indicators that may help identify those individuals and the development of ongoing systems to continually monitor for problematic workers in the laboratory.

Just like informing human practices, biosafety research can be immediately used to inform investments. Currently, European laboratories conducting research on transmissible viruses use different containment equipment than those in the United States. No data exists to support whether this difference results in a measurable improvement in safety or whether limited resources are being wasted on unnecessary (and costly) equipment. Biosafety data could inform cost- benefit studies to determine exactly how laboratories that study the world’s most dangerous pathogens should be equipped without unnecessarily diverting funding that could be invested in the research itself.

Because biosafety data are lacking, we do not know how often accidents may occur or what their consequences are likely to be, and key decisions cannot be made. In fact, this work would inform the deliberations currently happening in the federal government regarding whether experiments with modified pathogens are too risky to fund and, if funded, which changes should be recommended to reduce risks. Moreover, biosafety research could help inform where new laboratories of various types should be built.

In addition to the immediate benefits discussed above, funding of biosafety research is sustainable in the long-term by helping to build the culture of biosafety and foster the natural integration of safe practices into the laboratory. Currently, researchers often view biosafety as a necessary activity to conduct the experiments upon which their contribution to the field and their advancement in their institution is judged. Thought leadership in biosafety is often driven by dedicated biosafety professionals who sit outside the laboratories conducting the research. However, if biosafety research is funded, scientific publications (the currency of academic science and primary means of career advancement) would be generated based on this research and the researchers themselves may recognize a confluence of thought leadership in biosafety and advancement of their careers.

Addressing the Gap

To support the research needed to revolutionize biosafety, we propose that NIOSH be allocated funding dedicated to the empirical study of safety in biological laboratories. The funding needed to make a significant change in biosafety is not large. The empirical research needed involves the observation of workers in real and mock laboratories with equipment already in place in training laboratories. This research would be well funded if just $10 million a year were dedicated. This amount is less than the year-on-year increase in the National Institute of Allergy and Infectious Disease (NIAID—the part of the NIH that funds most research on pathogens) budget, or just shy of 0.2 percent of NIAID’s $5.3 billion budget. This amount would be enough to fund several innovative efforts each year. When considering the millions in pathogen research conducted at the USDA, Department of Defense, and Centers for Disease Control, this ask seems even more modest. This initial funding level is modest because the present lack of funding has prevented the development of a community of researchers focusing on this problem. As funding draws researchers into the field, the funding level may need to increase but so will the magnitude of the benefit.

The proposed Biosafety budget is $10 million, just shy of 0.2% of NIAID's $5.3 billion budget.
The proposed Biosafety budget is $10 million, just shy of 0.2% of NIAID's $5.3 billion budget. Source: CSIS

Primarily, this funding is critical to ensure our enormous investment in research to understand disease isn’t forever undermined by a single, tragic accident. Additionally, this research would help increase cost efficiency in the laboratory by identifying the equipment and supplies truly necessary to increase safety as well as those that are expensive but not beneficial. Similarly, the evaluation of human practices will identify the training protocols that have the most value in reducing the risk of accidents, ensuring that the time of researchers nationwide is well spent.

This modest funding level is but a fraction of the current investments made by the federal government in other industries that are associated with a risk of loss of human life (albeit on a more frequent, less consequential basis than laboratory accidents). More than $150 million is spent annually on transportation safety research by the National Highway Traffic Safety Administration. The budgets of the Chemical Safety Board, the National Transportation Safety Board, and the Nuclear Regulatory Commission (NRC) exceed $1 billion annually, albeit much of this budget pays for regulatory measures and incident investigation. The largest player in this field, the NRC, is funded by the federal government and fees that generators pay (90 percent of the NRC’s budget is from user fees).[^3] Although the biotech industry is large and economically vibrant in the United States, much of the research involving the manipulation of pathogens occurs at nonprofit research institutions and funded by the federal government. For this reason, a fee applied to biological researchers would simply be taking a little money out of one government-funded pot and distributing it to another, creating an unnecessary bureaucracy to collect these payments. Instead, biosafety research funding would be similar to the National Highway Traffic Safety Administration or the Chemical Safety Board, which is directly funded by the government.

The growth of nuclear power in association with the birth of the atomic bomb clearly helped fuel unjustified fears of nuclear power and the public outcry to ensure that the plants were as safe and secure as possible. Nuclear power safety research began as the Atomic Energy Commission (the forerunner to the NRC). In conducting their risk assessment associated with the siting of new power plants, the commission recognized that commercial generating plants are likely to suffer from accidents caused by loss of coolant, unlike the primary causes of accidents in research reactors. This recognition of this new accident pathway led to the conduct of research to understand the causes and means to mitigate a loss of coolant—perhaps a major reason why the United States has never suffered a major nuclear power accident. In this way, the birth of nuclear power safety research mirrors today’s conditions in the life sciences. As a community, we are just now coming to appreciate that the science has advanced to a stage where researchers can create pathogens that surpass the threat posed by natural strains and the power of biology is bringing in new researchers from other fields while simultaneously leading to the integration of biological systems into research and development programs into new fields. Each of these recent changes can create new risks that have not been properly assessed. However, the potential consequences of a biological accident are difficult to grasp. The public does not have a good handle on the economic, human, and societal costs of a pandemic and, because a biological accident of this scale may be less likely than a major chemical accident, the similar risk of catastrophic chemical, nuclear, and biological accidents may be difficult to appreciate.

The combination of regulatory authority, accident investigations, and research budgets in these other fields necessitated the creation of novel bureaucracies to manage the research. However, the management for a mission focused solely on biosafety research could easily be housed inside an existing research institution as a dedicated biosafety research program. The National Institute for Occupational Safety and Health (NIOSH) is a rational entity to manage this funding of biosafety research because its mission is “To develop new knowledge in the field of occupational safety and health and to transfer that knowledge into practice.” Given that almost all harm generated by accidents in a biology laboratory has been suffered by workers, the mission of NIOSH is clearly aligned with improving biosafety through research. Moreover, the Public Health Security and Bioterrorism Preparedness and Response Act of 2002 discusses laboratory safety as an occupational health issue [29 USC §669a]. With a budget of more than $300 million, NIOSH could easily absorb a biosafety research program of this size.

Moreover, this effort would not be impeded by an inability or unwillingness to share data. Much of the needed data would be developed de novo in controlled settings (such as training laboratories), to produce critical information on rates of mishaps and consequences. For this reason, this effort can still reach all these objectives even if institutions are not willing to share sensitive data on accidents and near misses that happened in the past. Several national and international bodies exist (such as the Association for Biosafety and Biosecurity) to facilitate the sharing and dissemination of this information, and several journals have a mission to publish these data (such as the journal Applied Biosafety).

The recent increase in power to manipulate life, combined with an influx of new researchers to the life sciences, demands that biosafety be modernized. A cornerstone of this modernization is the generation of data, without which biosafety decisions cannot be made wisely. This research would provide the data to identify which aspects of worker training need the most emphasis, help inform investments in the outfitting of laboratories, and immediately and cost-effectively improve safety in the laboratory. Federal funding is needed to support the research to generate these data and a modest investment (approximately $10 million a year) overseen by NIOSH would soon translate to real improvements in safety in the laboratory and may prevent a global pandemic. If no action is taken, a fatal laboratory accident may undermine the billions of dollars of annual investments in the life sciences and reverse decades of biomedical research.

Rocco Casagrande is a biochemist by training who has focused on minimizing the risks of advancing biotechnologies while capturing the benefits these technologies offer. He led government-funded efforts to examine risks stemming from research on contagious animal pathogens on the U.S. mainland, the creation of modified contagious human viruses, and the misuse of synthetic biology. He is a founder and managing director of Gryphon Scientific.

This brief is a product of the CSIS Commission on Strengthening America’s Health Security, generously supported by the Bill & Melinda Gates Foundation.

[^1]: Refer to the attached paper by Ritterson and Casagrande at Ryan Ritterson and Rocco Casagrande, “Basic Scholarship in Biosafety Is Critically Needed to Reduce Risk of Laboratory Accidents,” mSphere, March 29, 2017, https://msphere.asm.org/content/2/2/e00010-17.

[^2]: Lorna K. P. Suen et al., “Self-contamination during doffing of personal protective equipment by healthcare workers to prevent Ebola transmission,” Antimicrobial Resistance & Infection Control, 2018.

[^3]: United States Nuclear Regulatory Commission, Congressional Budget Justification, Fiscal Year 2019 (Washington, DC: U.S. NRC, 2019), https://www.nrc.gov/docs/ML1802/ML18023B460.pdf.

Next Article