Any time you have explosive growth, you run the risk of losing some control over the quality of your work.
That’s the problem underlying the debate over regulating high-containment laboratories, such as those at Galveston’s National Laboratory, where scientists study deadly pathogens.
In 2004, the United States had 400 such labs, the U.S. Government General Accountability Office reported. It now has 1,500.
The growth has been so fast that officials with the General Accountability Office recently testified they were not exactly sure how many such labs there are.
At some point, there has to be a tipping point — a point where the risks of having so many labs where something bad could escape outweigh the benefits of having such labs where deadly pathogens can be studied with a view toward reducing the threat.
With that kind of growth, you’d have to wonder whether that tipping point has been reached.
You’d also have to wonder who’s minding the store, given there are so many stores.
At a recent congressional hearing, a senior government official described the system of oversight as fragmented and largely self-policing.
The U.S. Centers for Disease Control and Prevention reported that the number of accidents in secure labs increased from 16 in 2004 to 269 in 2010.
The national debate about whether more regulation is needed was prompted by reports of problems at laboratories in other states.
The argument that those problems were human errors, and thus could not have been prevented by stricter regulations, just isn’t persuasive.
People who operate in positions of enormous responsibility and trust — military folks who handle nuclear weapons, for example — operate under severe regulation and discipline.
Where the consequences of a mistake are terrible, you generally find systems with strict rules and taught discipline, enforced by stiff consequences for failure.
You don’t hear much talk about self-regulation in those environments.
The case for sterner regulation is even stronger at laboratories where so-called “gain of function” research is done. That’s research in which scientists try to manipulate a pathogen to see whether it’s possible to make it even deadlier or more contagious.
Researchers at the University of Wisconsin, for example, genetically modified a virus to create something similar to the pathogen that killed countless people in 1918.
Institutions that do that research ought to be willing to accept extraordinary regulation. They also ought to be extraordinarly open about what they’re doing.