A Russian lab containing smallpox and Ebola exploded - Vox.com
Russia’s State Research Center of Virology, in the city of Koltsovo in Siberia, has one of the largest collections of dangerous viruses anywhere in the world. During the Cold War, the lab developed biological weapons and defenses against them, and it reportedly stored dangerous strains of smallpox, anthrax, and Ebola among other viruses.
So lots of people were concerned when an explosion ripped through the facility on Monday.
According to Russian independent media, the laboratory was undergoing repairs when a gas bottle exploded, sparking a 30-square-meter fire that left one worker severely burned. Glass throughout the building was reportedly destroyed in the blast, and the fire reportedly spread through the building’s ventilation system.
The lab is one of only two in the world known to still have samples of smallpox, which was eradicated from the wild in 1977. The other is in the United States.
Experts say that under certain circumstances, an explosion could lead to the release of deadly pathogens. “Part of the wave of the force of the explosion would carry it away from the site when it was first stored,” Joseph Kam, an associate professor at the Stanley Ho Centre for Emerging Infectious Diseases at the Chinese University of Hong Kong, told CNN.
That said, storage procedures for deadly pathogens like smallpox are extremely strict. The city’s mayor has stated that there is no threat to the general population, and a spokesperson for the center has said that no hazardous pathogens were stored in the room where the blast occurred. (Of course, Russian public reports on safety incidents are not always accurate.)
Will dangerous diseases escape the lab and infect the general population? Almost certainly not; the vast majority of lab accidents, even serious lab accidents, don’t sicken anyone, and none yet has sparked a pandemic in humans.
But that doesn’t mean that they shouldn’t give us pause. Outright explosions are relatively rare, but disastrous accidents that release dangerous pathogens are actually shockingly common — and not just in Russia, but in the United States and Europe as well. From accidental smallpox and anthrax exposures to mistaken transmission of deadly flu strains, slip-ups with some of the world’s most dangerous substances occur hundreds of times every year.
What should we do about that? The answer certainly isn’t that we should cut back on virology and pathogen research — research that has saved countless lives. It’s by studying the Ebola virus, for example, that researchers were able to develop the current cocktail of Ebola treatments that may reduce it from a death sentence to a mild, treatable illness.
But our track record of disasters such as what just happened in Russia suggests that certain kinds of research — into making pathogens deadlier, say — might not be worth its risks. As long as viruses keep escaping the lab — in freak accidents, fires, explosions, equipment malfunctions, and human mistakes — we run a risk of catastrophe. And we could reduce that risk without significantly impeding critical science.
Deadly accidents
In 1977, the last case of smallpox was diagnosed in the wild. That moment came at the end of a decades-long campaign to eradicate smallpox — a deadly infectious disease that killed about 30 percent of those who contracted it — from the face of the earth. Around 500 million people died of smallpox in the century before it was annihilated.
But in 1978, the disease cropped back up — in Birmingham, in the United Kingdom. Janet Parker was a photographer at Birmingham Medical School. When she developed a horrifying rash, doctors initially brushed it off as chicken pox. But Parker got worse and was admitted to the hospital, where testing determined that she had smallpox. She died of it a few weeks later.
How did she get a disease that was supposed to have been eradicated?
It turned out that the building that Parker worked in also contained a research laboratory, one of a handful where smallpox was studied by scientists who were trying to contribute to the eradication effort. Somehow, smallpox escaped the lab to infect an employee elsewhere in the building. Through sheer luck and a rapid response from health authorities, including a quarantine of more than 300 people, the deadly error didn’t turn into an outright pandemic.
Could something like that happen today?
All over the world, bio research labs handle deadly pathogens, some with the potential to cause a pandemic. Sometimes, researchers make pathogens even deadlier in the course of their research (as Science Magazine reported this spring, the US government recently approved two such experiments after years of keeping them on hold).
In 2004, the same Russian virology lab that just suffered an explosion was the site of another incident: a scientist died after accidentally infecting herself with Ebola. Several weeks passed before Russia acknowledged the event had occurred.
Research into viruses can help us develop cures and understand disease progression. We can’t do without this research. And there are lots of safety precautions in place to ensure that the research doesn’t endanger the public. But as a long series of incidents show, stretching from 1978 all the way to Monday’s explosion in Russia, containment has sometimes gone dangerously wrong.
How pathogens can find their way out of the lab
The US government controls research into “select agents and toxins” that pose a serious threat to human health, from bubonic plague to anthrax. There are 66 select agents and toxins regulated under the program and nearly 300 labs approved to work with them.
Researching pathogens and toxins allows us to develop vaccines, diagnostic tests, and treatments. New biology techniques also allow for more controversial forms of research, including making diseases more virulent or more deadly to anticipate how they might mutate in the wild.
So this research can be really important, and a critical part of public health efforts. Unfortunately, the facilities that do such work can also be plagued by a serious problem: human error.
The 1978 smallpox death was, most analyses found, caused by carelessness — poor lab safety procedures and badly designed ventilation. Most people would like to think that we’re not so careless today. But scary accidents — caused by human error, software failures, maintenance problems, and combinations of all of the above — are hardly a thing of the past, as the incident in Russia shows.
In 2014, as the Food and Drug Administration (FDA) did cleanup for a planned move to a new office, hundreds of unclaimed vials of virus samples were found in a cardboard box in the corner of a cold storage room. Six of them, it turned out, were vials of smallpox. No one had been keeping track of them; no one knew they were there. They may have been there since the 1960s.
Panicked scientists put the materials in a box, sealed it with clear packaging tape, and carried it to a supervisor’s office. (This is not approved handling of dangerous biological materials.) It was later found that the integrity of one vial was compromised — luckily, not one containing a deadly virus.
The 1978 and 2014 incidents, like the disaster in Russia, grabbed attention because they involved smallpox, but incidents of unintended exposure to controlled biological agents are actually quite common. Hundreds of incidents occur every year, though not all involve potentially pandemic pathogens.
In 2014, a researcher accidentally contaminated a vial of a fairly harmless bird flu with a far deadlier strain. The deadlier bird flu was then shipped across the country to a lab that didn’t have authorization to handle such a dangerous virus, where it was used for research on chickens.
The mistake was discovered only when the Centers for Disease Control and Prevention (CDC) conducted an extensive investigation in the aftermath of a different mistake — the potential exposure of 75 federal employees to live anthrax, after a lab that was supposed to inactivate the anthrax samples accidentally prepared activated ones.
The CDC’s Select Agents and Toxins program requires that “theft, loss, release causing an occupational exposure, or release outside of primary biocontainment barriers” of agents on its watchlist be immediately reported. Between 2005 and 2012, the agency got 1,059 release reports — an average of an incident every few days.
Now, the vast majority of these mistakes never infect anyone. And while 1,059 is an eye-popping number of accidents, it actually reflects a fairly low rate of accidents — working in a controlled biological agents lab is safe compared to many occupations, like trucking or fishing.
But a trucking or fishing accident will, at worst, kill a few dozen people, while a pandemic pathogen accident could potentially kill a few million. Considering the stakes and worst-case scenarios involved, it’s hard to look at those numbers and conclude that our precautions against disaster are sufficient.
The challenges of safe handling of pathogens
Why is running labs without such errors so hard?
A look at the CDC’s records of Select Agent containment failures helps answer that question. Errors come from many directions. With worrying frequency, people handle live viruses thinking they’ve been given deactivated ones.
Technology that’s a critical part of the containment process can fail unexpectedly. It’s not that there’s a single “problem” piece of technology — it’s that there are so many that are a part of the containment process, and all of them have some small risk of failing.
These problems don’t just occur in the US. In the United Kingdom, a recent investigation found:
more than 40 mishaps at specialist laboratories between June 2015 and July 2017, amounting to one every two to three weeks. Beyond the breaches that spread infections were blunders that led to dengue virus — which kills 20,000 people worldwide each year — being posted by mistake; staff handling potentially lethal bacteria and fungi with inadequate protection; and one occasion where students at the University of the West of England unwittingly studied live meningitis-causing germs which they thought had been killed by heat treatment.
Severe acute respiratory syndrome, or SARS, had an outbreak in 2003. Since then it hasn’t reoccurred in the wild, but there have been six separate incidents of it escaping the lab: one in Singapore, one in Taiwan, and four times at one lab in Beijing.
“These narratives of escaped pathogens have common themes,” argued an analysis of containment failures by medical historian Martin Furmanski in the Bulletin of the Atomic Scientists. “There are unrecognized technical flaws in standard biocontainment, as demonstrated in the UK smallpox [case]. ... The first infection, or index case, happens in a person not working directly with the pathogen that infects him or her, as in the smallpox and SARS escapes. Poor training of personnel and slack oversight of laboratory procedures negate policy efforts by national and international bodies to achieve biosecurity, as shown in the SARS and smallpox escapes.”
It’s easy to see why these problems are hard to address. Adding more rules for those handling pathogens won’t help if the people who become infected are usually not the ones handling the pathogens. Adding more federal and international regulations won’t help if the regulations aren’t consistently followed. And if there are still unrecognized technical flaws in the standards for containment, how would we know until an incident made those flaws apparent?
This is a worry that’s recently back in the news because the US government has approved research aimed at making certain deadly influenza viruses more virulent — that is, making it easier for them to spread from person to person. The researchers involved want to learn more about transmissibility and virulence, in order to better equip us to combat these diseases. The labs conducting such research have taken unusual steps to ensure their safety and to reduce the risk of an outbreak.
But have they reduced it enough? “We imagine that when there’s an accident, it’s because a ventilation system fails or someone just forgets to do something, or that it’s sort of avoidable mechanical or human error,” Marc Lipsitch, a professor of epidemiology at Harvard, told me.
Yet many of the recent failures don’t fit that pattern. “Rather, it was people doing something that they thought was the right thing and was neutralizing a dangerous pathogen by killing it, and in fact they still had some dangerous pathogen or contamination with a dangerous pathogen,” he said. “My concern is not really that one of these people will do something that’s foolish or reflects poor training. My concern is that there’ll be human error of the kind that’s not really avoidable.”
Lipsitch does not think we should tighten standards for most research. He argues that our current approach, while its error rate will never be zero, is a good balance of scientific and global health concerns with safety — that is, for most of the pathogens biologists research. But for the most dangerous pathogens, the ones with the potential to spark a global pandemic, he points out that that calculus doesn’t hold.
So far, too much biosecurity policy has been reactive — tightening standards after something goes wrong. Given how badly things can go wrong, that’s not good enough. It’ll be exceptionally challenging to make our labs safer, but when it comes to the riskiest pathogens, we simply have to be up to the challenge.
Sign up for the Future Perfect newsletter. Twice a week, you’ll get a roundup of ideas and solutions for tackling our biggest challenges: improving public health, decreasing human and animal suffering, easing catastrophic risks, and — to put it simply — getting better at doing good.
https://ift.tt/2NmtSYw
Comments
Post a Comment