top of page

Synthetic Biological Weapons May Be Coming. Here’s How To Fight Them.


Manahel Thabet

Just about anyone can be a biohacker now. Gene editing technology is cheaper and simpler than ever, and the Department of Defense (DoD) is paying close attention. After all, scientists have already used the tech to build a strain of horsepox, a virus not too genetically distant from smallpox, from scratch just for the fun of it. It may only be a matter of time before synthetic biology, such as weaponized pathogens, finds its way into a military’s or terrorist’s arsenal.

In order to stay prepared, the DoD commissioned the National Academy of Sciences to release a comprehensive report about the state of American biodefense. As the report suggests, it’s not great.

In the report, a team of scientists ranked the potential threats from gene editing and other bioengineering techniques, and examined what could be done to protect against them, taking into account how advances in bioengineering would make it more difficult to combat these threats. As MIT Technology Review reported, the DoD is most concerned that people might recreate known infectious viruses or enhance bacteria to become even more dangerous.

Many of the DoD’s official recommendations, such as investing in public health and advanced vaccines, run directly contrary to the priorities set by those currently running the federal government. Right now, the government pours money into defense but also slashes the public health infrastructure. Investing in it would do a who lot more to improve national security.

The portion of the report dedicated to “mitigating concerns” that synthetic bioweapons may be used against people in or out of the United States focuses on three key areas: deterring attacks, recognizing then, and then minimizing the damage done after the fact.

The solution that stood out the most is also perhaps the least concrete: invest and develop a robust public health system that could identify, prevent or counter, and respond to an epidemic or the use of a bioweapon. Because, as The Atlanticdemonstrated at length, America is absolutely not prepared for a major outbreak or epidemic of any sort — the country’s healthcare system can barely handle a rough flu season. Preparing treatments for a major viral or bacterial outbreak like the flu or Ebola relies on a large and fragile international supply chain, and hospitals are mostly independent entities that may balk at the cost of preparing for such an epidemic. In short, not the best tactic.

With a solid, well-funded public health system in place, the DoD and National Academy of Sciences researchers argue, the country will be more resilient to an attack, even if there’s no immediate cure for whatever bioweapon its citizens may be struck with. Even though the U.S. leads the world in public health via funding to the Centers for Disease Control and Prevention, the Atlantic article argues, funding and readiness initiatives still mostly react to health problems that come up instead of preparing for them before they do. It’s hard to keep politicians interested in staying prepared when there’s no immediate threat.

Even so, a better public health system could help catch such an attack before it spreads out of control — if a doctor notes a bizarre case or symptom, then a strong national network would be able to flag that case and perhaps even identify a patient zero, creating treatments more quickly and quarantining that person to prevent others from getting sick.

This type of investment would improve many parts of American life, especially for the most vulnerable and underserved among us. If “national security” is the angle that makes it happen then, well, fine.

In the report, the DoD also called for increased research into vaccines, perhaps some that can prevent broad swaths of infections, and improved programs to help get them to people. No one, the new report argues, would attack a population that’s already impervious to the weapon. Once more, what should perhaps be an obvious solution seems groundbreaking in a country in which the dangerous, repeatedly-debunked rhetoric of the anti-vaxxer movement is going strong and supported by the president.

The report also listed some areas of concern made more pressing due to cheap, democratized gene editing and biohacking tools. For one, small DNA sequences under 200 base pairs long aren’t screened by DNA synthesis companies for potential misuse. Normally, these companies would check genetic sequences against those known to be derived from pathogens and those on the Federal Select Agent Program Select Agents and Toxins List maintained by the CDC. These short sequences can be purchased by researchers or hobbyist gene editors. If they were so inclined, they could build a pathogen from scratch or make one that already exists more dangerous.

In the past, these short strands haven’t been screened because there was no need. Anyone trying to create a dangerous virus or bacteria would need a huge number of these strands, which would, presumably, tip someone off. But still, they could feasibly gather them without raising any alarm, as long as they don’t purchase anything longer than 200 base pairs at a time and no one looks too closely.

The report suggests creating a separate screening process for short strands of DNA — by analyzing the likelihood that a particular genetic sequence might be used in a weapon and going from there, rather than catching well-meaning researchers in the crossfire. While no such algorithm exists, the authors suggest that a well-trained machine learning system would be well-suited to the task.

In fact, much of the existing deterrence to synthetic bioweapons is based on such good faith efforts — the researchers cite a long list of agreements among biologists, geneticists, and other people who hold the tools to create weapons in which they say they wouldn’t. If you’re looking to prevent an epidemic of our own making, relying on social norms seems weak. With data and privacy scandals coming out of Silicon Valley just about every day, one might begin to wonder whether maybe “oh, those guys know what they’re doing” is a strong enough stance.

But the report cites decades of agreements and accords among researchers designed to prevent anyone who might want to synthesize a bioweapon from doing so. And, so far, it’s worked.

The problem, the report notes, comes from the recent and massive diversification of the genetics community. No longer are gene editing tools restricted to reputable scientists working in prestigious government or university labs; D.I.Y. CRISPR-Cas9 kits and other means of genetic manipulation are available to hobbyists, independent biotech companies, and citizen scientists. Proliferating the means of scientific research and experimentation could lead to amazing breakthroughs (In April, Gizmodo noted that a lab developed a CRISPR tool that could make for highly-specific cancer diagnoses) but it could also lead to catastrophic accidents or the deliberate weaponization of infectious agents by people who don’t adhere to long-standing (and perhaps outdated) codes of conduct.

As a result, the DoD calls for regulations and monitoring over synthetic biology research, whether it happens in a prestigious university or a hobbyist’s garage. But it specifically calls on such regulations to be penned in such a way that they “safeguard science.” Any new rules would need to allow scientists — and hobbyists — to conduct their research. If they need to get approval for the studies they want to conduct and genetic manipulations they want to attempt, then the bad actors can be weeded out. The key is to make sure any such regulations are motivated by safety, not the political leanings of whoever is in charge, now or in the future.

It’s not yet clear what effect this report will actually have. It’s not the kind of thing that will go to a vote, nor require that legislators hop to action. The report merely recommends certain strategies to address specific points that worry biologists and security experts. But at least it’s an indication that the DoD has begun to think about how to move science forward in a safe and responsible way, without hampering research in the process.

Source: Futurism

bottom of page