Monday, August 5, 2013

Censorship in science won't go viral

The BBC radio program, Inside Science, recently considered the issue of research into how viruses become transmissible between humans and whether it should be allowed.  What enables viruses to, well, go viral--to be able to spread rapidly among the population with devastating consequences?  We think of war as devastating but until recently more, or far more, combatants died from sickness than from battle injuries.  The 1918 flu pandemic is iconic for this, but millions have died in miserable obscurity in army camps or just at home, over the years.

From time to time viral diseases erupt, often from animal reservoirs, to be transmissible to, and most dangerously among, humans.  As we try to combat pathogens, we face both antibiotic resistance in bacteria and the demography of human population, crowding, and travel that contributes to viral epidemics.

Many of the worst viruses primarily live in nonhuman animal hosts.  There, they can transfer from individual to individual and may not even be very virulent.  But if particular mutations arise that allow them to not only infect a human host, but be transferred between them, then we have an epidemic.

While some viral genes have been identified in strains that lead to human epidemics, no gene has been identified that enables the human-to-human transfer.  Preventing that could stop an epidemic in its tracks, so could be very important to global health.  So about a year ago, investigators in the Netherlands announced research findings that had identified some genetic basis for such transfer.

This raised a furor.  The idea was to use genomic techniques to introduce mutations and screen for ones that enabled a flu virus to be transmitted from human to human (well, experimentally it was among ferrets, who are a good model for human flu virus transfer).  That that work was even being done terrified many, because enabling a virus to become infectious in that way could, if the virus were to escape the lab, conceivably cause a global pandemic.  Indeed, terrorists could use such a thing to great harm.

A yearlong moratorium against publishing or even doing this kind of research was suggested and at least partly followed.  But the moratorium has ended, with little resolution about the kind of work and how, where, or whether it should be done.  Scientists doing the work defend it, of course, even if acknowledging some slight risk that the engineered virus could escape.  But the justification is that the risk is small and that viral pandemics are inevitably in our future just because of mutation and selection in nature, and that this research could teach us enough to somehow anticipate or prevent such outbreaks.

Should we worry that a terrorist might learn the techniques from that public-spirited scientific literature and create devastation in his basement lab? One of the guests on the show said that the risk is so slight that it shouldn't be a consideration.  That isn't even the right question, I don't believe.  It isn't wild-catting individuals, but nation states, or groups funded by them, who would be most likely to engineer viral mayhem for political or military purposes.

The Snowden  Factor
Given the history of leaked secrets, or Snowden-like leaking of the existence of secrets, that has taken place in recent history, this isn't far-fetched.  Of the many workers, from students to faculty to who knows who else, who would be involved in viral engineering experiments, or the scientists with government funds who read Nature and learn what you need to start up a lab of this type, does anyone think it is all that unlikely that if not other countries, at least some disgruntled worker or one whose politics differ from the host lab director's, etc., would take the secrets to some highest bidder?  The risks should not be minimized. And, of course, if one thinks of the Snowden factor, which hostile governments (who can also read Nature) might already be up to this kind of stuff, for military or geopolitical, or even defensive purposes? 

The Pandora's box argument
A common rationale scientists give for not having their work hands tied is that science is about nature, and if we don't investigate some subject somebody else will, so let us just carry on.  It's government's job to prevent abuse, isn't it?  This is the selfish argument, whether or not there is anything malicious behind it in an individual case.  But society does have the right and the precedent for keeping Pandora's box closed.  At least to some, if highly inadequate extent, our institutional review boards (IRBs) have to approve all university and government research before it can be done.  And while torture is in fact visited on countless laboratory animals with the IRBs' blessings, they do at least prohibit some things.   You can't put toxin in dining hall food to see how any people get sick, not even if you were to get students' permission first.  Science is, in some minimal sense at least, already censored.  We are not morally required to recognize that the world is out there, and anything some scientist wants to know about it is fair game.

Yet the problems are real, and maybe science can help
Greedy and self-interested as we scientists are, it is only proper to recognize that many of us are not in it just for the money.   We have a sometimes compulsive and relentless desire to understand and ameliorate problems of the world.  Viral epidemics, that could be lethal to millions or more, certainly qualify.  Scientists aren't just venal for grants, we need them to pay for our staff and equipment and so on.  We are imperfect and follow fads, but our methods are not just faddish but often do work in ways we hope.  We have a lot of understanding of genes, pathogen-host evolution, and epidemiology to be able to understand problems like we're discussing and often we can, indeed, do something about them.

So if we don't let scientists do this, we prevent what they might achieve.  We should not stifle their work for trivial reasons.

No comments: