Monday, July 7, 2014

IRBs: Insider control can't do what's expected. Part II: Loss of control going viral

The virus that might roar
A story published in The Independent last week reported the controversial work of virologist Dr Yoshihiro Kawaoka at the University of Wisconsin-Madison.  Kawaoka was in the news several years ago for manipulating the H5N1 strain of flu virus so that it would be able to evade the immune defenses that much of the world developed when the virus was pandemic in 2009, killing over 500,000 people (the story was covered at the time by ScienceInsider).  That work was the subject of intense debate and scrutiny, and a moratorium was imposed while it underwent review.  The moratorium was lifted last year, and the work eventually cleared for publication.  

According to a recent piece in the Wisconsin State Journal, during the moratorium Kawaoka began to do the same kind of work with the virus that killed so many people globally in 1918.  The results of that project were recently published in Cell Host and Microbe.  Kawaoka's goal is to understand the kinds of genetic changes that would make these viruses circumvent human immunity to become even more infectious or more lethal.  The rationale, according to The Independent, is that it will help in the development of vaccines if such genetic changes were to occur in the wild.  

The problem, as many see it, is that there is no guarantee that these virulent strains won't escape from the lab and do much harm.  While Kawaoka says this won't happen, other lethal experimental organisms have done, and for new technologies like this such risk is always a concern.  Indeed, for old technologies -- the debate about whether to keep smallpox virus in labs has been going on for decades.  Kawaoka's work was approved by the university institutional review board although, according to The Independent, at least one member of the board was not willing to approve his current project.  

Does the fact that Kawaoka is a star on the faculty of the University of Wisconsin, where he has been treated extremely well, influence the IRB?  He is well-known in the field of influenza research, and has been involved in much recent work on emerging viruses, and no doubt his track record should count when his work is evaluated, but it's also possible, as always when power may be an issue, that Kawaoka's proposals have an easier time passing review than, say, a new researcher's would.  

But what is the IRB's role here?  Is it the board's job to decide what kind of risk society should be subjected to when academics do their work?  Or is that the job of an inter-institutional, or governmental agency, such as the U.S. National Science Advisory Board for Biosecurity (NSABB), which reviewed, and approved, Kawaoka's earlier work?

The risk of an inadvertent epidemic or even pandemic from this research may be small to very slight, but the consequences of such a thing would be so huge as to ask about the risk-benefit balance.  The importance of the discovery, should the research be successful, could be very great as well.  So there is no easy answer.

But that the University of Wisconsin allowed one of its very well-heeled faculty members to develop a modified pathogenic virus to which humans would no longer have resistance sounds like something out of Dr Strangelove.  How often is this sort of thing being done in a university near you--with or without its noble IRB being aware of it?

As we noted above, the previous work that got Kawaoka and Dutch investigators into hot water involved tinkering with the H5N1 flu virus to see what it would take to escape our immune system. Their idea was essentially to test virus genomic modification on ferrets, who in many ways are similar to humans immunologically.  The work was allowed to proceed after review, but in fact how can anyone guarantee that an accident won't happen?  We don't happen to know the conditions of the lifting of the moratorium, but no matter how extensive the review, or how cautious the scientists promised to be, no one can be absolutely certain that an accidental release of these viruses won't occur.  It reminds me of the time a little boy was getting on his bike to ride down the hill in front of our house.  His father reminded him to put on his helmet before he went, in case he fell off the bike.  "But Dad," he protested, "I'm not going to fall off!"

Similar concerns about recombinant DNA were raised a generation ago, and over time adequate protections were worked out and no disaster occurred that we know of.  But recombinant DNA doesn't pose the kinds of dangers that virulent viruses do.  And we have seen with other things, like stem cells, that scientists will do their best to find ways to do what they want to do.  Scientists, and the private sector, are both anxious to find new cures and also, one must acknowledge, looking for the major profits that are to be made.  The stem cell issue is more complicated because objections largely were religious.  Scientists may sneer at such things as ignorance standing the way of progress, but religious people are citizens and taxpayers, and if they are the majority, and aren't in favor of such a project, in a democracy, perhaps that should rule, whether frustrated scientists like it or not.

And there are other issues.  If a rock-star scientist threatens to leave the institution and go work elsewhere, this can be an incentive for an institution that treats faculty members like celebrities--particularly if they bring in big grant money--to compromise standards. 

And, should a properly independent system, with zero vested interests, be allowed or instructed to impose research bans for some number of years, appropriate to the offense, for investigators about whom there is evidence of misleading the IRB, or doing things not approved or even disapproved?    

It is, as in most similar kinds of situations, difficult to see how policy should be formed and implemented. After all, even amoral scientists are still scientists and citizens, and if they think something should be done, they have their votes, too.  And major public good might often also entail risks. 

The IRBs were started in the wake of abuses by Nazi and other scientists, including the most respected pillars of their society, and including in our country, as we mentioned last week.  That showed that scientists can't automatically be trusted not to intentionally, or even inadvertently do harm.  But many of us feel that the tenor of the committees has itself drifted from that proper gate-keeping job to a primary function to protect the institution against law-suits, part of a general trend in universities that is stifling in many ways, as well as costly in time and resources.  

Our mistaken mixing of messages
Making decisions is not easy, but there should be a balance of power.  However, in Thursday's post on IRBs, we mixed two aspects of bioethics.  One was about treatment of research subjects, human or otherwise.  The other was about priorities for spending society's resources (both are involved in our discussion here as well).  The issues overlap somewhat but we probably should have kept them separate.  IRBs are not mandated to deal with research priorities or societal concerns, though they do have to judge whether a project violates those concerns, and about whether doing some procedure on mice or other animals is warranted for the stated purpose of a project.

The peer review and policies of funders are the bodies that deal with research priorities.  My view is, as stated in Part I and elsewhere is that our priorities often too much depend on vested interests.  That is because agencies like NIH ask scientists what should be the next research priority.  Indeed, as I have seen directly several times, an agency like the National Academy of Sciences, entrusted with advising the government, can be paid by an NIH agency to hold a meeting about priorities, at which the agency's funded clients, and agency administrators, attend.  This is, essentially, insider trading and the NAS should not accept such contracts.  However, how to set priorities is not an easy thing to decide, since asking scientists their view is begging for self-interest to be at play, yet scientists know better than the public what the issues are.

In this sense, humans or animals are involved in projects that subject them to conditions that are allowed because of the social politics of the funding and academic career apparatus.  Are we out of proper alignment with what most would agree are appropriate societal priorities?  The payoff in actual public or scientific good is often, I think, far below what is promised.  This is of course a value judgment, but so are all IRB decisions and policies.

In any case, the Wisconsin issue that triggered these comments is more closely related to IRBs and its degree of real control of research ethics than about whether funds should be spent on this type of project rather than some other. Here, in fact, the story as written suggests serious abuse of what IRBs should rightly be policing.  One can argue that the knowledge being sought would properly have very high societal priority (because it deals with dangerous infectious disease), but that's a separate question. 

More generally, the funding priority issue may often even more important than the safer, local IRB protections. Billions of dollars go to feed the established research system, making it very self-aggrandizing and far less innovative than it might be if funding commitments, mega-longterm projects and the like were not so entrenched.  Instead of spending mega-bucks on more Big Data surveys we might focus funding on problems that were well-posed enough to be soluble.  This is again a societal issue about how resources are used, or captured, which does, of course, go beyond local IRB concerns that we were mainly intending to comment on.

So, while the ethical issues are not entirely separate, it confuses things to mix them as I did in our previous post.

No comments: