Censoring Science is Detrimental to Security

| 2 Comments | No TrackBacks
Restricting access to science and technology in the name of security is historically a losing proposition.  Censorship of information that is known to exist incentivizes innovation and rediscovery. 

As most readers of this blog know, there has been quite a furor over new results demonstrating mutations in H5N1 influenza strains that are both deadly and highly contagious in mammals.  Two groups, led by Ron Fouchier in the The Netherlands and Yoshihiro Kawaoka at The University of Wisconsin, have submitted papers to Nature and Science describing the results.  The National Science Advisory Board for Biosecurity (NSABB) has requested that some details, such as sequence information, be omitted from publication.  According to Nature, both journals are "reserving judgement about whether to censor the papers until the US government provides details of how it will allow genuine researchers to obtain redacted information".

For those looking to find more details about what happened, I suggest starting with Dorveen Caraval's interview with Fouchier in the New York Times, "Security in Flu Study Was Paramount, Scientist Says"; Kathleen Harmon's firsthand account of what actually happened when the study was announced; and Heidi Ledford's post at Nature News about the NSABB's concerns.

If you want to go further, there is more good commentary, especially the conversation in the comments (including from a member of the NSABB), in "A bad day for science" by Vincent Racaniello.  See also Michael Eisen's post "Stop the presses! H5N1 Frankenflu is going to kill us all!", keeping in mind that Eisen used to work on the flu.

Writing at Foreign Policy, Laurie Garrett has done some nice reporting on these events in two posts, "The Bioterrorist Next Door" and "Flu Season".  She suggests that attempts to censor the results would be futile: "The genie is out of the bottle: Eager graduate students in virology departments from Boston to Bangkok have convened journal-review debates reckoning exactly how these viral Frankenstein efforts were carried out."

There is much I agree with in Ms. Garrett's posts.  However, I must object to her assertion that the work done by Fouchier and Kawaoka can be repeated easily using the tools of synthetic biology.  She writes "The Fouchier episode laid bare the emptiness of biological-weapons prevention programs on the global, national, and local levels.  Along with several older studies that are now garnering fresh attention, it has revealed that the political world is completely unprepared for the synthetic-biology revolution."   As I have already written a book that discusses this confusion (here is an excerpt about synthetic biology and the influenza virus), it is not actually what I want to write about today.  But I have to get this issue out of the way first.

As far as I understand from reading the press accounts, both groups used various means to create mutations in the flu genome and then selected viruses with properties they wanted to study.  To clarify, from what I have been able to glean from the sparse accounts thus far, DNA synthesis was not used in the work.  And as far as I understand from reading the literature and talking to people who build viruses for a living, it is still very hard to assemble a functioning, infectious influenza virus from scratch.   

If it were easy to write pathogen genomes -- particularly flu genomes -- from scratch, we would quite frankly be in deep shit. But, for the time being, it is hard.  And that is important.  Labs who do use synthetic biology to build influenza viruses, as with those who reconstructed the 1918 H1N1 influenza virus, fail most of the time despite great skill and funding.  Synthesizing flu viruses is simply not a garage activity.  And with that, I'll move on.

Regardless of how the results might be reproduced, many have suggested that the particular experiments described by Fouchier and Kawaoka should not have been allowed.  Fouchier himself acknowledges that selecting for airborne viruses was not the wisest experiment he could have done; it was, he says, "really, really stupid".  But the work is done, and people do know about it.  So the question of whether this work should have been done in the first place is beside the point.  If, as suggested by Michael Eisen, that "any decent molecular biologist" could repeat the work, then it was too late to censor the details as soon as the initial report came out. 

I am more interested in the consequences of trying to contain the results while somehow allowing access to vetted individuals.  Containing the results is as much about information security as it is biological security.  Once such information is created, the challenge is to protect it, to secure it.  Unfortunately, the proposal to allow secure access only by particular individuals is at least a decade (if not three decades) out of date.

Any attempt to secure the data would have to start with an assessment of how widely it is already distributed.  I have yet to meet an academic who regularly encrypts email, and my suspicion is that few avail themselves of the built-in encryption on their laptops.  So, in addition to the university computers and email servers where the science originated, the information is sitting in the computers of reviewers, on servers at Nature and Science, at the NSABB, and, depending on how the papers were distributed and discussed by members of the NSABB, possibly on their various email servers and individual computers as well.  And let's not forget the various unencrypted phones and tablets all of those reviewers now carry around.

But never mind that for a moment.  Let's assume that all these repositories of the relevant data are actually secure.  The next step is to arrange access for selected researchers.  That access would inevitably be electronic, requiring secure networks, passwords, etc.  In the last few days the news has brought word that computer security firms Stratfor and Symantec have evidently been hacked recently.  Such attacks are not uncommon.  Think back over the last couple of years: hacks at Google, various government agencies, universities.  Credit card numbers, identities, and supposedly secret DoD documents are all for sale on the web.  To that valuable information we can now add a certain list of influenza mutations.  If those mutations are truly a critical biosecurity risk -- as asserted publicly by various members of the NSABB -- then that data has value far beyond its utility in virology and vaccinology.

The behavior of various hackers (governments, individuals, other) over the last few years make clear that what the discussion thus far has done is to stick a giant "HACK HERE" sign on the data.  Moreover, if Ms. Garrett is correct that students across the planet are busy reverse engineering the experiments because they don't have access to the original methods and data, then censorship is creating a perverse incentive for innovation.  Given today's widespread communication, restriction of access to data is an invitation, not a proscription.

This same fate awaits any concentration of valuable data.  It obviously isn't a problem limited to collections of sensitive genetic sequences or laboratory methods.  And there is certainly a case to be made for attempting to maintain confidential or secret caches of data, whether in the public or private interest.  In such instances, compartmentalization and encryption must be implemented at the earliest stages of communication in order to have any hope of maintaining security. 

However, in this case, if it true that reverse engineering the results is straightforward, then restriction of access serves only to slow down the general process of science.  Moreover, censorship will slow the development of countermeasures.  It is unlikely that any collection of scientists identified by the NSABB or the government will be sufficient to develop all the technology we need to respond to natural pathogens, let alone any artificial ones.

As with most other examples of prohibition, these restrictions are doomed before they are even implemented.  Censorship of information that is known to exist incentivizes innovation and rediscovery.  As I explored in my book, prohibition in the name of security is historically a losing proposition.  Moreover, science is inherently a networked human activity that is fundamentally incompatible with constraints on communication, particularly of results that are already disclosed.  Any endeavor that relies upon science is, therefore, also fundamentally incompatible with constraints on communication.  Namely developing technologies to defend against natural and artificial pathogens.  Censorship threatens not just science but also our security.

No TrackBacks

TrackBack URL: http://www.synthesis.cc/cgi-bin/mt/mt-t.cgi/362

2 Comments

Good thoughts all around. Not only do I agree that censorship of scientific knowledge is detrimental, but also that it's nearly impossible. By limiting access to data, to findings, to research itself, we're limiting the capacity of the scientific community to understand and thus to respond to, inhibit or even prevent epidemics.

Open science, open biology in particular, is scary to people because the presumption is that an understanding of life and its many wondrous and intricate mechanisms will enable foul-play on the scale of bioterrorism. What most don't realize, though, is that the quest for this knowledge will not and cannot be stopped, and that if it isn't allowed to take place in an open environment, it will continue in the shadows - precisely where it the potential for hazardous or malicious outcomes is most probable.

Earlier this month I was speaking with Oliver Medvedik about this instance and we both wondered about a model (something quantifiable) to represent the risk/cost-to-benefit relationship linked to open vs. closed science. Dr. Medvedik and I were calling it "The Good Guy Model" - thinking that the impact of "good guys" with access to knowledge outweighs the risks associated with the "bad guys" who would also have access to it.

There are precedents to this sort of thing in fields outside of biology, most notably in nuclear technology - it's a double-edged technology, but the current development of biotechnology (making the technology and techniques involved with modifying organisms [for better or worse] less costly and more accessible) brings the power to tinker with organisms to a broader base of people than that which would be able to tinker with uranium.

Do you know of a model that could help to settle the matter of proving the net benefit of open access to knowledge?


Alex,

I have been arguing for years that our only course is to embrace and engage "The Good Guys". There is no practical way to stop the Bad Guys when it comes to the use of new technologies, and history is full of many examples in which attempting to prohibit "bad uses" only makes things worse.

In re your question on about a model that might "settle the matter", I have two thoughts. The first is that, unfortunately, no model can "settle" it because arguments about biosecurity are never entirely rational. Fear of Frankenstein and other monsters simply cannot be overcome by explaining they are myths.

My second thought is that western culture itself is a demonstration that open access to knowledge is the way to go. There has been a constant struggle to control both the storage and transmission of knowledge (via printing and books, then radio, then the net) and generally control has lost out to freedom. Our society has its flaws, but innovation and the quality of life are generally much faster/higher here than in societies in which information is controlled.

The history of piracy gives an interesting perspective on the future of biotechnology. There has been an ongoing ideological and economic struggle for 500 years now between those who want to control the presses and those who want information to flow freely. Neither side is likely to "win" anytime soon. And I have been convinced by the argument that piracy is better thought of as a pricing and access problem rather than crime and law enforcement problem. We can already see this dynamic popping up in biotech, something that I am writing a paper about now as a follow up to my 2003 paper. Attempts to control access to technologies that have economic value simply incentivize piracy, work-arounds, and black markets that are by definition less secure that the world we live in now.

Leave a comment

Archives