Restricting access toscience and technology in the name of security is historically a losing proposition. Censorship of information that is known to exist incentivizes innovation and rediscovery.
As most readers of this blog know, there has been quite a furor over new results demonstrating mutations in H5N1 influenza strains that are both deadly and highly contagious in mammals. Two groups, led by Ron Fouchier in the The Netherlands and Yoshihiro Kawaoka at The University of Wisconsin, have submitted papers to Nature and Science describing the results. The National Science Advisory Board for Biosecurity (NSABB) has requested that some details, such as sequence information, be omitted from publication. According to Nature, both journals are "reserving judgement about whether to censor the papers until the US government provides details of how it will allow genuine researchers to obtain redacted information".
For those looking to find more details about what happened, I suggest starting with Dorveen Caraval's interview with Fouchier in the New York Times, "Security in Flu Study Was Paramount, Scientist Says"; Kathleen Harmon's firsthand account of what actually happened when the study was announced; and Heidi Ledford's post at Nature News about the NSABB's concerns.
If you want to go further, there is more good commentary, especially the conversation in the comments (including from a member of the NSABB), in "A bad day for science" by Vincent Racaniello. See also Michael Eisen's post "Stop the presses! H5N1 Frankenflu is going to kill us all!", keeping in mind that Eisen used to work on the flu.
Writing at Foreign Policy, Laurie Garrett has done some nice reporting on these events in two posts, "The Bioterrorist Next Door" and "Flu Season". She suggests that attempts to censor the results would be futile: "The genie is out of the bottle: Eager graduate students in virology departments from Boston to Bangkok have convened journal-review debates reckoning exactly how these viral Frankenstein efforts were carried out."
There is much I agree with in Ms. Garrett's posts. However, I must object to her assertion that the work done by Fouchier and Kawaoka can be repeated easily using the tools of synthetic biology. She writes "The Fouchier episode laid bare the emptiness of biological-weapons prevention programs on the global, national, and local levels. Along with several older studies that are now garnering fresh attention, it has revealed that the political world is completely unprepared for the synthetic-biology revolution." As I have already written a book that discusses this confusion (here is an excerpt about synthetic biology and the influenza virus), it is not actually what I want to write about today. But I have to get this issue out of the way first.
As far as I understand from reading the press accounts, both groups used various means to create mutations in the flu genome and then selected viruses with properties they wanted to study. To clarify, from what I have been able to glean from the sparse accounts thus far, DNA synthesis was not used in the work. And as far as I understand from reading the literature and talking to people who build viruses for a living, it is still very hard to assemble a functioning, infectious influenza virus from scratch.
If it were easy to write pathogen genomes -- particularly flu genomes -- from scratch, we would quite frankly be in deep shit. But, for the time being, it is hard. And that is important. Labs who do use synthetic biology to build influenza viruses, as with those who reconstructed the 1918 H1N1 influenza virus, fail most of the time despite great skill and funding. Synthesizing flu viruses is simply not a garage activity. And with that, I'll move on.
Regardless of how the results might be reproduced, many have suggested that the particular experiments described by Fouchier and Kawaoka should not have been allowed. Fouchier himself acknowledges that selecting for airborne viruses was not the wisest experiment he could have done; it was, he says, "really, really stupid". But the work is done, and people do know about it. So the question of whether this work should have been done in the first place is beside the point. If, as suggested by Michael Eisen, that "any decent molecular biologist" could repeat the work, then it was too late to censor the details as soon as the initial report came out.
I am more interested in the consequences of trying to contain the results while somehow allowing access to vetted individuals. Containing the results is as much about information security as it is biological security. Once such information is created, the challenge is to protect it, to secure it. Unfortunately, the proposal to allow secure access only by particular individuals is at least a decade (if not three decades) out of date.
Any attempt to secure the data would have to start with an assessment of how widely it is already distributed. I have yet to meet an academic who regularly encrypts email, and my suspicion is that few avail themselves of the built-in encryption on their laptops. So, in addition to the university computers and email servers where the science originated, the information is sitting in the computers of reviewers, on servers at Nature and Science, at the NSABB, and, depending on how the papers were distributed and discussed by members of the NSABB, possibly on their various email servers and individual computers as well. And let's not forget the various unencrypted phones and tablets all of those reviewers now carry around.
But never mind that for a moment. Let's assume that all these repositories of the relevant data are actually secure. The next step is to arrange access for selected researchers. That access would inevitably be electronic, requiring secure networks, passwords, etc. In the last few days the news has brought word that computer security firms Stratfor and Symantec have evidently been hacked recently. Such attacks are not uncommon. Think back over the last couple of years: hacks at Google, various government agencies, universities. Credit card numbers, identities, and supposedly secret DoD documents are all for sale on the web. To that valuable information we can now add a certain list of influenza mutations. If those mutations are truly a critical biosecurity risk -- as asserted publicly by various members of the NSABB -- then that data has value far beyond its utility in virology and vaccinology.
The behavior of various hackers (governments, individuals, other) over the last few years make clear that what the discussion thus far has done is to stick a giant "HACK HERE" sign on the data. Moreover, if Ms. Garrett is correct that students across the planet are busy reverse engineering the experiments because they don't have access to the original methods and data, then censorship is creating a perverse incentive for innovation. Given today's widespread communication, restriction of access to data is an invitation, not a proscription.
This same fate awaits any concentration of valuable data. It obviously isn't a problem limited to collections of sensitive genetic sequences or laboratory methods. And there is certainly a case to be made for attempting to maintain confidential or secret caches of data, whether in the public or private interest. In such instances, compartmentalization and encryption must be implemented at the earliest stages of communication in order to have any hope of maintaining security.
However, in this case, if it true that reverse engineering the results is straightforward, then restriction of access serves only to slow down the general process of science. Moreover, censorship will slow the development of countermeasures. It is unlikely that any collection of scientists identified by the NSABB or the government will be sufficient to develop all the technology we need to respond to natural pathogens, let alone any artificial ones.
As with most other examples of prohibition, these restrictions are doomed before they are even implemented. Censorship of information that is known to exist incentivizes innovation and rediscovery. As I explored in my book, prohibition in the name of security is historically a losing proposition. Moreover, science is inherently a networked human activity that is fundamentally incompatible with constraints on communication, particularly of results that are already disclosed. Any endeavor that relies upon science is, therefore, also fundamentally incompatible with constraints on communication. Namely developing technologies to defend against natural and artificial pathogens. Censorship threatens not just science but also our security.