Results tagged “synthetic biology”

Late Night, Unedited Musings on Synthesizing Secret Genomes

By now you have probably heard that a meeting took place this past week at Harvard to discuss large scale genome synthesis. The headline large genome to synthesize is, of course, that of humans. All 6 billion (duplex) bases, wrapped up in 23 pairs of chromosomes that display incredible architectural and functional complexity that we really don't understand very well just yet. So no one is going to be running off to the lab to crank out synthetic humans. That 6 billion bases, by the way, just for one genome, exceeds the total present global demand for synthetic DNA. This isn't happening tomorrow. In fact, synthesizing a human genome isn't going to happen for a long time.

But, if you believe the press coverage, nefarious scientists are planning pull a Frankenstein and "fabricate" a human genome in secret. Oh, shit! Burn some late night oil! Burn some books! Wait, better — burn some scientists! Not so much, actually. There are a several important points here. I'll take them in no particular order.

First, it's true, the meeting was held behind closed doors. It wasn't intended to be so, originally. The rationale given by the organizers for the change is that a manuscript on the topic is presently under review, and the editor of the journal considering the manuscript made it clear that it considers the entire topic under embargo until the paper is published. This put the organizers in a bit of a pickle. They decided the easiest way to comply with the editor's wishes (which were communicated to the authors well after the attendees had made travel plans) was to hold the meeting under rules even more strict than Chatham House until the paper is published. At that point, they plan to make a full record of the meeting available. It just isn't a big deal. If it sounds boring and stupid so far, it is. The word "secret" was only introduced into the conversation by a notable critic who, as best I can tell, perhaps misconstrued the language around the editor's requirement to respect the embargo. A requirement that is also boring and stupid. But, still, we are now stuck with "secret", and all the press and bloggers who weren't there are seeing Watergate headlines and fame. Still boring and stupid.

Next, It has been reported that there were no press at the meeting. However, I understand that there were several reporters present. It has also been suggested that the press present were muzzled. This is a ridiculous claim if you know anything about reporters. They've simply been asked to respect the embargo, which so far they are doing, just like they do with every other embargo. (Note to self, and to readers: do not piss off reporters. Do not accuse them of being simpletons or shills. Avoid this at all costs. All reporters are brilliant and write like Hemingway and/or Shakespeare and/or Oliver Morton / Helen Branswell / Philip Ball / Carl Zimmer / Erica Check-Hayden. Especially that one over there. You know who I mean. Just sayin'.)

How do I know all this? You can take a guess, but my response is also covered by the embargo. 

Moving on: I was invited to the meeting in question, but could not attend. I've checked the various associated correspondence, and there's nothing about keeping it "secret". In fact, the whole frickin' point of coupling the meeting to a serious, peer-reviewed paper on the topic was to open up the conversation with the public as broadly as possible. (How do you miss that unsubtle point, except by trying?) The paper was supposed to come out before, or, at the latest, at the same time as the meeting. Or, um, maybe just a little bit after? But, whoops. Surprise! Academic publishing can be slow and/or manipulated/politicized. Not that this happened here. Anyway, get over it. (Also: Editors! And, reviewers! And, how many times will I say "this is the last time!")

(Psst: an aside. Science should be open. Biology, in particular, should be done in the public view and should be discussed in the open. I've said and written this in public on many occasions. I won't bore you with the references. [Hint: right here.] But that doesn't mean that every conversation you have should be subject to review by the peanut gallery right now. Think of it like a marriage/domestic partnership. You are part of society; you have a role and a responsibility, especially if you have children. But that doesn't mean you publicize your pillow talk. That would be deeply foolish and would inevitably prevent you from having honest conversations with your spouse. You need privacy to work on your thinking and relationships. Science: same thing. Critics: fuck off back to that sewery rag in — wait, what was I saying about not pissing off reporters?)

Is this really a controversy? Or is it merely a controversy because somebody said it is? Plenty of people are weighing in who weren't there or, undoubtedly worse from their perspective, weren't invited and didn't know it was happening. So I wonder if this is more about drawing attention to those doing the shouting. That is probably unfair, this being an academic discussion, full of academics.

Secondly (am I just on secondly?), the supposed ethical issues. Despite what you may read, there is no rush. No human genome, nor any human chromosome, will be synthesized for some time to come. Make no mistake about how hard a technical challenge this is. While we have some success in hand at synthesizing yeast chromosomes, and while that project certainly serves as some sort of model for other genomes, the chromatin in multicellular organisms has proven more challenging to understand or build. Consequently, any near-term progress made in synthesizing human chromosomes is going to teach us a great deal about biology, about disease, and about what makes humans different from other animals. It is still going to take a long time. There isn't any real pressing ethical issue to be had here, yet. Building the ubermench comes later. You can be sure, however, that any federally funded project to build the ubermench will come with a ~2% set aside to pay for plenty of bioethics studies. And that's a good thing. It will happen.

There is, however, an ethical concern here that needs discussing. I care very deeply about getting this right, and about not screwing up the future of biology. As someone who has done multiple tours on bioethics projects in the U.S. and Europe, served as a scientific advisor to various other bioethics projects, and testified before the Presidential Commission on Bioethical Concerns (whew!), I find that many of these conversations are more about the ethicists than the bio. Sure, we need to have public conversations about how we use biology as a technology. It is a very powerful technology. I wrote a book about that. If only we had such involved and thorough ethical conversations about other powerful technologies. Then we would have more conversations about stuff. We would converse and say things, all democratic-like, and it would feel good. And there would be stuff, always more stuff to discuss. We would say the same things about that new stuff. That would be awesome, that stuff, those words. <dreamy sigh> You can quote me on that. <another dreamy sigh>

But on to the technical issues. As I wrote last month, I estimate that the global demand for synthetic DNA (sDNA) to be 4.8 billion bases worth of short oligos and ~1 billion worth of longer double-stranded (dsDNA), for not quite 6 Gigabases total. That, obviously, is the equivalent of a single human duplex genome. Most of that demand is from commercial projects that must return value within a few quarters, which biotech is now doing at eye-popping rates. Any synthetic human genome project is going to take many years, if not decades, and any commercial return is way, way off in the future. Even if the annual growth in commercial use of sDNA were 20% — which is isn't — this tells you, dear reader, that the commercial biotech use of synthetic DNA is never, ever, going to provide sufficient demand to scale up production to build many synthetic human genomes. Or possibly even a single human genome. The government might step in to provide a market to drive technology, just as it did for the human genome sequencing project, but my judgement is that the scale mismatch is so large as to be insurmountable. Even while sDNA is already a commodity, it has far more value in reprogramming crops and microbes with relatively small tweaks than it has in building synthetic human genomes. So if this story were only about existing use of biology as technology, you could go back to sleep.

But there is a use of DNA that might change this story, which is why we should be paying attention, even at this late hour on a Friday night.

DNA is, by far, the most sophisticated and densest information storage medium humans have ever come across. DNA can be used to store orders of magnitude more bits per gram than anything else humans have come up with. Moreover, the internet is expanding so rapidly that our need to archive data will soon outstrip existing technologies. If we continue down our current path, in coming decades we would need not only exponentially more magnetic tape, disk drives, or flash memory, but exponentially more factories to produce these storage media, and exponentially more warehouses to store them. Even if this is technically feasible it is economically implausible. But biology can provide a solution. DNA exceeds by many times even the theoretical capacity of magnetic tape or solid state storage.

A massive warehouse full of magnetic tapes might be replaced by an amount of DNA the size of a sugar cube. Moreover, while tape might last decades, and paper might last millennia, we have found intact DNA in animal carcasses that have spent three-quarters of a million years frozen in the Canadian tundra. Consequently, there is a push to combine our ability to read and write DNA with our accelerating need for more long-term information storage. Encoding and retrieval of text, photos, and video in DNA has already been demonstrated. (Yes, I am working on one of these projects, but I can't talk about it just yet. We're not even to the embargo stage.) 

Governments and corporations alike have recognized the opportunity. Both are funding research to support the scaling up of infrastructure to synthesize and sequence DNA at sufficient rates.

For a “DNA drive” to compete with an archival tape drive today, it needs to be able to write ~2Gbits/sec, which is about 2 Gbases/sec. That is the equivalent of ~20 synthetic human genomes/min, or ~10K sHumans/day, if I must coin a unit of DNA synthesis to capture the magnitude of the change. Obviously this is likely to be in the form of either short ssDNA, or possibly medium-length ss- or dsDNA if enzymatic synthesis becomes a factor. If this sDNA were to be used to assemble genomes, it would first have to be assembled into genes, and then into synthetic chromosomes, a non trivial task. While this would be hard, and would to take a great deal of effort and PhD theses, it certainly isn't science fiction.

But here, finally, is the interesting bit: the volume of sDNA necessary to make DNA information storage work, and the necessary price point, would make possible any number of synthetic genome projects. That, dear reader, is definitely something that needs careful consideration by publics. And here I do not mean "the public", the 'them' opposed to scientists and engineers in the know and in the do (and in the doo-doo, just now), but rather the Latiny, rootier sense of "the people". There is no them, here, just us, all together. This is important.

The scale of the demand for DNA storage, and the price at which it must operate, will completely alter the economics of reading and writing genetic information, in the process marginalizing the use by existing multibillion-dollar biotech markets while at the same time massively expanding capabilities to reprogram life. This sort of pull on biotechnology from non-traditional applications will only increase with time. That means whatever conversation we think we are having about the calm and ethical development biological technologies is about to be completely inundated and overwhelmed by the relentless pull of global capitalism, beyond borders, probably beyond any control. Note that all the hullabaloo so far about synthetic human genomes, and even about CRISPR editing of embryos, etc., has been written by Western commentators, in Western press. But not everybody lives in the West, and vast resources are pushing development of biotechnology outside of the of West. And that is worth an extended public conversation.

So, to sum up, have fun with all the talk of secret genome synthesis. That's boring. I am going off the grid for the rest of the weekend to pester litoral invertebrates with my daughter. You are on your own for a couple of days. Reporters, you are all awesome, make of the above what you will. Also: you are all awesome. When I get back to the lab on Monday I will get right on with fabricating the ubermench for fun and profit. But — shhh — that's a secret.

On DNA and Transistors

Here is a short post to clarify some important differences between the economics of markets for DNA and for transistors. I keep getting asked related questions, so I decided to elaborate here.

But first, new cost curves for reading and writing DNA. The occasion is some new data gleaned from a somewhat out of the way source, the Genscript IPO Prospectus. It turns out that, while preparing their IPO docs, Genscript hired Frost & Sullivan to do market survey across much of life sciences. The Prospectus then puts Genscript's revenues in the context of the global market for synthetic DNA, which together provide some nice anchors for discussing how things are changing (or not).

So, with no further ado, Frost & Sullivan found that the 2014 global market for oligos was $241 million, and the global market for genes was $137 million. (Note that I tweeted out larger estimates a few weeks ago when I had not yet read the whole document.) Genscript reports that they received $35 million in 2014 for gene synthesis, for 25.6% of the market, which they claim puts them in the pole position globally. Genscript further reports that the price for genes in 2014 was $.34 per base pair. This sounds much too high to me, so it must be based on duplex synthesis, which would bring the linear per base cost down to $.17 per base, which sounds much more reasonable to me because it is more consistent with what I hear on the street. (It may be that Gen9 is shipping genes at $.07 per base, but I don't know anyone outside of academia who is paying that low a rate.) If you combine the price per base and the size of the market, you get about 1 billion bases worth of genes shipped in 2014 (so a million genes, give or take). This is consistent with Ginkgo's assertions saying that their 100 million base deal with Twist was the equivalent of 10% of the global gene market in 2015. For oligos, if you combine Genscript's reported average price per base, $.05, with the market size you get about 4.8 billion bases worth of oligos shipped in 2014. Frost & Sullivan thinks that from 2015 to 2019 the oligo market CAGR will be 6.6% and the gene synthesis market will come in at 14.7%.

For the sequencing, I have capitulated and put the NextSeq $1000 human genome price point on the plot. This instrument is optimized to sequence human DNA, and I can testify personally that sequencing arbitrary DNA is more expensive because you have to work up your own processes and software. But I am tired of arguing with people. So use the plot with those caveats in mind.

(My blogging service seems to be having an issue embedding the image. Click below and you will get a pop up window.)

What is most remarkable about these numbers is how small they are. The way I usually gather data for these curves is to chat with people in the industry, mine publications, and spot check price lists. All that led me to estimate that the gene synthesis market was about $350 million (and has been for years) and the oligo market was in the neighborhood of $700 million (and has been for years).

If the gene synthesis market is really only $137 million, with four or 5 companies vying for market share, then that is quite an eye opener. Even if that is off by a factor of two or three, getting closer to my estimate of $350 million, that just isn't a very big market to play in. A ~15% CAGR is nothing to sneeze at, usually, and that is a doubling rate of about 5 years. But the price of genes is now falling by 15% every 3-4 years (or only about 5% annually). So, for the overall dollar size of the market to grow at 15%, the number of genes shipped every year has to grow at close to 20% annually. That's about 200 million additional bases (or ~200,000 more genes) ordered in 2016 compared to 2015. That seems quite large to me. How many users can you think of who are ramping up their ability to design or use synthetic genes by 20% a year? Obviously Ginkgo, for one. As it happens, I do know of a small number of other such users, but added together they do not come close to constituting that 20% overall increase. All this suggests to me that the dollar value of the gene synthesis market will be hard pressed to keep up with Frost & Sullivan estimate of 14.7% CAGR, at least in the near term. As usual, I will be happy to be wrong about this, and happy to celebrate faster growth in the industry. But bring me data.

People in the industry keep insisting that once the price of genes falls far enough, the ~$3 billion market for cloning will open up to synthetic DNA. I have been hearing that story for a decade. And then price isn't the only factor. To play in the cloning market, synthesis companies would actually have to be able to deliver genes and plasmids faster than cloning. Given that I'm hearing delivery times for synthetic genes are running at weeks, to months, to "we're working on it", I don't see people switching en mass to synthetic genes until the performance improves. If it costs more to have your staff waiting for genes to show up by FedEx than to have them bash the DNA by hand, they aren't going to order synthetic DNA.

And then what happens if the price of genes starts falling rapidly again? Or, forget rapidly, what about modestly? What if a new technology comes in and outcompetes standard phosphoramidite chemistry? The demand for synthetic DNA could accelerate and the total market size still might be stagnant, or even fall. It doesn't take much to turn this into a race to the bottom. For these and other reasons, I just don't see the gene synthesis market growing very quickly over the next 5 or so years.

Which brings me to transistors. The market for DNA is very unlike the market for transistors, because the role of DNA in product development and manufacturing is very unlike the role of transistors. Analogies are tremendously useful in thinking about the future of technologies, but only to a point; the unwary may miss differences that are just as important as the similarities.

For example, the computer in your pocket fits there because it contains orders of magnitude more transistors than a desktop machine did fifteen years ago. Next year, you will want even more transistors in your pocket, or on your wrist, which will give you access to even greater computational power in the cloud. Those transistors are manufactured in facilities now costing billions of dollars apiece, a trend driven by our evidently insatiable demand for more and more computational power and bandwidth access embedded in every product that we buy. Here is the important bit: the total market value for transistors has grown for decades precisely because the total number of transistors shipped has climbed even faster than the cost per transistor has fallen.

In contrast, biological manufacturing requires only one copy of the correct DNA sequence to produce billions in value. That DNA may code for just one protein used as a pharmaceutical, or it may code for an entire enzymatic pathway that can produce any molecule now derived from a barrel of petroleum. Prototyping that pathway will require many experiments, and therefore many different versions of genes and genetic pathways. Yet once the final sequence is identified and embedded within a production organism, that sequence will be copied as the organism grows and reproduces, terminating the need for synthetic DNA in manufacturing any given product. The industrial scaling of gene synthesis is completely different than that of semiconductors.

Brewing Bad Biosecurity Policy

Last week brought news of a truly interesting advance in porting opioid production to yeast. This is pretty cool science, because it involves combining enzymes from several different organisms to produce a complex and valuable chemical, although no one has yet managed to integrate the whole synthetic pathway in microbes. It is also potentially pretty cool economics, because implementing opiate production in yeast should dramatically lower the price of a class of important pain medications to a point that developing countries might finally be able to afford.

Alongside the scientific article was a Commentary – with images of drug dens and home beer brewing – explicitly suggesting that high doses of morphine and other addictive narcotics would soon be brewed at home in the garage. The text advertised “Home-brew opiates” – wow, just like beer! The authors of the Commentary used this imagery to argue for immediate regulation of 1) yeast strains that can make opioids (even though no such strains exist yet), and 2) the DNA sequences that code for the opioid synthesis pathways. This is a step backward for biosecurity policy, by more than a decade, because the proposal embraces measures known to be counterproductive for security.

The wrong recipe.

I'll be very frank here – proposals like this are deep failures of the science policy enterprise. The logic that leads to “must regulate now!” is 1) methodologically flawed and 2) ignores data we have in hand about the impacts of restricting access to technology and markets. In what follows, I will deal in due course with both kinds of failures, as well as looking at the predilection to assume regulation and restriction should be the primary policy response to any perceived threat.

There are some reading this who will now jump to “Carlson is yet again saying that we should have no regulation; he wants wants everything to be available to anyone.” This is not my position, and never has been. Rather, I insist that our policies be grounded in data from the real world. And the real world data we have demonstrates that regulation and restriction often cause more harm than good. Moreover, harm is precisely the impact we should expect by restricting access to democratized biological technologies. What if even simple analyses suggests that proposed actions are likely to make things worse? What if the specific policy actions recommended in response to a threat have already been shown to exacerbate damage from the threat? That is precisely the case here. I am constantly confronted with people saying, "That's all very well and good, but what do you propose we do instead?" The answer is simple: I don't know. Maybe nothing. Maybe there isn't anything we can do. But for now, I just want us to not make things worse. In particular I want to make sure we don't screw up the emerging bioeconomy by building in perverse incentives for black markets, which would be the worst possible development for biosecurity.

Policy conversations at all levels regularly make these same mistakes, and the arguments are nearly uniform in structure. “Here is something we don't know about, or are uncertain about, and it might be bad – really, really bad – so we should most certainly prepare policy options to prevent the hypothetical worst!” Exclamation points are usually just implied throughout, but they are there nonetheless. The policy options almost always involve regulation and restriction of a technology or process that can be construed as threatening, usually with little or no consideration of what that threatening thing might plausibly grow into, nor of how similar regulatory efforts have fared historically.

This brings me to the set up. Several news pieces (e.g., the NYT, Buzzfeed) succinctly pointed out that the “home-brew” language was completely overblown and inflammatory, and that the Commentary largely ignored both the complicated rationale for producing opioids in yeast and the complicated benefits of doing so. The Economist managed to avoid getting caught up in discussing the Commentary, remaining mostly focussed on the science, while in the last paragraph touching on the larger market issues and potential future impacts of “home brew opium” to pull the economic rug out from under heroin cartels. (Maybe so. It's an interesting hypothesis, but I won't have much to say about it here.) Over at, Piers Millet  formerly of the Biological Weapons Convention Implementation Support Unit  calmly responded to the Commentary by observing that, for policy inspiration, the authors look backward rather than forward, and that the science itself demonstrates the world we are entering requires developing completely new policy tools to deal with new technical and economic realities.

Stanford's Christina Smolke, who knows a thing or two about opioid production in yeast, observed in multiple news outlets that getting yeast to produce anything industrially at high yields is finicky to get going and then hard to maintain as a production process. It's relatively easy to produce trace amounts of lots of interesting things in microbes (ask any iGEM team); it is very hard and very expensive to scale up to produce interesting amounts of interesting things in microbes (ask any iGEM team). Note that we are swimming in data about how hard this is to do, which is an important part of this story. In addition to the many academic examples of challenges in scaling up production, the last ten years are littered with startups that failed at scale up. The next ten years, alas, will see many more.

Even with an engineered microbial strain in hand, it can be extraordinarily hard to make a microbe jump through the metabolic and fermentation hoops to produce interesting/useful quantities of a compound. And then transferring that process elsewhere is very frequently its own expensive and difficult effort. It is not true that you can just mail a strain and a recipe from one place to another and automatically get the same result. However, it is true that all this will get easier over time, and many people are working on reproducible process control for biological production.

That future looks amazing. I've written many times about how the future of the economy looks like beer and cows  in other words, that our economy will inevitably be based on distributed biological manufacturing. But that is the future: i.e., not the present. Nor is it imminent. I truly wish it were imminent, but it is not. Whole industries exist to solve these problems, and much more money and effort will be spent before we get there. The economic drivers are huge. Some of the investments made by Bioeconomy Capital are, in fact, aimed at eventually facilitating distributed biological manufacturing. But, if nothing else, these investments have taught me just how much effort is required to reach that goal. If anybody out there has a credible plan to build the Cowborg or to microbrew chemicals and pharmaceuticals as suggested by the Commentary, I will be your first investor. (I said “credible”! Don't bother me otherwise.) But I think any sort of credible plan is years away. For the time being, the only thing we can expect to brew like beer is beer.

FBI Supervisory Special Agent Ed You makes great use of the “brewing bad” and “baking bad” memes, mentioned in the Commentary, in talking to students and professionals alike about the future of drug production. But this is in the context of taking personal responsibility for your own science and for speaking up when you see something dangerous. I've never heard Ed say anything about increasing surveillance and enforcement efforts as the way forward. In fact, in the Times piece, Ed specifically says, “We’ve learned that the top-down approach doesn’t work.” I can't say exactly why Ed chose that turn of phrase, but I can speculate that it is based 1) on his own experience as a professional bench molecular biologist, 2) the catastrophically bad impacts of the FBI's earlier arrests and prosecutions of scientists and artists for doing things that were legal, and 3) the official change in policy from the White House and National Security Council away from suppression and toward embracing and encouraging garage biology. The standing order at the FBI is now engagement. In fact, Ed You's arrival on the scene was coincident with any number of positive policy changes in DC, and I am happy to give him all the credit I can. Moreover, I completely agree with Ed and the Commentary authors that we should be discussing early on the implications of new technologies, an approach I have been advocating for 15 years. But I completely disagree with the authors that the current or future state of the technology serves as an indicator of the need to prepare some sort of regulatory response. We tried regulating fermentation once before; that didn't work out so well [1]. 

Badly baked regulatory policy.

So now we're caught up to about the middle of the Commentary. At this point, the story is like other such policy stories. “Assume hypothetical thing is inevitable: discuss and prepare regulation.” And like other such stories, here is where it runs off the rails with a non sequitur common in policy work. Even if the assumption of the thing's inevitability is correct (which is almost always debatable), the next step should be to assess the impact of the thing. Is it good, or is it bad? (By a particular definition of good and bad, of course, but never mind that for now.) Usually, this question is actually skipped and the thing is just assumed to be bad and in need of a policy remedy, but the assumption of badness, breaking or otherwise, isn't fatal for the analysis.

Let's say it looks bad – bad, bad, bad – and the goal of your policy is to try to either head it off or fix it. First you have to have some metric to judge how bad it is. How many people are addicted, or how many people die, or how is the crime rate affected? Just how bad is it breaking? Next – and this is the part the vast majority of policy exercises miss – you have to try to understand what happens in the absence of a policy change. What is the cost of doing nothing, of taking no remediating action? Call this the null hypothesis. Maybe there is even a benefit to doing nothing. But only now, after evaluating the null hypothesis, are you in a position to propose remedies, because only now you have a metric to compare costs and benefits. If you leap directly to “the impacts of doing nothing are terrible, so we must do something, anything, because otherwise we are doing nothing”, then you have already lost. To be sure, policy makers and politicians feel that their job is to do something, to take action, and that if they are doing nothing then they aren't doing their jobs. That is just a recipe for bad policy. Without the null hypothesis, your policy development is a waste of time and, potentially, could make matters worse. This happens time and time again. Prohibition, for example, was exactly this sort of failure, and cost much more than it benefited, which is why it was considered a failure [2].

We keep making the same mistake. We have plenty of data and reporting, courtesy of the DEA, that the ongoing crackdown on methamphetamine production has created bigger and blacker markets, as well as mayhem and violence in Mexico, all without much impact on domestic drug use. Here is the DEA Statistics & Facts page – have a look and then make up your own mind.

I started writing about the potential negative impacts of restricting access to biological technologies in 2003 (PDF), including the likely emergence of black markets in the event of overregulation. I looked around for any data I could find on the impacts of regulating democratized technologies. In particular, I happened upon the DEA's first reporting of the impacts of the then newly instituted crackdown on domestic methamphetamine production and distribution. Even in 2003, the DEA was already observing that it had created bigger, blacker markets – that are by definition harder to surveil and disrupt – without impacting meth use. The same story has played out similarly in cocaine production and distribution, and more recently in the markets for “bath salts”, aka “legal highs”

That is, we have multiple, clear demonstrations that, rather than improving the world, restricting access to distributed production can instead cause harm. But, really, when has this ever worked? And why do people think going down the same path in the future will lead anywhere else? I am still looking for data – any data at all – that supports the assertion that regulating biological technologies will have any different result. If you have such data, bring it. Let's see it. In that absence of that data, policy proposals that lead with regulation and restriction are doomed to repeat the failures of the past. It has always seemed to me like a terrible idea to transfer such policies over to biosecurity. Yet that is exactly what the Commentary proposes.

Brewing black markets.

The fundamental problem with the approach advocated in the Commentary is that security policies, unlike beer brewing, do not work equally well across all technical and economic scales. What works in one context will not work in another. Nuclear weapons can be secured by guns, gates, and guards because they are expensive to build and the raw materials are hard to come by, so heavy touch regulation works just fine. There are some industries – as it happens, beer brewing – where only light touch regulation works. In the U.S., we tried heavy touch regulation in the form of Prohibition, and it failed miserably, creating many more problems than it solved. There are other industries, for example DNA and gene synthesis, in which even light touch regulations are a bad idea. Indeed, light touch regulation of has already created the problem it was supposed to prevent, namely the existence of DNA synthesis providers that 1) intentionally do not screen their orders and 2) ship to countries and customers that are on unofficial black lists.

For those who don't know this story: In early 2013, the International Council for the Life Sciences (ICLS) convened a meeting in Hong Kong to discuss "Codes of Conduct" for the DNA synthesis industry, namely screening orders and paying attention to who is doing the ordering. According to various codes and guidelines promulgated by industry associations and the NIH, DNA synthesis providers are supposed to reject orders that are similar to sequences that code for pathogens, or genes from pathogens, and it is suggested that they do not ship DNA to certain countries or customers (the unofficial black list). Here is a PDF of the meeting report; be sure to read through Appendix A.

The report is fairly anodyne in describing what emerged in discussions. But people who attended have since described in public the Chinese DNA synthesis market as follows. There are 3 tiers of DNA providers. The first tier is populated with companies that comply with the various guidelines and codes promulgated internationally because this tier serves international markets. There is a second tier that appears to similarly comply, because while they serve primarily the large internal market these companies have aspirations of also serving the international market. There is a third tier that exists specifically to serve orders from customers seeking ways around the guidelines and codes. (One company in this tier was described to me as a "DNA shanty", with the employees living over the lab.) Thus the relatively light touch guidelines (which are not laws) have directly incentivized exactly the behavior they were supposed to prevent. This is not a black market, per se, and cannot be accurate described as illegal, so let's call it a "grey market".

I should say here that this is entirely consistent with my understanding of biotech in China. In 2010, I attended a warm up meeting for the last round of BWC negotiations. After that meeting, I chatted with one of the Chinese representatives present, hoping to gain a little bit of insight into the size of the Chinese bioeconomy and the state of the industry. My query was met with frank acknowledgment that the Chinese government isn't able to keep track of the industry, does't know how many companies are active, or how many employees they have, or what they are up to, and so doesn't hold out much hope of controlling the industry. I covered this a bit in my 2012 Biodefense Net Assessment report for DHS. (If anyone has any new insight into the Chinese biotech industry, I am all ears.) Not that the U.S. or Europe is any better in this regard, as our mechanisms for tracking the biotech industry are completely dysfunctional, too. There could very well be DNA synthesis providers operating elsewhere that don't comply with the recommended codes of conduct: we have no real means of broadly surveying for this behavior. There are no physical means either to track it remotely or to control it.

I am a little bit sensitive about the apparent emergence of the DNA synthesis grey market, because I warned for years in print and in person that DNA screening would create exactly this outcome. I was condescendingly told on many occasions that it was foolish to imagine a black market for DNA. And then we have to do something, right? But it was never very complicated to think this through. DNA is cheap, and getting cheaper. You need this cheap DNA as code to build more complicated, more valuable things. Ergo, restrictions on DNA synthesis will incentivize people to seek, and to provide, DNA outside any control mechanism. The logic is pretty straightforward, and denying it is simply willful self-deception. Regulation of DNA synthesis will never work. In the vernacular of the day: because economics. To make it even simpler: because humans.

So the idea that people are still suggesting proscription of certain DNA sequences is a viable route to security just rankles. And it is demonstrably counterproductive. The restrictions incentivize the bad behavior they are supposed to prevent, probably much earlier than might have happened otherwise. The take home message here is that not all industries are the same, because not all technologies are the same, and that our policy approaches should take into account these differences rather than papering over them. In particular, restricting access to information in our modern economy is a losing game. 

Where do we go from here?

We are still at the beginning of biotech. This is the most important time to get it right. This is the most important time not to screw up and make things worse. And it is important that we are at the beginning, because things are not yet screwed up.

Conversely, we are well down the road in developing and deploying drug policies, with much damage done. To be sure, despite the accumulated and ongoing costs, you have to acknowledge that it is not at all clear that suddenly legalizing drugs such as meth or cocaine would be a positive step. I am not in any way making that argument. But it is abundantly clear that drug enforcement activities have created the world we live in today. Was there an alternative? If the DEA had been able to do cost/benefit analysis of the impacts of its actions – that is, predict the emergence of DTOs and their role in production, trafficking, and violence – would the policy response 15 years ago have been any different? If Nixon had more thoughtfully considered even what was known 50 years about about the impacts of proscription, would he have launched the war on drugs? That is a hard question, because drug policy is clearly driven more by stories and personal politics than by facts. I am inclined to think the present drug policy mess was inevitable. Even with the DEA's self-diagnosed role in creating and sustaining DTOs, the national conversation is still largely dominated by “the war on drugs”. And thus the first reaction to the prospect of microbial narcotics production is to employ strategies and tactics that have already failed elsewhere. I would hate to think we are in for a war on microbes, because that is doomed to failure.

But we haven't yet made all those mistakes with biological technologies. I continue to hope that, if nothing else, we will avoid making things worse by rejecting policies we already know won't work. 


[1] Pause here to note that even this early in the set up, the Commentary conflates via words and images the use of yeast in home brew narcotics with centralized brewing of narcotics by cartels. These are two quite different, and are perhaps mutually exclusive, technoeconomic futures. Drug cartels very clearly have the resources to develop technology. Depending on whether you listen to the U.S. Navy or the U.S. Coast Guard, either 30% or 80% of the cocaine delivered to the U.S. is transported at some point in semisubmersible cargo vessels or in fully submersible cargo submarines. These 'smugglerines', if you will, are the result of specific technology development efforts directly incentivized by governmental interdiction efforts. Similarly, if cartels decide that developing biological technologies suits their business needs, they are likely to do so. And cartels certainly have incentives to develop opioid-producing yeast, because fermentation usually lowers the cost of goods between 50% and 90% compared to production in plants. Again, cartels have the resources, and they aren't stupid. If cartels do develop these yeast strains, for competitive reasons they certainly won't want anyone else to have them. Home brew narcotics would further undermine their monopoly.

[2] Prohibition was obviously the result of a complex socio-political situation, just as was its repeal. If you want a light touch look at the interaction of the teetotaler movement, the suffragette movement, and the utility of Prohibition in continued repression of freed slaves after the Civil War, check out Ken Burns's “Prohibition” on Netflix. But after all that, it was still a dismal failure that created more problems than it solved. Oh, and Prohibition didn't accomplish its intended aims. Anheuser-Busch thrived during those years. Its best selling products at the time were yeast and kettles (see William Knoedleseder's Bitter Brew)...

The most important paragraph of The Gene Factory

The most important paragraph of Michael Specter's story about BGI:

"In the United States and in the West, you have a certain way," [BGI President Jian Wang] continued, smiling and waving his arms merrily. "You feel you are advanced and you are the best. Blah, blah, blah. You follow all these rules and have all these protocols and laws and regulations. You need somebody to change it. To blow it up. For the last five hundred years, you have been leading the way with innovation. We are no longer interested in following."
[Given the mix-up in the publication date of 2015, I have now deleted the original post. I have appended the comments from the original post to the bottom of this post.]

It's time once again to see how quickly the world of biological technologies is changing. The story is mixed, in part because it is getting harder to find useful data, and in part because it is getting harder to generate appropriate metrics. 

Sequencing and synthesis productivity

I'll start with the productivity plot, as this one isn't new. For a discussion of the substantial performance increase in sequencing compared to Moore's Law, as well as the difficulty of finding this data, please see this post. If nothing else, keep two features of the plot in mind: 1) the consistency of the pace of Moore's Law and 2) the inconsistency and pace of sequencing productivity. Illumina appears to be the primary driver, and beneficiary, of improvements in productivity at the moment, especially if you are looking at share prices. It looks like the recently announced NextSeq and Hiseq instruments will provide substantially higher productivities (hand waving, I would say the next datum will come in another order of magnitude higher), but I think I need a bit more data before officially putting another point on the plot. Based on Eric Check Hayden's coverage at Nature, it seems that the new instruments should also provide substantial price improvements, which I get into below.

As for synthesis productivity, there have been no new commercially available instruments released for many years. sDNA providers are clearly pushing productivity gains in house, but no one outside those companies has access to productivity data.
DNA sequencing and synthesis prices

The most important thing to notice about the plots below is that prices have stopped falling precipitously. If you hear or read anyone asserting that costs are falling exponentially, you can politely refer them to the data (modulo the putative performance of the new Illumina instruments). We might again see exponential price decreases, but that will depend on a combination of technical innovation, demand, and competition, and I refer the reader to previous posts on the subject. Note that prices not falling isn't necessarily bad and doesn't mean the industry is somehow stagnant. Instead, it means that revenues in these sectors are probably not falling, which will certainly be welcomed by the companies involved. As I described a couple of weeks ago, and in a Congressional briefing in November, revenues in biotech continue to climb steeply.

The second important thing to notice about these plots is that I have changed the name of the metric from "cost" to "price". Previously, I had decided that this distinction amounted to no real difference for my purposes. Now, however, the world has changed, and cost and price are very different concepts for anyone thinking about the future of DNA. Previously, there was at times an order of magnitude change in cost from year to year, and keeping track of the difference between cost and price didn't matter. In a period when change is much slower, that difference becomes much more important. Moreover, as the industry becomes larger, more established, and generally more important for the economy, we should all take more care in distinguishing between concepts like cost to whom and price for whom.

In the plot that follows, the price is for finished, not raw, sequencing.
And here is a plot only of oligo and gene-length DNA:
What does all this mean? Illumina's instruments are now responsible for such a high percentage of sequencing output that the company is effectively setting prices for the entire industry. Illumina is being pushed by competition to increase performance, but this does not necessarily translate into lower prices. It doesn't behoove Illumina to drop prices at this point, and we won't see any substantial decrease until a serious competitor shows up and starts threatening Illumina's market share. The absence of real competition is the primary reason sequencing prices have flattened out over the last couple of data points.

I pulled the final datum on the sequencing curve from the NIH; the title on the NIH curve is "cost", but as it includes indirect academic costs I am going to fudge and call it "price". I notice that the NIH is now publishing two sequencing cost curves, and I'll bet that the important differences between them are too subtle for most viewers. One curve shows cost per megabase of raw sequence - that is, data straight off the instruments - and the other curve shows cost per finished human genome (assuming ~30X coverage of 3x10^9 bases). The cost per base of that finished sequencing is a couple orders of magnitude higher than the cost of the raw data. On the Hiseq X data sheet, Illumina has boldly put a point on the cost per human genome curve at $1000. But I have left it off the above plot for the time being; the performance and cost claimed by Illumina are just for human genomes rather than for arbitrary de novo sequencing. Mick Watson dug into this, and his sources inside Illumina claim that this limitation is in the software, rather than the hardware or the wetware, in which case a relatively simple upgrade could dramatically expand the utility of the instrument. Or perhaps the "de novo sequencing level" automatically unlocks after you spend $20 million in reagents. (Mick also has some strong opinions about the role of competition in pushing the development of these instruments, which I got into a few months ago.) 

Synthesis prices have slowed for entirely different reasons. Again, I have covered this ground in many other posts, so I won't belabor it here. 

Note that the oligo prices above are for column-based synthesis, and that oligos synthesized on arrays are much less expensive. However, array synthesis comes with the usual caveat that the quality is generally lower, unless you are getting your DNA from Agilent, which probably means you are getting your dsDNA from Gen9

Note also that the distinction between the price of oligos and the price of double-stranded sDNA is becoming less useful. Whether you are ordering from Life/Thermo or from your local academic facility, the cost of producing oligos is now, in most cases, independent of their length. That's because the cost of capital (including rent, insurance, labor, etc) is now more significant than the cost of goods. Consequently, the price reflects the cost of capital rather than the cost of goods. Moreover, the cost of the columns, reagents, and shipping tubes is certainly more than the cost of the atoms in the sDNA you are ostensibly paying for. Once you get into longer oligos (substantially larger than 50-mers) this relationship breaks down and the sDNA is more expensive. But, at this point in time, most people aren't going to use longer oligos to assemble genes unless they have a tricky job that doesn't work using short oligos.

Looking forward, I suspect oligos aren't going to get much cheaper unless someone sorts out how to either 1) replace the requisite human labor and thereby reduce the cost of capital, or 2) finally replace the phosphoramidite chemistry that the industry relies upon. I know people have been talking about new synthesis chemistries for many years, but I have not seen anything close to market.

Even the cost of double-stranded sDNA depends less strongly on length than it used to. For example, IDT's gBlocks come at prices that are constant across quite substantial ranges in length. Moreover, part of the decrease in price for these products is embedded in the fact that you are buying smaller chunks of DNA that you then must assemble and integrate into your organism of choice. The longer gBlocks come in as low as ~$0.15/base, but you have to spend time and labor in house in order to do anything with them. Finally, so far as I know, we are still waiting for Gen9 and Cambrian Genomics to ship DNA at the prices they have suggested are possible. 

How much should we care about the price of sDNA?

I recently had a chat with someone who has purchased and assembled an absolutely enormous amount of sDNA over the last decade. He suggested that if prices fell by another order of magnitude, he could switch completely to outsourced assembly. This is an interesting claim, and potentially an interesting "tipping point". However, what this person really needs is not just sDNA, but sDNA integrated in a particular way into a particular genome operating in a particular host. And, of course, the integration and testing of the new genome in the host organism is where most of the cost is. Given the wide variety of emerging applications, and the growing array of hosts/chassis, it isn't clear that any given technology or firm will be able to provide arbitrary synthetic sequences incorporated into arbitrary hosts.

Consequently, as I have described before, I suspect that we aren't going to see a huge uptake in orders for sDNA until cheaper genes and circuits are coupled to reductions in cost for the rest of the build, test, and measurement cycle. Despite all the talk recently about organism fabs and outsourced testing, I suggest that what will really make a difference is providing every lab and innovator with adequate tools and infrastructure to do their own complete test and measurement. We should look to progress in pushing all facets of engineering capacity for biological systems, not just on reducing the cost of reading old instructions and writing new ones.


Comments from original post follow.

George Church:

"the performance and cost claimed by Illumina are just for human genomes rather than for arbitrary de novo sequencing."  --Rob
 ( )  But most of the curve has been based on human genome sequencing until now.  Why exclude human, rather than having a separate curve for "de novo"?  Human genomes constitute a huge and compelling market.    -- George  

"oligos synthesized on arrays are much less expensive. However, array synthesis comes with the usual caveat that the quality is generally lower, unless you are getting your DNA from Agilent"  -- Rob
So why exclude Agilent from the curve? -- George

"we aren't going to see a huge uptake in orders for sDNA until cheaper genes and circuits are coupled to reductions in cost for the rest of the build, test, and measurement cycle." --Rob
Is this the sort of enabling technology needed?:

My response to George:


Thanks for your comments. I am not sure what you might mean by "most of the curve has been based on human genome sequencing". From my first efforts in 2000 (published initially in 2003), I have tried to use data that is more general. It is true that human genomes constitute a large market, but they aren't the only market. By definition, if you are interested in sequencing or building any other organism, then instruments that specialize in sequencing humans are of limited relevance. It may also be true that the development of new sequencing technologies and instruments has been driven by human sequencing, but that is beside the point. It may even be true that the new Illumina systems can be just as easily used to sequencing mammoths, but that isn't happening yet. I have been doing my best to track the cost, and now the price, of de novo sequencing.

As I mention in the post, it is time that everyone, including me, started being more careful about the difference between cost and price. This brings me to oligos.

Agilent oligos are a special case. So far as I know, only Gen9 is using Agilent oligos as raw material to build genes. As you know, Cambrian Genomics is using arrays produced using technology developed at Combimatrix, and in any event isn't yet on the market. It is my understanding that, aside from Gen9, Agilent's arrays are primarily used for analysis rather than for building anything. Therefore, the market *price* of Agilent oligos is irrelevant to anyone except Gen9.

If the *cost* of Agilent oligos to Gen9 were reflected in the *price* of the genes sold by Gen9, or if those oligos were more broadly used, then I would be more interested in including them on the price curve. So far as I am aware, the price for the average customer at Gen9 is in the neighborhood of $.15-.18 per base. I've heard Drew Endy speak of a "friends and family" (all academics?) price of ~$.09 from Gen9, but that does not appear to be available to all customers, so I will leave it off the plot for now.

All this comes down to the obvious fact that, as the industry matures and becomes supported more by business-to-business sales rather than being subsidized by government grants and artificially cheap academic labor, the actual cost and actual price start to matter a great deal. Deextinction, in particular, might be an example where an academic/non-profit project might benefit from low cost (primarily cost of goods and cost of labor) that would be unachievable on the broader market where the price would set by 1) keeping the doors of a business open, 2) return on capital, and 3) competition, not necessarily in that order. The academic cost of developing, demonstrating, and even using technologies is almost always very different from the eventual market price of those technologies.

The bottom line is that, from day one, I have been trying to understand the impact of biological technologies on the economy. This impact is most directly felt, and tracked, via the price that most customers pay for goods and services. I am always looking to improve the metrics I use, and if you have suggestions about how to do this better I am all ears.

Finally, yes, the papers you cite (above and on the Deexctinction mailing list) describe the sort of thing the could help reduce engineering costs. Ultimately technologies like those will reduce the market price of products resulting from that engineering process. I look forward to seeing more, and also to seeing this technology utilized in the market.

Thanks again for your thoughtful questions.

BAS: From national security to natural security

Here is my recent essay in Bulletin of the Atomic Scientists: "From national security to natural security".

The first few paragraphs:

From 10,000 meters up, the impact of humans on the Earth is clear. Cities spanning kilometers are connected by roadways stretching to the horizon. Crowding the spaces in between are fields that supply food and industrial feedstock to satisfy a variety of human hungers. These fields feed humanity. Through stewardship we maintain their productivity and thus sustain societies that extend around the globe; if these fields fall into ill health, or if we push them into sickness, we risk the fate of those same societies.

Humans have a long history of modifying the living systems they rely on. Forests in Europe and North America have been felled for timber and have regrown, while other large tracts of land around the world have been completely cleared for use in agriculture. The animals and plants we humans eat on a regular basis have been selected and bred over millennia to suit the human palate and digestive tract. All these crops and products are shipped and consumed globally to satisfy burgeoning demand.

Our technology and trade thereby support a previously unimaginable quality of life for a previously impossible number of people. Humanity has kept Malthus at bay through centuries of growth. Yet our increasing numbers impose a load that is now impacting nature's capacity to support human societies. This stress comes at a time when ever-larger numbers of humans demand more: more food, more clean water, more energy, more education, more entertainment, more more.

Increasing human demand raises the question of supply, and of the costs of meeting that supply. How we choose to spend or to conserve land, water, and air to meet our needs clearly impacts other organisms that also depend on these resources. Nature has intrinsic value for many people in the form of individual species, ecosystems, and wilderness; nature also constitutes critical infrastructure in the form of ecosystems that keep us alive. That infrastructure has quantifiable economic value. Consequently, nature, and the change we induce in it, is clearly interwoven with our economy. That is, the security and health of human societies depends explicitly upon the security and health of natural systems. Therefore, as economic security is now officially considered as part and parcel of national security strategy, it is time to expand our understanding of national security to include natural security.

Here is the main page for the 5 November, 2013 Congressional Briefing in the U.S. Senate, Tooling the U.S. Bioeconomy: Synthetic Biology. Speakers included Mary Maxon, Darlene Solomon, and Chris Voigt. Here is my contribution:


Rob Carlson, Components and Potential of the Growing Bioeconomy from ACS Science & the Congress on Vimeo.

And here is the Q&A following the presentations, during which we got into issues of risk, security, public acceptance, etc:

Harry Potter and The Future of Nature

How will Synthetic Biology and Conservation Shape the Future of Nature?  Last month I was privileged to take part in a meeting organized by The Wildlife Conservation Society to consider that question.  Here is the framing paper (PDF), of which I am a co-author.  There will be a follow-up paper in the coming months.  I am still mulling over what I think happened during the meeting, and below are a few observations that I have managed to settle on so far.  Others have written their own accounts.  Here is a summary from Julie Gould, riffing on an offer that Paul Freemont made to conservation biologists at the close of the meeting, "The Open Door".  Ed Gillespie has a lovely, must-read take on Pandora's Box, cane toads, and Emily Dickenson, "Hope is the thing with feathers".  Cristian Samper, the new head of the Wildlife Conservation Society was ultimately quite enthusiastic: Jim Thomas of ETC, unsurprisingly, not so much.

The meeting venue was movie set-like Cambridge.  My journey took me through King's Cross, with its requisite mock-up of a luggage trolley passing through the wall at platform nine and three-quarters.  So I am tempted to style parts of the meeting as a confrontation between a boyish protagonist trying to save the world and He Who Must Not Be Named.  But my experience at the meeting was that not everyone was able to laugh at a little tension-relieving humor, or even to recognize that humor.  Thus the title of this post is as much as I will give in temptation.

How Can SB and CB Collaborate?

I'll start with an opportunity that emerged during the week, exactly the sort of thing you hope would come from introducing two disciplines to each other.  What if synthetic biology could be used as a tool to aid in conservation efforts, say to buttress biodiversity against threats?  If the ongoing, astonishing loss of species were an insufficient motivation to think about this possibility, now some species that humans explicitly rely upon economically are under threat.    Synthetic biology might - might! - be able to offer help in the form of engineering species to be more robust in the face of a changing environment, such as enabling corals to cope with increases in water temperature and acidity, or it perhaps via intervening in a host-prey relationship, such as that between bats and white-nose disease or between bees and their mites and viruses.

The first thing to say here is that if the plight of various species can be improved through changes in human behavior then we should by all means work toward that end.  The simpler solution is usually the better solution.  For example, it might be a good idea to stop using those pesticides and antibiotics that appear to create more problems than they solve when introduced into the environment.  Moreover, at the level of the environment and the economy, technological fixes are probably best reserved until we try changes in human behavior.  After all, we've mucked up such fixes quite a few times already.  (All together now: "Cane Toad Blues".)  But what if the damage is too far along and cannot be addressed by changes in behavior?  We should at least consider the possibility that a technological fix might be worth a go, if for no other reason that to figure out how to create a back up plan.  Given the time scales involved in manipulating complex organisms, exploring the option of a back-up plan means getting started early.  It also means thoughtfully considering which interventions would be most appropriate and urgent, where part of the evaluation should probably involve asking whether changes in human behavior are likely to have any effect.  In some cases, a technical solution is likely to be our only chance.

First up: corals.

We heard from Stanford's Steve Palumbi on work to understand the effects of climate change on corals in the South Pacific.  Temperature and acidity - two parameters already set on long term changes - are already affecting coral health around the globe.  But it turns out that in the lab some corals can handle remarkably difficult environmental conditions.  What if we could isolate the relevant genetic circuits and, if necessary, transplant them into other species, or turn them on if they are already widespread?  My understanding of Professor Palumbi's talk is that it is not yet clear why some corals have the pathway turned on and some do not.  So, first up, a bunch of genetics, molecular biology, and field biology to figure out why the corals do what they do.  After that, if necessary, it seems that it would be worth exploring whether other coral species can be modified to use the relevant pathways.  Corals are immensely important for the health of both natural ecosystems and human economies; we should have a back-up plan, and synthetic biology could certainly contribute.

Next up: bats.

Bats are unsung partners of human agriculture, and they contribute an estimated $23 billion annually to U.S. farmers by eating insects and pollinating various plants.  Here is nice summary article from The Atlantic by Stephanie Gruner Buckely on the impact upon North American bats of white nose syndrome.  The syndrome, caused by a fungus evidently imported from Europe, has already killed so many bats that we may see an impact on agriculture as soon as this year.  European bats are resistant to the fungus, so one option would be to try to introduce the appropriate genes into North American bats via standard breeding.  However, bats breed very slowly, usually only having one pup a year, and only 5 or so pups in a lifetime.  Given the mortality rate due to white nose syndrome, this suggests breeding is probably too slow to be useful in conservation efforts.  What if synthetic biology could be used to intervene in some way, either to directly attack the non-native fungus or to interfere with its attack on bats.  Obviously this would be a hard problem to take on, but both biodiversity and human welfare would be improved by making progress here.

And now: bees.

If you eat, you rely on honeybees.  Due to a variety of causes, bee populations have fallen to the point where food crops are in jeopardy.  Entomologist Dennis vanEngelstorp, quoted in Wired, warns "We're getting closer and closer to the point where we don't have enough bees in this country to meet pollination demands.  If we want to grow fruits and nuts and berries, this is important.  One in every three bites [of food consumed in the U.S.] is directly or indirectly pollinated by bees."  Have a look at the Wired article for a summary of the constellation of causes of Colony Collapse Disorder, or CCD -- they are multifold and interlocking.  Obviously, the first thing to do is to stop making the problem worse; Europe has banned a class of pesticide that is exceptionally hard on honeybees, though the various sides in this debate continue to argue about whether that will make any difference.  This change in human behavior may have some impact, but most experts agree we need to do more.  Efforts are underway to breed bees that are resistant to both pesticides and to particular mites that prey on bees and that transmit viruses between bees.  Applying synthetic biology here might be the hardest task of all, given the complexity of the problem.  Should synthetic biologists focus on boosting apian immune systems?  Should they focus on the mite?  Apian viruses?  It sounds very difficult.  But with such a large fraction of our food supply dependent upon healthy bees, it also seems pretty clear that we should be working on all fronts to sort out potential solutions.

A Bit of Good News

Finally, a problem synthetic biologists are already working to solve: malaria.  The meeting was fortunate to hear directly from Jay Keasling.  Keasling presented progress on a variety of fronts, but the most striking was his announcement that Sanofi-Aventis has produced substantially more artemisinin this year than planned, marking real progress in producing the best malaria drug extant using synthetic biology rather than by purifying it from plants.  Moreover, he announced that Sanofi and OneWorldHealth are likely to take over the entire world production of artemisinin.  The original funding deal between The Gates Foundation, OneWorldHealth, Amyris, and Sanofi required selling at cost.  The collaboration has worked very hard at bringing the price down, and now it appears that they can simply outcompete the for-profit pricing monopoly.

The stated goal of this effort is to reduce the cost of malaria drugs and provide inexpensive cures to the many millions of people who suffer from malaria annually.  Currently, the global supply fluctuates, as, consequently, do prices, which are often well above what those afflicted can pay.  A stable, high volume source of the drug would reduce prices and also reduce the ability of middle-men to sell doctored, diluted, or mis-formulated artemisinin, all of which are contributing to a rise of resistant pathogens.

There is a potential downside to this project.  If Sanofi and OneWorldHealth do corner the market on artemisinin, then farmers who currently grow artemisia will no longer have that option, at least for supplying the artemisinin market.  That might be a bad thing, so we should at least ask the question of whether the world is a better place with artemisinin production done in vats or derived from plants.  This question can be broken into two pieces: 1) what is best for the farmers? and 2) what is best for malaria sufferers?  It turns out these questions have the same answer.

There is no question that people who suffer from malaria will be better off with artemisinin produced in yeast by Sanofi.  Malaria is a debilitating disease that causes pain, potentially death, and economic hardship.  The best estimates are that countries in which malaria is endemic suffer a hit to GDP growth of 1.3% annually compared to non-malarious countries.  Over just a few years this yearly penalty swamps all the foreign aid those countries receive; I've previously argued that eliminating malaria would be the biggest humanitarian achievement in history and would make the world a much safer place.  Farmers in malarious countries are the worst hit, because the disease prevents them from getting into the fields to work.  I clashed in public over this with Jim Thomas around our respective testimonies in front of the Presidential Bioethics Commission a couple of years ago.  Quoting myself briefly from the relevant blog post,

The human cost of not producing inexpensive artemisinin in vats is astronomical.  If reducing the burden of malaria around the world on almost 2 billion people might harm "a few thousand" farmers, then we should make sure those farmers can make a living growing some other crop.  We can solve both problems.  ...Just one year of 1.3% GDP growth recovered by reducing (eliminating?) the impact of malaria would more than offset paying wormwood farmers to grow something else.  There is really no argument to do anything else.

For a bit more background on artemisinin supply and pricing, and upon the apparent cartel in control of pricing both the drug and the crop, see this piece in Nature last month by Mark Peplow.  I was surprised to learn that that the price of artemisia is set by a small group that controls production of the drug.  This group, unsurprisingly, is unhappy that they may lose control of the market for artemisinin to a non-profit coalition whose goal is to eliminate the disease.  Have a look at the chart titled "The Cost of Progress", which reveals substantial price fluctuations, to which I will return below.

Mr. Thomas responded to Keasling's announcement in Cambridge with a broadside in the Guardian UK against Keasling and synthetic biology more generally.  Mr. Thomas is always quick to shout "What about the farmers?"  Yet he is rather less apt to offer actual analysis of what farmers actually gain, or lose, by planting artemisia.

The core of the problem for farmers is in that chart from Nature, which shows that artemisinin has fluctuated in price by a factor of 3 over the last decade.  Those fluctuations are bad for both farmers and malaria sufferers; farmers have a hard time knowing whether it makes economic sense to plant artemisia, which subsequently means shortages if farmers don't plant enough.  Shortages mean price spikes, which causes more farmers to plant, which results in oversupply, which causes the price to plunge, etc.  You'll notice that Mr. Thomas asserts that farmers know best, but he never himself descends to the level of looking at actual numbers, and whether farmers benefit by growing artemisia.  The numbers are quite revealing.

Eyeballing "The Cost of Progress" chart, it looks like artemisia has been below the $400/kg level for about half the last 10 years.  To be honest, there isn't enough data on the chart to make firm conclusions, but it does look like the most stable price level is around $350/kg, with rapid and large price spikes up to about $1000/kg.  Farmers who time their planting right will probably do well; those who are less lucky will make much less on the crop.  So it goes with all farming, unfortunately, as I am sure Mr. Thomas would agree.

During his talk, Keasling put up a chart I hadn't seen before, which showed predicted farmer revenues for a variety of crops.  The chart is below; it makes clear that farmers will have substantially higher revenues planting crops other than artemisia at prices at or below $400/kg. 
The Strange Arguments Against Microbial Production of Malaria Drugs

Mr. Thomas' response in the Guardian to rational arguments and actual data was a glib accusation that Keasling is dismissing the welfare of farmers with "Let them plant potatoes".  This is actually quite clever and witty, but not funny in the slightest when you look at the numbers.  Thomas worries that farmers in African and Asia will suffer unduly from a shift away from artemisia to yeast.  But here is the problem: those farmers are already suffering -- from malaria.  Digging deeper, it becomes clear that Mr. Thomas is bafflingly joining the pricing cartel in arguing against the farmers' best interests.

A brief examination of the latest world malaria map shows that the most intense malaria hot spots are in Africa and Asia, with South America not far behind (here is the interactive CDC version).  Artemisia is primarily grown in Africa and Asia.  That is, farmers most at risk of contracting malaria only benefit economically when there is a shortage of artemisinin, the risk of which is maintained by leaving artemisia production in the hands of farmers.  Planting sufficient quantities of artemisia to meet demand means prices that are not economically viable for the farmer.  There are some time lags here due to growing and processing the crop into the drug, but the upshot is that the only way farmers make more money planting artemisia than other crops is when there is a shortage.  This is a deadly paradox, and its existence has only one beneficiary: the artemisinin pricing cartel.  But we can now eliminate the paradox.  It is imperative for us to do so.

Once you look at the numbers there is no argument Mr. Thomas, or anyone else, can make that we should do anything but brew artemisinin in vats and bring the price as low as possible.

I had previously made the macro-scale economic arguments about humanitarian impacts economic growth.  Malarious countries, and all the farmers in them, would benefit tremendously by a 1.3% annual increase in GDP.  But I only realized while writing this post that the micro-scale argument gives the same answer: the farmers most at risk from malaria only make money growing artemisia when there is a shortage of the drug, which is when they are most likely to be affected by the disease.

I get along quite well in person with Mr. Thomas, but I have long been baffled by his arguments about artemisinin.  I heartily support his aims of protecting the rights of farmers and taking care of the land.  We should strive to do the right thing, except when analysis reveals it to be the wrong thing.  Since I only just understood the inverse relationship between artemisinin pricing and the availability of the drug to the very farmers growing artemisia, I am certain Mr. Thomas has not had the opportunity to consider the facts and think through the problem so that he might come to the same conclusion.  I invite him to do so.
Here are updated cost and productivity curves for DNA sequencing and synthesis.  Reading and writing DNA is becoming ever cheaper and easier.  The Economist and others call these "Carlson Curves", a name I am ambivalent about but have come to accept if only for the good advertising.  I've been meaning to post updates for a few weeks; the appearance today of an opinion piece at Wired about Moore's Law serves as a catalyst to launch them into the world.  In particular, two points need some attention, the  notions that Moore's Law 1) is unplanned and unpredictable, and 2) somehow represents the maximum pace of technological innovation.

DNA Sequencing Productivity is Skyrocketing

First up: the productivity curve.  Readers new to these metrics might want to have a look at my first paper on the subject, "The Pace and Proliferation of Biological Technologies" (PDF) from 2003, which describes why I chose to compare the productivity enabled by commercially available sequencing and synthesis instruments to Moore's Law.  (Briefly, Moore's Law is a proxy for productivity; more transistors putatively means more stuff gets done.)  You have to choose some sort of metric when making comparisons across such widely different technologies, and, however much I hunt around for something better, productivity always emerges at the top.

It's been a few years since I updated this chart.  The primary reason for the delay is that, with the profusion of different sequencing platforms, it became somewhat difficult to compare productivity [bases/person/day] across platforms.  Fortunately, a number of papers have come out recently that either directly make that calculation or provide enough information for me to make an estimate.  (I will publish a full bibliography in a paper later this year.  For now, this blog post serves as the primary citation for the figure below.)

Visual inspection reveals a number of interesting things.  First, the DNA synthesis productivity line stops in about 2008 because there have been no new instruments released publicly since then.  New synthesis and assembly technologies are under development by at least two firms, which have announced they will run centralized foundries and not sell instruments.  More on this later.

Second, it is clear that DNA sequencing platforms are improving very rapidly, now much faster than Moore's Law.  This is interesting in itself, but I point it out here because of the post today at Wired by Pixar co-founder Alvy Ray Smith, "How Pixar Used Moore's Law to Predict the Future".  Smith suggests that "Moore's Law reflects the top rate at which humans can innovate. If we could proceed faster, we would," and that "Hardly anyone can see across even the next crank of the Moore's Law clock."

Moore's Law is a Business Model and is All About Planning -- Theirs and Yours

As I have written previously, early on at Intel it was recognized that Moore's Law is a business model (see the Pace and Proliferation paper, my book, and in a previous post, "The Origin of Moore's Law").  Moore's Law was always about economics and planning in a multi-billion dollar industry.  When I started writing about all this in 2000, a new chip fab cost about $1 billion.  Now, according to The Economist, Intel estimates a new chip fab costs about $10 billion.  (There is probably another Law to be named here, something about exponential increases in cost of semiconductor processing as an inverse function of feature size.  Update: This turns out to be Rock's Law.)  Nobody spends $10 billion without a great deal of planning, and in particular nobody borrows that much from banks or other financial institutions without demonstrating a long-term plan to pay off the loan.   Moreover, Intel has had to coordinate the manufacturing and delivery of very expensive, very complex semiconductor processing instruments made by other companies.  Thus Intel's planning cycle explicitly extends many years into the future; the company sees not just the next crank of the Moore's Law clock, but several cranks.  New technology has certainly been required to achieve these planning goals, but that is just part of the research, development, and design process for Intel.  What is clear from comments by Carver Mead and others is that even if the path was unclear at times, the industry was confident that they could to get to the next crank of the clock.

Moore's Law served a second purpose for Intel, and one that is less well recognized but arguably more important; Moore's Law was a pace selected to enable Intel to win.  That is why Andy Grove ran around Intel pushing for financial scale (see "The Origin of Moore's Law").  I have more historical work to do here, but it is pretty clear that Intel successfully organized an entire industry to move at a pace only it could survive.  And only Intel did survive.  Yes, there are competitors in specialty chips and in memory or GPUs, but as far as high volume, general CPUs go, Intel is the last man standing.  Finally, and alas I don't have a source anywhere for this other than hearsay, Intel could have in fact gone faster than Moore's Law.  Here is the hearsay: Gordon Moore told Danny Hillis who told me that Intel could have gone faster.  (If anybody has a better source for that particular point, give me a yell on Twitter.)  The inescapable conclusion from all this is that the management of Intel made a very careful calculation.  They evaluated product roll-outs to consumers, the rate of new product adoption, the rate of semiconductor processing improvements, and the financial requirements for building the next chip fab line, and then set a pace that nobody else could match but that left Intel plenty of headroom for future products.  It was all about planning.

The reason I bother to point all this out is that Pixar was able to use Moore's Law to "predict the future" precisely because Intel meticulously planned that future.  (Calling Alan Kay: "The best way to predict the future is to invent it.")  Which brings us back to biology.  Whereas Moore's Law is all about Intel and photolithography, the reason that productivity in DNA sequencing is going through the roof is competition among not just companies but among technologies.  And we only just getting started.  As Smith writes in his Wired piece, Moore's Law tells you that "Everything good about computers gets an order of magnitude better every five years."  Which is great: it enabled other industries and companies to plan in the same way Pixar did.  But Moore's Law doesn't tell you anything about any other technology, because Moore's Law was about building a monopoly atop an extremely narrow technology base.  In contrast, there are many different DNA sequencing technologies emerging because many different entrepreneurs and companies are inventing the future.

The first consequence of all this competition and invention is that it makes my job of predicting the future very difficult.  This emphasizes the difference between Moore's Law and Carlson Curves (it still feels so weird to write my own name like that): whereas Intel and the semiconductor industry were meeting planning goals, I am simply keeping track of data.  There is no real industry-wide planning in DNA synthesis or sequencing, other than a race to get to the "$1000 genome" before the next guy.  (Yes, there is a vague road-mappy thing promoted by the NIH that accompanied some of its grant programs, but there is little if any coordination because there is intense competition.)

Biological Technologies are Hard to Predict in Part Because They Are Cheaper than Chips

Compared to other industries, the barrier to entry in biological technologies is pretty low.  Unlike chip fabs, there is nothing in biology that costs $10 billion commercially, nor even $1 billion.  (I have come to mostly disbelieve pharma industry claims that developing drugs is actually that expensive, but that is another story for another time.)  The Boeing 787 reportedly cost $32 billion to develop as of 2011, and that is on top of a century of multi-billion dollar aviation projects that had to come before the 787.

There are two kinds of costs that are important to distinguish here.  The first is the cost of developing and commercializing a particular product.  Based on the money reportedly raised and spent by Life, Illumina, Ion Torrent (before acquisition), Pacific Biosciences, Complete Genomics (before acquisition), and others, it looks like developing and marketing second-generation sequencing technology can cost upwards of about $100 million.  Even more money gets spent, and lost, in operations before anybody is in the black.  My intuition says that the development costs are probably falling as sequencing starts to rely more on other technology bases, for example semiconductor processing and sensor technology, but I don't know of any real data.  I would also guess that nanopore sequencing, should it actually become a commercial product this year, will have cost less to develop than other technologies, but, again, that is my intuition based on my time in clean rooms and at the wet bench.  I don't think there is great information yet here, so I will suspend discussion for the time being.

The second kind of cost to keep in mind is the use of new technologies to get something done.  Which brings in the cost curve.  Again, the forthcoming paper will contain appropriate references.
carlson_cost per_base_oct_2012.png
The cost per base of DNA sequencing has clearly plummeted lately.  I don't think there is much to be made of the apparent slow-down in the last couple of years.  The NIH version of this plot has more fine grained data, and it also directly compares the cost of sequencing with the cost per megabyte for memory, another form of Moore's Law.  Both my productivity plot above and the NIH plot show that sequencing has at times improved much faster than Moore's Law, and generally no slower.

If you ponder the various wiggles, it may be true that the fall in sequencing cost is returning to a slower pace after a period in which new technologies dramatically changed the market.  Time will tell.  (The wiggles certainly make prediction difficult.)  One feature of the rapid fall in sequencing costs is that it makes the slow-down in synthesis look smaller; see this earlier post for different scale plots and a discussion of the evaporating maximum profit margin for long, double-stranded synthetic DNA (the difference between the orange and yellow lines above).

Whereas competition among companies and technologies is driving down sequencing costs, the lack of competition among synthesis companies has contributed to a stagnation in price decreases.  I've covered this in previous posts (and in this Nature Biotech article), but it boils down to the fact that synthetic DNA has become a commodity produced using relatively old technology.

Where Are We Headed?

Now, after concluding that the structure of the industry makes it hard to prognosticate, I must of course prognosticate.  In DNA sequencing, all hell is breaking loose, and that is great for the user.  Whether instrument developers thrive is another matter entirely.  As usual with start-ups and disruptive technologies, surviving first contact with the market is all about execution.  I'll have an additional post soon on how DNA sequencing performance has changed over the years, and what the launch of nanopore sequencing might mean.

DNA synthesis may also see some change soon.  The industry as it exists today is based on chemistry that is several decades old.  The common implementation of that chemistry has heretofore set a floor on the cost of short and long synthetic DNA, and in particular the cost of synthetic genes.  However, at least two companies are claiming to have technology that facilitates busting through that cost floor by enabling the use of smaller amounts of poorer quality, and thus less expensive, synthetic DNA to build synthetic genes and chromosomes.

Gen9 is already on the market with synthetic genes selling for something like $.07 per base.  I am not aware of published cost estimates for production, other than the CEO claiming it will soon drop by orders of magnitude.  Cambrian Genomics has a related technology and its CEO suggests costs will immediately fall by 5 orders of magnitude.  Of course, neither company is likely to drop prices so far at the beginning, but rather will set prices to undercut existing companies and grab market share.  Assuming Gen9 and Cambrian don't collude on pricing, and assuming the technologies work as they expect, the existence of competition should lead to substantially lower prices on genes and chromosomes within the year.  We will have to see how things actually work in the market.  Finally, Synthetic Genomics has announced it will collaborate with IDT to sell synthetic genes, but as far as I am aware nothing new is actually shipping yet, nor have they announced pricing.

So, supposedly we are soon going to have lots more, lots cheaper DNA.  But you have to ask yourself who is going to use all this DNA, and for what.  The important business point here is that both Gen9 and Cambrian Genomics are working on the hypothesis that demand will increase markedly (by orders of magnitude) as the price falls.  Yet nobody can design a synthetic genetic circuit with more than a handful of components at the moment, which is something of a bottleneck on demand.  Another option is that customers will do less up-front predictive design and instead do more screening of variants.  This is how Amyris works -- despite their other difficulties, Amyris does have a truly impressive metabolic screening operation -- and there are several start-ups planning to provide similar (or even improved) high-throughput screening services for libraries of metabolic pathways.  I infer this is the strategy at Synthetic Genomics as well.  This all may work out well for both customers and DNA synthesis providers.  Again, I think people are working on an implicit hypothesis of radically increased demand, and it would be better to make the hypothesis explicit in part to identify the risk of getting it wrong.  As Naveen Jain says, successful entrepreneurs are good at eliminating risk, and I worry a bit that the new DNA synthesis companies are not paying enough attention on this point.

There are relatively simple scaling calculations that will determine the health of the industry.  Intel knew that it could grow financially in the context of exponentially falling transistor costs by shipping exponentially more transistors every quarter -- that is the business model of Moore's Law.  Customers and developers could plan product capabilities, just as Pixar did, knowing that Moore's Law was likely to hold for years to come.  But that was in the context of an effective pricing monopoly.  The question for synthetic gene companies is whether the market will grow fast enough to provide adequate revenues when prices fall due to competition.  To keep revenues up, they will then have to ship lots of bases, probably orders of magnitudes more bases.  If prices don't fall, then something screwy is happening.  If prices do fall, they are likely to fall quickly as companies battle for market share.  It seems like another inevitable race to the bottom.  Probably good for the consumer; probably bad for the producer.

(Updated)  Ultimately, for a new wave of DNA synthesis companies to be successful, they have to provide the customer something of value.  I suspect there will be plenty of academic customers for cheaper genes.  However, I am not so sure about commercial uptake.  Here's why: DNA is always going to be a small cost of developing a product, and it isn't obvious making that small cost even cheaper helps your average corporate lab.

In general, the R part of R&D only accounts for 1-10% of the cost of the final product.  The vast majority of development costs are in polishing up the product into something customers will actually buy.  If those costs are in the neighborhood of $50-100 million, the reducing the cost of synthetic DNA from $50,000 to $500 is nice, but the corporate scientist-customer is more worried about knocking a factor of two, or an order of magnitude, off the $50 million.  This means that in order to make a big impact (and presumably to increase demand adequately) radically cheaper DNA must be coupled to innovations that reduce the rest of the product development costs.  As suggested above, forward design of complex circuits is not going to be adequate innovation any time soon.  The way out here may be high-throughput screening operations that enable testing many variant pathways simultaneously.  But note that this is not just another hypothesis about how the immediate future of engineering biology will change, but another unacknowledged hypothesis.  It might turn out to be wrong.

The upshot, just as I wrote in 2003, is that the market dynamics of biological technologies will  remain difficult to predict precisely because of the diversity of technology and the difficulty of the tasks at hand.  We can plan on prices going down; how much, I wouldn't want to predict.

Meeting on Conservation and Synthetic Biology, April 9-11

How will synthetic biology and conservation shape the future of nature?

April 9-11, 2013
Clare College
Cambridge, England

Sponsored by the Wildlife Conservation Society.  More info, including the agenda, here.

Scheduled speakers include: Dick Kitney, Georgina Mace, Kent Redford, Karen Esler, Rob Carlson, Sofia Alendra Valenzuela Aguila, Jay Keasling, Mildred Cho, Oliver Morton, Bertina Ceccarelli, Stewart Brand, and many others.


The New Biofactories

The New Biofactories
Robert Carlson, 2009
For What's Next, a Special Edition of the McKinsey Quarterly


Humans have been modifying biological systems for our own economic benefit for millennia. Improvements in crop yields and overall farming productivity have come from a continuing alteration of the genetic makeup--through selection and breeding--of the plants and animals upon which we rely. Now we find ourselves at the dawn of a new age of direct genetic modification. While the term "artificial life form" conjures up images of cyborgs or other creations of science fiction, the first such "artificial" creatures will actually be single-celled microorganisms. Even though these human-engineered life forms will be extremely simple, they will have an enormous impact on our world. Their biggest potential: the creation of biofuels and biomaterials, which have the promise to transform our entire economy.

The first explicitly artificial organisms emerged from recombinant DNA technology in the mid-1970s; this technology was commercialized with lightning speed. As of 2006, biotech drugs accounted for about $65 billion in sales worldwide. Just one drug, Epogen, has generated $10 billion in revenues since its creation. A molecular biologist--particularly when receiving stock options in a biotech start-up--would have to conclude that life forms that become "artificial" simply by the addition of one gene can be quite commercially significant.

Revenues from genetically modified "stuff" now exceed 1 percent of US GDP and are generated in three areas: drugs, agriculture, and industrial products like enzymes and plastics. These areas are growing at 10 to 20 percent per year, and together they are making a sizeable and growing contribution to the economy.

The biotech sector is also extremely productive. Between 2000 and 2007, biotech revenues added more than $100 billion to the economy, representing 2.5 percent of US GDP growth. This was accomplished by a biotech workforce of only about 250,000 people, less than one-sixth of one percent of the national workforce.

Yet the underlying technology is immature compared with that in other sectors of the economy. The majority of biotech products that have reached the market are the result of just a handful of genetic modifications and insertions. The commercial significance of the biotech sector will grow as its ability to engineer new biological systems expands.

Until recently, the complexity of these systems was limited in large part by the cost of development. The labor required to build and test a complex genetic circuit was prohibitive. But since the mid-1990s, productivity in reading and writing genes has been improving exponentially, while costs have plunged. Now relatively large pieces of DNA can be designed electronically, sent to a gene "foundry," constructed, and returned via express mail in just a few weeks. It is already technically possible to build stretches of DNA as long as those of small bacterial genomes (about 400 genes).

However, this is not the fastest road to commercially significant organisms. This is because the simpler the engineering task is, the greater the near-term economic impact will be. For example, aeronautical engineers do not attempt to build new aircraft with the complexity of a hawk, a hummingbird, or even a moth. They instead succeed by reducing complexity. Even the simplest cell contains far more bells and whistles than we can presently understand. Consequently, no biological engineer will succeed in building a system from scratch until most of that complexity is whittled away, leaving only the bare essentials. Real progress will come by adding to existing organisms just a few new genes--probably no more than 15.

Companies are already making substantive progress. Amyris Biotechnologies has modified yeast to transform sugar into useful compounds, including malaria drugs and biofuels that can substitute for today's jet fuel, diesel, and gasoline. The company will begin production of these fuels next year in converted ethanol fermentation plants in Brazil.

As biotech technology develops, biofuels and bioplastics produced this way will be easier and cheaper to make than ethanol or traditional plastics and they will perform better than even petroleum-based products. Their manufacture and use will also reduce the carbon emissions that cause climate change.

Such artificial life forms will fundamentally change how we power the economy, bringing about a switch from fossil fuels to biological feedstocks like sugar, starch, and cellulose. Biomanufacturing is less likely to be centralized, like petroleum refineries and ethanol plants, and will instead be more evenly distributed, like beer breweries.

Cars themselves might actually become the producers of the very fuels they consume. In the spring of 2007, researchers reported the successful construction of a synthetic pathway consisting of 13 enzymes from different organisms that can turn starch into hydrogen. This suggests a future in which sugar or starch--substances available at any grocery store--will go into our fuel tanks instead of gasoline. A fuel cell will use the hydrogen produced by engineered microbes in the tank to provide electric power for the car. Such a car would then become something of a cyborg, relying on living organisms to provide power to an inorganic shell. As one oil executive observed at a recent oil industry meeting, in this model "the car is the refinery".

If this innovation comes to pass, a very different marketplace is likely to arise. The infrastructure for shipping and refining petroleum overseen by that self-same executive might become less relevant in a new biotech world. Moreover, if distributed biological processing of simple feedstocks can compete in low-margin markets like liquid transportation fuels, then it will also make significant inroads with higher-margin products like fibers, plastics, flavorings, and scents.

It will soon be possible to devise enzymes and organisms that "eat" a diverse array of feedstocks. One good example is municipal sewage. Now mostly treated and disposed of as waste, this resource will initially be used to grow unmodified algae. The algae will in turn be fed to synthetic systems--think of these as "artificial cows", a fusion of robot and biology that is beyond even the "cyborg" car--engineered to make materials and fuels. Eventually, the algae itself will be engineered to directly convert sewage into products. And inevitably, these artificial cows will move out into the fields, closer to large-volume agriculture. Modern harvesting equipment is already often driven by autonomous, satellite-guided control systems. Imagine robotic harvesters equipped with bioprocessing modules slowly wandering around farmland, consuming a variety of feedstocks, processing that material into higher-value products like fuels and plastics, and delivering it to distribution centers. These hybrid "cowborgs" would thereby become autonomous, distributed biomanufacturing platforms, engineered to supply us with the fuels and materials that we need.

Very few organisms on our planet are larger than about one meter across. Most of the biomass production, and therefore most of the biological processing, occurs at scales of microns to centimeters. While organisms produced by nature face different constraints than those designed by humans, we may find ever more inspiration in microbes, insects, and cows for our future production infrastructure. We have barely begun to tap the promise of biotech.

Upcoming Talks in New York Area

I'm headed to the New York area this week and will be giving three talks (two of which are open to the public).

May 4th, Noon, Princeton University: "Biology is Technology: Garage Biology, Microbrewing and the Economic Drivers of Distributed Biological Production"

May 5th, 1 pm, Genspace (33 Flatbush Avenue, Brooklyn): "Biology Is Technology: The Implications of Global Biotechnology"

May 7th-8th, The Hastings Institute, "Progress and Prospects for Microbial Biofuels" for the next round of conversations on ethics, synthetic biology, and public policy.  The previous round of conversations is captured in this set of essays, which includes my contribution, "Staying Sober About Science" (free after registration).

Synthetic biology and "green" explosives

Here is my article with Dan Grushkin for Slate and Future Tense on "The Military's Push to Green Our Explosives", about using synthetic biology to make things go boom.  We had way more material than space, and we should probably write something else on the topic.

Here are the first three 'graphs:

Last year, when the United States military debuted footage of an iridescent drone the size and shape of a hummingbird buzzing around a parking lot, the media throated a collective hooah! Time magazine even devoted a cover to it. Meanwhile, with no fanfare at all--despite the enormous potential to reshape modern warfare--the military issued a request for scientists to find ways to design microbes that could produce explosives for weapons. Imagine a vat of genetically engineered yeast that produces chemicals for bombs and missiles instead of beer.

The request takes advantage of new research in synthetic biology, a science that applies engineering principles to genetics. To its humanitarian credit, in the field's short existence, scientists have genetically programmed bacteria and yeast to cheaply produce green jet fuels (now being tested by major airplane makers) and malaria medicines (scheduled for market in 2013). It's an auspicious beginning for a science that portends to revolutionize how we make things. In the future, we may harness cells to self-assemble into far more complex objects like cell phone batteries or behave like tiny programmable computers. The promise, however, comes yoked with risks.

The techniques that make synthetic biology such a powerful tool for positive innovation may be also used for destruction. The military's new search for biologically brewed explosives threatens to reopen an avenue of research that has been closed for 37 years: biotechnology developed for use in war.

Playing God, from BBC Horizons

Here is the full video of the BBC Horizons show on synthetic biology that aired earlier this year.  My bit starts at 38:30, but you would do well to watch the whole thing.  Oh, and spider goats!

Note: The video keeps getting yanked, but here is a link that works as of 30 Dec 2013:

Censoring Science is Detrimental to Security

Restricting access to science and technology in the name of security is historically a losing proposition.  Censorship of information that is known to exist incentivizes innovation and rediscovery. 

As most readers of this blog know, there has been quite a furor over new results demonstrating mutations in H5N1 influenza strains that are both deadly and highly contagious in mammals.  Two groups, led by Ron Fouchier in the The Netherlands and Yoshihiro Kawaoka at The University of Wisconsin, have submitted papers to Nature and Science describing the results.  The National Science Advisory Board for Biosecurity (NSABB) has requested that some details, such as sequence information, be omitted from publication.  According to Nature, both journals are "reserving judgement about whether to censor the papers until the US government provides details of how it will allow genuine researchers to obtain redacted information".

For those looking to find more details about what happened, I suggest starting with Dorveen Caraval's interview with Fouchier in the New York Times, "Security in Flu Study Was Paramount, Scientist Says"; Kathleen Harmon's firsthand account of what actually happened when the study was announced; and Heidi Ledford's post at Nature News about the NSABB's concerns.

If you want to go further, there is more good commentary, especially the conversation in the comments (including from a member of the NSABB), in "A bad day for science" by Vincent Racaniello.  See also Michael Eisen's post "Stop the presses! H5N1 Frankenflu is going to kill us all!", keeping in mind that Eisen used to work on the flu.

Writing at Foreign Policy, Laurie Garrett has done some nice reporting on these events in two posts, "The Bioterrorist Next Door" and "Flu Season".  She suggests that attempts to censor the results would be futile: "The genie is out of the bottle: Eager graduate students in virology departments from Boston to Bangkok have convened journal-review debates reckoning exactly how these viral Frankenstein efforts were carried out."

There is much I agree with in Ms. Garrett's posts.  However, I must object to her assertion that the work done by Fouchier and Kawaoka can be repeated easily using the tools of synthetic biology.  She writes "The Fouchier episode laid bare the emptiness of biological-weapons prevention programs on the global, national, and local levels.  Along with several older studies that are now garnering fresh attention, it has revealed that the political world is completely unprepared for the synthetic-biology revolution."   As I have already written a book that discusses this confusion (here is an excerpt about synthetic biology and the influenza virus), it is not actually what I want to write about today.  But I have to get this issue out of the way first.

As far as I understand from reading the press accounts, both groups used various means to create mutations in the flu genome and then selected viruses with properties they wanted to study.  To clarify, from what I have been able to glean from the sparse accounts thus far, DNA synthesis was not used in the work.  And as far as I understand from reading the literature and talking to people who build viruses for a living, it is still very hard to assemble a functioning, infectious influenza virus from scratch.   

If it were easy to write pathogen genomes -- particularly flu genomes -- from scratch, we would quite frankly be in deep shit. But, for the time being, it is hard.  And that is important.  Labs who do use synthetic biology to build influenza viruses, as with those who reconstructed the 1918 H1N1 influenza virus, fail most of the time despite great skill and funding.  Synthesizing flu viruses is simply not a garage activity.  And with that, I'll move on.

Regardless of how the results might be reproduced, many have suggested that the particular experiments described by Fouchier and Kawaoka should not have been allowed.  Fouchier himself acknowledges that selecting for airborne viruses was not the wisest experiment he could have done; it was, he says, "really, really stupid".  But the work is done, and people do know about it.  So the question of whether this work should have been done in the first place is beside the point.  If, as suggested by Michael Eisen, that "any decent molecular biologist" could repeat the work, then it was too late to censor the details as soon as the initial report came out. 

I am more interested in the consequences of trying to contain the results while somehow allowing access to vetted individuals.  Containing the results is as much about information security as it is biological security.  Once such information is created, the challenge is to protect it, to secure it.  Unfortunately, the proposal to allow secure access only by particular individuals is at least a decade (if not three decades) out of date.

Any attempt to secure the data would have to start with an assessment of how widely it is already distributed.  I have yet to meet an academic who regularly encrypts email, and my suspicion is that few avail themselves of the built-in encryption on their laptops.  So, in addition to the university computers and email servers where the science originated, the information is sitting in the computers of reviewers, on servers at Nature and Science, at the NSABB, and, depending on how the papers were distributed and discussed by members of the NSABB, possibly on their various email servers and individual computers as well.  And let's not forget the various unencrypted phones and tablets all of those reviewers now carry around.

But never mind that for a moment.  Let's assume that all these repositories of the relevant data are actually secure.  The next step is to arrange access for selected researchers.  That access would inevitably be electronic, requiring secure networks, passwords, etc.  In the last few days the news has brought word that computer security firms Stratfor and Symantec have evidently been hacked recently.  Such attacks are not uncommon.  Think back over the last couple of years: hacks at Google, various government agencies, universities.  Credit card numbers, identities, and supposedly secret DoD documents are all for sale on the web.  To that valuable information we can now add a certain list of influenza mutations.  If those mutations are truly a critical biosecurity risk -- as asserted publicly by various members of the NSABB -- then that data has value far beyond its utility in virology and vaccinology.

The behavior of various hackers (governments, individuals, other) over the last few years make clear that what the discussion thus far has done is to stick a giant "HACK HERE" sign on the data.  Moreover, if Ms. Garrett is correct that students across the planet are busy reverse engineering the experiments because they don't have access to the original methods and data, then censorship is creating a perverse incentive for innovation.  Given today's widespread communication, restriction of access to data is an invitation, not a proscription.

This same fate awaits any concentration of valuable data.  It obviously isn't a problem limited to collections of sensitive genetic sequences or laboratory methods.  And there is certainly a case to be made for attempting to maintain confidential or secret caches of data, whether in the public or private interest.  In such instances, compartmentalization and encryption must be implemented at the earliest stages of communication in order to have any hope of maintaining security. 

However, in this case, if it true that reverse engineering the results is straightforward, then restriction of access serves only to slow down the general process of science.  Moreover, censorship will slow the development of countermeasures.  It is unlikely that any collection of scientists identified by the NSABB or the government will be sufficient to develop all the technology we need to respond to natural pathogens, let alone any artificial ones.

As with most other examples of prohibition, these restrictions are doomed before they are even implemented.  Censorship of information that is known to exist incentivizes innovation and rediscovery.  As I explored in my book, prohibition in the name of security is historically a losing proposition.  Moreover, science is inherently a networked human activity that is fundamentally incompatible with constraints on communication, particularly of results that are already disclosed.  Any endeavor that relies upon science is, therefore, also fundamentally incompatible with constraints on communication.  Namely developing technologies to defend against natural and artificial pathogens.  Censorship threatens not just science but also our security.

Further Thoughts on iGEM 2011

Following up on my post of several weeks ago (iGEM 2011: First Thoughts), here is a bit more on last year's Jamboree.  I remain very, very impressed by what the teams did this year.  And I think that watching iGEM from here on out will provide a sneak peak of the future of biological technologies.

I think the biggest change from last year is the choice of applications, which I will describe below.  And related to the choice of applications is change of approach to follow a more complete design philosophy.  I'll get to the shift in design sensibility further on in the post.

The University of Washington: Make it or Break it

I described previously the nuts and bolts of the University of Washington's Grand Prize winning projects.  But, to understand the change in approach (or perhaps change in scope?) this project represents, you also have to understand a few details about problems in the real world.  And that is really the crux of the matter -- teams this year took on real world problems as never before, and may have produced real world solutions.

Recall that one of the UW projects was the design of an enzyme that digests gluten, with the goal of using that enzyme to treat gluten intolerance.  Candidate enzymes were identified through examining the literature, with the aim of finding something that works at low pH.  The team chose a particular starter molecule, and then used the "video game" Foldit to re-design the active site in silico so that it would chew up gluten (here is a very nice Youtube video on the Foldit story from Nature).  They then experimentally tested many of the potential improvements.  The team wound up with an enzyme that in a test tube is ~800 times better than one already in clinical trials.  While the new enzyme would of course itself face lengthy clinical trials, the team's achievement could have an enormous impact on people who suffer from celiac disease, among many other ailments.

From a story in last week's NYT Magazine ("Should We All Go Gluten-Free?"), here are some eye-opening stats on celiac disease, which can cause symptoms ranging from diarrhea to dramatic weight loss:

  • Prior to 2003, prevalence in the US was thought to be just 1 in 10,000: widespread testing revealed the actual rate was 1 in 133.
  • Current estimates are that 18 million Americans have some sort of gluten intolerance, which is about 5.8% of the population.
  • Young people were 5x more likely to have the disease by the 1990s than in the 1950s based on looking at old blood samples.
  • Prevalence is increasing not just in US, but also worldwide.
In other words, celiac disease is a serious metabolic issue that for some reason is affecting ever larger parts of the global population.  And as a summer project a team of undergraduates may have produced a (partial) treatment for the disease.  That eventual treatment would probably require tens of millions of dollars of further investment and testing before it reaches the market.  However, the market for gluten-free foods, as estimated in the Times, is north of $6 billion and growing rapidly.  So there is plenty of market potential to drive investment based on the iGEM project.

The other UW project is a demonstration of using E. coli to directly produce diesel fuel from sugar.  The undergraduates first reproduced work published last year from LS9 in which E. coli was modified to produce alkanes (components of diesel fuel -- here is the Science paper by Schirmer et al).  Briefly, the UW team produced biobricks -- the standard format used in iGEM -- of two genes that turn fatty acids into alkanes.  Those genes were assembled into a functional "Petrobrick".  The team then identified and added a novel gene to E. coli that builds fatty acids from 3 carbon seeds (rather than the native coli system that builds on 2 carbon seeds).  The resulting fatty acids then served as substrates for the Petrobrick, resulting in what appears to be the first report anywhere of even-chain alkane synthesis.  All three genes were packaged up into the "FabBrick", which contains all the components needed to let E. coli process sugar into a facsimile of diesel fuel.

The undergraduates managed to substantially increase the alkane yield by massaging the culture conditions, but the final yield is a long way from being useful to produce fuel at volume.  But again, not bad for a summer project.  This is a nice step toward turning first sugar, then eventually cellulose, directly into liquid fuels with little or no purification or post-processing required.  It is, potentially, also a step toward "Microbrewing the Bioeconomy".  For the skeptics in the peanut gallery, I will be the first to acknowledge that we are probably a long way from seeing people economically brew up diesel in their garage from sugar.  But, really, we are just getting started.  Just a couple of years ago people thought I was all wet forecasting that iGEM teams would contribute to technology useful for distributed biological manufacturing of fuels.  Now they are doing it.  For their summer projects.  Just wait a few more years.

Finally -- yes, there's more -- the UW team worked out ways to improve the cloning efficiency of so-called Gibson cloning.  They also packaged up as biobricks all the components necessary to produce magnetosomes in E. coli.  The last two projects didn't make it quite as far as the first two, but still made it further than many others I have seen in the last 5 years.

Before moving on, here is a thought about the mechanics of participating in iGEM.  I think the UW wiki is the about best I have seen.   I like very much the straightforward presentation of hypothesis, experiments, and results.  It was very easy to understand what they wanted to do, and how far they got.  Here is the "Advice to Future iGEM Teams" I posted a few years ago.  Aspiring iGEM teams should take note of the 2011 UW wiki -- clarity of communication is part of your job.

Lyon-INSA-ENS: Cobalt Buster

The team from Lyon took on a very small problem: cleaning up cooling water from nuclear reactors using genetically modified bacteria.  This was a nicely conceived project that involved identifying a problem, talking to stakeholders, and trying to provide a solution.  As I understand it, there are ongoing discussions with various sponsors about funding a start-up to build prototypes.  It isn't obvious that the approach is truly workable as a real world solution -- many questions remain -- but the progress already demonstrated indicates that dismissing this project would be premature.

Before continuing, I pause to reflect on the scope of Cobalt Buster.  One does wonder about the eventual pitch to regulators and the public: "Dear Europe, we are going to combine genetically modified organisms and radiation to solve a nuclear waste disposal problem!"  As the team writes on its Human Practices page: "In one project, we succeed to gather Nuclear Energy and GMOs. (emphasis in original)"  They then acknowledge the need to "focus on communication".  Indeed.

Here is the problem they were trying to solve: radioactive Cobalt (Co) is a contaminant emitted during maintenance of nuclear reactors.  The Co is typically cleaned up with ion exchange resins, which are both expensive and when used up must be appropriately disposed of as nuclear waste.  By inserting a Co importer pump into E. coli, the Lyon team hopes to use bacteria to concentrate the Co and thereby clean up reactor cooling water.  That sounds cool, but the bonus here is that modelling of the system suggests that using E. coli as a biofilter in this way would result in substantially less waste.  The team reports that they expect 8000kg of ion exchange resins could be replaced with 4kg of modified bacteria.  That factor of 2000 in volume reduction would have a serious impact on disposal costs.  And the modified bug appears to work in the lab (with nonradioactive Cobalt), so this story is not just marketing.

The Lyons team also inserted a Co sensor into their E. coli strain.  The sensor then drove expression of a protein that forms amyloid fibers, causing the coli in turn to form a biofilm.  This biofilm would stabilize the biofilter in the presence of Co.  The filter would only be used for a few hours before being replaced, which would not give the strain enough time to lose this circuit via selection.

Imperial College London: Auxin

Last, but certainly not least, is the very well thought through Imperial College project to combat soil erosion by encouraging plant root growth.  I saved this one for last because, for me, the project beautifully reflects the team's intent to carefully consider the real-world implications of their work.  There are certainly skeptics out there who will frown on the extension of iGEM into plants, and who feel the project would never make it into the field due to the many regulatory barriers in Europe.  I think the skeptics are completely missing the point.

To begin, a summary of the project: the Imperial team's idea was to use bacteria as a soil treatment, applied in any number of ways, that would be a cost-effective means of boosting soil stability through root growth.  The team designed a system in which genetically modified bacteria would be attracted to plant roots, would then take up residence in those roots, and would subsequently produce a hormone that encourages root growth.

The Auxin system was conceived to combine existing components in very interesting ways.  Naturally-occurring bacteria have already been shown to infiltrate plant roots, and other soil-dwelling bacteria produce the same growth hormone that encourages root proliferation.

Finally, the team designed and built a novel (and very clever) system for preventing leakage of transgenes through horizontal gene transfer.  On the plasmid containing the root growth genes, the team also included genes that produce proteins toxic to bacteria.  But in the chromosome, they included an anti-toxin gene.  Thus if the plasmid were to leak out and be taken up by a bacterium without the anti-toxin gene, any gene expression from the plasmid would kill the recipient cell.

The team got many of these pieces working independently, but didn't quite get the whole system working together in time for the international finals.  I encourage those interested to have a look at the wiki, which is really very good.

The Shift to Thinking About Design

As impressive as Imperial's technical results were, I was also struck by the integration of "human practices" into the design process.  The team spoke to farmers, economists, Greenpeace -- the list goes on -- as part of both defining the problem and attempting to finesse a solution given the difficulty of fielding GMOs throughout the UK and Europe.  And these conversations very clearly impacted the rest of the team's activities.

One of the frustrations felt by iGEM teams and judges alike is that "human practices" has often felt like something tacked on to the science for the sake of placating potential critics.  There is something to that, as the Ethical, Legal, and Social Implications (ELSI) components of large federal projects such as The Human Genome Project and SynBERC appear to have been tacked on for just that reason.  Turning "human practices" into an appendix on the body of science is certainly not the wisest way to go forward, for reasons I'll get to in a moment, nor is it politically savvy in the long term.  But if the community is honest about it, tacking on ELSI to get funding has been a successful short-term political hack.

The Auxin project, along with a few other events during the finals, helped crystallize for me the disconnect between thinking about "human practices" as a mere appendix while spouting off about how synthetic biology will be the core of a new industrial revolution, as some of us tend to do.  Previous technological revolutions have taught us the importance of design, of thinking the whole project through at the outset in order to get as much right as possible, and to minimize the stuff we get wrong.  We should be bringing that focus on design to synthetic biology now.

I got started down this line of thought during a very thought-provoking conversation with Dr. Megan Palmer, the Deputy Director for Practices at SynBERC.  (Apologies to you, Megan, if I step your toes in what follows -- I just wanted to get these thoughts on the page before heading out the door for the holidays.)  The gist of my chat with Megan was that the focus on safety and security as something else, as an activity separate from the engineering work of SB, is leading us astray.  The next morning, I happened to pass Pete Carr and Mac Cowell having a chat just as one of them was saying, "The name human practices sucks. We should really change the name."  And then my brain finally -- amidst the jet lag and 2.5 days of frenetic activity serving as a judge for iGEM -- put the pieces together.  The name does suck.  And the reason it sucks is that it doesn't really mean anything.

What the names "human practices" and "ELSI" are trying to get at is the notion that we shouldn't stumble into developing and using a powerful technology without considering the consequences.  In other fields, whether you are thinking about building a chair, a shoe, a building, an airplane, or a car, in addition to the shape you usually spend a great deal of time thinking about where the materials come from, how much the object costs to make, how it will be used, who will use it, and increasingly how it will be recycled at end of use.  That process is called design, and we should be practicing it as an integral part of manipulating biological systems.

When I first started as a judge for iGEM, I was confused by the kind of projects that wound up receiving the most recognition.  The prizes were going to nice projects, sure, but those projects were missing something from my perspective.  I seem to recall protesting at some point in that first year that "there is an E in iGEM, and it stands for Engineering."  I think part of that frustration was the pool of judges was dominated for many years by professors funded by the NIH, NRC, or the Welcome Trust, for example -- scientists who were looking for scientific results they liked to grace the pages of Science or Nature -- rather than engineers, hackers, or designers who were looking for examples of, you know, engineering.

My point is not that the process of science is deficient, nor that all lessons from engineering are good -- especially as for years my own work has fallen somewhere in between science and engineering.  Rather, I want to suggest that, given the potential impact of all the science and engineering effort going into manipulating biological systems, everyone involved should be engaging in design.  It isn't just about the data, nor just about shiny objects.  We are engaged in sorting out how to improve the human condition, which includes everything from uncovering nature's secrets to producing better fuels and drugs.  And it is imperative that as we improve the human condition we do not diminish the condition of the rest of the life on this planet, as we require that life to thrive in order that we may thrive.

Which brings me back to design.  It is clear that not every experiment in every lab that might move a gene from one organism to another must consider the fate of the planet as part of the experimental design.  Many such experiments have no chance of impacting anything outside the test tube in which they are performed.  But the practice of manipulating biological systems should be done in the context of thinking carefully about what we are doing -- much more carefully than we have been, generally speaking.  Many fields of human endeavor can contribute to this practice.  There is a good reason that ELSI has "ethical", "legal", and "social" in it.

There have been a few other steps toward the inclusion of design in iGEM over the years.  Perhaps the best example is the work designers James King and Daisy Ginsburg did with the 2009 Grand Prize Winning team from Cambridge (see iGEM 2009: Got Poo?).  That was lovely work, and was cleverly presented in the "Scatalog".  You might argue that the winners over the years have had increasingly polished presentations, and you might worry that style is edging out substance.  But I don't think that is happening.  The steps taken this year by Imperial, Lyon, and Washington toward solving real-world problems were quite substantive, even if those steps are just the beginning of a long path to get solutions into people's hands.  That is the way innovation works in the real world.

iGEM 2011: First Thoughts

Congratulations to the 2011 University of Washington iGEM team for being the first US team ever to win the Grand Prize.  The team also shared top honors for Best Poster (with Imperial College London) and for Best Food/Energy Project (with Yale).  The team also had (in my opinion) the clearest, and perhaps best overall, wiki describing the project that I have seen in 5 years as an iGEM judge.  I only have a few minutes in the airport to post this, but I will get back to it later in the week.

The UW team had an embarrassment of riches this year.  One of the team's projects demonstrated production of both odd and even chain alkanes in E. coli directly from sugar.  The odd-chain work reproduces the efforts of a Science paper published by LS9 last year, but the team also added an enzyme from B. subtilis to the pathway that builds alkanes starting from a 3-carbon seed rather than the normal 2-carbon seed in coli.  This latter step allowed them to make even-chain alkanes via a synthetic biological pathway, which has not been reported elsewhere.  So they wound up directly making diesel fuel from sugar.  The yields aren't all there yet to roll out this sort of thing more widely, but its not so bad for a summer project.

And that's not all.

The other main project was an effort to produce an enzyme to digest gluten.  There is one such enzyme in clinical trials at the moment, intended for use as a therapeutic for gluten intolerance, which afflicts about 1% of the population.  However, that enzyme is not thermostable and has an optimum pH of 7.

The UW team found an enzyme in the literature that was not known to digest gluten, but which works at pH 4 (close to the human stomach) and is from a thermophilic organism.  They used Foldit to redesign the enzyme to process gluten, and then built a library of about 100 variants of that design.  One of those variants wound up working ~800 times better than the enzyme that is currently in clinical trials.  And the team thinks they can do even better by combining some of the mutants from the library.

Nice work.

I could go on and on about the competition this year.  The teams are all clearly working at a new level.  I recall that a couple of years ago at iGEM Drew Endy asked me, somewhat out of frustration, "Is this it?  Is this all there is?"  The answer: No.  There is a hell of a lot more.  And the students are just getting started.

Plenty of other teams deserve attention in this space, in particular Imperial College London, the runner up.  They built a system (called Auxin) in E. coli to encourage plant root growth, with the aim of stopping desertification.  And their project was an extremely good example of design, from the technical side through to conversations with customers (industry) and other stakeholders (Greenpeace) about what deployment would really be like.

More here later in the week.  Gotta run for the plane.

Biological Technology in 2050

A version of this essay was also published in IEEE Spectrum in 2002 as "Open-Source Biology and its Impact on Industry".

Biological Technology in 2050
Robert Carlson, 2001
Silver Award Winner, The Economist/Shell World in 2050 Essay Competition


In fifty years, you may be reading The Economist on a leaf. The page will not look like a leaf, but it will be grown like a leaf. It will be designed for its function, and it will be alive. The leaf will be the product of intentional biological design and manufacturing.

Rather than being constantly green, the cells on its surface will contain pigments controlled by the action of something akin to a nervous system. Like the skin of a cuttlefish, the cells will turn color to form words and images as directed by a connection to the internet of the day. Given the speed with which the cuttlefish changes its pigment these pages may not be fast enough to display moving images, but they will be fine for the written word. Each page will be slightly thicker than the paper The Economist is now printed on, providing room for control elements (the nervous system) and circulation of nutrients. When a page ages, or is damaged, it will be easily recycled. It will be fueled by sugar and light. Many of the artifacts produced in 50 years and used in daily living will have a similar appearance, and have similar origin. The consequences of mature biological design and manufacturing are widespread, and will affect all aspects of the economy including energy and resource usage, transportation, and labor. Today, electronic paper and similar display technologies are just around the corner, but in the long run they will not be able to compete with the products of inexpensive, distributed biological manufacturing.

Growing engineered leaves for display devices may seem a complex biological engineering feat, but foundations for the technology are already being laid. Structurally simple replacement human tissues are currently being grown in the laboratory on frameworks of suture material[1]. Projects to grow functional human heart tissue, and eventually a whole heart, are underway, with a timeline for completion of ten years[2].

Within those ten years, the genomes of many organisms will be sequenced, providing a parts list for the proteins forming the structural and control elements in those organisms. Biologists, engineers, and physicists are already collaborating to build models that help us understand how those parts work and fit together. The goal for these models is quantitative prediction of the behavior of biological systems, which will have profound implications for the understanding of basic biology and for improving human health.

Beyond initial biomedical consequences, models that can be used to predict the effects of perturbations to existing biological systems will become de facto design tools, providing an infrastructure for creating new technologies based on biology. When we can successfully predict the behavior of designed biological systems, then an intentional biology will exist. With an explicit engineering component, intentional biology is the opposite of the current, very nearly random applications of biology as technology. For instance, the present debate over genetically modified foods is more indicative of the poorly planned use of an immature technology rather than a failure of the technology itself. At present we simply can't predict the effects of tinkering with a system as complex as crops and their pests. But as with the progression of every other human technology, from fire, to bridges, to computers, biological engineering will improve with time. Quantitative models for simple systems like viral infections of bacteria and yeast signal transduction pathways are already being tested[3]. Computational methods developed in those efforts will soon be applied to higher plants and animals. It is a short step from successful prediction to design and the beginning of industrial applications.

Yet even before the advent of true biological design, more general lessons from biology are already transforming our economy. The potential impact on industrial practices of learning from biology is enormous and is explored in the book Natural Capitalism by Paul Hawken and Amory and L. Hunter Lovins[4]. The authors point out that structuring business practices along biological lines can significantly improve the bottom line. The human circulatory system, for instance, is optimized to minimize the work required to pump blood throughout the body. The majority of industrial pumping systems, however, are optimized to minimize the cost of the pipes during construction. This means smaller pipes are used, requiring large pumps that use vastly more energy than necessary. Similarly, in the human pumping system, the heart has to work too hard when arteriosclerosis leads to a reduction in the diameter of blood vessels. These vessels then require maintenance in the form of an angioplasty. Industrial pumping systems are designed with built-in arteriolosclerosis and fixing them requires rebuilding from the ground up. Paying careful attention to several hundred million years of nature's trial and error design experience will provide considerable savings in energy and resources to human industry.

Borrowing a design aesthetic for industrial function from nature is just the beginning. The living world will also become part or our industrial infrastructure. Nature has already discovered how to fabricate materials and finesse chemistry in ways that are the envy of human engineers and chemists. Many companies, both established and start-up, are now focusing on harvesting enzymes from organisms in the environment for use in industrial processes[5]. Popular examples of high strength materials fabricated by biology at low temperature, pressure, and energy cost are spider silk and abalone shell[6]. Yet increased resource efficiency and biomaterials are only the first steps in a revolution in manufacturing. Beyond using biology as a model for the structure and function of industrial production, the year 2050 will see humans utilizing biology as the means of production itself.

Whereas most manufacturing today is highly centralized and materials are transported considerable distances throughout the assembly process, in the year 2050 human industry will use distributed and renewable manufacturing based upon biology. Renewable manufacturing means that biology will be used to produce many of the physical things we use every day. In early implementation, the organism of choice will likely be yeast or a bacterium. The physical infrastructure for this type of manufacturing is inherently flexible: it is essentially the vats, pumps, and fluid handling capacity found in any brewery. Production runs for different    products would merely involve seeding a vat with a yeast strain containing the appropriate genetic instructions and then providing raw materials. To be sure, there will always be applications and environments where biological fabrication is not the best option, and it is not clear how complex the fabrication task can be, but biology is capable of fabrication feats not emulatable by any current or envisioned human technology. In some ways, this scheme sounds a bit like Eric Drexler's nanotechnological assemblers[7], except that we already have functional nanotechnology - it's called biology.

The transformation to an economy based on biological manufacturing will occur as technical manipulations become easier with practice and through a proliferation of workers with the appropriate skills. Biological engineering will proceed from profession, to vocation, to avocation, because the availability of inexpensive, quality DNA sequencing and synthesis equipment will allow participation by anyone who wants to learn the details. In 2050, following the fine tradition of hacking automobiles and computers, garage biology hacking will be well underway.

Considerable information is already available on how to manipulate and analyze DNA in the kitchen. A recent Scientific American Amateur Scientist column provided instructions for amplifying DNA through the polymerase chain reaction (PCR)[8], and a previous column concerned analyzing DNA samples using homemade electrophoresis equipment. The discussion was immediately picked up in a thread where participants provided tips for improving the yield of the PCR process[9]. More detailed, technical information can be found in any university biology library in Current Protocols in Molecular Biology[10], which contains instructions on how to perform virtually every task needed in modern molecular biology. This printed compendium has recently joined the myriad resources online[11] maintained by universities and government agencies, thereby becoming all the more accessible. Open-source biology is already becoming a reality.

As the "coding" infrastructure for understanding, troubleshooting, and, ultimately, designing biology develops, DNA sequencers and synthesizers will become less expensive, faster, and ever simpler to use. These critical technologies will first move from academic labs and large biotechnology companies to small businesses, and eventually to the home garage and kitchen. Many standard laboratory techniques that once required a doctorate's worth of knowledge and experience to execute correctly are now used by undergraduates in a research setting with kits containing color-coded bottles of reagents. The recipes are easy to follow. This change in technology represents a democratization of sorts, and it illustrates the likely changes in labor structure that will accompany the blossoming of biological technology.

The course of labor in biological technology can be charted by looking at the experience of the computer and internet industries. Many start-up companies in Silicon Valley have become contract engineering efforts, funded by venture capital, where workers sign on with the expectation that the company will be sold within a few years, whereupon they will find a new assignment. The leading edge of the biological technology revolution could soon look the same. However, unlike today's integrated circuits, where manufacturing infrastructure costs have now reached upwards of 1 billion dollars per facility, the infrastructure costs for renewable biological manufacturing will continue to decline. Life, and all the evolutionarily developed technology it utilizes, operates at essentially room temperature, fueled by sugars. Renewable, biological manufacturing will take place anywhere someone wants to set up a vat or plant a seed.

Distributed biological manufacturing will be all the more flexible because the commodity in biotechnology is today becoming information rather than things. While it is still often necessary to exchange samples through the mail, the genomics industry has already begun to derive income from selling solely information about gene expression. In a few decades it will be the genomic sequence that is sent between labs, there to be re-synthesized and expressed as needed. It is already possible to synthesize sufficient DNA to build a bacterial genome from scratch in a few weeks using chemical means. Over the coming decades that time will be reduced to days, and then to hours, eventually via the development of directed, template-free, enzymatic synthesis - a DNA "synthase."

It is possible that the evolution of open- source biology will be delayed by retrenchment on the part of corporations trying to protect intellectual property. However, the future model of biology as a technological instrument of any corporation can be found by simply looking at the way life currently makes use of biological technology. Only very rarely is it the case that advantage is conferred on an organism via a biochemically unique enzyme or pathway. The toolbox of biochemistry, the parts list - the "kernel," to stretch the software analogy - is shared by all organisms on the planet. In general, organisms are different from one another because of the order of gene expression or because of relatively subtle perturbations to protein structures common to all forms of terrestrial life. That is, innovation in the natural world in some sense has always followed the idea of a service and flow economy. If the environment is static, only when an organism figures out how to provide itself, or another organism, with a new service using the old toolbox is advantage conferred.

The analogy to future industrial applications of biology is clear: When molecular biologists figure out the kernel of biology, innovation by humans will consist of tweaking the parts to provide new services. Because of the sheer amount of information, it is unlikely that a single corporate entity could maintain a monopoly on the kernel. Eventually, as design tasks increase in number and sophistication, corporations will have to share techniques and this information will inevitably spread widely, reaching all levels of technical ability - the currency of the day will be innovation and design. As with every other technology developed by humans, biological technology will be broadly disseminated.

As open-source biological manufacturing spreads, it will be adopted quickly in less developed economies to bypass the first world's investment in industrial infrastructure. Given the already stressed state of natural resources throughout much of the developing world, it will not be possible for many of those countries to attain first-world standards of living with industrial infrastructure as wasteful as that of the United States. The developing world simply cannot afford industrial and energy inefficiency. A short cut is to follow the example of the growing wireless-only communications infrastructure in Africa and skip building systems to transport power and goods. It is already clear that distributed power generation will soon become more efficient than centralized systems. Distributed manufacturing based upon local resources will save transportation costs, provide for simpler customization, require less infrastructure investment, and as a result will likely cost less than centralized manufacturing.

Distributed biological manufacturing is the future of the global economy. With design and fabrication power spread throughout the world to the extent suggested here, it is necessary to consider possible dangers. The simple answer is that those dangers are real and considerable. This technology enables the creation of new organisms potentially pathogenic to humans, or to animals and plants upon which we rely. It is already clear that the social and biological consequences of extending human life span and human germline engineering will consume considerable public debate time over the next few decades. Moreover, the underlying infrastructure and methods are already so widespread that no one country will be able to manipulate the development of biological technology by controlling the research within its borders. But fear of potential hazards should be met with increased research and education rather than closing the door on the profound positive impacts distributed biological technology will have on human health, human impacts on the environment, and on increasing standards of living around the world. Technology based on intentional, open-source biology is on its way, whether we like it or not, and the opportunity it represents will just begin to emerge in the next fifty years.

3.    Endy, D., et al., Computation, prediction, and experimental tests of fitness for bacteriophage T7 mutants with permuted genomes. Proceedings of the National Academy of Sciences, 2000. 97: p. 5375-5380.
4.    Hawken, P., A. Lovins, and L.H. Lovins, Natural Capitalism. 1999: Little Brown.
6.    Sarikaya, M., Biomimetics: Materials fabrication through biology. Proceedings of the National Academy of Sciences, 1999: p. 14183-14185.
10.    Current Protocols in Molecular Biology. 1999, Wiley: New York.

Staying Sober about Science

The latest issue of The Hastings Center Report carries an essay of mine, "Staying Sober about Science" (free access after registration), about my thoughts on New Directions: The Ethics of Synthetic Biology and Emerging Technologies (PDF) from The Presidential Commission for the Study of Bioethical Issues.

Here is the first paragraph:

Biology, we are frequently told, is the science of the twenty-first century. Authority informs us that moving genes from one organism to another will provide new drugs, extend both the quantity and quality of life, and feed and fuel the world while reducing water consumption and greenhouse gas emissions. Authority also informs that novel genes will escape from genetically modified crops, thereby leading to herbicide-resistant weeds; that genetically modified crops are an evil privatization of the gene pool that will with certainty lead to the economic ruin of small farmers around the world; and that economic growth derived from biological technologies will cause more harm than good. In other words, we are told that biological technologies will provide benefits and will come with costs--with tales of both costs and benefits occasionally inflated--like every other technology humans have developed and deployed over all of recorded history.

And here are a couple of other selected bits:

Overall, in my opinion, the report is well considered. One must commend President Obama for showing leadership in so rapidly addressing what is seen in some quarters as a highly contentious issue. However, as noted by the commission itself, much of the hubbub is due to hype by both the press and certain parties interested in amplifying the importance of the Venter Institute's accomplishments. Certain scientists want to drive a stake into the heart of vitalism, and perhaps to undermine religious positions concerning the origin of life, while "civil society" groups stoke fears about Frankenstein and want a moratorium on research in synthetic biology. Notably, even when invited to comment by the commission, religious groups had little to say on the matter.

The commission avoided the trap of proscribing from on high the future course of a technology still emerging from the muck. Yet I cannot help the feeling that the report implicitly assumes that the technology can be guided or somehow controlled, as does most of the public discourse on synthetic biology. The broader history of technology, and of its regulation or restriction, suggests that directing its development would be no easy task.8 Often technologies that are encouraged and supported are also stunted, while technologies that face restriction or prohibition become widespread and indispensable.

...The commission's stance favors continued research in synthetic biology precisely because the threats of enormous societal and economic costs are vague and unsubstantiated. Moreover, there are practical implications of continued research that are critical to preparing for future challenges. The commission notes that "undue restriction may not only inhibit the distribution of new benefits, but it may also be counterproductive to security and safety by preventing researchers from developing effective safeguards."12 Continued pursuit of knowledge and capability is critical to our physical and economic security, an argument I have been attempting to inject into the conversation in Washington, D.C., for a decade. The commission firmly embraced a concept woven into the founding fabric of the United States. In the inaugural State of the Union Address in 1790, George Washington told Congress "there is nothing which can better deserve your patronage than the promotion of science and literature. Knowledge is in every country the surest basis of publick happiness."13

The pursuit of knowledge is every bit as important a foundation of the republic as explicit acknowledgment of the unalienable rights of life, liberty, and the pursuit of happiness. Science, literature, art, and technology have played obvious roles in the cultural, economic, and political development of the United States. More broadly, science and engineering are inextricably linked with human progress from a history of living in dirt, disease, and hunger to . . . today. One must of course acknowledge that today's world is imperfect; dirt, disease, and hunger remain part of the human experience. But these ills will always be part of the human experience. Overall, the pursuit of knowledge has vastly improved the human condition. Without scientific inquiry, technological development, and the economic incentive to refine innovations into useful and desirable products, we would still be scrabbling in the dirt, beset by countless diseases, often hungry, slowly losing our teeth.

There's more here.


8. R. Carlson, Biology Is Technology: The Promise, Peril, and New Business of Engineering Life (Cambridge, Mass.: Harvard University Press, 2010).

12. Presidential Commission for the Study of Bioethical Issues, New Directions, 5.

13. G. Washington, "The First State of the Union Address," January 8, 1790,
Synthetic Biology 5.0 has come and gone.  I expected, as in previous years, to be busy liveblogging amid the excitement.  I tweeted some during the proceedings (here is Eric Ma's summary of #synbio5 tweets), but this is my first post about the meeting, and probably the last one.  I mostly just listened, took a few notes, and was delighted to see the progress being made.  I was not nearly as amped up about the proceedings as in previous years, and I am still trying to figure out why. 

Here are a couple of reasons I have sorted out so far.  It was the end of the beginning of synthetic biology.  The meeting was full of science and engineering.  And that's about all.  There were a few VC's and other investors sniffing around, but not nearly so many as in previous years; those who did show up kept a lower profile.  There were also fewer obvious government officials, no obvious spooks, no obvious law enforcement officers, nor any self-identified Weapons of Mass Destruction Coordinators.  And I only encountered a couple of reporters, though there must have been more.  I skipped 3.0 in Zurich, but at 1.0 at MIT, 2.0 at Berkeley (parts 1, 2, 3, 4, 5), and 4.0 in Hong Kong (part 1), there was much more buzz.  Synthetic Biology 5.0 was much shorter on hype than prior gatherings. 

There was substantially more data this year than previously.  And there was substantially less modeling.  All in all, Synthetic Biology is substantially more ... substantial.  It was like a normal scientific meeting.  About science.  No stunts from "civil society" groups looking for their next fear bullet point for fundraising.  No government officials proclaiming SB as the economic future of their city/state/country.  Just science.

What a relief.

And that science was nothing to sneeze at.  There were great talks for 3 days.  Here are a couple of things that caught my eye.

Jef Boeke from Johns Hopkins presented his plans to build synthetic yeast chromosomes.  I first heard this idea more than ten years ago from Ron Davis at Stanford, so it isn't brand new.  I did notice, however, that Boeke having all his synthetic chromosomes made in China.  Over the longer term this means China is getting a boost in building out future biomanufacturing platforms.  If the project works, that is.

As tweeted, Jack Newman from Amyris gave an update on commercialization of artemisinin; it should be on the market by the end of the year, which should be in time to help avert an expected shortfall in production from wormwood.  Fantastic.

Pam Silver and her various students and post-docs showed off a variety of interesting results.  First, Faisal Aldaye showed in vivo DNA scaffolds used to channel metabolic reactions, resulting in substantial increases in yield.  Second, Pam Silver showed the use of those scaffolds to generate twice as much sucrose from hacked cyanobacteria per unit of biomass as from sugar cane.  If that result holds up, and if the various issues related to the cost of bioreactors used to culture photosynthetic organisms are worked out, then Pam's lab has just made an enormous step forward in bringing about distributed biological manufacturing.

This is the sort of advance that makes me feel more sanguine about the future of MIcrobrewing the Bioeconomy.  It will take some years before the volume of Amyris' Biofene, or Gevo's bio-PET, or Blue Marble's bio-butyric acid begins to impact the oil industry.  But it is clear to me now as never before that the petroleum industry is vulnerable from the top of the barrel -- the high value, low volume compounds that are used to build the world around us in the form of petrochemicals.  Biology can now be used to make all those compounds, too, directly from sugar, cellulose, and sunlight, without the tens of billions of dollars in capital required to run an oil company (see The New Biofactories). 

So SB 5.0 was the end of the world as we know it.  Synthetic biology is now just another field of human endeavor, thankfully producing results and also thankfully suffering reduced hype.  I can see how the pieces are starting to fit together to provide for sustainable manufacturing and energy production, though it will be some years before biological technologies are used this way at scale.  Perhaps this is less in-your-face exciting for the attendees, the press, and the public, and that may be part of the reason for my ambivalence.  I fell asleep several times during the proceedings, which has never happened to me at SB X.0, even when overseas and jetlagged.  I have never before thought of achieving boredom as constituting progress.