Last year, when the United States military debuted footage of an iridescent drone the size and shape of a hummingbird buzzing around a parking lot, the media throated a collective hooah! Time magazine even devoted a cover to it. Meanwhile, with no fanfare at all--despite the enormous potential to reshape modern warfare--the military issued a request for scientists to find ways to design microbes that could produce explosives for weapons. Imagine a vat of genetically engineered yeast that produces chemicals for bombs and missiles instead of beer.
Results tagged “synthetic biology”
Robert Carlson, 2009
For What's Next, a Special Edition of the McKinsey Quarterly
Humans have been modifying biological systems for our own economic benefit for millennia. Improvements in crop yields and overall farming productivity have come from a continuing alteration of the genetic makeup--through selection and breeding--of the plants and animals upon which we rely. Now we find ourselves at the dawn of a new age of direct genetic modification. While the term "artificial life form" conjures up images of cyborgs or other creations of science fiction, the first such "artificial" creatures will actually be single-celled microorganisms. Even though these human-engineered life forms will be extremely simple, they will have an enormous impact on our world. Their biggest potential: the creation of biofuels and biomaterials, which have the promise to transform our entire economy.
The first explicitly artificial organisms emerged from recombinant DNA technology in the mid-1970s; this technology was commercialized with lightning speed. As of 2006, biotech drugs accounted for about $65 billion in sales worldwide. Just one drug, Epogen, has generated $10 billion in revenues since its creation. A molecular biologist--particularly when receiving stock options in a biotech start-up--would have to conclude that life forms that become "artificial" simply by the addition of one gene can be quite commercially significant.
Revenues from genetically modified "stuff" now exceed 1 percent of US GDP and are generated in three areas: drugs, agriculture, and industrial products like enzymes and plastics. These areas are growing at 10 to 20 percent per year, and together they are making a sizeable and growing contribution to the economy.
The biotech sector is also extremely productive. Between 2000 and 2007, biotech revenues added more than $100 billion to the economy, representing 2.5 percent of US GDP growth. This was accomplished by a biotech workforce of only about 250,000 people, less than one-sixth of one percent of the national workforce.
Yet the underlying technology is immature compared with that in other sectors of the economy. The majority of biotech products that have reached the market are the result of just a handful of genetic modifications and insertions. The commercial significance of the biotech sector will grow as its ability to engineer new biological systems expands.
Until recently, the complexity of these systems was limited in large part by the cost of development. The labor required to build and test a complex genetic circuit was prohibitive. But since the mid-1990s, productivity in reading and writing genes has been improving exponentially, while costs have plunged. Now relatively large pieces of DNA can be designed electronically, sent to a gene "foundry," constructed, and returned via express mail in just a few weeks. It is already technically possible to build stretches of DNA as long as those of small bacterial genomes (about 400 genes).
However, this is not the fastest road to commercially significant organisms. This is because the simpler the engineering task is, the greater the near-term economic impact will be. For example, aeronautical engineers do not attempt to build new aircraft with the complexity of a hawk, a hummingbird, or even a moth. They instead succeed by reducing complexity. Even the simplest cell contains far more bells and whistles than we can presently understand. Consequently, no biological engineer will succeed in building a system from scratch until most of that complexity is whittled away, leaving only the bare essentials. Real progress will come by adding to existing organisms just a few new genes--probably no more than 15.
Companies are already making substantive progress. Amyris Biotechnologies has modified yeast to transform sugar into useful compounds, including malaria drugs and biofuels that can substitute for today's jet fuel, diesel, and gasoline. The company will begin production of these fuels next year in converted ethanol fermentation plants in Brazil.
As biotech technology develops, biofuels and bioplastics produced this way will be easier and cheaper to make than ethanol or traditional plastics and they will perform better than even petroleum-based products. Their manufacture and use will also reduce the carbon emissions that cause climate change.
Such artificial life forms will fundamentally change how we power the economy, bringing about a switch from fossil fuels to biological feedstocks like sugar, starch, and cellulose. Biomanufacturing is less likely to be centralized, like petroleum refineries and ethanol plants, and will instead be more evenly distributed, like beer breweries.
Cars themselves might actually become the producers of the very fuels they consume. In the spring of 2007, researchers reported the successful construction of a synthetic pathway consisting of 13 enzymes from different organisms that can turn starch into hydrogen. This suggests a future in which sugar or starch--substances available at any grocery store--will go into our fuel tanks instead of gasoline. A fuel cell will use the hydrogen produced by engineered microbes in the tank to provide electric power for the car. Such a car would then become something of a cyborg, relying on living organisms to provide power to an inorganic shell. As one oil executive observed at a recent oil industry meeting, in this model "the car is the refinery".
If this innovation comes to pass, a very different marketplace is likely to arise. The infrastructure for shipping and refining petroleum overseen by that self-same executive might become less relevant in a new biotech world. Moreover, if distributed biological processing of simple feedstocks can compete in low-margin markets like liquid transportation fuels, then it will also make significant inroads with higher-margin products like fibers, plastics, flavorings, and scents.
It will soon be possible to devise enzymes and organisms that "eat" a diverse array of feedstocks. One good example is municipal sewage. Now mostly treated and disposed of as waste, this resource will initially be used to grow unmodified algae. The algae will in turn be fed to synthetic systems--think of these as "artificial cows", a fusion of robot and biology that is beyond even the "cyborg" car--engineered to make materials and fuels. Eventually, the algae itself will be engineered to directly convert sewage into products. And inevitably, these artificial cows will move out into the fields, closer to large-volume agriculture. Modern harvesting equipment is already often driven by autonomous, satellite-guided control systems. Imagine robotic harvesters equipped with bioprocessing modules slowly wandering around farmland, consuming a variety of feedstocks, processing that material into higher-value products like fuels and plastics, and delivering it to distribution centers. These hybrid "cowborgs" would thereby become autonomous, distributed biomanufacturing platforms, engineered to supply us with the fuels and materials that we need.
Very few organisms on our planet are larger than about one meter across. Most of the biomass production, and therefore most of the biological processing, occurs at scales of microns to centimeters. While organisms produced by nature face different constraints than those designed by humans, we may find ever more inspiration in microbes, insects, and cows for our future production infrastructure. We have barely begun to tap the promise of biotech.
May 4th, Noon, Princeton University: "Biology is Technology: Garage Biology, Microbrewing and the Economic Drivers of Distributed Biological Production"
May 5th, 1 pm, Genspace (33 Flatbush Avenue, Brooklyn): "Biology Is Technology: The Implications of Global Biotechnology"
May 7th-8th, The Hastings Institute, "Progress and Prospects for Microbial Biofuels" for the next round of conversations on ethics, synthetic biology, and public policy. The previous round of conversations is captured in this set of essays, which includes my contribution, "Staying Sober About Science" (free after registration).
Here are the first three 'graphs:
The request takes advantage of new research in synthetic biology, a science that applies engineering principles to genetics. To its humanitarian credit, in the field's short existence, scientists have genetically programmed bacteria and yeast to cheaply produce green jet fuels (now being tested by major airplane makers) and malaria medicines (scheduled for market in 2013). It's an auspicious beginning for a science that portends to revolutionize how we make things. In the future, we may harness cells to self-assemble into far more complex objects like cell phone batteries or behave like tiny programmable computers. The promise, however, comes yoked with risks.
The techniques that make synthetic biology such a powerful tool for positive innovation may be also used for destruction. The military's new search for biologically brewed explosives threatens to reopen an avenue of research that has been closed for 37 years: biotechnology developed for use in war.
As most readers of this blog know, there has been quite a furor over new results demonstrating mutations in H5N1 influenza strains that are both deadly and highly contagious in mammals. Two groups, led by Ron Fouchier in the The Netherlands and Yoshihiro Kawaoka at The University of Wisconsin, have submitted papers to Nature and Science describing the results. The National Science Advisory Board for Biosecurity (NSABB) has requested that some details, such as sequence information, be omitted from publication. According to Nature, both journals are "reserving judgement about whether to censor the papers until the US government provides details of how it will allow genuine researchers to obtain redacted information".
For those looking to find more details about what happened, I suggest starting with Dorveen Caraval's interview with Fouchier in the New York Times, "Security in Flu Study Was Paramount, Scientist Says"; Kathleen Harmon's firsthand account of what actually happened when the study was announced; and Heidi Ledford's post at Nature News about the NSABB's concerns.
If you want to go further, there is more good commentary, especially the conversation in the comments (including from a member of the NSABB), in "A bad day for science" by Vincent Racaniello. See also Michael Eisen's post "Stop the presses! H5N1 Frankenflu is going to kill us all!", keeping in mind that Eisen used to work on the flu.
Writing at Foreign Policy, Laurie Garrett has done some nice reporting on these events in two posts, "The Bioterrorist Next Door" and "Flu Season". She suggests that attempts to censor the results would be futile: "The genie is out of the bottle: Eager graduate students in virology departments from Boston to Bangkok have convened journal-review debates reckoning exactly how these viral Frankenstein efforts were carried out."
There is much I agree with in Ms. Garrett's posts. However, I must object to her assertion that the work done by Fouchier and Kawaoka can be repeated easily using the tools of synthetic biology. She writes "The Fouchier episode laid bare the emptiness of biological-weapons prevention programs on the global, national, and local levels. Along with several older studies that are now garnering fresh attention, it has revealed that the political world is completely unprepared for the synthetic-biology revolution." As I have already written a book that discusses this confusion (here is an excerpt about synthetic biology and the influenza virus), it is not actually what I want to write about today. But I have to get this issue out of the way first.
As far as I understand from reading the press accounts, both groups used various means to create mutations in the flu genome and then selected viruses with properties they wanted to study. To clarify, from what I have been able to glean from the sparse accounts thus far, DNA synthesis was not used in the work. And as far as I understand from reading the literature and talking to people who build viruses for a living, it is still very hard to assemble a functioning, infectious influenza virus from scratch.
If it were easy to write pathogen genomes -- particularly flu genomes -- from scratch, we would quite frankly be in deep shit. But, for the time being, it is hard. And that is important. Labs who do use synthetic biology to build influenza viruses, as with those who reconstructed the 1918 H1N1 influenza virus, fail most of the time despite great skill and funding. Synthesizing flu viruses is simply not a garage activity. And with that, I'll move on.
Regardless of how the results might be reproduced, many have suggested that the particular experiments described by Fouchier and Kawaoka should not have been allowed. Fouchier himself acknowledges that selecting for airborne viruses was not the wisest experiment he could have done; it was, he says, "really, really stupid". But the work is done, and people do know about it. So the question of whether this work should have been done in the first place is beside the point. If, as suggested by Michael Eisen, that "any decent molecular biologist" could repeat the work, then it was too late to censor the details as soon as the initial report came out.
I am more interested in the consequences of trying to contain the results while somehow allowing access to vetted individuals. Containing the results is as much about information security as it is biological security. Once such information is created, the challenge is to protect it, to secure it. Unfortunately, the proposal to allow secure access only by particular individuals is at least a decade (if not three decades) out of date.
Any attempt to secure the data would have to start with an assessment of how widely it is already distributed. I have yet to meet an academic who regularly encrypts email, and my suspicion is that few avail themselves of the built-in encryption on their laptops. So, in addition to the university computers and email servers where the science originated, the information is sitting in the computers of reviewers, on servers at Nature and Science, at the NSABB, and, depending on how the papers were distributed and discussed by members of the NSABB, possibly on their various email servers and individual computers as well. And let's not forget the various unencrypted phones and tablets all of those reviewers now carry around.
But never mind that for a moment. Let's assume that all these repositories of the relevant data are actually secure. The next step is to arrange access for selected researchers. That access would inevitably be electronic, requiring secure networks, passwords, etc. In the last few days the news has brought word that computer security firms Stratfor and Symantec have evidently been hacked recently. Such attacks are not uncommon. Think back over the last couple of years: hacks at Google, various government agencies, universities. Credit card numbers, identities, and supposedly secret DoD documents are all for sale on the web. To that valuable information we can now add a certain list of influenza mutations. If those mutations are truly a critical biosecurity risk -- as asserted publicly by various members of the NSABB -- then that data has value far beyond its utility in virology and vaccinology.
The behavior of various hackers (governments, individuals, other) over the last few years make clear that what the discussion thus far has done is to stick a giant "HACK HERE" sign on the data. Moreover, if Ms. Garrett is correct that students across the planet are busy reverse engineering the experiments because they don't have access to the original methods and data, then censorship is creating a perverse incentive for innovation. Given today's widespread communication, restriction of access to data is an invitation, not a proscription.
This same fate awaits any concentration of valuable data. It obviously isn't a problem limited to collections of sensitive genetic sequences or laboratory methods. And there is certainly a case to be made for attempting to maintain confidential or secret caches of data, whether in the public or private interest. In such instances, compartmentalization and encryption must be implemented at the earliest stages of communication in order to have any hope of maintaining security.
However, in this case, if it true that reverse engineering the results is straightforward, then restriction of access serves only to slow down the general process of science. Moreover, censorship will slow the development of countermeasures. It is unlikely that any collection of scientists identified by the NSABB or the government will be sufficient to develop all the technology we need to respond to natural pathogens, let alone any artificial ones.
As with most other examples of prohibition, these restrictions are doomed before they are even implemented. Censorship of information that is known to exist incentivizes innovation and rediscovery. As I explored in my book, prohibition in the name of security is historically a losing proposition. Moreover, science is inherently a networked human activity that is fundamentally incompatible with constraints on communication, particularly of results that are already disclosed. Any endeavor that relies upon science is, therefore, also fundamentally incompatible with constraints on communication. Namely developing technologies to defend against natural and artificial pathogens. Censorship threatens not just science but also our security.
I think the biggest change from last year is the choice of applications, which I will describe below. And related to the choice of applications is change of approach to follow a more complete design philosophy. I'll get to the shift in design sensibility further on in the post.
The University of Washington: Make it or Break it
I described previously the nuts and bolts of the University of Washington's Grand Prize winning projects. But, to understand the change in approach (or perhaps change in scope?) this project represents, you also have to understand a few details about problems in the real world. And that is really the crux of the matter -- teams this year took on real world problems as never before, and may have produced real world solutions.
Recall that one of the UW projects was the design of an enzyme that digests gluten, with the goal of using that enzyme to treat gluten intolerance. Candidate enzymes were identified through examining the literature, with the aim of finding something that works at low pH. The team chose a particular starter molecule, and then used the "video game" Foldit to re-design the active site in silico so that it would chew up gluten (here is a very nice Youtube video on the Foldit story from Nature). They then experimentally tested many of the potential improvements. The team wound up with an enzyme that in a test tube is ~800 times better than one already in clinical trials. While the new enzyme would of course itself face lengthy clinical trials, the team's achievement could have an enormous impact on people who suffer from celiac disease, among many other ailments.
From a story in last week's NYT Magazine ("Should We All Go Gluten-Free?"), here are some eye-opening stats on celiac disease, which can cause symptoms ranging from diarrhea to dramatic weight loss:
- Prior to 2003, prevalence in the US was thought to be just 1 in 10,000: widespread testing revealed the actual rate was 1 in 133.
- Current estimates are that 18 million Americans have some sort of gluten intolerance, which is about 5.8% of the population.
- Young people were 5x more likely to have the disease by the 1990s than in the 1950s based on looking at old blood samples.
- Prevalence is increasing not just in US, but also worldwide.
The other UW project is a demonstration of using E. coli to directly produce diesel fuel from sugar. The undergraduates first reproduced work published last year from LS9 in which E. coli was modified to produce alkanes (components of diesel fuel -- here is the Science paper by Schirmer et al). Briefly, the UW team produced biobricks -- the standard format used in iGEM -- of two genes that turn fatty acids into alkanes. Those genes were assembled into a functional "Petrobrick". The team then identified and added a novel gene to E. coli that builds fatty acids from 3 carbon seeds (rather than the native coli system that builds on 2 carbon seeds). The resulting fatty acids then served as substrates for the Petrobrick, resulting in what appears to be the first report anywhere of even-chain alkane synthesis. All three genes were packaged up into the "FabBrick", which contains all the components needed to let E. coli process sugar into a facsimile of diesel fuel.
The undergraduates managed to substantially increase the alkane yield by massaging the culture conditions, but the final yield is a long way from being useful to produce fuel at volume. But again, not bad for a summer project. This is a nice step toward turning first sugar, then eventually cellulose, directly into liquid fuels with little or no purification or post-processing required. It is, potentially, also a step toward "Microbrewing the Bioeconomy". For the skeptics in the peanut gallery, I will be the first to acknowledge that we are probably a long way from seeing people economically brew up diesel in their garage from sugar. But, really, we are just getting started. Just a couple of years ago people thought I was all wet forecasting that iGEM teams would contribute to technology useful for distributed biological manufacturing of fuels. Now they are doing it. For their summer projects. Just wait a few more years.
Finally -- yes, there's more -- the UW team worked out ways to improve the cloning efficiency of so-called Gibson cloning. They also packaged up as biobricks all the components necessary to produce magnetosomes in E. coli. The last two projects didn't make it quite as far as the first two, but still made it further than many others I have seen in the last 5 years.
Before moving on, here is a thought about the mechanics of participating in iGEM. I think the UW wiki is the about best I have seen. I like very much the straightforward presentation of hypothesis, experiments, and results. It was very easy to understand what they wanted to do, and how far they got. Here is the "Advice to Future iGEM Teams" I posted a few years ago. Aspiring iGEM teams should take note of the 2011 UW wiki -- clarity of communication is part of your job.
Lyon-INSA-ENS: Cobalt Buster
The team from Lyon took on a very small problem: cleaning up cooling water from nuclear reactors using genetically modified bacteria. This was a nicely conceived project that involved identifying a problem, talking to stakeholders, and trying to provide a solution. As I understand it, there are ongoing discussions with various sponsors about funding a start-up to build prototypes. It isn't obvious that the approach is truly workable as a real world solution -- many questions remain -- but the progress already demonstrated indicates that dismissing this project would be premature.
Before continuing, I pause to reflect on the scope of Cobalt Buster. One does wonder about the eventual pitch to regulators and the public: "Dear Europe, we are going to combine genetically modified organisms and radiation to solve a nuclear waste disposal problem!" As the team writes on its Human Practices page: "In one project, we succeed to gather Nuclear Energy and GMOs. (emphasis in original)" They then acknowledge the need to "focus on communication". Indeed.
Here is the problem they were trying to solve: radioactive Cobalt (Co) is a contaminant emitted during maintenance of nuclear reactors. The Co is typically cleaned up with ion exchange resins, which are both expensive and when used up must be appropriately disposed of as nuclear waste. By inserting a Co importer pump into E. coli, the Lyon team hopes to use bacteria to concentrate the Co and thereby clean up reactor cooling water. That sounds cool, but the bonus here is that modelling of the system suggests that using E. coli as a biofilter in this way would result in substantially less waste. The team reports that they expect 8000kg of ion exchange resins could be replaced with 4kg of modified bacteria. That factor of 2000 in volume reduction would have a serious impact on disposal costs. And the modified bug appears to work in the lab (with nonradioactive Cobalt), so this story is not just marketing.
The Lyons team also inserted a Co sensor into their E. coli strain. The sensor then drove expression of a protein that forms amyloid fibers, causing the coli in turn to form a biofilm. This biofilm would stabilize the biofilter in the presence of Co. The filter would only be used for a few hours before being replaced, which would not give the strain enough time to lose this circuit via selection.
Imperial College London: Auxin
Last, but certainly not least, is the very well thought through Imperial College project to combat soil erosion by encouraging plant root growth. I saved this one for last because, for me, the project beautifully reflects the team's intent to carefully consider the real-world implications of their work. There are certainly skeptics out there who will frown on the extension of iGEM into plants, and who feel the project would never make it into the field due to the many regulatory barriers in Europe. I think the skeptics are completely missing the point.
To begin, a summary of the project: the Imperial team's idea was to use bacteria as a soil treatment, applied in any number of ways, that would be a cost-effective means of boosting soil stability through root growth. The team designed a system in which genetically modified bacteria would be attracted to plant roots, would then take up residence in those roots, and would subsequently produce a hormone that encourages root growth.
The Auxin system was conceived to combine existing components in very interesting ways. Naturally-occurring bacteria have already been shown to infiltrate plant roots, and other soil-dwelling bacteria produce the same growth hormone that encourages root proliferation.
Finally, the team designed and built a novel (and very clever) system for preventing leakage of transgenes through horizontal gene transfer. On the plasmid containing the root growth genes, the team also included genes that produce proteins toxic to bacteria. But in the chromosome, they included an anti-toxin gene. Thus if the plasmid were to leak out and be taken up by a bacterium without the anti-toxin gene, any gene expression from the plasmid would kill the recipient cell.
The team got many of these pieces working independently, but didn't quite get the whole system working together in time for the international finals. I encourage those interested to have a look at the wiki, which is really very good.
The Shift to Thinking About Design
As impressive as Imperial's technical results were, I was also struck by the integration of "human practices" into the design process. The team spoke to farmers, economists, Greenpeace -- the list goes on -- as part of both defining the problem and attempting to finesse a solution given the difficulty of fielding GMOs throughout the UK and Europe. And these conversations very clearly impacted the rest of the team's activities.
One of the frustrations felt by iGEM teams and judges alike is that "human practices" has often felt like something tacked on to the science for the sake of placating potential critics. There is something to that, as the Ethical, Legal, and Social Implications (ELSI) components of large federal projects such as The Human Genome Project and SynBERC appear to have been tacked on for just that reason. Turning "human practices" into an appendix on the body of science is certainly not the wisest way to go forward, for reasons I'll get to in a moment, nor is it politically savvy in the long term. But if the community is honest about it, tacking on ELSI to get funding has been a successful short-term political hack.
The Auxin project, along with a few other events during the finals, helped crystallize for me the disconnect between thinking about "human practices" as a mere appendix while spouting off about how synthetic biology will be the core of a new industrial revolution, as some of us tend to do. Previous technological revolutions have taught us the importance of design, of thinking the whole project through at the outset in order to get as much right as possible, and to minimize the stuff we get wrong. We should be bringing that focus on design to synthetic biology now.
I got started down this line of thought during a very thought-provoking conversation with Dr. Megan Palmer, the Deputy Director for Practices at SynBERC. (Apologies to you, Megan, if I step your toes in what follows -- I just wanted to get these thoughts on the page before heading out the door for the holidays.) The gist of my chat with Megan was that the focus on safety and security as something else, as an activity separate from the engineering work of SB, is leading us astray. The next morning, I happened to pass Pete Carr and Mac Cowell having a chat just as one of them was saying, "The name human practices sucks. We should really change the name." And then my brain finally -- amidst the jet lag and 2.5 days of frenetic activity serving as a judge for iGEM -- put the pieces together. The name does suck. And the reason it sucks is that it doesn't really mean anything.
What the names "human practices" and "ELSI" are trying to get at is the notion that we shouldn't stumble into developing and using a powerful technology without considering the consequences. In other fields, whether you are thinking about building a chair, a shoe, a building, an airplane, or a car, in addition to the shape you usually spend a great deal of time thinking about where the materials come from, how much the object costs to make, how it will be used, who will use it, and increasingly how it will be recycled at end of use. That process is called design, and we should be practicing it as an integral part of manipulating biological systems.
When I first started as a judge for iGEM, I was confused by the kind of projects that wound up receiving the most recognition. The prizes were going to nice projects, sure, but those projects were missing something from my perspective. I seem to recall protesting at some point in that first year that "there is an E in iGEM, and it stands for Engineering." I think part of that frustration was the pool of judges was dominated for many years by professors funded by the NIH, NRC, or the Welcome Trust, for example -- scientists who were looking for scientific results they liked to grace the pages of Science or Nature -- rather than engineers, hackers, or designers who were looking for examples of, you know, engineering.
My point is not that the process of science is deficient, nor that all lessons from engineering are good -- especially as for years my own work has fallen somewhere in between science and engineering. Rather, I want to suggest that, given the potential impact of all the science and engineering effort going into manipulating biological systems, everyone involved should be engaging in design. It isn't just about the data, nor just about shiny objects. We are engaged in sorting out how to improve the human condition, which includes everything from uncovering nature's secrets to producing better fuels and drugs. And it is imperative that as we improve the human condition we do not diminish the condition of the rest of the life on this planet, as we require that life to thrive in order that we may thrive.
Which brings me back to design. It is clear that not every experiment in every lab that might move a gene from one organism to another must consider the fate of the planet as part of the experimental design. Many such experiments have no chance of impacting anything outside the test tube in which they are performed. But the practice of manipulating biological systems should be done in the context of thinking carefully about what we are doing -- much more carefully than we have been, generally speaking. Many fields of human endeavor can contribute to this practice. There is a good reason that ELSI has "ethical", "legal", and "social" in it.
There have been a few other steps toward the inclusion of design in iGEM over the years. Perhaps the best example is the work designers James King and Daisy Ginsburg did with the 2009 Grand Prize Winning team from Cambridge (see iGEM 2009: Got Poo?). That was lovely work, and was cleverly presented in the "Scatalog". You might argue that the winners over the years have had increasingly polished presentations, and you might worry that style is edging out substance. But I don't think that is happening. The steps taken this year by Imperial, Lyon, and Washington toward solving real-world problems were quite substantive, even if those steps are just the beginning of a long path to get solutions into people's hands. That is the way innovation works in the real world.
The UW team had an embarrassment of riches this year. One of the team's projects demonstrated production of both odd and even chain alkanes in E. coli directly from sugar. The odd-chain work reproduces the efforts of a Science paper published by LS9 last year, but the team also added an enzyme from B. subtilis to the pathway that builds alkanes starting from a 3-carbon seed rather than the normal 2-carbon seed in coli. This latter step allowed them to make even-chain alkanes via a synthetic biological pathway, which has not been reported elsewhere. So they wound up directly making diesel fuel from sugar. The yields aren't all there yet to roll out this sort of thing more widely, but its not so bad for a summer project.
And that's not all.
The other main project was an effort to produce an enzyme to digest gluten. There is one such enzyme in clinical trials at the moment, intended for use as a therapeutic for gluten intolerance, which afflicts about 1% of the population. However, that enzyme is not thermostable and has an optimum pH of 7.
The UW team found an enzyme in the literature that was not known to digest gluten, but which works at pH 4 (close to the human stomach) and is from a thermophilic organism. They used Foldit to redesign the enzyme to process gluten, and then built a library of about 100 variants of that design. One of those variants wound up working ~800 times better than the enzyme that is currently in clinical trials. And the team thinks they can do even better by combining some of the mutants from the library.
I could go on and on about the competition this year. The teams are all clearly working at a new level. I recall that a couple of years ago at iGEM Drew Endy asked me, somewhat out of frustration, "Is this it? Is this all there is?" The answer: No. There is a hell of a lot more. And the students are just getting started.
Plenty of other teams deserve attention in this space, in particular Imperial College London, the runner up. They built a system (called Auxin) in E. coli to encourage plant root growth, with the aim of stopping desertification. And their project was an extremely good example of design, from the technical side through to conversations with customers (industry) and other stakeholders (Greenpeace) about what deployment would really be like.
More here later in the week. Gotta run for the plane.
Biological Technology in 2050
Silver Award Winner, The Economist/Shell World in 2050 Essay Competition
In fifty years, you may be reading The Economist on a leaf. The page will not look like a leaf, but it will be grown like a leaf. It will be designed for its function, and it will be alive. The leaf will be the product of intentional biological design and manufacturing.
Rather than being constantly green, the cells on its surface will contain pigments controlled by the action of something akin to a nervous system. Like the skin of a cuttlefish, the cells will turn color to form words and images as directed by a connection to the internet of the day. Given the speed with which the cuttlefish changes its pigment these pages may not be fast enough to display moving images, but they will be fine for the written word. Each page will be slightly thicker than the paper The Economist is now printed on, providing room for control elements (the nervous system) and circulation of nutrients. When a page ages, or is damaged, it will be easily recycled. It will be fueled by sugar and light. Many of the artifacts produced in 50 years and used in daily living will have a similar appearance, and have similar origin. The consequences of mature biological design and manufacturing are widespread, and will affect all aspects of the economy including energy and resource usage, transportation, and labor. Today, electronic paper and similar display technologies are just around the corner, but in the long run they will not be able to compete with the products of inexpensive, distributed biological manufacturing.
Growing engineered leaves for display devices may seem a complex biological engineering feat, but foundations for the technology are already being laid. Structurally simple replacement human tissues are currently being grown in the laboratory on frameworks of suture material. Projects to grow functional human heart tissue, and eventually a whole heart, are underway, with a timeline for completion of ten years.
Within those ten years, the genomes of many organisms will be sequenced, providing a parts list for the proteins forming the structural and control elements in those organisms. Biologists, engineers, and physicists are already collaborating to build models that help us understand how those parts work and fit together. The goal for these models is quantitative prediction of the behavior of biological systems, which will have profound implications for the understanding of basic biology and for improving human health.
Beyond initial biomedical consequences, models that can be used to predict the effects of perturbations to existing biological systems will become de facto design tools, providing an infrastructure for creating new technologies based on biology. When we can successfully predict the behavior of designed biological systems, then an intentional biology will exist. With an explicit engineering component, intentional biology is the opposite of the current, very nearly random applications of biology as technology. For instance, the present debate over genetically modified foods is more indicative of the poorly planned use of an immature technology rather than a failure of the technology itself. At present we simply can't predict the effects of tinkering with a system as complex as crops and their pests. But as with the progression of every other human technology, from fire, to bridges, to computers, biological engineering will improve with time. Quantitative models for simple systems like viral infections of bacteria and yeast signal transduction pathways are already being tested. Computational methods developed in those efforts will soon be applied to higher plants and animals. It is a short step from successful prediction to design and the beginning of industrial applications.
Yet even before the advent of true biological design, more general lessons from biology are already transforming our economy. The potential impact on industrial practices of learning from biology is enormous and is explored in the book Natural Capitalism by Paul Hawken and Amory and L. Hunter Lovins. The authors point out that structuring business practices along biological lines can significantly improve the bottom line. The human circulatory system, for instance, is optimized to minimize the work required to pump blood throughout the body. The majority of industrial pumping systems, however, are optimized to minimize the cost of the pipes during construction. This means smaller pipes are used, requiring large pumps that use vastly more energy than necessary. Similarly, in the human pumping system, the heart has to work too hard when arteriosclerosis leads to a reduction in the diameter of blood vessels. These vessels then require maintenance in the form of an angioplasty. Industrial pumping systems are designed with built-in arteriolosclerosis and fixing them requires rebuilding from the ground up. Paying careful attention to several hundred million years of nature's trial and error design experience will provide considerable savings in energy and resources to human industry.
Borrowing a design aesthetic for industrial function from nature is just the beginning. The living world will also become part or our industrial infrastructure. Nature has already discovered how to fabricate materials and finesse chemistry in ways that are the envy of human engineers and chemists. Many companies, both established and start-up, are now focusing on harvesting enzymes from organisms in the environment for use in industrial processes. Popular examples of high strength materials fabricated by biology at low temperature, pressure, and energy cost are spider silk and abalone shell. Yet increased resource efficiency and biomaterials are only the first steps in a revolution in manufacturing. Beyond using biology as a model for the structure and function of industrial production, the year 2050 will see humans utilizing biology as the means of production itself.
Whereas most manufacturing today is highly centralized and materials are transported considerable distances throughout the assembly process, in the year 2050 human industry will use distributed and renewable manufacturing based upon biology. Renewable manufacturing means that biology will be used to produce many of the physical things we use every day. In early implementation, the organism of choice will likely be yeast or a bacterium. The physical infrastructure for this type of manufacturing is inherently flexible: it is essentially the vats, pumps, and fluid handling capacity found in any brewery. Production runs for different products would merely involve seeding a vat with a yeast strain containing the appropriate genetic instructions and then providing raw materials. To be sure, there will always be applications and environments where biological fabrication is not the best option, and it is not clear how complex the fabrication task can be, but biology is capable of fabrication feats not emulatable by any current or envisioned human technology. In some ways, this scheme sounds a bit like Eric Drexler's nanotechnological assemblers, except that we already have functional nanotechnology - it's called biology.
The transformation to an economy based on biological manufacturing will occur as technical manipulations become easier with practice and through a proliferation of workers with the appropriate skills. Biological engineering will proceed from profession, to vocation, to avocation, because the availability of inexpensive, quality DNA sequencing and synthesis equipment will allow participation by anyone who wants to learn the details. In 2050, following the fine tradition of hacking automobiles and computers, garage biology hacking will be well underway.
Considerable information is already available on how to manipulate and analyze DNA in the kitchen. A recent Scientific American Amateur Scientist column provided instructions for amplifying DNA through the polymerase chain reaction (PCR), and a previous column concerned analyzing DNA samples using homemade electrophoresis equipment. The discussion was immediately picked up in a slashdot.org thread where participants provided tips for improving the yield of the PCR process. More detailed, technical information can be found in any university biology library in Current Protocols in Molecular Biology, which contains instructions on how to perform virtually every task needed in modern molecular biology. This printed compendium has recently joined the myriad resources online maintained by universities and government agencies, thereby becoming all the more accessible. Open-source biology is already becoming a reality.
As the "coding" infrastructure for understanding, troubleshooting, and, ultimately, designing biology develops, DNA sequencers and synthesizers will become less expensive, faster, and ever simpler to use. These critical technologies will first move from academic labs and large biotechnology companies to small businesses, and eventually to the home garage and kitchen. Many standard laboratory techniques that once required a doctorate's worth of knowledge and experience to execute correctly are now used by undergraduates in a research setting with kits containing color-coded bottles of reagents. The recipes are easy to follow. This change in technology represents a democratization of sorts, and it illustrates the likely changes in labor structure that will accompany the blossoming of biological technology.
The course of labor in biological technology can be charted by looking at the experience of the computer and internet industries. Many start-up companies in Silicon Valley have become contract engineering efforts, funded by venture capital, where workers sign on with the expectation that the company will be sold within a few years, whereupon they will find a new assignment. The leading edge of the biological technology revolution could soon look the same. However, unlike today's integrated circuits, where manufacturing infrastructure costs have now reached upwards of 1 billion dollars per facility, the infrastructure costs for renewable biological manufacturing will continue to decline. Life, and all the evolutionarily developed technology it utilizes, operates at essentially room temperature, fueled by sugars. Renewable, biological manufacturing will take place anywhere someone wants to set up a vat or plant a seed.
Distributed biological manufacturing will be all the more flexible because the commodity in biotechnology is today becoming information rather than things. While it is still often necessary to exchange samples through the mail, the genomics industry has already begun to derive income from selling solely information about gene expression. In a few decades it will be the genomic sequence that is sent between labs, there to be re-synthesized and expressed as needed. It is already possible to synthesize sufficient DNA to build a bacterial genome from scratch in a few weeks using chemical means. Over the coming decades that time will be reduced to days, and then to hours, eventually via the development of directed, template-free, enzymatic synthesis - a DNA "synthase."
It is possible that the evolution of open- source biology will be delayed by retrenchment on the part of corporations trying to protect intellectual property. However, the future model of biology as a technological instrument of any corporation can be found by simply looking at the way life currently makes use of biological technology. Only very rarely is it the case that advantage is conferred on an organism via a biochemically unique enzyme or pathway. The toolbox of biochemistry, the parts list - the "kernel," to stretch the software analogy - is shared by all organisms on the planet. In general, organisms are different from one another because of the order of gene expression or because of relatively subtle perturbations to protein structures common to all forms of terrestrial life. That is, innovation in the natural world in some sense has always followed the idea of a service and flow economy. If the environment is static, only when an organism figures out how to provide itself, or another organism, with a new service using the old toolbox is advantage conferred.
The analogy to future industrial applications of biology is clear: When molecular biologists figure out the kernel of biology, innovation by humans will consist of tweaking the parts to provide new services. Because of the sheer amount of information, it is unlikely that a single corporate entity could maintain a monopoly on the kernel. Eventually, as design tasks increase in number and sophistication, corporations will have to share techniques and this information will inevitably spread widely, reaching all levels of technical ability - the currency of the day will be innovation and design. As with every other technology developed by humans, biological technology will be broadly disseminated.
As open-source biological manufacturing spreads, it will be adopted quickly in less developed economies to bypass the first world's investment in industrial infrastructure. Given the already stressed state of natural resources throughout much of the developing world, it will not be possible for many of those countries to attain first-world standards of living with industrial infrastructure as wasteful as that of the United States. The developing world simply cannot afford industrial and energy inefficiency. A short cut is to follow the example of the growing wireless-only communications infrastructure in Africa and skip building systems to transport power and goods. It is already clear that distributed power generation will soon become more efficient than centralized systems. Distributed manufacturing based upon local resources will save transportation costs, provide for simpler customization, require less infrastructure investment, and as a result will likely cost less than centralized manufacturing.
Distributed biological manufacturing is the future of the global economy. With design and fabrication power spread throughout the world to the extent suggested here, it is necessary to consider possible dangers. The simple answer is that those dangers are real and considerable. This technology enables the creation of new organisms potentially pathogenic to humans, or to animals and plants upon which we rely. It is already clear that the social and biological consequences of extending human life span and human germline engineering will consume considerable public debate time over the next few decades. Moreover, the underlying infrastructure and methods are already so widespread that no one country will be able to manipulate the development of biological technology by controlling the research within its borders. But fear of potential hazards should be met with increased research and education rather than closing the door on the profound positive impacts distributed biological technology will have on human health, human impacts on the environment, and on increasing standards of living around the world. Technology based on intentional, open-source biology is on its way, whether we like it or not, and the opportunity it represents will just begin to emerge in the next fifty years.
3. Endy, D., et al., Computation, prediction, and experimental tests of fitness for bacteriophage T7 mutants with permuted genomes. Proceedings of the National Academy of Sciences, 2000. 97: p. 5375-5380.
4. Hawken, P., A. Lovins, and L.H. Lovins, Natural Capitalism. 1999: Little Brown.
6. Sarikaya, M., Biomimetics: Materials fabrication through biology. Proceedings of the National Academy of Sciences, 1999: p. 14183-14185.
10. Current Protocols in Molecular Biology. 1999, Wiley: New York.
Here is the first paragraph:
Biology, we are frequently told, is the science of the twenty-first century. Authority informs us that moving genes from one organism to another will provide new drugs, extend both the quantity and quality of life, and feed and fuel the world while reducing water consumption and greenhouse gas emissions. Authority also informs that novel genes will escape from genetically modified crops, thereby leading to herbicide-resistant weeds; that genetically modified crops are an evil privatization of the gene pool that will with certainty lead to the economic ruin of small farmers around the world; and that economic growth derived from biological technologies will cause more harm than good. In other words, we are told that biological technologies will provide benefits and will come with costs--with tales of both costs and benefits occasionally inflated--like every other technology humans have developed and deployed over all of recorded history.
And here are a couple of other selected bits:
Overall, in my opinion, the report is well considered. One must commend President Obama for showing leadership in so rapidly addressing what is seen in some quarters as a highly contentious issue. However, as noted by the commission itself, much of the hubbub is due to hype by both the press and certain parties interested in amplifying the importance of the Venter Institute's accomplishments. Certain scientists want to drive a stake into the heart of vitalism, and perhaps to undermine religious positions concerning the origin of life, while "civil society" groups stoke fears about Frankenstein and want a moratorium on research in synthetic biology. Notably, even when invited to comment by the commission, religious groups had little to say on the matter.
The commission avoided the trap of proscribing from on high the future course of a technology still emerging from the muck. Yet I cannot help the feeling that the report implicitly assumes that the technology can be guided or somehow controlled, as does most of the public discourse on synthetic biology. The broader history of technology, and of its regulation or restriction, suggests that directing its development would be no easy task.8 Often technologies that are encouraged and supported are also stunted, while technologies that face restriction or prohibition become widespread and indispensable.
...The commission's stance favors continued research in synthetic biology precisely because the threats of enormous societal and economic costs are vague and unsubstantiated. Moreover, there are practical implications of continued research that are critical to preparing for future challenges. The commission notes that "undue restriction may not only inhibit the distribution of new benefits, but it may also be counterproductive to security and safety by preventing researchers from developing effective safeguards."12 Continued pursuit of knowledge and capability is critical to our physical and economic security, an argument I have been attempting to inject into the conversation in Washington, D.C., for a decade. The commission firmly embraced a concept woven into the founding fabric of the United States. In the inaugural State of the Union Address in 1790, George Washington told Congress "there is nothing which can better deserve your patronage than the promotion of science and literature. Knowledge is in every country the surest basis of publick happiness."13
The pursuit of knowledge is every bit as important a foundation of the republic as explicit acknowledgment of the unalienable rights of life, liberty, and the pursuit of happiness. Science, literature, art, and technology have played obvious roles in the cultural, economic, and political development of the United States. More broadly, science and engineering are inextricably linked with human progress from a history of living in dirt, disease, and hunger to . . . today. One must of course acknowledge that today's world is imperfect; dirt, disease, and hunger remain part of the human experience. But these ills will always be part of the human experience. Overall, the pursuit of knowledge has vastly improved the human condition. Without scientific inquiry, technological development, and the economic incentive to refine innovations into useful and desirable products, we would still be scrabbling in the dirt, beset by countless diseases, often hungry, slowly losing our teeth.
There's more here.
8. R. Carlson, Biology Is Technology: The Promise, Peril, and New Business of Engineering Life (Cambridge, Mass.: Harvard University Press, 2010).
12. Presidential Commission for the Study of Bioethical Issues, New Directions, 5.
13. G. Washington, "The First State of the Union Address," January 8, 1790, http://ahp.gatech.edu/first_state_union_1790.html.
Here are a couple of reasons I have sorted out so far. It was the end of the beginning of synthetic biology. The meeting was full of science and engineering. And that's about all. There were a few VC's and other investors sniffing around, but not nearly so many as in previous years; those who did show up kept a lower profile. There were also fewer obvious government officials, no obvious spooks, no obvious law enforcement officers, nor any self-identified Weapons of Mass Destruction Coordinators. And I only encountered a couple of reporters, though there must have been more. I skipped 3.0 in Zurich, but at 1.0 at MIT, 2.0 at Berkeley (parts 1, 2, 3, 4, 5), and 4.0 in Hong Kong (part 1), there was much more buzz. Synthetic Biology 5.0 was much shorter on hype than prior gatherings.
There was substantially more data this year than previously. And there was substantially less modeling. All in all, Synthetic Biology is substantially more ... substantial. It was like a normal scientific meeting. About science. No stunts from "civil society" groups looking for their next fear bullet point for fundraising. No government officials proclaiming SB as the economic future of their city/state/country. Just science.
What a relief.
And that science was nothing to sneeze at. There were great talks for 3 days. Here are a couple of things that caught my eye.
Jef Boeke from Johns Hopkins presented his plans to build synthetic yeast chromosomes. I first heard this idea more than ten years ago from Ron Davis at Stanford, so it isn't brand new. I did notice, however, that Boeke having all his synthetic chromosomes made in China. Over the longer term this means China is getting a boost in building out future biomanufacturing platforms. If the project works, that is.
As tweeted, Jack Newman from Amyris gave an update on commercialization of artemisinin; it should be on the market by the end of the year, which should be in time to help avert an expected shortfall in production from wormwood. Fantastic.
Pam Silver and her various students and post-docs showed off a variety of interesting results. First, Faisal Aldaye showed in vivo DNA scaffolds used to channel metabolic reactions, resulting in substantial increases in yield. Second, Pam Silver showed the use of those scaffolds to generate twice as much sucrose from hacked cyanobacteria per unit of biomass as from sugar cane. If that result holds up, and if the various issues related to the cost of bioreactors used to culture photosynthetic organisms are worked out, then Pam's lab has just made an enormous step forward in bringing about distributed biological manufacturing.
This is the sort of advance that makes me feel more sanguine about the future of MIcrobrewing the Bioeconomy. It will take some years before the volume of Amyris' Biofene, or Gevo's bio-PET, or Blue Marble's bio-butyric acid begins to impact the oil industry. But it is clear to me now as never before that the petroleum industry is vulnerable from the top of the barrel -- the high value, low volume compounds that are used to build the world around us in the form of petrochemicals. Biology can now be used to make all those compounds, too, directly from sugar, cellulose, and sunlight, without the tens of billions of dollars in capital required to run an oil company (see The New Biofactories).
So SB 5.0 was the end of the world as we know it. Synthetic biology is now just another field of human endeavor, thankfully producing results and also thankfully suffering reduced hype. I can see how the pieces are starting to fit together to provide for sustainable manufacturing and energy production, though it will be some years before biological technologies are used this way at scale. Perhaps this is less in-your-face exciting for the attendees, the press, and the public, and that may be part of the reason for my ambivalence. I fell asleep several times during the proceedings, which has never happened to me at SB X.0, even when overseas and jetlagged. I have never before thought of achieving boredom as constituting progress.
Browser warning: When I ran it, something about the combination of Flash and the slide viewer caused Safari to freeze; Firefox was just fine.
Now sitting in the audience, I've just heard Jim Thomas of ETC once again egregiously distort the Keasling-Amyris-malaria-artemisinin story. As usual he is quite well-spoken and reasonable sounding, and uses rhetoric well to his ends.
It may be true, as Thomas asserts, that switching artemisinin production to fermentation will harm the economic livelihood of "a few thousand" (his words) farmers in China and Africa. But he has left out of his calculation the 40% of the world's population that is at risk of malaria every year. He has left out the millions of children who die annually from malaria.
Quoting from my book (pg.98 -- I've left out the references as I am liveblogging from the meeting):
The cost burden of the disease on individual families is highly regressive. The average cost per household for treating malaria may be in the range of only 3-7 percent of income, but total and indirect costs to poor households can amount to one-third of annual income. The disease also disproportionately affects the young. Approximately 90percent of those who are killed by the parasite are African children under the age of ﬁve; according to the World Health Organization (WHO), a child dies from malaria roughly every thirty seconds.So, Mr. Thomas, what about all the people who will benefit from inexpensive malaria drugs? It is, frankly, unconscionable and indefensible for you to continue beating this drum as you do. The human cost of not producing inexpensive artemisinin in vats is astronomical. If reducing the burden of malaria around the world on almost 2 billion people might harm "a few thousand" farmers, then we should make sure those farmers can make a living growing some other crop. We can solve both problems. Your ideological opposition to synthetic biology is is blinding you to the opportunities, and your version of reality would ignore the health and welfare of children around the world.
In addition to staggering personal costs, the disease harms whole societies by severely inhibiting economic development. In affected countries, malaria reduces GDP growth by about 1.3 percent per year. These countries, moreover, contain about 40percent of the world's population. Over the past forty years, the growth penalty has created a difference in GDP that substantially exceeds the billions in annual foreign aid they receive. In 2000 the World Health Organization estimated that eliminating this growth penalty in 1965 would have resulted in "up to $100 billion added to sub-Saharan Africa's  GDP of $300 billion. This extra $100 billion would be, by comparison, nearly ﬁve times greater than all development aid provided to Africa [in 1999]."
Because there was no technical means to eliminate the parasite in the middle of the twentieth century, this is clearly a number calculated to impress or shock, but the point is that the growth penalty continues to balloon. As of 2008, the GDPs of countries in sub-Saharan Africa would be approximately 35 percent higher than they are today had malaria been eliminated in 1965. The World Health Organization reckons that malaria-free countries have a per capita GDP on average three times larger than malarious countries. The productivity of farmers in malarious countries is cut by as much as 50 percent because of workdays lost to the disease. The impact of producing an effective and inexpensive antimalarial drug would therefore be profound.
Improving access to other technologies, such as bed nets treated with insecticides, would also be of substantial aid in reducing the rate of infection. Yet infected victims will still need access to cures. Prevention might be found in a vaccine, which the Gates Foundation also funds. However, even the most promising malaria vaccine candidates are only partially effective and cost even more than artemisinin. Microbial production of artemisinin would completely change the impact of malaria on billions of people worldwide. Artemisinin is presently derived from the wormwood tree and has been used as an herbal remedy for at least two thousand years. Its antimalarial activity was ﬁrst described by Chinese scientists in 1971. The existence of the drug and its physiochemical properties were announced to the world in 1979, although its precise molecular mechanism of action is still not understood. A method for chemical synthesis was published in 1983, but it remains "long, arduous, and economically nonviable."
Because natural artemisinin is an agricultural product, it competes for arable land with food crops, is subject to seasonal variations in supply, and its production costs are in part determined by the costs of fertilizer and fuel. As a result of the work of Keasling and his collaborators, it appears that, within just a few years, biological technology may provide a more-ﬂexible and less-expensive supply of drugs than now exists. Commercial production of artemisinin should commence in 2010, with a continuous annual production sufﬁcient to treat the 500million malaria cases per year.
How's that for rhetoric?
Update: One other thought. Just one year of 1.3% GDP growth recovered by reducing (eliminating?) the impact of malaria would more than offset paying wormwood farmers to grow something else. There is really no argument to do anything else.
For a "Civil Society" organization, ETC is being decidedly uncivil on this issue.
The topic is Synthetic Biology. Here is the agenda.
Location: The Ritz-Carlton Washington, DC 1150 22nd Street, NW Washington, DC 20037
The nuts and bolts (or bases and methylases?) of the story are this: Gibson et al ordered a whole mess of pieces of relatively short, synthetic DNA from Blue Heron and stitched that DNA together into full length genome for Bug B, which they then transplanted into a related microbial species, Bug A. The transplanted genome B was shown to be fully functional and to change the species from old to new, from A to B. Cool.
Yet, my general reaction to this is the same as it was the last time the Venter team claimed they were creating artificial life. (How many times can one make this claim?) The assembly and boot-up are really fantastic technical achievements. (If only we all had the reported $40 million to throw at a project like this.) But creating life, and the even the claim of creating a "synthetic cell"? Meh.
(See my earlier posts, "Publication of the Venter Institute's synthetic bacterial chromosome", January 2008, and "Updated Longest Synthetic DNA Plot ", December 2007.)
I am going to agree with my friends at The Economist (see main story) that the announcement is "not unexpected", and disagree strongly that "The announcement is momentous." DNA is DNA. We have known that for, oh, a long time now. Synthetic DNA that is biologically indistinguishable from "natural DNA" is, well, biologically indistinguishable from natural DNA. This result is at least thirty years old, when synthetic DNA was first used to cause an organism to do something new. There are plenty of other people saying this in print, so I won't belabor the point; see, for example, the comments in the NYT article.
One less-than-interesting outcome of this paper is that we are once again going to read all about the death of vitalism (see the Nature opinion pieces). Here are the first two paragraphs from Chapter 4 of my book:
"I must tell you that I can prepare urea without requiring a kidney of an animal, either man or dog." With these words, in 1828 Friedrich Wöhler claimed he had irreversibly changed the world. In a letter to his former teacher Joens Jacob Berzelius, Wöhler wrote that he had witnessed "the great tragedy of science, the slaying of a beautiful hypothesis by an ugly fact." The beautiful idea to which he referred was vitalism, the notion that organic matter, exempliﬁed in this case by urea, was animated and created by a vital force and that it could not be synthesized from inorganic components. The ugly fact was a dish of urea crystals on his laboratory bench, produced by heating inorganic salts. Thus, many textbooks announce, was born the ﬁeld of synthetic organic chemistry.Care to guess where the nucleotides came from that went into the Gibson et al synthetic genome? Probably purified and reprocessed from sugarcane. Less probably salmon sperm. In other words, the nucleotides came from living systems, and are thus tainted for those who care about such things. So much for another nail in the vital coffin.
As is often the case, however, events were somewhat more complicated than the textbook story. Wöhler had used salts prepared from tannery wastes, which adherents to vitalism claimed contaminated his reaction with a vital component. Wöhler's achievement took many years to permeate the mind-set of the day, and nearly two decades passed before a student of his, Hermann Kolbe, ﬁrst used the word "synthesis" in a paper to describe a set of reactions that produced acetic acid from its inorganic elements.
Somewhat more intriguing will be the debate around whether it is the atoms in the genome that are interesting or instead the information conveyed by the arrangement of those atoms that we should care about. Clearly, if nothing else this paper demonstrates that the informational code determines species. This isn't really news to anyone who has thought about it (except, perhaps, to IP lawyers -- see my recent post on the breast cancer gene lawsuit) but it might get a broader range of people thinking more about life as information. What then, does "creating life" mean? Creating information? Creating sequence? And what sort of design tools do we need to truly control these creations? Are we just talking about much better computer simulations, or is there more physics to learn, or is it all just too complicated? Will we be forever chasing away ghosts of vitalism?
That's all I have for deep meaning at the moment. I've hardly just got off one set of airplanes (New York-DC-LA) and have to get on another for Brazil in the morning.
I would, however, point out that the recent paper describes what may be a species-specific processing hack. From the paper:
...Initial attempts to extract the M. mycoides genome from yeast and transplant it into M. capricolum failed. We discovered that the donor and recipient mycoplasmas share a common restriction system. The donor genome was methylated in the native M. mycoides cells and was therefore protected against restriction during the transplantation from a native donor cell. However, the bacterial genomes grown in yeast are unmethylated and so are not protected from the single restriction system of the recipient cell. We were able to overcome this restriction barrier by methylating the donor DNA with purified methylases or crude M. mycoides or M. capricolum extracts, or by simply disrupting the recipient cell's restriction system.This methylation trick will probably -- probably -- work just fine for other microbes, but I just want to point out that it isn't necessarily generalizable and that the JVCI team didn't demonstrate any such thing. The team got this one bug working, and who knows what surprises wait in store for the next team working on the next bug.
Since Gibson et al have in fact built an impressive bit of DNA, here is an updated "Longest Synthetic DNA Plot" (here is the previous version with refs.); alas, the one I published just a few months ago in Nature Biotech is already obsolete (hmph, they have evidently now stuck it behind a pay wall).
A couple of thoughts: As I noted in DNA Synthesis "Learning Curve": Thoughts on the Future of Building Genes and Organisms (July 2008), it isn't really clear to me that this game can go on for much longer. Once you hit a MegaBase (1,000,000 bases, or 1 MB) in length, you are basically at a medium-long microbial genome. Another order of magnitude or so gets you to eukaryotic chromosomes, and why would anyone bother building a contiguous chuck of DNA longer than that? Eventually you get into all the same problems that the artificial chromosome community has been dealing with for decades -- namely that chromatin structure is complex and nobody really knows how to build something like it from scratch. There is progress, yes, and as soon as we get a real mammalian artificial chromosome all sorts of interesting therapies should become possible (note to self: dig into the state of the art here -- it has been a few years since I looked into artificial chromosomes). But with the 1 MB milestone I suspect people will begin to look elsewhere and the typical technology development S-curve kicks in. Maybe the curve has already started to roll over, as I predicted (sketched in) with the Learning Curve.
Finally, I have to point out that the ~1000 genes in the synthetic genome are vastly more than anybody knows how to deal with in a design framework. I doubt very much that the JCVI team, or the team at Synthetic Genomics, will be using this or any other genome in any economically interesting bug any time soon. As I note in Chapter 8 of Biology is Technology, Jay Keasling's lab and the folks at Amyris are playing with only about 15 genes. And getting the isoprenoid pathway working (small by the Gibson et al standard but big by the everyone-else standard) took tens of person years and about as much investment (roughly ~$50 million in total by the Gates Foundation and investors) as Venter spent on synthetic DNA alone. And then is Synthetic Genomics going to start doing metabolic engineering in a microbe that they only just sequenced and about which relatively little is known (at least compared with E. coli, yeast, and other favorite lab animals)? Or they are going to redo this same genome synthesis project in a bug that is better understood and will serve as a platform or chassis? Either way, really? The company has hundreds of millions of dollars in the bank to spend on this sort of thing, but I simply don't understand what the present publication has to do with making any money.
So, in summary: very cool big chuck of synthetic DNA being used to run a cell. Not artificial life, and neither artificial cell nor synthetic cell. Probably not going to show up in a product, or be used to make a product, for many years. If ever. Confusing from the standpoint of project management, profit, and economic viability.
But I rather hope somebody proves me wrong about that and surprises me soon with something large, synthetic, and valuable. That way lies truly world changing biological technologies.
Cambridge's James Brown gets the honor of introducing the Beeb's audience to synthetic biology, biobricks, and engineering methods for biological systems. The 3D-printed DremelFuge gets a photo and a significant mention. I explicitly pointed to this sort of application of 3D printing in my book, though it is happening even faster than I had imagined. Shapeways is now printing all sorts of interesting materials, though the resolution of most 3D printers and processes still doesn't make them useful for the sorts of objects I want to print. That said, there is clear improvement over time.
It will be interesting to see how long it takes before you can print mixed media functional objects, say something like a zero-dead volume, positive displacement membrane pump. Or better yet an entire pump block. (Which is usually milled from a piece of stainless steel -- see where this is going?) That gets you the most annoying bit of kit needed for a DNA synthesizer. At which point you can forget any regulations limiting access to DNA of any sequence.
What should we make of this?
First, Life is led by Gregory Lucier, who used to be way high up at GE and is a former protege of Jack Welch. In my observation, and in my experience, Life is trying to the be the GE of biology. What does that mean? GE is obviously a conglomerate, and it operates not so much as a maker and seller of things but as a finance operation that seeks growth through return on capital. As such, GE buys other companies aggressively -- this is, vastly oversimplified, the Jack Welch strategy. Life is operating the same way. The company is aggressively acquiring biotech companies of all sizes. The web was full of rumors earlier in 2010 that GE, seeing something it liked and a familiar strategy, was trying to buy Life Tech. Who knows -- that may be real and may still happen. It would be another interesting indication of a certain kind of maturity in the market for biological technologies.
Second, Life obviously sells lots of cloning reagents -- a market that is threatened by synthesis -- so the move could be somewhat defensive in nature. Life is getting reputation, market share, and expertise in an area that they do not yet dominate.
Third, while GeneART is big, they are a European shop paying German wages to a bunch of people running around with plates and pipetters. GeneART gets some cash and a big marketing arm, and Life gets ... hummm ... an operation that may have difficulty competing with Chinese labor (Genescript) and automation (Blue Heron). Presumably, Life looked at the balance sheet and the marketing forecasts and decided the deal makes sense. But it might be a complex calculation involving not just return on capital, but also access to IP, expertise, and factors that nobody outside Life can do more than guess at, like balancing sales of cloning reagents against sales of synthetic genes.
Now, what might be the implications for the synthetic biology community? Probably not much. Prices for synthetic DNA continue to fall. The $.39 per base price established last autumn as a "special" is now, no surprise, the industry standard. We will probably see additional consolidation and shifting around as margins get squeezed. The industry is expecting prices to be at $.05 to $.15 per base within 5 years. Though even within the same conversation you might hear $.10-$.25 per base, thereby managing consumer expectations, which makes me wonder if people are starting to quail a bit at the exponential and its implications for their business. You will still have the option to pay more for rush jobs or for genes that are tricky to synthesize.
As I have observed previously (most recently in Nature Biotechnology, here), the maximum profit margin on synthetic genes is evaporating exponentially. That is not hyperbole, but rather a quantitative observation based on market prices over more than ten years; it is data. That said, even as prices fall it will still be possible for some companies to increase their revenues as competitors leave the market or go out of business. But I would be surprised if the market dynamics that enabled Intel to exploit Moore's Law for many decades reemerged in synthetic genes. Intel knew it could ship exponentially more transistors every quarter -- which meant it could rapidly grow even in the face of falling prices -- but I do not have any evidence that the total market for synthetic genes is expanding much faster than the price is falling. Conversations with industry executives lead me to believe the total dollar value in the market is continuing to rise, if somewhat slowly. The rate of increase is hard to pin down, however, given the hiccup that was 2009. This year's volume and revenues should be bigger, but it isn't clear that one should attribute this to more than the broader economic recovery.
All in all, this seems like business as usual for an industry that is experiencing a rapid transition to commodity status while simultaneously suffering from globalization and lowered barriers to entry. It probably isn't so different in overall impact from the demise of Codon Devices. This is just another step towards maturity in an area that will have much more impact on our lives in the future than it has thus far.