Re-Inventing The Food Chain (or "On Food Prices, In Vitro Meat, and GM Livestock Feed")

Given the price of grain, and a dislike of genetically modified crops, Europe might soon be eating meat grown in a vat.  Stay with me:

The press is full of noise about the price of food.  Whatever the real impact of biofuels production on food prices -- which is probably very hard to pin down quantitatively -- the grain surplus we have enjoyed for decades is now over and demand exceeds supply.  This condition is probably permanent, and in order to keep the economy running we need to figure out how to get more production out of limited arable land.  This in turn raises the issue of improving yields and overall harvest through the use of genetically modified crops.  GM crops are widely grown and consumed in the Americas, but have met with governmental and consumer resistance elsewhere.

The general embrace by U.S. farmers of GM crops, and contemporaneous rejection of those crops by European consumers, produces interesting complexities within markets.  While the European region is presently a net food exporter, much of the feed for European livestock and poultry comes from the Americas.  Yet the strict safety testing and labeling requirements for food or feed containing GM plants amounts to a European zero-tolerance policy for importation of GM products.  While GM sugar beets and some varieties of GM corn may be officially approved for sale in Europe, consumers appear to avoid products with the GM label.

This policy has fascinating secondary consequences, namely that it is on track to force dramatic reductions in European livestock production due to increasing fractions of GM feed grains.  In an article in the October 2007 issue of Nature Biotechnology, "Europe's anti-GM stance to presage animal feed shortage?" (the full text of which you can find online here, PDF here), Peter Mitchell writes:

...If a solution isn't found, European farmers will be forced into wholesale slaughter of their livestock rather than have the animals starve. Europe will then have to import huge quantities of animal products from elsewhere—ironically, most of it from animals raised on the very same GM feeds that Europe has not approved.

Mitchell cites a report from the European commission that production of meat could fall by between 1 and 44 percent over the next two years, depending on actual supplies of non-GM feed.  Changes in attitude that produce a marketing environment friendlier to GM products may alleviate this problem.  Yet consumer resistance to GM products in Europe is both deep and broad.  Even in the face of economic hardship, brought on by reduced food exports and increased domestic prices, consumers and interest groups may take many years to change their minds.

The New York Times reports that pressure is growing in Europe to change policies on GM crops.  According to an article by Andrew Pollack in the 21 April, NYT:

In Britain, the National Beef Association, which represents cattle farmers, issued a statement this month demanding that “all resistance” to such crops “be abandoned immediately in response to shifts in world demand for food, the growing danger of global food shortages and the prospect of declining domestic animal production.”

Despite these pressures, Pollack writes that, "Since the beginning of the year France has banned the planting of genetically modified corn while Germany has enacted a law allowing for foods to be labeled as “G.M. free.”"

So, in a world with declining GM-free feedstocks, where is Europe going to get GM-free meat?  The science fiction vision of meat grown in vats could be economically relevant sooner than one might think.

Earlier this month at Wired News, Alexis Madrigal wrote about the recent In Vitro Meat Symposium in Norway.  A report was presented that claimed, "Meat grown in giant tanks known as bioreactors would cost between $5,200-$5,500 a ton (3,300 to 3,500 euros)" -- more or less competitive with current European beef prices."  Madrigal reports that according to Jason Matheny at Johns Hopkins University, "The general consensus is that minced meat or ground meat products -- sausage, chicken nuggets, hamburgers -- those are within technical reach.  We have the technology to make those things at scale with existing technology."  Matheny is the founder of New Harvest, a non-profit working on producing meat substitutes.

Madrigal's story carried a skeptical tone, and suggested that commercialized in vitro meat was probably many years away.  I have been wondering whether the market would, um, serve up cultured meat sooner than that, and this week brought an interesting surprise.

People for the Ethical Treatment of Animals (PETA) just announced a US$ 1 million prize for, "The first person to come up with a method to produce commercially viable quantities of in vitro meat at competitive prices."  It may not be long before PETA writes that check.

If it is already possible to produce "meat-like products" at prices competitive with those in Europe today, then continued increases in the price of products grown on the hoof or claw should make in vitro meat even more attractive economically.  The feedstocks for meat cells grown in culture would be fairly basic, just sugars and amino acids, and possibly some lipids.  These in turn can be produced by plants, yeast, and bacteria.  In principle, a co-culture of non-GM animal cells and hacked/engineered microbes that serve as feeder cells could provide a fairly high-efficiency conversion of sunlight to meat.  That might not pass muster in Europe, but it would probably sell like hotcakes Big Macs in other countries.  And how could you tell the difference?

Pushing further down this road, how long will it be before an iGEM team produces a circuit that facilitates the differentiation and culture of stem cells from fowl, fish, and mammals to produce a better burger?  I suppose an intermediary step is to hack filamentous E. coli so that it grows to have the texture of muscle tissue for minced meat.  Those clever undergrads have already made coli smell like bananas and mint, so why not add a few more metabolic products: "Yum! Tastes just like chicken!"  Or lamb.  Or yak.  Yuck. You could even enjoy a nice "coliburger" ("bactoburger"?) that intentionally contained bacteria and not have to worry about kidney and liver damage.  (Oh, yes, this is waaaay more fun than finishing the last chapter of my book.)

Just as long as our in vitro meat isn't actually made of people (see 1:34:29).  Can't wait for the t-shirt.

Catching up on the News - Big Biotech Investment by India

Now that my book is nearing completion, I can start looking around a bit more.  I missed this last November -- according to a news brief in Nature, India is planning to seriously boost its expenditures on biotech:

For the first time, Indiahas appointed a biologist as head of its largest research agency. The announcement coincides with the unveiling of a national strategy for biotechnology, supported by a 65-billion-rupee (US$1.6-billion) commitment over the next 5 years.

Samir Brahmachari, former director of the Institute of Genomics and Integrative Biology in Delhi, is the new chief of the Council of Scientific and Industrial Research (CSIR). The CSIR manages 41 labs with a staff of more than 18,000 scientists and has been without a permanent director since December 2006.

The strategy approved by the cabinet on 14 November calls for one-third of the government's research budget to be spent on biotechnology — a 450% increase over the previous 5 years — in partnership with private-sector funding. The plan will create 50 biotech 'centres of excellence' by 2012.

                                                                                   

"Laying the foundations for a bio-economy"

My new commentary, "Laying the foundations for a bio-economy", will be appearing in a upcoming issue of Systems and Synthetic Biology.  The piece is freely available online as both text and PDF.  Thanks to Springer for supporting the Open Access option.  Here are the abstract, the first two paragraphs, and the last two paragraphs:

Abstract  Biologicaltechnologies are becoming an important part of the economy. Biotechnology already contributes at least 1% of US GDP, with revenues growing as much as 20% annually. The introduction of composable biological parts will enable an engineering discipline similar to the ones that resulted in modern aviation and information technology. As the sophistication of biological engineering increases, it will provide new goods and services at lower costs and higher efficiencies. Broad access to foundational engineering technologies is seen by some as a threat to physical and economic security. However, regulation of access will serve to suppress the innovation required to produce new vaccines and other countermeasures as well as limiting general economic growth.          


Welcome to the Paleobiotic Age. Just as today we look back somewhat wistfully on our quaint Paleolithic--literally "old stone"--ancestors, so will our descendants see the present age as that of "old biology", inhabited by Paleobiotic Man. The technologies we use to manipulate biological systems are experiencing dramatic improvement, and as a result are driving change throughout human economies.       

In order to understand the impact of our growing economic dependence on biological technologies it is worth taking a moment to consider the meaning of economy. "Economy" is variously thought of as, "the management of the resources of a country, especially with a view to its productivity" and "the disposition or regulation of the parts or functions of any organic whole; an organized system or method"  Amid a constantly increasing demand for resources, we look to technology to improve the productivity of labor, to improve the efficiency of industrial process and energy production, and to improve the yield of agriculture. Very tritely, we look to technological innovation within our economy to provide more stuff at lower cost. Biological technologies are increasingly playing that role.

...

In this, the Paleobiotic Age, our society is only just beginning to struggle with all the social and technical questions that arise from a fundamental transformation of the economy. History holds many lessons for those of us involved in creating new tools and new organisms and in trying to safely integrate these new technologies into an already complex socio-economic system. Alas, history also fails to provide examples of any technological system as powerful as rational engineering of biology. We have precious little guidance concerning how our socio-economic system might be changed in the Neobiotic Age to come. We can only attempt to minimize our mistakes and rapidly correct those we and others do make.

The coming bio-economy will be based on fundamentally less expensive and more distributed technologies than those that shaped the course of the 20th Century. Our choices about how to structure the system around biological technologies will determine the pace and effectiveness of innovation. As with the rest of the natural and human built world, the development of this system is decidedly in human hands. To paraphrase Stewart Brand: We are as engineers, and we'd better get good at it in a hurry.          

Massive Technology Failure

If anybody reading this has emailed me within the last 14 days, I haven't received it.  After 5 years, and many, many miles, my trusty 12" Powerbook had a serious hardware failure a couple of nights ago.  Took all my email with it.  Then, when I was trying to rescue some data, the failing Powerbook took out my backup drive.  Ouch.  I seem to have a talent for finding really good technology problems...

I have normal email access now, but if you don't receive a reply from me on something sent before this weekend, please send again.

And remember, kids, back up, back up, and back up again.

Publication of the Venter Institute's synthetic bacterial chromosome

Craig Venter and his crew have just published a paper in Science demonstrating synthesis of a complete bacterial chromosome.  Venter let the cat out of the bag late last year in an interview with The Guardian, which I wrote about a few weeks ago, here: "Updated Longest Synthetic DNA Plot".

As a technical achievement, the paper, by Gibson, et al., is actually quite nice.  The authors ordered ~5kB gene cassettes from Blue Heron, DNA 2.0, and GENEART, and then used a parallel method to assemble those cassettes into the ~580kB full genome in just a few steps.  They contrast their method, which may be generalizable to any sequence, with previous research:

All [the previous] methods used sequential stepwise addition of segments to reconstruct a donor genome within a recipient bacterium. The sequential nature of these constructions makes such methods slower than the purely hierarchical scheme that we employed.

The Itaya and Holt groups found that the bacterial recipient strains were unable to tolerate some portions of the donor genome to be cloned, for example ribosomal RNA operons. In contrast, we found that the M. genitalium ribosomal RNA genes could be stably cloned in E. coli BACs. We were able to clone the entire M. genitalium genome, and also to assemble the four quarter genomes in a single step, using yeast as a recipient host. However, we do not yet know how generally useful yeast will be as a recipient for bacterial genome sequences.

The team was evidently unable to successfully use the synthetic chromosome to boot up a new organism.  It turns out that one of the techniques they developed in fact gets in the way of finishing this final step.  There is an interesting note, added in proof, at the end of the paper:

While this paper was in press, we realized that the TARBAC vector in our sMgTARBAC37 clone interrupts the gene for the RNA subunit of RNase P (rnpB). This confirms our speculation that the vector might not be at a suitable site for subsequent transplantation experiments.

So, Gibson, et al., made really interesting technical progress in developing a method to assemble large, (seemingly) arbitrary sequences.  However, their goal of booting up a synthetic chromosome using the assembly technique is presently stymied by one of the technologies they are relying on to propagate the large construct in yeast.  As for the goal of "synthetic life" as defined by constructing a working genome from raw materials, they are close, but not quite there.  Given the many different wasy of manipulating large pieces of DNA within microbes, it won't be long until the Venter Institute team gets there.

Andrew Pollack of the NYT quotes Venter as saying, “What we are doing with the synthetic chromosome is going to be the design process of the future."  This is a bit of a stretch, because no one in their right mind is going to synthesize an entire microbial genome for a real engineering project, with real costs, anytime soon.  Any design process that involves writing whole genomes is going to be WAY in the future.  As I wrote in the "Longest Synthetic DNA" post:

The more interesting numbers are, say, 10-50 genes and 10,00-50,000 bases.  This is the size of a genetic program or circuit that will have interesting economic value for many decades to come.  But while assembling synthetic constructs (plasmids) this size is still not trivial, it is definitely old news.  The question is how will the cost for constructs of this size fall, and when can I have that DNA in days or hours instead of weeks?  And how soon before I can have a desktop box that prints synthetic DNA of this length?  As I have previously noted in this space, there is clear demand for this sort of box, which means that it will happen sooner or later.  Probably sooner.

The Gibson, et al, Science paper doesn't say how many person-hours the project took, nor does it say exactly how much they spent on their synthetic construct (presumably they got a nice volume discount).  The fact that the project isn't actually finished demonstrates that this is hardly a practical engineering challenge that will find a role in the economy anytime soon.

That said, I could certainly be wrong about this assertion, particularly if other technical approaches crop up, as may well happen.  In the NYT story Venter is quoted as saying that, "I will be equally surprised and disappointed if we can’t do it in 2008.”  And they probably will, but what is the real impact of that success? 

The NYT story, by Andrew Pollack, carries the unfortunate title, "Scientists Take New Step Toward Man-Made Life".  Not so much.  Even if Venter and colleagues do get their chromosome working, they will have demonstrated not "man-made" life, but rather a synthetic instruction set running in a pre-existing soup of proteins and metabolites in a pre-existing cell.  It's really no different than getting a synthetic viral genome working in cell culture, which is old news.  Show me a bacterial cell, or something else obviously alive, from an updated Miller-Urey experiment and then I will be really impressed.  Thus the Gibson paper represents a nice technical advance, and a good recipe for doing more science, but not much in the way of a philosophical earthquake.

Without the ability to easily -- very easily -- print genomes and get them into host cells at high efficiency and low cost, building synthetic genomes will remain just interesting science.

The New York Times gets a story title backwards

The story itself is right on the money, mind you -- I highly recommend reading it -- but the title, "An Oil Quandary: Costly Fuel Means Costly Calories", is bass-ackwards.  That title, probably courtesy of an editor, rather than the reporters, would be accurate for ethanol but has the effect before the cause for vegetable oil-based biodiesel.

Indeed, the story is the same as the one Bio-era has been telling for the last year.  "Chomp! Chomp! Fueling a new agribusiness", written (mostly by Jim Newcomb) for CLSA, nailed all the trends early on; rising income, rising meat consumption, grain use for food and feed, water supply issues, carbon emissions, and government mandates for biofuel use.  It all adds up to a big mess, for the time being.

As I wrote last year while in Hong Kong (See "Asia Biofuels Travelblog, Pt. 2"), after having just been on the ground in Malaysia and Singapore, food use has driven the price of of palm and other vegetable oils well above the wholesale price for finished petrodiesel.  Planting more oil palms, even if done on land that has already been cleared (i.e., not on virgin jungle or on drained peat bogs), is unlikely to ease price pressures because demand is climbing much faster than supply could possibly keep up (see the "Travelblog" post for some rough numbers).  In other words, there is plenty of price pressure to keep cutting down forests and draining peat bogs, carbon emissions be damned.  Prices are probably going to stay high for quite a while.

As the NYT story notes, biodiesel refineries are sitting idle all over the place because the feedstock is way too expensive to turn into fuel.  Far better, and more profitable, to eat it.  The heart of the matter is that, as the Times says, "Huge demand for biofuels has created tension between using land to produce fuel and using it for food."  The arable land is the key issue, and the only way the ongoing collision between food and fuel is going to be resolved is by using non-food feedstock to make fuel, to grow that feedstock on land that cannot be used to produce food at market prices, and produce biofuels using new technologies.  Synthetic biology, various grasses, and sugar from Brazil seem to be the way to go (see my earlier posts "The Need for Fuels Produced Using Synthetic Biology" and "The Intersection of Biofuels and Synthetic Biology").  Hmmm...I still need to post something about switchgrass, miscanthus, and prairies.  Maybe next week.

I'm headed to Houston on Monday for a Roundtable on biofuels run by Bio-era, "Biotech Biofuels & the Future of the Oil Industry".  Companies in the oil industry, agbiotech, and synthetic biology will all be there.  Should be interesting.

High Yield Ethanol Fermentation from Synthesis Gas

The New York Times is reporting that GM has directly invested in a waste-to-ethanol company in order to help supply biofuels.  Coskata (another Khosla-funded company) has a proprietary combined industrial-biological process for using synthesis gas (CO and H2) to produce ethanol.  Here is the NYT story, by Matthew Wald.

This announcement is interesting to me for several reasons.  First, it turns out I was told all about the Coskata process late last year (though not the GM investment), but I was so busy I didn't tune in sufficiently and so completely missed the significance.  Oops.

Second, in about 2002, I suggested to GM's upper management that they should start thinking of themselves as a "transportation solutions" company rather than just a company that sells cars, and that they invest in providing alternative fuels to ensure that their advanced technology cars would have something to burn. (As the NDA has long since expired, I will connect the dots and point interested readers to an earlier post of mine on producing hydrogen from waste.)  Think W. Edwards Deming and buggy whip manufacturers -- over the next two decades selling cars by themselves is rapidly going to become a losing business model in developed countries as manufacturing practices change and as carbon becomes a bigger issue.  I don't claim that my suggestion five years ago is what got GM started down this road, but I am certainly interested to see that they have made the decision.

The NYT story quotes a number of people commenting on GM's investment, and I think this is the most interesting one, because it is so wrong:

“I don’t really see the logic of it,” said Christopher Flavin, president of the Worldwatch Institute, a Washington environmental group. “It’s not particularly an industry they know well, or have expertise in.” Companies like G.M., he said, could be more effective by concentrating on the fuel efficiency of their products.

GM is now facing enormous pressure to reduce the carbon emissions from its vehicles, in part by increasing fuel efficiencies.  But that isn't the whole story.  Carbon emissions can fall much faster by switching to new fuels, but the extra cost that goes into building engines able to burn those fuels is wasted without access to the fuel.  My earlier suggestion to GM was in the context of using hydrogen as that fuel, but the argument is the same for any other fuel.  Without a sufficient supply of the fuel, why would anyone bother to pay extra for a vehicle that could have lower emissions if only the fuel were available? 

The Coskata website is rather thin on details, but basically they describe a microbe that can convert CO and H2 to ethanol on the fly.  I am absolutely certain the NDA covering the conversation in which I learned about this is still in effect, which limits my ability to say more than what has been published elsewhere.

What I can say is that, if the technology proves to be as efficient and versatile as is claimed, this strategy makes a great deal of sense.  From the NYT story:

If it can be done economically, the Coskata process has three large advantages over corn-based ethanol, according to General Motors. First, it uses a cheaper feedstock that would not compete with food production. Second, the feedstock is available all over the country, a crucial point since ethanol cannot be shipped from the corn belt to areas of high gasoline demand in existing pipelines.

As I have written in this space many times (see, for example, "The Need for Fuels Produced Using Synthetic Biology"), getting away from competition with food is the most important next step in increasing biofuel production.  Diversifying feedstocks to include waste products is critical.

Finally, it is interesting to speculate about the possibility of combining Coskata's synthesis gas eating microbe with the non-fermentative biofuel synthesis I wrote about last week.  Fermentation produces lots of stuff besides ethanol, and ethanol is toxic to most microbes above minimal concentrations.  Besides, ethanol sucks as a biofuel.  So if you could patch the biosynthesis technology that Gevo (another Khosla-funded company, hmmm...) just licensed from UCLA into a bug that eats synthesis gas, you would have a generalized method for taking any organic trash and converting it via synthesis gas into many useful materials, starting with fuels.  Put all together and what do you get?

Say it all together now: "Distributed Biological Manufacturing" (PDF).

High yield biofuels production using engineered "non-fermentative" pathways in microbes.

A paper in last week's Nature demonstrated a combination of genetic modifications that allowed E. coli to produce isobutanol from glucose at 86% of the theoretical maximum yield.  Please people, slow down!  How am I supposed to finish writing my book if you keep innovating at this rate?

I jest, of course.  Mostly.

Atsumi, et al., exploit non-fermentative synthesis to maximize the production of molecules that could be used as biofuels, while minimizing parasitic side reactions that serve to "distract" their microbial work horse (here is the abstract in Nature).  The authors deleted 7 native genes, added several more from yeast and other microbes, and also added a plasmid containing what looks like another 6 or so genes and regulatory elements.  The plasmid was used to overexpress genes in a native E. coli synthesis pathway.  So call it ~15 total changes.

While the various genetic changes were made using traditional cloning techniques, rather than by synthesis, I would still put this project squarely in the category of synthetic biology.  True, there is no evident quantitative modeling, but it is still a great story.  I am impressed by the flavor of the article, which makes it sound like the project was cooked up by staring at a map of biochemical process (here is a good one at ExPASy -- you can click on the map for expanded views) and saying, "Hmmm... if we rewired this bit over here, and deleted that bit over there, and then brought in another bit from this other bug, then we might have something."  Molecular Legos, in other words.

As far as utility in the economy goes, the general method of engineering a biosynthesis pathway to produce fuels appears has, according to the press release from UCLA, been licensed to Gevo.  Gevo was founded by Francis Arnold, Matthew Peters, and Peter Meinhold of the California Institute of Technology and was originally funded by Vinod Khosla.

It is not clear how much of the new technology can be successfully claimed in a patent.  Dupont a published application from last spring (Update -- typed too fast)  Dupont had an application published last spring that claims bugs engineered to produce fuels via the Ehrlich pathway, and it appears to be very similar to what is in the Atsumi paper described above.  Here is the DuPont application at the USPTO, oddly entitled "Fermentive production of four carbon alcohols".  The "four-carbon" bit might be the out for the UCLA team and Gevo, as they demonstrate ways to build molecules with four and more carbons.  Time, and litigation, will tell who has the better claims.  And then both groups probably have to worry about patents held by Amyris, which is probably also claiming the use of engineered metabolic synthesis for biofuels.  Ah, the joys of highly competitive capitalism.  But, really, it is all good news because all the parties above are trying to move rapidly beyond ethanol.

I am no fan of ethanol as a biofuel, as it has substantially lower energy density than gasoline and soaks up water even better than a sponge.  If ethanol were the only biofuel around, then I suppose we would have to settle for it despite the disadvantages.  But, obviously, new technologies are rapidly being demonstrated that produce other, better, biofuels.  The Atsumi paper serves as yet more evidence that biological technologies will prove a substantial resource in weaning ourselves from fossil fuels (see  my earlier posts "The Need for Fuels Produced Using Synthetic Biology" and "The Intersection of Biofuels and Synthetic Biology").

New method for "bottom-up genome assembly"

Itaya, et al., have published a new method for assembling ~5kB DNA fragments into genome-sized pieces in this month's Nature Methods (PubMed).  Jason Kelly has launched a blog, Free Genes, where he describes the new method.  Welcome to the blogosphere, Jason.

I won't add anything to Jason's post, other than to note that because Itaya's method exploits a recombination mechanism present in a microbe, there is no need to manipulate large pieces of DNA "by hand".  This is a significant advantage over methods that require lots of pipetting between PCR steps, which exposes the growing DNA to fluid shear.  The reliance upon natural mechanisms for assembly might mean the method is better suited to the garage than something that uses fluid transfer.

Finally, building ~5kB segments doesn't appear to be such a big deal at this point.  While Itaya's method isn't completely general, and as described may be a bit slow, it should be widely useful to anyone who has an in-house method for making gene-sized pieces of DNA and who doesn't want to pay a foundry to assembly even larger pieces.

(Update: Oops.  I forgot to add that this sort of thing is just what I suggested in my previous post, when I observed that while Venter may have made excellent progress in building an artificial chromosome he certainly doesn't have a lock on building new organisms.)

Updated "Longest Synthetic DNA" Plot

Carlson_longest_sdna_nov_07With the reported completion of a 580 kB piece of DNA by Venter and colleagues, it is time to update another metric of progress in biological technologies.  Assuming the report is true, it provides evidence that the technological ability to assemble large pieces of DNA from the short oligonucleotides produced by DNA synthesizers is keeping up with the productivity enhancements enabled by those synthesizers (see my prior post "Updated, Um, Carlson Curve for DNA Synthesis Productivity").  That said, this is an accomplishment of art and science, not of commerce and engineering.  The methods are esoteric and neither widespread nor sufficiently low cost to become widespread.

The news report itself is a couple of months old now.  It yet to be confirmed by scientific publication of results, so I am breaking my habit of waiting until I can see the details of the paper before including another point on the plot.  Perhaps I just need something to do as a break from writing my book.

In any event, in the 6 October, 2007 edition of The Guardian, Ed Pilkington reported, "I am creating artificial life, declares US gene pioneer":

The Guardian can reveal that a team of 20 top scientists assembled by Mr Venter, led by the Nobel laureate Hamilton Smith, has already constructed a synthetic chromosome, a feat of virtuoso bio-engineering never previously achieved. Using lab-made chemicals, they have painstakingly stitched together a chromosome that is 381 genes long and contains 580,000 base pairs of genetic code.

It does not appear, from Mr. Pilkington's story, that Venter et al have yet inserted this mammoth piece of DNA into a cell.  Though Craig Venter is supposedly "100% confident" they can accomplish this, and as a result will boot up a wholly artificial genome running a semi-artificial organism; "The new life form will depend for its ability to replicate itself and metabolise on the molecular machinery of the cell into which it has been injected, and in that sense it will not be a wholly synthetic life form."

The Guardian story includes a comment from the dependably well-spoken Pat Mooney, director of the ETC Group.  Says Mooney,  "Governments, and society in general, is way behind the ball. This is a wake-up call - what does it mean to create new life forms in a test-tube?"

Here is an open letter to Mr. Mooney:

Dear Pat,

It doesn't mean a damn thing.  Except that it helps you raise more money by scaring more people unnecessarily, so that you can go on to scare yet more people.  Have fun with that.

Best Regards,

Rob Carlson

PS Great business model. 

I just can't get really excited about 580 kB of synthetic DNA.  First, while interesting technically, the result is entirely expected.  People keep saying to me that it is really hard to manipulate large pieces of DNA in the lab, and to this I say many things we do are really hard.  Besides, nature has been manipulating large pieces of DNA very successfully for a while now.  Say, three billion years, give or take.  It was inevitable we would learn how to do it. 

Second, I know of a few individuals who are concerned that, because there is insufficient funding for this sort of work, Venter and his crew will now have some sort of lock on the IP for building new organisms.  But it is so very early in this technological game that putting money on the first demonstrated methodology is just silly.  Someone else, probably many different someones, will soon demonstrate alternatives.  Besides, how many times are we going to need to assemble 580,000 bases and 381 genes from scratch?  The capability isn't really that useful, and I don't see that it will become useful anytime soon.

The more interesting numbers are, say, 10-50 genes and 10,00-50,000 bases.  This is the size of a genetic program or circuit that will have interesting economic value for many decades to come.  But while assembling synthetic constructs (plasmids) this size is still not trivial, it is definitely old news.  The question is how will the cost for constructs of this size fall, and when can I have that DNA in days or hours instead of weeks?  And how soon before I can have a desktop box that prints synthetic DNA of this length?  As I have previously noted in this space, there is clear demand for this sort of box, which means that it will happen sooner or later.  Probably sooner. 

Third, the philosophical implications of constructing an artificial genome are overblown, in my humble opinion.   It is interesting to see that it works, to be sure.  But the notion that this demonstrates a blow against vitalism, or against other religious conceptions of life is, for me, just overexcitement.  Venter and crew have managed to chemically synthesize a long polymer, a polymer biologically indistinguishable from naturally occurring DNA; so what?  If that polymer runs a cell the same way natural DNA does, as we already knew that it would, so what?  Over the last several millennia religious doctrine has shown itself to be an extremely flexible meme, accommodating dramatic changes in human understanding of natural phenomena.  The earth is flat!  Oh, wait, no problem.  The earth is at the center of the universe!  No?  Okay, we can deal with that.  Evolution is just another Theory!  Bacteria evolve to escape antibiotics?  Okay, God's will.  No problem. I can't imagine it will be any different this time around.

Finally, it is worth asking what, if any, implications there are for the regulatory environment.  The Guardian suggests, "Mr Venter believes designer genomes have enormous positive potential if properly regulated."  This is interesting, especially given Venter's comments last winter at the initial public discussion of "Synthetic Genomics: Options for Governance".  I don't know if his comments are on record anywhere, or whether my own public comments are for that matter, but Venter basically said "Good luck with regulation," and "Fear is no basis for public policy."  In this context, I think it is interesting that Venter is not among the authors of the report.

I just finished writing my own response to "Options for Governance" for my book.  I can't say I am enthusiastic about the authors' conclusions.  The  authors purport to only present "options".  But because they examine only additional regulation, and do not examine the the policy or economic implications of maintaining the status quo, they in effect recommend regulation.  One of the authors responded to my concerns of the implicit recommendation of regulation with, "This was an oversight."  Pretty damn big oversight.

Today's news provides yet another example of the futility of regulating technologies to putatively improve security.  Despite all the economic sanctions against Iran, despite export restrictions on computer hardware, scientists and engineers in Iran report that they have constructed a modest supercomputer using electronic components sold by AMD.  Here is the story at ITNews (originally via Slashdot).  Okay, so the Iranians only have the ability to run relatively simple weather forecasting software, and it may (may!) be true that export restrictions have kept them from assembling more sophisticated, faster supercomputers. (I have to ask at this point, why would they bother?  They are rolling in dollars.  Why not just pay somebody who has a faster machine to do the weather forecasting for you?  It suggests to me that they have pulled the curtain not from their best machine, but rather from one used to be used for weapons design and is now gathering dust because they have already built a faster one.)  Extending this security model to biological technologies will be even less successful.

Export restrictions for biological components are already completely full of holes, as anyone who has applied for an account at a company selling reagents will know.  Step 1: Get a business license.  Step 2: Apply for account.  Step 3: Receive reagents in mail.  (If you are in a hurry, skip Step 1; there is always someone who doesn't bother to ask for it anyway.)  This particular security measure is just laughable, and all the more so because any attempt to really enforce the legal restrictions on reselling or shipping reagents would involve intrusive and insanely expensive physical measures that would also completely crimp legitimate domestic sales.  I can only imagine that the Iranians exploited a similar loophole to get their AMD processors, and whatever other hardware they needed.

Well, enough of that.  I have one more chapter to write before I send the book off to reviewers.  Best get to it.