Dispelling a Climate Change Skeptic's "Deception"

(Updated: Friday 5 Oct 19:15 PST)

A few weeks ago I heard a presentation from someone (hereafter person "A", to remain anonymous) who claimed that increasing CO2 concentrations won't cause significant global warming.  The highly technical argument sounded extremely implausible to me, but it has taken me a while to sort out the details.  This is worth commenting on because the argument is due to be presented in a high profile book due out next year from a very well known publisher.

I don't fault person A for falling for the "deception", but he could have been more critical given the sources he used to build up his argument.

The anti-warming argument was based on a figure from a non-peer reviewed "paper" available on the web.  The figure, in turn, was generated by a fellow named David Archibald using the "modtran" model server hosted by The University of Chicago.  The modtran model server is run by Professor David Archer , in the Department of Geophysical Sciences, to help his students with coursework.  I wrote to Professor Archer to clarify both the intended use of the model and the interpretation of the data.

The model is evidently reasonably well accepted in its description of infrared radiation adsorption by the atmosphere as a function of CO2 concentration, otherwise known as radiative forcing.  But it turns out that to estimate the resulting warming, you have to multiply the radiative forcing by the 'climate sensitivity parameter', which tells you how the atmosphere and oceans respond to added heat.  The climate sensitivity parameter is actually a distribution of values, and models of climate change are usually evaluated using several different values of the parameter.  David Archibald conveniently chose a value that is 40 times smaller than the most likely value in the distribution used by the IPCC.  The value is in the distribution describing the climate sensitivity parameter, to be sure, but it is way the hell out to the left, and very improbable.  Thus one can very accurately claim that Archibald used the correct radiative forcing numbers but he intentionally chose an estimate of climate sensitivity that nobody else believes is physically likely.

Professor Acher posted to RealClimate.org with the title, "My model, used for deception".  He is relatively circumspect, though still damning, in his criticism of Archibald.  The comments that follow his post, however, are ruthless.  It seems I set loose the hounds.

I take the time to write this because I have become more aware of late that many climate change skeptics seem to think that anthropogenic climate change (in particular, warming caused by CO2 emissions) is simply a political ploy with no basis in physical reality.  That kind of thinking denies not just climate change, but virtually all of the science our technological economy is built on.  (I will certainly admit some of the rhetoric surrounding climate change bothers me, and I am not comfortable with the idea of brainwashing children to harass their parents about buying hybrid cars.  See the 29 September WSJ, "Inconvenient Youths", or even the recent The Daily Show segment on absurdly over the top children's books from wingnuts on both the left and the right.)

I could care less at this point about the political side of the argument, and why people do or don't like Al Gore.  Physics is physics.  Science always wins.  Science is self-correcting, and over the long term there ain't no politics about it.  The U.S. was founded based on the enlightenment notions of tolerance and rational decision making.  Alas, those words aren't in the Constitution anywhere, and they are seldom uttered inside the Beltway these days.  But if we don't base our policy decisions on science, then we can just forget the U.S. as a viable economic entity, and thus as an entity capable of being the standard bearer of ideals that make this country worth living in and defending.

On the use of the word "Biobrick"

A couple of months ago, Drew Endy admonished me via email for using "Biobricks" as a noun.  The trademark, as held by the Biobricks Foundation (BBF), describes a brand, or marque.  The word "Biobricks" is an adjective describing a particular set of things conforming to a particular standard.

I finally had a chance to catch up via phone Drew yesterday, and he clarified why this is important.  All the groups contributing to the MIT Registry of Standard Biological Parts, mainly via the International Genetically Engineered Machines Competition (iGEM), are working hard to make sure all those parts conform to set of rules for physical and functional assembly.  That means, amongst many other requirements, that the ends of the genes have appropriate sequences for manipulation and are sequence optimized for the assembly protocols.  For example, all the EcoR1 restriction enzyme sites need to be at the ends of the part and not in the middle.

It turns out that Drew is seeing lots of "parts" show up in papers and talks, described as "biobricks", that won't be compatible with the growing list of parts in the Registry refrigerators.  Thus the need for a differentiable marque.  From the BBF FAQ:  "The BBF maintains the "biobrick(s)" trademarks in order to enable and defend the set of BioBrick™ standard biological parts as an open and free-to-use collection of standard biological parts."  Thus it seems the BBF will both assert a standard and curate and license a library of parts.

There will be a BBF open workshop 4-6 November at MIT to define technical and legal standards for Biobricks Biobrick parts (it's just awkward, no?), following iGEM 2007 on 2-4 November at MIT.

Which gets me to wondering what other examples there might be of standards being defined and maintained by a foundation, protected with a trademark.  As far as I know, "transistor-transistor logic" (TTL) became a standard simply because Texas Instruments put a bunch of products out and everybody else jumped on board (see Wikipedia).  But nobody protected the marque "TTL", and no one organization curated and licensed a library of TTL parts.  Similarly, if I have got this right, the IEEE discusses and approves standards for hardware and software that manufacturers and programmers can use, but the IEEE does not itself play a role in building or licensing anything.  (Comments?  Randy?  TK?)

So I wonder if the BBF isn't heading out into some unknown territory.  Obviously, the idea of Biobricks Biobrick parts (Argh!) is itself new and interesting, but I wonder what the effect on innovation will be under an apparently new kind of IP regime if one organization is in a position to "defend" not just a a standard but also parts that conform to the standard.  What happens if the leadership (or control) of the BBF changes and suddenly the "open and free-to-use collection" becomes not so open?  And am I free to build/identify a new part as a Biobrick part (!) without submitting it to the Registry or the BBF?  Can I even advertise something as being compatible with the standard on my own, or do I have to have permission from the BBF to even suggest in public that I have something other people might want to use/buy that works with all the other Biobrick™ parts?  And who exactly controls the Registry?  (The "About the Registry" page doesn't appear to answer this question, even though I believe I have heard Drew and Randy Rettburg say in the past that MIT presently controls the IP.  There was also, I believe, some question as to whether some parts in the Registry are actually owned by other organizations.)

So many questions.  It is clear that there is lot's of work to do...

Metabolic rate determines mutation rate

ScienceDaily has a story describing a new paper showing that the rate of protein evolution is subject to allometric scaling.  Actually, now that I have written that, I remember that allometric scaling describes a specfic mathematical relationship between metabolism and body mass, but the paper in question doesn't appear to be online yet so I can't say for sure allometric scaling is the appropriate mechanism to cite.

At any rate, ScienceDaily reports that James Gillooly, and colleagues have shown that: "...A 10-degree increase in temperature across species leads to about a 300 percent increase in the evolutionary rate of proteins, while a tenfold decrease in body size leads to about a 200 percent increase in evolutionary rates."

"Generally, there are two schools of thought about what affects evolution," said Andrew P. Allen, Ph.D., a researcher with the National Center for Ecological Analysis and Synthesis in Santa Barbara, Calif. "One says the environment dictates changes that occur in the genome and phenotype of a species, and the other says the DNA mutation rate drives these changes. Our findings suggest physiological processes that drive mutation rates are important."

That is pretty interesting.  Warm, small animals evidently experience a greater rate of protein evolution than to large, cold ones.  This suggests to me that warm-blooded, smaller animals have an evolutionary advantage because they are better able to produce physiological variation in the context of a changing environment, and thus better able to compete at the species level in the face of natural selection.  The ScienceDaily story doesn't make that point, but I would assume the paper in Biology Letters, when it is published, will.

Here is the press release from the University of Florida.

Amyris Raises Additional US$ 70 Million for Micriobial Biofuels Production

Amyris Biotechnologies today announced the first portion of their B round financing for US$ 70 million.  This brings the total company financing for microbial production of biofuels to just under US$ 100 million in the last year.  The press release also notes Amyris already has bugs in the lab producing "bio-jet", "bio-diesel", and "bio-gasoline".  The latter is interesting because previous announcements had suggested butanol as a target product rather than a hydrocarbon.  Immiscible hydrocarbons will be much easier (read "less expensive") to separate from the fermentation broth than water soluble alcohols.

In any event, the company is clearly moving faster than even my earlier optimistic estimates (see "The Need for Fuels Produced Using Synthetic Biology").  While the speed of engineering efforts is still an issue (see "The Intersection of Biofuels and Synthetic Biology"), and will be for some time to come, I have been spending more time lately trying to understand the issue of scale.  The petroleum industry is absolutely enormous, and replacing any significant amount of petro-fuels with bio-fuels will require feedstocks in abundance.  It is by no means clear that the U.S. can meet the demand with domestic biomass production.  More on this as the topic develops.

Sony's Enzyme-Powered, Sugar Fueled Power Supply

Sony has apparently demonstrated a power supply for consumer electronics that uses enzymes to covert sugar to useful electrons (via Gizmodo).  Not many details are available (to non-Japanese speakers, anyway), but it looks like each "module" generates ~50 mW from an unspecified amount of sugar.  It is evidently just an engineering demonstration, but it's pretty cool nonetheless.  No word on how the digested sugar is converted to electrical power.

The Intersection of Biofuels and Synthetic Biology

New players are appearing every day in the rush to production biofuels using synthetic biology.  I just noticed an announcement that Codon Devices has signed an agreement with Agrivida for:

The discovery, development, and commercialization of engineeredproteins for use in so-called 'third generation' biofuel applications. Under the terms of this agreement, Codon Devices will deliver to Agrivida optimized enzymes to be embedded in crops for biofuels production.

...Agrivida, an agricultural biotechnology company, is developing such third generation biofuels by creating corn varieties optimized for producing ethanol. First generation methods for manufacturing ethanol make use of the corn grain only, leaving the remaining plant material, such as the corn leaves, stalks, and husks in the field. Central to Agrivida’s ethanol-optimized corn technology are engineered cellulase enzymes that are incorporated into the corn plants themselves. These enzymes will efficiently degrage the entire mass of plant material into small sugars that can then be readily converted to ethanol.

The step of putting some of the biofuel processing into crops was inevitable, but I can't say I am particularly thrilled about it.  I am not opposed to the principle of open planting of GM crops, but, because many GM plants do not behave as predicted once placed in a complex ecosystem (i.e., nature), I wonder if  we shouldn't be more circumspect about this particular engineering advance.

In other interesting developments at Codon, they also recently announced a deal with Open Biosystems wherein the latter will:

Sell and distribute Codon Devices’ gene synthesis offering to researchers with needs that fall below Codon’s minimum order threshold.  The partnership will enable a wide range of new customers to utilize high-quality, low-cost gene synthesis in their research, and will greatly strengthen Codon Devices’ presence within academic, government and other non-profit institutions.

I also notice Codon is now advertising gene synthesis for $.69 per base for constructs between 50 and 2000 bases in length, with "typical delivery" in 10-15 days.  2001-5000 bases will cost you $.84 per base and 15-20 days.  Last year at SB 2.0, Brian Baynes suggested they would be at about $.50 per base within a year, so costs continue to fall pretty much apace.  But delivery times are staying above two weeks, and this is now becoming a problem for some of Codon's customers.  I am not at liberty to divulge names, but some synthetic biology companies that rely on outside gene synthesis are starting to chafe at having to wait two weeks before trying out new designs.  This is something we predicted would happen in the "Genome Synthesis and Design Futures" report from Bio-era, though I am a bit surprised it is happening so soon.  This may be another indication of how quickly SB is becoming an important technology in the economy.  Engineers trying to turn around products aren't satisfied with the NIH/academic model of trading off time for money -- the market, to first order, only cares about products that are actually for sale, which means those that make it through R&D quickly and generate revenues in what will become an increasingly crowded field.

Concerns about delays in the R&D cycle due to outsourced gene delivery are also becoming confounded by IP issues.  Personally, I am certainly not thrilled about sending my protein designs around via email, and I know of another SB company (which again I am not at liberty to name) that is becoming less and less comfortable with sending sequences for new genetic circuits out the door in electronic form.  This can only be exacerbated by the deal Codon Devices has just signed with Agrivida, an explicit competitor to anybody trying to produce anything in hacked/engineered organisms.  A couple of months ago, I had a conversation with Brian Baynes (which I will post here sometime soon) in which he outlined Codon's plans for participating in markets beyond gene synthesis.  I suspect Codon Devices will have to start paying more and more attention to conflict of interest issues generated by its simultaneous role as a fabrication house and provider of design services.

I'll argue again that the two trends of IP concerns and R&D time scales will drive the emergence of a market in desktop gene synthesis machines, whether you call them "desktop gene printers" or  something else.  This weekend at SciFoo, Drew Endy suggested such instruments are a long ways off.  Drew has been paying more attention to the specific engineering details of this than I have, if for no other reason that his involvement in Codon, but, in addition to my own work, I think that there are enough technological bits and pieces already demonstrated in the literature that we could see a desktop instrument sooner rather than later; that is, if a market truly exists. 

Environmental Effects of Growing Energy Crops

News this week that the dead zone in the Gulf of Mexico, caused by agricultural run-off from the mid-west, is again this year going to be quite large.

There is some disagreement about exactly how large.  A article from Minnesota Public Radio leads off with: "A scientist with the National Oceanic and Atmospheric Administration, NOAA, says this summer's dead zone could be as large as 8500 square miles. That's 77 percent larger than the average size of the dead zone over the last two decades."

The article continues:

The issue of nitrogen is especially important this year because it's the main fertilizer used on the nation's corn crop.

U.S. farmers this spring planted one of their largest corn crops ever, up almost 20 percent from a year ago. Much of the increase will go to meet the demands of the ethanol industry.

Runoff from farm fields carries nitrogen into streams and rivers and eventually the Gulf of Mexico. NOAA's David Whitall says the corn-biofuels-dead zone link is one area researchers will examine as they search for answers.

...One federal study says if ethanol production continues to expand, nitrogen loads to the Gulf could increase another 30 percent.

At CNN, on the other hand, the size of the dead zone is portrayed somewhat differently:

The oxygen-poor "dead zone" off the Louisiana and Texas coasts isn't quite as big as predicted this year, but it is still the third-largest ever mapped, a scientist said Saturday.

...The 7,900-square-mile area with almost no oxygen, a condition called hypoxia, is about the size of Connecticut and Delaware together. The Louisiana-Texas dead zone is the world's second-largest hypoxic area, she said.

This year's is about 7.5 percent smaller than [had been] predicted, judging by nitrogen content in the Mississippi River watershed.

[Previous predictions were] about 8,540 square miles, which would have made it the largest measured in at least 22 years. More storms than normal may have reduced hypoxia by keeping the waters roiled.

No mention at CNN of any role any role in the dead zone of biofuels.  The difference between the numbers cited by the two sources is less than 5%, which probably isn't a big deal, especially give then fact that neither article cites error bars.  But there is a difference in focus.  On the one hand, the dead zone is bigger than ever, on the other, not so bad.  Corn acres are certainly up in the U.S., and the effects of the consequent increase in irrigation and fertilizer use is something to keep an eye on.