Recently in Science Category
How It Works: Directly Reading DNA
The basic idea is not new: as a long string of DNA pass through a small hole, its components -- the bases A, T, G, and C -- plug that hole to varying degrees. As they pass through the hole, in this case an engineered pore protein derived from one found in nature, each base has slightly different interactions with the walls of the pore. As a result, while passing through the pore each base lets different numbers of salt ions through, which allows one to distinguish between the bases by measuring changes in electrical current. Because this method is a direct physical interrogation of the chemical structure of each base, it is in principal much, much faster than any of the indirect sequencing technologies that have come before.
There have been a variety of hurdles to clear to get nanopore sequencing working. First you have to use a pore that is small enough to produce measurable changes in current. Next the speed of the DNA must be carefully controlled so that the signal to noise ratio is high enough. The pore must also sit in an insulating membrane of some sort, surrounded by the necessary electrical circuitry, and to become a useful product the whole thing must be easily assembled in an industrial manner and be mechanically stable through shipping and use.
Oxford Nanopore claims to have solved all those problems. They recently showed off a disposable version of their technology -- called the MinIon -- containing 512 pores built into a disposable USB stick. This puts to shame the Lava Amp, my own experiment with building a USB peripheral for molecular biology. Here is one part I find extremely impressive -- so impressive it is almost hard to believe: Oxford claims they have reduced the sample handling to single (?) pipetting step. Clive Brown, Oxford CTO, says "Your fluidics is a Gilson." (A "Gilson" would be a brand of pipetter.) That would be quite something.
I've spent a good deal of my career trying to develop simple ways of putting biological samples into microfluidic doo-dads of one kind or another. It's never trivial, it's usually a pain in the ass, and sometimes it's a showstopper. Blood, in particular, is very hard to work with. If Oxford has made this part of the operation simple, then they have a winning technology just based on everyday ease of use -- what sometimes goes by the labels of "user experience" or "human factors". Compared to the complexity of many other laboratory protocols, it would be like suddenly switching from MS DOS to OS X in one step.
How Well Does it Work?
The challenge for fast sequencing is to combine throughput (bases per hour) with read length (the number of contiguous bases read in one go). Existing instruments have throughputs in the range of 10-55,000 megabases/day and read lengths from tens of bases to about 800 bases. (See chart below.) Nick Loman reports that using the MinIon Oxford has already run DNA of 5000 to 100,000 bases (5 kB to 100 kB) at speeds of 120-1000 bases per minute per pore, though accuracy suffers above 500 bases per minute. So a single USB stick can run easily run at 150 megabases (MB) per hour, which basically means you can sequence full-length eukaryotic chromosomes in about an hour. Over the next year or so, Oxford will release the GridIon instrument that will have 4 and then 16 times as many pores. Presumably that means it will be 16 times as fast. The long read lengths mean that processing the resulting sequence data, which usually takes longer than the actual sequencing itself, will be much, much faster.
This is so far beyond existing commercial instruments that it sounds like magic. Writing in Forbes, Matthew Herper quotes Jonathan Rothberg, of sequencing competitor Ion Torrent, as saying "With no data release how do you know this is not cold fusion? ... I don't believe it." Oxford CTO Clive Brown responded to Rothberg in the comments to Herper's post in a very reasonable fashion -- have a look.
Of course I want to see data as much as the next fellow, and I will have to hold one of those USB sequencers in my own hands before I truly believe it. Rothberg would probably complain that I have already put Oxford on the "performance tradeoffs" chart before they've shipped any instruments. But given what I know about building instruments, I think immediately putting Oxford in the same bin as cold fusion is unnecessary.
Below is a performance comparison of sequencing instruments originally published by Bio-era in Genome Synthesis and Design Futures in 2007. (Click on it for a bigger version.) I've hacked it up to include the approximate performance range of 2nd generation sequencers from Life, Illumina, etc, as well for a single MinIon. That's one USB stick, with what we're told is a few minutes worth of sample prep. How many can you run at once? Notice the scale on the x-axis, and the units on the y-axis. If it works as promised, the MinIon is so vastly better than existing machines that the comparison is hard to make. If I replotted that data with log axis along the bottom then all the other technologies would be cramped up together way off to the left. (The data comes from my 2003 paper, The Pace and Proliferation of Biological Technologies (PDF), and from Service, 2006, The Race for the $1000 Genome).
The Broader Impact
Later this week I will try to add the new technologies to the productivity curve published in the 2003 paper. Here's what it will show: biological technologies are improving at exceptional paces, leaving Moore's Law behind. This is no surprise, because while biology is getting cheaper and faster, the density of transistors on chips is set by very long term trends in finance and by SEMATECH; designing and fabricating new semiconductors is crazy expensive and requires coordination across an entire industry. (See The Origin of Moore's Law and What it May (Not) Teach Us About Biological Technologies.) In fact, we should expect biology to move much faster than semiconductors.
Here are a few graphs from the 2003 paper:
...The long term distribution and development of biological technology is likely to be largely unconstrained by economic considerations. While Moore's Law is a forecast based on understandable large capital costs and projected improvements in existing technologies, which to a great extent determined its remarkably constant behavior, current progress in biology is exemplified by successive shifts to new technologies. These technologies share the common scientific inheritance of molecular biology, but in general their implementations as tools emerge independently and have independent scientific and economic impacts. For example, the advent of gene expression chips spawned a new industrial segment with significant market value. Recombinant DNA, gel and capillary sequencing, and monoclonal antibodies have produced similar results. And while the cost of chip fabs has reached upwards of one billion dollars per facility and is expected to increase [2012 update: it's now north of $6 billion], there is good reason to expect that the cost of biological manufacturing and sequencing will only decrease. [Update 2012: See "New Cost Curves" for DNA synthesis and sequencing.]Cue nanopore sequencing.
These trends--successive shifts to new technologies and increased capability at decreased cost--are likely to continue. In the fifteen years that commercial sequencers have been available, the technology has progressed ... from labor intensive gel slab based instruments, through highly automated capillary electrophoresis based machines, to the partially enzymatic Pyrosequencing process. These techniques are based on chemical analysis of many copies of a given sequence. New technologies under development are aimed at directly reading one copy at a time by directly measuring physical properties of molecules, with a goal of rapidly reading genomes of individual cells. While physically-based sequencing techniques have historically faced technical difficulties inherent in working with individual molecules, an expanding variety of measurement techniques applied to biological systems will likely yield methods capable of rapid direct sequencing.
A few months ago I tweeted that I had seen single strand DNA sequence data generated using a nanopore -- it wasn't from Oxford. (Drat, can't find the tweet now.) I am certain there are other labs out there making similar progress. On the commercial front, Illumina is an investor in Oxford, and Life has invested in Genia. As best I can tell, once you get past the original pore sequencing IP, which it appears is being licensed broadly, there appear to be many measurement approaches, many pores, and many membranes that could be integrated into a device. In other words, money and time will be the primary barriers to entry.
(For the instrumentation geeks out there, because the pore is larger than a single base, the instrument actually measures the current as three bases pass through the pore. Thus you need to be able to distinguish 4^3=64 levels of current, which Oxford claims they can do. The pore set-up I saw in person worked the same way, so I certainly believe this is feasible. Better pores and better electronics might reduce the physical sampling to 1 or 2 bases eventually, which should result in faster instruments.)
It may be that Oxford will have a first mover advantage for nanopore instruments, and it may be that they have amassed sufficient additional IP to make it rough for competitors. But, given the power of the technology, the size of the market, and the number of academic competitors, I can't see that over the long term this remains a one-company game.
Not every sequencing task has the same technical requirements, so instruments like the Ion Torrent won't be put to the curbside. And other technologies will undoubtedly come along that perform better in some crucial way than Oxford's nanopores. We really are just at the beginning of the revolution in biological technologies. Recombinant DNA isn't even 40 years old, and the electronics necessary for nanopore measurements only became inexpensive and commonplace in the last few years. However impressive nanopore sequencing seems today, the greatest change is yet to come.
A couple of weeks ago I flew into LAX from the east coast and got another perspective on water use there. My first glimpse of the basin was the smog lapping up against the rim of the San Gabriel Mountains. I managed to snap a quick photo after we had flown over the ridge (the smog is on the lower left, though the contrast was more impressive when we were looking from the east side).
Even in May it looks a little dry 'round those parts.
A few minutes later, I noticed large green patches covering the sides (usually the west side) of hills. This continued all the way to downtown LA, and we were high enough for most of that time that I couldn't figure out why the locals were spending so much of their precious water keeping the sunset sides of hills green. Then, finally, we passed over one low enough that the purpose jumped out at me.
Even in death, Los Angelinos maintain their homage to William Mulholland by keeping him eternally damp. And in death, Los Angelinos continue to contribute to the smog shown above -- the grass covering the land of the dead is trimmed quite short. Many, many square miles of it. A cushy life, have those dead people. And to be fair to Los Angeles (which, admittedly, is hard for me), Seattle, too, uses a great deal of water and hydrocarbons to keep our decaying ancestors covered with a trim layer of green. It happens everywhere here. Welcome to America.
Even the way the US irrigates land to feed the living represents a profligate use of water. According to the USDA, 80% of the water consumed in this country goes to agriculture. (Note that "use" and "consumption" are often confused. Agriculture and thermoelectric power generation both "use" about 40% of the nation's freshwater, but while almost 100% of the water used for power generation is returned to where it was taken from -- albeit somewhat warmer than when it was taken -- much of the of water put on crops is does not reach the roots or is evaporated and lost to the atmosphere.) Notice that I did not use the word "waste", because some of the leakage winds up back in groundwater, or otherwise finds its way into the environment in a way that might be classified as "beneficial".
And pondering water use here in the US, and the impact on our economy, my thoughts turn to water use in Asia. Much ado was made in the last couple of years about the IPCC report of anomalous melting of Asian glaciers, followed by the discovery that there was no actual data behind the assertion.
A recent paper in Science adds some much needed analysis to the story. Walter Immerzeel and colleagues set out to understand the relative importance of meltwater and rainwater to river flows in Asia. It is interesting to me that this sort of analysis wasn't done before now: "Earlier studies have addressed the importance of glacial and snow melt and the potential effects of climate change on downstream hydrology, but these are mostly qualitative or local in nature."
For five large river basins the authors used a combination of precipitation data, snow melt models, and evaporation rates, to calculate the Normalized Melt Index (NMI). The NMI is the ratio of snow and glacier discharge to downstream discharge. If all the water in a river downstream is from melting, then this ratio is obviously one; if the ratio is less than one, rainfall contributes more than meltwater, and if it larger than one, more water is lost through evaporation or other processes (like agriculture) and meltwater is more important for total flow.
Here are the results. For each of the rivers, the authors calculated the percentage of the total discharge generated by snow and glacial melt:
In other words, water supplies in the Indus river valley are largely dependent on meltwater, whereas the large river systems in China appear to be less dependent on meltwater. That is a very interesting result, because the story told by lots of people (including myself) about the future of water in China is that they are in big trouble due to glacial melting in the Himalayas. Assuming this result holds up, China may be better off in a warmer world that I had anticipated.
The authors also used various projections of snow and rainfall to estimate what water supplies would look like in these rivers in 2050. As you might expect, a warmer world leads to less snowfall, more melting, and lower river flows. But as the warmer world brings increased rainfall, the impact is smaller than has been widely assumed. I am not going to bother putting any of the numbers in here, because, as the authors note, "Results should be treated with caution, because most climate models have difficulty simulating mean monsoon and the interannual precipitation variation, despite recent progress in improving the resolution of anticipated spatial and temporal changes in precipitation."
But they went one step further and tried to estimate the effects of potential decreased water supply on local food supplies. Couched in terms of crop yields, etc., Immerzeel et al estimate that the Brahmaputra will support about 35 million fewer people, the Indus will support about 26 million fewer people -- that's food for 60 million fewer people in India and Pakistan, if you are counting -- and the Yellow about 3 million more people. Finishing up, they write:
We conclude that Asia's water towers are threatened by climate change, but that the effects of climate change on water availability and food security in Asia differ substantially among basins and cannot be generalized. The effects in the Indus and Brahmaputra basins are likely to be severe owing to the large population and the high dependence on irrigated agriculture and meltwater. In the Yellow River, climate change may even yield a positive effect as the dependence on meltwater is low and a projected increased upstream precipitation, when retained in reservoirs, would enhance water availability for irrigated agriculture and food security.I am perplexed by the take on these results over at Nature News by Richard Lovett. His piece carries the title, "Global warming's impact on Asia's rivers overblown". I'll give Lovett the out that he may not have written the actual headline (Editors!), but nonetheless he sets up the Immerzeel paper as a big blow to some unnamed group of doomsayers. Perhaps he imagines that Immerzeel completely undermines the IPCC report? This is hardly the case. As I wrote last January, sorting out the mistake over Himalayan melting rates is an example of science working through a blunder. Instead overturning some sort of vague conspiracy, as best I can tell Immerzeel is simply the first real effort to make quantitative assessments of something to which much more attention should have been paid, much earlier than it was.
And even Lovett appears to acknowledge that reducing the human carrying capacity of the Brahmaputra and Indus river valleys by 60 million people is something to be concerned about. From Lovett:
The findings are important for policy-makers, says Jeffrey Kargel, a glaciologist at the University of Arizona in Tucson. "This paper adds to mounting evidence that the Indus Basin [between India and Pakistan] is particularly vulnerable to climate change," says Kargel. "This is a matter that obviously concerns India and Pakistan very much."Indeed. As they should concern us all.
When the document came out, there was just a little bit of coverage in the press. Notably, Wired's Threat Level, which usually does a commendable job on security issues, gave the document a haphazard swipe, asserting that "Obama's Biodefense Strategy is a Lot Like Bush's". As described in that post, various commentators were unhappy with the language that Under Secretary of State Ellen Tauscher used when announcing the Strategy at a BWC meeting in Geneva. According to Threat Level, "Sources tell this reporter that the National Security Council had some Bush administration holdovers in charge of editing the National Strategy and preparing Ms. Tauscher's script, and these individuals basically bulldozed the final draft through Defense and State officials with very little interagency input and with a very short suspense." Threat Level also asserts that "Most are disappointed in the language, which doesn't appear to be significantly different than the previous administration." It is unclear who "Most" are.
In contrast to all of this, in my view the Strategy is a clear departure from the muddled thinking that dominated earlier discussions. By muddled, I mean security discussions and policy that, paraphrasing just a little, went like this: "Biology Bad! Hacking Bad! Must Contain!"
The new National Strategy document, however takes a very different line. Sources tell this reporter, if you will, that the document resulted from a careful review that involved multiple agencies, over many months, with an aim to develop the future biosecurity strategy of the United States in a realistic context of rapidly spreading infectious diseases and international technological proliferation driven by economic and technical needs. To wit, here are the first two paragraphs from the first page (emphasis added, of course):
We are experiencing an unparalleled period of advancement and innovation in the life sciences globally that continues to transform our way of life. Whether augmenting our ability to provide health care and protect the environment, or expanding our capacity for energy and agricultural production towards global sustainability, continued research and development in the life sciences is essential to a brighter future for all people.Recall that this document carries the signature of the President of the United States. I'll pause to let that sink in for a moment.
The beneficial nature of life science research is reflected in the widespread manner in which it occurs. From cutting-edge academic institutes, to industrial research centers, to private laboratories in basements and garages, progress is increasingly driven by innovation and open access to the insights and materials needed to advance individual initiatives.
And now to drive home the point: the new Strategy for Countering Biological Threats explicitly points to garage biotech innovation and open access as crucial components of our physical and economic security. I will note that this is a definite change in perspective, and one that has not fully permeated all levels of the Federal bureaucracy and contractor-aucracy. Recently, during a conversation about locked doors, buddy systems, security cameras, and armed guards, I found myself reminding a room full of biosecurity professionals of the phrase emphasized above. I also found myself reminding them -- with sincere apologies to all who might take offense -- that not all the brains, not all the money, and not all the ideas in the United States are found within Beltway. Fortunately, the assembled great minds took this as intended and some laughter ensued, because they realized this was the point of including garage labs in the National Strategy, even if not everyone is comfortable with it. And there are definitely very influential people who are not comfortable with it. But, hey, the President signed it (forgive me, did I mention that part already?), so everyone is on board, right?
Anyway, I think the new National Strategy is a big step forward in that it also acknowledges that improving public health infrastructure and countering infectious diseases are explicitly part of countering artificial threats. Additionally, the Strategy is clear on the need to establish networks that both promulgate behavioral norms and that help disseminate information. And the new document clearly recognizes that these are international challenges (p.3):
Our Strategy is targeted to reduce biological threats by: (1) improving global access to the life sciences to combat infectious disease regardless of its cause; (2) establishing and reinforcing norms against the misuse of the life sciences; and (3) instituting a suite of coordinated activities that collectively will help influence, identify, inhibit, and/or interdict those who seek to misuse the life sciences.Norms, open biology, better technology, better public health infrastructure, and better intelligence: all are themes I have been pushing for a decade now. So, 'nuff said on those points, I suppose.
...This Strategy reflects the fact that the challenges presented by biological threats cannot be addressed by the Federal Government alone, and that planning and participation must include the full range of domestic and international partners.
Implementation is, of course, another matter entirely. The Strategy leaves much up to federal, state, and local agencies, not all of whom have the funding, expertise, or inclination to follow along. I don't have much to say about that part of the Strategy, for now. But I am definitely not disappointed with the rest of it. It is, you might say, the least bad thing I have read out of DC in a long time.
First up, a paper from this week's PNAS by Breeker et al at UT Austin, "Atmospheric CO2 concentrations during ancient greenhouse climates were similar to those predicted for A.D. 2100". Already from the title you can see where this is going.
The problem Breeker and colleagues address is the following: how do you correlate the carbon content of fossil soils with prevailing atmospheric carbon dioxide concentrations? Well established methods exist for measuring the carbon content of compounds in fossil soil, but less certain were conditions under which chemical reactions produce those particular compounds. It turns out that model used to infer atmospheric CO2 contained an error. Breeker determined that the primary compound assayed when determining soil carbon content forms at much lower atmospheric CO2 concentrations than had been assumed.
Prior attempts to correlate soil carbon (and by proxy atmospheric CO2) with greenhouse periods in Earth's climate had concluded that warm periods experienced CO2 concentrations of much greater than ~1000 parts per million (ppm). Therefore, one might conclude that only when average atmospheric CO2 spiked above this level would we be in danger of experiencing greenhouse gas warming that threatened glaciers. The correction supplied by Breeker substantially lowers estimates of the average CO2 concentration that is correlated with continental glacial melting. Eyeballing the main figure in the paper, it looks to me like we could be in real trouble above 450 ppm -- today we are at just shy of 390 ppm and there is no sign we will be slowing down anytime soon, particularly if India and China keep up their pace of development and emissions.
Looking forward to 2100, things get a touch squiffy because Breeker relies on an estimate of CO2 concentrations that come out of model of global economic activity. So the title of the paper might be a tad alarmist, simply because 2100 is a long way out for any model to be taken too seriously. But the correction of the paleodata is a big story because at minimum it reduces the uncertainty of atmospheric CO2 levels, and it appears to clarify the connection between CO2 levels and continental glaciation. More work is needed on the later point, obviously, or this paper would have been on the cover of Science or Nature.
Now on to a serious screw-up at the IPCC. Elisabeth Rosenthal at the NYT is reporting that "A much-publicized estimate from a United Nations panel about the rapid melting of Himalayan glaciers from climate change is coming under fire as a gross exaggeration." Here is Andrew Revkin's take on DotEarth, and anyone interested in this story should read through his post. The comments are worth perusing because some of the contributors actually seem to have additional useful knowledge, though, of course, nut jobs aplenty show up from both sides of the debate over climate change.
In a nutshell, the issue is that the most recent IPCC chapter on glaciers contained a conclusion, advertised as real analysis, that was in fact a speculation by one scientist promulgated through the popular press. The authors of that section of the IPCC report may have been warned about the unsubstantiated claim. Contradictory data and analysis seems to have been ignored.
So, to be frank, this is a giant, inexcusable fuck-up. The IPCC is composed of so many factions and interest groups that this may be a case of simple blundering or of blatant politicization of science. But here is the beautiful thing about science -- it is self-correcting. It may take a while, but science always wins. (See also my post of a couple of years ago, Dispelling a Climate Change Skeptic's "Deception".) Every newspaper story I have seen about this particular IPCC screw-up notes that it was brought to light by...wait for it...a climate scientist. It is an excellent public airing of dirty laundry by the community of science. So while this episode demonstrates that the last official IPCC report on glacial melting in the Himalayas should not be used for any sort of scientific policy recommendation or economic forecast, you can bet that the next report will do a damn fine job on this topic.
Finally, whether or not the IPCC gets its act together, there are plenty of good data out there on the state of the planet. Eventually, Science -- with a capital S -- will get the right answer. The same methodical process that has resulted in computers, airplanes, and non-stick fry pans will inevitably explain what is really going on with our climate. And if you use computers, fly on airplanes, or eat scrambled eggs then you are implicitly acknowledging, whatever your political or religious persuasion, that you believe in science. And you better, 'cause science always wins.
I have a couple of general thoughts about the event, colored by another meeting full of economists, bankers, and traders that I attended in the last week of December. I met a number of fantastically accomplished and interesting people in just a few hours, many of whom I hope will remain lifelong friends.
First, I have to extend my thanks to The Economist -- they have been very good to me over the last 10 years, beginning in 2000 by co-sponsoring (with Shell) the inaugural World in 2050 writing competition. (Here is my essay from the competition (PDF). It seems to be holding up pretty well, these 10 years later, save the part about building a heart. But at least I wasn't the only one who got that wrong.)
Here is a paraphrased conversation over drinks between myself and Daniel Franklin, the Executive Editor of the newspaper.
Me: I wanted to thank you for including me. The Economist has been very kind to me over the past decade.
Franklin: Well, keep doing interesting things.
Me: Umm, right. (And then to myself: Shit, I have a lot of work to do.)
On to the World in 2010 Festival. The professional economists and journalists present all seem to agree that we have seen the worst of the downturn, that the stimulus package clipped the bottom off of whatever we were falling into, and that employment gains going forward could be a long time in coming. Unsurprisingly, the Democratic politicians and operatives who turned up crowed about the effects of the stimulus, while the Republicans who spoke poo-pooed any potential bright spots in, well, just about everything.
At the other meeting I attended, last week in Charleston, SC, one panel of 10 people, composed Federal reserve and private bankers, traders, and journalists couldn't agree on anything. The recovery would be V shaped. No, no, W shaped. No, no, no, reverse square root shaped (which was the consensus at The World in 2010 Festival). No, no, no, no, L shaped. But even those who agreed on the shape did not agree on anything else, such as the availability of credit, employment, etc.
Basically, as far as I can tell, nobody has the slightest idea what the future of the US economy looks like. And I certainly don't have anything to add to that. Except, of course, that the future is biology.
Here is John Oliver's opening monologue from the Festival. He was absolutely hilarious. Unfortunately you can't hear the audience cracking up continuously. I nearly pissed myself. Several times. (Maybe the cocktails earlier in the evening contributed to both reactions.)
Back to Innovation in 2010. Dean Kamen had this nice bit in response to a question about whether the imperative to invent and innovate has increased in recent years (see 36:20 in the C-Span video): "7 billion people can't be recipients, they have to be part of the solution. And that is going to require advanced technologies to be properly developed and properly deployed more rapidly than ever before."
To this I can only add that we are now seeing more power to innovate put into the hands of individuals than has ever occurred in the history of humanity. Let's hope we don't screw up.
A few other tidbits from the article: sugar beets now supply about half the US sugar demand, and it seems that GM sugar beets account for about 95% of the US crop (I cannot find any data on the USDA site to support the latter claim). A spokesman for the nation's largest sugar beet processor claims that food companies, and consumers, have completely accepted sugar from the modified beets -- as they should, because it's the same old sugar molecule.
I got lured into spending most of my day on this because I noticed that the Sierra Club was one of the plaintiffs. This surprised me, because the Sierra Club is less of a noisemaker on biotech crops than some of the co-plaintiffs, and usually focuses more on climate issues. Though there is as yet no press release, digging around the Sierra Club site suggests that the organization wants all GM crops to be tested and evaluated with an impact statement before approval. But my surprise also comes in part because the best review I can find of GM crops suggests that their growing use is coincident with a substantial reduction in soil loss, carbon emissions, energy use, water use, and overall climate impact -- precisely the sort of technological improvement you might expect the Sierra Club to support. The reductions in environmental impact -- which range from 20% to 70%, depending on the crop -- come from "From Field to Market" (PDF) published earlier this year by the Keystone Alliance, a diverse collection of environmental groups and companies. Recall that according to USDA data GM crops now account for about 90% of cotton, soy, and corn. While the Keystone report does not directly attribute the reduction in climate impacts to genetic modification, a VP at Monsanto recently made the connection explicit (PDF of Kevin Eblen's slides at the 2009 International Farm Management Congress). Here is some additional reporting/commentary.
So I find myself being pulled into exploring the cost/benefit analysis of biotech crops sooner than I had wanted. I dealt with this issue in Biology is Technology by punting in the afterword:
Obviously we will all be talking about biotech crops for years to come. I don't see how we are going to address the combination of 1) the need for more biomass for fuel and materials, 2) the mandatory increase in crop yields necessary to feed human populations, and 3) the need to reduce our climatic impacts, without deploying biotech crops at even larger scales than we have so far. But I am also very aware that nobody, but nobody, truly understands how a GM organism will behave when released into the wild.
The broader message in this book is that biological technologies are beginning to change both our economy and our interaction with nature in new ways. The global acreage of genetically modified (GM) crops continues to grow at a very steady rate, and those crops are put to new uses in the economy every day. One critical question I avoided in the discussion of these crops is the extent to which GM provides an advantage over unmodified plants. With more than ten years of field and market experience with these crops in Asia and North and South America, the answer would appear to be yes. Farmers who have the choice to plant GM crops often do so, and presumably they make that choice because it provides them a benefit. But public debate remains highly polarized. The Union of Concerned Scientists recently released a review of published studies of GM crop yields in which the author claimed to "debunk" the idea that genetic modification will "play a significant role in increasing food production" The Biotechnology Industry Organization responded with a press release claiming to "debunk" the original debunking. The debate continues.
We do live in interesting times.
Biodesic evaluated systems biology investments for a large organization about 18 months ago, and Schadt's approach makes more sense to me -- by far -- than anything else we looked at. I sat in on the pitch that Schadt and Stephen Friend made to that same organization, and it was crystal clear to me that Sage -- now residing at the Hutch here in Seattle -- should be on the receiving end of piles of money. The stacks of Nature Group publications Schadt is accumulating suggest he is on to something, and it appears that his methods can be used to make predictions about the behaviors of complex networks. Time and experimentation will tell, of course. The open source aspect is a huge bonus.
Schadt's move to Pacific Biosciences is interesting because during his talk he suggested that genome sequencing provides enough information about variation to fuel his statistical methods for predicting interactions not just between genes but between tissues -- he is working at the level of describing the behavior of networks of networks. It seems he will now have access to plenty of data.
Total synthesis of a gene
Science 16 February 1979:
Vol. 203. no. 4381, pp. 614 - 625
A totally synthetic plasmid for general cloning, gene expression and mutagenesis in Escherichia coli
Wlodek Mandecki, Mark A. Hayden, Mary Ann Shallcross and Elizabeth Stotland
Gene Volume 94, Issue 1, 28 September 1990, Pages 103-107
Single-step assembly of a gene and entire plasmid from large numbers of oligodeoxyribonucleotides
Willem P. C. Stemmer, Andreas Crameria, Kim D. Hab, Thomas M. Brennanb and Herbert L. Heynekerb
Gene Volume 164, Issue 1, 16 October 1995, Pages 49-53
Chemical Synthesis of Poliovirus cDNA: Generation of Infectious Virus in the Absence of Natural Template
Jeronimo Cello, Aniko V. Paul, Eckard Wimmer
Science 9 August 2002: Vol. 297. no. 5583, pp. 1016 - 1018
Accurate multiplex gene synthesis from programmable DNA microchips
Jingdong Tian, Hui Gong, Nijing Sheng, Xiaochuan Zhou, Erdogan Gulari, Xiaolian Gao & George Church
Nature 432, 1050-1054 (23 December 2004)
Total synthesis of long DNA sequences: Synthesis of a contiguous 32-kb polyketide synthase gene cluster
Sarah J. Kodumal, Kedar G. Patel, Ralph Reid, Hugo G. Menzella, Mark Welch, and Daniel V. Santi
PNAS November 2, 2004 vol. 101 no. 44 15573-15578
Complete Chemical Synthesis, Assembly, and Cloning of a Mycoplasma genitalium Genome
Daniel G. Gibson, Gwynedd A. Benders, Cynthia Andrews-Pfannkoch, Evgeniya A. Denisova, Holly Baden-Tillson, Jayshree Zaveri, Timothy B. Stockwell, Anushka Brownley, David W. Thomas, Mikkel A. Algire, Chuck Merryman, Lei Young, Vladimir N. Noskov, John I. Glass, J. Craig Venter, Clyde A. Hutchison, III, Hamilton O. Smith
Science 29 February 2008: Vol. 319. no. 5867, pp. 1215 - 1220