Chip Fab Now Costs US$2.5 Billion

Chip fabs just keep getting more expensive.  The AP, via Wired, reports that Intel is investing US$2.5 billion in a new factory in China.  The facility will churn out chips only for the Chinese market, evidently, and using old technology.  U.S. export rules require that Intel restrict the fab to using 90-nm processing, whereas chips made and sold in the U.S. will soon use a 45-nm process.

And biology just keeps getting cheaper.

Update on Public Access to "Genome Synthesis and Design Futures"

Due to confusion about access to "Genome Synthesis and Design Futures", I would like to make a clarification.  The original order page was not as clear as it could have been.  The report is publicly available, and is available as a free PDF or via a print-on-demand service for $95.  There are no restrictions to obtaining a copy, unless you are shy or are obviously misrepresenting yourself.  While the report does not contain sensitive material, Bio-era is requiring registration to receive a copy in an effort to both track interest and be a responsible public citizen.  I think it is rather ironic that the decision to require registration has been the target of public criticism by people who have made a business of making noise about restricting access to, and progress in, biological technologies.

Here is the new, clearer, web page.

Thoughts on Open Biology

A story at LinuxDevices last year on a report from the Committee for Economic Development (CED), recommending government use of "open source" and "open research", prompted me to collect the following thoughts on Open Biology.

I've changed the entry in my category list for this blog from "Open Source Biology" to "Open Biology".  Despite unleashing the phrase "Open Source Biology" on the world six years ago, at this point I no longer know what Open Source Biology might be.  Perhaps Drew Endy still has a  useful definition in mind, but as I try to understand how to maintain progress, improve safety, and keep the door open for economic growth, I think the analogy between software and biology just doesn't go far enough.  Biology isn't software, and DNA isn't code.  As I study the historical development of railroads, electricity, aviation, computer hardware, computer software, and of the availability of computation itself (distributed, to desktop, and back to distributed; or ARPANet to Microsoft Office to Google Apps), I am still trying to sort out what lessons can be applied to biological technologies.  I have only limited conclusions about how any such lessons will help us plan for the future of biology.

When I first heard Drew Endy utter the phrase "Open Source Biology", it was within the broader context of living in Berkeley, trying to understand the future of biology as technology, and working in an environment (the then embryonic Molecular Sciences Institute) that encouraged thinking anything was possible.  It was also within the context of Microsoft's domination of the OS market, the general technology boom in the San Francisco Bay area, the skyrocketing cost of drug development coupled to a stagnation of investment return on those dollars, and the obvious gap in our capabilities in designing and building biological systems.  OSB seemed the right strategy to get to where I thought we ought to be in the future, which is to create the ability to tinker effectively,  perhaps someday even to engineer biology, and to employ biology as technology for solving some of the many problems humans face, and that humans have created.

As in 2000, I remain today most interested in maintaining, and enhancing, the ability to innovate.  In particular, I feel that safe and secure innovation is likely to be best achieved through distributed research and through distributed biological manufacturing.  By "Open Biology" I mean access to the tools and skills necessary to participate in that innovation and distributed economy.

"Open source biology" and "open source biotechnology" are catchy phrases, but they have little if any content for the moment.  As various non-profits get up and running (e.g., CAMBIA and the BioBrick Foundation), some of the vagaries will be defined, and at least we will have some structure to talk about and test in the real world.  When there is a real license a la the GPL, or the Lesser License, and when it is finally tested in court we will have some sense of how this will all work out.

I am by no means saying work should stop on OSB, or on figuring out the licenses, just that I don't understand how it fits into helping innovation at the moment.  A great deal of the innovation we need to see will not come from academia or existing corporations, but from people noodling around in their garages or in start-ups yet to be founded.  These are the customers for Biobricks, these are the people who want the ability to build biological systems without needing an NIH grant.

But Drew Endy (Biobricks) and Richard Jefferson (CAMBIA) have as primary customers not corporations, hobbyists, or tinkerers, but large foundations and governments.  The marketplace in which Biobricks and CAMBIA compete for funding values innovation and the promise of changing the world.  At present, they do not derive the majority of their funding from actually selling parts or licenses on the open market, and thus do not rely on sales to fund their work.  Nor should they.  But the rest of our economy operates on exchanges of money for goods and services.  Synthetic Biology will get there some day, too, but the transition is still a bit murky for me.  The Bio-era research report, "Genome Synthesis and Design Futures: Implications for the U.S. Economy", of which I am a co-author, points to the utility of Synthetic Biology and Biobricks in producing biofuels, vaccines, and new materials.  However, the implementation of the new technological framework of genome design, enabled by large scale gene synthesis and composable parts with defined properties, is still in the offing.

Janet Hope has made an initial study of the state of Open Source Biotechnology in her Ph.D. dissertation at Australia National University.  Janet gives the following definition for her project:

"Open Source Biotechnology" refers to the possibility of extending the principles of commerce-friendly, commons-based peer production exemplified by Open Source software development to the development of research tools in biomedical and agricultural biotechnology.

This project examines the feasibility of Open Source Biotechnology in the current industry environment. In particular, it explores:       

1. Whether it would be possible to run a viable biotechnology business on Open Source principles, and

2. What such a business might look like, including the application of specific Open Source-style licences to particular classes of biotechnology research tools.

Janet's book on the subject is due out later this year from Harvard Press.  My book on all of this stuff is, um, not finished.

The CED report  "concludes that openness should be promoted as a matter of public policy, in order to foster innovation and economic growth in the U.S. and world economies."  I think this bit, in particular, is very interesting (quoting from the LinuxDevices story):

  • Open Innovation (such as 'peer production' systems like WikiPedia and eBay user ratings)

    • To foster open innovation, federally funded, non-classified research should be widely disseminated, following the example of the NIH (National Institute of Health)
    • "Any legislation or regulation regarding intellectual property rights [should be] weighed with a presumption against the granting of new rights ... because of the benefits to society of further innovation through greater access to technology."
    • The NSF (National Science Foundation) should fund research into "alternative compensation methods, similar to those created to facilitate the growth of radio, to reward creators of digital information products"

The first point is a bit off, since most NIH sponsored research, as a practical matter, available only through subscriptions to the journals in which it is published.  This will slowly get fixed, however, with increasing publication via the Public Library of Science and similar efforts.  The second point, embodied in patent reform, will probably take forever and will be hobbled by vested interests.  The third may not produce useful results for many years.

So here we sit, needing much fast innovation in biological technologies in order to produce carbon neutral fuels, improve human health, and deal with emerging threats such as SARS and pandemic influenza.  Open Biology is part of that, somehow, but I still don't see a clear path to implementing the ideas within the context of the real economic system we live in every day.

Stewart Brand -- “Where are the green biotech hackers?”

Tomorrow's New York Times has a great article on Stewart Brand.  In it, he asks the question, “Where are the green biotech hackers?”  We're coming, Stewart.  It's just that we're still on the slow part of the curves.

It's an interesting question, actually -- when do we get to the fast part?  When does biology start to go really fast?  And what does fast mean?

One answer to the question is the speed and the cost at which we can presently sequence or synthesize an interesting genetic circuit or organism.  Costs for reading genes are halving every 18 months or so, and if the rumors are true, we will hit the Thousand Dollar Genome sooner than my original estimate.  Sequencing is pretty easy at this point, as long as you already have a map to work with, which is the case for an increasing number of organisms.  And if you build the organism yourself, or pay someone else to do it, then you already know both the basic structure of the genome (the map) and the specific sequence.

At the moment, synthesis of a long gene takes about four weeks at a commercial DNA foundry, with a bacterial genome still requiring many months at best, though the longest reported contiguous synthesis job to date is still less than 50 kilobases.  And at a buck a base, hacking any kind of interesting new circuit is still expensive.  As I reported from SB 2.0, the synthesis companies are evidently now using my cost estimates as planning devices, even though that's not why made those estimates in the first place.  They project costs to continue falling by a factor of 2 approximately every year, which means that it will be another 5 years before synthesizing something the size of E. coli from scratch will cost less than US$ 1000, or 1 kilobuck.

The bigger problem, though, is the turnaround time.  No engineer or hacker wants to wait four weeks to see if a program works.  Hit compile, wait for four weeks, no "Hello World."  Start trying to debug the bug, with no debugging tools.  No thanks.  (I've actually had discussions with geneticists/molecular biologists who think even waiting a few days for a synthesis job isn't a big deal.  But what can you say -- biology just hasn't been a hacker culture.  And we are the poorer for it.)

So, Mr. Brand, it will be a few years before green hackers, at least those who aren't supported by Vinod Khosla or Kleiner Perkins, really start to have an impact.  The hackers who are lucky enough to have that kind of support, such as the blokes at Amyris Biotechnologies if their past accomplishments are anything to go by, will probably have something to show for themselves pretty soon.

The article ends with a couple of great paragraphs, which, along with "Science is the only news", are all you need to live by:

“I get bored easily — on purpose,” he said, recalling advice from the co-discoverer of DNA’s double helix. “Jim Watson said he looks for young scientists with low thresholds of boredom, because otherwise you get researchers who just keep on gilding their own lilies. You have to keep on trying new things.”

That’s a good strategy, whether you’re trying to build a sustainable career or a sustainable civilization. Ultimately, there’s no safety in clinging to a romanticized past or trying to plan a risk-free future. You have to keep looking for better tools and learning from mistakes. You have to keep on hacking.

"Genome Synthesis and Design Futures: Implications for the U.S. Economy"

After many, many months of work, Bio Economic Research Associates (Bio-era) today released "Genome Synthesis and Design Futures: Implications for the U.S. Economy".  Sponsored largely by Bio-era and the U.S. Department of Energy, with assistance from Dupont and the Berkeley Nanosciences and Nanoengineering Initiative, the report examines the present state of biological technologies, their applications to genome design, and potential impacts on the biomanufacturing of biofuels, vaccines, and chemicals.  The report also employs scenario planning to develop four initial scenarios exploring the effects of technological development and governmental policy.   Here is a link to the press release; over on the right side of the page are links to a short Podcast with myself and Jim Newcomb describing some of the findings.

It is a giant topic, and even at 180 pages we have really just barely scratched the surface.  The changes we've already witnessed will pale in comparison to what's coming down the pike.  The report deals mostly with science, technology, economics, markets, and policy, and only starts to explore the social and ethical aspects of forthcoming decisions.  Future work will refine the technological and economic analyses, will flesh out the security aspects of the ferment in biological technologies, and will delve into what all this means for our society.  In the preface, Jim Newcomb and Steve Aldrich note:

In presenting this analysis, we are mindful of the limitations of its scope. The arrival of new technologies for engineering biological systems for human purposes raises complex questions that lie at the intersection of many different disciplines. As the historian Arthur M. Schlesinger has written, “science and technology revolutionize our lives, but memory, tradition and myth frame our response.” Because this report is focused on potential economic implications of genome engineering and design technologies for the U.S. economy, there are many important questions that are not addressed here. In particular, we have not attempted to address questions of safety and biosecurity; the likelihood or possible impact of unintended consequences, such as environmental damage from the use of these technologies; or the ethical, legal, and social questions that arise. The need for thoughtful answers to these and related questions is urgent, but beyond the scope of this work. We hope to have the opportunity to investigate these questions in subsequent research.

We had a lot of help along the way, and for my part I would like to thank Drew Endy, Brian Arthur, George Church, Tom Kalil, Craig Venter, Gerald Epstein, Jay Keasling, Brad Smith, Erdogan Gulari, John Beadle, Roger Brent, John Mulligan, Michele Garfinkel, Ralph Baric, and Stephen Johnston, and Todd Harrington. 

Here is web page to buy a hard copy and/or download the PDF.  Just fill out the form (we're trying to track interest), and you will be sent a link to the PDF.

A Few Thoughts on Rapid Genome Sequencing and The Archon Prize

The December, 2006 issue of The Scientist has an interesting article on new sequencing technologies.  "The Human Genome Project +5", by Victor McElheny, contains a few choice quotes.  Phil Sharp, from MIT, says he, "would bet on it without a questionthat we will be at a $1,000 genome in a five-year window."  Presently we are at about US$10 million per genome, so we have a ways to go. It's interesting to see just how much technology has to change before we get there. 

The Archon X-Prize for Genomics specifies sequencing 100 duplex genomes in 10 days, at a cost of no more than US$10,000 per genome.  In other words, that is roughly 600 billion bases at a cost of microdollars per base.  Looking at it yet another way, winning requires 6000 person-days at present productivity numbers for commercially available instruments, whereas 10 days only provides 30 person-days of round-the-clock productivity.

I tried to find a breakdown of genome sequencing costs on the web, and all I could come up with is an estimate for the maize genome published in 2001.  I'll use that as a cost model for state of the art sequencing of eukaryotes (using Sanger sequencing on capillary based instruments).  Bennetzen, et al., recount the "National Science Foundation-Sponsored Workshop Report: Maize Genome Sequencing Project" in the journal Plant Physiology, and report:

The participants concurred that the goal of sequencing all of the genes in the maize genome and placing these on the integrated physical and genetic map could be pursued by a combination of technologies that would cost about $52 million. The breakdown of estimated costs would be:

  • Library construction and evaluation, $3 million
  • BAC-end sequencing, $4 million
  • 10-Fold redundant sequencing of the gene-rich and low-copy-number regions, $34 million
  • Locating all of the genes on an integrated physical-genetic map, $8 million
  • Establishing a comprehensive database system, $3 million.

From the text, it seems that decreases in costs are built into the estimate.  If we chuck out the database system, since this is already built for humans and other species, we are down to direct costs of something like $49 million for approximately 2.5 megabases(MB).  The Archon prize doesn't specify whether competitors can use existing chromosomal maps to assemble sequence data, so presumably all the information is fair game.  That lets us toss out another $8 million in cost.  The 10-fold redundant sequencing is probably overkill at this point, but I will keep all those costs because the Archon prize requires an error rate of no more than 1 in 100,000 bases; you have to beat down the error regardless of the sequencing method.  Rounding down to $40 million for charity's sake, it looks like the labor and processing associated with producing the short overlapping sequences necessary for Sanger sequencing account for about 17.5 percent of the total.  These costs are probably fixed for approaches that employ shotgun sequencing.

Again using the Archon prize as a simple comparison, that's US$1.75 million just to spend on labor for getting ready to do the actual sequencing.  In 1998, the FTE (full time equivalent) for sequencing labor was US$135,000.  If you assume the dominant cost for preparing the library and verifying the BACs is labor, you can hire about 15 people.  This looks like a lot of work for 15 people, and, given the amount of time required to do all the cloning and wait for bacteria to grow, not something they can accomplish even within the 10 days alloted for the whole project.

The other 82.5 percent of the $10 million you can spend on the actual sequencing.  The prize guidelines say you don't have to include the price of the instruments in the cost, but just for the sake of argument I'll do that here.  And I'll mix and match the cost estimates from the maize project for Sanger sequencing with other technologies.  The most promising commercial instrument appears to be the 454 pyrosequencer, at $500,000 a pop, looking at its combination of read length and throughput, even if they don't yet have the read length quite high enough yet.  If you buy 16 of those beasties, it appears you can sequence about 1.6 GB a day, about a factor of 40 below what's required to win the Archon prize.  Let's say 454 gets the read length up to 500 bases, then they are still an order of magnitude shy just on the sequencing rate, forgetting the sample prep.

Alternatively, you could simply buy 600 of the 454 instruments, and then you'd be set, at least for throughput.  Might blow your budget, though, with the $300 million retail cost.  But you could take solace in how happy you'd make all the investors in 454.

Microsoft Supports Biobricks

Last weekend at the 2006 International Genetically Engineered Machines Competition (iGEM 2006), Microsoft announced a Request For Proposals related to Synthetic Biology.  According to the RFP page:

Microsoft invites proposals to identify and address computational challenges in two areas of synthetic biology. The first relates to the re-engineering of natural biological pathways to produce interoperable, composable, standard biological parts. Examples of research topics include, but are not limited to, the specification, simulation, construction, and dissemination of biological components or systems of interacting components. The second area for proposals focuses on tools and information repositories relating to the use of DNA in the fabrication of nanostructures and nanodevices. In both cases, proposals combining computational methods with biological experimentation are seen as particularly valuable.

The total amount to be awarded is $500,000. 

"Smallpox Law Needs Fix"

ScienceNOW Daily News is carrying a short piece on the recommendation by the National Science Advisory Board on Biosecurity (NSABB) to repeal a law that criminalizes synthesis of genomes 85% similar to smallpox.

The original law, which surprised everyone I have ever talked to about this topic, was passed in late 2004 and wasn't written about by the scientific press until March of '05:

The new provision, part of the Intelligence Reform and Terrorism Prevention Act that President George W. Bush signed into law on 17 December 2004, had gone unnoticed even by many bioweapons experts. "It's a fascinating development," says smallpox expert Jonathan Tucker of the Monterey Institute's Center for Nonproliferation Studies in Washington, D.C.

...Virologists zooming in on the bill's small print, meanwhile, cannot agree on what exactly it outlaws. The text defines variola as "a virus that can cause human smallpox or any derivative of the variola major virus that contains more than 85 percent of the gene sequence" of variola major or minor, the two types of smallpox virus. Many poxviruses, including a vaccine strain called vaccinia, have genomes more than 85% identical to variola major, notes Peter Jahrling, who worked with variola at the U.S. Army Medical Research Institute of Infectious Diseases in Fort Detrick, Maryland; an overzealous interpretation "would put a lot of poxvirologists in jail," he says.

According to the news report at ScienceNOW:

Stanford biologist David Relman, who heads NSABB's working group on synthetic genomics, told the board that "the language of the [amendment] allows for multiple interpretations of what is actually covered" and that the 85% sequence stipulation is "arbitrary." Therefore, he said, "we recommend repealing" the amendment.

Relman's group also recommended that the government revamp its select agents list in light of advances in synthetic genomics. These advances make it possible to engineer biological agents that are functionally lethal but genomically different from pathogens on the list. The group's recommendations, which were approved unanimously by the board, are among several that the board will pass on to the U.S. government to help develop policies for the conduct and oversight of biological research that could potentially be misused by terrorists.

Oh Goody -- Prizes for Genomes!

But seriously folks...it's good news that prizes are being posted for biological technologies.  A couple of weeks ago, the X Prize Foundation announced a $10 million prize for demonstration of "technology that can successfully map 100 human genomes in 10 days."  This is not the first such offer; Nicholas Wade notes in the New York Times that Craig Venter set up a $500,000 prize in 2003 for achieving the Thousand Dollar Genome.  Venter is now on the board of the X Prize Foundation and it appears his original prize has been expanded into the subject of the current announcement.  We definitely need new ways to fund development of biological technologies.

Here's more coverage, by Antonio Regalado in the Wall Street Journal.  It will be interesting to see if anyone can come up with a way to make a profit on the $10 million prize.

The prize requires sequencing roughly 500 billion bases in 10 days.  It isn't possible to directly compare the prize specs with my published numbers since there is no specification on the number of people involved in the project.  If you throw a million lab monkeys running a million low tech sequencers at the problem, you're set.  Except, of course, for all the repeats, inversions, and rearrangements that require expertise to map and sort out.

According to a news story by Erika Check in Nature, the performance numbers cited by 454 Life Sciences appear to be encouraging: "Using the 454 technique, one person using one machine could easily sequence the 3 billion base pairs in the human genome in a hundred days, [Founder and CEO Jonathan Rothberg] says," which is about 3.75 million bases per person per day.  And he is optimistic about progress in reducing costs:  "As the process gets faster, it gets less expensive. "It's clear that we'll be able to do this much cheaper," predicts Rothberg, who says that in the next few years scientists will be able to assemble a human genome for US$10,000."  At the present pace of improvement, this looks to be about 2015, though new technology could always get there sooner.

There seems to be some divergence of expert opinion about where a winning technology will come from.  Writing in Science, Elizabeth Pennisi, notes:

Charles Cantor, chief scientific officer of SEQUENOM Inc. in San Diego, California, predicts only groups already versed in sequencing DNA will have a chance at the prize. Others disagree. "I think it is unlikely" that the winner will come from the genome-sequencing community, says Leroy Hood, who invented the first automated DNA sequencer. And Venter predicts that the chance that someone will come out of the woodwork to scoop up the $10 million is "close to 100%." The starting gun has sounded. 

Indeed.  I had sworn off thinking about new sequencing technologies, but the prize has got even me to thinking...

Vaccine Development as Foreign Policy

I was fortunate to attend Sci Foo Camp last month, run by O'reilly and Nature, at the Googleplex in Santa Clara.  The camp was full of remarkable people; I definitely felt like a small fish.  (I have a brief contribution to the Nature Podcast from Sci Foo; text, mp3.)  There were a great many big, new ideas floating around during the weekend.  Alas, because the meeting was held under the Chatham House Rule, I cannot share all the cool conversations I had.

However, at the airport on the way to San Jose I bumped into Greg Bear, who also attended Sci Foo, and our chat reminded me of an idea I've been meaning to write about.

In an essay published last year, Synthetic Biology 1.0, I touched briefly on the economic costs of disease as a motivation for developing cheaper drugs.  Building synthetic biological systems to produce those drugs is an excellent example of the potential rewards of improved biological technologies.

But a drug is a response to disease, whereas vaccines are far and away recognized as "the most effective medical intervention" for preventing disease and reducing the cost and impacts of pathogens.  While an inexpensive drug for a disease like malaria would, of course, be a boon to affected countries, drugs do not provide lasting protection.  In contrast, immunization requires less contact with the population to suppress a disease.  Inexpensive and effective vaccines, therefore, would provide even greater human and economic benefit.

How much benefit?  It is extremely hard to measure this sort of thing, because to calculate the economic effect of a disease on any given country you have to find a similar country free of the disease to use as a control.  A report released in 2000 by Harvard and the WHO found that, "malaria slows economic growth in Africa by up to 1.3% each year."  The cumulative effect of that hit to GDP growth is mind-blowing:

...Sub-Saharan Africa's GDP would be up to 32% greater this year if malaria had been eliminated 35 years ago. This would represent up to $100 billion added to sub-Saharan Africa's current GDP of $300 billion. This extra $100 billion would be, by comparison, nearly five times greater than all development aid provided to Africa last year.

The last sentence tells us all we need to know about the value of a malaria vaccine; it could advance the state of the population and economy so far as to swamp the effects of existing foreign aid.  And it would provide a lasting improvement to be built upon by future generations of healthy children.

The economic valuation of vaccines is fraught with uncertainty, but Rappuoli, et al., suggest in Science that if, "policymakers were to include in the calculation the appropriate factors for avoiding disease altogether, the value currently attributed to vaccines would be seen to underestimate their contribution by a factor of 10 to 100."  This is, admittedly, a big uncertainty, but it all lies on the side of underestimation.  And the point is that there is some $20 Billion annually spent on aid, where a fraction of it might be better directed towards western vaccine manufacturers to produce long term solutions.

Vaccine incentives are usually discussed in terms of guaranteeing a certain purchase volume (PDF warning for a long paper here discussing the relevant economics).  But I wonder if we shouldn't re-think government sponsored prizes.  This strategy was recently used in the private sector to great effect and publicity for the X-Prize, and its success had led to considering other applications of the prize incentive structure.

Alas, this isn't generally considered the best way to incentivize vaccine manufacturers.  The Wikipedia entry for "Vaccine" makes only passing reference to prizes for vaccine development.  A 2001 paper in the Bulletin of the World Health Organization, for which a number of experts and pharmaceutical companies were interviewed about ways to improve AIDS vaccine development, concluded, "It was felt that a prize for the development of an AIDS vaccine would have little impact. Pharmaceutical firms were in business to develop and sell products, not to win prizes."

But perhaps the problem is not that prizes are the wrong way to entice Big Pharma, but rather that Big Pharma may not be the right way develop vaccines.  Perhaps we should find a way to encourage a business model that aims to produce a working, safe vaccine at a cost that maximizes profit given the prize value.

So how much would developing a vaccine cost?  According to a recent short article in Nature, funds devoted to developing a malaria vaccine amounted to a whopping measly $65 million in 2003.  The authors go on to to note that, "At current levels, however, if a candidate in phase II clinical trials demonstrated sufficient efficacy, there would be insufficient funding available to proceed to phase III trials."

It may be that The Gates Foundation, a major funder of the malaria work, would step in to provide sufficient funds, but this dependency doesn't strike me as a viable long-term strategy for developing vaccines.  (The Gates Foundation may not be around forever, but we can be certain that infectious disease will.)  Instead, governments, and perhaps large foundations like The Gates, should set aside funds to be paid as a prize.  What size prize?  Of the ~$1-1.5 Billion it supposedly costs to develop a new drug, ~$250 million goes to marketing.  Eliminating the need for marketing with a prize value of $1.5 Billion would provide a reasonable one time windfall, with continued sales providing more profit down the road.

Setting aside as much as $200 million a year would be a small fraction of the U.S. foreign aid budget and would rapidly accumulate into a large cash payout.  Alternatively, we could set it up as a yearly payment to the winning organization.  Spread the $200 million over multiple governments (Europe, Japan, perhaps China), and suddenly it doesn't look so expensive.  In any event, we're talking about a big payoff in both saving lives and improving general quality of life, so a sizable prize is warranted.  I expect $2 Billion is probably the minimum to get international collaborations to seriously compete for the prize.

The foreign policy aspects of this strategy fit perfectly with the goals of the U.S. Department of State to improve national security by reducing poverty abroad.  Here is Gen. Colin Powell, reprinted from Foreign Policy Magazine in 2005 ("No Country Left Behind"):

We see development, democracy, and security as inextricably linked. We recognize that poverty alleviation cannot succeed without sustained economic growth, which requires that policymakers take seriously the challenge of good governance. At the same time, new and often fragile democracies cannot be reliably sustained, and democratic values cannot be spread further, unless we work hard and wisely at economic development. And no nation, no matter how powerful, can assure the safety of its people as long as economic desperation and injustice can mingle with tyranny and fanaticism.

Development is not a "soft" policy issue, but a core national security issue. [emphasis added]  Although we see a link between terrorism and poverty, we do not believe that poverty directly causes terrorism. Few terrorists are poor. The leaders of the September 11 group were all well-educated men, far from the bottom rungs of their societies. Poverty breeds frustration and resentment, which ideological entrepreneurs can turn into support for--or acquiescence to--terrorism, particularly in those countries in which poverty is coupled with a lack of political rights and basic freedoms.

Dr. Condoleezza Rice, in opening remarks to the Senate Foreign Relations Committee (PDF warning) during her confirmation hearings, plainly stated, "...We will strengthen the community of democracies to fight the threats to our common security and alleviate the hopelessness that feeds terror."

Over any time period you might care to examine, it will probably cost vastly less to produce a working malaria vaccine than to continue dribbling out foreign aid.  Even just promoting the prize would bolster the U.S. image abroad in exactly those countries where we are hurting the most, and successful development would have profound consequences for national security through the elimination of human suffering.  Seems like a good bargain.  The longer we wait, the worse it gets.