Visual inspection reveals a number of interesting things. First, the DNA synthesis productivity line stops in about 2008 because there have been no new instruments released publicly since then. New synthesis and assembly technologies are under development by at least two firms, which have announced they will run centralized foundries and not sell instruments. More on this later.
Second, it is clear that DNA sequencing platforms are improving very rapidly, now much faster than Moore's Law. This is interesting in itself, but I point it out here because of the post today at Wired by Pixar co-founder Alvy Ray Smith, "How Pixar Used Moore's Law to Predict the Future". Smith suggests that "Moore's Law reflects the top rate at which humans can innovate. If we could proceed faster, we would," and that "Hardly anyone can see across even the next crank of the Moore's Law clock."
Moore's Law is a Business Model and is All About Planning -- Theirs and Yours
As I have written previously, early on at Intel it was recognized that Moore's Law is a business model (see the Pace and Proliferation paper, my book, and in a previous post, "The Origin of Moore's Law"). Moore's Law was always about economics and planning in a multi-billion dollar industry. When I started writing about all this in 2000, a new chip fab cost about $1 billion. Now, according to The Economist, Intel estimates a new chip fab costs about $10 billion. (There is probably another Law to be named here, something about exponential increases in cost of semiconductor processing as an inverse function of feature size. Update: This turns out to be Rock's Law.) Nobody spends $10 billion without a great deal of planning, and in particular nobody borrows that much from banks or other financial institutions without demonstrating a long-term plan to pay off the loan. Moreover, Intel has had to coordinate the manufacturing and delivery of very expensive, very complex semiconductor processing instruments made by other companies. Thus Intel's planning cycle explicitly extends many years into the future; the company sees not just the next crank of the Moore's Law clock, but several cranks. New technology has certainly been required to achieve these planning goals, but that is just part of the research, development, and design process for Intel. What is clear from comments by Carver Mead and others is that even if the path was unclear at times, the industry was confident that they could to get to the next crank of the clock.
Moore's Law served a second purpose for Intel, and one that is less well recognized but arguably more important; Moore's Law was a pace selected to enable Intel to win. That is why Andy Grove ran around Intel pushing for financial scale (see "The Origin of Moore's Law"). I have more historical work to do here, but it is pretty clear that Intel successfully organized an entire industry to move at a pace only it could survive. And only Intel did survive. Yes, there are competitors in specialty chips and in memory or GPUs, but as far as high volume, general CPUs go, Intel is the last man standing. Finally, and alas I don't have a source anywhere for this other than hearsay, Intel could have in fact gone faster than Moore's Law. Here is the hearsay: Gordon Moore told Danny Hillis who told me that Intel could have gone faster. (If anybody has a better source for that particular point, give me a yell on Twitter.) The inescapable conclusion from all this is that the management of Intel made a very careful calculation. They evaluated product roll-outs to consumers, the rate of new product adoption, the rate of semiconductor processing improvements, and the financial requirements for building the next chip fab line, and then set a pace that nobody else could match but that left Intel plenty of headroom for future products. It was all about planning.
The reason I bother to point all this out is that Pixar was able to use Moore's Law to "predict the future" precisely because Intel meticulously planned that future. (Calling Alan Kay: "The best way to predict the future is to invent it.") Which brings us back to biology. Whereas Moore's Law is all about Intel and photolithography, the reason that productivity in DNA sequencing is going through the roof is competition among not just companies but among technologies. And we only just getting started. As Smith writes in his Wired piece, Moore's Law tells you that "Everything good about computers gets an order of magnitude better every five years." Which is great: it enabled other industries and companies to plan in the same way Pixar did. But Moore's Law doesn't tell you anything about any other technology, because Moore's Law was about building a monopoly atop an extremely narrow technology base. In contrast, there are many different DNA sequencing technologies emerging because many different entrepreneurs and companies are inventing the future.
The first consequence of all this competition and invention is that it makes my job of predicting the future very difficult. This emphasizes the difference between Moore's Law and Carlson Curves (it still feels so weird to write my own name like that): whereas Intel and the semiconductor industry were meeting planning goals, I am simply keeping track of data. There is no real industry-wide planning in DNA synthesis or sequencing, other than a race to get to the "$1000 genome" before the next guy. (Yes, there is a vague road-mappy thing promoted by the NIH that accompanied some of its grant programs, but there is little if any coordination because there is intense competition.)
Biological Technologies are Hard to Predict in Part Because They Are Cheaper than Chips
Compared to other industries, the barrier to entry in biological technologies is pretty low. Unlike chip fabs, there is nothing in biology that costs $10 billion commercially, nor even $1 billion. (I have come to mostly disbelieve pharma industry claims that developing drugs is actually that expensive, but that is another story for another time.) The Boeing 787 reportedly cost $32 billion to develop as of 2011, and that is on top of a century of multi-billion dollar aviation projects that had to come before the 787.
There are two kinds of costs that are important to distinguish here. The first is the cost of developing and commercializing a particular product. Based on the money reportedly raised and spent by Life, Illumina, Ion Torrent (before acquisition), Pacific Biosciences, Complete Genomics (before acquisition), and others, it looks like developing and marketing second-generation sequencing technology can cost upwards of about $100 million. Even more money gets spent, and lost, in operations before anybody is in the black. My intuition says that the development costs are probably falling as sequencing starts to rely more on other technology bases, for example semiconductor processing and sensor technology, but I don't know of any real data. I would also guess that nanopore sequencing, should it actually become a commercial product this year, will have cost less to develop than other technologies, but, again, that is my intuition based on my time in clean rooms and at the wet bench. I don't think there is great information yet here, so I will suspend discussion for the time being.
The second kind of cost to keep in mind is the use of new technologies to get something done. Which brings in the cost curve. Again, the forthcoming paper will contain appropriate references.