Moore's Law and Beyond
Where a calculator on the ENIAC is equipped with 18,000 vacuum tubes and weighs 30 tons, computers in
the future may have only 1,000 vacuum tubes and perhaps weigh 1.5 tons.
—P
OPULAR
M
ECHANICS
,
1949
Computer Science is no more about computers than astronomy is about telescopes.
—E.
W.
D
IJKSTRA
Before considering further the implications of the Singularity, let's examine the wide range of technologies that are
subject to the law of accelerating returns. The exponential trend that has gained the greatest public recognition has
become known as Moore's Law. In the mid-1970s, Gordon Moore, a leading inventor of integrated circuits and later
chairman of Intel, observed that we could squeeze twice as many transistors onto an integrated circuit every twenty-
four months (in the mid-1960s, he had estimated twelvemonths). Given that the electrons would consequently have
less distance to travel, circuits would also run faster, providing an additional boost to overall computational power.
The result is exponential growth in the price-performance of computation. This doubling rate—about twelve months—
is much faster than the doubling rate for paradigm shift that I spoke about earlier, which is about ten years. Typically,
we find that the doubling time for different measures—price-performance, bandwidth, capacity—of the capability of
information technology is about one year.
The primary driving force of Moore's Law is a reduction of semiconductor feature sizes, which shrink by half
every 5.4 years in each dimension. (See the figure below.) Since chips are functionally two-dimensional, this means
doubling the number of elements per square millimeter every 2.7 years.
22
The following charts combine historical data with the semiconductor-industry road map (International Technology
Roadmap for Semiconductors [ITRS] from Sematech), which projects through 2018.
The cost of DRAM (dynamic random access memory) per square millimeter has also been coming down. The
doubling time for bits of DRAM per dollar has been only 1.5 years.
23
A similar trend can be seen with transistors. You could buy one transistor for a dollar in 1968; in 2002 a dollar
purchased about ten million transistors. Since DRAM is a specialized field that has seen its own innovation, the
halving time for average transistor price is slightly slower than for DRAM, about 1.6 years (see the figure below).
24
This remarkably smooth acceleration in price-performance of semiconductors has progressed through a series of stages
of process technologies (defined by feature sizes) at ever smaller dimensions. The key feature size is now dipping
below one hundred nanometers, which is considered the threshold of "nanotechnology."
25
Unlike Gertrude Stein's rose, it is not the case that a transistor is a transistor is a transistor. As they have become
smaller and less expensive, transistors have also become faster by a factor of about one thousand over the course of the
past thirty years (see the figure below)—again, because the electrons have less distance to travel.
26
If we combine the exponential trends toward less-expensive transistors and faster cycle times, we find a halving
time of only 1.1 years in the cost per transistor cycle (see the figure belowl.
27
The cost per transistor cycle is a more
accurate overall measure of price-performance because it takes into account both speed and capacity. But the cost per
transistor cycle still does not take into account innovation at higher levels of design (such as microprocessor design)
that improves computational efficiency.
The number of transistors in Intel processors has doubled every two years (see the figure below). Several other
factors have boosted price-performance, including clock speed, reduction in cost per microprocessor, and processor
design innovations.
28
Processor performance in MIPS has doubled every 1.8 years per processor (see the figure below). Again, note that the
cost per processor has also declined through this period.
29
If I examine my own four-plus decades of experience in this industry, I can compare the MIT computer I used as a
student in the late 1960s to a recent notebook. In 1967 I had access to a multimillion-dollar IBM 7094 with 32K (36-
bit) words of memory and a quarter of a MIPS processor speed. In 2004 I used a $2,000 personal computer with a half-
billion bytes of RAM and a processor speed of about 2,000 MIPS. The MIT computer was about one thousand times
more expensive, so the ratio of cost per MIPS is about eight million to one.
My recent computer provides 2,000 MIPS of processing at a cost that is about 2
24
lower than that of the computer
I used in 1967. That's 24 doublings in 37 years, or about 18.5 months per doubling. If we factor in the increased value
of the approximately 2,000 times greater RAM, vast increases in disk storage, and the more powerful instruction set of
my circa 2004 computer, as well as vast improvements in communication speeds, more powerful software, and other
factors, the doubling time comes down even further.
Despite this massive deflation in the cost of information technologies, demand has more than kept up. The number
of bits shipped has doubled every 1.1 years, faster than the halving time in cost per bit, which is 1.5 years.
30
As a
result, the semiconductor industry enjoyed 18 percent annual growth in total revenue from 1958 to 2002.
31
The entire
information-technology (IT) industry has grown from 4.2 percent of the gross domestic product in 1977 to 8.2 percent
in 1998.
32
IT has become increasingly influential in all economic sectors. The share of value contributed by
information technology for most categories of products and services is rapidly increasing. Even common manufactured
products such as tables and chairs have an information content, represented by their computerized designs and the
programming of the inventory-procurement systems and automated-fabrication systems used in their assembly.
Do'stlaringiz bilan baham: |