If a most efficient supercomputer works all day to compute a weather simulation problem, what is the
minimum amount of energy that must be dissipated according to the laws of physics? The
answer is actually
very simple to calculate, since it is unrelated to the amount of computation. The answer is always equal to
zero.
—E
DWARD
F
REDKIN
,
P
HYSICIST
45
We've already had five paradigms (electromechanical calculators, relay-based computing, vacuum tubes, discrete
transistors, and integrated circuits) that have provided exponential growth to the price-performance and capabilities of
computation. Each time a paradigm reached its limits, another paradigm took its place. We can already see the outlines
of the sixth paradigm, which will bring computing into the molecular third dimension. Because computation underlies
the foundations
of everything we care about, from the economy to human intellect and creativity, we might well
wonder: are there ultimate limits to the capacity of matter and energy to perform computation? If so, what are these
limits, and how long will it take to reach them?
Our human intelligence is based on computational processes that we are learning to understand. We will
ultimately multiply our intellectual powers by applying and extending the methods of human intelligence using the
vastly greater capacity of nonbiological computation. So to consider the ultimate limits of computation is really to ask:
what is the destiny of our civilization?
A common challenge to the ideas presented in this book is that these exponential trends must reach a limit, as
exponential trends commonly do. When a species happens upon a new habitat, as in the famous
example of rabbits in
Australia, its numbers grow exponentially for a while. But it eventually reaches the limits of that environment's ability
to support it. Surely the processing of information must have similar constraints. It turns out that, yes, there are limits
to computation based on the laws of physics. But these still allow for a continuation of exponential growth until
nonbiological intelligence is trillions of trillions of times more powerful than all of human civilization today,
contemporary computers included.
A major factor in considering computational limits is the energy requirement. The energy required per MIPS for
computing devices has been falling exponentially, as shown in the following figure.
46
However, we also know that the number of MIPS in computing devices has been growing exponentially. The
extent to which improvements in power usage have kept pace with processor speed depends on
the extent to which we
use parallel processing. A larger number of less-powerful computers can inherently run cooler because the
computation is spread out over a larger area. Processor speed is related to voltage, and the power required is
proportional to the square of the voltage. So running a processor at a slower speed significantly reduces power
consumption. If we invest in more parallel processing rather than faster single processors, it is feasible for energy
consumption and heat dissipation to keep pace with the growing MIPS per dollar, as the figure "Reduction
in Watts
per MIPS" shows.
This is essentially the same solution that biological evolution developed in the design of animal brains. Human
brains use about one hundred trillion computers (the interneuronal connections, where most of the processing takes
place). But these processors are very low in computational power and therefore run relatively cool.
Until just recently Intel emphasized the development of faster and faster single-chip processors, which have been
running at increasingly high temperatures. Intel is gradually changing its strategy toward parallelization by putting
multiple processors on a single chip. We will see chip technology move in this direction as a way of keeping power
requirements and heat dissipation in check.
47
Do'stlaringiz bilan baham: