Sunday, July 1, 2007

COMPUTATION

The second side of the organizational square is computation. For most of human
history, computation has been done within the speed and capacity limits of the
individual human brain. Computation acceleration beyond human input was not
possible. While the knowledge base of the fundamentals of mathematics was advancing
throughout the ages, means of providing computational leverage for the human
mind lagged. For those interested in a detailed history of computation, Stephen
White provides a most interesting account.

The early abacus served as humankind’s only computational lever for five millennia.
It magnified human capacity by counting, carrying, and serving as a memory
device. With the discovery of the logarithm, the development of the slide rule was
made possible in the first quarter of the seventeenth century (William Oughtred,
1625). The next major advance was the mechanical calculator using gears and
mechanisms to perform addition, subtraction, division, and multiplication (first mass
produced by Charles de Colmar in 1820). Although human input was still required
to initiate each calculation, some speed advance over unaided human computation
was achieved.

Charles Babbage (1792 to 1871) provided a big step forward as he conceived
the fundamentals of computing machines — machines that could be programmed
to execute many computations without further human intervention. Babbage conceived
numerous mechanical computers with innovations including memory, punchcard
programming, and conditional jumps. His mechanical designs provided the
conceptual foundation for the electronic computers that followed a century later.

Observing the computational speed of these methods throughout history, one
realizes that the computational lever for the mind was very limited until systems
were developed that used electricity as the basis of operation. One ancillary
techonomic observation is that mechanical development of a process often precedes
electronic development — the physical implementation precedes the virtual. In a
way, things are understood in the tangible world before they are implemented in the
electronic world. The invention of the vacuum tube by Lee DeForest (1906) was the
gateway to electronic computing. Once an electrical representation for mathematical
operations occurred, the speed of computation increased by orders of magnitude,
and eventually the cost per computation decreased even more dramatically.

One of the first digital computers, the ENIAC (Electronic Numerical Integrator
And Computer) could perform 50,000 simple additions or subtractions per second.

The ENIAC was developed during World War II, representing another example of
military demands driving rapid technological development due to absence of economic
constraints. Early use of ENIAC included calculations related to projectile
paths and supporting calculations for the hydrogen bomb. While ENIAC was unreliable
due to the thousands of vacuum tubes required for it to operate, it ushered in
the era of electronic computation, and the lever for the mind was unleashed. By the
way, ever wonder where the word “debug” came from? To modern computer programmers,
it means fixing a computer problem by correcting errors in computer
code. It originated with the early, mammoth-sized, relay-driven computers. Moths
would get into the circuitry and cause malfunctions, so computers had to be cleaned
out to function properly: debugged.

The speed and miniaturization of computation has been on a rapid rise ever since
the days of these early computers.

Physically ENIAC was a monster — it contained 17,468 vacuum tubes, 7,200
crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million
hand-soldered joints. It weighed 30 short tons (27 t), was roughly 2.4 m by 0.9 m
by 30 m, took up 167 m and consumed 160 kw of power. As of 2004, a chip of
silicon measuring 0.02 inches (0.5 mm) square holds the same capacity as the ENIAC.

The development sequence and interactions are worthy of note. For eons,
advances in computational capabilities were nonexistent, but the body of knowledge
of mathematics and processes continued growing. The harnessing of electricity led
to many discoveries and products, including the vacuum tube in the early twentieth
century. Advances in vacuum tube and computer technology were slow at first; the
boost came via a wartime effort requiring significant computation. The war prompted
focused research and development in practical and applied electronic computation.
Analog computers, requiring hardwired programming for each problem, appeared
first at the turn of the twentieth century. In response to military needs, a new category
of digital computers emerged, exemplified by ENIAC, and computational capabilities
were increased by orders of magnitude.

Next, the slow, steady road to commercial applications began. Commercial
requirements demanded higher reliability and reduced cost for general applications.
At mid-century another military race began, the space race, and the need for miniaturized,
light-weight, and reliable computing for space vehicle control fueled
advancement. Related technological developments emerged (transistor and integrated
circuit) directly addressing size (miniaturization), operating environment
(reduced power consumption and heat load, increased component reliability), and
programming flexibility (high-level software languages). With each related technology
improvement, more commercial applications became economically viable.

In a nonintuitive twist, the Soviet Union had superior rockets reducing their need to
limit the weight and size of the electronic control payload. As a result, there was
less motivation to miniaturize electronic circuitry — a need that drove the U.S.
program and subsequently resulted in the birth of the U.S. microelectronics industry.
These advances found their way into the commercial mass market a decade later
with the introduction of the personal computer.

The journey of computational progress accelerated, as the technology was now
in the hands of the masses, creating economic opportunities for innovations, both
in hardware and software. With more commercially viable applications, economies
of scale reduced hardware manufacturing costs dramatically, and a growing body of
knowledge related to software solutions became available. The geometric expansion
of worldwide computational capacity, initiated by the military-backed development
of ENIAC, continues into the twenty-first century with little sign of slowing.

Before the dawn of electronic computing in the 1940s, the speed of human input
constrained the computational performance of the limited methods available. After
the 1940s, the invention of the transistor and the integrated circuit created an everincreasing
performance metric for computation. Even in the absence of an economic
component, one can easily observe the significant increase in computing power
afforded by the emerging technologies.

A second metric, combining the impact of improving computational speed with
the cost reductions that have accompanied these advances, provides a techonomic
perspective on computational advance. While the speed of various technologies has
increased exponentially over the past 50 years, implementation costs have simultaneously
decreased. This has been due to the economics of implementation (aided
by mass production), reduced material requirements, reduced energy requirements,
and improved reliability. Combining the advance in technological performance with
reduced implementation cost yields a complete techonomic metric of the advance
of computational capabilities available for commercial applications. This metric is
closely related to Moore’s Law, which predicts that the cost of
equivalent computing performance halves every 18 months.

No comments: