Yesterday's klutzy machines have become today's micromarvels
The first electronic digital computer in the U.S., unveiled at the University of Pennsylvania in 1946, was a collection of 18,000 vacuum tubes, 70,000 resistors, 10,000 capacitors and 6,000 switches, and occupied the space of a two-car garage. Yet ENIAC (for Electronic Numerical Integrator and Calculator) was, in retrospect, a dimwit. When it worked, it did so only for short bursts because its tubes kept burning out. Built to calculate artillery firing tables, the half-million dollar ENIAC could perform 5,000 additions or subtractions per second. Today almost any home computer, costing only a few hundred dollars, can outperform poor old ENIAC as a "number cruncher."
Computer designers have obviously come a long way. But behind their spectacular achievements is a colorful history, one involving so many characters, so many innovations and such wrenching efforts that no single person or even country can claim authorship of the computer.
In a sense, humans have been computingmanipulating and comparing numbers or anything that they may representsince they first learned how to count, probably with pebbles (the word calculus stems from the Latin for stone). At least 2,500 years ago, the Chinese, among others, discovered that they could handle numbers more easily by sliding little beads on strings. Their invention, the abacus, is still in use.
In 1642, perhaps pained by the long hours his tax-collector father spent doing sums, a 19-year-old French prodigy named Blaise Pascal made an automatic device that could add or subtract with the turning of little wheels. But the clerks who spent their lives doing calculations in those days viewed Pascal's gadget as a job threat, and it never caught on. A short time later, the German mathematician Gottfried Wilhelm Leibniz added the power of multiplication and division. Said he: "It is unworthy of excellent men to lose hours like slaves in the labor of calculations..."
But such mechanical contrivances were no more than calculators. They could only do arithmetic, and very clumsily at that. The first man to conceptualize a true computer, one that would be able to do math and much more, was an irascible 19th century English mathematician named Charles Babbage. Incensed by the inaccuracies he found in the mathematical tables of his time, the ingenious Babbage (father of the speedometer, the cowcatcher for locomotives and the first reliable life-expectancy tables) turned his fertile brain to creating an automaton that could rapidly and accurately calculate long lists of functions like logarithms. The result was an intricate system of gears and cogs called the Difference Engine.
