(3 of 8)
Such computational prowess seems dazzlingly unreal, and reinforces the popular image of computers as electronic brains with infinite intelligence. Yet most scientists regard computers, including those on chips, as dumb brutes. "They do only what they are told," insists Louis Robinson, director of scientific computing at IBM's data-processing division, "and not an iota more." What all computers, large and small, do extremely well is "number crunching"; they can perform prodigious feats of arithmetic, handling millions of numbers a second. Equally important, they can store, compare and arrange data at blinding speed. That combination lets the computer handle a broad range of problemsfrom designing a complex new telescopic lens to sending TV images across the solar system.
Humans have been calculating since the dawn of history and before. Stone age man, making scratches on animal bones, tried to keep track of the phases of the moon. Other prehistoric people reckoned with pebbles. Indeed, the Latin word calculus means a stone used for counting. Perhaps the most enduring calculating device is the abacus, which was used in China as early as the 6th century B.C. But the first really serious efforts to make mechanical calculators, in which some of the tallying was done automatically, did not come until the 17th century.
By then numbers had become especially important because of great advances in astronomy, navigation and other scientific disciplines. More than ever before, it was necessary to rely on long tables of such elementary mathematical functions as logarithms, sines and cosines. Yet compiling these essential tools often required years of slavish toil.
Still, mathematical illiteracy continued to plague Europe.
In the early 19th century, Charles Babbage, an idiosyncratic mathematician and inventor of the railroad cowcatcher and the first tachometer, was becoming increasingly incensed by the errors he found in insurance records, logarithm tables and other data. His fetish for accuracy was so great, in fact, that after reading Lord Tennyson's noted line "Every moment dies a man/ Every moment one is born," he wrote the poet: "It must be manifest that if this were true, the population of the world would be at a standstill." Babbage's recommended change: "Every moment dies a man/ Every moment 1 1/16 is born."
In 1822, Babbage began work on a machine, called the difference engine, that could help solve polynomial equations to six places. The Chancellor of the Exchequer was so impressed by the machine's potential for compiling accurate navigational and artillery tables that he subsidized construction of a still larger difference engine that could compute to 20 places. Unfortunately, the metalworkers of Babbage's day were not up to making the precision parts required, and the machine was never completed. But Babbage had a bolder dream: he wanted to build a machine, which he dubbed the analytical engine, that could perform any arithmetical and logical operations asked of it. In effect, it would have been programmablethat is, a true computer instead of a mere calculator.
