(4 of 5)
The novel idea of using strings of Is and Os to solve complex problems traces back to another gifted Englishman, George Boole. A contemporary of Babbage's, he developed a system of mathematical logic that allows problems to be solved by reducing them to a series of questions requiring only an answer of true or false. Just three logical functions, called AND, OR and NOT, are needed to process Boole's "trues" and "falses," or Is and Os. In computers these operations are performed by simple combinations of on-off switches, called logic gates. They pass on information, that is pulses of electricity, only according to the Boolean rules built within them. Even a small home computer has thousands of such gates, each opening and closing more than a million times a second, sending bits and bytes of information coursing through the circuitry at nearly light's velocity (electricity travels about a foot in a billionth of a second).
The earliest digital computers were much more plodding. They relied on electromechanical on-off switches called relays, which physically opened and closed like the old Morse code keys. Physicist-Author Jeremy Bernstein recalls that Mark I, IBM's first large computer, assembled at Harvard during World War II, sounded "like a roomful of ladies knitting." It could multiply two 23-digit numbers in about five seconds. Even some hand-held calculators can now do the same job in a fraction of the time.
ENIAC vastly increased computer speed by using vacuum tubes rather than electromechanical relays as its switches, but it still had a major shortcoming. To perform different operations, it had to be manually rewired, like an old wire-and-plug telephone switchboard, a task that could take several days. The Hungarian-born mathematical genius, John von Neumann, saw a solution. He suggested putting the machine's operating instructions, or program, within the same memory as the data to be processed and writing it in the same binary language. The computer could thus be programmed through the same input devices used to feed in data, such as a keyboard or a reel of tape. The first commercial computer to have such capability was Sperry-Rand's UNIVAC 1, which appeared in 1951 and, much to IBM's chagrin at being beaten, was promptly delivered to the Census Bureau.
Yet even while journalists were hailing the new "electronic superbrains," the machines were already becoming obsolete. In 1947 three scientists at Bell Labs invented a tiny, deceptively simple device called the transistor (short for transfer resistance). It was nothing more than a sandwich of semiconducting materials, mostly crystals of germanium; silicon became popular later. The crystals were arranged so that a tiny current entering one part of the sandwich could control a larger current in another. Hence, they could be used as switches, controlling the ebb and flow of electrons. Even the earliest transistors were much smaller than vacuum tubes, worked faster and had fewer failures. They gave off so little heat that they could be packed closely together. Above all, they were quite cheap to make.
