Big Dimwits and Little Geniuses

  • Share
  • Read Later

(5 of 5)

Within a few years, the wizards at Bell Labs built the first fully transistorized (or solid-state) computer, a machine called Leprechaun. But by then Ma Bell, eager to avoid the wrath of the Justice Department's trustbusters, had sold licenses for only $25,000 to anyone who wanted to make transistors, and the scramble was on to profit from them. William Shockley, one of the transistor's three inventors, returned to his California home town, Palo Alto, to form his own company in the heart of what would become known as Silicon Valley. In Dallas, a young, aggressive maker of exploration gear for the oil industry, Texas Instruments, had already hired away another Bell Labs star, Gordon Teal, and was churning out the little gadgets. So were old-line tube makers such as General Electric, RCA, Sylvania and Raytheon. Much of their production went to the Pentagon, which found transistors ideal for a special computing task: the guidance of missiles.

The first computers, even those built with transistors, were put together like early radios, with tangles of wires connecting each component. But soon electronics manufacturers realized that the wiring could be "printed" directly on a board, eliminating much of the hand-wiring. Then came another quantum leap into the miniworld. In the late 1950s, Texas Instruments' Jack Kilby and Fairchild Semiconductor's Robert Noyce (one of eight defectors from Shockley's firm whom he scathingly called the "traitorous eight") had the same brainstorm. Almost simultaneously, they realized that any number of transistors could be etched directly on a single piece of silicon along with the connections between them. Such integrated circuits (ICs) contained entire sections of a computer, for example, a logic circuit or a memory register. The microchip was born.

Designers kept cramming in more and more transistors. Today, hundreds of thousands can be etched on a tiny silicon chip. The chips also began incorporating more circuits. But even such so-called large-scale integration had a drawback. With the circuits rigidly fixed in the silicon, the chips performed only the duties for which they were designed. They were "hardwired," as engineers say. That changed dramatically in 1971, when Intel Corp., a Silicon Valley company founded by Noyce after yet another "defection," unveiled the microprocessor. Designed by a young Intel engineer named Ted Hoff, it contained the entire central processing unit (CPU) of a simple computer on one chip. It was Babbage's mighty mill in microcosm.

With the microprocessor, a single chip could be programmed to do any number of tasks, from running a watch to steering a spacecraft. It could also serve as the soul of a new machine: the personal computer. By 1975 the first of the new breed of computers had appeared, a hobbyist machine called the Altair 8800 (cost: $395 in kit form, $621 assembled). The Altair soon vanished from the marketplace. But already there were other young and imaginative tinkerers out in Silicon Valley getting ready to produce personal computers, including one bearing an odd symbol: an apple with a bite taken out of it. Suddenly, the future was now. —By Frederic Golden

  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. Next Page