(2 of 8)
The Intel chip and one developed at about the same time at Texas Instrumentsthe question of priority is still widely debated in the industrywere the natural culmination of a revolution in electronics that began in 1948 with Bell Telephone Laboratories' announcement of the transistor. Small, extremely reliable, and capable of operating with only a fraction of the electricity needed by the vacuum tube, the "solid-state" device proved ideal for making not only inexpensive portable radios and tape recorders but computers as well. Indeed, without the transistor, the computer might never have advanced much beyond the bulky and fickle ENIAC, which was burdened with thousands of large vacuum tubes that consumed great amounts of power, generated tremendous quantities of heat, and frequently burned out. In an industry striving for miniaturization, the transistors, too, soon began to shrink. By 1960, engineers had devised photolithographic and other processes (see box) that enabled them to crowd many transistors as well as other electronic components onto a tiny silicon square.
The advent of such integrated circuits (ICs) drastically reduced the size, cost and electrical drain of any equipment in which they were used. One immediate byproduct: a new generation of small, desk-size minicomputers as well as larger, high-speed machines. Their speed resided in the rate at which electric current races through wire: about one foot per billionth of a second, close to the velocity of light. Even so, an electrical pulse required a significant fraction of a second to move through the miles of wiring in the early, large computers. Now even circuitous routes through IC chips could be measured in inchesand traversed by signals in an electronic blink. Computers with ICs not only were faster but were in a sense much smarter. Crammed with more memory and logic circuitry, they could take on far more difficult workloads.
Like the tracks in a railroad yard, ICs were really complex switching systems, shuttling electrical pulses hither and yon at the computer's bidding. Still, ICs could not function by themselves; other electronic parts had to keep the switches opening and closing in proper order. Then came the next quantum leap in miniaturization: the development in the late 1960s of large-scale integration (LSI). Unlike their single-circuit predecessors, which were designed to do only one specific job, LSIs integrated a number of circuits with separate functions on individual chips. These in turn were soldered together on circuit boards. Out of such modules, entire computers could be assembled like Erector sets.
But the new LSls had an innate drawback. Because they were made in rigid patterns and served only particular purposes or were, as engineers say, "hard-wired"they lacked flexibility. That limitation was ingeniously solved by the work of Hoff and others on microprogrammingstoring control instructions on a memory-like chip. For the first time, computer designers could produce circuitry usable for any number of purposes. In theory, the same basic chip could do everything from guiding a missile to switching on a roast.
