(3 of 5)
Babbage's dream of a true computerone that could solve any number of problemswas not realized until the 1930s. In Hitler's Germany, an obscure young engineer named Konrad Zuse, using the German equivalent of an Erector set for parts and his parents' living room as his workshop, built a simple computer that could perform a variety of tasks; its descendants calculated wing designs for the German aircraft industry during World War II. At Bell Telephone Laboratories in the U.S., the research arm of AT&T, a mathematician named George Stibitz built a similar device in 1939 and even showed how it could do calculations over telephone wires. This was the first display of remote data processing. During the war a British group, putting into practice some of the ideas of their brilliant countryman Alan Turing, built a computer called Colossus 1 that helped break German military codes. The British, German and U.S. machines all shared a common characteristic: they were the first computers to use the binary system of numbers, the standard internal language of today's digital computers.
In this they departed from Babbage's "engine." The engine was designed to count by the tens, or the decimal system. Employing ten digits (0 to 9), the decimal system probably dates from the time when humans realized they had ten fingers and ten toes. (Digit comes from the Latin for finger or toe.) But there are other ways of counting as well, by twelves, say, as in the hours of the day or months of the year (the duodecimal system). In the binary system, only two digits are used, 0 and 1. To create a 2, you simply move a column to the left, just as you do to create a 10 in the decimal system. Thus if zero is represented by 0 and one by 1, then two is 10, three 11, four 100, five 101, six 110, seven 111, eight 1000, and so forth.
The binary system is enormously cumbersome. Although any number can be represented, it requires exasperatingly long strings of Os and 1s. But putting such a system to work is a snap for digital computers. At their most fundamental level, the computers are little more than a complex maze of on-off switches that reduce all information within the machine to one of two states: yes (1) or no (0), represented either by the presence of an electrical charge at a particular site or the absence of one. Accordingly, if in a row of three switches, two of them are in an on position (11) and the other off (0), they would represent the number six (110).
In the world of digital computers, each of these pieces of information is called a bit (for binary digit). In most personal computers, bits are shuttled about within the machine eight at a time, although some faster 16-bit machines are already on the small-computer market and even speedier 32-bit machines are in the offing. Clusters of eight bits, forming the equivalent of a single letter in ordinary language, are called bytes. A typical personal computer offers users anywhere from about 16,000 bytes of memory (16K) to 64,000 (64K). But that figure is climbing fast. A few years ago, the standard memory chip, a quarter-inch square of silicon, was 16K. Today it is rapidly becoming 64K, and the industry is already talking of mass-producing 256K chips.
