THE MAN WHO INVENTED THE WEB

  • You might think that someone who invented a giant electronic brain for Planet Earth would have a pretty impressive brain of his own. And Tim Berners-Lee, 41, the creator of the World Wide Web, no doubt does. But his brain also has one shortcoming, and, by his own account, this neural glitch may have been the key to the Web's inception.

    Berners-Lee isn't good at "random connections," he says. "I'm certainly terrible at names and faces." (No kidding. He asked me my name twice during our first two hours of conversation.) Back in 1980 he wrote some software to help keep track of such links--"a memory substitute." The rest is history. This prosthetic extension of his mind took a vast evolutionary leap a decade later, and then grew to encompass the world. It is the reason that today you can be online looking at a photo, then mouse-click on the photographer's name to learn about her, then click on "Nikon" to see the camera she uses--traveling from computers in one end of the world to those in another with no sense of motion.

    Berners-Lee is the unsung--or at least undersung--hero of the information age. Even by some of the less breathless accounts, the World Wide Web could prove as important as the printing press. That would make Berners-Lee comparable to, well, Gutenberg, more or less. Yet so far, most of the wealth and fame emanating from the Web have gone to people other than him. Marc Andreessen, co-founder of Netscape, drives a Mercedes-Benz and has graced the cover of several major magazines. Berners-Lee has graced the cover of none, and he drives a 13-year-old Volkswagen Rabbit. He has a smallish, barren office at M.I.T., where his nonprofit group, the World Wide Web Consortium, helps set technical standards for the Web, guarding its coherence against the potentially deranging forces of the market.

    Is Berners-Lee's Volkswagen poisoning his brain with carbon monoxide? He wonders about this by way of apologizing for the diffuseness of his answers. "I'm not good at sound bites," he observes. True, alas. But what he lacks in snappiness he makes up in peppiness. Spouting acronyms while standing at a blackboard, he approaches the energy level of Robin Williams. He is British (an Oxford physics major), but to watch only his hands as he talks, you'd guess Italian. Five, six years ago, during his "evangelizing" phase, this relentless enthusiasm was what pushed the Web beyond critical mass.

    The breathtaking growth of the Web has been "an incredibly good feeling," he says, and is "a lesson for all dreamers ... that you can have a dream and it can come true." But Berners-Lee's story has more facets than simple triumph. It is in part a story about the road not taken--in this case the road to riches, which in 1992 he pondered taking, and which he still speaks of with seemingly mixed emotions. His is also a story about the difficulty of controlling our progeny, about the risky business of creating momentous things, unleashing epic social forces. For Berners-Lee isn't altogether happy with how the World Wide Web has turned out.

    He says he'd give the Web a B-plus, even an A-minus, that on balance it is a force for good. Yet an "accident of fate" has compromised its goodness. And that accident is intertwined with--perhaps, perversely, even caused by--his decision back in 1992 to take the road less traveled. The question that fascinates people who have heard of Berners-Lee--Why isn't he rich?--may turn out to have the same answer as the question that fascinates him: Why isn't the World Wide Web better than it is?

    Berners-Lee comes by his vocation naturally. His parents helped design the world's first commercially available computer, the Ferranti Mark I. "The family level of excitement about mathematics was high," he says, recalling the breakfast-table teasing of his younger brother, then in primary school, who was having trouble fathoming the square root of negative four.

    In adolescence Berners-Lee read science fiction, including Arthur C. Clarke's short story Dial F for Frankenstein. It is, he recalls, about "crossing the critical threshold of number of neurons," about "the point where enough computers get connected together" that the whole system "started to breathe, think, react autonomously." Could the World Wide Web actually realize Clarke's prophecy? No-- and yes. Berners-Lee warns against thinking of the Web as truly alive, as a literal global brain, but he does expect it to evince "emergent properties" that will transform society. Such as? Well, if he could tell you, they wouldn't be emergent, would they?

    But making them as benign as possible is what gives his current job meaning. Even if the Web's most epic effects can't be anticipated or controlled, maybe they can be given some minimal degree of order. As director of the Web consortium, he brings together its members--Microsoft, Netscape, Sun, Apple, IBM and 155 others--and tries to broker agreement on technical standards even as the software underlying the Web rapidly evolves. His nightmare is a Web that "becomes more than one Web, so that you need 16 different browsers, depending on what you're looking at." He especially loathes those BEST VIEWED WITH ACME BROWSER signs on Websites.

    Most of the consortium's achievements to date are, if important, arcane. (You probably don't care that HTML 3.2 is a widely respected standard, even though that fact greatly eases your travel on the Web.) But some are more high profile. pics, the Platform for Internet Content Selection, is a proposed standard that would let parents filter out offending Websites. It's a kind of V chip, except with no government involvement; you subscribe to the rating service of your choice.

    It is for "random reasons" that Berners-Lee is known as the inventor of the World Wide Web, he says. "I happened to be in the right place at the right time, and I happened to have the right combination of background." The place was CERN, the European physics laboratory that straddles the Swiss-French border, and he was there twice. The first time, in 1980, he had to master its labyrinthine information system in the course of a six-month consultancy. That was when he created his personal memory substitute, a program called Enquire. It allowed him to fill a document with words that, when clicked, would lead to other documents for elaboration.

    This is "hypertext," and it was hardly new. The idea was outlined by Vannevar Bush in 1945 and envisioned as an appendage to the brain. Berners-Lee explains the brainlike structure of hypertext by reference to his cup of coffee. "If instead of coffee I'd brought in lilac," he says, sitting in a conference room in M.I.T.'s computer-science lab, "you'd have a strong association between the laboratory for computer science and lilac. You could walk by a lilac bush and be brought back to the laboratory." My brain would do this transporting via interlinked neurons, and hypertext does it via interlinked documents. A single click from lilacs to lab.

    The trouble with most hypertext systems, as of the late 1980s, was that they were in one sense unlike the brain. They had a centralized database that kept track of all the links so that if a document was deleted, all links to it from other documents could be erased; that way there are no "dangling links"--no arrows pointing to nothing, no mouse-clicks leading nowhere. When Berners-Lee attended hypertext exhibits and asked designers whether they could make their systems worldwide, they often said no, citing this need for a clearinghouse. Finally, "I realized that this dangling-link thing may be a problem, but you have to accept it." You have to let somebody in Tokyo remove a document without informing everyone in the world whose documents may point to it. So those frustrating WEBSITE NOT FOUND messages delivered by today's browsers are the price we pay for having a Web in the first place.

    In between the birth of Enquire and the birth of the Web a decade later, the world had changed. The Internet, though still unknown to the public, was now firmly rooted. It was essentially a bare-bones infrastructure, a trellis of empty pipes. There were ways to retrieve data, but no really easy ways, and certainly nothing with the intuitive, neural structure of hypertext.

    To Berners-Lee, now back at CERN, one attraction of the Internet was that it encompassed not just CERN but CERN's far-flung collaborators at labs around the world. "In 1989, I thought, look, it would be so much easier if everybody asking me questions all the time could just read my database, and it would be so much nicer if I could find out what these guys are doing by just jumping into a similar database of information for them." In other words: give everyone the power to Enquire.

    Berners-Lee wrote a proposal to link CERN's resources by hypertext. He noted that in principle, these resources could be text, graphics, video, anything--a "hypermedia" system--and that eventually the system could go global. "This initial document didn't go down well," says Berners-Lee. But he persisted and won the indulgence of his boss, who okayed the purchase of a NeXT computer. Sitting on Berners-Lee's desk, it would become the first Web content "server," the first node in this global brain. In collaboration with colleagues, Berners-Lee developed the three technical keystones of the Web: the language for encoding documents (HTML, hypertext markup language); the system for linking documents (HTTP, hypertext transfer protocol); and the www.whatever system for addressing documents (URL, universal resource locator).

    Berners-Lee also wrote the first server software. And, contrary to the mythology surrounding Netscape, it was he, not Andreessen, who wrote the first "graphical user interface" Web browser. (Nor was Andreessen's browser the first to feature pictures; but it was the first to put pictures and text in the same window, a key innovation.)

    The idea of a global hypertext system had been championed since the 1960s by a visionary named Ted Nelson, who had pursued it as the "Xanadu" project. But Nelson wanted Xanadu to make a profit, and this vastly complicated the system, which never got off the ground. Berners-Lee, in contrast, persuaded CERN to let go of intellectual property to get the Web airborne. A no-frills browser was put in the public domain--downloadable to all comers, who could use it, love it, send it to friends and even improve on it.

    But what should he name his creation? Infomesh? No, that sounded like Infomess. The Information Mine? No, the acronym--TIM--would seem "egocentric." How about World Wide Web, or "www" for short? Hmm. He discussed it with his wife and colleagues and was informed that it was "really stupid," since "www" takes longer to say than "the World Wide Web."

    There was no single moment when the magnitude of Berners-Lee's creation hit home with thunderous force. But there have been moments of sudden reckoning. Two years ago, Berners-Lee still had pictures of his two young children on his Website. Then someone pointed out that there were enough data there for "strange people" to locate them, and that there were strange people on the Web. "You have to think like that more as the thing scales up," he acknowledges.

    The Web's growing lack of intimacy, in a way, symbolizes his one big disappointment with it. It was meant to be a social place. "The original goal was working together with others," he says. "The Web was supposed to be a creative tool, an expressive tool." He had imagined, say, a worker posting a memo on a Website accessible only to colleagues and having them react by embedding hyperlinks that led to their comments or to other relevant documents; or a bicoastal family similarly planning its annual reunion on the family site.

    But the Web turned out otherwise. Robert Cailliau of CERN, Berners-Lee's earliest collaborator on the project, describes the Web's prevailing top-down structure: "There's one point that puts the data out, and you're just a consumer." He finds this model--whose zenith is the coming wave of so-called push technology--an "absolute, utter disaster."

    Berners-Lee is more diplomatic. He has no gripe about commerce on the Web. (He buys CDs there.) And it was inevitable, in retrospect, that much Web activity would be, well, passive, with people absorbing content from high-volume sites. But he'd hoped the ratio of active to passive would be higher. It irks him that most Website-editing software is so cumbersome. Even the software that spares you the drudgery of actually looking at HTML code calls for some heavy lifting. You chisel your text in granite and then upload the slab, after which changes are difficult. "The Web," he complains, "is this thing where you click around to read," but if you want to write, "you have to go through this procedure." As Cailliau puts it, people have come to view the Web as "just another publishing medium. That was definitely not our intention." Berners-Lee, it turns out, is a kind of accidental Gutenberg.

    Berners-Lee considers the Web an example of how early, random forces are amplified through time. "It was an accident of fate that all the first [commercially successful] programs were browsers and not editors," he says. To see how different things might have been, you have to watch him gleefully wield his original browser--a browser and editor--at his desk. He's working on one document and--flash--in a few user-friendly keystrokes, it is linked to another document. One document can be on his computer "desktop"--for his eyes only--another can be accessible to his colleagues or his family, and another can be public. A seamless neural connection between his brain and the social brain.

    What if the "accident of fate" hadn't happened? What if Berners-Lee's browser-editor, or some further evolution of it, had become the Web tool that first reached the masses? The world almost found out. In 1992, two years after he created his browser, and before Andreessen's Mosaic browser existed, he and Cailliau consulted a lawyer about starting a company called Websoft (the name has since been taken). But the project held risks, and besides, Berners-Lee envisioned competitors springing up, creating incompatible browsers and balkanizing the Web. He thought it better to stay above the fray and try to bring technical harmony. "Tim's not after the money," says Cailliau in a tone of admiration perhaps tinged with regret. "He accepts a much wider range of hotel-room facilities than a CEO would."

    Berners-Lee admits to no regrets at having taken the high-minded, low-profit route. He says he is grateful that Andreessen co-authored a user-friendly browser and thus brought the Web to the public, even if in non-ideal form. Yet it can't have been easy watching Andreessen become the darling of the media after writing a third-generation browser that lacked basic editing capabilities. When I ask, "So there was a moment when you might have been Marc Andreessen?" Berners-Lee says, "I suppose so," and then smiles in a slightly stiff, even frosty, way. "The world is full of moments when one might be other things," he says. "One is the decisions one's taken." File closed.

    Berners-Lee is not easy to read, not prone to self-disclosure. Ask him if he's a sociable guy, and he tells you that on the Myers-Briggs test, "I rate pretty much in the middle on introversion vs. extroversion." Ask about his wife, and he'll tell you that she is an American he met in Europe while she was working for the World Health Organization, after which details get sketchy. "Work is work, and home is home," he says. And when you cross the border between them, his turbocharged gesticulation subsides.

    Other sources volunteer that Berners-Lee met his wife Nancy Carlson at an acting workshop; he turns out to have an artistic, piano-playing, festive side. "He is both British and the life of the party, and that's not a contradiction," says Rohit Khare, who recently left the Web consortium. "He can be the life of the party without making the party about him."

    Berners-Lee, standing at a blackboard, draws a graph, as he's prone to do. It arrays social groups by size. Families, workplace groups, schools, towns, companies, the nation, the planet. The Web could in theory make things work smoothly at all of these levels, as well as between them. That, indeed, was the original idea--an organic expanse of collaboration. But the Web can pull the other way. And Berners-Lee worries about whether it will "allow cranks and nut cases to find in the world 20 or 30 other cranks and nut cases who are absolutely convinced of the same things. Allow them to set up filters around themselves ... and develop a pothole of culture out of which they can't climb." Will we "end up with a world which is full of very, very disparate cultures which don't talk to each other?"

    Berners-Lee doesn't kid himself. Even if the Web had followed the technological lines he envisioned (which it is finally starting to do, as software evolves), it couldn't force people to nurture the global interest, or even their neighborhood's interest. Technology can't make us good. "At the end of the day, it's up to us: how we actually react, and how we teach our children, and the values we instill." He points back to the graph. "I believe we and our children should be active at all points along this."

    On Sundays Berners-Lee packs his family into the car and heads for a Unitarian-Universalist church. As a teenager he rejected the Anglican teachings of his parents; he can't bring himself to worship a particular prophet, a particular book. But "I do in fact believe that people's spiritual side is very important," and that it's "more than just biology."

    He likes the minimalist Unitarian dogma--theologically vague but believing in "the inherent dignity of people and in working together to achieve harmony and understanding." He can accept the notion of divinity so long as it is couched abstractly--as the "asymptote" of goodness that we strive toward--and doesn't involve "characters with beards." He hopes the Web will move the world closer to the divine asymptote.

    Berners-Lee is sitting at his desk, in front of bookshelves that are bare, devoid of books and other old-fashioned forms of data. A few sheet-metal bookends stand there with nothing to do, and nearby are pictures of his family. He concentrates, trying to put a finer point on his notion of divinity. A verse he's heard in church comes to mind, but all he can remember are fragments. "All souls may..." his voice trails off "...to seek the truth in love..." He is silent for a moment. His brain has failed him. Then inspiration strikes. "Maybe I can pick it up from the Web." In a single motion, he swivels his chair 180[degrees] and makes fluid contact with his IBM Thinkpad.