In 1945, a young naval radar operator was waiting to be shipped home in the slack days after victory in WWII. He read a magazine article in his Philippine jungle base that proposed a new kind of information system, based on a fabulous desk called a Memex. Its two side-by-side microfilm readers and a host of hidden machinery would let you browse and create links between spools on any subject. The idea was to use the power of machines to make the whole of human knowledge accessible to all, and to let people add to and refine that knowledge in a virtuous circle.
Some years later that sailor, Douglas Engelbart, now a thoughtful and restless engineer at a Mountain View, California aerospace company, had an epiphany. Perhaps the new digital computer – not microfilm – could form the heart of a system like the one he’d read about. He imagined moving through information space the way a radar screen let you navigate through physical space.
The article he’d read was “As We May Think”, by leading U.S. scientist Vannevar Bush, a polymath who had built analog computers as well as played a major role in the development of the atomic bomb. Bush’s article mirrored some of the ideas of early 20th century pioneers including Paul Otlet and writer H.G. Wells about using the power of machines to assemble all knowledge in a kind of “world brain.” To Engelbart, the flexibility of the computer opened up a whole new set of possibilities. He decided that building such a system would be his life’s work.
But as I wrote in my piece on CHM Fellow Bob Taylor, the man who funded Douglas Engelbart through many of his most productive years, the idea of using digital computers to share information wasn’t exactly an easy sell in the 1950s and early ‘60s. Why would you waste these fabulously expensive data crunchers on something as quotidian as communication, in a world that already had telephones, printing, telegraphs, photography, TV and radio? Just as wild was Engelbart’s idea that each person would sit in front of their own keyboard and fabulously expensive radar-style video screen, interacting in real time with the computer and through it, with each other.
Engelbart was not completely alone; a few others had begun to see the computer as the ultimate information machine. A brilliantly precocious college student named Ted Nelson came up with an independent concept of using associative links to navigate and organize all the world’s knowledge into a new kind of multimedia literature, and he coined the term hypertext.
Two other fellow travelers were in a position to offer Engelbart extraordinarily concrete help. At the army’s Advanced Research Projects Administration (ARPA), J.C.R Licklider and his protégé Bob Taylor would later co-author a paper called “The Computer as a Communications Device.” With funding from Taylor first at NASA and then at ARPA, as well as from several others, Engelbart began to turn his vision into reality.
His goal was nothing less than to augment human intellect – to harness people’s ability to collaboratively solve the world’s important problems. He believed that properly trained and with the right computer tools, we could raise our “collective IQ.” By putting knowledge at the fingertips of those who needed it, and letting them share their refinements and insights with others, he hoped to start a feed-forward process he called “bootstrapping.” Each improvement would help accelerate further advances in method, and so on. The concept of bootstrapping also went far beyond computers. Much of his work, and that of his group, was aimed at improving the organizational processes that can help lead to innovation
This vision was in stark contrast to his Artificial Intelligence contemporaries, who wanted to create an alternate intelligence on computers rather than help turbo-charge the human kind. This early fork in the road still leaves its mark on computing today.
Engelbart started a laboratory at SRI (Stanford Research Institute at the time). He grandly named it the Augmented Human Intellect Research Center (AHIRC), later shortened to Augmentation Research Center (ARC). At the peak he would have 50 people working for him.
Doug Engelbart had a thoughtful, gentle manner, and a wonderfully open smile. When he met people he was charming and often funny. But he also gave the sense that he was considering things really, really deeply; that there was some serious purpose to everything he did. With prematurely grey hair and deep-set eyes framed by his large nose and prominent brows, he had the perfect presence for a visionary, or a guru.
As a manager he was often hands-off when it came to operational details, but concerned with communicating his vision so that others could help build it. He wasn’t terribly interested in technical details either. But he was brilliant at inspiring some of the best programmers and engineers of the time to come and work with him.
In a sense, Engelbart and his teams only built one big thing in his long career, the oNLine System (NLS), later repurposed as Augment. The mouse was merely Doug’s idea for a convenient input device which hardware wizard Bill English developed as one of several ergonomic accessories to that system; the chord keyset was another.
But if you tried to map the features of NLS to the computing world we know today, you would have to include pretty much all the core features of the Web as well as word processing, spell checkers, online collaboration in forms like wikis and Google Docs, videoconferencing tools, personal information software for things like grocery lists, a full featured email system, archiving software for saving documents with permanent identifiers, and some features of databases. Other features wouldn’t map at all, since they still haven’t reached wide use. These include documents that are editable by multiple applications rather than belonging to a single one, and a whole host of specialized hypertext features.
How could one system do so much? When Engelbart and his few peers imagined the future of computer communication in the early 1960s, the power of the machine was already clear to them, as was the fact that this power would get exponentially cheaper and faster (later memorialized as Moore’s Law).
The rest was gloriously wide open; a blank frontier in which to build not just castles but whole cities made of sand and imagination. There were no standards to support, no established players to consider in business strategies, no relevant conventional wisdom from advisors and investors. The result? By the mid 1960s Engelbart and his team had actually prototyped many of the core features of the computing world that would unfold over the next 40 years, plus others that may come.
Similarly, Ted Nelson independently conceived a number of these features plus his own vision of new kinds of electronic literature and multimedia, and built out some of them with help from his former schoolmate Andy van Dam. J.C.R. Licklider and Bob Taylor laid out quite different, but also sweeping visions of the future of computing.
By contrast, an example of an ambitious and lavishly funded computing project today might be launching a new social network within the ecosystem of established precedents.
Partly as a result of their lofty aspirations, Engelbart and his researchers forged close connections with many key figures of the 1960s counterculture. There was Stewart Brand of the Whole Earth Catalog, Ken Kesey and his Merry Pranksters, and many others. Like the ARPANET community that would follow, the ARC lab represented an uneasy intersection of two very different flavors of open-ended exploration; that of military-funded research, and the sometimes idealistic, sometimes just for kicks questing of an emerging caste of hippie hackers. This intersection is beautifully explored in John Markoff’s book What the Dormouse Said, and Fred Turner’s From Counterculture to Cyberculture.
In 1968, Engelbart and his staff put on the so-called “mother of all demos” at a major conference in San Francisco, showing off all the features they had developed over the years. For ninety minutes, the stunned audience of over 1000 computer professionals witnessed many of the features of modern computing for the first time: Live videoconferencing, document sharing, word processing, windows, and a strange pointing device jokingly referred to as ‘the mouse’. Elements on the screen linked to other elements using associative links – or ‘hypertext’.
In the late 1960s NLS was a timesharing program, meaning that it ran on a single computer shared by a community of perhaps a couple of hundred users who logged in from their own terminals. General purpose computer-to-computer networking promised to create far larger communities, but it was still in the process of being invented. Engelbart and his lab played a significant role in that process.
Bob Taylor of ARPA had asked Engelbart to have his ARC lab host one of three centers on the experimental ARPAnet; the Network Information Center, or NIC. This would act as a central library and card catalog for all of the information on the growing network, with the archives of the ARC group itself as a foundation. It would also host the central directory for all of the computers on the ARPAnet, a function which later evolved into the familiar Domain Name System (.com, .org, etc.).
Engelbart enthusiastically agreed; he saw the chance to expand the reach of NLS from hundreds of users on timesharing systems to thousands all over the country and beyond; the start of a true online world. His team even made plans to add multimedia, foreshadowing features on the Web a quarter century hence. He hoped the NIC could be the seed of a truly online world.
At the end of 1969, ARC programmer Bill Duvall became one of the first two users on the ARPAnet, the world’s first major general-purpose computer network. Over nearly the next two decades the SRI NIC would play a pivotal role in the expansion of the ARPAnet and later the Internet. The ARC/NIC archives are a foundational collection at the Computer History Museum.
But the fortunes of the ARC lab itself began to falter. In 1969 Bob Taylor left ARPA, and ARPA itself also changed its funding policies as part of a general government belt-tightening. Grants began to dry up, and SRI management, always wary of Engelbart’s freewheeling group of renegades in colorfully patched jeans, started to make more demands. Engelbart, who was more of a visionary leader than a hands-on manager, felt things slipping away.
The NIC and the ARPANET did indeed bring NLS to a broader spectrum of users, but a lot of them found it hard to use with a steep learning curve and arcane functions. It was also a resource-heavy program for the low-bandwidth, just-created networks of the era. Many started to access the NIC’s information with dumber but faster tools.
Another blow came when Bob Taylor became the leader of the Computer Systems Laboratory at Xerox’s newly created and lavishly funded Palo Alto Research Center, or PARC. The ARC lab’s former benefactor and his colleagues began to hire more and more ARC team members to build his own “Office of the Future,” eventually including some of Engelbart’s closest lieutenants like Bill English, Jeff Rulifson, and Bill Duvall. The bitter joke ran that ARC was a training program for PARC.
The ARC alums brought many of the baseline concepts pioneered in NLS to PARC, and thus into the stream of development that eventually led to much of modern computing. But after the internal failure of the POLOS project, which was meant to be a PARC version of NLS, much got left out as well – from hypertext links to the overall emphasis on collaboration and augmenting human intellect.
In 1977, SRI sold the ARC project to Tymshare, later a subsidiary of McDonnell-Douglas. There, Engelbart and his remaining team turned NLS into Augment, and pioneered several new features. But the momentum was gone, and Tymshare had little interest in pursuing Engelbart’s main goals. He retired from Tymshare in 1986, and continued to pursue his vision in offices provided by a grateful mouse-maker, Logitech.
He continued to speak widely, and in 1988 he founded the Bootstrap Institute with his daughter Christina, one of four children, to perpetuate his work. He won the National Medal of Technology, the Lemelson-M.I.T. Prize and the Turing Award, and was a Fellow of the Computer History Museum. Widowed in 1997, he and his second wife Karen attended public events into the last year of his life.
Douglas Engelbart died on July 2 at his home in Atherton, California. He was 88.
What was the impact of Engelbart’s work? The irony is that so far, it has been largely in inverse to the parts he himself considered important. The mouse, a neat but fairly trivial accessory to NLS, became a household item for billions pretty much exactly as Bill English designed it for him. The once-radical idea of using a personal keyboard and screen for reading, and writing, and tracking our own personal information did indeed spread through many channels including the ’68 demo, the ARC alumni recruited by Xerox PARC, and then the larger PC revolution.
But from the late ‘70s into the mid ‘90s the great bulk of those keyboards and screens began to get attached to standalone PCs, the polar opposite to Engelbart’s vision of connectivity. Computing power had dropped so radically in price that a literally personal, standalone computer became affordable, and for over 20 years the attention of the computing world shifted to what an individual could do on his or her own.
Unlike their indirect ancestor the Xerox Alto, relatively few PCs got connected to networks or even dial-up services. The whole notion of multi-user systems and online collaboration began to fall out of fashion. It became an insider thing for researchers, professionals, and geeks. The closest the average person got to a “shared” repository of knowledge was buying a commercial CD-ROM.
Even computer hypertext itself, which Engelbart and Nelson had both independently invented as a tool for collaboratively sharing and building on associative links, first reached the mass market as a single-user program (HyperCard). Its author Bill Atkinson wasn’t directly aware of the work of the 1960s hypertext pioneers. There were some fuller-featured hypertext systems around, like Intermedia by Andy van Dam’s former student Norm Meyrowitz, but they were specialty products.
By the end of the 1980s the connectivity pendulum slowly began to swing the other way. Down at the network level, the Internet and rival standards were growing exponentially in business, research, and higher education. Commercial online systems for ordinary folks – like CompuServe in the U.S. and Minitel in France – remained a niche but an expanding one.
A few people began to build experimental online systems specifically for the Internet, the next phase of the original ARPAnet whose second node had been at Engelbart’s ARC lab, and on which he’d pinned such high hopes for NLS. Several of those experimental systems had hypertext features, including one ambitiously called “WorldWideWeb.”
But while its creators had well-intentioned plans to add more, the Web came with only the most paltry minimum of hypertext and collaborative features from the point of view of Engelbart and other 1960s pioneers. It had simple associative links like you see on this Web page today, i.e “click here to see something related,” and that was it.
There were no typed links, like the kind in NLS that told you what sort of thing you were about to link to. There were no links to links, or to annotations, or to many other kinds of targets. There weren’t multiple views of the same information, like the collapsible outline view in Microsoft Word; NLS had even let you customize views for different levels of users. The Web also lacked any way to properly classify items and manage categories of information, or even to update broken links: if a target changed, a user simply got the familiar “404” error message. There was no built-in way to categorize pages by subject.
Of course, many of the more sophisticated features of NLS and other full-blown hypertext systems were easiest to implement in a controlled environment, a system you built from the ground up like one could in the wide open frontier days of the 1960s. The Web, by contrast, was a guerilla application designed to run in a huge variety of mutually incompatible environments; to spread virally and be installed casually by any sympathetic system administrator. It was a child of the cluttered, feudal, and feuding computing world of the late 1980s.
The Web was also dead simple to use, another survival strategy. New users were up and surfing like champs in minutes. This was a contrast to the formidable learning curve of NLS and a number of other systems.
For a few visionary Web pioneers including its main inventor Sir Tim Berners-Lee, NLS and other full-featured visions like Nelson’s Xanadu became a kind of aspirational template. They represented a set of features to be implemented and explored just as soon as there was breathing space from the sheer crush of immediate tasks. Some, like typed links and fields for classifying pages by category, even made it into early specs. But in the Web’s explosive growth the needed breathing space never came, and the original pioneers gradually lost direct control over the Web’s direction.
That loss of control also killed a feature that both the Web team and Engelbart considered a foundation of any meaningful online system: Authoring. While Berners-Lee’s first browser was also an editor, like a word processor connected straight to the Internet, the ones that made the Web famous left out that difficult-to-program function. Only with wikis and blogs a decade later did users regain some limited ability to easily contribute to the online world.
Berners-Lee himself later moved toward his own vision of the Semantic Web, which is compatible with a larger hypertext vision but emphasizes different aspects – pre-digesting information by making it readable by machines, rather than further building out tools for direct manipulation by people.
In 1990, hypertext and future Web pioneer Dan Connolly was inspired by Engelbart’s writings at a major conference on computer collaboration. Engelbart’s works were also an occasional presence at subsquent hypertext and Web conferences. But the majority of developers jumping on the Web bandwagon knew little or nothing about the long history of their newly chosen field.
In 1997, my colleague Kevin Hughes and I were deeply proud to feature Doug Engelbart as the keynote speaker at the first Web History Day and exhibit, which we organized for the International World Wide Web conference that year. The goal was to introduce his work to a wider Web community; we also had a live demo of NLS in the exhibit area, and the day opened with a breakfast “Wake Up Call” from Ted Nelson. I moderated a closing panel on future visions with Engelbart, Tim Berners-Lee, Brewster Kahle, and Pei Wei. Perhaps it’s time to look back and see what they said.
Today, the world is getting more and more connected. As a young Doug Engelbart could only imagine sixty years ago, much of the world’s population does the bulk of its reading, writing, and research tasks online, whether on the internet or on mobile phone networks.
But when it comes to the kind of knowledge navigation and collaboration tools that were the tangible features of his vision, we’ve climbed only the first rung of the ladder. If there’s ever a time that his ideas can be fully tested, it lies ahead.
Of course obstacles may be increasing, too, as the online world continues to clutter itself with fissioning standards, and proprietary services and apps. Take the simple act of sharing a link to a page, i.e. marking a “trail” in the parlance of Vannevar’s Bush’s article on his magical Memex desk. There are now literally dozens of proprietary “social bookmarking services” vying with each other to handle the task, from Digg, to Reddit, to Squidoo. Instead of refining a truly shared body of knowledge, such a multiplicity of trails become just another form of ephemeral content.
There are many other examples. It’s as if each tiny feature from the grand unified visions of people like Engelbart, Otlet, Bush and Nelson now has competing constituencies and sometimes IPOs behind it.
A century ago, Paul Otlet began building perhaps the first system featuring automation and hyperlinks to try and organize and refine all the worlds information. A core goal was to fight the increasing fragmentation of knowledge, which had gotten more and more specialized with what contemporaries called “information flood.” H.G. Wells and Vannevar Bush (of the Memex desk) championed similar quests in the pre-computer era, and Engelbart and several others I’ve mentioned here in the youth of the computer age.
The computer attracted Engelbart because of its infinite flexibility, as compared with the klugey microfilm and library cards of earlier systems. But an Achilles heel of the computer’s flexibility is how easy that makes it to create incompatible stuff. It would be ironic but unsurprising if the century-old quest to unify knowledge were to founder on specialization of a different kind; another profusion of incompatible standards on the “universal machine”. If so, we can hope that the computer’s very flexibility, and another generation of visionaries, can put it back on course.
Errors: Please let me know of any errors in the piece, at email@example.com.
Memories: If you have memories related to Doug and the ARC/NIC you would like to share, please use the commenting feature in this blog and we will archive them.
Historical Materials: If you have or know of historical materials related to the topics of this piece that you think should be preserved with us or elsewhere, please contact me as above. You can offer materials directly to the Museum
The ARC/NIC records are part of the Museum’s permanent collection and comprise over 300 boxes of documents and hundreds of backup tapes, the latter now transferred to modern media. Networking pioneer and member of the Internet Society hall of Fame Elizabeth “Jake” Feinler, the former director of the NIC, brought these archives to the Museum and is a core Advisor to the CHM Internet History Program. The bulk of the existing archives of Engelbart’s work are split between the ARC/NIC records and other materials at CHM and the holdings at Stanford Libraries, listed below.
For other oral histories, see info below.
See the document “Networking Resources at CHM” for a guide to networking materials in general. A number of not-yet-posted oral histories related to the work of the ARC lab and the NIC are listed there. They are currently available to researchers, and we hope to have them posted soon. Note also the networking oral histories in the Pelkey Collection, listed there.
Within our permanent Revolution exhibition, Engelbart’s work can be found in the HCI, Networking, and Web galleries. You can also search on his name.
John Markoff, New York Times, “Computer Visionary Who Invented the Mouse“