Some Random Nerd

View Original

"Augmenting Human Intellect"

Earlier this month, on the 2nd July, Douglas Engelbart died aged 88.

I started putting together a brief note about his achievements as a kind of personal tribute/obituary, but it didn't seem quite right. Nailing down what he did by talking about the technologies he pioneered felt like it was somehow falling short, and I think the reason for that was because his 'big idea' was so big that I was missing the wood for the trees.

So instead, I'm going to try to explain why I believe he is a rare example of a true genius.

Inventing the future

If you try to imagine what the future will look like, the chances are that you will take things that you know about the past and current trends and extrapolate those to create some sort of 'future vision'. In other words, things that are getting faster will be faster, things that are getting smaller will be smaller, and things that are getting cheaper might even become free.

That is one way to get to a 'future vision'. There are plenty of people making a living out of that kind of thing (especially in areas like market sizing) and it works reasonably well on a short term basis, where you are talking about things like how many widgets are likely to be sold over the next 3-5 years. Tomorrow will be the same as today, but a little bit different; repeat to fade…

The thing is, that isn't really about the future. Its about understanding the present – where we are, how we got here, and assuming that we stay on the same trajectory.

What that kind of 'vision' misses are actual events; the 'game changers' that disrupt or transform businesses and industries. (For example, the effect that MP3s had on music sales in the late 1990s/early 2000s.) Usually, those kinds of events are clear in hindsight; often a continuation or convergence of technological trends, brought together and executed well to create a new product. They come from a deeper understanding of the present – how the current trajectory is going to be changed by different events from the past that aren't immediately obvious. (To continue with the MP3 example; hard drives and embedded computers were getting smaller, so it was possible to put much more music into a much smaller music-playing device.)

But occasionally, someone has a new idea, or an insight, which changes the future.

"Genius is a talent for producing something for which no determinate rule can be given, not a predisposition consisting of a skill for something that can be learned by following some rule or other" Immanuel Kant

"Talent hits a target no one else can hit. Genius hits a target that no one else can see." Arthur Schopenhauer

It is very rarely that something happens which genuinely changes the trajectory of history. Especially when it comes from a vision; a possible or potential future that someone imagines. More rarely still, it comes from a vision that someone is able to not just articulate, but execute.

For example, some might call the concept of the iPhone 'genius' – it changed the smartphone industry, and what we think of as "portable" or "mobile" computers, but the development of a pocket-sized computer built around a touch screen wasn't new. Smartphones weren't new. Although it was a very well executed idea, it is hard to say that what Apple achieved with the iPhone was significantly different to the target that Palm, Nokia, RIM, Microsoft and others were aiming for with their phones, smartphones and pocket PCs.

I find it hard to think of a better example of what 'genius' really means than Douglas Engelbart.

Recently, I wrote a blog post about "inventing the future" where I said that;

[…] if you want to see the real meaning of "inventing the future", then you could do far worse than looking at [Alan] Kay and the work that was going on at Xerox PARC (Palo Alto Research Centre, where he worked in the 1970s). Because PARC was basically where most of the ideas behind what we think of as 'computers' were put together into a coherent product. At a point in time when the science fiction future of computers involved banks of switches, blinking lights and beeping computers, the guys at PARC were putting together computers with graphical interfaces (ie. WIMP - the idea of a user interface using Windows, Icons, Mouse and Pointer), the Paper Paradigm (the idea that the computer interface would be made up of analogues to the traditional desktop – so, the "desktop", files and folders, the trash can), WYSIWYG ("What You See Is What You Get" – beforehand, what you would type in a word processor wouldn't really give you any clear idea of what it would look like when you printed it out on paper.)

What I failed to mention (because I was focussing on the idea of "inventing the future" that Kay articulated, rather than the actual "inventing" part) was that while the work that was being done at Xerox PARC was putting together the ideas behind what we think of as "computers", they were very much standing on the shoulders of what Douglas Engelbart and his research team had achieved at SRI in coming up with those ideas. (The PARC team also included several of Engelbart's best researchers

Speaking of Alan Kay, he is quoted in Wired's obituary of Engelbart;

"The jury is still out on how long -- and whether -- people are actually going to understand," he said, what Engelbart created. But at least we have started to.

Ultimately, Douglas Engelbart's big idea was simply too big to fit into a simple blog post or article. The general theme of most of the obituaries I have read summarise his life's work as 'inventing the mouse.'

For example, this piece on the BBC website illustrates the shortcomings of over-simplifying what he did;

Douglas Engelbar [sic], the man who invented the computer mouse, has died. He passed away aged 88 and did not become rich through his invention, because he created the mouse before there was much use for it. Bill English explained: "The only money Doug ever got from it (the mouse) was $50,000 licence from Xerox when Xerox Parks [sic - actually Xerox PARC)] started using the mouse. Apple never paid any money from it, and it took off from there."

Aside from the transcription errors, that brief summary puts this particular achievement into nice simple, concrete terms that anyone who has used a computer would understand. But in doing so, it pulls it out of context and massively over-simplifies what he did. (With the added distraction of how much money he failed to make from his invention. Failing to mention the $500,000 Lemelson-MIT prize he was awarded in 1997, for example.)

To put this particular invention into context; imagine using a computer without a mouse. I would guess that you're probably imagining the same kind of computer, but with a different kind of device to move a pointer around the screen. (Such as a laptop's trackpad, or a trackball, or a joystick.) If so, then you're missing the point of what he actually invented – not just the mouse, but the basic concept of the computer interface using a "pointing" device.

So, try again to imagine a computer interface that doesn't use a mouse, or a pointer. Now, I would guess that you are thinking about the kind of modern applications that don't involve a lot of mouse/pointer work (so, no Photoshop, no Powerpoint etc.) and maybe something more like a word processor. In other words, different ways of using a graphical user interface to operate a computer – which again was part of Engelbart's creation.

Hopefully, you're starting to get an idea of how much of the 2013 idea of a "computer" owes to what Engelbart was imagining half a decade ago.

Robert X. Cringely sums it up;

In addition to the mouse and the accompanying chord keyboard, Doug invented computer time sharing, network computing, graphical computing, the graphical user interface and (with apologies to Ted Nelson) hypertext links. And he invented all these things — if by inventing we mean envisioning how they would work and work together to create the computing environments we know today — while driving to work one day in 1950.

Incidentally, that article closes with this beautiful quote;

I once asked Doug what he’d want if he could have anything. “I’d like to be younger,” he said. “When I was younger I could get so much more done. But I wouldn’t want to be any less than 50. That would be ideal.”

He has been widely credited with creating many of the basic concepts of modern computers, demonstrating many of them to the world for the first time at what has since been dubbed "the mother of all demos". But the impact of what he envisioned was much greater than the sum of its parts.

Augmenting Human Intellect

Even then, its still an over simplification. It still hasn't got to the bottom of Engelbarts vision. The mouse, GUI, videoconference, networked computing are all just details of execution – they don't get to the bottom of why he was developing those ideas, and what they were for.

His vision of the potential of the computer went beyond what they did, or how a user would interact with them. It was – in an age of hugely expensive room-sized workstations, punch-cards and teletype terminals – about the role that a computer would have in people's lives.

In a blog post on Understanding Douglas Engelbart, John Naughton has this to say;

But reading through the obituaries, I was struck by the fact that many of them got it wrong. Not in the fact-checking sense, but in terms of failing to understand why Engelbart was such an important figure in the evolution of computing. Many of the obits did indeed mention the famous “mother of all demonstrations” he gave to the Fall Joint Computer Conference in San Francisco in 1968, but in a way they failed to understand its wider significance. They thought it was about bit-mapped, windowing screens, the mouse, etc. (which of course it demonstrated) whereas in fact that Engelbart was on about was the potential the technology offered for augmenting human intellect through collaborative working at a distance. Harnessing the collective intelligence of the network, in other words. Stuff we now take for granted. The trouble was that, in 1968, there was no network (the ARPAnet was just being built) and the personal computer movement was about to get under way. The only way Engelbart could realise his dream was by harnessing the power of the time-shared mainframe — the technology that Apple & Co were determined to supplant. So while people understood the originality and power of the WIMPS interface that Doug created (and that Xerox PARC and, later, Apple and then Microsoft implemented), they missed the significance of networking. This also helps to explain, incidentally, why after the personal computer bandwagon began to roll, poor Doug was sidelined.

To put it another way, before Engelbart, a "computer" was a device to process information – you put data in, it ran the numbers and gave you data out. In other words, a computer was something you gave some 'work' to, and it did it for you. (For example, you would use a keyboard to punch some information into a card, then put the card into the computer.)

Engelbart's vision was computers as devices to augment human intellect – partly by doing the "computing" work for you, and partly by doing the work with you (for example, by using an interface that was live, giving the user feedback in real time), but through networking and sharing resources, by connecting people and by working together, becoming a working team greater than the sum of its parts.

If you focus on his achievement as the tools he created to make the first part of this vision a reality — the mouse, the GUI and the desktop computing environment — then you could be forgiven for thinking that as we move to mobile devices and touch screens, we are leaving what he did behind.

I think that couldn't be further from the truth. When we move forwards to always-on, mobile, networked computing, with permanent availability of resources like Wikipedia, to social networks, to Dropbox and Maps and so on, the role of the device for "augmenting human intellect" becomes clearer then ever.

The Power of the Network

The system that Engelbart designed and helped to build was NLS - the "oN Line System", which enabled several users to work on the same computer at the same time. (This was the system that was shown off at 'the mother of all demos'.)

In 1969, the beginnings of ARPANET – one of the first packet-switching computer networks – were put into place. The second computer on the network (so the first network connection) was up and running in October, connecting a machine at UCLA machine to Douglas Engelbart's NLS system at the Stanford Research Institute. As this network developed, it was the first to use the TCP/IP protocol designed to allow computer networks to connect to one another, allowing machines on any of the connected networks to communicate with one another directly. The public 'network of networks' built on this protocol is the internet.

There is a great anecdote in this article from 1999 about a meeting between Engelbart and Steve Jobs which I think illustrates this friction between Engelbart's vision of the power of the network being the key to the computer, and the similar but competing vision of the personal, desktop computer as a self-contained box with 'all the computing power you need';

Apple Computer Inc.'s hot-shot founder touted the Macintosh's capabilities to Engelbart. But instead of applauding Jobs, who was delivering to the masses Engelbart's new way to work, the father of personal computing was annoyed. In his opinion, Jobs had missed the most important piece of his vision: networking. Engelbart's 1968 system introduced the idea of networking personal computer workstations so people could solve problems collaboratively. This was the whole point of the revolution. "I said, 'It [the Macintosh] is terribly limited. It has no access to anyone else's documents, to e-mail, to common repositories of information, "' recalls Engelbart. "Steve said, 'All the computing power you need will be on your desk top."' "I told him, 'But that's like having an exotic office without a telephone or door."' Jobs ignored Engelbart. And Engelbart was baffled. We'd been using electronic mail since 1970 [over the government-backed ARPA network, predecessor to the Internet]. But both Apple and Microsoft Corp. ignored the network. You have to ask 'Why?"' He shrugs his shoulders, a practiced gesture after 30 frustrating years, then recounts the story of Galileo, who dared to theorize that the Earth circles the sun, not vice versa. "Galileo was excommunicated, " notes Engelbart. "Later, people said Galileo was right." He barely pauses before adding, "I know I am right."

Apple's vision for the Mac in 2001 was the "digital hub", which would connect to all of your other electronic devices. It wasn't until just 2 years ago that the focus of Apple's vision shifted from the desktop computer to the network – specifically iCloud – as the "digital hub" which everything would connect to. Arguably, Apple's reputation for online services, and specific examples like the iWork for web applications announced last month (which work through a web browser, but still offer no way for people to work collaboratively on the same document at the same time) indicate that they still don't get it.

So – yes, he invented the mouse. And the idea of the computer interface that the mouse works within. But his greater idea was one that I think we are still getting to grips with; the idea of the computer as a tool for extending ourselves, for bringing people together, connecting them across countries and continents so that they can work together, share their things, talk, write and speak to one another.

All of this sparked by a man in 1950, driving on his way to work the day after getting engaged and realising that he needed to set himself some professional goals to keep himself interested once he had got married and was 'living happily ever after';

I finally said, "Well, let's just put as a requirement I'll get enough out of it to live reasonably well." Then I said, "Well, why don't I try maximizing how much good I can do for mankind, as the primary goal, and with the proviso that I pick something in there that will make enough livable income." So that was very clear, very simple as that.

Realising that the complexity of human problems was growing, as well as becoming more urgent, and realising that computers could provide a way to solve those problems, his mission (or 'crusade', as he later called it) was to turn that into a reality.

From that vision sprang the ideas behind what we think of as the computer, in terms of its graphical user interface, and the tools we use to connect with that interface. From his own computer system came the first network connection, to the network that later became the internet. But the vision he was putting together in the 1960s is only just now becoming clear to those of us who have moved into a world of ubiquitous, always-on, always-connected computers – as it moves past the desktop-bound paradigm that he saw and into a pocket-sized, portable and wireless world.

Whether he will forever be 'the man who invented the mouse', or eventually get wider recognition for the scope of that original vision remains to be seen, but no history of either the personal computer or the internet could be complete without mentioning his work. But the fact is that thanks to Douglas Engelbart's vision, pretty much anyone today with even a passing interest in where the ideas of the personal computer, the networked computer or the internet came from will be able to pull out their pocket-sized personal, networked computer and quickly trace them back to him.