"Augmenting Human Intellect"

Earlier this month, on the 2nd July, Douglas Engelbart died aged 88.

I started putting together a brief note about his achievements as a kind of personal tribute/obituary, but it didn't seem quite right. Nailing down what he did by talking about the technologies he pioneered felt like it was somehow falling short, and I think the reason for that was because his 'big idea' was so big that I was missing the wood for the trees.

So instead, I'm going to try to explain why I believe he is a rare example of a true genius.

Inventing the future

If you try to imagine what the future will look like, the chances are that you will take things that you know about the past and current trends and extrapolate those to create some sort of 'future vision'. In other words, things that are getting faster will be faster, things that are getting smaller will be smaller, and things that are getting cheaper might even become free.

That is one way to get to a 'future vision'. There are plenty of people making a living out of that kind of thing (especially in areas like market sizing) and it works reasonably well on a short term basis, where you are talking about things like how many widgets are likely to be sold over the next 3-5 years. Tomorrow will be the same as today, but a little bit different; repeat to fade…

The thing is, that isn't really about the future. Its about understanding the present – where we are, how we got here, and assuming that we stay on the same trajectory.

What that kind of 'vision' misses are actual events; the 'game changers' that disrupt or transform businesses and industries. (For example, the effect that MP3s had on music sales in the late 1990s/early 2000s.) Usually, those kinds of events are clear in hindsight; often a continuation or convergence of technological trends, brought together and executed well to create a new product. They come from a deeper understanding of the present – how the current trajectory is going to be changed by different events from the past that aren't immediately obvious. (To continue with the MP3 example; hard drives and embedded computers were getting smaller, so it was possible to put much more music into a much smaller music-playing device.)

But occasionally, someone has a new idea, or an insight, which changes the future.

"Genius is a talent for producing something for which no determinate rule can be given, not a predisposition consisting of a skill for something that can be learned by following some rule or other" Immanuel Kant

"Talent hits a target no one else can hit. Genius hits a target that no one else can see." Arthur Schopenhauer

It is very rarely that something happens which genuinely changes the trajectory of history. Especially when it comes from a vision; a possible or potential future that someone imagines. More rarely still, it comes from a vision that someone is able to not just articulate, but execute.

For example, some might call the concept of the iPhone 'genius' – it changed the smartphone industry, and what we think of as "portable" or "mobile" computers, but the development of a pocket-sized computer built around a touch screen wasn't new. Smartphones weren't new. Although it was a very well executed idea, it is hard to say that what Apple achieved with the iPhone was significantly different to the target that Palm, Nokia, RIM, Microsoft and others were aiming for with their phones, smartphones and pocket PCs.

I find it hard to think of a better example of what 'genius' really means than Douglas Engelbart.

Recently, I wrote a blog post about "inventing the future" where I said that;

[…] if you want to see the real meaning of "inventing the future", then you could do far worse than looking at [Alan] Kay and the work that was going on at Xerox PARC (Palo Alto Research Centre, where he worked in the 1970s). Because PARC was basically where most of the ideas behind what we think of as 'computers' were put together into a coherent product. At a point in time when the science fiction future of computers involved banks of switches, blinking lights and beeping computers, the guys at PARC were putting together computers with graphical interfaces (ie. WIMP - the idea of a user interface using Windows, Icons, Mouse and Pointer), the Paper Paradigm (the idea that the computer interface would be made up of analogues to the traditional desktop – so, the "desktop", files and folders, the trash can), WYSIWYG ("What You See Is What You Get" – beforehand, what you would type in a word processor wouldn't really give you any clear idea of what it would look like when you printed it out on paper.)

What I failed to mention (because I was focussing on the idea of "inventing the future" that Kay articulated, rather than the actual "inventing" part) was that while the work that was being done at Xerox PARC was putting together the ideas behind what we think of as "computers", they were very much standing on the shoulders of what Douglas Engelbart and his research team had achieved at SRI in coming up with those ideas. (The PARC team also included several of Engelbart's best researchers

Speaking of Alan Kay, he is quoted in Wired's obituary of Engelbart;

"The jury is still out on how long -- and whether -- people are actually going to understand," he said, what Engelbart created. But at least we have started to.

Ultimately, Douglas Engelbart's big idea was simply too big to fit into a simple blog post or article. The general theme of most of the obituaries I have read summarise his life's work as 'inventing the mouse.'

For example, this piece on the BBC website illustrates the shortcomings of over-simplifying what he did;

Douglas Engelbar [sic], the man who invented the computer mouse, has died. He passed away aged 88 and did not become rich through his invention, because he created the mouse before there was much use for it. Bill English explained: "The only money Doug ever got from it (the mouse) was $50,000 licence from Xerox when Xerox Parks [sic - actually Xerox PARC)] started using the mouse. Apple never paid any money from it, and it took off from there."

Aside from the transcription errors, that brief summary puts this particular achievement into nice simple, concrete terms that anyone who has used a computer would understand. But in doing so, it pulls it out of context and massively over-simplifies what he did. (With the added distraction of how much money he failed to make from his invention. Failing to mention the $500,000 Lemelson-MIT prize he was awarded in 1997, for example.)

To put this particular invention into context; imagine using a computer without a mouse. I would guess that you're probably imagining the same kind of computer, but with a different kind of device to move a pointer around the screen. (Such as a laptop's trackpad, or a trackball, or a joystick.) If so, then you're missing the point of what he actually invented – not just the mouse, but the basic concept of the computer interface using a "pointing" device.

So, try again to imagine a computer interface that doesn't use a mouse, or a pointer. Now, I would guess that you are thinking about the kind of modern applications that don't involve a lot of mouse/pointer work (so, no Photoshop, no Powerpoint etc.) and maybe something more like a word processor. In other words, different ways of using a graphical user interface to operate a computer – which again was part of Engelbart's creation.

Hopefully, you're starting to get an idea of how much of the 2013 idea of a "computer" owes to what Engelbart was imagining half a decade ago.

Robert X. Cringely sums it up;

In addition to the mouse and the accompanying chord keyboard, Doug invented computer time sharing, network computing, graphical computing, the graphical user interface and (with apologies to Ted Nelson) hypertext links. And he invented all these things — if by inventing we mean envisioning how they would work and work together to create the computing environments we know today — while driving to work one day in 1950.

Incidentally, that article closes with this beautiful quote;

I once asked Doug what he’d want if he could have anything. “I’d like to be younger,” he said. “When I was younger I could get so much more done. But I wouldn’t want to be any less than 50. That would be ideal.”

He has been widely credited with creating many of the basic concepts of modern computers, demonstrating many of them to the world for the first time at what has since been dubbed "the mother of all demos". But the impact of what he envisioned was much greater than the sum of its parts.

Augmenting Human Intellect

Even then, its still an over simplification. It still hasn't got to the bottom of Engelbarts vision. The mouse, GUI, videoconference, networked computing are all just details of execution – they don't get to the bottom of why he was developing those ideas, and what they were for.

His vision of the potential of the computer went beyond what they did, or how a user would interact with them. It was – in an age of hugely expensive room-sized workstations, punch-cards and teletype terminals – about the role that a computer would have in people's lives.

In a blog post on Understanding Douglas Engelbart, John Naughton has this to say;

But reading through the obituaries, I was struck by the fact that many of them got it wrong. Not in the fact-checking sense, but in terms of failing to understand why Engelbart was such an important figure in the evolution of computing. Many of the obits did indeed mention the famous “mother of all demonstrations” he gave to the Fall Joint Computer Conference in San Francisco in 1968, but in a way they failed to understand its wider significance. They thought it was about bit-mapped, windowing screens, the mouse, etc. (which of course it demonstrated) whereas in fact that Engelbart was on about was the potential the technology offered for augmenting human intellect through collaborative working at a distance. Harnessing the collective intelligence of the network, in other words. Stuff we now take for granted. The trouble was that, in 1968, there was no network (the ARPAnet was just being built) and the personal computer movement was about to get under way. The only way Engelbart could realise his dream was by harnessing the power of the time-shared mainframe — the technology that Apple & Co were determined to supplant. So while people understood the originality and power of the WIMPS interface that Doug created (and that Xerox PARC and, later, Apple and then Microsoft implemented), they missed the significance of networking. This also helps to explain, incidentally, why after the personal computer bandwagon began to roll, poor Doug was sidelined.

To put it another way, before Engelbart, a "computer" was a device to process information – you put data in, it ran the numbers and gave you data out. In other words, a computer was something you gave some 'work' to, and it did it for you. (For example, you would use a keyboard to punch some information into a card, then put the card into the computer.)

Engelbart's vision was computers as devices to augment human intellect – partly by doing the "computing" work for you, and partly by doing the work with you (for example, by using an interface that was live, giving the user feedback in real time), but through networking and sharing resources, by connecting people and by working together, becoming a working team greater than the sum of its parts.

If you focus on his achievement as the tools he created to make the first part of this vision a reality — the mouse, the GUI and the desktop computing environment — then you could be forgiven for thinking that as we move to mobile devices and touch screens, we are leaving what he did behind.

I think that couldn't be further from the truth. When we move forwards to always-on, mobile, networked computing, with permanent availability of resources like Wikipedia, to social networks, to Dropbox and Maps and so on, the role of the device for "augmenting human intellect" becomes clearer then ever.

The Power of the Network

The system that Engelbart designed and helped to build was NLS - the "oN Line System", which enabled several users to work on the same computer at the same time. (This was the system that was shown off at 'the mother of all demos'.)

In 1969, the beginnings of ARPANET – one of the first packet-switching computer networks – were put into place. The second computer on the network (so the first network connection) was up and running in October, connecting a machine at UCLA machine to Douglas Engelbart's NLS system at the Stanford Research Institute. As this network developed, it was the first to use the TCP/IP protocol designed to allow computer networks to connect to one another, allowing machines on any of the connected networks to communicate with one another directly. The public 'network of networks' built on this protocol is the internet.

There is a great anecdote in this article from 1999 about a meeting between Engelbart and Steve Jobs which I think illustrates this friction between Engelbart's vision of the power of the network being the key to the computer, and the similar but competing vision of the personal, desktop computer as a self-contained box with 'all the computing power you need';

Apple Computer Inc.'s hot-shot founder touted the Macintosh's capabilities to Engelbart. But instead of applauding Jobs, who was delivering to the masses Engelbart's new way to work, the father of personal computing was annoyed. In his opinion, Jobs had missed the most important piece of his vision: networking. Engelbart's 1968 system introduced the idea of networking personal computer workstations so people could solve problems collaboratively. This was the whole point of the revolution. "I said, 'It [the Macintosh] is terribly limited. It has no access to anyone else's documents, to e-mail, to common repositories of information, "' recalls Engelbart. "Steve said, 'All the computing power you need will be on your desk top."' "I told him, 'But that's like having an exotic office without a telephone or door."' Jobs ignored Engelbart. And Engelbart was baffled. We'd been using electronic mail since 1970 [over the government-backed ARPA network, predecessor to the Internet]. But both Apple and Microsoft Corp. ignored the network. You have to ask 'Why?"' He shrugs his shoulders, a practiced gesture after 30 frustrating years, then recounts the story of Galileo, who dared to theorize that the Earth circles the sun, not vice versa. "Galileo was excommunicated, " notes Engelbart. "Later, people said Galileo was right." He barely pauses before adding, "I know I am right."

Apple's vision for the Mac in 2001 was the "digital hub", which would connect to all of your other electronic devices. It wasn't until just 2 years ago that the focus of Apple's vision shifted from the desktop computer to the network – specifically iCloud – as the "digital hub" which everything would connect to. Arguably, Apple's reputation for online services, and specific examples like the iWork for web applications announced last month (which work through a web browser, but still offer no way for people to work collaboratively on the same document at the same time) indicate that they still don't get it.

So – yes, he invented the mouse. And the idea of the computer interface that the mouse works within. But his greater idea was one that I think we are still getting to grips with; the idea of the computer as a tool for extending ourselves, for bringing people together, connecting them across countries and continents so that they can work together, share their things, talk, write and speak to one another.

All of this sparked by a man in 1950, driving on his way to work the day after getting engaged and realising that he needed to set himself some professional goals to keep himself interested once he had got married and was 'living happily ever after';

I finally said, "Well, let's just put as a requirement I'll get enough out of it to live reasonably well." Then I said, "Well, why don't I try maximizing how much good I can do for mankind, as the primary goal, and with the proviso that I pick something in there that will make enough livable income." So that was very clear, very simple as that.

Realising that the complexity of human problems was growing, as well as becoming more urgent, and realising that computers could provide a way to solve those problems, his mission (or 'crusade', as he later called it) was to turn that into a reality.

From that vision sprang the ideas behind what we think of as the computer, in terms of its graphical user interface, and the tools we use to connect with that interface. From his own computer system came the first network connection, to the network that later became the internet. But the vision he was putting together in the 1960s is only just now becoming clear to those of us who have moved into a world of ubiquitous, always-on, always-connected computers – as it moves past the desktop-bound paradigm that he saw and into a pocket-sized, portable and wireless world.

Whether he will forever be 'the man who invented the mouse', or eventually get wider recognition for the scope of that original vision remains to be seen, but no history of either the personal computer or the internet could be complete without mentioning his work. But the fact is that thanks to Douglas Engelbart's vision, pretty much anyone today with even a passing interest in where the ideas of the personal computer, the networked computer or the internet came from will be able to pull out their pocket-sized personal, networked computer and quickly trace them back to him.

A Day One bookmarklet for iOS

There is a promotion in the Apple App Store at the moment, giving away 10 apps for free to mark the 5 year anniversary of the App Store's launch.

One of the apps there is Day One, which I had heard some good things about, so decided to give it a whirl. And I like it.

One thing that I thought would be useful was a bookmarklet to send web pages from Safari into Day One entries. I had a quick look, but couldn't find anything. So I had a stab at building one myself.

This is the Day One URL scheme;

CommandURL
Open Day Onedayone://
Start an entrydayone://post?entry="entry body"
Open Entries listdayone://entries
Open Calendardayone://calendar
Open Starred dayone://starred
Edit Entrydayone://edit?entryId=[UUID]
Preferencesdayone://preferences

And this is a bookmarklet I had previously made to work with the Drafts app;

javascript:window.location='drafts://x-callback-url/create?text='+encodeURIComponent(document.title+'\n')+encodeURIComponent(location.href)

To start with, I put the Day One URL into a very simple JavaScript bookmarklet;

javascript:window.location='dayone://post?entry="entry body"'

And it works- good start!

Taking the code from my Drafts bookmarklet to get the URL and page title gave me this;

javascript:window.location='dayone://post?entry="'+encodeURIComponent(document.title+'\n')+encodeURIComponent(location.href)+'"'

Which also worked. So this is basically the same as my Drafts bookmarklet (but without the Actions to trigger).

I had a more complex Drafts bookmarklet which checks for selected text (only works on the iPad when the Bookmarks bar is visible - otherwise any text is deselected when you pull up the bookmarks menu) - switching the base URL gave me this (I've added line breaks to make it readabl here- you probably don't want them if you're using this bookmarklet yourself. Just copy/paste the code into a text editor and remove the line breaks so it is all on a single line.)

javascript:function%20 getSelText()%7Bvar%20txt=%27%27; if(window.getSelection)%7Btxt=window.getSelection();%7D else%20if(document.getSelection)%7Btxt=document.getSelection();%7D else%20if(document.selection)%7Btxt=document.selection.createRange().text;%7D else%20return%20%27%27; return%20txt;%7D var%20q=getSelText(); if(q!=%27%27)%7B q='%3Cblockquote%3E%5Cn'+q+'%5Cn%3C%2Fblockquote%3E%5Cn';%7D var%20l='dayone://post?entry='+'%5B' +encodeURIComponent(document.title)+'%5D%28' +encodeURIComponent(location.href+'%29%5Cn'); if(!document.referrer)%7Br='';%7D else%7Br='via%20'+encodeURIComponent(document.referrer);%7D window.location=l+r+'%5Cn'+q+'%5Cn';

Which, to my surprise (once I had got rid of some stray commas and semicolons) worked!

With some text selected on a web page, this bookmarklet now opens Day One, creates an entry and populated it with the web page title (as a markdown link to the page URL) and any selected text in a blockquote HTML tag, and looks something like this;

Drafts and Safari bookmarklets — Some Random Nerd

It occurred to me that an app that plays so nicely with URL schemes (ie. sending things to other apps via their URL schemes) would probably have a scheme of its own for pulling things in. A little googling later and I found that you can; like this bookmarklet

Not a bad result at all - especially considering I managed to put it all together on my iPad on a 25 minute train journey.

"What Really Happens On A Teen Girl's iPhone"

Every so often, I see a report like this – usually anecdotal data about a particular young heavy user of a technology (computers, games etc.) or service (MySpace, Facebook etc.) But the underlying thread is quite simple – the way people who have grown up with technology actually use is is quite different to those of us who were introduced to it later in life.

For me, social media is an additional 'layer' on top of 'real' social stuff – seeing people, phone calls, texts, emails etc.

This account of a teenage (American) girl's life through the lens of an iPhone is interesting enough. But its not the points about weight/frequency of use that blow me away – its the fact that digital communication is so completely central to these girls' lives that astonishes me.

Its one thing to imagine how my own life as a teenager/young adult would have been different (the kinds of photos that might have appeared on Facebook, for example), but its another thing altogether to imagine how the important things in life would have been different – which friendships would have been strengthened, which ones might have fallen by the wayside.

Inventing the Future

A short piece in AdAge caught my eye, about the founder of FourSquare, talking about the idea of "frictionless check-ins" (ie. apps that track your location wherever you are, so it can be matched up with key locations, who you are with etc. so that advertisers can then pick people to send 'offers' to). His reply to a question about the privacy implications of this was;

"Whenever you're kind of inventing the future this happens," Mr. Crowley said. "I can think of the number of people who were like, 'I will never get a cellphone because I don't want people calling me all the time. And I will never get on Facebook because I don't want to share that stuff with people. And Twitter, that's not for me.' And this is just the natural progression of things."

It took me a while to figure out why this innocuous quote rubbed me up the wrong way, but I think I've got to the bottom of it.

Firstly, there is nothing "natural" about this kind of progression. This is about services designed by people, technology designed by people, and business models designed around collecting data from people and selling it as a commercial product. To describe this as 'natural progression' (especially as a deflection of a question about the privacy implications of a business tracking your every move) seems incredibly disingenuous – the idea presumably being that you (as a concerned user) are going to have your every move tracked anyway, so why not just go along with it instead of fighting it?

Well, there is clearly a payoff to be made here - what you lose in terms of privacy and control when what used to be personal data moves into the digital (and commercial) space. It seems to me that the real benefit of this kind of service is going to be more along the lines of what Google are doing with Google Now; in exchange for that personal information, you get useful information back – what Eric Schmidt has talked about as the idea that "[…]most people don't want Google to answer their questions. They want Google to tell them what they should be doing next."

But it isn't just the implications for privacy that bother me about the quote. The idea of 'inventing the future' had already been on my mind when I saw the interview though. Because I had been mulling over a particular quote for a little while;

"The best way to predict the future is to invent it"

I saw this inspirational quote written on the wall of an office meeting room. It is credited to Abraham Lincoln.

Now, I just don't think that this is something that Abraham Lincoln ever said. Just a hunch, but it seems unlikely to me that Lincoln was really thinking along those lines while trying to preserve the United States. Of course, it could be that the quote is really an inside joke – a nod to Abraham Lincoln's Internet Wisdom, and the running joke about "The trouble with quotes on the internet is that you can never know if they are genuine" ('attributed' to Abraham Lincoln.)

Or it might be that someone forgot that if you Google for something, you tend to find what you are looking for. I don't know who put it up on the wall, and I don't really want to ask in case it isn't a reference to a meme, and its just a misquote.

The thing is, the actual source of the quote is a man called Alan Kay, and if you're interested in the history and development of technology and the people who 'invented the future' that we live in, you might be familiar with him as the inventor of the "Dynabook" – an early concept of what the "personal computer" might look like from the 1970s.

In a paper from 1972 outlining the Dynabook idea, he includes a quote;

"To know the world one must construct it" – Pavese

Although Kay is often credited with the "inventing the future" quote (and certainly deserves to be strongly associated with it), it seems pretty clear that the inspiration came from elsewhere. (And probably not Abraham Lincoln…)

But if you want to see the real meaning of "inventing the future", then you could do far worse than looking at Kay and the work that was going on at Xerox PARC (Palo Alto Research Centre, where he worked in the 1970s). Because PARC was basically where most of the ideas behind what we think of as 'computers' were put together into a coherent product.

At a point in time when the science fiction future of computers involved banks of switches, blinking lights and beeping computers, the guys at PARC were putting together computers with graphical interfaces (ie. WIMP - the idea of a user interface using Windows, Icons, Mouse and Pointer), the Paper Paradigm (the idea that the computer interface would be made up of analogues to the traditional desktop – so, the "desktop", files and folders, the trash can), WYSIWYG ("What You See Is What You Get" – beforehand, what you would type in a word processor wouldn't really give you any clear idea of what it would look like when you printed it out on paper.)

I've written before about how part of the role of fiction (especially science fiction) is to help us to prepare for the future; the idea that Art is a form of cultural defence, which gives us frameworks to think about the kind of ideas that are soon to become a reality. In that context, its worth noting that the way the Dynabook concept document above was written was, literally, as a piece of science fiction.

These basic concepts of what a "computer" is and how it works aren't (and weren't), inevitable, or set in stone. They weren't a "natural progression of things." They were people's ideas and designs, dream up, executed on, iterated and refined. What bugs me about the (misattributed) quote is how much of its meaning is taken away by pulling it out of its context.

Because right now, 40 years on, some of these ideas are being challenged by the current generation of touchscreen phones and tablets. Some of them will survive, some of them will be replaced. Some of them will fade away, because something better will (or has) come along. Some of them will stick around, despite something better coming along. (Consider the QWERTY keyboard layout — designed to stop typewriter keys from sticking, surviving the transition to electronic keyboards, handheld keyboards, and now touchscreens — despite arguably better designs being introduced.)

The thing that is so impressive about the work that was going on back then at PARC isn't just how many of their ideas have lasted so long, but how much of what they were creating is now considered 'normal.' I don't think many people really think about how everything in how a computer works wasn't 'natural' or 'obvious' or 'inevitable', but was the product of human imagination and the hard work to execute the ideas that they came up with. There is nothing fundamental to the workings of a microchip that means a computer should have a 'desktop' and 'windows', that files should be saved in folders, or that the documents on a screen should be laid out following the conventions of desktops and paper-based design.

So the story behind the quote, the man, the place and the work is a fascinating one. The culture of what was happening on the west coast of America — PARC, Stanford, ARPA, Douglas Engelbart, Vint1 Cerf etc. were fundamental in shaping the tech scene of Silicon Valley today. This was the birthplace of modern computers, networks, and the Internet. The designs that influenced the original Apple Macintosh, and Microsoft Windows.

Because the point is that the future is ours — it is up to us to decide what we want to do, what problems we want to solve, what challenges are worth taking on, what ideas and dreams we want to turn into a reality – and what we don't.

That is what "inventing the future" means. "We" are inventing the future – people/companies like FourSquare are a part of that, but so are the users who choose to sign up for them, to opt in to particular services and to share particular data. The idea that "inventing the future" is the preserve of a Silicon Valley elite, while the billion+ users of these services are just somehow passively along for the ride is what grates to me.

  1. Orignally, this read "Vintage", which I assume was an iPhone autocorrect mistake.

Throwing things

From XKCD's 'what if', as part of an answer to the question "How high can a human throw something?";

Throwing is hard. In order to deliver a baseball to a batter, a pitcher has to release the ball at exactly the right point in the throw. A timing error of half a millisecond in either direction is enough to cause the ball to miss the strike zone. To put that in perspective, it takes about five milliseconds for the fastest nerve impulse to travel the length of the arm. That means that when your arm is still rotating toward the correct position, the signal to release the ball is already at your wrist. In terms of timing, this is like a drummer dropping a drumstick from the 10th story and hitting a drum on the ground on the correct beat.

You should read the whole answer (actually, you should follow the blog, assuming you already know about the XKCD comic and follow that already.) But for a nice illustration of what we take for granted, here's a lovely video;

http://juanetch.tumblr.com/ by: Juan Etchegaray (Argentina) Twitter: @juanetch NEW! Watch the backstage Here: vimeo.com/59068094

"Find The Thing You're Most Passionate About, Then Do It..."

"...on nights and weekends for the rest of your life."

A painfully resonant article from The Onion…

Before you get started, though, you need to find the one interest or activity that truly fulfills you in ways nothing else can. Then, really immerse yourself in it for a few fleeting moments after an exhausting 10-hour day at a desk job and an excruciating 65-minute commute home. During nights when all you really want to do is lie down and shut your eyes for a few precious hours before you have to drag yourself out of bed for work the next morning, or on weekends when your friends want to hang out and you’re dying to just lie on your couch and watch TV because you’re too fatigued to even think straight—these are the times when you need to do what you enjoy most in life.

Productivity through Laziness

It must be about 18 years since I figured out in my first job out of school that computers and laziness went together very well.

The job was a combination of data entry and data transfer - putting data from various sources into a CRM/sales database. But when I found a little-known Windows application called Recorder (part of Windows 3.1 or 95 or whatever it was that I was using at the time) that literally recorded where your mouse was moving and clicking around the screen, I discovered that I could speed up the number of entries I was doing by a huge factor.

Whats more, this also meant that I didn't need to pay as much attention to what I was doing, which meant that I didn't get as bored, as I could distract myself with other activities. Among other things, this was when I set my personal record for Minesweeper, learnt to touch type, and learnt about the internet. Which prompted a whole bunch of other stuff…

(Unfortunately, despite the fact that I could point to better numbers that proved I was getting more work done, it didn't look like I was working harder. In hindsight, I obviously understand why, but still, it was a valuable lesson - sometimes looking busy can be more important than actually being productive...)

Anyway, Recorder has been and gone (I think largely replaced on the PC by Visual Basic) but I've been discovering what sort of things can be done with Automator and Applescript on the Mac - tools that do a similar thing to Recorder, but in a much cleverer way (ie. giving commands to applications, rather than hoping that things like boxes and form fields will be in exactly the same place every time…)

And I'm feeling pretty pleased with myself. In the space of one morning, I managed to take 646 spreadsheets in 34 different folders (poorly labelled, I might add) and turn them into 34 organised and clearly labelled Excel workbooks in the space of about 2 hours.

In those same 2 hours, I attended a half hour long meeting.

Now, I have no idea how long that would have taken to do manually- I'm guessing at least a couple of days (given the mind-numbing nature of the task and opportunities for human error to mess things up.)

The bottom line is this - if you use a computer to work but don't really know how to use a computer, then there is a very real chance that you are actually doing the kind of work that a computer could do by itself in a lot less time with just a little supervision.

But, if you are lazy enough to figure out how to get your computer to do your job for you (or at least as much of it as possible), then that puts you in a much stronger position in the possible workplace of the future.

(But you might be lonely in the office…)

"To a Boy of the 1970s, the Line Between Comic Books and Real Life People Was Hopelessly Blurred"

(via DaringFireball.net)

I love this.

You have to understand that to a boy of the 1970s, the line between comic books and real life people was hopelessly blurred. Was Steve Austin, the Six Million Dollar Man, real or fake? Fake? Well, then, how about Evel Knievel jumping over busses on his motorcycle? Oh, he was real. The Superman ads said, “You will believe a man can fly,” and Fonzie started jukeboxes by simply hitting them, and Elvis Presley wore capes, and Nolan Ryan threw pitches 102 mph, and Roger Staubach (who they called Captain America) kept bringing the Cowboys back from certain defeat, and Muhammad Ali let George Foreman tire himself out by leaning against the ropes and taking every punch he could throw. What was real anyway?

It never occurred to me that before television, children probably didn't live in a world where reality and fiction were so confusingly intertwined.

It reminds me of a quote I heard (can't remember or find a source) along the lines that "The magic of Disneyworld wasn't in making people believe that it was real, but making them believe that the rest of the world wasn't just an illusion."

"BlackBerry CEO on the future of tablets"… For who?

“In five years I don’t think there’ll be a reason to have a tablet anymore,” Heins said in an interview yesterday at the Milken Institute conference in Los Angeles. “Maybe a big screen in your workspace, but not a tablet as such. Tablets themselves are not a good business model.”

I saw this get posted around a few times yesterday, and I can understand why; BlackBerry CEO sounding out of touch is obviously a fun story.

I haven't found the actual interview he is being quoted from, but my guess is that this is out of context. We know enough about the tablet market to know that there are three camps who are doing well; Apple being the biggest, selling iPads at a premium (and making money), everyone else selling devices virtually at cost (Amazon with the Kindle Fire, Google with the Nexus 7), and Samsung occupying a much smaller space in between — selling Galaxy tablets at premium prices, but in far less volume than Apple. (I should note that I think Microsofts position in the market isn't yet clear — I wouldn't call the Surface a failure, but I just haven't seen enough data yet to make a clear call.)

If he is talking about there being no reason for people to have a tablet in 5 years, then he's probably wrong. But if he is saying that for a BlackBerry (or Motorola, Samsung, LG etc) to have a tablet in 5 years is not going to be a good business model, then I'm inclined to agree with him.

I'm still struggling to imagine Alicia Keys delivering a PowerPoint deck though…

"Google Now iOS App Keeps GPS Active"

When I woke up this morning, I noticed that the GPS icon was showing on my iPad. The only app I had running was my alarm clock, which doesn’t use Location Services. It was odd but didn’t give it too much more thought. Then I was browsing Twitter and noticed someone had tweeted that the new Google Search app (with Google Now) had a flaw, leading to the GPS being always on. Aha! Perhaps mystery solved. I checked my iPhone. Sure enough. GPS icon active.

Some confusion going on here about the 'GPS icon' on the iPhone. The iPhone (and iPad) has a small icon in the menu bar that appears when something on your phone is using Location Services. Location Services is an API that is used to identify where you are.

I would guess that the confusion comes from the fact that the icon is the same one used in the Maps app to say where you are (ie. to access Location Services), which is a concept that everyone is familiar with through GPS devices – and which uses GPS on devices that have a GPS chip. (Wifi-only iPads and iPod Touch have Location Services, but no GPS chip.)

But "Location services" doesn't mean GPS. GPS is a part of it, but at any given time, a phone that is connected to a network has some information about where it is, because it knows which network cells it is currently connected to. Similarly, devices on a WiFi hotspot can deduce information about where they are based on the hotspot and IP address. Location Services allows an application to tap into this information, and the icon is there to let you know that something you use is tracking where you are. If more detail is needed, then Apple's APIs will then use services like GPS to get more accurate locational data (at different levels of accuracy, depending on what is needed— more accuracy requires more power.)

This might mean that an application will 'wake up' when you are in a certain area and then switch on GPS to confirm (for example) whether you are actually in the area (say, a train station, a shop, at home) or just nearby.

Apple's developer documentation explains how this all works on a technical level, but the bottom line is that the location services icon is not a GPS icon, and it doesn't mean that something is using GPS and draining your battery.

Whether you still want to use an application that is constantly tracking your location is up to you though. Of course, a side effect of this is that even if you trust Google with that information, you won't be able to easily see if any other applications are using Location Services without going into the Settings menu to manually check.

WWDC 2013: $8 million in less than 2 minutes

WWDC- Apple’s annual developer conference for anyone who wants to stay ahead of the game when it comes to all things related to iOS and OSX development.

$1,600 a ticket. About 5,000 attendees (the capacity of the conference centre where its held). Videos and slides are made available to all registered developers afterwards (for which you pay a fee of something like $80 a year, which you need to do if you want to get apps in the App Store.) And unlike Googles I/O conference, they don't have a history of giving away free gadgets. You get admission, and that is all.

Last year, it sold out in 2 hours, when dates were announced and tickets went on sale at the same time.

This year, dates for the conference and ticket release time were announced together earlier this week. And tickets just sold out in less than 2 minutes.

If you're wondering if developers are losing interest in Apple's platforms, then the fact that 5,000 of them just collectively handed over about $8 million just to hear what they have to say *in person* might be a relevant data point to bear in mind.

Falling Giants

Over the last decade or so, the most unforeseeable developments probably weren't the growth of a company, or the rise or emergence of a new consumer product, category or technology. The thing that was unthinkable 10 years ago was that the clear leaders in technology – Microsoft in computers, Nokia in mobile phones, AOL in the consumer web/internet – would not be able to retain their obvious leadership positions.

Smartphone install bases

Related to my last post, a post from Benedict Evans on (global) smartphone install bases; its easy to forget that when the average phone lasts 24 months, the current pace of smartphone growth is really pretty fast. Based on some fairly simple arithmetic;

… about 90% of current Android users are on their first Android, as are 70% of iPhone users. It'll be interesting to see what their second purchase is.

Mobile stats - "All the numbers, in one place"

A handy collection of mobile stats collected by Harry McCracken. A little too US focussed to be of much use for me personally, but still worth a look.

However, what is worth noting that, as he states, this isn't an analysis of why the numbers look like they do;

I’m collecting rather than interpreting, though I hope that some of you will draw conclusions in the comments.

The one that stands out for me (because its the most interesting from a media perspective); "Which platform gets used most on the internet?"

According to NetMarketShare, its iOS, with a 61.4% share (over 24.3% for Android.) But according to StatCounter, its Android, with a 37.2% share (over 27.1% for iOS.)

Why the discrepancy? Harry doesn't offer much;

Of course, the two organizations’ methodology may be radically different; I’m not sure, for instance, whether both, either or neither of them include the iPad in these numbers. But the disparity is a healthy reminder that it’s risky to draw conclusions from data you don’t know very much about.

True enough. But its my job to understand this stuff, so I thought I'd take a look.

StatCounter say that;

Net Apps base their stats on unique visitors per site per day. ("We 'count' unique visitors to our network sites, and only count one unique visit to each network site per day.") We base our stats on page views.

So, it would seem that NetMarketShare are reporting far more iOS devices online (based on daily site visits), while StatCounter are reporting slightly more pageviews from Android devices (indicating that the smaller number who are using Android are using them much more.)

Although it still isn't clear if that really tells the whole story. As well as different metrics, the two companies are using different data collection methods, and different ways of processing them. This is far from an in-depth analysis — neither of the data sources are ones that I use regularly or know much about their methodologies; this is still very much a top line interpretation rather than a 'proper' analysis.

But, I do know that tablet users are likely to be more active online than smartphone users, and that iOS has a far greater share of the tablet market than the phone market. Some analysis I did last year on comScore's figures indicate that in terms of traffic, despite there being far fewer devices, iPads account for more than iPhones in the UK (and iPhones account for more than Android.) So for the UK (which is the only market I'm really interested in), iOS still looks like the dominant smartphone platform.

Not Necessarily News #3

This is the third of my 'not necessarily news' posts, where I try to wrap up the stuff that caught my eye in the past week (give or take a day or so – I haven't quite figured out my posting routine yet.)

The general idea here is that I don't want to be trying to cover 'news' – the idea of throwing something away because I didn't manage to write a few paragraphs about it within 24 hours isn't really what I want to be doing.

This week, I've gone with something of a theme around the whole 'news' thing, and trying to explain (through the words of people much smarter than me) why I'm trying to do this 'newsletter without the news' format.

What is the single most important life lesson older people feel young people need to know?

A nice summary of interviews with 1,500 70+ year olds about the advice that they would offer to younger people;

Another point worth making is advice the older folks consistently did not give: No one— not a single person out of a thousand— said that to be happy you should try to work as hard as you can to make money to buy the things you want. No one— not a single person— said it’s important to be at least as wealthy as the people around you, and if you have more than they do it’s real success. No one— not a single person— said you should choose your work based on your desired future earning power.

Do we need a new Save icon?

Some Random Dude — Why Redesigning the Save Icon is Important

The age or establishment of something shouldn’t preclude it from scrutiny and/or replacement. If anything, that should make us all the more eager to pull it down from its pedestal. Our job is to make things better—or at least try to do so. The Save icon is not good enough. We should try to make it better.

It's a fair point. My last computer had a floppy disc drive that I never used. I'm guessing it would be about 10 years or so since I last used one. For me, the various jobs that they did got replaced by hard drives, USB sticks, Dropbox and email.

But what does the save icon really mean? "Put this file I'm working with on a disc."

The issue isn't that the icon has become obsolete. It's that the act of saving has become obsolete. Why do you want to save?

My main tools today; nvALT — automatically saves as you go. Drafts - automatically saves as you go. Elements — automatically saves when you close a document (ie. finish working with it.) VoodooPad — I honestly don't know if this saves as you go along or just when you close; I don't recall saving. What I do do in Voodoopad (and several other applications) is export — a Voodoopad document as a set of web pages, a layered photoshop .PSD as a .PNG.

What I never think to myself is "this would be a bad time to save", or "I wish my auto save was set to every 10 minutes instead of every 5."

In a world with version control, document history (think of Photoshop's History tool), cloud storage etc. there isn't any point in saving. It should be automatic. There should never be a difference between the version of the document I'm working on in RAM and the version on disk.

More to the point, the most annoying thing in my day to day workflow isn't losing unsaved work. It's having two versions of the same document with different changes because I was working in an application that doesn't save as I go along.

So, I agree that the Save icon is overdue a rethink. But the issue isn't that it needs a different picture. It needs to be got rid of altogether.

"Digg Does The Google Reader Survey That Google Should've Done"

You can’t help but wonder how and why Google — a company that lives and breathes data — never bothered to survey Google Reader users to measure their passion for the product before they decided to shut it down on July 1st.

Probably because Google know exactly how many users Reader has, what feeds they follow, how often they check them etc.

(For every single user— not just the ones who chose to fill out a survey for Digg…)