"What Really Happens On A Teen Girl's iPhone"

Every so often, I see a report like this – usually anecdotal data about a particular young heavy user of a technology (computers, games etc.) or service (MySpace, Facebook etc.) But the underlying thread is quite simple – the way people who have grown up with technology actually use is is quite different to those of us who were introduced to it later in life.

For me, social media is an additional 'layer' on top of 'real' social stuff – seeing people, phone calls, texts, emails etc.

This account of a teenage (American) girl's life through the lens of an iPhone is interesting enough. But its not the points about weight/frequency of use that blow me away – its the fact that digital communication is so completely central to these girls' lives that astonishes me.

Its one thing to imagine how my own life as a teenager/young adult would have been different (the kinds of photos that might have appeared on Facebook, for example), but its another thing altogether to imagine how the important things in life would have been different – which friendships would have been strengthened, which ones might have fallen by the wayside.

Inventing the Future

A short piece in AdAge caught my eye, about the founder of FourSquare, talking about the idea of "frictionless check-ins" (ie. apps that track your location wherever you are, so it can be matched up with key locations, who you are with etc. so that advertisers can then pick people to send 'offers' to). His reply to a question about the privacy implications of this was;

"Whenever you're kind of inventing the future this happens," Mr. Crowley said. "I can think of the number of people who were like, 'I will never get a cellphone because I don't want people calling me all the time. And I will never get on Facebook because I don't want to share that stuff with people. And Twitter, that's not for me.' And this is just the natural progression of things."

It took me a while to figure out why this innocuous quote rubbed me up the wrong way, but I think I've got to the bottom of it.

Firstly, there is nothing "natural" about this kind of progression. This is about services designed by people, technology designed by people, and business models designed around collecting data from people and selling it as a commercial product. To describe this as 'natural progression' (especially as a deflection of a question about the privacy implications of a business tracking your every move) seems incredibly disingenuous – the idea presumably being that you (as a concerned user) are going to have your every move tracked anyway, so why not just go along with it instead of fighting it?

Well, there is clearly a payoff to be made here - what you lose in terms of privacy and control when what used to be personal data moves into the digital (and commercial) space. It seems to me that the real benefit of this kind of service is going to be more along the lines of what Google are doing with Google Now; in exchange for that personal information, you get useful information back – what Eric Schmidt has talked about as the idea that "[…]most people don't want Google to answer their questions. They want Google to tell them what they should be doing next."

But it isn't just the implications for privacy that bother me about the quote. The idea of 'inventing the future' had already been on my mind when I saw the interview though. Because I had been mulling over a particular quote for a little while;

"The best way to predict the future is to invent it"

I saw this inspirational quote written on the wall of an office meeting room. It is credited to Abraham Lincoln.

Now, I just don't think that this is something that Abraham Lincoln ever said. Just a hunch, but it seems unlikely to me that Lincoln was really thinking along those lines while trying to preserve the United States. Of course, it could be that the quote is really an inside joke – a nod to Abraham Lincoln's Internet Wisdom, and the running joke about "The trouble with quotes on the internet is that you can never know if they are genuine" ('attributed' to Abraham Lincoln.)

Or it might be that someone forgot that if you Google for something, you tend to find what you are looking for. I don't know who put it up on the wall, and I don't really want to ask in case it isn't a reference to a meme, and its just a misquote.

The thing is, the actual source of the quote is a man called Alan Kay, and if you're interested in the history and development of technology and the people who 'invented the future' that we live in, you might be familiar with him as the inventor of the "Dynabook" – an early concept of what the "personal computer" might look like from the 1970s.

In a paper from 1972 outlining the Dynabook idea, he includes a quote;

"To know the world one must construct it" – Pavese

Although Kay is often credited with the "inventing the future" quote (and certainly deserves to be strongly associated with it), it seems pretty clear that the inspiration came from elsewhere. (And probably not Abraham Lincoln…)

But if you want to see the real meaning of "inventing the future", then you could do far worse than looking at Kay and the work that was going on at Xerox PARC (Palo Alto Research Centre, where he worked in the 1970s). Because PARC was basically where most of the ideas behind what we think of as 'computers' were put together into a coherent product.

At a point in time when the science fiction future of computers involved banks of switches, blinking lights and beeping computers, the guys at PARC were putting together computers with graphical interfaces (ie. WIMP - the idea of a user interface using Windows, Icons, Mouse and Pointer), the Paper Paradigm (the idea that the computer interface would be made up of analogues to the traditional desktop – so, the "desktop", files and folders, the trash can), WYSIWYG ("What You See Is What You Get" – beforehand, what you would type in a word processor wouldn't really give you any clear idea of what it would look like when you printed it out on paper.)

I've written before about how part of the role of fiction (especially science fiction) is to help us to prepare for the future; the idea that Art is a form of cultural defence, which gives us frameworks to think about the kind of ideas that are soon to become a reality. In that context, its worth noting that the way the Dynabook concept document above was written was, literally, as a piece of science fiction.

These basic concepts of what a "computer" is and how it works aren't (and weren't), inevitable, or set in stone. They weren't a "natural progression of things." They were people's ideas and designs, dream up, executed on, iterated and refined. What bugs me about the (misattributed) quote is how much of its meaning is taken away by pulling it out of its context.

Because right now, 40 years on, some of these ideas are being challenged by the current generation of touchscreen phones and tablets. Some of them will survive, some of them will be replaced. Some of them will fade away, because something better will (or has) come along. Some of them will stick around, despite something better coming along. (Consider the QWERTY keyboard layout — designed to stop typewriter keys from sticking, surviving the transition to electronic keyboards, handheld keyboards, and now touchscreens — despite arguably better designs being introduced.)

The thing that is so impressive about the work that was going on back then at PARC isn't just how many of their ideas have lasted so long, but how much of what they were creating is now considered 'normal.' I don't think many people really think about how everything in how a computer works wasn't 'natural' or 'obvious' or 'inevitable', but was the product of human imagination and the hard work to execute the ideas that they came up with. There is nothing fundamental to the workings of a microchip that means a computer should have a 'desktop' and 'windows', that files should be saved in folders, or that the documents on a screen should be laid out following the conventions of desktops and paper-based design.

So the story behind the quote, the man, the place and the work is a fascinating one. The culture of what was happening on the west coast of America — PARC, Stanford, ARPA, Douglas Engelbart, Vint1 Cerf etc. were fundamental in shaping the tech scene of Silicon Valley today. This was the birthplace of modern computers, networks, and the Internet. The designs that influenced the original Apple Macintosh, and Microsoft Windows.

Because the point is that the future is ours — it is up to us to decide what we want to do, what problems we want to solve, what challenges are worth taking on, what ideas and dreams we want to turn into a reality – and what we don't.

That is what "inventing the future" means. "We" are inventing the future – people/companies like FourSquare are a part of that, but so are the users who choose to sign up for them, to opt in to particular services and to share particular data. The idea that "inventing the future" is the preserve of a Silicon Valley elite, while the billion+ users of these services are just somehow passively along for the ride is what grates to me.

  1. Orignally, this read "Vintage", which I assume was an iPhone autocorrect mistake.

Throwing things

From XKCD's 'what if', as part of an answer to the question "How high can a human throw something?";

Throwing is hard. In order to deliver a baseball to a batter, a pitcher has to release the ball at exactly the right point in the throw. A timing error of half a millisecond in either direction is enough to cause the ball to miss the strike zone. To put that in perspective, it takes about five milliseconds for the fastest nerve impulse to travel the length of the arm. That means that when your arm is still rotating toward the correct position, the signal to release the ball is already at your wrist. In terms of timing, this is like a drummer dropping a drumstick from the 10th story and hitting a drum on the ground on the correct beat.

You should read the whole answer (actually, you should follow the blog, assuming you already know about the XKCD comic and follow that already.) But for a nice illustration of what we take for granted, here's a lovely video;

http://juanetch.tumblr.com/ by: Juan Etchegaray (Argentina) Twitter: @juanetch NEW! Watch the backstage Here: vimeo.com/59068094

"Find The Thing You're Most Passionate About, Then Do It..."

"...on nights and weekends for the rest of your life."

A painfully resonant article from The Onion…

Before you get started, though, you need to find the one interest or activity that truly fulfills you in ways nothing else can. Then, really immerse yourself in it for a few fleeting moments after an exhausting 10-hour day at a desk job and an excruciating 65-minute commute home. During nights when all you really want to do is lie down and shut your eyes for a few precious hours before you have to drag yourself out of bed for work the next morning, or on weekends when your friends want to hang out and you’re dying to just lie on your couch and watch TV because you’re too fatigued to even think straight—these are the times when you need to do what you enjoy most in life.

Productivity through Laziness

It must be about 18 years since I figured out in my first job out of school that computers and laziness went together very well.

The job was a combination of data entry and data transfer - putting data from various sources into a CRM/sales database. But when I found a little-known Windows application called Recorder (part of Windows 3.1 or 95 or whatever it was that I was using at the time) that literally recorded where your mouse was moving and clicking around the screen, I discovered that I could speed up the number of entries I was doing by a huge factor.

Whats more, this also meant that I didn't need to pay as much attention to what I was doing, which meant that I didn't get as bored, as I could distract myself with other activities. Among other things, this was when I set my personal record for Minesweeper, learnt to touch type, and learnt about the internet. Which prompted a whole bunch of other stuff…

(Unfortunately, despite the fact that I could point to better numbers that proved I was getting more work done, it didn't look like I was working harder. In hindsight, I obviously understand why, but still, it was a valuable lesson - sometimes looking busy can be more important than actually being productive...)

Anyway, Recorder has been and gone (I think largely replaced on the PC by Visual Basic) but I've been discovering what sort of things can be done with Automator and Applescript on the Mac - tools that do a similar thing to Recorder, but in a much cleverer way (ie. giving commands to applications, rather than hoping that things like boxes and form fields will be in exactly the same place every time…)

And I'm feeling pretty pleased with myself. In the space of one morning, I managed to take 646 spreadsheets in 34 different folders (poorly labelled, I might add) and turn them into 34 organised and clearly labelled Excel workbooks in the space of about 2 hours.

In those same 2 hours, I attended a half hour long meeting.

Now, I have no idea how long that would have taken to do manually- I'm guessing at least a couple of days (given the mind-numbing nature of the task and opportunities for human error to mess things up.)

The bottom line is this - if you use a computer to work but don't really know how to use a computer, then there is a very real chance that you are actually doing the kind of work that a computer could do by itself in a lot less time with just a little supervision.

But, if you are lazy enough to figure out how to get your computer to do your job for you (or at least as much of it as possible), then that puts you in a much stronger position in the possible workplace of the future.

(But you might be lonely in the office…)

"To a Boy of the 1970s, the Line Between Comic Books and Real Life People Was Hopelessly Blurred"

(via DaringFireball.net)

I love this.

You have to understand that to a boy of the 1970s, the line between comic books and real life people was hopelessly blurred. Was Steve Austin, the Six Million Dollar Man, real or fake? Fake? Well, then, how about Evel Knievel jumping over busses on his motorcycle? Oh, he was real. The Superman ads said, “You will believe a man can fly,” and Fonzie started jukeboxes by simply hitting them, and Elvis Presley wore capes, and Nolan Ryan threw pitches 102 mph, and Roger Staubach (who they called Captain America) kept bringing the Cowboys back from certain defeat, and Muhammad Ali let George Foreman tire himself out by leaning against the ropes and taking every punch he could throw. What was real anyway?

It never occurred to me that before television, children probably didn't live in a world where reality and fiction were so confusingly intertwined.

It reminds me of a quote I heard (can't remember or find a source) along the lines that "The magic of Disneyworld wasn't in making people believe that it was real, but making them believe that the rest of the world wasn't just an illusion."

"BlackBerry CEO on the future of tablets"… For who?

“In five years I don’t think there’ll be a reason to have a tablet anymore,” Heins said in an interview yesterday at the Milken Institute conference in Los Angeles. “Maybe a big screen in your workspace, but not a tablet as such. Tablets themselves are not a good business model.”

I saw this get posted around a few times yesterday, and I can understand why; BlackBerry CEO sounding out of touch is obviously a fun story.

I haven't found the actual interview he is being quoted from, but my guess is that this is out of context. We know enough about the tablet market to know that there are three camps who are doing well; Apple being the biggest, selling iPads at a premium (and making money), everyone else selling devices virtually at cost (Amazon with the Kindle Fire, Google with the Nexus 7), and Samsung occupying a much smaller space in between — selling Galaxy tablets at premium prices, but in far less volume than Apple. (I should note that I think Microsofts position in the market isn't yet clear — I wouldn't call the Surface a failure, but I just haven't seen enough data yet to make a clear call.)

If he is talking about there being no reason for people to have a tablet in 5 years, then he's probably wrong. But if he is saying that for a BlackBerry (or Motorola, Samsung, LG etc) to have a tablet in 5 years is not going to be a good business model, then I'm inclined to agree with him.

I'm still struggling to imagine Alicia Keys delivering a PowerPoint deck though…

"Google Now iOS App Keeps GPS Active"

When I woke up this morning, I noticed that the GPS icon was showing on my iPad. The only app I had running was my alarm clock, which doesn’t use Location Services. It was odd but didn’t give it too much more thought. Then I was browsing Twitter and noticed someone had tweeted that the new Google Search app (with Google Now) had a flaw, leading to the GPS being always on. Aha! Perhaps mystery solved. I checked my iPhone. Sure enough. GPS icon active.

Some confusion going on here about the 'GPS icon' on the iPhone. The iPhone (and iPad) has a small icon in the menu bar that appears when something on your phone is using Location Services. Location Services is an API that is used to identify where you are.

I would guess that the confusion comes from the fact that the icon is the same one used in the Maps app to say where you are (ie. to access Location Services), which is a concept that everyone is familiar with through GPS devices – and which uses GPS on devices that have a GPS chip. (Wifi-only iPads and iPod Touch have Location Services, but no GPS chip.)

But "Location services" doesn't mean GPS. GPS is a part of it, but at any given time, a phone that is connected to a network has some information about where it is, because it knows which network cells it is currently connected to. Similarly, devices on a WiFi hotspot can deduce information about where they are based on the hotspot and IP address. Location Services allows an application to tap into this information, and the icon is there to let you know that something you use is tracking where you are. If more detail is needed, then Apple's APIs will then use services like GPS to get more accurate locational data (at different levels of accuracy, depending on what is needed— more accuracy requires more power.)

This might mean that an application will 'wake up' when you are in a certain area and then switch on GPS to confirm (for example) whether you are actually in the area (say, a train station, a shop, at home) or just nearby.

Apple's developer documentation explains how this all works on a technical level, but the bottom line is that the location services icon is not a GPS icon, and it doesn't mean that something is using GPS and draining your battery.

Whether you still want to use an application that is constantly tracking your location is up to you though. Of course, a side effect of this is that even if you trust Google with that information, you won't be able to easily see if any other applications are using Location Services without going into the Settings menu to manually check.

WWDC 2013: $8 million in less than 2 minutes

WWDC- Apple’s annual developer conference for anyone who wants to stay ahead of the game when it comes to all things related to iOS and OSX development.

$1,600 a ticket. About 5,000 attendees (the capacity of the conference centre where its held). Videos and slides are made available to all registered developers afterwards (for which you pay a fee of something like $80 a year, which you need to do if you want to get apps in the App Store.) And unlike Googles I/O conference, they don't have a history of giving away free gadgets. You get admission, and that is all.

Last year, it sold out in 2 hours, when dates were announced and tickets went on sale at the same time.

This year, dates for the conference and ticket release time were announced together earlier this week. And tickets just sold out in less than 2 minutes.

If you're wondering if developers are losing interest in Apple's platforms, then the fact that 5,000 of them just collectively handed over about $8 million just to hear what they have to say *in person* might be a relevant data point to bear in mind.

Falling Giants

Over the last decade or so, the most unforeseeable developments probably weren't the growth of a company, or the rise or emergence of a new consumer product, category or technology. The thing that was unthinkable 10 years ago was that the clear leaders in technology – Microsoft in computers, Nokia in mobile phones, AOL in the consumer web/internet – would not be able to retain their obvious leadership positions.

Smartphone install bases

Related to my last post, a post from Benedict Evans on (global) smartphone install bases; its easy to forget that when the average phone lasts 24 months, the current pace of smartphone growth is really pretty fast. Based on some fairly simple arithmetic;

… about 90% of current Android users are on their first Android, as are 70% of iPhone users. It'll be interesting to see what their second purchase is.

Mobile stats - "All the numbers, in one place"

A handy collection of mobile stats collected by Harry McCracken. A little too US focussed to be of much use for me personally, but still worth a look.

However, what is worth noting that, as he states, this isn't an analysis of why the numbers look like they do;

I’m collecting rather than interpreting, though I hope that some of you will draw conclusions in the comments.

The one that stands out for me (because its the most interesting from a media perspective); "Which platform gets used most on the internet?"

According to NetMarketShare, its iOS, with a 61.4% share (over 24.3% for Android.) But according to StatCounter, its Android, with a 37.2% share (over 27.1% for iOS.)

Why the discrepancy? Harry doesn't offer much;

Of course, the two organizations’ methodology may be radically different; I’m not sure, for instance, whether both, either or neither of them include the iPad in these numbers. But the disparity is a healthy reminder that it’s risky to draw conclusions from data you don’t know very much about.

True enough. But its my job to understand this stuff, so I thought I'd take a look.

StatCounter say that;

Net Apps base their stats on unique visitors per site per day. ("We 'count' unique visitors to our network sites, and only count one unique visit to each network site per day.") We base our stats on page views.

So, it would seem that NetMarketShare are reporting far more iOS devices online (based on daily site visits), while StatCounter are reporting slightly more pageviews from Android devices (indicating that the smaller number who are using Android are using them much more.)

Although it still isn't clear if that really tells the whole story. As well as different metrics, the two companies are using different data collection methods, and different ways of processing them. This is far from an in-depth analysis — neither of the data sources are ones that I use regularly or know much about their methodologies; this is still very much a top line interpretation rather than a 'proper' analysis.

But, I do know that tablet users are likely to be more active online than smartphone users, and that iOS has a far greater share of the tablet market than the phone market. Some analysis I did last year on comScore's figures indicate that in terms of traffic, despite there being far fewer devices, iPads account for more than iPhones in the UK (and iPhones account for more than Android.) So for the UK (which is the only market I'm really interested in), iOS still looks like the dominant smartphone platform.

Not Necessarily News #3

This is the third of my 'not necessarily news' posts, where I try to wrap up the stuff that caught my eye in the past week (give or take a day or so – I haven't quite figured out my posting routine yet.)

The general idea here is that I don't want to be trying to cover 'news' – the idea of throwing something away because I didn't manage to write a few paragraphs about it within 24 hours isn't really what I want to be doing.

This week, I've gone with something of a theme around the whole 'news' thing, and trying to explain (through the words of people much smarter than me) why I'm trying to do this 'newsletter without the news' format.

What is the single most important life lesson older people feel young people need to know?

A nice summary of interviews with 1,500 70+ year olds about the advice that they would offer to younger people;

Another point worth making is advice the older folks consistently did not give: No one— not a single person out of a thousand— said that to be happy you should try to work as hard as you can to make money to buy the things you want. No one— not a single person— said it’s important to be at least as wealthy as the people around you, and if you have more than they do it’s real success. No one— not a single person— said you should choose your work based on your desired future earning power.

Do we need a new Save icon?

Some Random Dude — Why Redesigning the Save Icon is Important

The age or establishment of something shouldn’t preclude it from scrutiny and/or replacement. If anything, that should make us all the more eager to pull it down from its pedestal. Our job is to make things better—or at least try to do so. The Save icon is not good enough. We should try to make it better.

It's a fair point. My last computer had a floppy disc drive that I never used. I'm guessing it would be about 10 years or so since I last used one. For me, the various jobs that they did got replaced by hard drives, USB sticks, Dropbox and email.

But what does the save icon really mean? "Put this file I'm working with on a disc."

The issue isn't that the icon has become obsolete. It's that the act of saving has become obsolete. Why do you want to save?

My main tools today; nvALT — automatically saves as you go. Drafts - automatically saves as you go. Elements — automatically saves when you close a document (ie. finish working with it.) VoodooPad — I honestly don't know if this saves as you go along or just when you close; I don't recall saving. What I do do in Voodoopad (and several other applications) is export — a Voodoopad document as a set of web pages, a layered photoshop .PSD as a .PNG.

What I never think to myself is "this would be a bad time to save", or "I wish my auto save was set to every 10 minutes instead of every 5."

In a world with version control, document history (think of Photoshop's History tool), cloud storage etc. there isn't any point in saving. It should be automatic. There should never be a difference between the version of the document I'm working on in RAM and the version on disk.

More to the point, the most annoying thing in my day to day workflow isn't losing unsaved work. It's having two versions of the same document with different changes because I was working in an application that doesn't save as I go along.

So, I agree that the Save icon is overdue a rethink. But the issue isn't that it needs a different picture. It needs to be got rid of altogether.

"Digg Does The Google Reader Survey That Google Should've Done"

You can’t help but wonder how and why Google — a company that lives and breathes data — never bothered to survey Google Reader users to measure their passion for the product before they decided to shut it down on July 1st.

Probably because Google know exactly how many users Reader has, what feeds they follow, how often they check them etc.

(For every single user— not just the ones who chose to fill out a survey for Digg…)

The Big News

It seems like being on Twitter a lot is a great way of staying on top of a) news in niche areas (the kind of things that nobody around me is bothered about), and b) little news (the kind of news that someone else will mention half an hour after I have read about it.)

But days like yesterday when there is 'big news', it's funny how I never seem to hear it first on Twitter. Yesterday, someone who had walked past a TV in the office told us that Margaret Thatcher had died. (My first reaction– check Twitter to see what was going on.)

When the 7/7 bombing happened in London, the first I heard was a phone call from my mum, who had heard it on the radio.

Unless you are literally living in Twitter – and don't get me wrong, I'm not denying that I have days like that – then when news hits that is so big that it hits the airwaves immediately, broadcast media is still the place to find out what is happening. Fact is, no matter how carefully selected the people you follow are, there are times when all hands in the newsroom turn to the TV. Where audiences are measured in millions on a daily basis (and not the meaningless "potential reach" kind of millions that people talk about when they want to make a Twitter campaign sound impressive). Where someone can be mic'd up and talking in front of a live camera while the web guys are still entering their passwords into the CMS…

Today, of course, it is the front page story for the papers. Which is probably the subject of a whole other post…