Comment

Unordered: Overdue

While Brexit has either been dragging along for years with nobody actually doing anything, or lots of people rushing around doing a lot of big, important things that nobody really gets to see/understand, one thing remains constant, and that is that I'm not writing as much as I'd like to. (That is, more than about a post a month.)

So; an overdue Unordered is here... but not before a little bit of self-promotion; I did a podcast recently for work, talking about 5G, and I think I managed to sound like I know what I'm talking about. (Given the number of reports I've read, conference talks I've attended and analysis I've done, I really should do...) You can listen to it here.

Facebook's updated ad policy

Facebook have updated their T&Cs for advertisers, and now expressly forbid advertising with "false or misleading content". Which sounds like a sensible move- except that a) its a narrowing down of what was previously considered "unacceptable", and b) it doesn't apply to political advertising.

https://twitter.com/JuddLegum/status/1179740913771405313

I just don't even know where to start with this. Is this supposed to be an open invitation for regulation to do a job that Facebook don't want to do? Regulation around political advertising - in the UK, at least - tends to be media-specific, and for online its pretty much non-existent. Last year, the IPA called for a suspension of micro-targeted political advertising, making the point that politics (read: democracy) relies on an open, collective debate - adverts that only certain people can even see doesn't contribute to this, and is "vulnerable to abuse".

More from The Guardian; https://www.theguardian.com/technology/2019/oct/04/facebook-exempts-political-ads-ban-making-false-claims

And, while we're in the middle of a big Brexit advertising campaign funded by the UK government and a prime minister using the Queens Speech to set out his election manifesto, "an increasing number of countries have experienced coordinated social-media manipulation campaigns. It’s now 70 in total, up from 48 in 2018 and 28 in 2017, according to a report by researchers at Oxford University."

https://www.technologyreview.com/f/614438/70-countries-around-the-world-now-run-organized-disinformation-campaigns/

Ben Thompson has a great post that puts Zuckerbergs' recent speech into some historical context - anything that talks about what is happening right now in the context of the impact that the printing press has had on society will always make my ears prick up.

https://stratechery.com/2019/the-internet-and-the-third-estate/

On the topic of Facebook and politics - Rob Blackie has a Twitter summary of a presentation about Trump's 'winning' communication style and how it works that is well worth a read; https://twitter.com/robblackie_oo/status/1179378135365750789

The Only Grindr User in the village

Nice piece from the BBC on dating apps/websites and small communities; https://www.bbc.co.uk/bbcthree/article/c3a7ba4e-b79d-4c2a-9fbf-fb3ddd54d972

But the thing that blew me away was this embedded tweet;

To me, that is a story about a seismic societal shift going on - people don't meet through friends, or family, or work, or any of that old-fashioned "social" stuff. They either meet on the internet, or "in a bar or restaurant".

One of the (many) reasons I've long been suspicious of Facebook is the idea that it sits inbetween you and your friends and chooses who gets to see what. But the idea that the bonds that form the foundation of families in the 2000s aren't built on a shared network of friends and family - ie. the kind of support network that you need in tough times - but chance (happen to be in the same bar at the same time) and algorithms is just mind-blowing to me. I mean, I've been very aware how lucky I am that I met my now-wife at University through mutual friends and never really had to do either the "dating thing" that we did before apps took over, or the "dating thing" that replaced all that with Tinder or Grindr or Bumble or OKCupid or whatever, but "met online" is now bigger than "met through friends" has been for the last century...

#mindblown

(And I don't even want to think about how many of those who "met in a bar or restaurant" consider the Tinder bit that preceded it irrelevant/embarassing...)

Netflix: Bring on BARB Measurement

A bit niche this one, but Netflix have indicated that they would welcome BARB measurement for their platform. Jon Manning (a colleagure who knows a lot more than me about TV advertising and measurement) wrote a piece for Campaign magazine (UK advertising industry trade press) about how it might signal agree with Jon on this one. It probably warrants a post on its own, but given my typical speed of publication (see: time between this post and the podcast that prompted it) I'm just going to say for now;

  • Netflix need to justify their stock price above all else, and their stock price is currently predicated on a future entertainment monopoly. To do that, they need to demonstrate not just that continued growth is possible, but also profitable.
  • That means that a business model that lets users stop subscribing but still able to see the flagship shows just won't work. If they were to go with a super-premium advertising product, they might be looking at something like a £50 CPM (cost per thousand advertising impressions delivered.) Think about how many adverts would need to be delivered to compensate for a lost subscriber at ~£10 a month (the lowest UK price is currently £5.99 a month, but you pay more if you want to cover a household where more than one person might be watching at once/HD etc., and I'm thinking about where they will be in a year or two- not where they are today.) The numbers just don't add up - it worked OK for Spotify, where "adverts that irritate people into subscribing" certainly helped their growth story. But it isn't going to work in a competitive environment.
  • Part one of their story is done - they switched from "21st Century DVD rental company" to "online video platform", and it isn't hard to find numbers that show that they have a lot of subscribers.
  • Part two of the story is different though. Its not about "Netflix vs TV" - its about "Netflix vs. Amazon, Disney, Apple, HBO, Britbox, a bunch of other SVOD providers vs. TV". They need to prove not just that they have a lot of customers, but that they are their customers favourite out of a bunch of subscription services.
  • To do that, they need numbers that show not just how big their audience is, but how engaged they are. How they watch more of it than the others. How they check whats on Netflix before they check whats on TV.
  • If you want to do that, you don't do it by drip-feeding your own numbers that happen to support your story. You do it by opening up to independant, trusted 3rd party measurement that is comparable with whatever you want to benchmark yourself against. Which, in this case, in the UK, means BARB.

Again - this is a topic that I think is well worth going into in more depth- TV measurement and Netflix are very interesting to me at the moment, especially with Disney+ and the Mandalorian on the horizon.

And speaking of Star Wars, I can't let this pass without mention;

Lots for the old-school fans in there- lightsabers, Tatooine, Endor, the Falcon, the blockade runner... and the closing Carrie Fisher shot.

(I still don't really know what the title is about, but then I'm staying a lot further away from spoilers and the Star Wars online communities than I did the last time around...)

Related: the story of the kid who Alec Guinness asked to not watch Star Wars again: https://www.buzzfeed.com/chowdad/alec-guinness-hated-star-wars-and-i-should-know-20egr

Fortnite's "The End"

I watched my son spend at least two weeks getting excited about the next Fortnite event, watched the event as he watched it (sidenote: watching a First-Person Shooter when you're not the First-Person is incredibly annoying) and listened to him chatting with his (online) friends (or should that be "friends online"?) was quite an interesting experience. Some thoughts;

  • Even though it was a ten minute "event", they were still talking about 40 minutes or so into the post-event "black screen with a black hole in the middle", waiting for something to happen. I was simultaneously dismissive of them wasting their time in such a pointless way, while also being kind of jealous of the enthusiasm. Just imagining how I felt about something like a Star Wars trailer or a Grand Theft Auto launch, and how I would have felt about being able to experience it with my friends online (not just "online friends") - I would totally have spent an hour watching nothing happening in the hope that it would.
  • Also - it did! There were some numbers, that led to a secret code, with a secret message... that reminded me of when Lost was cool and mysterious and exciting.
  • I can't think of anything in terms of a "cultural event" like the Fortnite one from my lifetime that wasn't on TV.
  • I still think that Save the World is way better than Battle Royale though.

I haven't given this Untitled a number, partly because I've kind of lost track and partly because I'm taking a leaf from both Fortnite and Dan Hon's books and preparing for a Season Two reboot. So watch this space for that...

Comment

Comment

This website is not data-driven

I like to tell people how I've been blogging since before it was called blogging. Technically, its not true- I started my first website (on Angelfire!) in 2000, and the term "blog" was coined in 1999 - but what "blogging" meant back then was more like what people put on social media now, and what I was doing then was much more like what "blogging" tends to mean now.

Anyway.

Once my web development chops progressed from hand-coding HTML pages on Angelfire to setting up a web server and running my own CMS, I noticed something about the way my CMS was designed that was influencing what I was writing.

These days, its pretty standard to use Google Analytics to measure your website's traffic, but back then all my posts had a counter to tell me how many people had read it.

I'm not talking about those "hits" counters that really old websites all had (along with marquee text, blink, animated gifs etc.) - this was an internal thing that only showed up if you were logged in.

I switched it off. It was surprisingly difficult, and involved hacking into the CMS code for some reason or other, but I realised pretty quickly that I didn't want it. Because I realised pretty quickly that the thing I really liked about writing (as my kind of blogging was called back then) was the way it made me think about what I was writing.

The point of writing for me was always more about my own thinking than anything. If I was facing numbers telling me that Post X had been read 200 times and Post Y had only been read 20 times, then I figured that would only make me more likely to write about whatever Post X was about. Not only do I not particularly want to spend time going into the analytics and trying to "build an audience", I realised that I very specifically wanted not to know about what I was doing that was "driving engagement".

Obviously this was in a different era, culturally speaking. Today, my kids are watching educational TV about how the "like" counters on social media affect the way the content is perceived. Facebook are thinking about removing the "like" counters from posts.

But an interesting side-effect that I didn't really forsee is that this has changed the way I write blog post titles. Usually, the idea is that you write a title that will draw in the readers - which isn't quite the same as "clickbait", although I'm not really sure I could articulate the difference.... But for what I'm doing, I'm more interested in a title that either summarises the post or (my preference) makes perfect sense- but only after you've read it, so if I'm looking through a list of my post titles, I can find the one that I want.

So if you were wondering why my post titles are so bad... its not that they are 'bad' - they are just maybe doing a different job to what headlines "normally" do.

Which... is a bit of a weak conclusion to this post, but reminded me of a good story about "breaking the rules" from Dan Hon's (excellent) newsletter which is only vaguely connected (in that there is something about friction - or making things easy, and "building communities" being about the kind of people you want in your community, which feels vaguely analagous to not writing for an audience of people who probably aren't really interested in what you're writing about for the sake of getting 'hits'), but you should totally read it.

Comment

Comment

How much has TV viewing really changed?

I was listening to the Rule of Three podcast with Laurence Rickard (of Horrible Histories and Yonderland fame, to anyone with young children in the UK), and there is some conversation about producing TV programmes for different channels (CBBC- Horrible Histories, Sky- Yonderland, and BBC1- Ghosts), and the differences in audience expectations for a programme that goes out on a given channel, at a particular time - and how changing viewing behaviours mean that how programmes are watched is not as closely tied to how they are scheduled any more.

Joel Morris (one of the podcast hosts) says, about 4 and a half minutes in;

"We did some work about 5 years ago for Gogglebox [...] I was always under the impression that Googlebox was a fiction, that families didn't gather around the TV and watch it anymore [...], and we looked up the BARB figures (about 5 years ago) were that 85% of programmes were watched live, by a family, on a sofa, when they went out. And I thought - god, I thought that was a vintage and antique thing. I am sure that is not true any more, and that must have changed really quickly, and I'm wondering whether the culture of comedy, where the demand is for things to hit straight away, whether people have quite caught up with the fact that people consume everything so differently now."

I'm really interested in changing TV viewing behaviour for all sorts of reasons (some of which I talked about here- an old post, but still relevant), so I thought I'd take a look at whether this is true - what has changed in the last 5 years?

First of all, for the sake of clarity, lets look at what is being said;

85% of programmes were watched live, by a family, on a sofa, when they went out.

OK- there's a few things going on there, so for the sake of clarification;

  • "85% of programmes" - this is a bit of a fuzzy one, because you aren't really talking about "85% of programmes" - a given programme can be watched live, by a family, on a sofa, when it goes out by one household, but watched from a recording, by a single person, in a bedroom, the day after it goes out next door. I'm going to assume that what Morris meant was "85% of viewing time to programmes" - ie. "85% of all the time being spent by people watching television programmes". Because nobody really cares about "percent of all programmes" - a metric which would put a world cup final on equal footing to an obscure programme on an obscure digital channel, broadcast at 3am on a Tuesday...
  • "watched live" and "when they went out" - I'm going to lump those together as meaning "watched when they are broadcast", because "live" has a slightly different meaning. (ie. Gogglebox is pre-recorded, so its never "watched live"... unless you are using the TV Licence definition of "live TV"...) So I'm looking at what is viewed when it was broadcast, vs what is timeshifted (ie. watched at a different time to when it was broadcast, whether recorded, streamed or downloaded.)
  • "by a family" - BARB data doesn't really have an easy way to say what is watched "by a family", but I can look at what is watched by an individual on their own vs. "shared viewing" watched by 2 or more people together.
  • "on a sofa" - OK, this isn't really something that is captured by the BARB data - you can look at what room a TV is in (ie. living room, bedroom, kitchen etc.), but I'm interpreting this to mean "on a traditional TV set" - as opposed to on a computer, tablet or smartphone.

Luckily for me, the BARB website has some of the data available to the public.

But there's another bit of the quote worth pulling out;

Firstly, live vs. catch-up viewing; in 2014, live viewing accounted for between 85.8% (w/c December 22nd) and 90% (w/c June 23rd). In 2019, that figure has changed... a little. The most recent data at the time of writing is for w/c 1st July, where live viewing accounted for 85.8% of viewing- the highest for the year to date. (The lowest was 83.2%, w/c 8th April.) I suspect that this live vs. recorded viewing is what he was talking about - just because the number is so close. But has it changed much? I'd say no. A bit, but not much.

Next is "by a family" - which I'm looking at as viewing by one person vs viewing by more than one person. This is a bit trickier to get hold of the numbers - I couldn't find anything on the BARB website for solo vs shared viewing, but as I work for a BARB subscriber, I do have access to their full data. This is what the share of "solus" viewing by month looks like for the last 7 years;

Solus as a share of total viewing minutes

Solus as a share of total viewing minutes

So - 85% of viewing is definitely not watched "as a family", because nearly half is watched alone... but that said, the point here is that I'm not seeing anything here that makes me think that there has been a significant change in viewing behaviour over the last decade. (Interestingly, we watch more TV on our own in the summer months- which is when we tend to watch less TV overall.)

Finally, "on a sofa"... or, my interpretation, on different screens.

BARB report two sets of data at the moment - firstly, whats watched on TV screens, based on a panel of households with meters monitoring what the TV is doing. Secondly, what is being watched by the "players" (ie. iPlayer, ITV Hub, All 4, Sky Go etc.) on other devices, based on census-level data from the players themselves.1

You can see the time by device data on the BARB website here - I've had to pull the equivalent data from BARB myself to get to this chart of total viewing minutes.

2019-07-29 13_20_09-Total TV Weekly.xlsx - Excel.png

From the bottom, we have a combination of "live" (ie. as broadcast) viewing and VOSDAL - Viewing Same Day As Live (ie. watched on the same day as the broadcast). Then we have viewing that is watched between 1-7 days after it was broadcast, and a much smaller slice of programming watched between 8-28 days after it was broadcast.

Then there's a purple slice of viewing that is "unknown" - the way BARB works is by matching what is on a TV with an archive of broadcast content from the last 28 days - anything that isn't matched is "unknown"; so that will cover watching DVDs, or watching Netflix exclusives, or playing computer games. Its generally understood that most of this is accounted for by Netflix and Amazon Prime, but exactly how much isn't massively clear - and certainly a bigger topic than I can cover here (where I'm already oversimplifying more than I would like...)

Then there is a tiny sliver along the top that is the billion or so weekly minutes of "player" TV being sent out to computers, tablets and smartphones. (ie. very unlikely to be watched by a family, together, on a sofa.) A billion sounds like a big number out of context...

As I said, I've been watching these numbers for quite a while, so none of this really surprised me very much. About 12 years ago, a colleague of mine did some analysis on PVR viewing where he found that when a household gets a PVR, they do start skipping adverts - but they also watch more TV, and the extra TV viewing outweighs the ad skipping, so they actually end up seeing more advertising - at least, for the first 6 months or so. That taught me two important lessons;

  • First, we are incredibly bad at understanding our own behaviour. (I wrote about this some years ago - see "Do you really know how much TV you watch?")
  • Second, the intuitive conclusion - in this case, we see less adverts when we start skipping them - isn't necessarily backed up by the data.

Why the Sofa is the most important media device

My theory is that the space that "TV" fits into people's lives simply hasn't really changed much. Generally speaking, people go to work, come home, deal with the household chores that have to be dealt with, and then at the end of the day they sit down on the most comfortable chair in the house - which has a TV in front of it - and they watch TV.

The thing in the living room with the biggest influence on TV viewing isn't the TV (big, small, smart, dumb etc.) or the things plugged into it (ie. the satellite/cable box, recorder, DVD player, games consoles, wifi dongle etc.), or the things that are on the screen (TV broadcast, videos, DVDs, Netflix, games etc.), but the sofa.

As Marshall McLuhan said, "The medium is the massage". Or, "Content is the juicy piece of meat that distracts the watcdog of the mind". If you want to see the effect that TV has on people, watch the people - not the TVs.

That said, exactly what "watch TV" means is changing - I grew up with 4 channels to choose from, which meant that the best way of representing how people watch TV was the idea of "least objectionable programming" - people would see what the options were and settle on the one they objected to the least; the least boring, least offensive, least upsetting or unsuitable option would get watched. That model doesn't really work when you've not just got a dozen or so channels to choose from (just looking at the channel repertoire of what people actually watch - ignoring the dozens and dozens of channels in the EPG that you never actually switch over to - generally, people have about 23 channels that they actually watch, no matter how many channels they can watch), but also a library of pre-recorded favourites on your Sky+/PVR (sure, we had VHS tapes along with our 4 channels, but actually choosing one and watching it was a very different experience), as well as the archive of programmes available on the iPlayer/ITV Hub/All 4/Netflix/Amazon Prime/YouTube etc.

But our perception changes much faster than our behaviour, for a number of reasons;

  • The more we think about something, the more we remember it. That means if we are looking through a library of content - whether actively looking for a specific programme or browsing through genres hoping to discover something appealing - we are much more likely to remember that act than flicking through channels to see what is on - which in turn, we are much more likely to remember than watching whatever happened to be after the programme we were watching because nobody objected enough to switch over. (Which is also related to why we remember fast-forwarding through the adverts more than we remember not fast-forwarding through the adverts, and similarly over-estimate how much we do it.) Our perception of ourselves is fundamentally biased. (Similarly, we remember doing stuff on our phone while watching TV more than we remember not doing stuff while watching TV.)
  • We focus on the new - what has changed - and ignore the familiar until it goes away. So when we think about how TV is changing, we think about the technology drivers - big screens, Sky+, iPlayer, iPads, Netflix - and don't think about what is stopping things from changing. I suspect that if you were to take away the sofa from a typical household - or even to move it into a different room - you would see bigger changes in behaviour than introducing any piece of technology. (Well, you'd probably just see people moving it back again, but you get the idea...)
  • We lie to ourselves. We like to think that we are the sort of person who doesn't just spend hours watching TV every single day. We like to think we are the sort of person who carefully curates and selects the content that is worth our time. (This is on top of the fact that we are wrong about how we are spending our time in the first place.)
  • We are lied to. What the media tells us about how everyone else is watching TV is about the changes - advertising the new technology and the new services, reporting on the impact of the new technology (and the PR that promotes it), the science-fiction vision of how we will watch TV in the future - all of which builds a false picture in our minds of how 'everyone' watches TV, which we tend to assume applies to ourselves as well. I would put the podcast that prompted this post into this category; we hear people talking about how everything is so different now, so - unless we happen to be paying particular attention to the particular area - why wouldn't we believe it? (At the end of the day, that is a big part of why TV advertising works so well.)
  • We lie to others. We don't like to present ourselves as the sort of person who spends hours indiscriminately watching TV every single day. We like to present ourselves as someone who is in touch with the current trends, using the latest technology. (This is on top of the lie we tell ourselves, on top of being wrong about it in the first place.) If a doctor asks a 25 year old how much alcohol they drink every week, they probably don't expect the answer to be accurate. (But they probably do expect the person they are asking to think about how much they actually drink, versus how much they should drink, and how big a lie they feel they should respond with...)

So, in conclusion, I don't think "TV viewing" has really changed all that much - in the sense of the way that we watch TV. The television industry is changing enormously - what we choose to watch, how we choose it, how we watch it, what is available to us, who is paying for it, how it gets comissioned etc. etc. All of which is a really interesting space to watch.

But the thing that remains really interesting to me - the basic human behaviour of sitting down on the most comfortable chair in the house at the end of the day and watching "audio-visual content" (for want of a better word) on the best screen in the house is a behaviour that has remained remarkably consistent for a remarkably long time.

  1. The methodology is a bit too dull to go into in detail, but in a nutshell - each video has a tag which fires when it is played, telling BARB that it is playing. So BARB get a record of all viewing by the players - not just based on the panel, but based on every single view. What they don't have is a way to say who is watching - ie. old, young, male, female etc. - or how many people are watching. So technically, it isn't comparable data - but its the closest that is available. If you don't work in media research, you probably don't care. If you do, you probably know all this already.

Comment

Goosebumps

In the dim and distant past- half a lifetime ago - my CD collection was a treasured possession, proudly on display and carefully organised on a regular basis (sorted by a mixture of thematic links, chronology, and the confines of whatever shelving system happened to hold them at the time), but a big chunk of my actual listening was still tapes.

The thing about tapes is they force you to spend time on putting them together, which in turn forces you to put thought into it. In principle, it’s the same as a playlist in Spotify or iTunes, but the fact that you have to stop and start the tape before and after each song means listening to it in real time. Choosing what song should come next while you're listening to the song that comes before encourages you to build a sequence of tracks - rather than throw a semi-random collection into a playlist in the order that they happened to accur to you.

The fixed length of each side (which varies from tape to tape) means you have another choice about how to handle the empty space at the end (fill the tape but have a song cut off half way through? Finish the song on the next side, or start again from the beginning? Find another song that is the right length, but also fits with the general theme?)

The physical nature means you can label the tape, the inlay card, (hand) write the tracklisting, decorate with your own artwork, dedicate it to someone - all of which you’re given the time to do by the fact that you’re sitting around actually listening to the music as you go along and immersing yourself in the music collection you're making. (I guess today we'd probably call it "curating"...)

For a while, after MP3s took over from tapes as my way of listening on the go, I liked iTunes Smart Playlists. The idea is that you build a set of rules - for example, a random collection of tracks in my collection that I’d rated at least 4 stars, but hadn’t played within the last three months, or tracks that had been in my collection the shortest amount of time but not yet listened to, or a selection of tracks from a bigger playlist, ordered by some other criteria- encouraged a more systematic approach to organising my collection. Not necessarily listening, but curating, rating and generally managing metadata and algorithms. Sadly, over the years, iTunes’ changed the way it worked, play counts stopped working as well, 5-star rating were replaced by “like” and “dislike”, but probably more importantly the opening up of the Apple Music library and recommendation system meant the availability of a whole new world of music (that I hadn’t personally curated).

This progression from "hand-made" tapes to burning CDs to the total flexibility of a playlist has been a shift away from actually experiencing the thing that you're making- the emotional involvement with the music.

But of the dozens - probably hundreds - of playlists I’ve made in the 15 or so years of using ‘players’ like iTunes, one of them has stood out from the others.

I can’t remember exactly what kicked it off, but the idea was a simple one. Every time a piece of music gave me a tangible, physical response - goosebumps - I would add it to the playlist. Just that one simple rule - no exceptions, no accounting for context (if I didn’t like the song, or the artist, or if the reaction was because of something I associated with the song rather than the song itself, it still gets added to the list.)

It feels like it should be older, but it must have been 2012 when I started putting it together because the title of the playlist is gersberms- a reference to a meme that knowyourmeme.com tells me didn’t exist before then. (In my head, I was living in a flat that I moved out of in about 2007, which tells you something about the reliability of my memory.)

It would have been impossible - or at least impractical - before smartphones, because pretty much wherever I am, however I'm listening to the music, I can always pull out my phone, find the song and add it to the playlist. (Obviously, if I'm listening on my phone or through iTunes on my computer - which must me 98% of the music I listen to - then its just a click or two away.)

So, I currently have a collection of 20 songs that have proven to provoke a tangible emotional reaction- obviously, goosebumps don’t happen every time I listen to them, but that means 1 hour and 25 minutes of music that resonates with me personally in some way or other.

Some of them I understand. Espresso Love by Dire Straits (the live version on the Alchemy album) is deeply connected to childhood memories - it always makes me think of being curled up on the back seat of my dad’s car next to my little sister (who turns 40 next year, but will always be my little sister), driving through the night (on the M6 or M40), on the way home from Bolton, where we would go a few times a year to visit family, my head against the window, either watching the world go by or trying to read a book either by the flashing streetlights or the headlights of the car behind. Or Cherub Rock by Smashing Pumpkins - I like the song well enough, but it’s the rising and crashing of the guitar solo that makes the hairs on my arms stand on end.

But I couldn’t tell you exactly what it is about Glen Campbell’s Wichita Lineman that tickles my nervous system into a tangible reaction. Or why Don Henley’s Boys of Summer makes the list. (I misheard the title lyric for years as “after the poison/ summer has gone”, so was never quite sure what the song actually was until it popped up on an Apple Music playlist.) I think it might have been in the soundtrack to a film from the ‘80s or something, but whatever subconscious switch the chorus is flipping is buried too deeply for me to quite get a grip on it. (Again, the evidence is that it was never in an ‘80s film and my memory is playing tricks on me, but at least I know I’m not alone…

Even though its an intensely personal collection, it definitely isn't a playlist that I would put together any other way. Some of my favourite artists are conspicuously absent, there's definitely more music from the '80s than I would consciously choose to put in there, and there's no obvious "theme" to tie it together (which is usually how my tapes and playlists work).

I have on rare occasions taken songs off- although I can only think of one. (The first dance at our wedding was Al Green’s “Let’s Stay Together”, which made the list - but after setting it as my phone’s ringtone back in the days when it seemed to make sense to not have my phone set to silent/vibrate, the intro bars just make me think my phone is ringing, which tends to elicit a less positive emotional reaction.)

1 hour and 25 minutes happens to just about the right amount of time for a C90 mixtape, but I don’t actually have a tape player any more to either record it onto or play it back, so committing to cassette isn’t an option. A CD can only hold 80 minutes, so that's almost possible - maybe I’ll take off a song so I can fit it onto a CD and give a copy to each of my kids. (I guess it would probably the Spiritualized one with the line about “just me, the spike in my arm and my spoon”… it is quite a long song after all…)

Maybe some day I’ll share the playlist itself and go through it track-by-track, but this is supposed to be a write-and-post-in-a-day post, and I’ve got a busy day ahead of me…

Digital Media's Growing Pains

Its now about 11 and a half years since I thought it would be a good idea to work in the media industry for a year or so. (I thought) I wanted to be a web developer, didn't quite understand how everything was on the internet for free but somehow companies like Google and Facebook were worth millions of dollars - but knew that it was something to do with advertising, and wanted to understand how that business worked a bit better before I did something like set up my own business in a world where the financial model made no sense to me.

As it turns out, the first thing I learnt which pretty much shaped the next decade of my career was that communications technology might be interesting, but the impact it has on people and their behaviour is far, far more interesting.

Something else I learnt, probably a few years later, was that the world of "digital" that was massively disrupting the advertising industry is only a subset of "marketing". It was a new world that was sold on the back of all the new things that were possible, but to a world that didn't really grasp the implications of what was lost in the process. The problem is that "best practice" for digital media/advertising isn't necessarily "best practice" for advertising in general.

Digital is great for short-term results, identifying potential customers close to the point of purchase and giving them the final nudge. Which are very easy to measure - you just watch your sales numbers before, during, and after the campaign and see how much of a difference it makes.

But what "traditional" media does well is the long-term, brand-building stuff - the messages that stick in the back of your mind until they resurface when you're walking doan a supermarket aisle and make you pick up the more expensive branded product instead of the cheaper alternative. (My go-to example is always the £3 packet of Nurofen instead of the 30p supermarket-brand ibuprofen.) The kind of advertising that doesn't necessarily make you run out and buy the product, but changes the way you think about it in a way that expresses itself as brand loyalty (buying the same product again- even when the last time you bought it was purely because it was discounted), "price elasticity" (willing to pay more for the same thing) - factors that only really shift as part of a long-term, consistent marketing exercise, but are very difficult to measure, quantify, and build into a marketing model.

So, when more money is being spent on "digital advertising" than all other forms of media put together, a problem arises. But its a special kind of problem that doesn't actually look like a problem to begin with.

How Advertising Spend has Changed

Just over a decade into my "year or so" long experiment and its not unreasonable to say that the advertising industry has been completely transformed by the impact of technology.

This is what the global picture looks like, in terms of where all the advertising money is going;1

Worldwide Ad Spend by Media ($m)

What we see; Newspapers used to be the biggest advertising medium (by spend) in the world, overtaken by TV in the mid-to-late late 1990s. Then, around the year 2000, a small spike in advertising across all media2 was followed by continual and rapid growth in online spending — mainly at the expense of newspapers3. In the space of a decade, online advertising had overtaken Outdoor (ie. posters) and radio.

Then, everything took a dive around 2007 - the global financial crisis hit the media industry pretty hard - Newspapers, looking pretty flat for a few years, suddenly fell off a cliff and never recovered, while online advertising continued its growth. While TV advertising remained (and remains) fairly strong, online spending recently overtook that too, and is now the biggest medium by ad spend in the world.

Now, lets take a look at the same figures but just for the UK;

UK Ad Spend ($m)

By 2008, the internet had become the biggest media channel by spend in the UK. It now accounts for more than half of all the UK's advertising spend.

What the Big Numbers are Hiding

It is worth noting at this point that these big, broad numbers hide a few things.

The first - and probably most apparent from inside the media industry - is that "Newspapers" really covers two very different things. On one hand, there's the kind of advertising that you probably think of when you think about "advertising in newspapers" - the big, colourful, maybe even full-page adverts that are selling some sort of "brand". The other is the "classified" adverts- mostly taken up by property for sale, recruitment for jobs and second-hand cars. When spend started shifting to "digital", the first to get hit was the classified ads - adverts that worked very well online, probably better than in print. The "brand" or "display" ads weren't so much hit by the online alternative for advertisers, but the online alternative for readers - as readers migrated from print to online, the advertising model (and money) changed. As the industry cliche goes, from analogue pounds to digital pennies. In print, newspapers were competing with one another to be "the paper" that a person or household would choose to buy. Online, there's no particular reason why anyone would limit themselves to a single source of "news" - why not read the Guardian for politics, the Times for international news, the Sun for sport. Or maybe you would read the front page (now "home page") of a newspaper website for the headlines, then read a bunch of bloggers for the editorial.

The second thing that the broad numbers hide is that "internet" also covers a number of different things. About half of it in the UK is Search advertising - the search results that you see with "Ad" or "Sponsored" next to them, shown to you based on the search terms (and influenced by other data that your Google account or web browser history might be holding). About 40% of it is "display" - the banners at the endges of web pages, the pre-roll videos, the sponsored or promoted posts in your social news feed. The other ~10% or so is the "classified" ads. Clearly, Search advertising is doing a "response", rather than a "brand" job. Likewise, Classified ads are obviously "response". But of the online Display advertising, a great deal of it is also doing the "response" job - easily measured in terms of efficiency and value, but less likely to do the long-term "brand building" job.

How the Media Industry has Changed

When I joined, there were a few interesting things going on. Firstly, the agencies didn't really seem to have many people who really "got" the internet. I was 30 at the time, and when I left school I had never heard of the internet - the fact that I could walk into an interview with a smartphone/PDA in my pocket and a link to my blog (about the internet!) on my CV was enough to mark me out as an "expert" - despite no experience or understanding of the advertising industry. The industry - media owners, media agencies and advertisers - really, really wanted to be able to show that they understood the new, digital world. And the people they turned to for knowledge and understanding were (and still are, I think) mostly the salesmen.

The big story from a consumer point of view was about how broadband penetration was growing, Facebook was clearly the "next big thing", but people were still wondering whether it was going to be as big as MySpace - and if so, what was stopping something else from coming along and overtaking it. (My point at the time was that Facebook was doing something different by connecting people's real world identities - you didn't tell anyone your real name online at the time - with all of their interests, and all of their real-world friends, and that was going to be a very sticky thing to pull people away from.) It was also at the point after smartphones had started to appear, but before the iPhone had really crystallised what a mass-market smartphone/pocket computer would look like. (At the time, it was a real headache from a research point of view that everyone had a different definition of what a "smartphone" was… but thats another story.) Digital was the future - obviously - and whether or not it was a future that we wanted, it was just a race to get there first.

Today, its a slightly different world. Every media has been touched by "digital" in some way - "TV" is now part of a broader "video" marketplace, where broadcasters are making most of their content available online, and "radio" is part of a broader "audio" marketplace, where radio broadcasters are up against a world of music streaming services and podcasts, "newspapers" have become "newsbrands" and while their businesses are often still reliant on the printed copy (partly because its something that people pay for, but just as much because print advertising is an easier business than online advertising), their future is probably going to be in a world where they are balancing the value of their content with the value of their audience (who can just as easily be targeted anywhere else on the web, on an individual basis).

Which brings me to my point.

Advertising on the internet has its own set of best practices. There are things that you can do online that you can't do offline, and vice versa. But "best practice" for online advertising is not the same as "best practice" for marketing in general. However, with online advertising so massively dominant in the UK, I don't believe that there are enough people worrying about this particular conflict and the implications. (Some, sure. But not enough.)

Or, at least, not as many as there are still racing to be the first on board with the Next Big Thing - whatever it might be.

  1. I've deliberately left this data unsourced, because I'm not entirely clear whether the data is allowed to be published in public or not. Suffice to say, I'm confident enough that the figures that they illustrate a 'truth', and confident enough in the story that they tell that I don't feel a need to cite an authoritative source.

  2. I'm pretty sure that this is the dotcom boom, when a bunch of investor money got pumped into new online businesses, who pumped a lot of it into advertising their businesses, a lot of which was with other advertising businesses, which made them look good to investors who pumped more money into their businesses…

  3. Newspapers are pretty flat on the chart here; although newspapers' level of advertising spending stayed fairly steady, total advertising spend tends to be proportional to GDP- advertising spend overall kept on growing, but newspapers' share was declining - especially in the markets where internet infrastructure had been developed and online advertising was thriving.

Why I don't watch adverts

I don't watch adverts.

That isn't exactly an uncommon thing to say. I've done a bit of work into how many people say it - specifically, how many people will say that they don't watch adverts, or live TV at all, and how everything they watch is recorded on Sky + and the ads are always skipped - and then you track what they are watching and discover that actually, at least three quarters of their viewing is live and even when they watch recorded TV they still watch the adverts a significant amount of time.

But this isn't about that.

This is about the adverts that you choose to watch - the ones that you actively seek out. The Christmas advert that you search for on YouTube when you hear that it came out today.

I don't watch those adverts.

Partly its because, well, I don't really want to watch them. But for a slightly deeper reason - I work in the advertising industry (at what used to call itself a "media agency", when we judged ourselves in terms of how well we bought media - but thats another topic…), where there is a kind of expectation that you care about advertising.

Maybe its the decade (and a bit) that I've been working in the industry that has brainwashed or indoctrinated me, but I find that I do like advertising. Kind of... I mean, I like the way that it makes "free" things possible. I like the fact that it has enabled "news" as an industry to develop and exist, which in turn makes the ideal of an informed democracy at the very least, a possibility. (Although, whether that is still true is up for debate.)

I find "advertising" as a part of a wider world of "marketing" much more interesting though. And I have a theory that if you take something that is supposed to be an "advert" - that is, a thing that is supposed to communicate something from a business to its (potential) customers - and take it out of the context of the media space that it was designed to sit in, then you fundamentally change the nature of what it is.

Figure and Ground

I've written before about Marshall McLuhan, but there's an idea that he wrote and talked about that I only recently got my head around, and its about "Figure and Ground". Think of a painting - lets say, the Mona Lisa. Everyone has seen it, everyone can recognise it, everyone knows what it looks like.

Think for a moment about how you would describe it. (You don't need to look at it first — in fact, this exercise probably works better if you don't.)

All of that stuff that you're thinking about describing is the "figure". Probably the woman, her face, her smile, her hair, her clothes — the object of attention. In fact, for a painting, that is the way it is designed - something in the frame is supposed to get your attention.

But that isn't the whole painting. The rest of it - the lake, the rocks, the winding road, the bridge, the clouds, the arm of the chair, the balcony - is the "ground".

In painting, or perception, it is about the thing that jumps out at you and you pay attention to. But in McLuhan's media analysis, its about the context. And one of his central theses was that the ground is the bit that really matters.

"The medium is the message"

(or "massage" - apparently typo in an early version of a manuscript which actually illustrated the point; a twist of the original meaning that could only happen in a typeset medium.)

“For the “content” of a medium is like the juicy piece of meat carried by the burglar to distract the watchdog of the mind.”
Marshall McLuhan in "Understanding Media: The Extensions of Man"1

My view is that watching an advert out of context is like taking the label off a tin and sticking it in an art gallery. First, you fundamentally change the thing you are looking at - by removing it from its context and putting it somewhere else. But also, you fundamentally change the thing itself - can anyone look at a can of Campbell's soup and not think about Andy Warhol?

I should note that I don't really avoid adverts. I mean, I tend to skip through the adverts when I'mn watching recorded TV programmes - mainly because my wife has the remote control - but I'm not installing ad blockers and all of that. (Because I don't think most people do, and I do think it matters that I tend to see what most people see in their online experience.) And I tend not to watch much TV - just because I would prefer to do other things, like play computer games or paint or code or read or surf Reddit or... you get the idea.

So, my theory is that by avoiding looking at advertising out of context, I get a better idea of what the advertising is actually doing - how it is changing the "ground" - than someone who is taking the TV spot that millions of distracted people are going to barely notice in the background of their Facebook sessions, and watching it on the biggest and best screen that the agency can buy with a bunch of other highly paid people in expensive clothes, sitting quietly around a boardroom table, drinking coffee and taking notes.

Which is a very long winded way of saying that I don't watch adverts because I care about advertising.

  1. London, England: MIT Press,1964; p.18], reference found here, in what I think is his grandson's blog.

B-Day and Activism

Today is the day that the whole B-thing was supposed to wind up and whatever it is that nobody really understood in June 2016 when the nation agreed to it in a non-binding referendum was going to happen1...

Ho hum.

Anyway - not entirely coincidentally, what I've been putting out into my social media feeds and my website have pretty much dried up over the course of those two years, partly because of a deepening cynicism about what those platforms are doing and the influence they are having on society, and partly because I've been progressively less interested in that "public profile" stuff and increasingly concerned about the influence that it has on my state of mind and sense of happiness.

But having just got a couple of chunky pieces of work out of the way and found myself with a little bit of breathing space, I wanted to chew on an idea that I came across yesterday, while talking about a potential research project about "activism". You see, when you're writing a survey and asking people questions about a "thing", one of the first things you need to do is to make sure that you're providing a sensible definition of the thing that you're asking them about.2

Anyway...

When it comes to a definition of what "activism" means, one of the starting points is that you're pushing for a change in society. That is, you're setting out something about the status quo that you're not happy with, and want to push against.

I thought it was kind of interesting to consider that, right now, the status quo is that we are on our way out of the European Union one way or another. Which means that there isn't really a place for pro-Br**it activism any more - because it isn't "activism" any more. Its just flipped to "maintaining the status quo".

Which might explain why a recent petition to revoke Article 50 can manage to get 6 million signatures 3 when no other petition has come close. (There was one calling for a clearer rule on how the results of the EU referendum would be acted upon which got 4.1 million signatures, but the next closest was 1.8 million people asking for Donald Trump to be refused a State Vist. No other petition I can find has passed 1 million signitures).

But democracy doesn't - and shouldn't - work by online petitions, so onward we march towards the kind of departure from the EU that 400 MPs gave indicative votes against and only 160 were in favour of. Or to put it another way, what the democratically elected representatives of 28.4 million people - 61% of the electorate - have spoken out against.

So, you've got this weird situation where what was a relatively coherent group who was pushing for political upheaval and transformation in 2016 are now trying really hard to just keep things going the way that they are, because a whole bunch of different visions that came under the "Br**it" brand have turned out to not have any kind of majority backing at all, and any attempt to turn it into a workable course of action has completely failed to get any kind of majority backing. But they have momentum on their side, and apparently no strong leadership of any kind in a position to put forward a meaningful alternative to the current course.

Lets see how this plays out…

  1. Except its been delayed because our democratically elected representatives who we were supposed to be "giving back control" to have now taken control of the process from the arguably less democratically appointed Prime Minister (who has inherited a policy set out by the same previous leader who kick started the whole thing in the first place) who clearly didn't really have it under control, and now they have subsequently failed to demonstrate that they really have any kind of control over anything. But this isn't supposed to be a blog post about Br**it...

  2. For example, if you were to ask 45.5 million people "should the United Kingdom leave the European Union", it would be a really good idea to make sure that they are all working under the same idea of what that question actually means. For example, when you say "United Kingdom", does that actually mean ALL of the United Kingdom, or is there maybe a need to exclude Northern Ireland from the working definition because of the physical border with the Republic of Ireland with a really complicated history? And does "leave" mean "actually, totally, unconditionally leave"- as opposed to, say, "leave the Union, but only once an agreement has been properly negotiated that sets out the future relationship with that Union in terms of trade, movement of people, laws etc. because it is actually so complicated that people who spend their working lives dealing with it don't really understand it and the whole point of our political system is that we democratically appoint representatives who are supposed to understand this stuff because we have jobs to do, children to raise, friends and family to look after and lives to live.

  3. To 3 decimal places, at the time of writing

Unordered: Cmd+R

As its nearly two years since I last posted here, it feels like I should acknowledge the passage of time with some sort of summary. If this was a film, I'm imagining a shot of me hitting post, closing my laptop, and then a short montage that includes a couple of Christmases, a 40th birthday, some haircuts, some marking off of my childrens' height against a doorframe, a few shots of the woods near my house as the autumn leaves fall, turn brown, get covered in snow, then the leaves on the trees grow back and the bright sunshine shines through them onto my back garden... except it isn't a film, I'm not measuring my kids heights, and this isn't really a blog about my life so much as meandering run-on sentences about things that have caught my attention or imagination. So that whole effort will end up incomplete; a combination of footage and animatic storyboards on the cutting room floor, set to a classic soundtrack that never got copyright clearance.

So, passage of time duly acknowledged, lets move on. What did I miss?

Well, there was a kind of meme/trope in conference presentations for a while where people talked about how "last year, we didn't think Brexit would happen, or that Trump was a credible presidential candidate" and so on. Now we're oddly past the point where that covers "what happened since we last got together to look at Powerpoint slides in a large hotel room", but at the same time still talking about whether Brexit is going to happen and whether Trump was a credible (read: legal) presidential candidate.

But thats far too much for me to tackle- one of the reasons I started "unordered" was to stop myself trying to write comprehensive, everything-important-about-this-one-thing posts that ended up in a collection of terribly long and boring drafts, so I'm not going to walk that path today. Suffice to say that I've pretty much fallen off Twitter, reduced my Facebook activity to occasionally flicking through my news feed (but keeping on top of a couple of Groups that I like), and generally withdrawn from the worst of the filter-bubble data-scraping websites. (That said, I have rediscovered Reddit, which somehow feels less bad, for reasons I can't quite put my finger on - I think because you're following topics - subreddits - rather than people.)

So, I'm kicking this off again. Probably with a bit more "real life" stuff, because one thing I miss from reading old posts is being able to pinpoint what was going on in my life while I was thinking about whatever it was that drove me to write something. (I particularly like the way Dan Hon structures his excellent newsletters with a "sitrep" at the beginning, so I'm probably going to steal that). Possibly with a bit more structure around a posting schedule - if I can balance how much I like having a maintained blog that I can send links to people who might be interested in what I'm thinking without having to listen while I talk at length about something for far longer than their polite attention against all the other things I'd like to be spending my time doing. Which right now is dominated by;

  • The latest Zelda game on the Nintendo Switch, which I just got for my birthday,
  • Fortnite (everyone - especially kids - are obsessed with the free Battle Royale game, but I've been enjoying the "Save the World" fort-building PVE version of the game),
  • The Warhammer 40k hobby, which I've been getting back into after something like a 25 year break. Mainly for the painting, but also the occasional game with my son (although as I'm still struggling with the rules, its still a bit complicated for him.) So the computer/work related clutter on my desk has now been joined by model-related clutter, paint-related clutter, and needs clearing/tidying/cleaning more frequently than ever before,
  • Reading Reddit on my phone.

So, nothing that can't make way for a slightly more productive way of spending my free time.

I hope.

Unordered 8: Battery low

Here I am, back at it again with the random collection of things mashed together to try to compensate for the fact that they don't warrant posts of their own.

Isn't it weird how I can walk 10 metres across the office and my bluetooth headphones still play music from the phone sitting on my desk, but if I've got my phone in my left hand pocket and turn my head to the right, they cut out?

Anyway... You may have noticed another gap between posts, which I'm going to blame on a) summer happening, b) distracting computer games, c) a really busy couple of weeks at work 1, but it all comes down to "I had other things to attend to". Sorry about that. (I'm thinking that if I force myself to acknowledge the gap between posts every time, it will motivate me with a sense of urgency around the follow up, bit to be honest its kind of working the other way.)

On with the braindump…


Its an interesting time in the phone world. As I understand it, there have only really been two companies making any real money out of selling smartphones; Apple and Samsung. Apple makes quite a few expensive phones, while Samsung make quite a lot of cheaper phones.

And Google might be about to suck a lot of the oxygen out of Samsung's smartphone market. I haven't seen the Pixel phone they recently announced, although I have seen a pop-up Google coffee shop outside Euston station that took me a while to figure out that its purpose was apparently somehow linked to getting people to see the Pixel phone, but it sounds pretty cool. Probably not cool enough to pull me away from iPhones and the tethered collection of Apple laptops, Apple Music, Apple Photos etc. etc. but I'm sure people more loosely connected to the iOS ecosystem will see an appealing alternative in something built around Google's services. And at a time when Samsung's brand is probably taking the battering of its life (I imagine that being sued for millions for copying Apple is a happy memory now), the idea of their Google 'partner' cannibalising a bunch of their high end handset sales (ie. the most profitable ones) is probably pretty worrying for them.

But its the Google side that I find more interesting.

The thing is, Google makes something like 98% of its profits from adverts. The internet (read: "digital") has turned out to be a very successful platform for advertising, and no online advertising format has been as successful as search – which Google has the vast majority of.

But there is a problem. The future of the internet is mobile, and the future of mobile isn't typing stuff into a search box and looking through the results (which may or may not include a few adverts). The future of mobile is (probably) something like Artifical Intelligence powered assistants that get you stright from what you want to do to actually doing it - ask the AI for something, and it gives it to you. And there isn't really any convenient space in that kind of service to squeeze in some advertising.

So, Google's future might not be as an advertising company. It might be as an AI assistant company. Except, while not many people are likely to pay for an AI assistant, there are lots of people paying quite a lot of money every year or two for a new smartphone. And if Google can make that smartphone, sell it at a high price (or at least a high profit margin), then maybe there is a way for Google to stay true to its mission of organising the worlds information and making it universally accessible, while at the same time completely changing its business model from supported by 3rd parties paying for adverts, to selling features, but bundling it in with expensive hardware...

One to watch…


I got an iPhone 6s Plus last year, purely for the bigger battery. I like the big screen, but not as much as I dislike the massive thing that barely fits in my pockets. But I like running out of battery least of all, so it was really a simple decision. Thinking that a it would (should?) last me through a day of even the heaviest use.

Then I started playing Minecraft on my commute (well, the bits of it that don't involve cycling or walking) and things started getting pretty low by the evening.

Then something else happened, and I wasn't even making it to the afternoon. I've actually managed to hit 40% battery by the time I got to the office in the morning.

Why? You can probably guess. That old Google April Fools joke that weirdly turned real. Pokémon Go…

I'm pretty sure that by now, as the "craze" has died down (in that it isn't something that most people still care about - the question isn't "are you playing it", but "are you still playing it?") but that its still going to be a popular game for a more 'normal' number of people for a good while longer yet. Which means I feel safe writing about it, without feeling like either a "10 Things Your Business Can Learn From Pokémon Go!" LinkedIn piece or an overblown "this is the fall of mankind" article…

Batteries are really important

I think this is the one thing that Pokémon Go really shines a light on; if you had to name important technologies for the future, you would probably be thinking about super fast computers, super small computers, the internet, new places to put touch screens, wifi, 3G/4G/5G mobile internet and so on. But I think the really interesting stuff comes out of the technologies that are easy to overlook.

Right now, all the buzz seems to be around virtual reality - technology that requires a hardware platform consisting of dedicated 'wearable' screens, very modern, high speed processors and graphics coprocessors - at least a grand or so to spend on the computer, the headset, the controllers and the camera to get started on playing games… well, the killer games themselves haven't actually been made yet. To make a high end console title takes years of work. I've just finished playing Call of Duty: Advanced Warfare – the credits take a 28 page PDF to list all the names of people involved, and thats a few years worth of work. The costs are in the hundreds of millions.

And that is for a console game. Sure, consoles have come on a long way in the last 20 years in terms of processing power, graphics, physics simulation and so on. But in terms of the underlying mechanics, we have gone from sitting in front of a CRT television screen in 1997 holding one of these;

To sitting in front of a (bigger, flatter, thinner, LCD) television holding one of these;

Sure; we've swapped a cable for a battery. We've added a weird touchpad thing that only ever seems to be actually used as an extra button. But the basic mechanics – the user interface between the player and the game – really hasn't seen any fundamentally new ideas for about 40 years. (Longer, if you count the joystick as the most fundamental element of the controller.)

Is that massive progress – from Elite to No Mans Sky? Well, games have got bigger, richer, deeper… the internet has made some interesting things possible (now I'm the appalingly predictable idiot getting shot on the Battlefield, instead of the guy laughing at the appallingly predictable AI soldiers.) But have we really seen advances? I'm not too sure.

But VR games have to do a couple of things;

  1. Figure out how games work best in a VR world. Is that just a new screen, but the same controller? Is it standing up and waving at a clever camera that recognises your gestures? Is it actually something that works in the home – or are we going to see a resurgence in the arcade game model?
  2. Do it really quickly, because there are a bunch of companies investing a lot of money in this space, no doubt signing up some very valuable patents along the way, who will be expecting to make money out of it. Right now, there is no "path to VR success", but if these guys don't figure it out quickly enough to build businesses out of it, whats going to be left behind them is a bunch of well-trodden paths to nowhere in particular, littered with a bunch of patent landmines.

Don't get me wrong; I think the idea of Virtual Reality is extremely compelling. I think there are going to be lots of interesting things happening in that space. I want to see Sony, and Oculus, and Steam, and whoever else is involved succeding.

But I just don't think they are going to be interesting to all that many people - people who aren't going to spend a few hundred pounds on the hardware, or even go out of their way to try it out. And without that kind of marketplace, the likelihood of the kind of software that is able to show off what it can do getting developed seems pretty slim. (And if the software can't compare to something like the LA-inspired world that took £265 million to make, then who is going to be interested?)


Virtualised Worlds

But the Pokémon game itself is doing some interesting things – and not just in terms of Nintendo's intellectual property revival. The fact that I can walk through the village with my son and he will point out a local 16th century pub – not directly because of its historical importance, but because it happens to be a Pokéstop. I'm sure that there are plenty of gamers who are used to the idea of visiting a real-world location and being familiar with it through the virtual worlds it has inspired (I found it strange visiting L.A. earlier this year and feeling like I had wandered into the Grand Theft Auto world of Los Santos) – a feeling not unlike going somewhere you're familiar with through seeing it in films.

But flipping that the other way – experiencing the real world differently because of what is happening in a purely virtual world that has been overlaid on top of it – feels like something new, if only because I don't think its really happened at this kind of scale before. When have crowds of people come together at the same time to visit an old building because of a shared belief that something special can happen there on a different plane of existence… (apart from every Sunday, obviously.)

When Foursquare launched, professional people got very excited about something that nobody was using, because of the potential that this gamified social platform driving people to participating businesses. I'm actually kind of surprised how little "professional" talk I'm hearing about something that lots of people seem to be using. (Based on anecdotal, media industry, London-centric point of view, but I spoke to a headteacher from Derbyshire who was talking about having to give an end-of-term talk telling kids to be careful when playing the game, and was also playing herself, for what thats worth...)

But the potential has certainly been spotted; smart and agile businesses were setting lures at nearby Pokéstops from the early days of its release; McDonalds were on board in time for the Japanese launch

All those mobile games before fell into a few broad categories. There were the 'casual' games, that you could play on any platform, but worked really well on mobile because of how they fit into the mobile context – for example, the game of Candy Crush that you can pause at any time because your train gets to your stop, or the person you were killing time waiting for arrived. Then there are the 'mid-core' games – not as involved as the triple-A) console games that demand a few hours of dedication at a time 1, but still requiring a few minutes of on-the-clock, uninterrupted concentration. (Example - Clash of Clans, where battles can happen at any time, but once the clock is ticking, even a few seconds distraction can be the difference between winning and losing – and something like a 20 minute wait before you can train up an army for another go.)

But Pokémon Go doesn't fit into neat categorisation – you can 'play' it just by having the app open on your phone, clocking up miles as you walk around and hatching your eggs. Or you can play it slightly more attentively, collecting virtual goods when you pass a Pokéstop. Or you can take it more seriously, going out of your way to a Gym – not just any old gym, but one where your Pokémon are the right level to make it worth spending a few minutes doing battle. (You can even coordinate your battles with team-mates – everyone has to pick one of three teams to join, no swapping after you've chosen – although this is a few steps beyond my own level of involvement at this point…)

It reminds me of a piece I read about Snapchat a while ago; like Snapchat, Pokémon isn't just a 'mobile first' application, but is something different. This isn't a game that was designed for the iOS and Android platforms (ie. pocket-sized touch screens), but a game that couldn't really have been conceived of without some important platforms already in place;

  • Ubiquitous 3G (or better) mobile internet infrastructure,
  • Cheap enough data to be able to play without worrying about the costs
  • Ubiquitous smartphone penetration,
  • GPS and compass hardware on all of those smartphones,
  • Ubiquitous familiarity with all of those technologies. (If you have to explain how to turn around in the real world to orientate yourself in a virtual world, you aren't going to understand the basics of the game).

And, of course, if you don't have good enough battery life that you can use the GPS without worrying about killing your phone for the rest of the day then you lose the ability to drop into the game's virtual world for the most casual element of the game play…


  1. Pitches are a blight of the media industry.

  1. Sometimes that long just to download and install the necessary updates to be allowed to play in the first place…

On Apple pulling the plug

This evening (UK time), Apple will do the one thing that you can expect them to do every year; have a big September event where they announce the next iPhone. All else is speculation, but it seems like a near-certainty that the new iPhone will be faster, won't have a headphone socket, and will look pretty much the same as the iPhone 6/6s (and therefore there will be a medium and "plus" size model.) There will probably also be new Apple Watch news.

The removal of the headphone socket is probably going to be the most interesting piece of news, because it isn't exactly clear why its being removed. Its unlikely that the phone will be so thin that it won't fit - there still needs to be a decent slab of lithium battery in there. A second speaker might well take its place, but thats about what is filling the space left behind – not why there is a space opening up there in the first place. About 6 months worth of controlled leaks means that the tech press has been debating it for a while now though.1 I don't buy the argument that its just a way to sell something; Apple's priority is selling more iPhones, not more accessories. So the interesting question at the moment is what its being replaced with.

Most of the commentary I've seen has been around the idea of a lightning port adapter; the idea being that the phone will come with a normal, wired pair of headphones that will plug into the only remaining socket, and if you want to use them with something else (like, for example, your Macbook) you will use a little lightning-to-3.5mm adapter. But that seems like a tiny annoyance that will be incredibly easy to lose.

My guess is that Apple's story won't be about taking away the socket, but introducing something new. And I don't think the "something new" is likely to be about a different kind of wire. A few months ago, thanks to a new interest in exercise1 and after discovering that normal headphones tend to fall out of sweaty ears while running, I bought a cheap pair of wireless headphones. I wouldn't particularly recommend them - they are a bit uncomfortable, the sound isn't great for music (but fine for podcasts), but what I would recommend is the idea of wireless headphones. Take away the cable that runs from your ears to wherever your phone happens to be — whether in your hand, in a pocket, on your desk — means taking away an annoyance.

So my bet is that the headphone-socket-less iPhone won't come with a different pair of headphones to plug into a different socket, but a pair of headphones to not plug into a socket. That is, they will probably plug into the lightning socket to top up the battery while on the go (similar to how the iPad Pencil charges through the Lightning socket), but won't need to be plugged in to use them. Charging off the phone would be a compelling feature on its own.

What I'm hoping is that they will come in at a price point significantly lower than the current £170 starting point for Beats wireless headphones. They don't have to be ultra-cheap1, but something like £60-80 feels like a good balance between adding genuine value to what comes in the iPhone box and a reasonably priced accessory to sell to people who are going to be using an older iPhone for the next few years.

But what I'm really hoping (wishing?) for is something that takes the bluetooth headphone experience of pairing and unpairing with different devices (which is a real pain in the neck) and making it simple to switch between iPhone, iPad and Macbook (and Watch). I personally wouldn't be bothered if it was an Apple-only protocol1; maybe something like a Siri button, or a way to listen to music on a Macbook while talking to Siri on an iPhone could be an interesting feature. But just a wireless system that lets me switch devices from the device that has a touchscreen/keyboard and mouse/usable UI instead of three buttons and a tiny LED. That would be very helpful.1

But if Apple are taking away a socket and expecting every iPhone user to replacing it with a different plug, I will be pretty disappointed.


For the watch, I'm hoping for something different; a new "Sport Plus" Watch "collection" that has built-in GPS (that will probably be a horrible drain on the battery), and the old Watch and Sport "collections" sticking around for a while longer. Not because of any kind of strategic vision or anything like that (although I have suspected since the parallel launch of a £259 and £5,000 with identical computers inside them that they aren't planning on any significant internal hardware shake-ups for the Watch line), but because I just got the original Sport model for a birthday present and I'd be a little bit sad if it was replaced with something I want more so soon. But thats really my problem, not Apple's…



  1. I wrote a post about it back in January, for what thats worth. I haven't seen much marketing around the wired Beats headphones since then though – but the wireless ones have had pretty prominent positioning in Apple stores.




  2. Well, an interest in not dying young that has expressed itself through trying to build a new exercise habit, which works out as more or less the same thing.




  3. My £15 bluetooth headphones are probably a false economy, but I find shopping for headphones at the best of times.




  4. Although it would be even more helpful for the wireless speaker we have in the kitchen that might be used by about 6 diffent devices in the household.




  5. Because of my employers' security policies, I seem to need to have admin rights for my work PC if I want to connect my bluetooth headphones to it for office listening, so my headphones are effectively Apple-only as I only use them with my phone and my laptop anyway.