Robot War

XKCD CD tray fight

Its in a different kind of way, but I get a similarly uncomfortable feeling from this video. There is something very unnatural about the way the robot just ignores the guy with the stick.

I'm not saying that the robot should turn on the guy who is basically acting like an exaggerated cartoon version of a school bully. I'm just saying that if one day we manage to create actual artificial intelligence, and that AI gets access to YouTube (bearing in mind that an AI shouldn't need to watch video like that in real time)… then I wouldn't like to be in that guys shoes.

Or any of his descendants', come to that.

(I wonder if the Singularity comes before or after the Robot War…)

Thanks for clarifying, Buzzfeed

Reading a Buzzfeed post about superhero movies (actually, about what Ryan Reynolds thinks about women and superhero movies), I got to the bit about female characters in Deadpool, which was apparently something Buzzfeed asked Ryan Reynolds about in an interview earlier this month.

Apparently, he told Buzzfeed that featuring strong women in the movie was a "no brainer".

He also talked about how movies need to work harder to reflect the realities of society.

He also said that he was into the idea of having strong female roles in the film.

b3.PNG

You'll note that, helpfully, having given us the three sentences explaining what Ryan said, along with a picture of him saying it, carrying the caption of his words, Buzzfeed then devoted a paragraph of text — actually, the only text in the article that isn't a level 2 header — to recapping exactly what it was that he said about women in Deadpool.

I think he's a fan.

The problem with Twitter

Every time something changes, the focus is tightly concentrated on the change.

Today, I read about how Periscope's broadcasts within Twitter will impact brands.

I don't disagree with any of the comments. I guess my issue is whether its the right question to focus on.

This is something I was trying to articulate recently, but I think that I fell into the same trap of looking at the tweets instead of the timeline.

Making a tweet more flexible is understandable when you look at TWTR the business, and how they can improve their service to their customers – the advertisers who pay their bills, the brands creating content to populate the platform.

But if1 the important thing about Twitter isn't the quality of advertising/brand tweets, but actually the newsfeed that millions of people dip into on a regular basis (ie. the environment that the adverts appear in, rather than the ad units themselves), and the important thing for those users is that they can skim through a few dozen tweets in the time it takes to wait for their coffee to pour, or while waiting for a meeting to start, or the train to stop, then maybe it isn't such a positive story.

It might well be great news for the people streaming their live videos 1 – but is it as good for the people whose timelines they appear in? Or for the other hundreds of Twitter users whose Tweets are alongside them in my timeline? I'm pretty sure that the number of people putting videos in their tweets are going to be much smaller than the number putting photos in their tweets – which in turn is smaller than the number of people on Twitter in the first place. (Affinio reckon that 90% of Twitter's users are 'silent'.)

I guess the secret to being 'good at Twitter' as a (non-publishing) user is looking after your Follow list to look after your timeline. I'm just not sure if Twitter are really thinking about the timeline in the same sort of way.

  1. Bearing in mind that this is a commercial business that hasn't yet really proved that it has a glittering future, it is a big "if"

  2. Well, it might be great news. It might be that the amazing thing that they are watching is less important than making sure they are framing the important things on their smartphone screen, that they still have a decent mobile signal, that they are keeping an eye on the comments and interactions… So, good for brands, less good for 'normal' people. Also, eerily reminiscent of The Circle, if you've read it.

On Apple dropping the 3.5mm headphones plug

There have been rumours for a while now that for next iPhone, apparently in a bid to make it even thinner, Apple are going to lose the headphone socket; instead, headphones will connect via the Lightning port.

As with any Apple rumour, there are plenty of bloggers and podcasters as well as professional journalists throwing in their opinions, speculation, anonymous sources and so on.

Here's the thing I don't understand about the rumour though. Suppose that its true, and this is a big change lined up for the next iPhone (ie. the iPhone 7 form factor), then it will have been in the works since the iPhone 6 design was finished.

Apple sells its own headphones. It also owns the Beats headphones brand.

Today, it isn't possible to buy a Lightning-connected set of headphones from Apple, or from Beats. New headphones from Beats designed for on-the-go usage retailing for over £100 that were launched in September 2015 still used the 3.5mm socket. There are much more expensive headphones being sold by Apple (not just 3rd party headphones on the shelves in Apple stores) that will be incompatible with a lightning-only iPhone.

So I'm expected to believe that Apple have designed, built and shipped a pair of headphones, selling for over £100, which are expected to become obsolete within 12 months for anyone buying the next iPhone?

It seems to me that if this were the plan, the first thing Apple would do would be to start designing and selling lightning-enabled headphones on their premium Beats headphone lines, with a story about how a digital socket enables better audio quality than the 3.5mm analogue stereo jack from the 1970s (introduced for the Sony Walkman in 1979, adapted from a 3.5mm mono jack that was already in use.) They would also be selling Apple-branded wireless headphones for the lower price points - perhaps introducing Lightning as an audio-enabled charging solution. (Worth noting that the iPad Pro pencil, Magic Mouse and Magic Keyboard all now charge over a Lightning connection.) Basically, they would have started transitioning users away from needing the 3.5mm headphone jack.

Meanwhile, they would be preparing users to be able to use the Lightning port for headphones. The latest Macbook model launched last March (which includes a 3.5mmm headphone jack) would either also have a lightning jack, or would have dropped the headphone jack and expect users to have a wireless connection, (as would the revised Macbook Air and Macbook Pro models; the lifetime of those machines mean that users would be expected to be using lightning-connected headphones with their iPhone 7 while also owning one of those laptops.)

But no — nothing that Apple are doing indicates any sort of expectation that people will be using a different kind of headphone connection in the forseeable future.

I expect that Apple might well be planning on this — or even expecting that some future iPhone will drop the 3.5mm socket. But I will be very surprised if its something that happens in 2016. Apart from anything else, the iPhone 6s is 7.1mm thick — but the iPod Touch (at 6.1mm) and iPod Nano (at 5.4mm) still have room for a 3.5" headphone socket. There is quite a bit of space to shave off before the socket becomes a limiting factor for the iPhone's thickness.

The "describe it in a tweet" trope isn't going to work any more.

Re/code reports that the 140 character limit on Twitter is going away (or at least, being replaced with something like a 10,000 charcter limit.)

CEO Jack Dorsey all but confirmed it with this tweet of a picture of some text explaining why the limit exists, and the benefits of removing it.

Naturally, Twitterers are freaking out about the change, as they do whenever Twitter changes.

The thing is, Twitter has a big cultural footprint. Even if you have never signed up for Twitter, if someone asks you to "describe something in a Tweet", you probably know what they mean.

"Describe something in less than 10,000 characters" doesn't quite have the same ring to it…

But although this is a change that seems like a big shift in direction, its really just a continuation along a path that Twitter has been on for a long time. Really, since the introduction of Cards in 2012. Twitter used to just be text; there were a few ways to squeeze more characters into the limits (hashtags turned into a link to a Twitter search, URLs got auto-shortened) but the basic service was the same.

What changed was when they added ways to embed photos — instead of linking out to content beyond 140 characters, Twitter began pulling it into the platform — but keeping the physical size of the tweet the same; you had to click the tweet to reveal the photo.

Later, the tweets got bigger, accomodating a "preview" of the photo.

Then they expanded to be able to include video.

In other words, the big change has already happened. When a tweet was 140 characters, you could read it in the space of a second or two — meaning you could skim through a hundred tweets in a couple of minutes. On the Tweetie app, before Twitter acquired it and turned it into the official Twitter application (on the old, small iPhones), 4 or 5 tweets would fit on screen at once. Today, on a big iPhone 6s Plus, only 2 tweets with pictures will fit on the screen.

Have a look through your Twitter feed right now, and see how many of the photos are actually necessary; how much of the relevant information in the tweet would you lose if the pictures went away? From a quick skim through my own feed, the answer is virtually nothing.

But what the pictures add is less tangible. Anyone who has worked with blogs or web design knows that people like pictures; even if its irrelevant to the story, people are more likely to read stories that have pictures next to them. So by adding pictures, Twitter is adding "engagement".

They have said so themselves;

  • Photos average a 35% boost in Retweets
  • Videos get a 28% boost

A glance through my own timeline shows that this is a lesson learnt by many 'brands' on Twitter; stock photo after stock photo that adds nothing to the headline tweet in terms of information, taking up a bigger chunk of my screen (and therefore diverting attention away from other tweets in a way that isn't related to the quality of the content.) They dont do it because its good, they do it because its working.

Ultimately, this seems to be turning Twitter into something slightly different to what it used to be – and perhaps what 'old school' Twitter users are familar with. And thats basically a 'closed' version of an RSS reader; filling the space I think is still left behind by Google Reader.

Dave Winer mentioned the other day;

Google really hurt the blogosphere with the dominance of Reader and then its shutdown. It's good to pay attention to that now. When you start relying on a dominant product, everything is good, because it hasn't gone away yet. You don't feel the pain until it goes away.

But ultimately, I think its so that people with Twitter accounts can do what they have been doing for ages now, which is use it to try to write blog posts.

Basically, long tweets become the same as Facebook instant articles. No need to click through and wait for a website to load. Text remains accessible, sitting in Twitter's platform (and through Gnip). The Twitter feed becomes more like an RSS reader (click headline to reveal content), except totally closed. Annoying screenshots of text go away, along with tweetstorms. Clicking a popular link that fails to load because its a popular link on a website not equipped to deal with popularity hopefully starts to become a less common occurance…

I think New Twitter is probably more like Tumblr than anything else. I wonder if Yahoo will notice?

Why the big Apple TV makes sense

One of the odd things about the new Apple TV is the fact that its available in a choice of storage sizes. Apple's own website seems to gloss over the fact; despite all the specs on the Apple TV page, storage sign isn't among them; its only when you get into the actual store page that the fact that there are two models becomes apparent.

If you plan to use your Apple TV primarily to stream movies, TV shows, and music or to play a few apps and games, you’ll probably be fine with 32GB of storage. If you plan to download and use lots of apps and games, choose the 64GB configuration.1 Keep in mind when making your decision, that some apps, when in use, do require additional storage.

The old (3rd gen) officially has no storage (in fact, it has 8Gb - but won't let you install apps.) So it would seem that 32Gb – four times as much as the 3rd gen – should be plenty, unless you're planning to load up with downloaded games.

That said, Apple does seem to have a history of under-loading the base model of its products; Macbooks with 4Gb of RAM long after 8Gb was considered a minimum for good performance (frustrating because RAM is relatively inexpensive – if you don't buy it from Apple), iPhones with 16Gb of storage – which is probaby enough if you don't use it as a camera, or want to regularly take your photos and videos off your phone, or want to keep music or video on there. The extra storage is relatively expensive (again) – but there are very few people who I can think of for who I would say it isn't worth the extra cost.

The Apple TV should be simple though. For someone using it as a TV/video device (most people, surely?), even Apple say that should be enough.

But this review of Facebook TV made me think that might not be the case. Think about this;

I click to another video in the horizontal feed, and it immediately begins playing in the big hero slot. Then, I click on the original video I was watching and it immediately begins playing right where I’d left off.

To make that work, you need a bunch of different videos cached on the device. And to make that work – you want to have a bunch of space to store that video.

The unbundling of naked ladies

So, FHM and Zoo magazine are shutting down.

I used to read FHM – by which I mean, I actually read FHM. Yes, there were the pictures of women in their underwear, but there were also good articles. But showing my age, this was the FHM from about 20 years ago; pre-internet, pre-Loaded (and very much pre-Zoo/Nuts). 1

An article in The Telegraph makes the case that women should be mourning the loss of these magazines - because however 'bad' those magazines were, the alternative to 'lads mags' like FHM and Zoo is whatever is on the other side of a Google search (see Rule 34).

I made a similar point in a blog post a couple of months ago ("Software is eating innocence") - although I was talking about 'real' pornographic magazines, rather than the tamer 'lads mags' kind of publication. But its kind of worrying that if there is some kind of spectrum with girls in bikinis at one end and hardcore pornography at one end, then you have the kind of algorithmically-driven pornographic content pushing things at the hardcore end ever further towards the hardcore, while market forces at the other end of the spectrum mean that the tamer 'lads mags' are disappearing, then the impact that its going to have on younger generations as they grow up is on an alarming trajectory.

  1. My dad used to borrow them, and always said he only read it for the jokes. I only ever half believed him.

"Faster than we think"

A very interesting perspective on the reasoning behind some of the features of the recently unveiled Tesla Model X car;

  1. A front door that opens when you approach it and closes itself behind you,
  2. Electronic seats that move forwards and backwards, making space for a 3rd row when needed,
  3. Falcon doors to make it easier to get in and out (with limited space?)
  4. More storage space under the seats,
  5. An automatically-connecting charger, so the car can charge itself when parked up.

Can you spot the thread that connects them all? (I'm not going to spoil the surprise here...)

At the very least, the next five years (not the next ten, this will happen faster than we think), will be very interesting.

Elon Musk’s sleight of hand

Software is eating regulation

Returning to the VW story; the thing about the cars that passed the emissions tests by cheating is that they did exactly what they were supposed to do.

Presumably, some engineers were given some clear parameters for what they needed to achieve - diesel engines that met certain regulations (to please regulators, so that they could actually sell the cars), hit certain numbers (to please the marketers market the cars to people who care about the environmental numbers), and meet some performance benchmarks (to please the people who test drove the cars as potential owners.)

I'm guessing that the engineers realised that the problems they were out to solve weren't really the same problem, and that through software could set up programming conditions to reflect the different problem conditions. I think I can imagine how, from an engineers point of view, given those particular problems to solve, you could consider it an elegant solution. (Particuarly if you have the kind of mindset – which that kind of problem solving would require – to compartmentalise the ethical responsibilty for the consequences of your work to a different compartment of the VW corporation. Its the kind of thing that can make great television.)

I wonder what the implications are from a regulatory point of view though. I mean, it seems clear that the tests were faulty – if a car can behave differently in the tests to how it behaves on the road, then the tests aren't doing their job. Except, the nature of the tests have changed. It used to be about measuring an object – an object does what it does, and regulators performed physical measurements. Objects don't lie. Now, the objects have behaviour – they do exactly what they are programmed to do. So now, its about testing what they are programmed to do.

Marcelo Rinesi of the IEET says;

Things now have software in them, and software encodes game-theoretical strategies as well as it encodes any other form of applied mathematics, and the temptation to teach products to lie strategically will be as impossible to resist for companies in the near future as it has been to VW, steep as their punishment seems to be. As it has always happened (and always will) in the area of financial fraud, they’ll just find ways to do it better.

So, does that mean that the way the measurements are taken needs to be improved? (Reflecting the reality of 21st century cars, bringing the tests to more real-world conditions.) Or does it mean that it isn't just the cars themselves that need to be measured, but that the software itself needs to be subject to testing?

This is a terrifying concept.

For anyone who has never been involved with software, that probably seems like a pretty innocuous statement; sure, just test the software. Wire it up to a monitor, get some geeks to have a look, make sure that there isn't anything like;

IF conditions = "testing" THEN
    LET FuelMix = "Clean";
ELSE
    LET FuelMix = Dirty;
END IF

Obviously, that isn't even close to what you would be looking for in the real world.

For one thing, anyone who knows anything about writing software knows that testing software is half of the challenge of making it work the way you want it to in the first place. Writing software that doesn't do what you don't want it to do is hard enough when you are actually writing it – dealing with software that deliberately does something that it isn't supposed to do and then hiding it in the code is the kind of thing that would be terrifying to have to find – even if you knew for a fact that it existed in the first place. (And thats before factoring in a fairly reasonable dose of paranoia – does the latest iPhone software update just happen to be slowing down what used to be a fast handset because of the cool new features, or is it to make the owner want to upgrade to the latest and even faster model?)

The other thing that you will know, if you know the software industry, is that many organisations will fight tooth and nail to stop anyone being able to look at its code in the first place. Often, this is down to valid reasons (for example, software might implement 3rd party code and be restricted by the licencing deal to protect that code from being accessible to others who could steal it – costing the original developers future sales.)

A Wired article from earlier this year explains how the software embedded in cars and tractors remains the property of the manufacturer – not the owner of the hardware that runs it – and the kind of lengths (technically and legally) that those manufacturers will go to prevent anyone being able to see that software. Even when Microsoft was doing its best to argue in court that Windows software wasn't breaking any US or European legislation, it still took years before allowing government agencies to see the source code. – and that was at a time when businesses would literally live or die depending on how Microsoft were implementing the APIs that they weren't even really making public.1 In other words, even if the silly pseudocode above was in any way representative of what was hiding in the code of car computers, regulators probably wouldn't be able to see it anyway.

The days where if you wanted to know how something worked, you could find out by carefully pulling it apart is over; even if it were possible to examine a microchip to discover what functions it processes, one of the consequences of the 'computerisation of everything'2 is the growing role that software plays in the basic workings of what used to be seen as physical things – add in an internet connection and have the software running on a remote server (where it can be updated, revised and refreshed at any time with no notice given) and you have an utterly opaque – and almost impossible to properly regulate – scenario.

  1. (Wikipedia's criticism of Microsoft page has plenty of information about the kind of activities that were going on at the time.

  2. Goole "internet of things" for an idea of how far this is expected to spread.

Software is eating innocence

From The Economist;

Researchers who have listened to teenagers talk frankly report that, for many, porn is the main source of sex education. Even those who have not viewed it have heard plenty about it from friends. It is shaping their expectations of sex—and what they go on to do.

For those who grew up in a pre-internet world, our main source of sex education was probably some kind of combination of what we heard from older kids, big brothers and sisters (whether ours or our friends), and maybe a few magazines; a world of naivety and personal experiences, but very much a world where "in the land of the blind, the one-eyed man is king".

If porn is moving from print to online, then there is a significant change in the impact it has on sex education. What you are presented with is no longer an editorially selected image, but an algorithmically selected set of videos. Instead of a handful of titles on the shelves of a newsagent — all aimed at the same audience — there is an infinite selection of content, on permanent rotation.

And what is determining the selection? An algorithm, which takes on board what everyone is clicking on. Assuming that the Pareto Principle holds (and there isn't really any obvious reason to think it wouldn't), then the 20% of heaviest users are accounting for 80% of the traffic — so its their choices and preferences that are mainly determining what everyone else is exposed to.

Including those teenagers for whom porn is the main source of sex education.

Maybe those 20% of most active users are actually the teenagers who are finding their way around the world, and this isn't really an issue — just a feedback loop. But I would suspect that the real valuable audience that these kinds of algorithms are being optimised for would be a different kind of audience — at the very least, old enough to have a credit card of their own. And probably not looking for the kind of videos that I would want to be the backbone of my childrens' sex education.

Three letters missing from the UK's 'ad blocking' discussion

While putting together a longer piece about the big ad blocking debate (if you aren't in the media world, you might not have noticed that it has quite suddenly, due to a new technical framework in iOS9, become a big topic for discussion), I thought I would just point out something that I haven't seen mentioned that is pretty important.

The BBC.

OK, so for anyone outside the UK, the BBC is just another publisher with a free website, with display advertising to fund it. And if what you read about the debate is coming from the US-centric tech press, then it makes sense that you wouldn't be hearing about it.

But in the UK, where we pay a licence fee to keep the BBC free and independant, the ad blocking discussion should really be a very different conversation to the rest of the world. Outside of the UK, there is a serious debate (albeit an academic "what if" one) about what happens if online advertising goes away and all publishers have to either go being a paywall or disappear as commercial entities.

In the UK, the discussion should be dominated by the BBC - whether that is about the value of the organisation, the importance of its continued existence and ongoing funding, or about the negative impact it has on the commercially-funded media that have to compete with it.

It seems strange that - as far as I can see - mentions of the BBC are almost entirely absent from the debate.

Software is eating VW

Marc Andreesen wrote a well known article on how 'software is eating the world' back in 2011. The basic idea 1 was that software is increasingly becoming more important than the hardware it runs on.

The news this week about VW rigging their diesel cars to pass emissions test doesn't sound related to that concept. Most reports talk about a "defeat device", but I think its reasonable to say that the assumption for most people would be that its a physical thing attached to the car. It seems that its actually a software device - the way the car's computer balances chemicals in the engine was programmed to act in an unusual way under testing conditions.

For a technology fan, the car industry is a fascinating one to watch at the moment, with transitions on the horizon including the switch to electrical cars (being mainly driven by Tesla), the idea of self-driving cars (mainly driven in public perception by Google), and the very concept of car ownership being challenged by Uber.

Even with all this in mind, the idea that a combustion powered car could be undone – with enormous damage to the manufacturer and the brand – by the way a piece of software was written is just incredible to me.

  1. Horribly oversimplified here, I'm afraid.

"Data-driven insights"

I was interested to read the winning entries in the Admap prize for 2015. With the essay topic "Does Big Data inspire or hinder creative thinking?", it seemed very relevant to my interests as a researcher in a media agency.

Background: The media industry is in a transition at the moment; advertising money is still moving to digital media, where the ability to target on an individual basis is working under a completely different set of rules to 'traditional' broadcast media, transforming the way advertising is planned, bought and measured.

Last year, I wrote a piece titled "Is Data the new Digital?", about how "data" is becoming a fashionable buzzword which we (ie. the media and advertising industry) use to mean all kinds of data, facts, statistics and information, which risks confusing what we mean when we are talking about targetable data and the new opportunities that it presents for media and commercial communications in a post-broadcast world of one-to-one communication at scale.

[/jargon]

Usually, the benefits of "data" are presented as a contrast to the weaknesses of "research" - data is fast, research is slow. Data is easily collected (you're probably already collecting it through websites, phone logs, sales figures etc.) - Research costs money. And Data comes from everyone - Research only comes from those strange people who spend their time filling in surveys and sitting in focus groups.

Anyway, the AdMap editor had this to say about the broader theme of the competition;

Of course, the biggest opportunity is the application of data in helping understand the consumer better so as to form insights for creative strategy, an analytical catalyst for the big idea, exemplified by 'The Dove Campaign for Real Beauty', which was built on the data-driven insight that 'only 4% of women consider themselves beautiful'.

So, where did that "data" come from? Turns out, it was a 2004 "global study of 3,200 women, aged 18 to 64, comissioned by Dove, using the field services of MORI International" (Source.)

In other words, the insight arose before the survey went into fieldwork - the point of the survey was to collect the "data" to valdate the insight. (You don't comission a global survey like that without having a pretty good idea what you're trying to prove.) And it was almost a decade ago.

Personally, I would be inclined to call that "research".

From the archives: Britney vs The Beatles

I think I wrote this somewhere around 2002, for my first attempt at building a website – in the days when nobody was talking about blogs, MySpace was still just a twinkle in Tom's eye, and… Well, lets just say that my tastes have developed since then. (Back then, putting a date on your posts wasn't considered terribly important. Or if it was, nobody told me...) I think this was my first attempt at writing something for my own website, at a time when all the HTML was hand-coded. I'm putting it here so I don't have to worry about the day when my old website inevitably falls off the internet and goes to the big Geocities archive in the sky.

The reason I've dug it up though: there is a passing mention of Max Martin at the end; the guy who wrote "...Baby One More Time" - which was his first number one. At the time of reposting this, he has now written/co-written more number one singles than anyone in history other than John Lennon and Paul McCartney. If his work with Taylor Swift on 1989 is anything to go by, I reckon he has a pretty strong chance of overtaking them.


What this is all about is Britney Spears, who embodies all that I think is good about pop music, and the Beatles, who have become the embodiment of all that I think is bad about pop music.

Actually, it's not specifically about Britney Spears. It's just that she's another target of venomous hate campaigns that I think are thoroughly unwarranted. It could just as well be about S-Club 7, or Steps, or Bananarama, or any one of countless "manufactured" bands. But Britney Spears sounds like more fun to write about, and gives me a good excuse to put her pictures all over the place...

And it's not strictly about the Beatles either; more about the Beatles fans that still exist in the 21st century but haven't quite managed to move on from the late 1960s.

Firstly, I have to say for the record, I don't think the Beatles were a bad band. That's not my point. I'm happy to say that I've bought a few of their albums, and used to listen to them quite often. I think they wrote and recorded some great songs, did some great things, and although I can't really say, because I wasn't born until 7 years after they split up, I think they probably deserved their phenomenal success.

At the time...

In Christmas 2000, the Beatles released an album called "1", which was a collection of all their number one singles. It went to number one in the album charts for a while, and had people queuing up outside record shops for the midnight release.

Now, this is something that's bothered me for a while, but actually seeing this happen really brought it home to me.

Here's an interesting statistic; Before the Beatles split up, between 1963 and 1970 they released 13 long-playing albums in the UK. Since they split up, they've released no new songs (unless you count what are, by anybodys judgement, second-rate, discarded songs dug out of their rightful place at the back of a cupboard in Abbey Road and dusted off for the Anthology. They weren't released first time round for a reason...) but have managed to put together another 20 long playing albums. TWENTY! With NO new material... Now, I have to admit that if I was in the band, or worked for their record company, I'd love to do the same thing- sell not just the same songs, but the very same recordings shuffled in a slightly different order and watch the ca$h roll in (those country mansions don't pay for themselves, you know...) No, the thing that bothers me is the people out there who will walk into a record shop and out of however many hundreds of records there, what they choose to spend their hard earned sixteen pounds on is another compilation album from a band that split up 30 years ago. I can't help wondering, if someone had told John Lennon that the first Christmas number one of the twenty-first century would be the umpteenth compilation of Beatles songs, would he have laughed or cried? (Considering that Michael Jackson bought up the publishing rights to most of his songs in the 80s- probably cried...)

I can remember thinking how strange it seemed when CDs broke through as the standard format for music and people were spending small fortunes on buying their records again in a slightly smaller, shinier disc, rather than just have a record player and a CD player next to each other. I always thought that was about as pointless as buying a record could get. But buying the same recordings again in a different order? While I'm quite sure that there must have been a fair few sold to kids who never had a Beatles record before, I'm equally certain that wouldn't possibly sell enough copies to get to number one at the time of year when record sales hit their peak.

So, these people don't bother me. If you want to get some Beatles songs in your CD collection, but you don't want to shell out two hundred quid for all of their albums and none of the other compilations are to your taste, this may well be what you're looking for.

No, what really irks me are those people who hold up the Beatles as a shining example of what good music can be, and would be were it not for the hordes of manufactured, talentless, soulless boy/girl/pop bands flooding the charts, who not only don't play their own instruments, they don't even write their own music!!!!!

The horror...

There are two main elements to any band- the look and the sound. And Britney Spears is intrinsically better than the Beatles on both counts.

1) Appearance.

The Beatles, to my knowledge, had one dance move. They played their concert motionless for 95% of the time (except for playing their instruments, obviously) but their one dance move consisted of shaking their hair when they went "wooooo" in the chorus. They couldn't really do much else because they all had guitars to deal with, and it wasn't until later in the sixties that the idea of moving and playing the guitar at the same time would be invented by such pioneers as Jimi Hendrix and Pete Townsend, who then sent Chuck Berry back in a time machine... So the pressure of doing both at once became too much, and George Harrison gave the rest of the band an ultimatum- either they didn't have to play live, and deal with the pressure of shaking their heads and playing their instruments at the same time, or he would quit the band. They quickly realised, however, that they were so ridiculously famous that their records went to number one on pre-orders alone; which meant that no-one really cared what they sounded like anymore because they bought the record anyway, and as their fans screamed so loud at the gigs that they couldn't hear themselves either, they would be better off not bothering to get out of bed. So, in what can only be described as a massive snub to their fans who trooped along en masse to their live performances, they decided not to play live anymore, which meant they were free to have their own individual, non-matching haircuts, grow beards, take lots of new exciting drugs and slowly withdraw up their own arses.

Compare this to the pop stars of today. Britney Spears doesn't play an instrument- she get a professional to do that job. So she isn't restricted by a plank of wood tied around her neck. And the stars are chosen, at least in part, for their looks. And since Madonna, they don't even have to worry about holding a microphone, as its strapped to their heads. So they are free to dance and put on a show that's actually worth watching (as opposed to four motionless blokes that you can't hear anyway.) And when they have their ego-driven temper tantrums, it's more along the lines of "I want a bowl of 1000 M&Ms, and someone to pick out the brown ones" or "I want an entourage of 15,000 stylist, hairdressers, make up artists and a stylist for my dog"- which effectively adds to the show, because you can still be entertained by reading about their off-stage antics while they transform into an untouchable legend living in a parallel popstar universe, which contrasts sharply with just not going on stage anymore because you can't be bothered.

Combine this with todays extravagant stage shows with light shows, smoke machines and pyrotechnic displays, and you have a better looking show. Arguably, this is a result of things advancing with the times, but there were plenty of bands in the 60s playing around with projectors, lights and smoke to give their shows an edge, so that doesn't really hold. I think it's more likely that they simply didn't care. Did the Beatles ever abseil onto the stage or fly in from the back of the stadium on flying skateboards as Five and Backstreet Boys did?

And if you're of the opinion that the Beatles image was all their own, compared to the manufactured images that todays pop bands are forced into, then I recommend you read up on the subject and find out exactly what happened to them in Hamburg that turned them from leather-clad rockers to pop stars with matching suits and haircuts well before anyone other than their immediate families and friends knew they existed.

2) Playing Instruments & Writing Songs.

Some people would have you believe that a band that plays their own instruments and writes their own songs is somehow intrinsically better than a band who doesn't. This is one of those things that, as far as I can tell, is subtly implanted into peoples heads at an early age by people who don't know what they're talking about, and is subsequently taken for granted. It's not true. In fact, if you think it through, the opposite is true.

Firstly, every band needs instruments- whether its a single piano, a couple of guitars, a bank of synthesisers or a full orchestra. But there's simply no reason for the band to play them themselves. What's a better show- four good looking young kids singing and dancing, of four good looking young kids staring at their hands trying not to drop a note or fluff a chord?

On top of that, there's the problem of ego. Musical instruments are there to play music on. Seems like stating the obvious, but a lot of people who play musical instruments seem to think that they're for something different- to show off how well they can play that particular instrument. Which leads us away from the finely crafted 3 minute pop songs that we know and love, and into the territory of extended guitar solos that don't send the audience anywhere other than to sleep. Except, of course, for the schoolboys (always seems to be boys, for some reason) who are learning to play guitar so one day they could do extended guitar solos too, staring in wonder that anyone can play that fast, when in fact just about anyone who can be bothered to waste a few years of their life practising scales can do it.

On top of which, the bands end up being tied down by their instruments. A band with two guitarists, a bassist and a drummer will generally only be able to do drums-and-guitar-based songs without causing rifts within the band (for example, when the drummer has nothing to do in the studio while the guitarist spends hours playing with a drum machine.) Even if they were talented enough to come up with them, they would never allow themselves to do the likes of "Baby One More Time", or All Saints' "Pure Shores", or S-Club 7's "Don't Stop Moving." But there's nothing to stop Britney Spears getting hold of a guitarist and playing a rock classic. (She's already covered the Stones...)

The other thing that every band needs is a song. Preferably more than one, too. This is where the other popular misconception comes in- that bands who write their own songs are intrinsically better than bands who don't. While I appreciate that it can bring more "authenticity" to hear a singer singing one of their own songs, it's not without its drawbacks.

To me, it makes no difference whether the ideas, songs or concepts come from the band themselves or their managers, producers, songwriters, stylists, families or off the back of a cereal packet, so long as they're good. There seems to be a school of thought that thinks that artists who don't write their own songs get them from some sort of machine, or simply plucks them out of the ether, so their songs are somehow "bad", while artists who write their own songs spend months at a time carefully crafting them, pouring their blood, sweat and tears into them, so they're naturally "good". Clearly, that's not true- if anything, it's the opposite. A band with an established following will be forgiven for writing the occasional duff tune, because their faces have had thousands of pounds worth of marketing money invested in them so they have to be kept happy. On the other hand, a faceless songwriter will be dropped and replaced without any need for press releases or newspaper headlines, or just simply not have their "duff" songs recorded. So they've got a reason to make every song as good as they can, rather than just knocking out a b-side in less time than it takes to listen to it. On top of the fact that they're just songwriters, so they don't spend their time rehearsing dance steps or practising poses in front of mirrors or planning what they're going to say to the journalist from NME and whether it's going to be the same speech they gave the journalists from Q and Smash Hits, because they're just learning how to write damn good songs.

And there's no simple formula or set of rules to follow to make a number one song. What will go straight to number one today might not do the same next week, next month or next year. And if a "rule" should happen to come into existence, it will be broken just as quickly. In the late 80s, the KLF attempted to write "The Manual- How to have a number one single the easy way." Although an excellent read, statements like "In this day and age no song with a BPM over 135 will ever have a chance of getting to Number One" show it as something forever set in the days before drum'n'bass, it clearly didn't take long for the rules to transform completely. (Nowadays a final chorus that goes up a note seems to be an appendix to the "Golden Rules".)

And anyway, surely if you love a song, you love the song no matter where it came from? Does it make any difference if it was written by the singer, guitarist, drummer, manager or someone else? It certainly never seemed to bother any Motown fans.

And besides, as much as I love the work of Max Martin (writer of such songs as "Backstreet's Back" and "Baby One More Time"), I don't want a blown-up picture of him on my wall, or his face on MTV. In fact, I don't even care what he looks like. Give me Britney Spears any time...

...But if it was on TV, it would be rubbish

I remember Charlie Brooker saying something along the lines of "if you take something funny on the internet and put it on TV, it isn't funny any more."

Well, I'm trying to put my finger on exactly what it is about Periscope that is supposed to make it so interesting.

The business of nothing.

About fifteen years ago, I read Naomi Klein's No Logo, when the idea of brands outsourcing the actual product (ie. Nike having their trainers manufactured by a different company) blew my mind.

Today (seen via LinkedIn...)