Microsoft In Talks To Acquire Mobile App Development Startup Xamarin

Sources: Microsoft In Talks To Acquire Mobile App Development Startup Xamarin | CRN

Microsoft is in the final stages of negotiations that could lead to either an acquisition or major investment in Xamarin, a mobile startup whose tools make it possible to code iOS and Android apps using Microsoft development tools, sources with knowledge of the discussions told CRN recently.

When the iPhone SDK was first released, the fact that you had to use a Mac to develop for the iPhone was seen by many as a significant disadvantage – even more so once Android started gathering steam.

I don't know how much of an issue it is today, but I would guess that for developers, it is now seen more as a selling point for OSX than a disincentive to work on iOS.

(via Benedict Evans' newsletter.)

Ctrl-Alt-Del on the Mac

On a PC, Crt-Alt-Delete (A.K.A. the Three Finger Salute) is a keyboard shortcut that just about everybody knows. If you need to kill something that isn't running, then thats usually the solution.

The Mac has an equivalent - Command-Option-Escape, which brings up a window to let you force-quit an application.

I've been using a Mac for about four and a half years. But I have to Google for that key combination every time I have to use it, because I've only had to use it about four times.

All-at-once-ness

Ours is a brand-new world of all-at-once-ness. "Time" has ceased, "space" has vanished. We now live in a global village...a simultaneous happening. We are back in acoustic space. We have begun again to structure the primordial feeling, the tribal emotions from which a few centuries of literacy divorced us.

Marshall McLuhan, "The Medium is the Massage", 1967, p. 63

Will "Print" outlive "desktop"?

"Print will be around longer than the desktop," New York Times Publisher Arthur Sulzberger Jr. told a group of media professionals Thursday morning.

Interesting point of view — I don't think anyone is really arguing against the idea that "mobile" (including tablets) is the future of "computers." (A report from Enders Analysis today talks about mobile devices accounting for 50% of time spent online in the UK, and tablet shipments overtaking PC sales.) The "PC" is clearly in decline as mobile is growing.

But the same story has been accepted to be true of print for a good few years now — pring has been in decline while "digital" was a growth story. So its a thought-provoking question - which one will last longer — "old" physical print, or "old" digital?

To be honest, I wouldn't like to bet against the long-term future of print. But then again, is its future going to be just as much of a "speciality" as a personal computer?

The new way to do subtraction

Apparently an image is doing the rounds on Facebook, depicting the crazy way schools today are teaching maths.

I haven't seen it – partly because I haven't been spending much time on Facbeook recently (especially over the last few weeks), and partly because I don't think its the kind of thing my friends would be posting 1. But its one of those things where it isn't until you stop and think about it that you realise that the way 'we' were taught how to do subtraction is actually a) pretty complicated, b) pretty impossible in most day-to-day situations, and c) not the way that you actually tend to do subtraction in your head.

Subtraction is one of those things that I think I assumed we had cracked hundreds of years ago. Turns out we hadn't quite nailed it yet.

I do wonder what sort of things my kids are going to aske me for homework help with in a few years that is going to baffle and confuse me.

[EDIT - updated with working link to the original story...]

  1. At least, not the ones that Facebook predicts that I'm going to find interesting.

"…on every desk, and in every home"

Microsoft's original mission statement;

A computer on every desk and in every home, running Microsoft software.

Something I talked about recently;

Consumers have made it clear over the last decade or so that they want laptops that they can use anywhere, rather than desktops that are tied to a desk. And I don't think on the whole that they are particularly interested in using their laptops at their desks either.

The 'Microsoft software' part is interesting to think about. If it used to be Office that was what people wanted to run, then Windows what what they needed to run.

Today, Office is the only Microsoft software running in my house – not because I want it 1, but because if I want to work at home without lugging my office latop back with me, I need it.

But what else is there that Microsoft make that I would want to install on my own Mac? I was digging around the applications that came with a recent work upgrade to Office 2013, and came across OneNote, which I thought might have been interesting. Until I noticed;

OneNote is available for Windows, iOS, Android, Windows Phone, and Symbian.

So, an application for "free-form information gathering and multi-user collaboration" (interesting) that I can't use on my 'main' computer (effectively useless, unless I want to use it through a browser.)

Lets suppose that Microsoft have a killer app in the pipeline – what are the chances that they will make a Mac version out of the gate? Seems unlikely.

What are the chances of a genuine Excel alternative appearing any time soon? For me – pretty slim. For me, Excel is the swiss army knife of 'data' tools – I use it as a calculator, a spreadsheet, an organiser, a chart builder, for natural language processing, statistical analysis… Pretty much anything involving 'data', I'll do my first pass in Excel. For anything boring, I would rather delegate it to an Excel macro than to someone else to deal with manually. (Often, its quicker to tell a computer what to do than to explain the problem to a human.)

But lets face it – thats not how most people use a computer. Most people either have no interest in complex .XLSM workbooks, or if they do then they are using them rather than building them. And if they aren't building them, then they are probably better off using something other than Excel for the job (something Benedict Evans has posted some thoughts about.)


So. Back to Microsoft.

…on every desk

Somewhere I don't think most people want a computer any more,

…in every home

Somewhere I don't think most people need Microsoft software any more.

I can't help but think that in 5 years or so, 'everybody' will be criticising the new Microsoft CEO for not doing the things that, with hindsight, he "obviously" should be doing right now.

But right now, for the record, I just can't see what it is that he should be doing. Other than maybe figuring out a better way for Microsoft to say what they want to be doing.

  1. although my relationship with Excel is kind of love/hate…

"Your an idiot"

"Your an idiot" – apparently the 'most internet sentence.'

Its a lovely analysis, and 1 I thought I'd have a look at just how 'internet' the phrase turned out to be.

The most retweeted "your an idiot" tweet is a reasonably standard 'step four' response from Frankie Boyle;

614 retweets at the time of writing

I particularly liked the most retweeted 'step three' (ie. non-referential) tweet;

To quote from Wikipedia;

At the bottom of the table, Queens Park Rangers were relegated after a thoroughly dismal campaign in which they recorded the worst start in Premier League history, with not even Harry Redknapp's appointment as manager and a substantial investment in players during the January transfer window significantly improving their fortunes.

The most authoritative news story (according to Sysomos' measure of 'authority') – a story about another 'step four' usage - ABC News - Samuel L. Jackson schools the president;

“I’ll be reading scripts and the screenwriter mistakes ‘your’ for ‘you’re.’ On Twitter someone will write, ‘Your an idiot,’ and I’ll go, ‘No, you’re an idiot,’ and all my Twitterphiles will go, ‘Hey, Sam Jackson, he’s the grammar police,” Jackson said. “Somebody needs to be,” he added. “I mean, we have newscasters who don’t even know how to conjugate verbs, something Walter Cronkite and Edward R. Murrow never had problems with. How the f> *> did we become a society where mediocrity is acceptable?”

The 'most authoritative' tweet;

Again, 'step four' is a popular one – especially when from a celebrity. (Rob Lowe has 815,084 followers.)

Fox News – joint biggest "your an idiot" authority…

Fox News – joint biggest "your an idiot" authority…

A wordcloud of "your an idiot" Twitter mentions isnt particularly enlightening (although the word 'irony' is a nice indication of 'stage 4' popularity…)

Twitter wordcloud

Twitter wordcloud

But looking at volumes over time, perhaps its actually a bit of a 2013 thing – volumes seem to have been steadily decreasing. (The rise of mobile and autocorrect, perhaps?) The 600 tweets a day estimate in the original article seems to be a little high – although its definitely down from about 850 a year ago.

Tweet volume

Tweet volume

Perhaps unsurprisingly, its less of a big deal in the blogosphere;

Blog mentions

Blog mentions

…and, although nowhere near as big as Twitter, still fairly popular on forums.

Forum mentions

Forum mentions

  1. before things went a bit crazy over the last couple of weeks

Tedx - David Puttnam: Does the media have a "duty of care"?

Vaguely related to my everyone is wrong about everything post recently - a Tedx talk that (kind of) addresses the issue of why people are so misinformed, and what might be done about it – Does the media have a duty of care?.

The summary;

In this thoughtful talk, David Puttnam asks a big question about the media: Does it have a moral imperative to create informed citizens, to support democracy? His solution for ensuring media responsibility is bold, and you might not agree. But it's certainly a question worth asking ... (Filmed at TEDxHousesofParliament.)

Framed in the context of the Paisley snail and laws around duty of care, its ultimately a discussion of the role of the media in building an informed democracy.

Personally, I tend to share Kevin Kelly's optimistic view 1 that, while technology creates new problems, the benefits of the ones that we adopt outweigh the problems that they create. Which means that I do have a belief that things will get better.

But when it comes to the question of how media/communications will lead to a more informed electorate, able to make better decisions about what to do with their power, and a shift of power to those people (as opposed to the politicians, media, corporations etc.)

Its not that I don't see it as a goal. I just don't see a likely path towards it.

  1. Better expressed in his book "What Technology Wants", but I can't find a suitable link.

Twitter, TV and the One Direction effect

Around this time last year, we did a fun little project around pancake-related tweets on pancake day. The biggest driver of tweets over time was a hashtag game, around #replacebandnameswithhashtags.

But the biggest driver of activity was this relatively innocuous tweet;

The impact was pretty clear; a massive spike in activity as it was retweeted and replied to;

(The secondary spike a short while later was the result of his girlfriend also tweeting about pancakes.)

I've been doing a few projects around tracking Twitter mentions and conversations around various topics, and it has become something of a joke – if there is a massive, inexplicable spike in tweets, the first thing to check is whether one of One Direction happened to say something related. And I'd say that as often as not, if no other explanation is apparent, then that is what it turns out to be. We call it the One Direction effect.

Another topic I've been watching (along with many others) is the interaction between Twitter and television. Just over a year ago, a Twitter spokesman said that 40% of tweets were about television during peak TV hours – a huge volume.

What would happen if the two collided? This week, we found out, thanks to SecondSync's analysis. They tweeted;

So, how much of a difference did it make?

This much.

To put that into context, in last week's round up they also mentioned the Graham Norton show;

The top show on Friday night’s leaderboard was the final episode of the current series of The Graham Norton Show, which attracted 16,551 tweets, the most it has recorded for an episode in 2014. The most popular guest on the show was Breaking Bad’s Aaron Paul, mentioned in over 5,600 tweets, while a peak of 506 TPM was reached in reaction to Ellie Goulding’s live performance. Overall, 75,646 tweets have been recorded for the eight episodes broadcast in this series.

So, the most tweeted about programme on a Friday night generated sixteen thousand tweets. But a single tweet from a One Directioner alone (during a repeat – not the live broadcast) generated nearly fourteen thousand – in a very intense burst.

This is their analysis of twitter volume over time – for a particularly popular (in Twitter-terms) episode of a popular show, in its original broadcast.

Note the difference in scales to the chart above.

Television gets a lot of attention when it comes to discussion of Twitter and their share price/market value.

Funny how One Direction don't get as much attention. I wonder what would happen if they started posting exclusively to Facebook?

Everyone is wrong about everything

A study from Ipsos MORI carried out last year took a look at how public perceptions and actual facts differ wildly. For example;

  • We think that 15% of girls under 16 get pregnant every year - official figures suggest it is around 0.6%
  • 29% of people think we spend more on Job Seekers Allowance than pensions – actually, we spend £74.2 billion on pensions and £4.9 billion on JSA
  • The public (on average) think that £24 of every £100 spent on benefits is claimed fraudulently, when official estimates are £0.70
  • 26% think foreign aid is one of the top 203 items government spends money on, when it actually makes up 1.1% of expenditure (£7.9bn.) More say this is the top item of expenditure than pensions.

Somewhat depressing, when you consider that these are the people who elect the politicians they think will address the issues they feel are most important.

Data tables (albeit in a horribly formatted PDF) are also available for download.

BRICs and Morton

Along with some changes in the team structure at work, I've been working on a series of market reports on different countries – China, Russia and Brazil so far (and I'm assuming that India is going to be next on the list). Fascinating countries to be learning about, and wanting to do the best I can do has meant work hours and life hours have got a little blurred… And although I would love to share what I've learned and written, I'm not entirely comfortable with doing that just yet. (Although I will say - China's pace of urbanisation is insane.) Which has meant that February posts here haven't quite managed to keep pace with January's updates. 1

So, the time I've had to myself has increasingly been time away from a screen and keyboard. For Christmas, my wife got James Morton's Brilliant Bread. This has worked out quite well for her – she likes to read all about cooking, but I seem to have taken over the actual work of doing the baking. I would strongly recommend the book to anyone interested in the idea — rather than just a simple collection of recipes, lists of ingredients and step-by-step guides to make it, it explains the why just as well as the how — meaning that it works as an excellent jumping-off point when you get to a stage where you want to start trying out your own thing. (Something I used to find particularly intimidating in baking, where you have to worry about the science as much as the flavour — for example, you don't want to chuck a load of salt in for flavour if you know that its going to stop the yeast from working properly.)

We got excited about a bread machine we got some years ago – partly for simple bread, but particularly for malted bread and pizza bases. I then progressed to making my own pasta and focaccia (which is beyond the capability of a bread maker) – but the machine has been retired to the cupboard for some time. 2

Starting off with a couple of the simple recipes: 'mug bread' (where everything is measured out with a mug; no weighing or measuring tools needed), simple white bread and soft rolls (both of which don't require any kneading), this weekend I had my first step at an 'advanced bread' – a white sourdough.

The crazy thing about this is that the only ingredients are flour, water, and a little salt. The first thing you need is a sourdough starter – which is just 100g each of flour and water mixed in a jar, with a little 'seeding' (James recommends raisins, but doesn't say how many – I think I threw in about a dozen or so), left for 24 hours, 'fed' with another 100g of flour and water, left for another 24 hours before I started using it. The natural yeasts and bacteria in the flour (and maybe the raisins) get activated and start to grow, eating the flour and making bubbles – and become the raising agent for the bread. Somehow, the bacteria that you want in your dough create an environment that other bacteria don't thrive in 3, so you end up with what seems like the bread equivalent of the 'healthy bacteria' you get in tiny bottles of drinking yoghurt, or something… Anyway, whatever the biology behind it, with the raisins picked out, the 'starter' is then mixed with flour and water, a little salt (about 10g – although my scales are cheap and rubbish, so for 'about', read ± 4g or so), and mixed, rested, kneaded, rested, shaped, rested and then baked.

The result is a lovely sourdough loaf. Which I can munch away on while learning all about India…

Lessons learned

  1. When you make your sourdough starter, use a large jar to store it in. ie. make sure that there is enough room for it to grow – because as it bubbles away, it will grow. As I learnt.
  2. You can get a proving basket to rest your dough in for the last stretch (after shaping it), or you can make your own with a bowl, tea towel and lots of flour. I'm not sure what went wrong with mine – either using a muslin instead of a tea towel is a mistake, because the weave is too loose, leaving space for the dough to stick. Or I didn't put enough flour in. Or I didn't knead the dough enough. Or it could be any combination of the three, but when it came to turning out my shaped dough into the pan, the result was something of a sticky mess. Not that it made much of a difference to the finished product (I think), other than a slightly unusual shape on the top.
  3. There is obviously a knack to kneading. The book suggests 10 minutes should be enough – but although my dough got to something like a chewing gum texture, it still didn't pass the 'window-pane test' after a good 20 minutes of working. That might be because I was being too timid – with an 18 month old daughter asleep upstairs, I didn't want to be throwing the dough around as hard as I could (attending to a woken toddler with dough-covered hands didn't feel like a great idea…) But again - the finished product was something I was very happy with.

So, plans for the future;

  • Sourdough, because of all the funky bacteria breeding in the dough (I think) lasts longer than normal bread – my loaves and buns were dried out within a couple of days, but sourdough is supposed to last a couple of weeks. Which should make it much more practical to have in the house than normal bread (and also means I don't have to buy any yeast, which is another plus.)
  • A proving basket seems like it might be a wise investment – I'll give the tea towel method another couple of goes first though (unless I can find a cheap one somewhere – £10 for a little basket to help bake a loaf of bread that probably costs about 40p in ingredients seems a little bit pointless – I would rather invest that money in better quality flour, given that its the only ingredient that isn't coming out of my taps. (Should I try baking with fancy bottled mineral water?) There is a health shop in the village that seems to have quite a selection, so I'm sure that something there is what I'm looking for. Although how to judge the quality of flour (versus quality of my baking) is a mystery to me at the moment.
  • Bread with bits in! I want to have a think about what sort of stuff I want to throw in there – ideas right now include a savoury olive sourdough, a sweet raisin/sultana/fruity loaf, sundried tomato… and with my other current cooking obsession being smoked barbeque (pulled pork, beef brisket and ribs) – soft white rolls go very well, but I'm wonding if there's something that I can do with some of the chipotle, honey and mustard that make the bbq sauce to make a perfect sourdough accompaniment…
  • My mum recently had some bread with some sort of salt flakes baked into the crust, which sounds like a nice idea… So I'm wondering what else might be an interesting thing to bake into the crust – as opposed to mixed throughout the dough.
  • It seems that baking 2 loaves at once would be twice as much bread for just a little bit more work – I'm guessing that it would take less than twice the energy in kneading. So thats something else to experiment with at some point. And making 2 loaves at a time means that I can try out some experimental ideas and still have some nice bread if they go horribly wrong.

4

  1. I made a New Years Resolution to write less and post more – too many half-thought through ideas and re-re-drafted contemplations of current events which were long past being current by the time I had over-edited any interest or excitement out of my writing.

  2. As I understand it, the bacteria like an acidic environment, and the yeasts produce alcohol which acidifies the dough. The alcohol/acid stop other bacteria from growing. Apparently there are bakeries in San Francisco using sourdough starters that are over 150 years old. I don't know what the hygiene practices of American bakers 150 years ago were, but I can't help but be impressed by that.

  3. I know lots of people get bored of bread makers, but I didn't – she started cutting wheat out of her diet, and there isn't too much you can do with a breadmaker if you aren't putting flour in it. Although apparently they can also double up as jam makers, which might be useful – I can't see myself using it for bread again.

  4. I can't think of a blog post I have written where I have been so pleased with the title as this one.

Creative Computers

Its been clear for years that the 'desktop' PC is in decline – the world has chosen the laptop form factor, which can be used anywhere, rather than the more powerful, economical and desktop-bound devices.

But is there a more interesting trend that the one place most laptop users aren't interested in using their computers is sitting at a desk?

 

"No more real than it is real-time"

Last Sunday, an American Footballer 1 announced that he was gay.

Like last month when the first Premier League footballer came out (five months after retiring from the game), this caused a stir in the news.

But an interesting angle was pointed out in AdAge about the reaction from the advertising world – where "real time" is the latest buzz, vocal brands and real-time advertisers had nothing to say.

Too sensitive a topic? Perhaps. But more likely that 'real time advertising' isn't really the 'real time', 'agile', 'always-on' approach that its being pitched as. Instead, its just a case of forward planning – maybe some quick photoshop work or some fast-working video production.

The commenters on the article don't seem to agree – consensus seems to be that commenting on the story would have been inappropriate. For example;

"Brands, in general, are not weighing in on slow-burning issues that culminate in RT moments (laws affecting same-sex marriage, for example)."

However, a little searching reveals this to be a bad example.

Forbes has a slideshow of some outdoor advertising. Mashable has a story about Microsoft and Amazon having some videos. Chevrolet are running some pro-gay marriage ads over the Winter Olympics, and Business Insider has an article – including plenty of social media examples.

I think the issue is that brands are weighing in on these slow-burning issues. But only the slow-burning issues. (Maybe 'social media' still isn't ready for fireworks.)

Apparently brands have already been in touch with Sam about sponsorship deals – so maybe its that they are looking to make a stronger statement than merely tweeting about their support.

Maybe its just that I'm not paying close attention to the kinds of brands who are tweeting their support or posting about it to their Facebook pages – I am, after all, neither American nor an American Football fan (and haven't been spending much time on Twitter or Facebook this last week or two), so I'm taking it on faith that AdAge's writer, editors and commenters would have noticed if it were a false premise.

But the feeling I get is that this is an example of where 'real time' is falling short. Right now, its about either preparing for moments of planned spontaneity, or looking for the technology that will detect the stories that meet certain key brand-related criteria (read: use the right keywords.)

The point where 'real time' becomes 'real' still seems some way off yet.

  1. That is, a player of American Football. Not an *actual* footballer.

"The trouble with new tech is how democratising it all is"

Bob Wootton on MediaTel;

Just as would-be record producers could create music in ProTools, Logic, Cakewalk, Cubase etc on their home PCs or laptops, so budding video producers now work in software like Adobe Premiere. And boy can they create.
[…]
London creative agencies and West End production and post-companies find themselves facing change on an unprecedented scale because the great ideas they specialise in - and they can be great - have been paid for through massive production costs.
But as I've said, there's a sea change afoot. How will these ideas be funded as the production costs that once funded them collapse?

The music industry has been pushing the story for quite a while about how the cost of a CD is more than just putting some music on a disc and shipping it out to the shops.

Its an interesting view that something similar is happening in the world of video — the cost of the ideas have been buried in the costs of the technology.

So what happens when the cost of the technology drops to near-negligible numbers? When the high-definition cameras are built into your phone, and the video editing software comes free with your laptop? Sure, the aspiring, hungry young enthusiasts can put together some incredible work — but what happens if there isn't an industry there to support them and help them match their skills and hobbies with paying clients to turn them into careers and a regular income?

Cable spaghetti

Having a one and a half year old bimbling around the place again means thinking about all the potential death traps in the house, and the latest one that my wife was worrying about was the potential of pulling the 37" TV off the table its standing on and onto herself.

So, after a minor diversion 1 , I picked up a wall-mounting bracket and fitted the TV to the wall.

Which left what I thought would be a reasonably straightforward task of tidying up the various cables to get the extra foot or two to get the now wall-mounted TV plugged in.

What I had underestimated was just how much cable spaghetti was hiding out of sight, and how tangled 18 months worth of occasionally moving cables around would leave everything.

I probably have more 'stuff' connected to my TV than most people – but I don't think its necessarily an unusual amount. (I posted about what I think a 'typical' household might have recently.)

This is a diagram of all the cables I have – power, video and internet. (Laid out more or less how they are in my living room.)

Living Room Schematic.png

Its complicated. That many plugs going into two power sockets on the wall probably isn't the safest it could be. And in the real world, all of those wires criss-crossing one another are a tangled mess – organising all of them was not a quick and simple task.

What I would like to see is something that would rationalise all of this stuff I have connected up. Is there a 'dream box' that could deal with that?

When people talk about the idea of an Apple TV, it tends to be either what they could do if they made a beautiful TV set with the Apple TV functionality, or what else the current Apple TV could do.

Well, right now there are two Apple boxes – a router and an Apple TV. Why not put them in the same box? Roll in a Time Capsule and an Apple TV in the same box and you've got a 'hub' for all the household's computers, internet and online video.

Next up – make it the 'hub' for A/V. My TV's remote control is effectively an on/off button and an input selector. Let this dream box do that job – let the TV people control their TV content (ie. unlike Google's approach) – just a handful of inputs2, a single HDMI output to the TV and an audio output to some speakers (whether thats something like a soundbar or a full on 7.1 surround system.)

The result would be turning what I have above into something like this;

Dream Box.png

With some additional inputs, there is plenty of space for additional games consoles, online video boxes (say, a Chromecast and a NowTV, or a Roku or whatever else), Blu Ray/DVD etc.

And thats without going into the really interesting space – how you design a user interface for whatever it is at the heart of the living room's entertainment system of the future.

  1. After buying a cheap bracket from Sainsburys, rated for 'up to 42" sets', I discovered that my TV's VESA measurements of 600x400 are unusual for a 37" set – more typical for a much larger set.

  2. Multiple HDMI cables are a real problem, and I can only assume its going to be more of an issue as technology gets more prevalent. One or two on a TV set is far from uncommon – although newer sets do seem to typically have 3 or 4. HDMI switchers are just an added level of hassle – either an additional remote control, or having to get up and switch inputs.

Chromecast coming to the UK soon?

Talking about the future (or lack of) for Smart TV, I said;

Today, a typical household might have;

  • TV (well, probably a couple — but ignoring secondary screens for the moment to try to simplify the picture…)
  • TV service (cable/satellite/terrestrial) - probably a separate set-top box (given that over half of the UK pays for subscription TV service.)
  • PVR (eg. Sky+/V+) for recording broadcast TV - two thirds of the UK have one (probably built into the TV set top box, or possibly as a separate VCR-like box — probably not built into the TV.)
  • DVD player/Blu-Ray player for watching pre-recorded films/video.
  • Games console (55% of households) — mainly for playing games, but often used to access online video services.
  • Some sort of dedicated 'Internet video' device (might be an Apple TV, Now TV, Roku etc. Might be a connected PC. Might even be more than one.)

Those last two are somewhat different to the others. 98% of UK households have a television set, and if you have a TV then you have some sort of TV service (whether free or paid.) If you have a PVR, then its probably come from your TV service provider, bundled with the package.

DVD/Blu-Ray players are another 'must-have' – whether its a low end DVD player, cheap enough to throw in with your supermarket shop, or a high end, high definition player to watch your favourite films in the best possible quality.

But games consoles – while popular – aren't for everyone. If you aren't interested in games, you probably don't have one in your house, and if you do you probably aren't too interested in the 'additional' features it offers – like watching online video.

Finally, the 'internet video device'. If you are interested in streaming films, setting up a Netflix subscription etc. then you're probably interested enough to get something to let you watch it on the big screen. But that's not 'mainstream' – if you're not interested in gadgets, you're probably not interested in finding the best box for your requirements. And even if you are, you might not be sufficiently motivated to go and spend the best part of £100 (or, to put it another way, more than a Blu-Ray player).

Which is what makes Google's Chromecast such an interesting device. At just $35 in the US (about £21 equivalent), it plugs into your TV (and a power supply), connects to your home wifi network, and lets you stream video from your smartphone/tablet to your TV screen.1

And its UK launch is rumoured to be soon

For YouTube and Netflix, this is probably going to be great news (they are already supported in the US, and both go for a general strategy of ubiquitous availability.) Whether the UK's TV players will be bringing iPlayer, ITV Player and 4OD (especially the BBC) is what will be the make or break.

…Which leaves Sky. Their Now TV) box is just £9.99, and is being marketed as a way to access Sky's (subscription) TV services without a satellite dish. But it also has apps for iPlayer, 4OD, Spotify, Vimeo and a number of other online services – Netflix and YouTube conspicuous by their absence.

The thing is, these are two very similar pieces of technology with clearly very different functionality; one for putting online video on your TV, the other for giving you TV through online video. And while the price is low enough to make getting both quite affordable, there is the issue of having 2 spare HDMI sockets in your TV set.

But, for those not interested in shelling out for a Smart TV or sticking a games console into their living rooms, this should be an interesting and cheap way to get some online video onto their TV screen.

  1. At least, thats the illusion. In practice, the mobile device just tells the Chromecast what video to stream and where to pull it from – the phone doesn't actually do the work, which means its free to find the next video you want to watch.

The Death of Expertise

The Death of Expertise

Having equal rights does not mean having equal talents, equal abilities, or equal knowledge. It assuredly does not mean that “everyone’s opinion about anything is as good as anyone else’s.” And yet, this is now enshrined as the credo of a fair number of people despite being obvious nonsense.

I think there is something strange going on. On one hand, there is the rise of the 'knowledge economy' and 'information working' (ie. working with your head, rather than your hands). While at the same time, the value of real knowledge/wisdom seems to be declining. (Which is what the linked article is about.)

My guess is that this is a transitional period – we are seeing the end of the days of isolated pockets of 'clever people' (ie. people in universities, editorial teams of publications, advisors to the powerful), but haven't yet moved to a world where we really understand the impact of global networks (whether thats people – Facebook, Twitter etc. or information – Wikipedia, blogging communities and so on.)

(I like to think thats my view as an 'expert', rather than just an optimist…)

"Social TV" measurement

Last week, I mentioned an announcement of a partnership between Twitter and GfK (a research firm) in providing an "offical" measurement of TV-related conversations on Twitter. This comes on the back of similar partnerships with Nielsen (for the US) and Kantar (for the UK.)

This is an area that I've been watching for a while — a couple of years ago, I did some analysis at work around online mentions of TV programmes, comparing volumes of mentions to TV audience sizes. I then spent a fair amount of time prototyping and then building my own little Twitter-tracking application which would plug into TV listings to create an ongoing measurement of TV programme mentions on Twitter. I ended up shelving the project for a number of reasons — not least because a company called SecondSync were doing something very similar (but as a proper business, rather than just a coding hobbyist project.)

What has been clear to anyone paying attention to the world of television is that a few different trends were all colliding;

The rise of social media. People talking to one another online about (among other things) television.

Twitter — the ideal platform for this kind of conversation for a number of reasons;

  • Public — most tweets are visible to anyone who wants to see them (as opposed to Facebook's 'semi-private' nature — much of what happens on Facebook is only visible within limited social circles, either to friends of the poster, or 'friends of friends.')
  • Searchable — put a keyword into Twitter's search, and you can see anyone's tweets mentioning that keyword. Hashtags make this very easy to do, encouraging public content to become part of a public conversation.
  • APIs — Its reasonably easy to plug into Twitter's data feeds and automate the searching process. Which means that its fairly straightforward to count mentions of keywords and see the volumes of mentions.
  • Real-time: Twitter's design focusses on what is happening right now (as opposed to "interesting things your friends have shared", which has been Facebook's focus — with a lot of their innovation revolving around figuring out the most "interesting" stuff to put at the top of your news feed.
  • Marketing — Twitter have made a concerted push to position themselves as the de facto platform to talk about television. (Zeebox were looking like they might replace them because they were focussed on just talking about TV, but I don't think the idea of only talking about TV has really caught on.) And, as this analysis by my colleague Mat Morrison shows, Twitter gets a lot of news/media attention for its size.

So, despite a much bigger audience, Facebook has been kind of left out of the "social TV" conversation — we don't really know what other people are talking about, we don't know the scale of conversations around particular topics, but we do know that while Twitter can be viewed as a network of conversations tied together by common hashtags, there is no way to connect a conversation I'm having on Facebook with that of, say, a teenager in Taunton or a mother in Middlesex that happen to be about the same thing, at the same time. Unless we have mutual friends, those other conversations might as well be happening on MySpace, as far as my experience is concerned.

Last Thursday, Facebook made an announcement;

Today we’re announcing an international partnership with SecondSync, a social TV analytics specialist, intended to help clients understand how people are using Facebook to talk about topics such as TV.
[…]
The first output from this partnership will be a forthcoming white paper, Watching with Friends, showing how different types of people use Facebook to talk about TV across a range of programs in the US, UK and Australia.

Interesting for a number of reasons;

  • Apparently Facbeook have been privately sharing some numbers with TV networks in the US, but this is the first time we will get a proper look at what is being talked about on Facebook – not brands being 'talked about', but 'natural' conversations. (There was a very limited tool some years ago that was apparently hacked together in Facebook's early days, but seems to have long since been forgotten.)
  • The fact that its beng done by a firm used to doing it on Twitter would suggest that there should be a degree of comparability between the figures. At the very least, it should give us an idea of the difference between programmes that are talked about on Twitter, Facebook – or both. In other words, we start getting a proper idea of 'social TV' – not just 'Twitter TV'.
  • The fact that its being done by a firm based in the UK should be good news for those of us in the UK media industry.
  • The fact that its also being done outside of the UK indicates that SecondSync have been developing their business/technology. Which is nice to hear (although a little frustrating on a personal level…)
  • Some visibility into Facebook conversations should give us (that is, researchers/media types) some better idea of what is actually going on, outside of our own circles or brand pages we are involved in.
  • Which, in turn, should tell us a bit more about TV audiences and viewing behaviours.

What the iWatch might do to Apple's revenues

Apple might be working on a watch.

It might be called the "iWatch".

It might be priced at around $299.

And it might be aimed at an existing Apple customer base (like the iPad), rather than to new customers (like the iPod or iPhone.)

And it might generate $17.5 billion in revenues during the first 12 months – which is more than the iPad managed when it launched.

But it might be constrained by supply issues, limiting revenues to between $12 and $15 billion.

That's according to a Morgan Stanley analyst, as reported by CNET. (Who have a nice picture of what it might look like.)

Seems like an awful lot of speculation around how something might happen.

Also seems to be based on the assumption that it will be the next big thing