Teaching the Media Industry to code

As you've probably heard, something new was introduced to the National Curriculum last year; coding lessons.

I saw this Mediatel article from last month; "The coding revolution and what it means for media". It makes the point that software is now integral to the media world1 — as its been put elsewhere, "software is eating the world", and the generation growing up now will – or at least, should – have a better understanding of it than any generation before;

This is a revolution that will be 10 years in the making but could change this country forever.

On one hand, I'm a little sceptical. I remember my own computer education at school involving learning about word processors, spreadsheets and databases – seeds of knowledge that I probably spent 20 years without ever calling upon, until I was in a working world where most people who have the need for a database immediately turn to a spreadsheet, while MS Access remains an unopened application on almost every desktop.

But I digress; it is an initiative I am totally behind. But the reason it will be "ten years in the making" is because this is a change in the national curriculum: kids will now be "learning to code" at school, and therefore (the logic follows), in the next ten years, we will see our new graduates wth an increasing understanding of "how to code".

Two points;

  1. If this is a good thing, why are we waiting 10 years to feel the impact?
  2. Who is going to make sure that the industry is keeping up with the education system?

Perhaps as an industry, we just haven't really thought about it yet. (Maybe it won't be until a few of us have found ourselves baffled by our kids' homework until the impact starts to sink in.) Perhaps we are hoping that, by the time this wave of truly computer-literate graduates joins the industry, that those of us thinking about this stuff now will have been promoted high enough that we won't be worrying about being outsmarted and outgunned by what they are able to do, safe in the knowledge that it won't help with boardroom discussions or high level negotiations.

If so, then that is a slightly worrying thought.

I wrote something at the beginning of the year about three levels of "using a computer". I worry that most people in the media industry see computers as digital paper. They might spend several hours each day using Excel, but they haven't taken the time to think about how to automate their most repetitive workflows.

I suspect that the real problem is that there are plenty of people who have absolutely no need to understand how to do that kind of thing. If you are in senior management, then you are probably (rightly) more interested in the software you can buy to simplify the problems that exist for everyone in your organisation or address key business challenges than you are in the training and learning that can help everyone address their own personal computing problems. (Not to mention the massive difference between what you need to know about software for 20th century processing of invoices and purchase orders, versus the 21st century world of programmatic, biddable, real-time media trading and so on.) Or maybe you are worrying about the process to go about quickly bringing those kinds of skills into your business; outsourcing software development to a trusted partner/supplier, acqui-hiring relevant businesses etc. Probably not hire a bunch of suitable people to fill roles that don't yet exist, reporting to people who don't understand their skills or work, recruited by people with no idea how to interview them.

But here is my point; if you are a manager, then its your job to teach the people who report to you how to do their jobs.

"Coding" wasn't a part of most people's jobs yesterday. But that doesn't mean that the ability to code isn't something that can make most people's jobs a easier, or more efficient, or more productive today, and that it won't be as much of a core skill as the ability to edit an Excel spreadsheet or Powerpoint presentation tomorrow.

Sure, a board-level manager probably doesn't need to spend time learning how to make themself more efficient in Excel. But I think they do need to set an example to the people they manage. I mean, I doubt that Barack Obama will spend any time reading through StackExchange trying to figure out how to squash his bugs – but he has made public his attempts to learn to code (and who knows, maybe he has spent some time learning a thing or two about data processing?)

But ultimately, I think we need to think about the image of 'coders'; both the view from the outside of wizards who make computers do magical things, and the view from the inside of nerds who would rather spend time staring at text on a computer screen, and spend hours making data rearrange itself in particular ways than spend time with other people. (Not that its necessarily true, but I think it can be an image that is easy to play up to.)

So… Last year, I resolved to be a better coder and learn more about how to use a computer. This year, I'm resolving to be a better teacher.

  1. By "media world", I mean the media/advertising world that I work in, as opposed to the broader media/journalism/entertainment world.

More thoughts on "Using a computer"

I came across this story about someone who is boycotting Amazon. The Amazon part of the story isn't what caught my attention though- it was this bit;

I got out a pen, paper and calculator and, going through my Amazon history, I totted up everything I had spent. It came to an eye-watering total. Over the past eight years, I have spent £4,279 on Amazon.

(Worth noting that she had just mentioned the Microsoft Office package that she bought from Amazon for university.)

So,

  • She went through her Amazon history – presumably on a computer, which it is reasonable to assume also had a copy of Excel running on it,
  • Wrote down the cost of each order with a pen and paper,
  • Put each one of those numbers into a calculator
  • Got totals for her overall Amazon spending, as well as subtotals for certian months and years (quoted in the article.)

To me, this seems crazy. Surely it would have been quicker to copy/paste those numbers into a spreadsheet, where she could have then quickly totted up her total spending, and done any additional analysis that she wanted to. (From the article, it seems that she did some further analysis – concluding that she spent £4,279 in total over 8 years, and that although "a small amount of the total was DVDs and computer equipment […] the vast bulk — £4,000-worth — was spent on books..)

This is exactly what I was talking about when I wrote about using a computer, when I talked about different types of computer users;

1. The Computer as electronic paper. You put your numbers into a spreadsheet, but still do the calculations with a calculator. In other words, you see a spreadsheet as a table of numbers – which it is – but don't understand why a spreadsheet is anything more than just a table. (Similarly, you might not see much of a difference between a word processor and an electrical typewriter.)

What I wanted to point out here was a few things;

  • Just in case you live in a highly technical/'computer-literate' environment where everyone is an expert, this is an example of a "Type 1" computer user in the real world.
  • This is someone who clearly owns a computer of their own, and also had a need for Microsoft Office software for their university degree. (The article doesn't mention whether she graduated – it isn't relevant to the story – but she does appear to have spent a few years at university, and now holds a job as a Commissioning Editor at the Daily Mail.) This certainly implies someone with a good level of intelligence, not to mention being highly literate, who probably uses a computer every single day of her professional life.
  • Given her profession, I imagine that the word processor/electric typewriter comparison doesn't really apply here.

Has Black Friday transformed Christmas Shopping in the UK?

This is a fuller version of some work we did, looking at Black Friday in the UK this year. I say "we" — there was some debate and discussion around this, and what is presented here should be considered my own opinions, rather than those of my employer/colleagues. (That is to say, alongside the standard disclaimer, not necessarily everything here was originally my idea, but it is reflective of my current thoughts/opinions.)

The traditional Christmas retail cycle used to be fairly simple; a build-up over the course of November and December, leading up to the last-minute gift shopping frenzy, followed by the Boxing Day sales when people would go and spend whatever they had left after Christmas (and often a fistful of gift vouchers) on themselves. But as an increasing amount of Christmas shopping activity has moved online each year, that pattern is changing.

For several years, we have seen a kind of “double peak” pattern in online traffic around the Christmas run-up; the first peak in early December as the more organised online shoppers get their orders placed in plenty of time for a Christmas delivery, and then a second peak later in the month as shoppers look online for information to help with their last-minute shopping — presumably well aware that they had missed the chance for a Christmas delivery.

In the US, where Thanksgiving is celebrated at the end of November, that first peak has traditionally been pushed by the "Black Friday" phenomenon. When I mentioned it in a weekly round-up post for my work blog last year , I thought it was worth explaining exactly what "Black Friday" meant — assuming that the concept of the post-thanksgiving retail event would be unfamiliar to UK readers.

This year, that phenomenon is much more familiar. Firstly, because more UK retailers than ever have been joining in with "Black Friday" marketing — although Amazon claim to have led the charge, British brands such as Tesco, Sainsburys, Top Shop, Argos — even that most British of high street Retailers, John Lewis were promoting Black Friday discounts this year — giving them their best sales week on record.

But we have also seen a sharp increase in online mentions of “Black Friday”, identifiable as coming from the UK – a fourfold increase on mentions last year.

Although other countries have seen significant increases in mentions this year, this means that while Black Friday remains a predominantly American phenomenon, the UK now accounts for the second largest number of online mentions of “Black Friday”.

Perhaps the most telling difference is in the volumes of mentions over time; in the US, Black Friday sees significant volumes of mentions in the days leading up to the event itself, with almost as many mentions at the end of the day on Thursday, as shoppers talk about preparing for the sales and shops’ early openings.

In the UK, there was relatively little build-up; the peak was in the morning of the Friday, remaining high until lunchtime, after which it gradually declined over the course of the afternoon and evening.

Two brands stand out very clearly among UK mentions – Amazon (who claim to have brought the tradition to the UK – although there were retailers with “black Friday” sales previously, Amazon can probably be credited with bringing the phenomenon to the attention of a broader audience), and Tesco. Mentions of Amazon are broadly split between those who love the discounts being offered, and those who are disappointed that they don’t offer much more than typical ‘sale prices’ – and don’t discount as steeply as US sales.

Tesco mentions were generally less favourable, referencing reports of “chaos” at the physical stores, referencing the “ridiculous” and “hilarious” videos being shared on YouTube, and questioning whether the discounted electronics are items worth fighting over. (Perhaps an easy criticism to make when focussing on the items on sale, rather than the value of the discount to shoppers – as the Washington Post points out, it isn’t the wealthy or the comfortable who are standing in line in the cold, or wrestling with one another over a slightly discounted Xbox.)

Is it here to stay?

Many of the mentions in the UK are specifically talking about the US tradition coming over to the UK, and commenting on the unattractive scenes at supermarkets and other stores. (In fact, Walmart is one of the most mentioned brands in the UK, as people comment on the scenes of bargain-hunters fighting over limited stocks.)

Whether the interest will keep up after the novelty has worn off is hard to say. There is a clear benefit to the strategy if it works; getting customers to spend with you earlier (rather than later, when they might spend with a competitor) benefits the individual retailer, while getting people to do their Christmas shopping earlier could mean that some shoppers will be doing their Christmas shopping for longer — in other words, they won't spend the same amount, but spend it earlier, but will keep on buying more gifts (stocking fillers, "I saw this and jsut had to get it for you" gifts and so on.) Which is good for the broader industry as well.

But it does seem likely that retailers who are pushing their pre-December sales as online discounts will be best positioned to make the most of the buzz around the discounts, without the negative associations that can come with images and videos of people fighting over discounted large-screen televisions in supermarket aisles. This is also more in fitting with the way UK shoppers prefer to do their holiday shopping – more shoppers in the UK plan to do their shopping online than via brick-and-mortar stores.

So – expect to see more next year; more sales, more discounts, and no doubt more ugly scenes from the shop fronts. From an advertising persepctive, I would expect to see more media money being spent on earlier messaging promoting Black Friday offers as competitors work harder to get top-of-mind association with Black Friday, which should drive earlier excitement/buzz. But I think that for the smart marketers, the place to watch will be how the bigger retailers are handling their online presence, and – with mobile accounting for more Thanksgiving traffic than desktop devices in the US – how they are looking to cater for smartphone and tablet shoppers. Setting up an 'online queue' system to manage high levels of traffic might be OK for a desktop web experience, where users can leave a browser window open in the background and get on with whatever they need to get on with, but trying to do the same on a bandwidth and battery constrained mobile experience seems like a recipe for disaster.

I'm hoping that the subject for next years Black Friday marketing conversations will be the interesting technical innovations in handling large volumes of mobile shoppers as quickly as possible, rather than the crowd control (or lack of) at physical retail stores where people fight it out amongst themselves over big ticket electrical items (perhaps to save themselves money — perhaps to make it back on eBay.)

An OSX Service to get a web page title

The issue: I have a bunch of services that I use to drop URLs into a journal-type text file that lives in Dropbox, which I then go through to write blog posts, newsletters and the like.

Going through each link (opening up in a web browser, then copying the relevant details from the web page back into the text file) is a boring task. But the real problem is that its a boring task that I only do when I'm in the right mood to be doing the more creative task of writing up whatever it is that I'm writing.

The idea; I want a service, where I can just click on a URL and automatically convert it to a (MarkDown) link, automatically looking up the web page from the URL to get the title of the page.

Turns out that its pretty simple. I set up a Service in Automator, which receives selected text, and output replaces selected text.

All the Service does is run the following Ruby shell script;

require 'open-uri'
require 'nokogiri'
    
ARGF.each do |f|
  doc = Nokogiri::HTML(open(f))
  print "[" + doc.at_css("title").content.gsub(/\s{2,}/, "") + "]" + "(" + f.strip + ")"
end

To make it work, you will need the Nokogiri gem installed in your System Ruby. (Nokogiri can be straightforward to install – it can also be a complicated mess, so the instructions are outside the scope of this blog post.)

Obviously, there is room for improvement on this. For starters, it seems like overkill to pull a whole web page HTML and then to use a whole HTML/XML parsing tool like Nokogiri just to get a page title. (readline seems like it could be useful here.) It would also be nice to extract a URL from a selected piece of text – that is, turn it into a service that could be used on a selection of text with multiple URLs in it. And it would also be nice to detect URLs that are already either HTML or Markdown links and ignore them.

But as a starting point, it does the job.

Keyboard Maestro macro to help with assigning Keyboard Maestro macros

If you use a mac and are into macros, shortcuts, customising things and generally making your computer do the kind of things you want it to do, I strongly recommend giving Keyboard Maestro a whirl.

If you're already using it, you might be familiar with the feeling once you've got your great macro together when you need to figure out how you want to trigger it. There are quite a few strategies out there - using special characters to denote a macro (eg. I use "@@h" as a shortcut for my home email address – apart from ocassionally using class variables when writing code in Ruby, I can't think of any instances where I'm likely to type "@@".) Sometimes, you might double the initial letter (eg. I use "ddate" and "ttime" to insert the current date and time, respectively.)

Once you've got some sort of idea of a trigger you want to use, this is a quick macro to see whether your chosen string of characters is going to appear inside any standard Engligh words.

Its pretty straightforward - I have this triggered when I type "ccheck", and it runs a simple one line shell script, which looks for what you typed into a dictionary file that comes built into most UNIX systems;

If there are no matches, you get a brief alert (which you can safely ignore) telling you that there are no matches. If anything does match, it will pop up in a window to let you know what words contain the string you had typed, so you can then decide whether you care or not.

I'm pretty sure that there will be a cleverer way of going about this (maybe using one of the "proper" OSX dictionaries), but this is more of a quick fix to stop me using something stupid that is going to get accidentally triggered than a guarantee that I'm never, ever going to trigger this by accident.

For example, it tells me that there are 5 words that include "ttime" – but I'm quite confident that I'm never going to type any of them.

The Magazine is shutting down

Sad news.

The Magazine was an interesting publication. A paid-for collection of long-form articles, with a simple content-first idea: putting the writers first, so that they could publish great content.

I'm guessing that you probably haven't heard of it. And I'm guessing thats a bit part of the reason why it has shut down.

Jim Dalrymple (publisher of The Loop website, and its sister Apple Newsstand publication The Loop Magazine) had this to say;

I understand the issues of being an independent publisher on Apple’s Newsstand—it’s not fun. Apple should just admit that they don’t give a shit about digital magazines and be done with it.

Maybe its just that the idea of tablet publishing is going through a trough of disillusionment. But I imagine that in a parallel world, where it got picked up by someone with the money to promote it to a wider audience, it would still be being read by the kind of people who subscribe to Wired but can't be bothered to download the enormous and frankly pointlessly "multimedia" iPad application, or the kind of people who pick up copies of gadget magazines, flip through the pages and wish that there was something stimulating amongst the endless stream of smartphone and hifi reviews.

I like to think that, back in this world, there is still a space for something like The Magazine – that the same idea, perhaps slightly tweaked, can be a long-term commercial success. At least, I hope so.

Don't underestimate the power of convenience

A couple of years ago, we got a new 1 car. Being a gadget nerd, I was very keen to make sure that it had a bluetooth stereo that could connect to my phone (although not quite clever enough to make sure that it would play music from my phone – not just connect to make hands-free calls.)

But even as a gadget nerd, I thought that the "keyless" feature – where you only have to have the key with you, then press a button to unlock the doors/turn a knob to start the engine – was pointless. How lazy would you have to be to not be bothered to get a key out of your pocket? And isn't it just asking for trouble – when your key runs out of battery and leaves you unable to get into or start your car? Or if it gets wet or something and the electronics break?

I was so wrong – because I was looking at it from the perspective of someone completely used to the way that I had worked with my previous cars. I had never even thought about situations like;

  • Carrying a load of bags to the car from the supermarket and wanting to open the boot,
  • The car knowing if my wife's keys were inside when I try to lock the door,
  • Having a baby in my arms who is fighting tooth and nail to get out of my arms, while preparing myself to wrestle her into her car seat,
  • Coming back from a service station with a cup of coffee in one hand and some food in the other,

…and probably plenty of other situations where I have breathed a quiet sigh of relief that I just have to press a button – or even ask my 5 year old to press the button – to lock or unlock the car.

This is the experience that I keep coming back to when I'm thinking about things like NFC payments and "smart watches". Sure, I don't need contactless payments as a feature on the cards in my wallet – but when I need to make a payment quickly when I've got my hands full (which is much more likely to happen if you have small children than if you don't) then the speed of contactless is a definite plus. And although its easy to mock Apple's video of "how payments work today" as a woman fumbles with the stack of cards in her wallet, it is without a doubt far easier to take out a phone from your pocket and present it to a payment terminal with one hand than it is to take out your wallet, remove a credit card and put it back in again.

As for online payments and the "verified by VISA" system that makes me have to enter my card details repeatedly, try to remember which unique, ultra-secure password is attached to which card – I get a sinking feeling every time the logo pops up on my screen.

Sure, I can get by quite happily taking my phone out of my pocket when it buzzes to tell me there is something I have already said I want to be alerted to. But for those occasions where I'm in a meeting at work, or having a conversation with someone, I would much rather be able to glance at my wrist to see whether I need to even think about it right now, or if I can happily ignore it for the next hour or so.

I think the way to think about systems that promise to add a level of ostensibly pointless 'convenience', the question to ask isn't "do I need this feature"? Its "if I had this feature already, would I choose to switch to what I'm currently doing"?

For example, if I was using QR codes, I think I would quite happily switch to typing in a short URL. If I was paying for everything with my phone (one-handed), I can't imagine switching to a world where I have a collection of cards in my wallet.

For all the talk about retailer security, data protection and so on (which is important in making the system available - not whether people will use it), I think thats the issue that is going to determine whether systems like Apple Pay will succeed or fail. And while the Apple Watch is being pitched as a fashion item rather than a functional phone accessory (which similarly is important in making the convenience available), I think thats the issue that is going to determine whether those early adopters find it to be something that offers genuinely useful functionality, rather than simply being a 21st century take on finely engineered jewellery.

  1. By "new", I mean "used".

Accuracy vs Precision

Today's XKCD;

I'd be inclined to take this a step further; when you say "people are stupid compared to your expectations", what you are really saying is that my expectations of how smart people are is constantly and consistently wrong, yet I am unwilling to change my expectation to accomodate this information.

One of the things that stuck with me from my degree 1 is the idea of "Accuracy" vs "Precision."

Precision is essentially about consistency – a repeated measurement will give very similar values. Accuracy is about a measurement being close to a "true" value. So, a measurement using an incorrectly calibrated set of weighing scales can be very precise (always reporting the same weight), but not very accurate. Or reporting a figure to a large number of decimal places can be a very precise measurement – but that doesn't mean that it is an accurate one.

So, the "people are stupid" statement could be taken as saying that your views are not accurate, you know that they are not accurate, you know how you could change your views – but you refuse to. Perhaps to maintain your belief in your own level of relative 'smartness'.

Which, I guess, is pretty stupid…


It also reminds me of my favourite George Carlin quote – Think about how stupid the average person is. Then realise that half of the population is more stupid than that. 2

  1. I studied Chemisty with Environmental Chemistry. On one hand, I probably should have switched to a topic I found more interesting when I realised that I wasn't going to maintain a keen interest for 3 years. On the other hand, I'm not sure I would have necessarily made a better choice.

  2. Of course, if you interpret "average" to mean "mean", then this assumes that stupidity/smartness is normally distributed, and the mean and median are the same – which is not necessarily true.

Teaching kids to code

I've noticed a lot of talk about "learning to code" over the last couple of years — both from the likes of Codecademy promoting services, and from people I know who are interested in the idea of "learning to code."

Now, there is a whole topic around what "learning to code" means, who should really be doing it and at what level, but as someone who has taught themselves languages like PHP, Javascript, Ruby, Objective C, VBA and a few others, my main interest is how to get my kids interested at an early age. I don't know if its something that they will be interested in, but I do want to make sure that if they are, then I'm making it as easy for them as possible.

I didn't think that it was something that would be happening any time soon — my eldest is about to turn 5, and his maths knowledge is still at the "adding and taking away" stage. But I was surprised to discover that apparently as a part of the National Curriculum, next year he will be learning about "code" at school. Which means, I suppose, my "job" isn't so much about giving him a head start as giving him support.

Which means that a couple of apps that I recently discovered should be worth sharing with anyone with kids of a similar age.

The first one is Daisy the Dinosaur. Its very simple, and basically consists of writing instructions to get a cartoony dinosaur to perform certain tasks. For example, to get her to reach a star, you give the instructions to move forwards and then jump. Its a simple drag-and-drop interface to move "commands" into the "program", and (I think) quickly gets across the idea of chaining commands and loops.

Its very basic, but enough to give you an idea of whether this is going to be interesting or not. Once you've cleared the (low) bar that it offers, had a play around with the "free" mode and got to the point where you want to do something beyond the ability of poor Daisy, then take a look at Hopscotch from the same developers. This is pretty similar at first glance, but also lets you create your own "rules" from a set of different commands. Its got another level of complexity, and its probably worth spending a bit of time with it yourself before introducing it to your kid (and playing along with them.) On one hand, this seems like it could be a bit intimidating, but on the other, the language it uses is "Turing complete" — which means that the possibilities of what can be done with it are limited by what can be done with computers in general, rather than specific limitations of the application. So if you consider that, its actually a very impressive application.

(And if you're in the position now where you might find your 5 year old teaching you things you didn't know about coding, maybe you should just quietly download them for yourself.)

One Note for Mac

A couple of months ago, I came across and wrote about Microsoft's OneNote application;

I was digging around the applications that came with a recent work upgrade to Office 2013, and came across OneNote, which I thought might have been interesting. Until I noticed;

OneNote is available for Windows, iOS, Android, Windows Phone, and Symbian.

So, an application for "free-form information gathering and multi-user collaboration" (interesting) that I can't use on my 'main' computer (effectively useless, unless I want to use it through a browser.) Lets suppose that Microsoft have a killer app in the pipeline – what are the chances that they will make a Mac version out of the gate? Seems unlikely.

Well, since then Microsoft released OneNote for Mac.

It seemed like fortuitous timing – it happened that on the same day, my work email moved from Lotus Notes to Outlook, so I'm hoping that a few of my daily pain points around email will disappear (and the inevitable new pain points won't be as bad…) And, having also recently upgraded to Office 2013 a few weeks before, I've got plenty of new tools to play with. So, I've installed OneNote on my own laptop, my phone, and my iPad. (Naturally, I've also installed the new Office apps too – although I haven't yet had a reason to have a proper play and figure out if a 365 subscription is worth my while.)

What am I going to do with it though? Currently, my 'notes' system revolves around a combination of nvALT (Mac), Notepad++ (PC), Nebulous Notes (iPhone/iPad) and a Dropbox folder to tie it all together. The only catch is that its very much text-based – so the idea of a system for 'richer' notes is appealing. (For example, taking notes during a meeting, where I can easily separate out action points from points I want to reference etc.)

The first thing that appeals about OneNote is the free-form-ness of it. More than any other application that comes to mind, it resembles the way I like to use a paper notepad – but without the limitations of space that a paper notepad clearly has.

The second thing that appeals is the idea of collaborative working – I'm a fan of the idea of personal information management, through notes, wikis etc. My biggest frustration with Office documents is the 'read only' message when you ry to open a document that someone else is using. The idea of being able to store my own notes on, say, ways to use a piece of software we use at work that others can also add to is very interesting to me.

So, first impressions – now that its cross-platform, it has become a lot more useful. From a first glance, it looks interesting. (And seeing the icon on the IFTTT page is very encouraging.) So, I'm going to be looking out for ways it might be helpful – whether that means taking a laptop into meetings/presentations, or using a phone/iPad for note taking and then the desktop version for organising/formatting etc. I'm not sure just yet. But worth a whirl...

…except – irony of ironies – my PC version (that is, the non-free, paid-for-by-my-employer version that is part of Microsoft Office, that Outlook is peppered with links to) doesn't work. Or at least, it doesn't work with OneDrive (presumably disabled to keep 'work stuff' under work's control) or Sharepoint (not yet activated while the IT guys deal with the rest of the Outlook/Exchange/Sharepoint transition – I'm hoping…).

So now, in a complete reversal of my previous position, I'm able to make full use of it in a 'personal' capacity (on my own computer, my own phone, and my own tablet), but not in a 'professional' capacity on my work hardware. For a piece of software which, as far as I can see, would be much more valuable to me in a work context than a personal one.

Microsoft's troubles are starting to become a little clearer to me…