Two Bears, revisited | stratēchery by Ben Thompson

An interesting look at the different challenges faced by Apple and Samsung in the smartphone market by Ben Thompson at Stratechery.com;

Samsung is being challenged by lower-cost competitors; the company’s average price per phone fell by $30 last year, and its share of >$400 phones slipped from 40 percent to 21 percent. This kept up Samsung’s volume – they now account for one in three smartphone sales – but the result was their first profit decline in nine quarters.

Apple had the exact opposite problem: the iPhone’s average selling price jumped from $577 to $636 quarter-over-quarter, and was only down $6 year-over year. Apple also increased its share of the >$400 market from 35 percent to 65 percent. Growth, though, was meager: a mere 7%, despite the addition of NTT DoCoMo and a much earlier China launch for the iPhones 5S and 5C as compared to the iPhone 5. According to Tim Cook, this was compounded by stricter upgrade policies amongst North American carriers.

He makes the point that with Samsung's Android devices, there is no meaningful software differentiation — iPhone competes with 'everything else', but only playing at the high end of the market. Samsung are competing with 'everything else' across the spectrum — but running the exact same Android OS (and therefore applications and services.)

It brings to mind something I mentioned in a recent post;

A recent episode of the Cubed podcast talks about Microsoft's leadership position changing over time – Ben Evans talks about the 'old' PC market, where a dozen or so PC makers ran low margin businesses, outsourcing their industries' innovation to Intel and Microsoft.

In the PC world, the actual experience on the cheapest vs the most expensive laptop in the shop is not massively different — the fact that a Windows user knows how to use any Windows computer is one of its strengths. But from the point of view of a hardware manufacturer, the difficulty in differentiating one product/brand from another is a significant challenge.

With the growth of Android, the comparison between Android vs iOS and Windows vs Mac is drawn quite regularly (where Microsoft dominated the market, Apple were relegated to a niche part of the market and almost driven out of business), leading to the forecast that history will repeat itself.1

Its an interesting twist that for Samsung — probably Apple's strongest competitor in the Smartphone market — the analogy probably isn't a welcome one.

  1. The fact that Apple came out of it as the most profitable computer company in the world is usually left out of the comparison, as it doesn't really fit the narrative.

Beautiful but dumb

Back in 2012, Anthony Rose said that "in the future your TV will be a beautiful but dumb hi-res panel that will play the content it is told to by your smartphone or tablet." Which is looking increasingly accurate — but probably only telling part of the story.

"Save" vs "Sync"

There is a peculiarity that the floppy disk icon is still the standard symbol for 'save', despite being obsolete technology. But there is a broader issue going on – because the action of "saving" is quickly becoming obsolete too.

How often I use the buttons on my TV remote control

MyTVRemote.jpg

Makes me think that the future of TV remote controls is going to be more like this;

...and less like these;

Because occasionally – very occasionally – I need to do something that the buttons on my TV remote don't let me do, which is enter text into Apple TV (generally when searching for something specific in YouTube.) When I need to do that, its easier to get my iPhone out and use the Remote app.

I should point out that I've actually got one of these 1, which since setting up, I have never actually used. Because its more effort to get it from whatever dusty corner its ended up in than to enter text on an annoying left/right/up/down controller, navigating around an on screen alphabet.

  1. although with a slightly different 'special' keyboard layout.

Twitter and GfK announce TV measurement partnership

Twitter and GfK have announced a partnership to "introduce GfK Twitter TV Ratings in Germany, Austria and the Netherlands. The new service will provide insights into the frequency and reach of messages from Twitter users associated with television programs and campaigns."

Those watching the "social TV" industry will recall the deal Nielsen announced with Twitter at the end of 2012 (I wrote about the deal and implications of this kind of measurement for my work website at the time.)

And those wondering why the UK wasn't included in the deal may want to cast their mind back to last August, when a similar partnership between Twitter and Kantar (with SecondSync providing data) was announced.

The big question from my point of view is about how this "reach" measurement is being measured.

Will it be based on inflated counts that assume that every follower sees every tweet, and doesn't account for the fact that people might follow more than one person who tweets about a programme?

Or will it be based on actual data from Twitter, who presumably have the ability to know how many people actually see each tweet (given that they have to do the work of putting it in front of them.)

Sadly, I'm expecting the former…

Reality

For some reason, I thought it was a good idea to let the BBC News application give me alerts on breaking news.

Bad idea for two reasons.

  1. Its alerts come with a BBC alert sound. So if I'm with people, I feel that I have to explain that I don't have the BBC alert sound as a ringtone (I have sound effects from Legend of Zelda, but thats another "reality" topic altogether), and its a news alert from the App. (Unless they also have the app, in which case we all get these alerts at the same time.)

  2. It is always either bad news, or a reminder/update of a previous bad news story.

Apparently, there is no such thing as breaking, national good news. Maybe something like they royal baby would have qualified as "breaking national good news" — but for me, thats an irrelevant distraction that I'm going to be hearing plenty more about than I really want to anyway.

The Onion hit it on the head with its "This is what the world is like now" article;

“If you are not hyper-vigilant and in some way fearful for your very life then, I’m sorry, you’re living on a completely different planet,” National Security Advisor Tom Donilon told reporters. “Now, if you feel like you live in an unpredictable place where somebody hates you for no reason whatsoever and literally wants to murder you even though they’ve never even met you, well, that’s living in the world based on how it truly is. It’s an age of constantly searching for answers but realizing there are none because there are simply no logical answers when it comes to insanity.”

Its difficult to step outside of this world, because its the world that we live in. What is less obvious is that it is the world we choose to live in — perhaps because it isn't a choice we think about.

At work, I had the opportunity to join in some training sessions around Mindfulness. One exercise involved looking at your daily routine, and identifying whether each of your regular activities had a positive or negative ('draining') effect on you. Almost everyone had "watching TV" as an activity, and almost everyone considered it a 'draining' effect. So there was a bit of discussion about what people felt they should do. The general answer seemed to be "watch something else."

The idea of not watching TV on a daily basis barely surfaced in the discussion. As though we didn't have a choice in buying a TV set, putting it in our living room, seetting up our most comfortable furniture around it for optimal viewing angle (god forbid we should have the chairs we sit in all evening facing one another!) and switching it on every evening. Our only choice was what channel to watch.

Its kind of like Pepsi vs Coke — we get so caught up in whats going on in "one vs the other" that we don't think about what the third option might be. (Lemonade? Mineral water? Tap water?)

In the media industry, there is a lot of energy going into a semi-manufactured "TV versus online video" war at the moment, with the idea that online video will be able to disrupt the TV market, bringing down the monopolies of cable networks in the US and bringing quality video content to everyone, without all the stuff we don't like about TV (ie. subscription prices, advertising - the stuff that funds the content we want in the first place…)

But this isn't about all of that.

This is about reality.

First point;

Reality is what you choose it to be.

That doesn't mean that if you want to live in a world without gravity, you just need to close your eyes, click your heels and believe… We have a limit to what we can choose from. But the choices available to us are almost infinite. It means that if you choose to live in a world of "news" — watch nothing but 24 hour news channels, read nothing but news editorial, listen to nothing but news radio — then the reality that you live in won't have much in the line of art, poetry, music or dance. (For example…) But that is a consequence of a choice.

Whats more, if you believe that we experience the world subjectively — that is, we are constantly creating subjective representations of our experiences, based on what we notice and what we selectively focus on, then beyond our choice of what we want to experience is our choice of how we want to process those experiences.

So, your reality is your choice.
Understanding what that means is easy.
Understanding the consequences of those choices… aren't.

Second point;

Media is an extension of your senses.

Media theorist Marshall McLuhan went so far as to define "media" as "any extension of ourselves", or more broadly, "any new technology". It fundamentally changes our relationship with the world us.

Whether that is a live video feed from a warzone on the other side of the planet (which extends our senses of sight and sound), the printed word (allowing us to communicate not just across space, but across time), motorised transport (which compress distance, bringing distant people closer — for better or worse), or even the light bulb (opening up new spaces, changing what we can do in our homes or in public spaces once sunlight has vanished)— its all "media".

Whats more, it doesn't do it in a balanced way — a medium that is purely visual extends the sense of sight at the expense of the other senses.

The well known aphorism "The medium is the message" is all about this idea — that the effect that the printing press had on society had nothing to do with the content that was being printed, and everything to do with the nature of the printed word.

The reason this is important right now is because its changing. Networked computers take this idea to the next level. Instead of merely extending our senses, they extend our intellect.

So, take the idea of technology as "extensions of our senses", along with the fact that the latest generation of communications technology is all about the networked, multi-purpose devices (whether PC, smartphone, tablet, or whatever else might be around the corner), and you are left with a pretty clear picture of the importance of understanding the impact of all of this on society.

Third point;

Whoever controls your senses controls your reality.

McLuhan again;

Once we have surrendered our senses and nervous systems to the private manipulation of those who would try to benefit from taking a lease on our eyes and ears and nerves, we don't really have any rights left. Leasing our eyes and ears and nerves to commercial interests is like handing over the common speech to a private corporation, or like giving the earth's atmosphere to a company as a monopoly.
"Understanding Media", Marshall McLuhan, 1964

But this isn't something being done by other people and choices being forced upon us. It isn't about "them". Its about us.

It used to be that your choice was limited to the newspaper you read, or the TV channel you watched the news on.

Now, you have a choice of any media organisation in the world to follow. Or, you can turn to your chosen friends on Facebook, people you find interesting on Twitter, good blogs on Tumblr.

The World is like a ride in an amusement park, and when you choose to go on it you think it's real, because that's how powerful our minds are. And the ride goes up and down and round and round, and it has thrills and chills and is very brightly colored, and it's very loud.

Bill Hicks - "its just a ride".

Its only a choice - no effort, no work, no job, no savings of money, a choice, right now, between fear and love. The eyes of fear want you to put bigger locks on your door, buy guns, close yourself off. The eyes of love instead see all of us as one.

Bill Hicks' suggestion of what we can do to change the world is a pretty ambitious one. My suggestion would be to recognise what is the world you live in, and how much of it is the way it is, because of your own choices you've made. And how much of that is the way you want it to be. Because its your ride.


So, anyway, I've switched off BBC News Alerts. I think not knowing what is going on in the world is going to make me happier.

Corporate hacking

After I wrote about the idea that there might not be a next big thing in the world of technology (specifically, consumer electronics), an article appeared on Recode.net about "Big is the next big thing."

I loved the article, because it talks about an idea that I think I found exciting about 6 or 7 years ago, but have kind of forgotten about. Its about companies embracing 'digital' – not in the sense of building a Facebook page, or developing a social/content strategy, or making a useful iPhone app (or any of the other ideas that seem to pop up in every brainstorming session), but companies embedding digital technology within every aspect of the business, redefining themselves and their business models, looking to collaborate with other sectors.

The internet has always been about connections and networks – first computer networks with ARPANET/internet, then information networks in the 1990s with the World Wide Web, then social networks with Web 2.0 and social networking sites in the 2000s. But this next stage of networking businesses, where 'digital' moves up a step beyond just redefining marketing and starts redefining businesses.

But I think this is the paragraph that really got my attention;

Collaboration between sectors often starts when digitally enabled companies find new life through enhancements that expand their products or services well beyond their originally intended use, which attracts new buyers in new industries.

In other words, businesses are hacking on a corporate-level scale.

The Next Big Thing

It feels like something weird has happened to the technology industry.

It used to be pretty simple - a new gadget would appear, early-adopters would buy it and figure out what the interesting use cases were, the price would come down as it sold more, more people could afford it, so more people bought it.

Eventually, the gadgets would become mainstream. And "gadgets" were generic - "Walkman" and "Discman" might have been Sony brands, but I don't recall anyone really caring who made their portable tape/CD player. The benefit of one brand over another wasn't particularly clear.

At some point, it seems to have changed. At some point, there was clearly a "best" version of a gadget, and it was usually Apple's. iPod was the best MP3 player, iPhone was the best smartphone, iPad was the best tablet. (I think this is to do with the growing importance of software and UI design — with the growing complexity of gadgets, fighting against growing complexity of user interfaces has become a significant struggle

Wearable Technology

Right now, there seems to be a lot of excitement about "wearable technology". (Because everyone is expecting Apple to make a watch, I think.) The common narrative seems to be that, in the next couple of years, everyone is going to be wearing Smart Watches or Glasses (or smart earbuds or smart rings and so on.)

Now, there is an interesting thing going on in the world of health and fitness (not an area I've got a particular interest in — which might well be blinding me to a broader level of interest than I'm giving it credit for) where tracking and measuring what you do has a very clear benefit. But that is for particular people, in particular times and places.

But as a mainstream technology that "normal" people are going to buy and use? I just can't see it.

Is it because I'm getting old and cynical about the new and shiny things? I don't think so.

The thing is, everyone seems to be looking/waiting for the Next Big Thing – the gadget that is going to appear one day, and within a couple of Christmases every household will either have one, want one, or wish they had another.

I watched it happen with mobile phones — first, they were laughable gadgets for people who needed to be in touch with the office while they were driving to their next meeting, or networking on the golf course. Then they became affordable. Then everyone wanted one — for emergencies. Now everyone has one with them at (pretty much) all times.

Then the phones became smart — from pocket phones to pocket computers. Again — something that only a few people wanted (so they could keep on top of their email wherever they were) became something that everyone needed.

We watched everyone replace their record collections with CD collections, and we are watching them replace their CDs with either a library of MP3s (eg. iTunes) or streaming services (eg. Spotify.) We watched everyone replace their 23" CRT TV with 40" LED HDTVs. Everyone threw out their VHS recorders, replacing them with PVRs and DVDs. And it looks like we are in the early stage of everyone who replaced their desktop PCs with laptops now figuring out what happens when they replace their laptops with tablets.

So, naturally, we want to know what is the next piece of technology that everyone is going to have. And how is it going to change what we do?

Which brings my attention back to wearable. Because I don't think its going to be like that. Put simply, I don't think everyone wants Google Glass, or a Pebble watch, or a Fitbit, Nike fuel band etc. etc. And the devices that might be interesting to everyone (assuming a drop in cost and complexity) aren't interesting all of the time.

My guess is thats the key to misunderstanding around "wearable" – that if its going to be interesting, its going to be interesting all of the time. That, in the same way we have our phones with us everywhere we go, we will want to be wearing Google Glass (or whatever) all of the time. Not just when its useful, or when its practical, or when you're doing something where you could really benefit from having information presented to you in your field of vision? (Or maybe just when you want to be doing something with your hands at the same time as taking photos or videos?)

Will anything less be a failure? That seems to be what is being set up;

But what I’m looking for from any of the ['wearables'] companies during the Consumer Electronics Show is a device that gets the “Consumer” part of that equation exactly right, and delivers an experience people will be glad to go out of their way to actually wear – and not for a fortnight, but for a long, long time, until something better that fits the same need comes along.

Have we seen The Last Big Thing?

But not everyone wants an Apple Mac, or a 5.1 surround sound system, or a treadmill or exercise bike, or a games console, or a pair of Beats headphones. Doesn't mean that they aren't interesting, or successful, or a good business. It just means that they aren't for everyone. Not every "good thing" has to be a "big thing".

The interesting question is what if there is no Next Big Thing? What if the next 5-10 years of new technologies are going to be limited to niche markets — and there isn't another tech revolution just around the corner?

Have we been conditioned to expect something bigger?

It seems like anything less than an iPod/iPhone/iPad scale revolution would spell doom and disaster for Apple. But other technology markets (from products like TVs and smart watches, to services like Spotify or Netflix, to technologies like NFC or Ultraviolet) seem to be perpetually in Apple's shadow. Are Apple going to enter the market and destroy the competition? Or supress growth by not getting involved?

But if the Next Big Thing fails to appear, are we going to blame Silicon Valley and the tech industry for not being innovative enough? Investors, for not nurturing the Next Big Thing startup that might have been — too eager to sell out, too slow to build a sustainable business around the Next Big Idea?

Are we going to blame consumers and the economy for not spending enough money on the latest shiny gadgets — being too quick to buy the cheaper version of yesterdays innovation instead of the more expensive one which carries the true revolutionary technology? Or maybe the NSA, GCHQ, Facebook and Google for scaring everyone about what door the next wave of gadgets might be opening into our lives?

Or maybe — and probably more likely — the Next Big Thing isn't going to be a physical thing. I mentioned the impact that gadgets had when they became more about software than hardware. My guess is that the interesting developments right now are going on inside the devices that we already have — the services that are being built for a world of smartphones and tablets that wouldn't make sense in a world of desktops and laptops.

I don't know what the Next Big Thing is, but if I had to make a bet I would say that it won't be something we buy. It will be something we do.

Link hoarding and web design

I have a bad browser habit. At any given time, I will usually have dozens of tabs open. Its got to the point where my work PC regularly freezes up, and I'm pretty confident that it is due to the amount of memory being sucked up by 3 or 4 Chrome windows, each with 15-20 tabs open. (When it gets to the point where the icons disappear due to lack of space, I will usually start another one.)

The idea is that all of these pages have something interesting or useful in them – something I'm considering buying, work related research, interesting stories that I want to read properly but didn't have the time when I opened the page, things I was distracted from while reading (eg. opened a link in a new tab.)

But the thing is, this is supposed to be a powerful PC (Intel i7, 2.13GHz, 4GB, running Windows 7 64 bit – seems pretty good to me.) At home, my Macbook has 8GB of memory – enough that I can edit HD video, multitrack audio and do the kind of data crunching tasks in an hour or two that my last computer would have had to be left to run overnight (and probably still be running in the morning.) But the one thing that brings it consistently grinding to a halt is Safari (or whatever web browser I'm running), taking up too much memory with too many pages open.

Years back when I was more involved in web design practice, I was astonished at how designers would treat bandwidth and memory as near-infinite resources – throwing all sorts of Flash, Javascript and images onto web pages where simple text would have done the job just as well. In fact, text would have been better in many ways – for those with accessibility requirements, using devices that didn't support Flash/Javascript (like my old Windows Mobile phones), or with limited/metered bandwidth.

I don't think that its something that has got much better – 5-6 years ago, the fact that the iPhone didn't support Flash was seen by many as a problem with the iPhone. Today, if a website relies on Flash it is more likely to be seen as a problem with the website than the device used to look at it. (I have avoided even installing Flash on my Mac, and run a Chrome plugin that stops Flash from loading on websites unless I specifically allow it.)

Motherfuckingwebsite.com explains these kinds of issues – quite angrily – but the key point is at the end.

What I'm saying is that all the problems we have with websites are ones we create ourselves. Websites aren't broken by default, they are functional, high-performing, and accessible. You break them.

Anyway, this issue with bloated web pages is an issue – but at the same time, there are "web apps" – websites that include all manner of additional functionality, but not because they are over-designing and over-engineering what is ultimately a piece of text to read. Because that's the point of the app. So, a web browser needs to be able to handle these. Right now, I expect to be able to have my CMS admin window, an RSS reader, Twitter, Instapaper, webmail and at least a couple of other web apps open at any given time, without my computer griding to a halt.

So, while bad web design is a problem, it isn't really my problem.

My problem is that I need to stop keeping so many web pages open.

"Men with very troubling issues"

John Gruber;

Heads-up displays and augmented reality are coming, no doubt. But a lot of the people who are excited about it today seem to be men with very troubling issues.

Just watch the video.

I'm sure that there are dozens of ways that this kind of technology will be able to improve people's lives. But it seems that we aren't yet at the stage where the people selling the tech are looking at improving normal people's lives. (It feels like there is something very telling about the fact that the user of the AR tech in this video drives a Ferrari — not a symbol of engineering or style, but of ostentatious luxury and wealth.)

If the applications being touted are things like this, is it any wonder that people find the idea of Google Glass a bit creepy?

Windows 9: The operating system formerly known as Windows 8.1

Paul Thurrot recently shared some of what he's been hearing about the next big Windows release — codename "Threshold".

Some interesting stuff about "Metro 2.0", and the first major "vision" announcement since Longhorn, but the bit I found most interesting was;

Windows 9. To distance itself from the Windows 8 debacle, Microsoft is currently planning to drop the Windows 8 name and brand this next release as Windows 9. That could change, but that's the current thinking.

This strikes me as a dangerously bad idea. So much that I just can't believe that it might happen.

Firstly, a big difference between an '8 to 9' upgrade and an update/service pack is the price. If the tainted 'Windows 8' brand is ditched, will it still be free, as previously reported? Seems like something that would be very likely for a point release, but would be setting a revolutionary precedent for Microsoft. (Unless they were to move to a 'free software, paid support' model – consumers get to run the latest version for free, businesses pay for ongoing support etc?)

But the bigger issue – is this really addressing the problem? Microsft had a problem with Vista, which it addressed by making Windows 7 a significant improvement. If Windows 9 does turn out to be a rebadged Windows 8.1, then it risks turning the "Windows 8" problem into a "Windows" problem.

And, if the future of the Windows device really is something more like the Surface than the traditional laptop form factor, that becomes a big problem.

My Windows Weekly cohost Mary Jo Foley notes that Update 1 will be tied, schedule-wise to Windows Phone 8.1 and that both releases are a step towards that future when Windows and Phone are combined.

Lots of marketing moves that make total sense. My worry is whether the company that didn't see the problems they had with Windows 8 before they released it will be able to fix them for the next release and restore the confidence of PC buyers.

Its not like the people selling PCs have enought to worry about. Even Intel looks like its hedging its bets.

SomeRandomNerd.net Redesign

I've just clicked the button to make a new template go live on the site.

I hope you like it - any feedback (particularly bugs or awkward looking pages) would be helpful.

Straight Outta Cullompton

I have a very good friend and colleague, who I have been badgering to start blogging, tweeting, and generally moving into the digital world of the 21st century.

Partly because I get excited about that kind of thing (see: everything I've written on this site. And the fact that this site exists…), and partly because I'd like to know more about what is going on in the world of music, which I increasingly hear about through him.

So, I was pretty pleased recently when – presumably through someone else's insistence (I had long since since given up) he started writing a blog about various music related things.

If music has an important place in your life, I would recommend that you have a look. A couple of good starting points — Mobile phones and pop music and a narrative context for music.

Oh - and he has written a book of the same name, which is a great read. Watch out for the film…

(Also worth noting - I'm in the band.)

Keeping my computer tidy with Hazel

Over the Christmas break, I went on a bit of a tidying and organising mission, clearing up my desk space at home, and clearing up my hard drive.

As anyone knows, the difficult bit isn't tidying up – its maintaining tidiness as you go along. And I thought thats something that I could really do with some help with.

The first thing is setting up some sort of 'system' – ie. what goes where, what do you keep, what do you throw away, what do you archive etc. (My own policy is that I usually don't throw away anything unless I am absolutely certain that it will have no value at any point in the future. This isn't always a good policy — especially when it comes to real world things — but in a digital world where I have a few terabytes worth of drives, backups and external storage, I think it makes sense.)

I tend to find myself with a few folders that gradually become general purpose dumping grounds – which is what I need help with.

Downloads (a massive collection of files that either stay around long after they are useful, while more useful/interesting things get easily lost in the clutter.) Desktop – where I put things while I'm working with but move out again much less frequently.
And Dropbox – a repository of things that I want to be able to access from work, home and mobile devices, but again – the system of putting things in doesn't have much of a 'companion' system for getting them out again.

So, in imposing some sort of order on them, I find myself building 'rules' – I want to keep software I've downloaded, data tables and MS Office files are usually related to work.

Which is what led me to the Hazel app for the Mac. $28 (£17), with a free trial (14 days with full functionality – which should be enough time to figure out if you are going to get any use out of it.)

Basically, its an automated organisation system for the Mac. The interface is simple - in System Preferences, a Hazel pane appears;

Screenshot 2014-01-11 14.52.44.png

Here, I've got 7 folders that Hazel is watching, with 3 rules for my Dropbox folder.

The first rule is called 'Tag recent files' – named, because thats exactly what it does; anything in my Dropbox folder that has been recently modified gets a "Recently Modified" tag. I've also got this set up in Finder as a 'favourite' tag, so at a glance, I can see anything in there that has been recently modified.

Screenshot 2014-01-11 14.55.34.png

It has a companion Rule, which will remove the tag from anything last modified more than a week ago;

Screenshot 2014-01-11 14.57.15.png

So, where before I tried to maintain an "Active" folder of things that I was currently working with (which just ended up turning into yet another general purpose dumping ground), I now click on the Tag in Finder, and I'll see everything that I've been working on in the last week. (I could probably do something with Spotlight instead, but I'm playing with tags for now…)

The downside here is that the tags are Mac OSX only - so none of this organisation is much help on my work PC or iPhone/iPad, but I can live with that for now.

Another rule will pop up a notification to let me know if any large files are taking up space in my Dropbox folder, to help me keep the size down.

Screenshot 2014-01-11 14.57.52.png

Right now, most of my rules are either moving files into some soft of folder organisation, or just throwing tags at files – one of the Mavericks features that I've only just started exploring (again, as a part of my Christmas tidying/organising drive) – I haven't quite figured out whats useful (ie. lets me do something faster, more easily) and whats just interesting (for example, I didn't realise that the download source domain gets saved as part of the file metadata – so, adding those as 'source:url' tags lets me then use that to file – like stuff from certain analysts gets automatically filed away into an 'analysis' folder.)

As far as official documentation goes, Hazel does seem to be a little thin on the ground. However, that said, the interface is simple enough – what is probably more useful is inspiration than direction – there is so much that can be done with Hazel, its more a case of figuring out what would be useful for the way you work (ie. taking care of things you do manually so that you don't have to) than trying to figure out how to go about doing whatever you want to do. Chances are, if Hazel can't do it on its own, then combining Hazel actions with Applescript/shell scripts will get you where you want to be.

A good starter for ideas on the official forums

Take 5 iPads into the shower? Not me…

This story is truly insane;

Master of His Virtual Domain

At one point, he was bringing five iPads into the shower with him, each wrapped in a plastic bag, so that none of his accounts would go inactive.
During this period, while children like Ichi were dreaming of becoming the next Jorge Yao, George Yao himself lost 20 pounds, almost without noticing.

I've been playing Clash of Clans lately – I have a strange sense of pride that I play 'casual games' with a policy of not spending any money (and certainly not losing weight over it), so I'm a long way away from playing it like these top players, but I do have some sense of what the article is talking about.

John Gruber's comment on the article was that;

Yao is obviously an extreme case, but in-app purchases are driving game design more towards addiction and less towards fun.

I would definitely agree that this kind of behaviour is a kind of addiction. But I'm not so sure that I'd attribute it to in-app purchases. I don't know how many people are playing this particular game, but with millions of dollars a day being spent, it must be well into 8 figures.

The thing that I find fascinating is how the players of the game spread out, in terms of this kind of behaviour. This particular game is clearly an extreme case – simply from the amount of money being spent. (There are hundreds of games out there that are barely covering their costs.) And this particular player would appear to an extreme case within that extreme case – no other player has managed anything close to his achievement.

But how many people are sinking money into this kind of game? (I have to admit, I have mentally planned what I would do if I were to spend some money within the game – a game which, I feel I should point out, I have probably spent a good few hours playing without either spending a single penny, or seeing any advertising – so the people who are paying for gems within the game are effectively paying for my games.) I'm pretty sure that most people are just playing for a little while, getting bored and moving on. A smaller number must – like me – be dipping into the game, providing live opponents for similarly matched players to play against. (The way this particular game works is that you can only fight a player when they aren't online.)

And – I presume – a smaller number still are putting money into the game. (I would guess that it starts out by buying a small bag of gems and building a second Builders Hut – and once that threshold has been crossed and that first purchase has been made, its a lot less thought to put a second payment through to boost your stock of gems…) And of those is a hardcore base, spending 'too much money' (I struggle to conceive of a value system where $250 a week on virtual goods in a casual, mobile game isn't 'too much money'), and/or too much time.

But… How many people are spending that kind of time and money on other kinds of computer games. How much do you need to spend on the latest games consoles (and games) to hit the 'too much' threshold?

I can think of a few games I've played where I have crossed the line between the fun of the initial stages of the game through to the 'work' of having to grind your way through some menial, time-consuming task or other.

Usually – unless there is an especially good storyline that I want to see through to the end – thats the point where I put down the controller and move on to something else.

But sometimes, games become an obsession. I've lost track how many people I've talked to who understand the phenomenon of playing too much Tetris and starting to see block shapes in things like wall tiles. Tetris didn't need micropayments to worm its way that deeply into people's brains. Whether in-app purchases are deepening those kind of 'addictions', I don't know – but my guess would be that its more of a social than economic effect. Its relatively easy to walk away from a financial investment (at least, to stop throwing good money after bad.) But is it as easy to walk away from a 'social investment', when you've spent hours getting to know people in your 'clan', helping each other out, and chatting with one another while waiting for your next match/power up/etc.?

I would guess that the answer is no.

Why take 'no' for an answer?

When I signed up for a renewed Wired subscription (which I originally received as a gift), I thought I was clear about what sort of marketing communications I wanted to receive from Condé Nast and their partners.

Now I'm not sure whether carefully checking the web form was even worth the effort…

How not to argue

Daniel Finkelstein, on the two rules of politics;

After several fruitless exchanges I fear that I responded to a personal comment by telling my interlocutor that I had reluctantly reached the conclusion that he was “pathetic”. I can’t say that I immediately regretted doing this. Or that, even now, I resile from the judgment. But the next day it didn’t feel as good as it did at the time. And a couple of days later I began to wince at the error.

The first might be considered the “shoot the messenger” rule…
When someone issues an angry rebuke, observers associate that person with anger. When accusing someone else of being pathetic, whatever the merit of my case, I mainly succeeded in making myself appear pathetic.

The second rule one might call the university fraternity rule. My favourite social psychologist, Robert Cialdini, points to a study of fraternity initiation rituals in the US. The more humiliating the initiation, the more the membership was valued. It’s a psychological trick we play on ourselves to make us feel the humiliation was worthwhile.

Thus by calling someone pathetic I was increasing my antagonist’s commitment to the position I was arguing against. He would become even more wedded to it in order to justify to himself the insults he was enduring.

I had made myself look small while simultaneously making my adversary feel more certain that he was right. Good work there.