(Pre) WWDC 2023
So- Apple event today; WWDC usually isn’t one to look forward to - unless you’re the sort of person who cares about things like Xcode features - because it isn’t the venue where they talk about the new iPhones. Maybe there will be clues about new iPhone features in some new APIs or something, but the focus is generally on developers.
This year is different; the expectation is that this is going to be where Apple talk about their next new platform to developers; if you want to build a VR app, and Apple wants VR apps to be available when people are able to buy their VR headset, then they need to talk to the developers about exactly what that involves, and WWDC is the place where they talk to developers.
Partly, that’s about ways Apple have come up with to make your apps better - new APIs that will let you plug new features into your existing applications, or that will spark ideas for new types of apps. Sometimes there’s hardware news - usually not iPhone news, but I wouldn’t be surprised to hear about new iPads and Macs (in other words, its a good time to talk about what M3 will mean.
But that’s unlikely to make the headlines - VR is going be big news, and there is no shortage of rumours, leaks and speculation. The challenge is figuring out which is which.
I’m expecting news of a headset that probably won’t launch until about 6 months down the road. (Any earlier and there isn’t really enough time to design and build new, compelling apps - and also clashes with 'new iPhone' news in September/October, any later and they are missing out on Christmas sales and might as well wait another quarter or two.)
I’m expecting it to be an expensive Version 1 - rumours seem to anchor around $3k, but with the caveat that rumours around V1 iPad were about double the actual price; the difference being that iPad was a ‘device for everyone’ at a price point that didn’t really change much over the years (if you buy an iPad Pro today, you’re likely to be paying more than the inflated rumoured price.) For a VR headset, I’d expect the ‘traditional rules’ of V1 Apple products to apply - the first one is the flashy expensive version, the second one is more refined and a cheaper price point, the third is the ‘mainstream consumer’ version. Its going to be easier for Apple to build a platform that starts at the high end and then moves into manufacturing efficiencies and a smaller/lighter/cheaper version of the same thing in subsequent years, rather than the trajectory of a ‘stable’ platform where this years version will cost the same (maybe more) than last years, but with additional features/benefits to compel people to upgrade.
I’m not expecting much in the way of ‘augmented reality’ - this is about a device you stick your head in as a portal to a virtual world, rather than layers of the digital world on top of the real world. I do like the HoloLens proposition of ‘annotated reality’ (eg. the use case of having a plumber see what you see and highlighting the taps you need to turn or the pipe you need to cut) - but I don’t think anyone is dropping $3,000 to save a few hundred quid on plumbers/electricians call out fees. Maybe some day - but not 2023-2025...
Or maybe the rumoured $3k price point is missing a key piece of information; maybe its a ~$1k headset that needs a $2k+ M-series Mac (or iPad Pro?) to provide the computing power it needs to do its 'proper' job? Maybe the key point about the weirdly timed Meta Quest 3 announcement last week was the emphasis on 'standalone device' - perhaps Zuck knows something we don't... Maybe its a good time to revisit what I was thinking about ten years ago about what the real point of 'heavy duty computers' like the Mac Pro would be in the future? (Could Siri Pro, served from inside your home network, do a 'better' job than Siri over the internet?)
The other thing is AI - the thing the tech world has got very excited about over the last 6 months or so, which has very quickly made Siri feel like a clunky artefact of a previous generation of technology. Customer expectations for how ‘smart’ a smart assistant can be have been reset, and Apple (and their competitors - Alexa, and ‘Hey, Google’) have been slow to adjust to the new reality. Which is kind of weird, considering how hard Google have been pushing to keep up - Bard and ‘Hey Google’ seem to have no real connection to one another. I get the feeling that Alexa has backed itself into a corner with its 3rd party ecosystem, while Google is more focussed on reclaiming the ground lost to OpenAI/ChatGPT - perhaps Siri has a unique opportunity for a reinvention; nobody is using Siri on a cheap smart speaker or 3rd party TV app the way they are with the others; an OS upgrade might be all Apple needs to launch a ‘true’ next-generation assistant.
This is perhaps the thing that I’m most excited about- a big weakness of VR headsets is the physical interface; using a ‘normal’ keyboard with a screen strapped to your face is awkward; virtual keyboards where you point to the letters is fine for entering a word or two (eg. username/password) but useless for anything more than a sentence. Apple has a number of advantages over Meta when it comes to making a physical headset - they design and build the best mobile SoCs themselves, they have a massive production pipeline with lots of partners and strong relationships. But its the software side where I think the real opportunity lies - if the selling point is a headset that understands what you say to it, then Meta will have to do much more than provide cheaper hardware that can’t really compete at a software level if they want to stay relevant. Maybe the speculation has been about the headset device, but the real story will be about the software that powers it. Raising the bar on how you actually interact with a virtual world rather than just sticking your head inside it would be a positive thing for the industry in general.
My final point is that Meta have a VR paradigm that they seem to be locked into; you stand up, you put on a headset, and you wave your arms around. In Meta’s offices (or Mark Zuckerberg’s house) that probably works really well - but for most households, the idea that you can have a space where you can stand up and wave your arms around without being massively disruptive (read: annoying) while other members of your household want to watch TV, have a cup of tea, cook dinner, read a book etc. just isn’t a good fit. Maybe its just because I’m a middle-aged man, but I like to sit down; sitting down with a VR headset feels kind of like sitting down playing Wii Sports; you can do it, but it feels kind of like a hack. When Steve Jobs demoed the iPhone, he was standing up on stage. When he demoed the iPad, he was sitting down in a comfortable chair; the idealised iPad experience.
I’m really hoping that, assuming Apple have someone wearing the headset on stage that we see them sitting down; a sign that their experience is designed for someone at a desk or on a sofa - ie. they have considered the physical space it lives in at least as much as the virtual space it creates; not requiring kind of space that doesn’t actually exist in the majority of households.