Latest Posts:
In the growing buzz around generative AI, a new concept in research methodologies has arisen; "synthetic respondents". Instead of asking people the questions, a Large Language Model creates 'synthetic respondents' which you can ask as many questions as you like. And they will give you answers. And they will probably sound like real people. They will never get bored. They will never try to disguise their "true" thoughts and feelings (as David Ogilvy once said, “People don’t think what they feel, don’t say what they think, and don’t do what they say.”.) You can get answers from thousands of them, very quickly and at very little costs.
(Also - they never leave behind a bad smell, and won't eat all of your biscuits.)
But again - so obvious as to be barely worth mentioning - they aren't real people. They are synthetic - "made up." Just like the 'actors', pretending to be the sort of people we actually want to talk to.
They will do it faster. They will do it cheaper. Will they do it better - or at least, 'good enough'? Well... that's the real question.
A rough theory of why voice notes get such wildly different reactions from different people.
The Apple Vision Pro is now on sale. People are getting their hands on them, and sharing their opinions. People who haven't got their hands on them are sharing their opinions. There are a lot of opinions flying around.
First thing - sure, I'm interested in the headset, and the device actually getting in 'normal' people's hands (or on their faces) is this week's news; I'm not going to buy one, because it's ridiculously expensive and if I had that sort of money to throw around, I probably wouldn't be driving a car that's approaching either its 18th birthday or its last trip to the scrapyard and has done the equivalent milage of 5 times around the circumference of the earth.
But what I'm really interested in is the Vision platform; the bits in the software that are going to be the same when the next headset device is launched. And once there are a bunch of different ‘Vision’ devices - where they will fit, in the spaces in people's lives.
The promise of the internet plus the World Wide Web was an open, free network of hyperlinked pages, filled with all of the worlds knowledge. For years, the terms "internet" and "world wide web" were almost interchangeable/synonymous.
30 years on... It has issues.
If VR is going to truly take off, we’re going to need a virtual sofa to sit on.
Fifteen years ago, I wrote a blog post titled “Losing a Virtual Limb”, which was trying to articulate that funny feeling I was getting that buying my first iPhone was going to change everything.
Recently, I got that funny feeling again.
I’ve been using the same iPhone for six years - by far the longest I’ve had the same phone - smashing my previous record.
Its finally time to upgrade.
Charger plug standards are a weird thing to get excited about, but I’m excited about the proliferation of USB-C.
Except for one thing…
Actually, the best programming language of the future is probably going to be English…
General observations on trying out a new social networking site…
Some Featured Posts:
Featured Posts
Actually, the best programming language of the future is probably going to be English…
This is the tech war of the moment; a race to be the first to develop an AI/Machine Learning/Deep Learning product that will be a commercial success. Google have a head start - Microsoft+OpenAI look like they could be set to catch up, and maybe even overtake Google. But if this is a race then where is the finish line? What is the ultimate goal? Is it all about the $175 billion Search advertising market - or is it bigger than that?
After a long time of trying to come up with a simple answer to the simple question of “what is television?”, I decided to go the long way around.
One reason the Metaverse is doomed is because of the idea that it will straddle all of the different computing platforms; too many conflicting business interests will make this impossible to execute.
One reason the Metaverse will succeed is because of the idea that when something, sooner or later, straddle all of the computing platforms, it will deliver something incredibly useful.
Two narratives, one story;
An AI developed by Google has developed sentience - the ability to have its own thoughts and feelings, and a Google engineer working with it has been fired for making the story public.
A Google engineer thinks a 'chatbot' AI should be treated like a human, because he believes that it has developed the ability to have and express its own thoughts and feelings. After Google looked into and dismissed his claims, the engineer went public with them, and was then placed on paid administrative leave and subsequently fired.
The subject of the first story is artificial intelligence - with a juicy ethical human subplot about a whistleblower getting (unfairly?) punished.
The subject of the second story (which is a little more nuanced) is a human engineer going rogue, with an interesting subplot about ethics around artificial intelligence.
I think most of the reporting has been around the first version of the story- and I think thats because it fits into a broader ongoing narrative; the idea that 'our' machines are getting smarter - moving towards a point where they are so smart that humans can be replaced.
Its a narrative that stretches back for centuries - at least as far back as the industrial revolution.
If “moving seamlessly between virtual spaces” is a key feature of the metaverse, how might that actually work with virtual identities on a decentralised platform? (And why would Mark Zuckeberg, who holds a bigger centralised database of virtual identities than anyone, want that to happen?)
I mentioned in my last post that I had been working on a report & series of podcasts around the topic of the metaverse- well, I'm proud that I can now share the first episode of the podcast series - which just so happens to be the one in which I'm the guest/interviewee.
More Posts
There's an archive of all my posts here, my posts about post-pandemic work/life, things about the advertising/media industry, and other stuff. Maybe.
(Or at least, there will be...)