For years, Apple has made a name for itself through the design of its products – their combination of appearance, materials, and software functionality (which is part of the “design”, aka “how it works”). It has been able to command premium prices for desktops, laptops, phones, tablets, even routers by making things that not only work well, but look good.
What happens to that advantage and ability to command a premium, though, when there isn’t a product to hold? What happens if you don’t have a phone to pull out, a tablet to press, or a router to put in the corner of your room?
This thought struck me while listening to John Gruber and Ben Thompson discussing Amazon’s Echo, which we could roughly call a home automation device, and considering Google Home, which is going to be approximately the same thing. Both de-emphasise the physical product (there isn’t even a screen) in favour of an unobtrusive always-listening device which doesn’t need to be pressed or waved at; it just responds when spoken to.
It’s not hard to imagine future versions of Google Home or the Amazon Echo would have less and less physical hardware; essentially, they only need to power a microphone, a speaker and an internet connection. In which case, what would Apple’s version look like? It might look – might even be – the Apple TV. It’s nice, but many people would struggle to pick it out of a lineup. And once it’s underneath or behind your TV, you could forget it’s there.
Razing the playing field
But when you reach that point, the ground on which Apple used to fight – appearance, materials, “look and feel” – has suddenly vanished. The shift to systems which don’t need us to look at them directly and which feed information back to us by means other than an integrated item with a screen, doesn’t so much move the goalposts as set fire to them and terraform the field where they were standing.
In the same vein, I was asked a few years ago – when Siri had newly been announced, but Samsung was already making inroads to the premium market with the Galaxy Note – what I thought the phone of the future would look like. I suggested you wouldn’t actually look at it much. It would probably be Galaxy Note-sized but it would sit in your pocket and feed information to your headphones in response to questions you asked into the mic on the headset. Less need for the screen, less need for typing on the physical object.
Our obsession with photographs and cameras has forestalled that shift; Instagram and Snapchat demonstrate that, when it comes to social interaction, we love the visual. That suggests screens and devices – in other words, things we actually hold and carry around – remain important.
Even so, the invisible device does seem to me the biggest risk Apple faces. The advantage it has is Amazon’s Echo and Google Home are devices for, well, the home and, although we might talk a lot about it, the extent of our desire to have computing interaction with our home is surprisingly limited. Jan Dawson made this point well recently. The current “smart home” market is composed of early adopters because your fridge can’t ever be that smart and you’ll still have to load and unload your washing and put coffee into the coffee machine.
What’s more, most of us spend most of our time outside the home. And that’s where we need our devices. So far, we haven’t quite taken up the idea of chatting away to our headsets in the manner of Joaquin Phoenix in Her. But, bear in mind, social norms can shift; our parents’ generation would have been (and still are) appalled by the way teenagers today will ignore each other while across a table, or their elders, in favor of the glowing screen. And 30 years ago, walking along the street talking aloud to nobody was a sign of insanity. Now it just means you’re on a call. (Spot the difference: if you’re wearing earphones, nobody will turn a hair.) If intelligent assistants really take off, devices might shrink away too. Though every time I follow that reasoning, I arrive back at the need to digest visual information. We’ll still need screens, and hence housings for screens, and hence design.
These answers brought to you by…
That this potentially poses a threat to Apple doesn’t mean that everyone’s safe. Amazon’s pretty safe; if people order things via the Echo, it benefits. But Google relies on people looking at ads for 90% of its revenues and rather more of its profits. If we don’t look at a screen, how do we get the ads? Perhaps it will adopt the solution chosen in the UK by the “Speaking Clock”, a phone service you called to get the precise time read out to you. In 1986, the newly privatised British Telecom put it out to sponsorship, which was eagerly snapped up by Accurist – and so for 22 years, you would be told, “the time sponsored by Accurist is…”
Maybe that’s how Google will adapt if voice is the new interface. Equally, maybe that will open the door for companies like Apple to charge extra so we don’t hear the ads. The invisible device might still yield a premium. It’s just a question of what you’re paying for.