Earlier this week, Bloomberg reported Apple is evaluating moving into digital glasses. According to “people familiar with the project who did not want to be identified” (when do they ever want to be?), the device would connect wirelessly to iPhones to show images and other information in the wearer’s field of vision. If there is a product and if Apple decides to actually bring it to market, it won’t happen before 2018.
It was not the news itself that made me think about this but rather the different comments I saw pop up on social media and in press commentary. They quickly pointed to a couple of interesting underlying misconceptions I thought it would be worth fleshing out.
Google Glass is not the Benchmark for Smart Glasses, Let’s Move On!
When we started talking about wearables, the list of devices was pretty long: bands, watches, pendants, straps, helmets, smart-fabrics, adhesive strips, cameras, and glasses. The initial vendor excitement was met with limited interest by consumers and, as vendors were trying to figure out what worked, we saw the focus centering more and more around the wrist. Google Glass very much helped that process of elimination.
But Google Glass’ flop does not mean there is no role for smart glasses. The key differentiating point between success and failure is the focus. Google Glass was not only early to market but it was also trying to be too many things at once, leaving users confused about its reason to exist. Google thought of Glass as a wearable in the same way we now think of a smartwatch – something we have on all the time. Yet, rather than focusing on a few specific tasks, Google Glass attempted to replicate many of the tasks our smartphones were performing. Glass was a camera, a search engine, an assistant, and one of the initial voice-first devices.
As people commented on the Apple rumor this week, many were quick to remind us of what Tim Cook said about Glass back in 2013 in an interview at D11:
“I wear glasses because I have to. I don’t know a lot of people that wear them that don’t have to. They want them to be light and unobtrusive and reflect their fashion. … I think from a mainstream point of view [glasses as wearable computing devices] are difficult to see. I think the wrist is interesting. The wrist is natural.”
As someone who has been wearing glasses since the age of three, I can certainly relate to what Cook meant. Glasses are not something you want to wear every hour of your day. Yet, I have no problem wearing glasses for specific tasks like sunglasses or swimming goggles. The difference here is shorter periods of time, focused tasks, and high return from the experience. This is what I think Apple would have in mind if they move into this space. Plus, of course, a design that would appeal to the mass market and would not scream “tech”.
Snap – formerly known as Snapchat – has taken part of the idea of Google Glass and made it commercially appealing to millennials. Spectacles have a funky design without being obnoxious, they are affordable, and the task they perform is perfect for the device. Snapping videos without being intrusive so as to capture the true moment and doing it fast (all you need to do is look in the direction of where the action is) sounds very simple and perfect for how Snapchat is used.
I Say Wearables, You Say Smartwatch
While one wears Spectacles, I would not put them in the wearables category, the same as I would not put a GoPro among wearables. When Cook said, “The wrist is natural”, he was looking at the Nike FuelBand he was wearing. So it would be safe to assume that, while he voiced his concerns about convincing people to wear something on their wrist, his focus was more about delivering a device that could be with us all the time so that it would learn from us and increase its value to us over time.
When we talk about wearables today, we really mainly talk about fitness bands and smartwatches and I see their role being very different from Google Glass. Their focus is to capture data as much as to display data. To be transmitters more than receivers. Think about all the sensors these devices have that help capture information which is then processed and used in different ways. Today the best showcase is fitness but, with time, the use cases will increase. They are certainly not portrayed as an all-powerful computing device. They are a companion device, especially for Apple, that might alleviate some of the load our smartphone has been carrying for so long. Interestingly, this view was not initially shared by the Android Wear team, who seemed to pitch wearables in a very similar way to Google Glass when it came to a do-it-all approach to replace most, if not everything, your phone does. The longer and more consistently you wear these devices, the greater the benefit. Which means they have to be extremely comfortable, somewhat fashionable and, if failing on the fashion part, they should almost disappear. While they might take over from your phone at times, they are not designed to be your main computing device for any long periods. The wrist is the ideal location for both collecting heartbeats and allowing you a quick peek of short and timely information.
AR and VR goggles are very different. Similarly to Spectacles, while you wear Oculus Rift, Gear VR, Google DayDream, Microsoft Hololens, their main function is to display content. For that reason, the proximity to your line of vision is critical. If you think about the main difference between VR and AR/Mixed reality, the former is about you being in a fully immersive world different from the one you are actually in and the latter is about enhancing your current world. You can see how different kinds of glasses or goggles will be required. Given Tim Cook’s public position on how AR is more interesting than VR, I can see how some of his comments about designing something people want to wear still apply.
More than the design, however, I would expect Apple to prioritize the experience in regards to safety and privacy. This might mean the use cases, at least initially, might be limited either by experiences or locations – like your car DVD player not playing on your dashboard screen when you are driving.
If true that the glasses will connect to your iPhone, it seems Apple is trying to avoid the battery issues Google Glass faced while, at the same time, showing it does not think slapping your phone in front of your face is the right thing to do even when that phone is an iPhone.
The Wearable Market of the Future
Many make projections about what the wearable market will look like by 2025 and the definitions of what is included are almost as many as the numbers thrown around. The wearable market will be much more complex than the PC and smartphone market ever were when it comes to devices that should and should not be included.
Whether or not Apple is really working on glasses will be confirmed in due course. But the fact they might be considering glasses does not negate what Cook said back in 2013. To be a wearable device in the strictest sense you need to be able to wear it 24/7 or very close to it. Wearing a device for part of the day does not make it a wearable, no more than being able to move an all-in-one desktop from room to room makes it a mobile computer. I see wearable technology as the next phase of “connected anytime, anywhere” — the main task of the devices we will be wearing will be to more clearly feed into AI and big data than feed off of them.