As a technology industry analyst, I have had the privilege of covering Apple since 1981. During this period, I have watched Apple introduce new products, change leadership many times and delved deep into their product roadmaps and strategies. In the process, I have learned a great deal about Apple’s culture and how they think about advancing the world of personal computing.
From my years of tracking Apple, two very consistent threads seem to drive Apple’s strategy. These threads deal with two key things:
- The first is embodied in the original Mac strategy but defined, even more, when Steve Jobs left Apple and started a new computing company called NeXT. In fact, the name of Job’s new company is the consistent thread that drives Apple’s overall view of the world. Apple’s core DNA has been to deliver the “next” major advancement to the personal computing experience on a continuous basis.
The second thread is how they advance user interfaces with the computer. Their innovations in man-machine interfaces started with the Mac and then extended to the iPod, iPhone, the iPad and most recently, Apple Watch.
This focus on advancing the computing experience started with the Mac when Jobs and team created a new computing design that included a mouse and a new software graphical user interface. In those days a PC was battleship grey in most cases and consisted of a monitor, CPU unit or box and a keyboard. Microsoft’s DOS OS was text-based, and Apple and Steve Jobs upset the PC applecart by creating an all-in-one design and adding the mouse and a GUI to the OS and introduced the world to a whole new approach to personal computing.
Apple extended the computing experience by adding a CD Rom drive and launched the era of multimedia computing. When Jobs came back, he added a new concept in all-in-one design with the candy-colored iMacs. With the iPhone and iPad, Apple introduced another new User interface for mobile devices and instead of a mouse, they used a finger and touch for navigation. Again, they advanced the man-machine user interface with a different approach to a GUI and a new way to interact with these mobile devices.
As I look at Apple’s future role at driving a “next” computing experience, I am drawn to two essential technology developments that give us a hint of where they could be headed.
The first is Augmented Reality. Apple introduced this at their developer’s conference in June 2017. I see AR Kit and AR apps being at the center of Apple’s “next” big thing that advances the personal computing experience. At the moment it is focused on using the iPhone and its screen as the delivery device for this AR experience.
Where Apple can and will expand the “next” big thing in personal computing will come through some form of goggles or glasses. Patently Apple has been uncovering Apple’s various patents for glasses or goggles over the last year, and a couple of these patent drawings can be found in this link.
A Google search for Apple Glasses also shows some other illustrations of the various designs described in some of the patent filings.
While these patent drawings are impressive, I think of them only as first-generation designs and a long way from what I believe Apple has to deliver in the form of glasses that will be acceptable to their broad user base and allow them to extend their innovative leadership to create the next personal computing experience.
Creating the “next” way we interact with a computer is in Apple’s DNA. With each major new version of hardware and software, Apple created a new way to use a computer and then evolved the hardware and software over time. Just look at the Mac’s of 1984 and 1998mid-1980’sre them with the desktop Mac’s of today. Same goes for the original Mac Laptops. They are nothing like the first models Apple brought to market in the mid-1980’s Look at the original iPod and then look at the one’s Apple brought to market six years after launch. Future designs were different, and the UI was much better than in the first two models.
Same goes for the original iPhone. Compare it to an iPhone today, and it is hardly recognizable. At the same time, Apple evolves the UI and makes the software even better to match the new designs of their hardware product with each new release. This will be the same pattern Apple will follow with AR software and apps today but will evolve this to include glasses as the most efficient and practical way to deliver AR to the masses in the future.
While other companies are working on similar AR apps and even glasses, Apple’s historical track record suggests they could be the one that defines and refines this “next” computing experience for the masses via AR, the iPhone, and AR or mixed reality glasses. Apple’s iPod was not the first MP3 player in the market, but it became the best and dominated this market for over ten years. Apple did not bring out the first smartphone, but the iPhone redefined what a smartphone should be, and Apple is a dominant player in smartphones that also brings in over 50% of all smartphone revenues. The iPad was not the first tablet, but it is still considered the best and brings in the most revenue of any tablet maker in the market today.
However, as Apple moves to deliver their “next” version of man-machine interfaces with smart glasses, they most likely will deviate from the competition in one significant way. Most vendors I talk to believe all of the intelligence for intelligent glasses needs to be integrated into the headset or glasses themselves. The problem is that this makes these glasses bulky and heavy and makes a person look like a dork.
I believe Apple’s strategy will have the iPhone serve as the CPU and brains behind these glasses and feed the data and AR content to what I call “skinny” glasses, which would be Apple’s “next” significant way they will drive the future of personal computing. By using a future iPhone designed and tuned to deliver rich AR content to a set of smart glasses wirelessly, Apple would allow their glasses to be light and most likely look like a regular pair of glasses. It will need special optics as well as a small battery, but all of the processing and content shared through the glasses will come from an iPhone.
With this design, Apple introduces two crucial ways to enhance the man-to-machine interface. To use AR apps tied to these glasses, Apple will integrate voice and gestures into their user interface.
This approach benefits Apple in three ways.
- First, it allows them to create glasses that would be acceptable to a mass audience who would be resistant to any glasses or goggles that make them look odd or different. People are used to either wearing glasses or being with people who wear glasses and don’t think a thing about that. This alone becomes a critical design criterion for any set of AR glasses that will be acceptable to the majority of users.
Second, it makes the iPhone even more critical to the person who will use these glasses for AR. If all the processing and AR functionality is being done on the iPhone and then wirelessly transmitted to their smart glasses, the iPhone becomes indispensable to the overall “next” computing experience Apple brings to the market.
Third, this approach keeps the string of products tied to Apple’s DNA moving forward. Creating the next significant computing experience through hardware, software and new user interfaces like voice and gestures is what Apple does best. While other will surely copy Apple’s approach to AR, the control of their hardware, software, applications, and services ecosystem put them in place to not only define and deliver this next generation computing model but to dominate this segment if they do this right.
Apple is laying the groundwork now that would allow them to deliver the “next” big thing in personal computing. While it may be a few years out, if you look closely at these glasses patents and understand that Apple’s real value is how they advance the man-machine computing experience, one can understand that Apple is on pace to potentially bring the “next” big thing in personal computing to a mass market sometime in the near future.