Apple’s Hardware Stars at a Software Developer Conference

Apple has not traditionally talked much hardware at WWDC. However, when you think about it, it makes perfect sense. Any and all hardware advancements are the foundations for developers to build upon. Developers will be the first to tell you they never want to see hardware innovation stop because hardware innovation is what gets them excited and empowers them to create the software of the future. Advancements in CPU, GPU, image sensors, etc., are all things that let developers do more than they could with previous hardware.

Understanding the relationship between hardware innovation and software/developer innovation is why it actually makes sense to show your developer community the powerful new and innovative hardware you have coming and the toolkits you are enabling them with to utilize that new hardware. That being said, there were a few important software updates, really core technology updates that really got the crowd excited. I’ll discuss those first then talk about the hardware which was the start of the show.

Software, and Core Technologies Laying New Foundation for Future
Much of the commentary was the software updates, the front facing things, were really incremental improvements rather than big leaps forward for MacOS. iPadOS was a different story and we will get there shortly. But even by the name High Sierra, we can gather this update is really an improving on the last one vs. something entirely new or a big step forward. Next year will probably be the year for macOS to take a leap. But at the foundational level many new kits and developer tools were released that give us some hints of where Apple is going. Kits for machine learning, and augmented reality were two of the big core technologies coming to either macOS or iOS 11, or in some cases both.

Apple’s operating systems are acquiring the foundation in the core architectures to start to learn and adapt to their users. Even in many cases today, Apple’s machine learning is hidden in plain sight and my experience with predictive text, Siri suggested apps, Maps, etc., is not like yours because they have both learned our habits and traits and start to customize their user interface to our unique needs. This is only going to deepen going forward as Apple’s operating systems becoming learning and more dynamic operating systems providing unique experiences to their users. In many analysis in the past, I’ve used the word anticipation engine to articulate how a piece of software can become more user and more contextually user aware. At the core, Apple is turning macOS and iOS into an anticipation engine.

ARkit is really a big deal and it was obvious. Most folks have never really had a good AR demo so it was interesting to see all the reactions from folks experiencing incredible AR demos on iPad and iPhone in the hands on areas. Developers can download the kits and the examples and start learning how to use them. Apple’s statement, that day one, they will have the largest AR development platform in the world is something to seriously consider. While one could argue Facebook has more reach, we know Apple has more engaged users and developers and this single factor trumps reach and gives Apple the advantage. Just looking at the devices it supports, it is likely that by the end of the year over 80% of Apple’s iOS installed base is an augmented reality customer. That equates to a market opportunity of around 600m people by the end of the year. Developers will be quite pleased with that.

The App Store redesign is also worth mentioning. Apple stated 500 million people visit the app store every week. This design is being updated for better discoverability, but most interestingly, the addition of a Today tab could increase that weekly number to a daily number. If this happens, then the app opportunity for developers goes up exponentially. This one change could end up being one of the biggest changes to the App store for both consumers and developers.

The Hardware that Stood Out
In what is turning out to be a bit more transparent Apple in terms of roadmap (a welcome pivot in the new world), they offered a sneak peek at a few products. First is the new iMac Pro, which is a workstation beast that can go up to 18-Intel Xeon cores and has AMD’s latest and greatest Vega discreet graphics architecture. The sleeper addition here, and for all Macs in general, is the new support for eGPU (external GPU) solutions. Through recent years, Thunderbolt has acquired the ability to reliably push massive bits of computations over Thunderbolt cables meaning we can stack modern GPUs together and make our notebooks, or all-in-one devices like an iMac, expandable with the most important chip in today’s world, the GPU.

Let’s also be clear, the iMac Pro is not the Mac Pro Apple said they are working on. We are not sure what the timeline for that is but we do know they are not the same. This did, however, help ease the concerns of the pro community that Apple was letting them down. I think they can rest assured Apple is still committed to the Mac as ever, and honestly, the eGPU support is a really big deal that I’ll dive into at some point.

Apple made some important new performance and screen size updates to iPad. Going from 9.7 to 10.5 inches on iPad Pro truly did yield more screen size value than I initially thought once I played with it. The 20% more screen actually feels more like 20% and the full-size keyboard support single handily moves it up as a productivity device. Between the screen, and the CPU/GPU performance updates the new iPads truly enter a class of productivity they were not before. But it really is the software that will finalize this claim.

From what I saw, the specific updates to iOS 11 for iPad look very promising. From the feature that lets you organize multiple workspaces, like spaces on macOS, to drag and drop for multi-task, to screen shot edit, to screen record, etc., are all things once reserved to “computers.” Now the list of things you can’t do on iPad that you can on a Mac or Windows PC has dwindled, to perhaps, just using a mouse.

The really impressive stuff to me was how the touch-based workflows for things have improved. If Apple would have shown us this OS when iPad first came out people would have laughed, maybe cried, and there is no way the product would have done as well as it did, or get the opportunity to grow up into the mature adult it is today. The file system demos were truly unique in that it was not just a file system duplicate of what you get on a PC. It was truly a rethink of the workflows around file management in the modern, mobile, touch-based world.

Lastly, HomePod. As I wrote in my case for a Siri speaker last week, the whole room audio use case is the strongest. Most consumers simply don’t have this and we know it is a highly desired experience they have tried to fill with low-cost Bluetooth speakers that sound ok but not great. Sonos is a great solution, and they have always been the company with a target on their back in my opinion. I have friends with a Sonos and it really is an amazing solution, but one that Apple can do better and it looks like that is exactly where they are going.

I had the opportunity to hear the sound quality of the HomePods and they were truly amazing. It’s worth mentioning, that over the years I have done projects with high-end speaker, audio codec, and audio technology companies and had my fair share of demos in a controlled room listening to a full surround theater set up costing north of $5,000 and in some cases $10,000 dollars. I’m not joking when I say the sound quality out of these was not that far off. The spatially-aware technology on how sound is intelligently distributed is the true enabler here. The device was able to know what parts of the track were center audio (lyrics), accompaniment, and ambient like background vocals or percussion, and smartly distribute that sound to the right speaker and off the right wall in order to fill the room with sound. This is going to be one of those things you need to hear for yourself, but I’m confident in my audio knowledge to stand by the claims I made.

Apple led the value proposition with Music which I think is very smart since playing music is the top and most frequently used use case of all smart speakers today. However, that does not mean HomePod is not going to be great at assistant features also. Much of the commentary was Apple did not lead with Siri because Siri is not great, but whether that point is really true or not (and I may have quantitative evidence to dispute this claim), the reality is leading with Music is what the market will be most attracted to today. Average consumers are not buying Google Home or Amazon Echo because of their assistants, contrary to what many believe, early adopters may but average consumers are not. Music is the lead the mass market will appreciate and the rest is just icing on the cake.

That being said, Apple confirmed to us there is a lot they did not share about HomePod. Today was truly a sneak peek and more information on all it can do will come later in the year.

Overall, this WWDC set a new tone, and hopefully a new pattern of a combination of talking new hardware and new software tools together at WWDC. Apple covered a lot of ground leaving the fall just for the next iPhone. These two events together may be looked back on defining moments for Apple in their transition to the next era of computing.

Published by

Ben Bajarin

Ben Bajarin is a Principal Analyst and the head of primary research at Creative Strategies, Inc - An industry analysis, market intelligence and research firm located in Silicon Valley. His primary focus is consumer technology and market trend research and he is responsible for studying over 30 countries. Full Bio

3 thoughts on “Apple’s Hardware Stars at a Software Developer Conference”

Leave a Reply

Your email address will not be published. Required fields are marked *