Apple AirPods: More than just Headphones
Prior to their going on sale, we had quite a bit of information about the AirPods and what they were capable of doing. We knew they would pair easily and that there were sensors built in that knew when you are wearing them and when you weren’t. But some things just have to be experienced to appreciate their magic and the AirPods are one of them.
First, you will never see a more seamless pairing experience than the first time you pair the AirPods. Open the case, press Connect, and they are instantly paired with all my iOS devices, including iPad and Apple Watch. As soon as you put one AirPod in your ear, subtle sound lets you know they are on and ready to be used.
Perhaps my favorite feature is when you take one AirPod out, the music automatically pauses. Put it back in and it resumes flawlessly. This is useful when someone is talking to you and you need an ear free to listen and respond. I have some context with this experience, having used the Plantronics BackBeats Pro 2 which offer a similar smart sensor that pauses your music when you take off the headphones. For whatever reason, I found taking one AirPod out much more convenient than lifting the entire headset off my head. Perhaps just preference, perhaps not. In either case, the seamlessness of this experience is fantastic.
Whenever you need to know the battery level of the AirPods or the charging case, simply open the case next to your iPhone and this screen instantly pops up. Apple is using some sort of close proximity solution because, if you move the case even one foot away and open it, nothing happens on the phone.
I’ve been using Bluetooth headphones for years, so the awesomeness that is wireless headphones was not new to me. But, these were the first I’d used which are independently wireless — not connected to anything. With sports Bluetooth headphones you notice and feel the wire on the back of your neck as you move. Similarly, with over the hear wireless headphones like the Bose QuietComfort or Beats Wireless or similar ones, you feel the band that goes over the top of your head. The point is, they don’t disappear. I was surprised and delighted by how comfortable the AirPods are in my ears and how easily you forget they are there. Interestingly, I feel the same way about my Apple Watch. It seems the theme with both of Apple’s wearable computers (and yes I consider the AirPods to be wearable computers) is comfort to the degree of making them feel as though they disappear. This may be ear-shape dependent so my statement may not be true of everyone but it is with me.
Many others who have tried them have commented on how well they stay in your ears. I found this to be true. I used them while doing light exercises like yoga and even some living room cardio (via the Apple TV app Zova) and they stayed in perfectly. The lack of a cable makes a difference in helping them stay in your ears. I took it one step further and played a singles tennis match with my playing partner. I’m sure Apple wouldn’t recommend them for an intense run or similar activity, but I figured I’d try it. I’ve tried every form of sport Bluetooth headphones and, because of the wire behind my neck and some of the violent movements of tennis, they all fall out regularly. Here again, not having the wires attached made all the difference in the world. Maybe the AirPod shape fits my ears like a glove but they didn’t fall out one time during my match. In case it matters, I’m a fairly high level (by USTA ranking) tennis player, so I go at it pretty hard.
When I was tweeting my thoughts about AirPods, I got resistance from some saying, “Aren’t they just wireless headphones?” Apple’s AirPods are just wireless headphones about as much as the Apple Watch is “just” a watch and iPhone is “just” a phone. Nothing makes this more apparent than the Siri experience.
Siri in Your Ear
It is remarkable how much better Apple’s Siri experience is with AirPods. In part because the microphones are much closer to your mouth and, therefore, Siri can more clearly hear and understand you. I’m not sure how many people realize how many Siri failures have to do the distance you are from your iPhone or iPad, as well as ambient background noise and the device’s ability to clearly hear you. Thanks to the beam forming mics and some bone conduction technology, Siri with the AirPods is about as accurate a Siri experience I’ve had. In fact, in the five days I’ve been using the AirPods extensively, I have yet to have Siri not understand my request. Going further, the noise canceling built into the AirPods is impressive as well. I’ve intentionally created noisy environments to test the AirPods and Siri to see how it handles loud situations. Perhaps the most intense was when I turned my home theater system to nearly its peak volume, blasted Metallica and activated Siri. Remarkably, it caught every word and processed my request.
Furthermore, having Siri right in your ear and available with just a double tap on the side of either AirPod profoundly changes the experience. In many ways, the AirPods deliver on the voice-first interface in the ways I’ve been impressed with Amazon’s Alexa.
There is something to not having to look at a screen to interact with a computer, especially in a totally hands-free fashion. The AirPods bring about an experience which feels like Siri has been set free from the iPhone. This was Something that enhanced the experience but also pointed out some holes I hope Apple addresses.
Voice-First vs. Voice-Only Interfaces
There is, however, an important distinction to be made where I believe the Amazon Echo shows us a bit more of the voice-only interface and where I’d like to see Apple take Siri when it is embedded in devices without a screen, like the AirPods. You very quickly realize, the more you use Siri with the AirPods, how much the experience today assumes you have a screen in front of you. For example, if I use the AirPods to activate Siri and say, “What’s the latest news?” Siri will fetch the news then say, “Here is some news — take a look.” The experience assumes I want to use my screen (or it at least assumes I have a screen near me to look at) to read the news. Whereas, the Amazon Echo and Google Home just start reading the latest news headlines and tidbits. Similarly, when I activate Siri on the AirPods and say, “Play Christmas music”, the query processes and then plays. Where with the Echo, the same request yields Alexa to say, “OK, playing Christmas music from top 50 Christmas songs.” When you aren’t looking at a screen, the feedback is important. If I was to ask that same request while I was looking at my iPhone, you realize, as Siri processes the request, it says, “OK” on the screen but not in my ear. In voice-only interfaces, we need and want feedback that the request is happening or has been acknowledged.
Again, having Siri in your ear and the ability to have a relatively hands-free and screen-free experience broke down when you asked Siri something which required unlocking your phone. For example, one of the most common Siri actions of mine is to use Siri to locate a family member. Particularly my daughter who takes a bus home from school that has a variable drop off time due to traffic or student tardiness. Nearly every day I ask Siri to locate my daughter. But, when I do so via the AirPods and my phone has been off long enough to lock, it says I need to unlock my iPhone first. I hit this wall due to Apple’s security protocols, which I appreciate greatly. I wonder if, in the future, we can have a biosensor in the AirPods which authenticates with me and thus gives me security clearance to process a secure request like reading email, checking on a family member or other sensitive requests, without having to unlock the phone first.
There were cases where Siri assumes I can look at my iPhone to deliver the request. There are certainly plenty of queries where Siri, in a voice-only experience, works — when you ask Siri to read your new emails, or set timers, appointments, ask what time a sports game is, etc., but the sweet spot here will be when you can thoroughly use Siri and not need any screen for the full experience. I’m confident Apple will increasingly go in this direction.
Creating the Siri experience to be more than just voice-first but voice-only will be an important exercise. I strongly believe that, when voice exists on a computer with a screen, it will never be the primary interaction input with that screen. Take the screen away and things start to get really interesting. This is when new behaviors and new interactions with computers take place and it’s what happens when you start to integrate the Amazon Echo or Google Home into your life as both are voice-first experiences.
There is a great deal to like about the AirPods. Those who buy them and use them will be pleasantly surprised and delighted by their performance as wireless headphones and impressed with the upside of Siri in your ear. I consider the AirPods an important new product in Apple’s lineup and in the same category as the Apple Watch regarding importance for the future. Here is a significant observation of both the Apple Watch and the AirPods worth pointing out. Apple has a tendency to push engineering limits at times to learn or perfect a technique they believe is important for the future or to learn from it in order to integrate into other products. While iPads and iPhones are getting larger, the Apple Watch and AirPods are pushing the limits of miniaturization. Something that is key when we start thinking about future wearables where companies will pack tremendous amounts of technology into extremely small objects. The exercise of packing sensors, microprocessors, batteries, and more into extremely small objects and manufacturing them at scale is an incredibly important skill set to develop for the future. Both the Apple Watch and AirPods are key engineering milestones to build on for where I believe Apple is headed in the future.