I believe most analysts, including those that monitor Apple’s every move, are seriously underestimating the ramifications of Apple baking Shazam’s music identification service into iOS 8. This is not merely about increasing song downloads. Rather, this move marks Apple’s determined leap to re-position the iPhone in our lives. The digital hub metaphor is now much too limiting. As the physical and digital worlds mix, merge and mash together to create entirely new forms of interaction and new modes of awareness, the iPhone will become our nerve center. It will guide us, direct us, watch, listen and even feel on our behalf.
A bold statement, I know, especially given the prosaic nature of the rumor. Let’s start then with the original Bloomberg report:
(Apple) is planning to unveil a song discovery feature in an update of its iOS mobile software that will let users identify a song and its artist using an iPhone or iPad.
Apple is working with Shazam Entertainment Ltd., whose technology can quickly spot what’s playing by collecting sound from a phone’s microphone and matching it against a song database.
Song discovery? Ho hum. Only, look beyond the immediate and there’s potential for so much more. That late last year, Shazam updated its iPhone app to support an always-on, always-listening ‘Auto Shazam’ feature is no coincidence. Our phones are becoming increasingly aware of their surroundings. I expect Apple to leverage this technological confluence for our mutual benefit.
Today, Song Discovery.
Apple’s move no doubt satisfies a near term need. While Shazam has been around since 2008, and the company claims 90 million monthly users across all platforms, having their service baked into the iPhone will almost certainly spur increased sales. Song downloads have slowed — not just with iTunes, the world’s largest seller of music — but across the industry.
Instead of having to download the Shazam app, iPhone users will now simply point their device near a sound source and summon Siri: “what song is playing?” So notified, they can then buy it instantly from iTunes.
Little surprise music industry site MusicWeek was generally positive about the news. Little surprise, also, the tech industry could not muster much excitement. Thus…the Verge essentially summarized Bloomberg’s report.
Daring Fireball’s John Gruber offered little more than “sounds like a great feature.”
Windows Phone Central readers offered only gentle mocking, reminding all who would listen this feature is already embedded in Windows Phone.
That’s about it. Scarcely even a mention Shazam has a similar, if less developed TV show identification feature which could also prove a boon for iTunes video sales.
Place me at the other end of the spectrum. I think the rumored Shazam integration is a big deal and not because I care about the vagaries of the music business. This is not about yet another mental task the iPhone makes easier. Rather, this move reveals Apple’s intent to enable our iPhones to sense — to hear, see and inform, even as our eyes, ears and awareness are overwhelmed or focused elsewhere.
Tomorrow, Super Awareness.
Our smartphones are always on, always connected to the web, always connected to a specific location (via GPS) and, with minimal hardware tweaks, can always be listening, via the mic, and even always be watching, via the cameras.
What sights, sounds, people, toxins, movements, advertisements, songs, strange or helpful faces, and countless other opportunities and interactions, some heretofore impossible to assess or even act upon, are we exposed to every moment of every day? We cannot possibly know this, but our smartphones can, or soon will. I believe this Shazam integration points the way.
It’s not just about hearing a song and wanting to know the artist. It’s about picking up every sound, including those beyond human earshot, and informing us if any of them matter. Now apply this same principle to every image and face we see though do not consciously process.
Our smartphone’s mic, cameras, GPS and various sensors can record the near-infinite amount of real and virtual data we receive every moment of every day. Next, couple that with the fact our smartphone’s ‘desktop-class’ processing will be able to toss out the overwhelming amounts of cruft we are exposed to, determine what’s actually important, and notify us in real-time of that which should demand our attention. That is huge.
Going forward, the iPhone becomes not simply more important than our PC, for example, but vital for the successful optimization of our daily life. This is not evolution, but revolution.
The Age Of iPhone Awareness
Yes, it’s fun to have Siri magically tell us the name of a song. Only, this singular action portends so much more. At the risk of annoying Android and Windows Phone users, Apple’s move sanctions and accelerates the birth of an entirely new class of services and applications which I call ambient apps.
Ambient apps hear, see and record all the ‘noise’ surrounding us, instantly combine this with our location, time, history, preferences — then run this data against global data stores — to inform us of what is relevant. What is that bird flying overhead? Where is that bus headed? What is making that noise? Who is the person approaching me from behind? Is there anything here I might like?
Your smartphone’s mic, GPS, camera, sensors and connectivity to the web need never sleep. Set them to pick up, record, analyze, isolate and act upon every sound you hear, every sight you see.
This has long been the dream of some, though till now was impossible due to limited battery life, limited connectivity, meager on-board processing and data access. No longer.
Let’s start with a simple example.
Why ask Siri “what song is this”? Why not simply say, for example, “Siri, listen for every song I hear (whether at the grocery store, in the car, at Starbucks, etc.). At the end of the day, provide an iTunes link to every song. I’ll decide which ones I want to purchase. Thank you, Siri.”
Utterly doable right now. Except, why limit this service to music?
For example, perhaps our smartphone can detect and take action based upon the fact that, unbeknownst to you, the sound of steps behind you are getting closer. It can sense, record and act upon the fact you walk faster each time you hear this particular song. Or you slowed down when passing a particular restaurant. What do you want it to do based upon its “awareness” of your own actions — actions which you were not consciously aware of?
Our smartphone can hear and see. It is always with us. It makes sense then to allow it to optimize and prioritize our responses to the real and virtual people and things we interact with every day, even those outside our conscious involvement.
Ambient Apps Are The New Magic
The utility of our smartphone’s responses will only get better. Smartphones sense by having ears (mic), eyes (cameras), by knowing our exact location (GPS) and by being connected to the internet. These continue to improve. It is smartphone sensors, however, that parallel our many nerve endings, feeling and collecting all manner of data and notifying us when an appropriate action should be taken.
Though still a relatively young technology, smartphones have added a wealth of new sensors with each iteration. The inclusion of these sensors should radically supplement the recording, tracking and ambient ‘awareness’ of our smartphones, and thus further optimize our interactions, both online and offline.
Jan Dawson posted this Qualcomm chart which illustrates the amazing breadth of sensors added to the Samsung Galaxy line over just the past five years. What becomes standard five years from now?
Hear, see, sense. The smartphone’s combination of hardware, sensors, cloud connectivity, location awareness and Shazam-like algorithms will increasingly be used to uncover the most meaningful bits of our lives then help us act upon them, as needed. This is not serendipity, this is design. I think Apple is pointing the way.