Apple’s Future Is Ear

Apple’s Transition From Looking and Touching to Listening and Talking

Part 1: Looking Back


As you may well know, there’s an awful lot of angst concerning Apple’s removal of the headphone jack from their latest model iPhones.

Every new idea has something of the pain and peril of childbirth about it. ~ Samuel Butler

I won’t rehash it all other than to say that a lot of people — and I mean a LOT of people — disagree with Apple’s decision to remove the headphone jack from the recently released iPhone 7. And when I say a lot of people disagree with the removal of the headphone jack, I mean they VEHEMENTLY disagree.

[pullquote]Taking the headphone jack off the phones is user-hostile and stupid[/pullquote]

Taking the headphone jack off phones is user-hostile and stupid. ~ Nilay Patel

Wow. Strong words.

Don’t worry about people stealing an IDEA. If it’s original, you will have to ram it down their throats. ~ Howard Aiken

If you’re going to sin, sin against God, not the critics. God will forgive you, but the critics won’t. paraphrasing Hyman Rickover

So, are the critics right? Is Apple doing their customers a disservice?

New ideas come into this world somewhat like falling meteors, with a flash and an explosion, and perhaps somebody’s castle-roof perforated. ~ Henry David Thoreau

Or is it we who are doing Apple a disservice?

Our dilemma is that we hate change and love it at the same time; what we really want is for things to remain the same but get better. ~ Sydney J. Harris

Looking Back

Before we try to answer the questions posed above, let’s first take a step back in order to gain a broader perspective.

Acquire new knowledge whilst thinking over the old, and you may become a teacher of others. ~ (probably not) Confucius

I believe that the past can teach us a lot about the future.

The further back you look, the further forward you can see. ~ Winston Churchill

So before we discuss the removal of the headphone jack and the viability of Apple’s new Bluetooth AirPods, let’s first take a look back on some computing history.

Smaller, Ever Smaller

Since the advent of the Apple II and the rise of the mass-market consumer PC, you hear “computer” and you think “monitor, mouse, keyboard,” in some variation. ~ Matt Weinberger, Business Insider


But that’s not the way it’s always been.

Computers have gone from Mainframes that took up an entire floor, to Minis that filled an entire office, to PCs that sat on desktops, to Notebooks that laid on our laps, to Smartphones that rested in pockets, to watches that wrapped around our wrists. I can’t be the only one who sees the pattern here. Every generation of computer has gotten smaller and smaller. And that trend is not going to stop. It’s not a question of “if” computers are is going to get smaller, it’s only a question of and “when.”

Well, let me correct myself. It’s not only a question of “when” computers will get smaller, it’s also a question of “how.” Making computers smaller is relatively easy. Making them smaller while maintaining their usefulness is not so easy and does, in fact, pose a significant challenge.

The Windows Mobile Mistake

We may not know what the next User Interface should be, but we know what it shouldn’t be. It shouldn’t be a smaller version of the current User Interface.

Remember how Microsoft tried – for years and years and years – to squeeze its desktop User Interface into tablets and phones? Nowadays, we look back and mock Microsoft for those early, lame attempts to create a modern phone interface. But that smug point of view is simply retroactive arrogance. We now know what Microsoft should have done, so we’re astonished that they too didn’t employ the same 20/20 hindsight that we now do. But at that time — although almost none of us liked the tiny menus or the easy to lose styluses — no one had a better idea.

Not only that, most of us didn’t even know that a better idea was needed.

Unique User Interface

So a smaller version of the current User Interface provides a bad experience for the User. What then is the solution?

The solution, of course, is a brand new User Interface. It turns out that each successive generation of computer requires its very own unique User Interface — a User Interface specifically tailored to work with the new, smaller form factor.

Unfortunately, creating a brand new User Interface is easier said than done, in part because it’s extremely counterintuitive. In hindsight, all the best user interfaces look obvious. In foresight, those self-same user interfaces look like obvious failures.


Take, for example, the User Interface employed by the Macintosh.
The User Interface of the Macintosh was soon to become the standard for desktop PCs, with many of its features still in use today. But at the time, Xerox — which created several of the building blocks for the soon-to-be Macintosh User Interface — didn’t know what they had.

When I went to Xerox PARC in 1979, I saw a very rudimentary graphical user interface. It wasn’t complete. It wasn’t quite right. But within 10 minutes, it was obvious that every computer in the world would work this way someday. ~ Steve Jobs

Yeah, Steve Jobs instantly saw the promise of what was to become the Macintosh User Interface…

…but most of us aren’t Steve Jobs. Like the engineers at Xerox, we don’t recognize the value of a new User Interface even when we’re looking right at it.

By the way, anyone who thinks that Steve Jobs and Apple “stole” the UI form Xerox, needs to read this article and see this video.


Another example of not knowing what we had was the iPhone. I think most everyone would now agree that, with the introduction of the iPhone, Steve Jobs and Apple knocked the Smartphone User Interface out of the park. But that’s not how people saw it at the time.

Some thought the iPhone was an embarrassment to Apple:

Apple should pull the plug on the iPhone… What Apple risks here is its reputation as a hot company that can do no wrong. If it’s smart it will call the iPhone a ‘reference design’ and pass it to some suckers to build with someone else’s marketing budget. Then it can wash its hands of any marketplace failures… Otherwise I’d advise people to cover their eyes. You are not going to like what you’ll see. ~ John C. Dvorak, 28 March 2007

Internet commentators were no more impressed with the newly announced iPhone than were the press:

Im not impressed with the iPhone. As a PDA user and a Windows Mobile user, this thing has nothing on my phone..i dont see much potential. How the hell am I suppose to put appointments on the phone with no stylus or keyboard?!…No thanks Apple. Make a real PDA please….

lol last i checked many companies tried the tap to type and tap to dial … IT DOESNT WORK STEVIE, people dont like non-tactile typing, its a simple fact, this isnt a phone its a mac pda wow yippie….I mean it looks pretty but its not something i forsee being the next ipod for the phone industry…

im sorry but if im sending text messages i’d rather have my thumb keyboard than some weird finger tapping on a screen crap.

Touch screen buttons? BAD idea. This thing will never work.

Apparently none of you guys realize how bad of an idea a touch-screen is on a phone. I foresee some pretty obvious and pretty major problems here. I’ll be keeping my Samsung A707, thanks. It’s smaller, it’s got a protected screen, and it’s got proper buttons. And it’s got all the same features otherwise. (Oh, but it doesn’t run a bloatware OS that was never designed for a phone.) Color me massively disappointed.”

And, of course, even years after the iPhone appeared on the scene, competitors continued to overlook its significance:

Not everyone can type on a piece of glass. Every laptop and virtually every other phone has a tactile keyboard. I think our design gives us an advantage. ~ Mike Lazaridis, Co-CEO, Research In Motion, 4 June 2008

So globally we still have the world running on 2G internet. Blackberry is perfectly optimized to thrive in that environment. That’s why the BlackBerry is becoming the number one smartphone in those markets. ~ Mike Lazaridis, Co-CEO, Research In Motion, 7 December 2010


The lesson here is fourfold:

First, new user interfaces are hard. Really, really hard.

Second, we often don’t realize a new user interface is even needed.

Third, each user interface is unique — radically different from the User Interface that preceded it.

People should think things out fresh and not just accept conventional terms and the conventional way of doing things. — Buckminster Fuller

Fourth, even when a new user interface is introduced, and even if it ends up being the perfect solution in the long run, in the short run it’s not met with cries of “Thank goodness you’ve arrived!” No, it’s met with scorn, derision and dogged resistance.

Why Bother?

Before we go any further, I guess we should ask ourselves: “Why do we even need smaller computers that require a new User Interface anyway? Smartphones are great, right?”

Well, yes and no.

Smartphones are wondrous supercomputers that we carry in our pockets and which can solve a multitude of problems. But for some tasks, Smartphones are far from ideal.

One problem with Smartphones is that they are demanding. They cry out for our attention. They buzz, they beep, they ring, they flash, they vibrate. They call to us, “now, Now, NOW! Pay attention to me now, damn it!”

“The word ‘now’ is like a bomb through the window, and it ticks. ~ Arthur Miller

Another problem with Smartphones is that they are intrusive. To interact with a Smartphone, we must look at it. When our focus is on the Smartphone, our focus is off everything else.

This can be socially awkward when we hunch over our Smartphones and ignore those around us. It can be amusing as we watch those using Smartphones bump into walls or walk into water fountains. It can be deadly as we walk into traffic while staring at our Smartphones or foolishly attempt to text while driving.

Part 2: The Next User Interface

Just as we needed a brand new “touch” user interface in order to turn smaller Smartphone form factors into a usable computing device, we now need a brand new User Interface in order to turn even smaller computer form factors into usable computing devices.

— PCs used a mouse and a monitor;
— Notebooks used a trackpad and a monitor;
— Smartphones used a touch-sensitive screen and a monitor; and
— Watches are still a work in progress, but they currently use a variety of interfaces , like touch, 3D touch, Digital Crown, Taptic Engine…and a monitor.

But this presents us with a new challenge. Ever since the Apple I was introduced in 1977, every User Interface has had one thing in common — a monitor. But usable screen sizes have gotten as small as they can get. How then do we make a computer both smaller AND more usable?

Google Glass

Google Glass was an early attempt at creating a User Interface suitable for a smaller computer form factor. It solved the screen size dilemma by resting the screen on one’s face like a pair of glasses. It used augmented reality to superimpose bits of helpful information over the world as viewed through a small camera lens. The vision was for the device to always be present, always be watching, always be listening, always be ready to assist with some digital task or to instantly recall some vital piece of information. People had very, very high hopes for Google Glass.

Google glasses may look and seem absurd now but (Brian) Sozzi says they are “a product that is going to set the stage for many other interesting products.” For the moment, at least, the same cannot be said of iPhones or iPads.” ~ Jeff Macke, Yahoo! Breakout, February, 27  2013


So did Google Glass “set the stage for many other interesting products”? Not so much. It failed so badly that it came and went within the span of three short years.

So what went wrong?


Well…other than that picture, what went wrong?

Google Glass was incredibly intrusive, both for the user and, significantly, for those in the presence of the user. From the outside, Google glass stood out like a sore thumb. From the the inside, Google Glass inserted itself between the user and the world.

Google Glass is in your way for one thing, and it’s ugly…It’s always going to be between you and the person you’re talking to. ~ Hugh Atkinson

Further, Google Glass was a pest, always bombarding the user with distracting visual images.

I don’t think people want Post-it notes pasted all over their field of vision….The world is cluttered up enough as it is! ~ Hugh Atkinson

Perhaps even worse was the way Google Glass intruded upon the lives of others. People resented the feeling that they were being spied upon and began to call those who wore the devices “Glassholes.”

Finally, Google Glass just wasn’t that useful. It didn’t do many things and the things it did it didn’t do all that well.

Enter The Voice User Interface

[pullquote]We’re moving from view first to voice first[/pullquote]

I’m convinced that Voice is going to be the next great User Interface; that we’re moving from touching and looking on our Smartphones to talking and listening on our Headphones; That we’re moving from View First to Voice First.

Most agree the next major UI shift after touch is voice. ~ J. Gobert (@MrGobert)

More importantly, I’m convinced that Apple is convinced that Voice is the next great User Interface…

…which is no big deal, because Amazon, Google, Microsoft and most others are convinced too…which is why they’re all investing so heavily in the area.

(D)igital assistants are poised to change not only how we interact with and think about technology, but even the types of devices, applications and services that we purchase and use. ~ Bob O’Donnell

The User Interface Company

Apple is a User Interface company. Their business model is to:

— Create a revolutionary new User Interface;
— Use design principles to build an integrated hardware and software product;
— Iterate the hell out of it;
— Carefully select another area of computing ripe for disruption; and
— Do it all over again.

Some past examples:

— The Apple I added a monitor.
— The Macintosh added the Graphical User Interface (GUI) and the Mouse.
— The Apple Notebook added the recessed keyboard and trackpad.
— The iPod added a click wheel and related all the heavy lifting to the Personal Computer.
— The iPhone and the iPad added a touchscreen.

In 1976, with the Apple I, Apple started the modern era of personal computing by adding a monitor to the User Interface. In 2016, Apple intends to extend the era of the personal computer by removing the monitor from the User Interface.

The new User Interface would be — as all User Interfaces must be — a radical transition. It would take us from touching to talking; from looking to listening.

The most interesting disruption comes from attacking an industry from what looks like an irrelevant angle. ~ Benedict Evans (@BenedictEvans)

Introducing The AirPod

Apple recently announced a line of wireless headphones, called AirPods. The AirPod appears to represent Apple’s vision for the visionless User Interface of the future. With advanced bluetooth audio, a powerful W1 chip, two microphones, and yes, the elimination of the 3.5mm audio jack, the AirPod is the beginning Apple’s transition from User Interfaces for the eyeballs, to a User Interface for the eardrums.

So what’s the big deal? We’ve had wireless headsets for a while. True enough. But they’ve been confusing to pair, frustrating to use, had limited battery life, and were, overall, relatively powerless. The AirPods are not just another set of headphones. Rather, they are the start of a whole new generation of headsets. The new AirPods provide:

— Painless Pairing;

— A Charging Case that stores, charges, and pairs the earbuds;

— Optical sensors, that that make the first earbud in the ear the primary earbud for phone;

— Sharing between two people.

— A long tube that provides room for a larger battery, thus providing longer battery life;

— Microphones at the end of the tubes which reduces the interference provided by our head and allows better ear-to-ear communication; and

— Activation of Siri either by saying “Hey Siri” or by double tapping on either of the earbuds.

Good Design

The AirPods’ simplicity and demure demeanor is consistent with the principles of good design.

Good design is unobtrusive. Products fulfilling a purpose are like tools. They are neither decorative objects nor works of art. Their design should therefore be both neutral and restrained, to leave room for the user’s self-expression. ~ Dieter Rams

The advance of technology is based on making it fit in so that you don’t really even notice it, so it’s part of everyday life. ~ Bill Gates

If it disappears, we know we’ve done it. ~ Federighi 9/10/13

Technology is at its best and its most empowering when it simply disappears. ~ Jony Ive

I like things that do the job and kind of disappear into my life. Like Levis. They just kind of get faded and disappear, and you don’t think about it much. ~ Steve Jobs

The Invisible Hand

[pullquote]Apple has a secretive project in the works named “Invisible Hand”[/pullquote]

Bloomberg has reported that Apple has a secretive project in the works that would dramatically improve Siri. Currently, the Siri voice assistant is able to respond to commands within its application. With an initiative code-named “Invisible Hand,” Apple is researching new ways to improve Siri. Apple’s goal is for Siri to be able to control the entire system without having to open an app or reactivate Siri. According to an unnamed source, Apple believes it’s just three years away from a fully voice-controlled iPhone.

Note that the report said that Apple thinks it is three years away form employing all of these features. Not today. Not tomorrow. Not the day after tomorrow, but three years. So don’t expect to see these advanced features anytime too soon.

Veteran Apple engineer Bill Atkinson — known for being a key designer of early Apple UIs and the inventor of MacPaint, QuickDraw, and HyperCard—saw this coming a long time ago. He gave a presentation at MacWorld Expo back in 2011 in which he explains exactly why the ear is the best place for Siri. ~ Fast Company

— AirPods don’t require we look and touch. They only require we talk and listen.
— The AirPod will be always with us.
— The AirPod will be always on us.
— The AirPod does not require the use of our eyes.
— The Smartphone stands between us and the world and demands our eyes and our attention. The AirPod stands behind us and discretely whisper’s in our ears.

“Yuck,” you say. “Always on?. Who wants that?”

We all will.

— We can have the AirPod in our ears at all times.
— We needn’t reach into our pockets to look at our Smartphones.
— We needn’t even turn our wrists and glance at our Smartwatches.
— Tap, tap or “Hey Siri.” Computing at our beck and call.

Apple has already started down this path. If you’re activating Siri using the home button of your iPhone, Siri more often directs you to look at the screen. If Siri is activated hands free via “Hey Siri”, Siri is more talkative and less visual.

Today And Tomorrow

The possibilities for Voice activated computing are endless.

Of course, you can request music or even a specific song.

A Voice Interface will allow us to listen to our emails and texts.

Driving directions might best be served by using both visual and audio instructions. But walking instructions — which are in their infancy, but on their way — are a different matter. It’s not a good idea to look at a screen while walking. Audio only instructions are the way to go.

Third party apps will have access to all of the the AirPods’ functionality.

Using a double-tap, the user can quietly request information.

Soon we’ll be able to identify a document, and simply say “print” and the artificial intelligence will do the rest.

AirPods will one day be spatially aware. They’ll remind us to take the mail with us when we leave the house, and to buy toilet paper when we pass by the local supermarket.

Soon we’ll be able to simply say “help” in order for the system to help the us navigate a particular task or application.

We don’t have immediate recall, but our AirPods — which hear everything — will.

Siri might soon be able to recognize people, know places, identify motions and connect them all with meaningful data.

Sensors in the device will know if we are in conversation and will break in only with the most important verbal notifications.

As Siri becomes more environmentally aware, it will start to recognize important sounds in the environment. For example, if the AirPods detect a siren while the user is driving, they might temporarily mute any messages or other audio.

Soon we’ll be able to request that our intelligent assistants ask someone else a question, get the answer, and then relay that answer back to us.

Or we’ll be able to schedule a meeting, with the artificial intelligence navigating all the various questions and answers required from multiple parties to make that happen.

Proactive assistance will remind us that we have an upcoming appointment and — knowing the time and distance to the meeting — prompt us to leave for it in a timely manner.

Or better yet, a truly intelligent device will know us and understand us and remind us when our favorite sports team is scheduled to begin play.


“Science fiction,” you say? Really? Look how very far we’ve come since the iPhone was introduced in 2007. Just to take one small example, people forget that Smartphones were out for years and years and years before we were able to ditch our dedicated GPS devices and switch to using GPS on our phones instead. Now using the GPS on our phones is so normal that many can’t even imagine how we managed without it.

Apple opening up Siri, is like everything else they do. Building the momentum slowly and for the long haul. ~ Nathan Shochat (@natisho) 9/26/16

Like great men and women, great computing starts from humble beginnings.

Over the years, many of the most important features to come to the Apple ecosystem were launched as somewhat basic and rudimentary iPhone features.

• Siri told funny jokes.
• Touch ID unlocked iPhones.
• 3D Touch made Live Photos come to life.

In each case, a feature was introduced not to set the world on fire overnight, but rather to serve as a foundation for future innovation and functionality. Siri has grown from giving funny, canned responses to being one of the most widely-used personal assistants that relies on natural speech processing. Touch ID is now used to facilitate commerce with Apple Pay. 3D Touch has transformed into an emerging new user interface revolving around haptics and the Taptic Engine. ~ Neil Cybart, Above Avalon


Reviewers who have worn the Apple Watch have written that it has untethered them from their phone — that their iPhone has joined the MacBook as the “computer in the other room.” That is all going to be doubly so with AirPods and other sophisticated headphones.

— Yesterday, people used their computers when they were at their desktops or when they carried their laptops with them.

— Today, people use their phones all the time and their second screen devices – Desktops, Notebooks, Tablets — some of the time.

— Tomorrow, people will listen to their headphones all the time and look at their phones, tablets, notebooks and desktops some of the time.

Not One Device, But Many

So, am I saying the Smartphone is going away?

Hell no.

Did the mainframe go away? Did the PC go away? Did the freaking fax machine — which was invented in 1843 — go away?

Old tech has a very long half-life. ~ Benedict Evans on Twitter

Bill Gates foresaw what was going to happen to computing as far back as 2007. Well, he ALMOST foresaw what was going to happen:

MOSSBERG: What’s your device in five years that you’ll rely on the most.

GATES: I don’t think you’ll have one device

I think you’ll have a full screen device that you can carry around and you’ll do dramatically more reading off of that – yeah, I believe in the tablet form factor – and then we’ll have the evolution of the portable machine and the evolution of the phone will both be extremely high volume, complimentary, that is if you own one you’re likely to own the other…

Reverse “phone” and “tablet” and Gates got it just about right. We’re not going to have just one device with just one user interface. We’re going to seamlessly move from device to device as best suits our needs at that particular time, at that particular place.

Part 3: Critiques

Fantasizing Fanboys

Nilay Patel thinks the Apple fanboys who are buying into the whole AirPod thing are as bad as Google fanboys who bought into the whole Google Glasses thing.

Watching Apple fans repeat the mistaken dreams of Google Glass is super fun. ~ Nilay Patel (@reckless) 9/16/16

I think Nilay Patel is a really, really smart guy who’s being incredibly, and inexcusably, short-sighted.

The most important things have always seemed dumb to industry experts at the beginning. ~ Jeff Bezos

Professional critics of new things sound smart, but the logical conclusion of their thinking is a poorer world. ~ either Benedict Evans or Ben Thompson ((Sadly, I don’t know who to attribute this quote to. My notes have both saying it and a search did not reveal the original source. My bad.))

The AirPod isn’t obnoxious, like the Google Glasses were. They aren’t building a barrier between you and me; between you and the world. And while Google Glass was incredibly intrusive and incredibly useless, AirPods are not intrusive at all while they’re incredibly useful today and will become even more useful with each passing tomorrow.

Echo Chamber

Many, many, very intelligent and respected technology observers really like the Amazon Echo and think it is the wave of the future.

I’ll admit I’m swimming in dangerous waters here — there have already been reports that Apple is working on an Echo-like device — but I don’t think Apple is going to go in the direction of the Amazon Echo. Based on Apple’s investor call, held on October 25, 2016, Tim Cook doesn’t think so either:

Most people want an assistant with them all the time. There may be a nice market for kitchen ones, but won’t be as big as smartphone.

Here’s my issue with the Echo and Echo-like products:

First, the Echo, and competing devices like Google Home, are fixed to one room. It makes no sense to have your Artificial Intelligence anchored to a single location when you can have it with you anywhere, anytime.

Second, the Echo and its lookalikes are designed to be used by multiple people. That’s convenient…but it also means that it muddles the information the artificial intelligence receives which, in turn, muddles the information that the Artificial Intelligence can provide. In other words, devices used by many people will not be able to provide data tailored for single individuals.

Many, many, many, many very smart, very thoughtful, very respected industry observers disagree with me.

When I wrote my original Siri Speaker article in March, I heard from a lot of people who didn’t understand why Apple needed to make such a product [as the Echo] when our iPhones and iPads and Apple Watch can do the job…It’s a very different experience to have an intelligent assistant floating in the air all around you, ready to answer your commands, rather than having that assistant reside in a phone laying on a table (or sitting in your pocket). ~ Jason Snell, MacWorld

[pullquote]What if your intelligent assistant were always in your ear and always with you?[/pullquote]

Well, that’s true, but what if your intelligent assistant were always in your ear and always with you?

Job To Be Done

There are those who argue that Voice Input may be a nice supplement to computing but a voice Interface is not sufficient because it’s inadequate — it doesn’t do everything that we can currently do on our Smartphone, or even our Smartwatch.

Maybe this’ll feel retrograde in a decade, but how many people really want to control everything with their voice? It’s handy for some stuff, but not everything…. ~ Alex Fitzpatrick, Time

Don’t we say the exact same thing at the introduction of every new generation of computer?

— The Notebook couldn’t do what the Desktop did.
— The Tablet couldn’t do what the Desktop or the Notebook did.
— The Smartphone couldn’t do what the Desktop, the Notebook or the Tablet did.
— The Watch couldn’t do what the Smartphone did.
— The AirPod can’t do what the Smartphone, or even the Smartwatch does.

OF COURSE the new device is not as good as the old device at doing what the old device did best. A Notebook computer is a lousy Desktop computer. A tablet is a lousy Notebook. A Smartphone is a lousy Desktop, Notebook or Tablet. And a passenger vehicle is a lousy truck. But we don’t hire a passenger vehicle to be a truck. Neither will we hire a device using a Voice-First Interface to be a Desktop, Notebook, Tablet, Smartphone or Smartwatch.

We don’t recognize the value of a new User Interface because we measure it against the wrong standard.

Lesson #1: The New User Interface is not trying to “replace” the old user interface.

Tablets will not replace the traditional personal computer. The traditional PC is changing to adapt to the customer requirements. The tablet is an extra market for some niche customers. ~ Yang Yuanqing, Chief Executive Officer, Lenovo Group Ltd., 11 Jan 2012

The above quote misses the mark because it assumes that tablets WANT to replace the traditional personal computer.

‘This new thing will be great – once we can do all the old things on it in the old way’ ~ Benedict Evans

Each new computer form factor is being hired to do something different than its predecessor, otherwise, we wouldn’t want or need to migrate to the new device in the first place.

[pullquote]The goal is to use the new device for something it can do extremely well, especially if that something is something the old device did poorly or not at all[/pullquote]

The goal is not for the new Interface to duplicate the functionality of the old Interface; to use our new devices to do what our old devices already do well. The goal is to use our new devices for those things that they do best.

Lesson #2: We shouldn’t judge a User Interface by what it CAN NOT do.

Instead of judging a new User Interface by what it can not do, we should judge it by what it CAN DO EXTREMELY WELL, especially if it can do something well that the old User Interface does poorly or not at all.

Before you can say ‘that won’t work’, you need to know what ‘that’ is. ~ Benedict Evans

Socially Awkward

Some observers say we will not want to use Voice First because — well frankly, because it makes us look like socially awkward nerds and sound like socially oblivious geeks.

I personally still feel self-conscious when I’m using Siri in public, as I suspect lots of folks do as well.

This kind of thinking is already passé in China.

(Voice may be awkward) in the US. 100% not true in Asia. Voice is dominant input method whether public or private. ~ Mark Miller (@MarkDMill)

(F)or certain markets, like China…voice input was preferred over typing. ~ Ben Bajarin (@BenBajarin)

But let’s forget for a moment that the social awkwardness we fear is already irrelevant to a minimum of 1.3 billion people. Even for those of us who live in the West, our fear of social awkwardness is — well — it’s a little bizarre.

Apparently, this is considered ‘normal’ looking:

…but this in considered abnormal and abhorrent.


Who knew?

Don’t fool yourself into thinking that resistance to the new AirPods is anything new. There has never been a meaningful change that wasn’t resisted by self-righteous, holier-than-thou, know-it alls — like me.

People are very open-minded about new things – as long as they’re exactly like the old ones. ~ Charles Kettering

You don’t believe that resistance to the new is the norm? Then I strongly suggest you follow Pessimists Archive @pessimistsarc. (Even if you DO believe me, I still strongly suggest you follow Pessimists Archive @pessimistsarc)

Here are just a couple the things that the guardians of goodness have deemed irredeemable:

— CELL PHONES: Don’t you remember when cell phones were considered anti-social?

And pretty dorky looking, too.


— WALKMAN: In the 1980s, in response to the Walkman, a town in New Jersey made it illegal to wear headphones in public. That law is still on the books today.

— RADIO: A 1938 article opined that it was “disturbing” to see kids listening to the radio for more than 2 hours a day.

— AUTOMOBILES: Early automobiles caused as much controversy then as driverless cars do today. It was common for people to yell “Get a horse” as the new fangled cars passed them by (both literally and figuratively).

— BICYCLES: Yes bicycles. First, bicycles were decried for allowing the youth to stray far from the farm. Second, bicycles were blamed for leading to the “evolution of a round-shouldered, hunched-back race” (1893).

— PHONOGRAPH: In 1890, The Philadelphia Board & Park commissioners “started a crusade against the phonograph.”

— KALEIDOSCOPES: Yes, kaleidoscopes! In the early 1800s kaleidoscopes were blamed for distracting people from the real world and its natural beauty.

— BOOKS: You read that right. Books. Novels were considered to be particularly abhorrent. In 1938, a newspaper ran an article with some top tips for stopping your kids from reading all the time.

Little men with little minds and little imaginations go through life in little ruts, smugly resisting all changes which would jar their little worlds. ~ Zig Ziglar


[pullquote]This isn’t the first time Apple has changed the way we do things[/pullquote]

You know, this isn’t exactly the first time that Apple has changed the way we do things.

The Macintosh got us to use the mouse. And that wasn’t a given.

The Macintosh uses an experimental pointing device called a “mouse”. There is no evidence that people want to use these things. ~ John C. Dvorak, In a review of the Macintosh in The San Francisco Examiner (19 February 1984)

(emphasis added)

Remember the day-glow colors of the first iMacs?


Remember the iconic white earbuds of the iPod (just 15 years ago, this week).


Remember what it was like before the Smartphone and how quickly we adapted to having a Smartphone with us all the time?


You think going from the Smartphone User Interface to AirPod User Interface is going to be hard? Are you kidding me? This is going to be the easiest User Interface transition ever.

If you’re strong enough, there are no precedents. ~ F. Scott Fitzgerald

[pullquote]How hard will it be to go from using headphones with our smartphones to simply using headphones all by themselves?[/pullquote]

AirPods build upon already existing habits. We’re already talking into our phones and bluetooth devices. Who cares if we start talking into our AirPods instead? And we already use headphones with our Smartphones. How hard will it be to go from using headphones with our smartphones and smartwatches to simply using headphones all by themselves?

Good products help us do things. Great products change the things we do. Exceptional products change us. ~ Horace Dediu (@asymco) 9/4/16

Siri Sucks

If you want to doubt Apple’s ability to create a truly meaningful Voice-First User Interface, look no further than Siri. If Apple is going to rely on voice for input, and Artificial Intelligence for output, then Siri needs to be top-tier. Right now, not only isn’t Siri “good enough”, it’s just plain not good. True, sometimes Siri can be magical…but far more often it’s maniacal.

Some people think Siri is a joke. I disagree. There’s nothing funny about the way Siri fails to do what it’s supposed to be doing.

The good news is Apple is very well aware of the fact that Siri is moving from backstage to center stage. The bad news is that Apple has yet to prove that they have the ability to transition Siri from the role of a bit player to that of a lead actor.

If you’re an optimist, like me, one hopeful precedent is Apple Maps. They too were widely panned when they first appeared. But gradually — year after year after year after year — Apple improved them until, over time, they have became “good enough” (although the title “best” still resides with Google Maps).

Part 4: The Apple Way

Nilay Patel, and other critics, can’t understand why Apple is doing what it’s doing.

Invention requires a long-term willingness to be misunderstood. ~ Jeff Bezos

There’s nothing new in that. Apple has always been misunderstood.

Human salvation lies in the hands of the creatively maladjusted. ~ Dr. Martin Luther King, Jr.

Apple has always been willing to take chances.

Success is the child of audacity. ~ Benjamin Disraeli

Boldness has genius, power and magic in it. ~ Johann Wolfgang von Goethe

And they’ve always been mocked for doing so.


The price of originality is criticism. The value of originality is priceless. ~ Vala Afshar (@ValaAfshar) 9/27/16

But why take chances?
Why do things that you know are going to be heavily criticized?

Clear thinking requires courage rather than intelligence. ~ Thomas Szasz

Well, for one thing, that’s where the opportunity lies.

The biggest opportunities are going after complex solutions that incumbents trained everyone to think could never be made simple. ~ Aaron Levie (@levie)

For another, Apple knows that the real danger lies in NOT taking chances.

Don’t play for safety.  It’s the most dangerous thing in the world. ~ Hugh Walpole

Avoiding danger is no safer in the long run than outright exposure. The fearful are caught as often as the bold. ~ Helen Keller

If you risk nothing, then you risk everything. ~ Geena Davis

The trouble is, if you don’t risk anything, you risk even more. ~ Erica Jong

It is better to err on the side of daring than the side of caution. ~ Alvin Toffler

Nilay Patel doesn’t understand what Apple understands. You can make small incremental changes — baby steps, if you will — to improve an existing product. But designing a new User Interface is revolutionary and requires radical change.

A truly great design is innovative and revolutionary. It’s built on a fresh idea that breaks all previous rules and assumptions but is so elegant it appears simple and natural once it has been created. ~ David Ngo

You can’t get to a new User Interface by taking baby steps. You get there by making a leap.

[pullquote] The most dangerous strategy is to jump a chasm in two leaps[/pullquote]

The most dangerous strategy is to jump a chasm in two leaps. ~ Benjamin Disraeli

Apple is not going to sit around and wait for their competitors.

If you’re competitor-focused, you have to wait until there is a competitor doing something. Being customer-focused allows you to be more pioneering. ~ Jeff Bezos

The competitor to be feared is one who never bothers about you at all, but goes on making his own business better all the time. ~ Henry Ford

[pullquote]Apple is not going to sit around and wait for Nilay Patel’s permission[/pullquote]

And they’re not going to sit around and wait for Nilay Patel’s permission either, that’s for damn sure.

Standing still is the fastest way of moving backwards in a rapidly changing world. ~ Lauren Bacall

Apple thinks AirPods are going to be a significant part of their future.

The future lies in designing and selling computers that people don’t realize are computers at all. ~ Adam Osborne

So they’re moving toward that future today.

Most important, have the courage to follow your heart and intuition. ~ Tim Cook 10/5/16

Why is that so hard to understand?

Published by

John Kirk

John R. Kirk is a recovering attorney. He has also worked as a financial advisor and a business coach. His love affair with computing started with his purchase of the original Mac in 1985. His primary interest is the field of personal computing (which includes phones, tablets, notebooks and desktops) and his primary focus is on long-term business strategies: What makes a company unique; How do those unique qualities aid or inhibit the success of the company; and why don’t (or can’t) other companies adopt the successful attributes of their competitors?

137 thoughts on “Apple’s Future Is Ear”

  1. I tuned out about 2/3 through… all that on the Apple thingies and how innovative they are and whatnot, and not a word on the Moto Hint from 2014, which is very much the same concept ? They had good critical success, not a runaway sales success, which probably is a hint to the actual value of such things.

    As for Apple boldly innovating, which is claimed in there somewhere I’m sure, well… is this 2014 ?

    1. You really need to start reading the articles you link to. From the article:

      “The Hint’s fundamental problem is no different than Google Glass: it’s not useful often enough to be worth keeping it on all the time. Motorola tried to make the future – but all it wound up with was a Bluetooth headset.”


      “Moto Hint is ready to pair. Go to Bluetooth menu on device to complete pairing.”

      “Paired,” she said, after I’d spend a few seconds digging through the menus on my Moto X. “Moto Hint connected.”

      Yeah, that sounds seamless


      “You can use Siri or Google Now by tapping on the face of the device, which produces a really ugly, mushed clicking noise despite the fact that you’re not actually moving anything – it’s just a horrible, skeuomorphic sound that supposedly acknowledges your input.

      I say “supposedly” because as often as not, nothing happens when the sand-in-the-machine clicking sound goes off. I tap and I wait. I wait for Siri to ding, for Google Now to ask me who I want to call, for Moto Voice to start listening. Then I tap again. And again. More than once I just took my phone out of my pocket, or yelled “OKAY MOTO X” extra loud hoping the Moto X would activate in my pocket. (Unlike the X, the Hint isn’t always listening; it always takes a tap to wake it up.) It has a habit of disconnecting when my phone goes into a pocket or a bag, too, and never holds up beyond more than about 30 feet.”


      “On the other hand, it’s always a mystery what will happen whenever I put it on. Sometimes my music would start back up when I put the earpiece in; other times, silence.”


      “This is all, of course, ignoring the most important flaw with the Motorola Hint: it just doesn’t sound good. It’s not worse than your average Bluetooth headset, but it’s not good at all. It’s actually worse than Apple’s EarPods”


      “It’s a very comfortable, mediocre-sounding Bluetooth headset, and comes in handy sometimes when I’m quickly sending a text and don’t want to dig out my phone, but neither voice control in general nor the Hint in particular is powerful enough to make the hands-free connection a permanent one.”

      It’s not about being First!, it’s about getting it right. Apple is rarely first to market. There really is no prize for being first.

    2. “not a word on the Moto Hint from 2014”

      Sometimes I wonder about you, obarthelemy. Are you a parody account? Because with comments like the above, you sometimes seem to be mocking yourself.

      There were over 3,000 car manufacturers in business prior to the introduction of model T. Do we remember their names? No we do not. Because they did not popularize motor vehicles.

      It’s not first that matters, it’s first to get it right. Now I’m not guaranteeing you that Apple has gotten it right with the AirPods, but I am guaranteeing you that the Moto Hint is irrelevant to our discussion. Did they move us toward a voice-first user interface? No they did not. End of discussion.

      1. “It’s not first that matters, it’s first to get it right.”
        No, it’s the one with the power to impose their will.

        1. “it’s the one with the power to impose their will”

          If that were true, there would never be a successful startup. and all the companies on the fortune 500 would be over 100 years old.

          1. Yeah, you’re right…

            Barnes and Noble, Amazon, A&P never put a mom and pop shop out of business. Exxon/Mobile never had a war fought for it.

            If it weren’t for anti-trust laws, how many start ups would never happen?

            But still, power comes in many forms. Removing something to promote an agenda is a pure power play.

          2. You’re right, “innovation” comes in many forms. Amazon “innovated” around the supply chain and company structure — not around their consumer devices. But somehow, Amazon is Wall Street’s and the Justice Department’s and the consumer’s Darling.

          3. Which only shows that innovation is far from being the sole arbiter of commercial success.

          4. That’s right. So, one has to do some analysis regarding how that commercially successful company arrived there. What part did innovation play?

            With Apple, success has a lot to do with having “innovative” products, or products with innovative features to sell. Is Apple “innovative”? Yes, Apple’s “innovations” are quite quickly adopted by a couple hundred million people who often make them integral to their workflows. And these stick around, and largely get adopted by other companies as the new bar.

            Is a company “innovative” just because it shoves the latest feature (along with 60 others) in a device as a gimmick, only to abandon the product/service/feature six months later? That’s debatable.

            It’s especially debatable when the company *clearly* isn’t commercially successful on the back of their devices or software, but relies on becoming the de facto supplier to large companies or selling user data or some such. In those cases, the selling point really isn’t “innovation” or attention to detail, it is ubiquity, convenience or price.

            Fortunately, it’s easier to analyse Apple than most companies, because Apple actually is more transparent than you wanted to admit in another comment above. There are lots of data points that they provide to the SEC, etc., for example. Much more so that most others. (Of course, they are not particularly transparent about their road map).

          5. Well reasoned, if a little bit biased.

            And by the same token, how much of that acceptance comes from having no where else to go within the ecosystem. How many useful (to others) features are we willing to support their exclusion, for the benefit of the team? How many times have I read about the innovativeness of floppy and cd removal. The justification of the Blu-Ray bag of hurt? Ther courage to remove the headphone jack. How many superlatives are place on things that benefit Apple more than the customer, many of which are trivial due to removal?

            Not being particularly transparent about their roadmap? Good enough for me if an understatement. How about being transparent enough to tell us exactly which CPU is in the machine, front and center. No, quad core i7 is not telling me enough.

        1. “Pretty much all the ideas of Apple earbuds are in the Hint”

          You will always be baffled by what does and does not succeed in tech because you use the wrong definition of “first” and the wrong definition of “innovation”.

          For the consumer, first means first to make a product that has commercial value. Nothing that occurs prior to that is relevant.

          And innovation is arranging things in a new, more useful way. Sometimes just the tiniest changes can have the most profound consequences.

          Most tech observers can’t seem to recognize the asymmetric nature of innovation. Sometimes they conflate innovation with invention. Sometimes they think that the change has to be big for the effect to be big. Reality says otherwise.

          1. And you seem to conflate innovation with success. You can be an unsuccessful innovator, the Wright brothers aren’t Boeing, de Dion-Bouton isn’t GM, the guy who convinced the medical establishment about microbes and infections died doing the demo…

            I seem to remember you were the one having difficulty about my calling “lock-in” on satisfied customers too.

            “success” and innovation” are 2 different concepts. The tree that fell in the empty woods still fell.

          2. I agree with you 100%. Innovation is an intrinsic property, a fleeting one, but intrinsic. It’s like saying it’s red, round, or heavy.

            This is where I really get my shorts in a tizzy when business types define innovation. They can define it for their own purposes, but it’s their purposes, not some law of nature.

          3. No-one has an issue with its being intrinsic. But, viewed that way alone, it’s not measurable: it’s like beauty — in the eye of the beholder.

            Now, everyone thinks, I hope, that their wife is beautiful. And, similarly, everyone no doubt thinks that every idea they every had is “innovative”. There are some 3-4 billion women in the world. And there are probably at least 3-4 billion ideas being hatched every day. They are all “innovative”, more or less. Great! I mean, that’s what an “idea” is, right? A new, non-obvious take on something that we think is useful. I don’t know about you, but I have six innovative ideas before breakfast every day!

            Now, how do we measure them? How do we measure that someone else’s idea was innovative? Because the person with the idea brought it to life and made it available to a significant number of people; and many people voted with their wallets to adopt it as something that either performed a job that wasn’t being done, or replaced something else by doing the previous thing’s job (and possibly more jobs) better.

            Of course we stand on the shoulders of giants. But a key requirement you listed (-it must be useful), has little meaning if it stays in a lab, or if a few die-hard early adopters try it out for a short time and try to make it work for them. There is a broad, culture-change aspect to a “universally” useful innovation; and yes, commercial success tells us something about that — if nothing more than to give us hard numbers about the number finding it useful. But, it is just an indicator, and you still have to apply some analysis of just what it is an indicator of…

            Obart will turn around and say, as he has before, “well, MS must be really innovative, then” (a large number of people paid to use their products). Sure, but innovative at what? At the mouse, or the GUI? No, at the business model; getting to users through the workplace, for example. (The mouse that every company makes today bears more similarities to the first one made by Apple 32 years ago, than Apple’s did to the inspiration Apple saw in the Xerox Parc lab.)

            “Something” takes a product over the threshold or tipping point into more widespread adoption as something useful (and therefore often leads to commercial success as an indicator), and that “something” is part of the innovation. It’s part of the recipe; it’s the completion of the thought or idea that brings it full-circle. And with Apple, this “something” rarely happens by accident, they actively seek it. And it turns a good idea (which we all have) into a reality.

            You may criticize John for using “commercial success” as an indicator of how Apple has managed to innovate products at a societal level (which makes it measurable) by using a more transparent business model (“you get what you pay for”). But other companies may “innovate” and experiment more on a commercial business-model basis; and with some the cost is largely unknown and yet to be counted (such as in terms of privacy). So, if you don’t like talking “business”, let’s revisit the business models of your favourite companies!

          4. Even if everything you say is correct, and I largely agree. The problem is this idea hasn’t even hit the level of moderate success as yet. The Earpods aren’t even shipping. So far, if this is their future, it isn’t “ear” yet.

            If his position were one of predicted success based on, I don’t know, the strength of Siri? The success of the EarPods? Something? Anything relevant? The argument about “innovation” would be moot at best.


          5. Oh, sure. I was just discussing the notion of “innovation as something intrinsic”. Whether these EarPods are “innovative” or not is yet to be seen (based on success).

            Quick and shared/remembered pairing has the potential to be innovative, because it definitely seems to fit a job to be done and needs/frustrations that are already perceived.

          6. Agreed. I know this is the reason my wife gave up on her bluetooth earphones. She couldn’t easily use them between her phone and macbook.

            As long as they can’t be hijacked, is the other obstacle.


          7. Again, their innovative character is already built in, their success as yet undetermined. Two distinct things,

          8. Thank you for a truly thoughtful, thorough, and sincere reply.
            Success is distinct from innovation, with innovation merely being a component of success. There are far too many factors to success, to attribute it all (good or bad) to the inherent innovativeness of the product or practice.
            How do we measure them? We can’t, as you’ve already stated. Show fifty shades of red to a group, and get 1000 different opinions. Trying to force fit a one-dimensional measure, market success, as an indicator of innovation is often irrelevant. Case in point, MS leveraging Windows to force Office unto customers. IBM forcing customers to only buy IBM punch cards, etc. These products succeeded not because of their extreme innovation (especially the punch cards 🙂 ) but because of other factors.
            All this is to say that I agree with you that commercial success is an indicator, at best, of “measuring” innovation. That attempt may satisfy the “metrics” people, who often aren’t good at interpreting the metric, which is the real value anyway, and which you’ve already pointed out.
            My favorite companies….
            Don’t really have any “favorite” companies. I do love innovation and creativity and I hate to see it cheapened with shallow metrics.

          9. Separate issue. You are far too intelligent and honest so as to be calling Apple a transparent company. The “metrics” speak against you and “you get what you pay for” is propagated by fans.

          10. “Nothing that occurs prior to that is relevant.”

            Good thing Moore, Shockley, et. al. didn’t believe you. Never mind the great physicists before them that made it all possible.

            “If I have seen further, it is by standing on the shoulders of giants.” -Isaac Newton

            So… nothing happens in isolation, and yes what happens before, and even first is very important. Except perhaps to the one dimensional outlook of some business people.

            Regarding innovation versus invention, as a recovering attorney, you surely know the requirements for obtaining a patent to an invention.

            -It must be statutory
            -It must be new
            -It must be useful
            -It must be non-obvious

            The last three of the four requirements define innovation. The statutory requirement speaks to the nature of the invention (does it have the right to be patented), and has been prostituted to no end.
            So I submit to you that invention encapsulates innovation, just non-exclusively.

      2. However, even Moto wasn’t first. Technology has been trying for decades to make a voice UI plausible and operable with lousy success. If there are critics, the criticism is warranted. It is up to the developer to _show_ they have gotten it right. And currently even Apple’s track record is working against them.

        Not that they haven’t succeeded, just that we have no reason to believe they have, yet. And with the Airpods delayed, we STILL don’t know.


  2. Welcome back!

    Much to process here, so some random thoughts…

    As always you’re an interesting pundit, promoter, evangelist, apologist (yes, the other “f” word) promoting the virtue of Apple. As proof I offer this…

    “Or is it we who are doing Apple a disservice?” – John Kirk

    Slide whistle! Talk about EXACTLY backwards! The payer is the one being served. Always! Yes, a developer can offer their vision and their respected authority, but a seller is never served, they get paid for serving. End of transaction. Next.

    As further proof, where is the equitable analysis and critique of competitor products?

    “High tide will be at 9:32 today, I sure hope that’s good for Apple…” – klahanas’ perception of people of the other “f” word.

    But boy you are interesting! (what’s the emoji for sincerity?)

    Unlike you, the optimist, Apple has taught me cynicism, for reasons we’ve discussed ad nauseum in the past. So, to contribute opposing noise to the narrative, lest it be one sided, this is about selling ever more granular functionality, unbundling if you will, in order to sell devices. Our devices. Our ecosystem trapped devices. You recognize this yourself where you describe the notebook to Airpod progression.

    The photo, now common, of the white earbuds indicated, and indicates an air of insular drones. It did then, it does now, but we know they were doing one of two things:

    -Listening to something.
    -Talking on the phone.

    With the Airpod, always listening, I wonder what the clever term for auditory Glasshole would be? Respect points for the most clever term….

    Once awareness of function becomes widespread, public scorn will follow.

    They are too intimate. Much too intimate. They would only scare me more if they were more deeply implanted. What next? Suppositories? Implants?
    I bring this up from time to time. Please watch the movie Demolition Man. It’s prophetic.

    1. With respect to Airpods and the era of the voice interface that it is supposed to usher in: I don’t mind the always listening part, it’s the always talking part that I object to when people go full utopian mode about the voice interface future.

      1. You might not mind but people you’re speaking with, or just in the room with, might. Let’s see what they’re all about when their in the wild.

        1. I don’t mind people around me listening to their ear thingies all day, I would mind their talking to it all day.

          1. We didn’t adjust to people talking into their phones in public transportation, in dentist’s waiting lounges, and other such places. And music is different from babbling human voices. I think the jerk factor of constant voice interfacing is just too great. I’m not saying voice interface is going to be a failure but I do think it’s not going to be as ubiquitous as some people envision.

          2. “We didn’t adjust to people talking into their phones in public
            transportation, in dentist’s waiting lounges, and other such places.”

            maybe you didn’t, but are you able to cite studies or surveys showing that your experience is the norm?

            Also, give me a break. people talking to their phones in public is no more distracting or annoying than people talking to the friend next to them in public. Every so often I encounter someone who forgets to use their inside voice while they talk on their phone and I become aware of their half of the conversation. The rest of the time it blurs into the background. How in hell will people talking to Siri or Google Now on their phone (with the other side of the conversation being inaudible because they’re using earphones) be any different at all from people talking to their friend on their phone?

          3. As I said, voice interface is not going to fail but it won’t be as ubiquitous as some people envision. We’ll put up with one or two people in the train making the occasional phone call, but not half the people droning on all at the same time. If all the people now messaging, emailing, arranging their schedules, and just generally fiddling with their cell phones while in a relatively crowded public setting shift from touch to voice interface, I doubt if people will not complain either to the noisemakers or the authorities in charge of the place. Voice interfaces will have their appropriate settings and usage: when you’re alone, maybe driving, and not for dictating your term paper.

          4. “If all the people now messaging”

            Now that’s an interesting point. If the primary mode continues to become texting/messaging, why is that? People can still talk to each other via voice, but they choose not to. I think this will be a voice UI stumbling block for those who think it will become the primary interface.


          5. “If the primary mode continues to become texting/messaging, why is that?”

            Privacy. Some prefer text to voice calls because they don’t wish to be overheard. That doesn’t stop public transit here in Toronto from being chock full of people of all ages talking on the phone (although I am a freak in that I hold the phone to my head, most people I see have a headset on and keep the phone in their pocket or in their lap).

          6. Right, but that idea of privacy is likely going to keep voice UI from being as ubiquitous as John’s article here wants us to believe.

            I think it is more than just privacy, though, since people will just as readily share texts with other friends around them.

            As someone who has spent a great deal of time on mass transit, it is actually kind of amazing how quiet that A train can be in general, much less on smartphones.

            People can, now, talk to their smartphones, even without over wireless headphones. But they don’t. And I don’t think it is just Siri’s failure rate. Google’s solution comparatively is far superior, but the public use is pretty sparse. At most I’ve seen it in small social circles.

            All that to say, I don’t think it will add to noise pollution except marginally. And I don’t think it will contribute to a more ubiquitous voice UI, either, except in isolated situations, for the exact same reasons.

            John’s talk of human nature should examine human nature a bit more.


          7. “talk to their smartphones, just not over wireless headphones. But they don’t.”

            Your experience and mine are very much at odds. Obviously there are huge cultural differences. In Toronto, I see people talking (usually quietly) on their phones on public transit all the time. Your experience (in NYC?) is that people don’t. Maybe it’s peer pressure, if the social disapproval of talking on one’s phone in public is as fierce in NYC as it seems to be among some of the commenters on this thread. Whereas Canadians tend to be much more tolerant of other people doing as they please.

          8. NYC and Chicago (some Atlanta, but I mostly drive around here). People talk freely while waiting, but once on the train or bus most people hang up. A few will talk quietly. But it really is the exception. Maybe they are waiting for the “inflight entertainment” to start or something. I don’t know. But they really are fairly quiet.


          9. “Canadians tend to be much more tolerant of other people doing as they please.” That’s pretty much the definition of what a Canadian is.

          10. “We’ll put up with one or two people in the train making the occasional
            phone call, but not half the people droning on all at the same time.”

            Again I ask how is this any different than all the people droning on all at the same time as they talk to their physically present friends on the train or in the cafe or wherever? If you object to one, you have to object to both. And if hearing the people next to you converse bothers you, then may I suggest the problem is not with them, but with you.

          11. I see more and more people in the US anyway, when in a waiting area, they will duck outside to talk on the phone. And some retailers won’t serve you if you are on your phone. Oddly they have no problem if you are chatting with an accompanying friend.


          12. What is this constant drone of conversation that you talk about on trains, airplanes and buses? Not very many people talk while taking public transportation, at least not in the US.

          13. “folks will not suddenly display a raised tolerance for increased noise just because voice input becomes available”

            I agree, it won’t be sudden. But I do think it will happen over time. Obviously voice input will never be tolerable in some situations, I don’t think anyone is saying voice input will be everywhere at all times, but many places where we don’t have much ambient voice noise now will change over the next five to ten years. I could be wrong about that of course, but I’d make that bet.

          14. You might also mind if someone else’s device was recording you.
            These could “auditory glassholes”.

          15. You can do that now with a cell phone in your pocket. And for some reason, people just aren’t as alarmed by it as they are with surreptitious video recording. Which is understandable if you don’t normally talk about felonies you are planning or have committed.

          16. So true. But how about confidential business stuff, or a job interview. Or the creepy guy on the next table?

            The cell phone is extremely multi-purpose. It doesn’t raise suspicion as a Olympus dedicated recorder would. These are more specific purpose as the Olympus. Once recognized as such…

          17. You know, people used to say the same thing about a cell phone that you’re now saying about an earpiece. If you used a cell phone in public, it would bother the people around you, they would be able to see you type in your password, they would be able to see your screen and you’d have no privacy, etc, etc, etc.

            Stuff changes In China, it’s not just acceptable to use voice for input, it’s the norm. That could happen in the West too and I predict it will happen in the West.

          18. But these are “vertically integrated” devices. All they do is transmit sound, in both directions, directly to a computer that processes them in real time. This is potentially worse than laying down an old Olympus mini tape recorder.

          19. China is a pretty big place. Are you sure this isn’t a regional, even “city-fied”, phenomenon?


          20. Some parts of Asia eat dogs, by the same reasoning we’ll soon all be eating dogs. Except, they’ve been doing it for hundreds of years.

          21. If there was a really good reason to eat dogs, we’d quickly overcome our prejudices and do it.

          22. And in Italy they enjoy horse meat. The point is, what relevance is China (even if the study could demonstrate that this is true across the board in China and not localized based on certain demographics, like every other product research has to explain), by what history do we use China as an example of what the the rest of the world (or even just the West, or even just the US) will do? Your the one who brought up China in your article.


  3. More seriously, I think IT companies, incl. Apple, are kind of fumbling for the next big thing. Glasses, watches, earbuds… all seem possible candidates, and work very well in a few scenarios, yet nothing really thrives or even just sticks because the benefits over reaching for one’s phone are so minimal they don’t justify the expense, silly looks, bother w/ charging, taking care of and remembering to carry the darn thing, bad UI/UX…

    There are vanishingly few instances where I resent having to reach for my phone; and in those instances, my preferences for a shortcut would be, in order of decreasing desirability glasses (incl earbuds), earbuds only, watch. I guess I’ll have to wait for glasses to become non-ugly and discrete (and camera-less).

    Maybe new apps/uses will pop up where a perma-UI into the phone and AI will justify the various costs. Maybe the costs will lower significantly (several day battery, less silly looks, cheaper, silent queries…).

    1. One thing we always underestimate is the value of convenience. Even extremely tiny improvements can be highly valued.

      For example, keyless car entry. I mean, seriously? We’re too lazy to put a a key in the lock on our car door or in our ignition? But now people love the keyless systems and wouldn’t want to live without them.

      1. I’m with you on that. Two things actually are underestimated that weigh heavily on consumer behavior: convenience and inertia.

        1. Yes, inertia is a good one. I would point to the power of the default option which may be the same thing or a variant on that theme. For example, most iOS owners use Apple maps simply because it is the default option.

          And to take it out of the realm of technology, studies have shown that the biggest determinant for organ donation is how the question is worded. If the default is to donate, then some 90% donate. If the default is to not donate, then some 90% don’t donate. Human nature may sometimes be foolish but it is always foolish to ignore human nature.

      2. Exactly.

        A car keyfob is pure gain: nothing extra to carry, nothing to charge, doesn’t make you look silly… there’s no compromise, no flip side. That’s pure convenience.

        Wireless earbuds are so full of compromises and inconveniences, the convenience factor is probably on the side of not using them: they’re ugly so you probably need to take them out for use, they get lost, they need daily charging, careful storing…

        Do you think key fobs would have been successful if they had all those drawbacks/requirements ?

        1. Actually, most key fobs were something you added to your set of keys. And yes, carrying something small would work if it added to our convenience. For example, people love garage door openers.

          I remember when remote controls came out for TVs (yes, I’m old) how people thought they were crazy. Why not just change the channel with your hand? No people think it would be crazy to change the channel any other way than through a remote control and many TVs hide the buttons necessary to change the channel manually.

          1. “I remember when remote controls came out for TVs (yes, I’m old) how people thought they were crazy. Why not just change the channel with your hand?”

            Really? I was there, too. I remember the exact opposite. But then when I got older I just used my daughter as our remote control since we were constantly losing the remote, which is still a problem since our daughter is now in another city.

            I really don’t think people are going to leave the Airpods in as much as you think they will, at least not until they are the ear-wigs I mentioned earlier. People look equally ridiculous keeping regular earphones in, too. As you say, one is really no more “ugly” than the other.


          2. Here’s how my mom solved the lost remote problem: Put it right next to the TV. You want to change something? Get up and walk to the remote to use it, then put it back right where it was before you return to your seat.

          3. We use a similar solution, but remotes are placed on a side table between two couches. So you never lose a remote (we have a bunch) AND you don’t have to get up.

          4. I forgot to close my post with the “you lazy bum” quip that was the subtext of mom’s TV remote rule.

          5. I had already assumed your subtext. My own subtext was ‘sometimes working smarter isn’t lazy, it’s just working smarter’.

          6. This is a woman who lived in a war zone during WWII, where she was the cook starting at age 7 for a family of 8 and had to move residences seven times in three years at very short notice because the neighborhood was bombed or the house was ‘requesitioned’ by the army, both Allied and Axis. I new even then that whining about having to get up to change the TV channel was not going to get any mileage at all.

          7. Aside from the remote, progress has made your mother’s life easier in many ways. Same with my own parents, who now use iPads (first computer they can really use well), drive in modern cars, and on our farm we’ve been able to expand greatly by using modern equipment. Are we all lazy bums because we take advantage of what progress gives us? I would say no.

          8. I agree with you. But try telling my mom that about the TV remote. . . Funny, it was only with the TV remote though that she ‘resisted progress’. I think she was trying to drive a point home to us kids.

          9. Yeah, so that was obviously not about the remote for your mother, but rather a good lesson for her kids. I’ve done the same kind of thing with my own kids, digging post holes with a century old piece of equipment instead of using a gas powered post hole digger. We also finished an in ground pool by hand with shovels, although I did do a major portion of the work with our backhoe. I appreciate your mom’s lesson, but generally speaking working smarter isn’t being lazy.

          10. Agreed, those 3 things are examples of devices that brought superficially minimal functionnality, but have still been very successful.

            I’d argue earbuds shine more for their differences than their commononalities:
            – require daily charging
            – are something separate and extra to carry about (car and garage openers fit on the keychain you had anyway, or got left in the car; remotes got dumped on tables)
            – impact your looks (as much as makeup, jewellery, facial hair, pimples…)
            – don’t fully replace thier wired predecessors (music, battery life) nor their host system (I’m fairly sure a high %age of queries will end up being finalized on-phone)
            – are socially awkward (wireless headsets w/voice dial were briefly a thing in the… I want to say late 90s ? before being mocked to extinction)

            The issues with earbuds are a lot more numerous and powerful than w/ TV remotes and car/garage openers. Indeed, they might be counterbalanced be functionnality.. .gGlass are successful in some circles, as are phone headsets. AirBuds aime at very wide acceptance I think,, Apple is very good at creating demand and making things cool, but there hasn’t been any ripple in demand for earbuds in the decades they’ve been around, including years supporting voice assistants.

          11. I think Apple is doing what it does best — removing enough friction to allow something to make the leap to mainstream. You feel otherwise. We’ll have to wait and see.

  4. Great article, thank you. There is another aspect of a voice interface that has not been mentioned explicitly. Visual interfaces are limited by the amount of space on the screen. Even with a large screen, the designer must keep it simple and avoid clutter. In a complicated application, this forces the use of hierarchical menus. The organization and complexity of these menus can make or break the application. With a voice input and AI, there is no need for the hierarchy of menus. You can say “show me the check to ACME, inc in February” without selecting the app, then choosing the account, then searching for the check. The addition of Siri to the Mac will give Apple a lot of input to use in developing this new interface.

    1. Or you can have an assistant, and type exactly that on a keyboard ? Or even voice the query to the PC/phone…

      I’m mostly wondering if we’re supposed to have that cigarette thingy in our ear at all times (it’s ugly and doesn’t have the battery for it, at least the Moto Hint was more unobtrusive), or if we’re supposed to take it out of its case, put it in, voice our query, take it out of our ear, put it back in storage… in which case, using the phone directly is quicker.

      1. “I’m mostly wondering if we’re supposed to have that cigarette thingy in our ear at all times (it’s ugly …”

        If an ugly device is useless (like Google Glass) it remains ugly. If it’s useful, we adapt to it quickly and the ugly can become the beautiful. Remember, people thought cars were ugly when they were introduced, but by the fifties, many considered them to be beautiful and even sexy.

        They say beauty is in the mind of the beholder, but perhaps it’s in the mind of the beholden.

          1. beholden: owing thanks or having a duty to someone in return for help or a service

            Few of us want to feel we have a duty to do something, but almost all of us are happy, and more than happy, to do things as a way of showing our gratitude.

          2. synonyms: indebted, in someone’s debt, obligated, under an obligation; grateful, owing a debt of gratitude

            You are an optimist!

  5. After giving us 100,000 words, John, I hope you’re right with this prediction!

    I have that hope for two reasons: (1) I’d hate to see someone put that much effort into an article and be wrong, (2) I have an intuitive feeling that you’re right.

    1. Yeah, my articles are too long. As I’ve always said, I never met a sentence that I couldn’t turn into a paragraph.

      My bad. 🙂

  6. John, great piece of work. Was with you until Apple Maps 🙂 Seriously, Maps is still full of address errors for places I navigate to using the business name. Even their submittal process is broken… took 3 tries for them to fix an error.

    1. And for me it works well enough, whereas google maps has obvious errors vis a vis business names and locations for my neighbourhood of Toronto.

      Mapping apps are always going to be very YMMV depending on where you live and what your needs are. Every mapping app is going to have mistakes and lacunae, the only question is whether or not it’s good enough for what an individual needs it to do. And since every mapping database is also evolving all the time, it’s a good idea to check in every year or so to determine if an app has improved enough to be a viable solution for you or not. Then from the field of available “good enough” mapping apps, pick the one that has the features that you find desirable.

  7. I don’t think Google Home is trying to replace assistant on the phone or in the ear but to supplement it. It’s just giving you a stationary option in a place that you know you’ll very often be. I can’t wait to get one in the kitchen, a place where i don’t want to use my hands to touch a screen and I definitely don’t want to have something in my ear at home. Longer term though, I think it’s inevitable that we’ll let our devices listen to us 24×7 and proactively assist us rather than waiting for us to ask. Apple’s privacy stance may prevent it from doing that though.

    1. I could most definitely be wrong about Apple creating an Echo-like device. Perhaps it would just be an extension of one’s desktop, laptop, tablet, phone, watch, etc. Many people think that’s the way we’ll go. I don’t think it makes sense — but that may merely be because my experience with such devices is limited.

      1. I could imagine eventually these devices will all just be everywhere – in the TV, the stereo, the thermostat, the fridge, the bathroom mirror – we won’t need a standalone mic/speaker. But until then, I’m happy to have a hub at home. If i start cooking or I’m about to leave the house and want to give a quick command or question – when’s the next bus, how do I make salsa, play SongX on Spotify through my stereo, add Y to my shopping list, SMS my wife, should I take an umbrella – then my life is slightly easier. Yes, it’s a bit limited now but I can see it becoming an important part of life

        1. “I could imagine eventually these devices will all just be everywhere”

          Yes, I could definitely see that as a possibility. Think of the computer in star trek. It’s everywhere; all around them. They simply say “computer” and ask a question or demand a service.

          But I think it’s more likely that we’ll computers will be more like the badges trekkies wear. The badges are always with them — whether they’re on the ship (at home) or not and they can call for the computer whenever and wherever they need it.

  8. Love the Bucky quote; one of my heroes.

    With AirPods now indefinitely delayed, they are off to a slow start.


    1. Could never know for sure, but I’m guessing it’s not a hardware challenge for the delay. No new tech is being used as far as I can tell. The software integration on the other hand is indeed a huge challenge. Even 95% accuracy is a miserable experience.

      1. One guy at ARS was speculating there is no way to get it to zero in on one device when pairing. It opens up to any device within range.

        I have no idea, myself. I wish them luck. Voice UI (much less Siri) needs to improve, but I doubt it will ever be primary. I remember the old Apple Speech days. I had it pretty well tailored with Apple Script doing a lot of work. But in the end there is nothing simpler than just sitting down, quietly at the desk, and working while listening to music at whatever level you want not needing to speak to anyone.

        And I think that is a biggy. I think, culturally, the US is more prone to introverts than other cultures. Introverts don’t like to talk period, not just to other people. A voice UI will be part of the mix, sure, but not likely primary.

        I think it goes without saying Apple could have done this without deprecating the minijack. I think Apple thought the AirPods would be much further along by now. I even think they probably planned on including the AirPods with the iP7 but had to ditch when it became obvious it wouldn’t be ready in time. THAT would have been mind blowing. Included with the phone would have given more people a second thought to try Siri again. I know I gave up on Siri.

        If the Airpods are THAT important for a voice UI future, that would have been the only way to kickstart it into serious. As merely an option (an expensive one at that), and now delayed, it is more of an uphill battle.


          1. I’m really interested to see how an Ai parses out a command when someone uses the word “point”.


      2. Accuracy isn’t the only critical thing here. Ease of making a correction to a misunderstood voice input is also very critical. And my experience so far is the process of correcting a misunderstanding once it occurs is very, very frustrating. Enough for me to lay off the voice interface.

        When a human being misunderstands what you said, it takes what, 2 seconds to correct it? If your phone misunderstands you, well it’s off to the circular races.

          1. “I was so much older then, I’m younger than that now…” – Robert Allen Zimmerman
            Thanks Joe!

      3. Accuracy is an interesting perspective. Back in “good ole’ days” Ma Bell based telephone voice quality on the fact that most people only hear about 40-60 percent of a conversation anyway. People are able to fill in the blanks. This is the huge problem with AI and computer based assistants. That last 5% is the last mile.


    2. I’m always amazed at how quotes from all sorts of miscellaneous sources, seem to seamlessly fit into my articles on tech. I think these quotes often deal with human nature and they never go out of date.

      1. What amazes/amuses/interests me is how similar the conversations in technology are in the arts world, both in terms of creating as well as business.

        I lit a play based on Buckminster Fuller. One of my favourites and for many of the same reasons lighting Red, the play based on Mark Rothko, was one of my favourites.


  9. Voice interface? Are we condemned to a future of being subjected to incessant, annoying, perhaps even deafening mumbling and murmuring wherever we go?

        1. People always think the future will be dystopian. They worried about cameras, they worried about mobile phones, they worried about smartphones. They railed against the walkman. Guess what? We adjust and acclimate.

          1. We were annoyed too about the Segway, same for Google Glass. Mobile phones were accepted but yakking into one while you’re on a train or bus can get you ejected. The walkman was accepted but you try to play it loud in a bus with the headphones’ chirps drilling into other passengers’ skulls and you will be inviting antisocial behavior in response to your antisocial behavior. You only talk of the adjustments that people were willing to put up with, not the ones that society put up with but only under certain restrictions or the ones that society pretty much completely rejected.

  10. After a brief thought I believed the AirPods are just a cover-up for a real reason for the removal of an analog headset connector. Their cigarette look indeed look ugly. And the real reason is adding more sensors to the headset, such as brainwave sensors. This could be Apple entry into VR by adding edits to the songs based on a level of brainwaves.

    I did not know it was possible, but some research uncovered headsets with biofeedback sensors, such as NeuroSky :

  11. While I agree with most of this essay, I think the starting point is off.

    “Every generation of computer has gotten smaller and smaller. And that trend is not going to stop.”

    Actually that trend is going to have to stop, both because human beings cannot be given a process shrink the way chips can, and because of the limitations of our energy storage technology. On the one hand, computers intended to be used by humans have to be large enough to be manipulated by humans. Not talking interface, talking big enough to pick up, to be noticeable when you set it down and not get lost. Suppose after another 10 years we can put a computer in a grain of sand — no human being will be able to use it unless we embed that grain of sand in a much larger enclosure. On the other hand, computers need electricity and our ability to store electricity is not keeping up with Moore’s law. Even if we aren’t limited in shrinking the hardware by the limits of human hands, we are limited by the hardware’s need for electricity to do the things we want it to do, which means we need to give it batteries of non-trivial size.

    Basically, I doubt very strongly that computers intended to be used by humans will shrink much below their current minimum wristwatch-like size. The new paradigm of a voice UI (which I agree is now being explored and which is what all the tech companies are piling on voice assistants for) is not going to be driven by the *need* to interface with ever-smaller computers so much as by the *desire* to interface with computers in a hands-free and sometimes even glance-free manner.

    “One problem with Smartphones is that they are demanding. They cry out for our attention. They buzz, they beep, they ring, they flash, they vibrate.”

    If that’s your experience of your phone, then I say, once again, you are using your phone incorrectly. The default response to “appname would like to send you push notifications” should *always* be “no”. We are in the infancy of the smartphone era. The cultural knowledge of how one should use one’s phone is still in its teething stage. I predict that in the next few years there’s going to be a gradual realization by more and more people that their phones have become their masters rather than their servants, and a cultural shift away from allowing one’s phone to demand your attention whenever it likes. OS makers will start to enable options that restrict the phone from alerting you about new messages too often, especially new messages not from people on a preapproved list. (And by “phone” I mean whatever kind of computer you have that is always with you, whether wearable or pocketable)

    “Another problem with Smartphones is that they are intrusive. To interact with a Smartphone, we must look at it. When our focus is on the Smartphone, our focus is off everything else.”

    And here is where the current tech industry fascination with audio assistants is running off into the weeds. Listening distracts our focus *just as much* as looking. In fact, it can distract *more*, because we can glance quickly, but we cannot listen quickly. Audio is a *much* lower bandwidth form of UI compared to visually based UI. Audio is sequential, visual is random access.

    And that is why I think having your computer verbally tell you things (beyond very brief things like driving directions) is not going to be nearly as popular as being able to command your computer with your voice. And why I think computers without any kind of monitors are not going to catch on except as ancillary add ons to your primary computer, which is always going to have a screen of some kind.

    1. “Actually that trend (getting smaller) is going to have to stop…”

      I very much doubt it. I think computers are going to become wearables. We might wear them on our heart or somewhere else on our bodies — they may even be inserted under our skin.

      And, as I said, we won’t have just one computer. We’ll have a phone, probably a watch, almost certainly an earpiece, and we’ll use the interface that works best.

      1. “I very much doubt it”

        You are wrong. As I said, there are limits, both to power and to manipulable size. And moore’s law cannot fix the power issue because most of the power is not required to run the CPU anymore, but to connect the device to the network over wifi and to enable it to communicate to us (to run speakers or display or taptic engine or whatever).

        Running an implanted computer off waste body heat or off electricity siphoned from your nervous system or whatever is an SF pipe dream. It’s always going to need a battery and it’s always going to need to be charged, and that means you can’t make it much smaller than a watch or an earbud, because batteries stubbornly refuse to get much more power dense than they are now (for reasons of basic chemistry) and because our fingers stubbornly refuse to get any smaller than they are now.

        “I think computers are going to become wearables.”

        Maybe, maybe not. Smartwatch sales are supposedly tanking. The whole wearable computer trend is still in the crib. It may live and grow, or it may die of the tech version of SIDS, only time will tell.

        Re: using the interface that works best. A huge issue being overlooked here by all the young able-bodied people running the tech industry these days is that aural interfaces are not going to work well for people as they get older and more hard of hearing. I’ve ranted before about the rabbit eyed coders who are making all these wonderful new apps and web interfaces that I can barely read (and more often than not am prohibited from pinch zooming to make them more legible). My mother is starting to go deaf like her parents before her. The thing is, you can make a text interface use bigger type quite easily (just force all the coders to wear glasses that give them the vision of the great grandparents while they’re designing the UI). But you can’t make an aural interface louder without surrendering privacy. Ten years from now, if my mom is going to be using Siri, she’ll have to turn the volume up to maximum to hear what is said… and that means she’ll have to surrender her privacy to those next to her, even if she’s wearing earphones.

        1. “You are wrong. As I said, there are limits, both to power and to manipulable size.”

          We’ve said there were limitations throughout the history of computing…right up until we figure out how to overcome those limitations.

          1. You have a big blind spot. You’re only talking about technological limits which are limits arising from the device itself. Just as important are cognitive/physiological limits imposed by the user of the device.

            Manned fighter planes can be made more maneuverable than they are today but they aren’t because they are already at the limits of a pilot’s g-force tolerance. Automobiles can be designed to run and accelerate faster but they aren’t because, for one thing (and not the only thing), they would outrun normal human reaction times. Pretty much every product you can think, of you can see how it has in some aspects hit the limits dictated by the product’s user even though the underlying technology could have gone past it. That includes computing.

    2. I’m with you on this one. In the end, whatever limits there are on a computer’s function and form factor will not be determined by the state of miniaturization technology but by human biological limits. How small computers get will be dictated by our manual dexterity and visual acuity. The type of interface will be determined our cognitive limits. An all-purpose everyday computer will have to have a screen because as you said, humans seem to process information, especially of the abstract/symbolic type, better and faster when it is presented visually. That’s why we invented maps, graphs, spreadsheets, etc. Nothing mysterious or magical as to why we are that way, it’s just a consequence of evolution.

  12. Apple didn’t steal the graphical UI in any way, but the Jobs quote you use does give too little credit to Xerox PARC. I remember reading the 1977 Scientific American article by Alan Kay ( ) about the future of personal computers, and how it felt like it a bolt of lightning coming from the future, full of so many new ideas and viewpoints.

    The book “Fumbling the Future” by Smith and Alexander is a wonderful account of how Xerox made PARC to give them a future after the copy machine patents ran out, and how brilliantly the scientists did their job, to no avail since administrators couldn’t understand what they’d achieved.

    1. You make a good point, norm, but I doubt I could have asked my readers to read a book on the topic rather than an article and a short video. 🙂

  13. I’m a bit surprised by how much ‘luddite-ness’ there is in the comments here. Is it just because some commenters are always mad at Apple no matter what, or is it because they really are opposed to technological change?

    1. Yes, I didn’t say Apple was first, in fact they aren’t. Obviously other companies have done voice with the Amazon Voice probably being the most prominent. I’m just saying this is Apple’s take on voice and they’re doing it in a typically Apple way — they’re trying to create an integrated hardware software experience along with a hard to create, but easy to use user interface to make their vision work.

      I’m reading a book called: Innovation and its Enemies. Our fear of the new has been with us since the beginning of our existence. A couple of examples from the book: coffee, the printing press, margarine, farm mechanization, electricity, mechanical refrigeration, recorded music, transgenic crops, etc.

      One last quote from the book:

      “Public debates over new technologies…can rage for decades, if not centuries. For example, debates of coffee spanned the Old World from Mecca through London to Stockholm and lasted nearly three hundred years.”

      1. There does seem to be an obsession with being first among the tech crowd (even in the face of overwhelming evidence that being first does not matter), as well as a lack of understanding that the very habits they have today which are driving their resistance to change were resisted by others before them.

      2. For one, first matters to the Patent Office. Period.

        Change is good, when it’s chosen, not imposed. When USB finally became so prevalent, that you couldn’t get a parallel printer or RS-232 device anymore, then it made sense to remove them and offer adapters. The transition happened not because Apple removed the LPT and RS-232 ports, but because they were so inferior to USB (for all involved) it happened naturally.

    2. It has nothing to do with “luddite-ness”. It has everything to do with a specious position based on circumstantial evidence. Especially around a product that is currently vaporware.


      1. I think the other problem many commenters have (including you) is thinking in the now, rather than five to ten years out. Of course voice UI sucks now, and AirPods have many issues now, and there are cultural/behaviour issues now, but it should be obvious that none of this is static, technology will improve and the world will change, as it always has. I’m reminded of the very first iPhone and how so many people said similar negative things, but the inevitable better future of that product was obvious. The future now is distributed computing, and voice will be an important part of that. I’ve been calling this an Apple Network of Things for years already. We’re still a ways off of course, but it is coming.

          1. You’ve said that before. Care to elaborate and give some examples of things I’ve made up? Keep in mind that thinking about the future in reasonable, logical, and obvious ways isn’t making anything up, it is simply thinking about the future in reasonable, logical, and obvious ways. Could I be wrong? Certainly, but I’ve been right a lot over the past couple of decades. I like my chances.

      1. Reductionist tropes and memes are great affirmations for the believers, aren’t they? My favourite is:

        “…everyone wants a magical solution for their problems and everyone refuses to believe in magic.”

        Jefferson from Once Upon a Time

        They don’t really add to any critical discussion, but they sure make you feel great about yourself.


          1. I did read the article. And as you say, while appropriate, it is still no more critical.

            So, I do feel the same about the constant use of quotes. Great entertainment. Lousy analysis, even in the follow-up comments. I’ve gotten to the point that I skip most of the quotes these days. They are witty and add to the tenor of the article, but rarely add support material.


            [eta: But Buckminster Fuller is still one of my heroes.]

          2. Or did you mean did I read Chuq’s article? In internet time it feels like weeks ago, though it was probably yesterday or the day before. I didn’t get what he was referencing. I’ve not read any outrage. Lot’s of disappointment, but not much outrage. FCX had more outrage in internet postings than I’ve seen about the new Macbook Pro. I probably just don’t frequent those forums, I guess.

            Still, for the expectation set by proclaiming “Hello, again” for the event and not having a substantial update for several years, it really did not pull out the wiz-bang. But then no one really has. MS had the Surface whatever. But even that is “whatever”.

            But that’s just me. There really isn’t much left for the general purpose PC to innovate around that I can see. I hope to be proven wrong.


    3. Clearly the future will be very different from today, but you sound like one of the smug futurists who predicted we’d all be flying our own helicopters instead of driving cars.

      1. Heh, you wisely edited your original comment and went with a much shorter version. For the record, I’ve never predicted flying helicopters. I’m a pragmatist, and I work hard to shed all bias. This helps me see what is obvious and inevitable. Distributed computing, abstraction, an Apple Network of Things, voice UI, wireless, wearables, these are all things which are easy to see coming, and they are inevitable.

  14. I think the other thing that will have to change for EarPods to make this theoretical successful leap in consumer behaviour, even though they haven’t even shipped yet, at least in the US, is the social contract that says “I have my earphones in, don’t bother me.”


Leave a Reply

Your email address will not be published. Required fields are marked *