Apple’s Very Human Interface Guidelines

I am impressed with the speed of Apple’s foray into entirely new UIs. No, I am not talking about the re-jiggered version of iOS the company built for the Apple Watch. It merely reveals the way forward. Apple is clearly focused on transforming our bodies into the next great interface.

The devices this could enable are nearly limitless.

Starting with the launch of the Apple Watch, our voice and flesh, maybe soon our eyes, become common input methods — the mode by which we interface with data and interact with machines (or screens, clothes, wearables).

You already know about voice. The iPhone is now good enough to be reliably used for dictation, creating notes, tweets, texts, setting reminders and appointments, even searching the web. The Apple Watch incorporates this voice capability, along with touch, right from the start. The Apple Watch also incorporates physical interactions — haptics. Apple brands this as “taptics.”

My recent article in Macworld examined this new UI:

Haptic technology—haptics—uses force upon the skin to deliver real-time tactile feedback. These physical sensations are created by tiny motors called actuators. Done right, haptics can mimic the feeling of a pin prick by a wearable that tracks your blood sugar, simulate the plucking of virtual guitar strings on a tablet screen, or re-create the physical recoil of a phaser from your favorite game controller.

To date, use of haptics has been limited in part by middling accuracy — how much and where exactly the force is applied. Apple appears to have uncovered the use cases and improved the accuracy enough to make haptics a core feature of its next big thing. As the company boldly states:

Because (Apple Watch) touches your skin, we were able to add a physical dimension to alerts and notifications — you’ll feel a gentle tap when you receive an incoming message. Apple Watch also allows you to connect with your favorite people in some new, spontaneous ways not possible with any other device.


Physical sensations — haptics — are core to the Apple Watch UI.

It’s called the Taptic Engine, a linear actuator inside Apple Watch that produces haptic feedback. In less technical terms, it taps you on the wrist. Whenever you receive an alert or notification, or perform a function like turning the Digital Crown or pressing down on the display, you feel a tactile sensation that’s recognizably different for each kind of interaction. Combined with subtle audio cues from the specially engineered speaker driver, the Taptic Engine creates a discreet, sophisticated, and nuanced experience by engaging more of your senses. (emphasis added)

Where might this lead us?

I won’t predict any specific devices. I will say that, by leveraging human voice, touch and sensation, entirely new forms of interaction become possible — with data, objects and people. Thus, while I confess I am not terribly interested in the Apple Watch per se, I am very excited by Apple’s deliberate if somehow under the radar efforts at launching these human-centric UIs.

See Me, Feel Me, Touch Me

Sogeti Labs predicts a “personalization” revolution by 2025, a world filled with an amazing array of mobile devices, sensors, wearables, things, robots and semi-autonomous machines. In this brave new world, current input methods simply won’t work. No matter how great or how knowing the state of artificial intelligence or Big Data may be ten years from now, the world of “computing everywhere” will be severely limited if it cannot be instantly and reliably engaged by voice, touch, physical force and/or eyesight. Apple — with its pricey, jewelry-like watch — is showing us the way forward. Not with a failed beta like Google Glass but with a very real product soon available for sale around the world.

I predict the potential for human UI to be so great in fact I suspect Apple’s appropriately named Human Interface Guidelines only barely scratches the surface of what will soon be possible. These are the early days of the human-computing interface, akin to when the early PC makers touted the benefits of “storing your recipes”.

Here are some of the present ways the Apple Watch will leverage our body to interact with data (emphasis mine):


Siri. Dictate a message, ask to view your next event, find the nearest coffee shop, and more. Siri is closer than ever with the Apple Watch.


Phone. Use the built-in speaker and microphone for quick chats, or seamlessly transfer calls to your iPhone for longer conversations. You can also transfer calls from the Apple Watch to your car’s speakers or your Bluetooth headset. 


In addition to recognizing touch, the Apple Watch senses force, adding a new dimension to the user interface. Force Touch uses tiny electrodes around the flexible Retina display to distinguish between a light tap and a deep press, and trigger instant access to a range of contextually specific controls — such as an action menu in Messages, or a mode that allows you to select different watch faces — whenever you want. It’s the most significant new sensing capability since Multi‑Touch.


Heartbeat. When you press two fingers on the screen, the built-in heart rate sensor records and sends your heartbeat. It’s a simple and intimate way to tell someone how you feel.


To pay with Apple Watch, just double click the button next to the Digital Crown and hold your wrist up to the contactless reader. You’ll hear and feel a confirmation from the Apple Watch once your payment information is sent.



Since the Apple Watch sits on your wrist, your alerts aren’t just immediate. They’re intimate. With a gentle tap, notifications subtly let you know when and where your next meeting starts, what current traffic conditions are like, even when to leave so you’ll arrive on time.

According to Apple, “You won’t just see and respond to messages, calls, and notifications easily and intuitively. You’ll actually feel them.”


Confession: There are no eye-driven UI features in Apple Watch. I do wonder, however, if such a UI may be coming soon. Consider how “Looks” will work in Apple Watch 1.0:

A Short Look provides a discreet, minimal amount of information—preserving a degree of privacy. If the wearer lowers his or her wrist, the Short Look disappears. A Long Look appears when the wearer’s wrist remains raised or the user taps the short look interface. It provides more detailed information and more functionality—and it must be actively dismissed by the wearer.

I can absolutely envision an Apple Watch 2018 model, for example, which can and does change the information presented based on actual eye glances, not just movements.

The overall design of the Apple Watch, its innovative computer on a chip, the clever Digital Crown input and other features and technologies are all laudable. That said, I think the most important aspect of the Apple Watch is what it portends: entirely new ways of interacting with data, machines and people all thanks to entirely new forms of human-centric interfaces.

Published by

Brian S Hall

Brian S Hall writes about mobile devices, crowdsourced entertainment, and the integration of cars and computers. His work has been published with Macworld, CNBC, Wall Street Journal, ReadWrite and numerous others. Multiple columns have been cited as "must reads" by AllThingsD and Re/Code and he has been blacklisted by some of the top editors in the industry. Brian has been a guest on several radio programs and podcasts.

5 thoughts on “Apple’s Very Human Interface Guidelines”

  1. I suspect that this is going to be both very promising, very concerning, and tragic as well as comical at times. Like any technology, it will bring out all aspects of a culture. Watching the bugs getting ironed out will be a new level of endeavor.

    Imagine a future president looking at a map of a region in the briefing room, sneezing, and her watch inadvertently sends a command to nuke the region.

Leave a Reply

Your email address will not be published. Required fields are marked *