With Touch Bar, Apple Again puts Faith in Third-Party Developers

Apple this week introduced new MacBook Pros and, in addition to bright new screens, fast new processors, and–of course—ever thinner form factors, Apple introduced a new hardware feature called the Touch Bar. It’s a high-quality miniature screen that runs the length of the keyboard, replacing the old F Keys row above the numeric keys. In person, this display looks great and it has a unique coating that makes using it feel super smooth.

As you might expect, Apple’s Mac OS and first-party apps use Touch Bar right away but, if it is going to become a must-have feature worthy of driving Mac buyers to upgrade, third-party developers will also have to embrace it. At launch, Apple already had buy-in from big firms such as Microsoft and Adobe. But the real question is whether the rest of the developer community will follow suit and, if so, how soon?
Sticking to its Guns
I’ve lamented before that, after using a number of touch-enabled Windows notebooks, using a non-touch Mac notebook felt like a step backward. It’s easy to see Apple’s decision to put a small touch screen above the keyboard as a simple, stubborn unwillingness to bend to the larger trends in the PC industry just as it once resisted larger smartphone screens. To its credit, with the Touch Bar, Apple has put together a touch technology its executives clearly believe is a better option than a touch screen.

Apple has long suggested that reaching up to touch the screen of a Mac is unnatural and that it breaks the usage model of the notebook. In theory, I agree touching a notebook screen seems unnatural. But I also know, now that I’ve been doing it for a while, it feels pretty natural to me to reach up to touch the screen to scroll a Web page.

Keeping the Touch Bar on the horizontal axis means, as a user, I’m not reaching for the screen. But it also means I’m looking down from the screen toward the keys to find the specific, custom keys each application serves up on the Touch Bar. I suppose over time you could develop some muscle memory for unique Touch Bar keys you use often, but that seems unlikely.

After the keynote on Thursday, I participated in a deep-dive session and had the chance to spend some time with the new hardware. I can tell you this much: The Touch Bar is addictively enjoyable to use.

It works as you would expect for tasks such as scrolling through pictures and video (fast and fun), changing system settings (precise as physical buttons), and using the calculator (it’s the killer app, seriously, you heard it here first).

But where the Touch Bar really shows promise is with large, complicated apps such as Microsoft Word and Excel, and Adobe Photoshop. These apps tend to have tons of features that get lost in icon-dense ribbons or buried deep in drop-down menus. With Touch Bar, the developer can surface some of these features, making them visible and more easily accessible for the average user. Power users might scoff but, for many people, this level of increased visibility could lead to real productivity gains.

Apple tells me it is very easy for a developer to enable the Touch Bar for their apps and noted that partners appearing on stage this week did so in a very short amount of time. It will be interesting to see if other major Mac software developers do the same in the coming weeks. And perhaps more telling if smaller developers, with more constrained development time and budgets, decide such an update is worthwhile for their users.

Touch ID Impact or 3D Touch Impact?
What’s not clear to me yet is whether the Touch Bar is one of those new features that will instantly resonate with customers and become a part of their daily lives or if it is merely an interesting technology that makes for a great demo but never really takes off in common use. A good example of the first was Apple’s introduction of Touch ID on the iPhone (and available now on the MacBook Pros with Touch Bar). That technology fundamentally changed the way the vast majority of iPhone users interact with their phone every single time they pick it up. An example of the latter is 3D Touch, an interesting technology I often forget is on my phone unless I accidently trigger it. 3D Touch may eventually become an integral part of the iPhone interface but, right now, it doesn’t feel like most people see it that way. It’s too soon to tell which way the Touch Bar will go.

One thing is clear: Apple sees it as a feature some customers will pay to have, as the 13-inch MacBook Pro with Touch Bar carries a roughly $300 premium over a comparable model without it (Note: The Touch Bar model also has a better CPU). After a brief hands on, the Touch Bar feels to me like an important refinement to a tried-and-true interface. I’m not sure yet if it’s better or worse than a touch screen, but I look forward to testing the hardware in the coming weeks to see how it impacts my usage. And I’ll be watching closely to see which developers embrace the technology and which do not.

Published by

Tom Mainelli

Tom Mainelli has covered the technology industry since 1995. He manages IDC's Devices and Displays group, which covers a broad range of hardware categories including PCs, tablets, smartphones, thin clients, displays, and wearables. He works closely with tech companies, industry contacts, and other analysts to provide in-depth insight and analysis on the always-evolving market of endpoint devices and their related services. In addition to overseeing the collection of historical shipment data and the forecasting of shipment trends in cooperation with IDC's Tracker organization, he also heads up numerous primary research initiatives at IDC. Chief among them is the fielding and analysis of IDC's influential, multi-country Consumer and Commercial PC, Tablet, and Smartphone Buyer Surveys. Mainelli is also driving new research at IDC around the technologies of augmented and virtual reality.

25 thoughts on “With Touch Bar, Apple Again puts Faith in Third-Party Developers”

  1. Configurable + labelled Fn keys seems a nice incremental update, for devices that aren’t designed for dual use, except for the few that heavily use the current Fn keys (is WordPerfect still around ? ;-p).
    What I’m wondering is whether the Touch+Display part wouldn’t make more sense built into the trackpad. It’s a large surface, arguably with a more convenient form factor; it’s already there, the thumbs are already hovering over it…
    I do think devices should be dual-use, with a 360 hinge or detachable screen, but that’s another discussion, maybe…

    1. The trackpads on the new MacBook Pros are huge. So huge that you would probably have your palms resting on half of them while typing (which probably means that Apple used some clever UI to ignore inadvertent touches).

      You would be hiding the buttons most of the time of you put them on the trackpad. I don’t think that’s a good idea.

  2. It’s interesting that when you explain how you like and use touch screens, you always talk only about scrolling. Is scrolling the only way you use touch screens, or do you actually use them to edit cells in Excel, choose icons in Photoshop, copy and paste phrases in MS-Word, etc

    My point is, the problem with touch in general is that fingers are fat. Much fatter than pens and the mouse cursor. Desktop UIs were developed with these fine pointing tools in mind, and the click targets are not designed to be large enough for you fag fingers. It’s OK when you’re scrolling, but for anything more, your fingers aren’t the best choice.

    The touch bar interestingly introduces a UI that is designed for fat fingers. It’s designed for touch. I’m not sure if it’s going to become an important UI element, but I think that it’s wrong to compare it to touching the screen. It’s probably going to be a completely different experience and utilised totally differently within apps.

    1. “It’s OK when you’re scrolling, but for anything more, your fingers aren’t the best choice.”

      I guess that depends on what you’re doing. On proprietary computer devices (like lighting and audio consoles for live performances) the best operators can fly through operation on touch screens more so than on keypad entry only. In tandem, keyboard and touch screen is quite a formidable workspace. I imagine the same would be true for anyone who becomes proficient in a software specific screen layout on a PC.

      It is an additional learning curve, but I imagine with all the secondary training people are receiving with their mobile devices this is not as large an obstacle as it was prior to 2007, fat fingers and all.

      Frankly, I don’t see how a touch bar is anymore immune to fat fingers than a screen.

      Joe

      1. Yes, but in that case you are talking about a Ui that was designed for touch. In the case of regular PC software, you will be working with a UI that was originally designed for a mouse. Take the typical Adobe UI for example with the tiny icons/buttons.

        With a touch bar, the UI would be designed for fat fingers from the begin with, even if the rest of the UI was designed for mouse input.

        1. In terms of the PC, it’s not like anyone is getting rid of the mouse or trackpad, so the smallest of selectable elements are not without recourse.

          Also, I would imagine (but who would know for sure) that the SDK would allow for some select-ability of what is touchable and what is mouseable.

          Also I imagine that the (needed) new UI would also take fat fingers into account, either on the OS level or the available software level.

          And speaking of Photoshop, I would imagine Pencil would be quite the suitable device as much on a PC as on an iPad.

          I can’t imagine the touch bar being any easier or more difficult than iOS is already, especially in apps like Music and Facebook where somehow the text is the button or the tiny tool bar across the top these days with the tiny arrow or “Back to Mail” text. Even made for touch UI has its issues.

          I just don’t think it is as hard as people are trying to make it, particularly with the advent of mobile devices.

          Also, the platforms I mentioned earlier are usually two or three monitor systems, which is where a mouse gets really annoying and the touch screen is far more direct.

          What I think is the greatest obstacle is for developers to figure out if it is worth their time or resources. Adobe obviously thinks so. I think it would transform spredsheets. I wish it would transform CAD, which needs it the most. CAD is annoying as hell even with a mouse and keyboard, until you learn (if you learn) the keyboard shortcuts.

          I also think actual usability research would be needed as to which commands are best where. Muscle memory will be hard to get around once something is set. People can play musical instruments with great dexterity because nothing changes on them, except by the user where possible.

          Joe

          1. CAD is annoying as hell even with a mouse and keyboard, until you learn (if you learn) the keyboard shortcuts.

            That’s probably it. I may be wrong but I think the best way to look at the touch bar is a glorified, contextual keyboard shortcut system.

          2. For me, I’d rather see it on an external full size keyboard. I literally use my laptop as a desktop when I am doing CAD work. I have the laptop closed and connected to an external 24″ monitor, keyboard, and mouse. I don’t know anyone who sets up their workspace like they showed at the keynote.

            Joe

      1. Hmm. I would think that would drive touch typists nuts. Not being one myself, I can only speculate.

        [eta: I can see it now, “Add a keyboard to your laptop with the laptop keyboard case!”]

        Joe

        1. generally speaking, I agree. But I’m assuming Apple would implement a Force Touch / haptic capability for feedback.

        2. there might be a way to reach “good enough” via the vibrating feedback like they have on the iWatch ? And couple of bumps on the home keys ?

          Plus, are there any touch typists left ? and in 20 yrs ? Do they matter ?

          1. In my circles,that I am not a touch typist (either by training or sheer repetition) I seem to be the anomaly. Although I can type lightning fast, I still need to look at the keyboard.

            I remember seeing an Apple patent a long time ago that had tiny posts under the screen that would pop up depending on something, I can’t even remember. At the time I thought the potential for a brail keyboard for a touch screen would be fascinating!

            Joe

        3. And the “add a keyboard” kind of makes sense: have something like the Lenovo has (2 flat surfaces), but the second more versatile, not just a flat keyboard but a full screen + input area; with an optional extra slice of real keyboard you can clip onto it.

  3. You say “stubborn”. I’m sure Apple would say “courageous”. I agree with “stubborn”. Although I disagree with Obarthelemy below with the need of the laptop converting to a tablet form by 360 hinges or disconnecting screens. I think the laptop can add a touch screen without it becoming a tablet. But as you point out it is up to the creativity of developers to figure out how to accomplish this.

    And if history shows us anything it shows us people making business decisions are averse to creativity and coming up new ideas or even with new ways to do the same things. That is too risky. Have any of the touch screens on non-Apple PCs made any inroads in new thinking for long time developers? Will a touch bar help them with baby steps? I doubt it. I think for those who take advantage of the touch bar (beyond gaming developers) I doubt many will do anything beyond what they already did with mapping F keys.

    It’s going to take a new generation of developers to whom a touch interface is their primary way of interacting with computing devices to take advantage of any of this. Vectorworks is not going to dilute or fork their cash cow by bifurcating their programmer’s focus. And their users will have to suck it up no matter how much we may wish they would think different.

    Joe

    1. It’s not quite a need, but why *not* do it ? Adds <$50 to the BOM, doubles the use cases, has no adverse impact on anything.

  4. I just realized the touchbar is kind of like a hardware “Ribbon”, MS’s mostly disliked UI from a few years back. Surprised nobody mentioned that earlier.

    1. Perhaps because most people realise the obvious differences. For example, the window doesn’t lose productive space, and both hands can be employed in direct manipulation of on-screen items at the same time; perhaps because using the ribbon was never compared to “playing an instrument”.

      1. I think that comparison only commits those who made it ^^

        I agree with the gained screen space, then again I’d rather have a slightly bigger screen (and less bezel), disagree with the “both hands is better” my hands are already on the keyboard and mouse/trackpad, the last thing they need is a 3rd place to be (whether screen or ribbon), unless I get a graft. I dislike reaching for the function row, I think I’d dislike reaching for a touch ribbon too (hence my trackpad suggestion)

        1. I think the comparison is valid and I actually had that in mind as well.

          Microsoft has actually pioneered contextually aware UIs. The contextual menu that came up with a right click was innovative for its time and is now standard on both Windows and Mac apps. The ribbon interface is also contextual, and although others may not use the same design, Apple iWork apps also have contextual control panes on the right hand side of the window.

          With complex apps, contextual menus and control panes are the way to go. Lots of apps use these and it’s a standard idea now. Whether you can make it good or not is a matter of implementation, and at least I can say that Apple has a good bit of experience with these.

          So yes, they are similar in concept, but today, that’s probably not important. What’s important today is implementation.

        2. “my hands are already on the keyboard and mouse/trackpad, the last thing they need is a 3rd place to be (whether screen or ribbon”

          I said “direct manipulation of on-screen items”. I agree that when the touch bar is only providing function keys, you have a point. But the demo clearly showed the user dynamically affecting the items the trackpad was interacting with.

          In that case, both hands are better than one, because you don’t need to use the pointing hand (mouse or trackpad) to go to an onscreen control panel and change the tool or parameters of the tool that you are using. You continue the flow of what you are doing with the pointing/dragging hand.

Leave a Reply to Naofumi Cancel reply

Your email address will not be published. Required fields are marked *