The Next Evolution In User Interfaces

With the introduction of the iPhone, Apple introduced the touch UI and literally changed the way people interact with their smartphones. When they extended the touch UI to the iPad, it set in motion an industry stampede to create PCs, laptops, tablets, and smartphones with touch based interfaces. In the world of technology, this was a real milestone. For decades the way we navigated through our PCs was through a keyboard, mouse or Trackpad. While Apple was not the first to bring touch to tablets or smartphones, they clearly get credit for commercializing it and making it the defacto standard for next generation user interfaces.

But there were two products released recently that I have tested that I believe gives us an early glimpse at the next evolution of user interfaces. These, perhaps, will be just as ground breaking as the graphical user interface and touch UIs in the market today.

Touch Freedom

The first is a couple of gesture features that are in the new Samsung Galaxy S4 smartphone. The first is called Air View. If you are in the email application on the S4, you can just “hover” your finger over the email you are looking at and the subject line and first 2 or 3 lines of the email pops up. It hovers over the actual email line so you can see what the email is about at a glance and decide if you need to read it or just move onto the next to check it out. The Air View gesture only works on the email app now but the software community will likely get the tools to be able to use it on other apps shortly. This gesture alone is a game changer in that it takes limited information on a small screen and blows it up in context so-to-speak so you can gain more info on the item you are looking at.

The second feature is just as cool. It is called Air Gesture. Have you ever been working in the kitchen with a recipe and gotten your hands dirty yet needed to go to the second page of the recipe to get the rest of the details? Well with Air Gesture, all you do is wave your hand in front of the tablet and it moves to the next page without ever touching the screen. I often take my tablet with me to restaurants when I am alone on the road and catch up on the days news, or even read a magazine or book while chowing down. Often my hands are full with knife and fork and today I actually use my knuckle to touch the screen to open a page or turn it.

To be fair, Microsoft has had gesture based user interfaces on the Xbox for almost two years, but to date it has only been designed for game consoles and has not transferred over to PCs or mobile devices yet. Both of these features on the Galaxy S4 smartphone represent the first major shift to making gestures an integral part of a mobile UI. While these two gestures are only on the S4 today, I’m sure it will eventually find its way to Samsung’s Galaxy tablets perhaps later this year.

The other gesture-based technology introduced recently comes from Leap Motion. This pad like device is used on a PC and sits in front of the monitor and between the keyboard and turns Windows into gesture based UI for supporting software. It can also be used with a laptop via a USB dongle with the device sitting in front of a laptop keyboard. Leap Motion has seeded over 10,000 developers with SDKs to make their apps work with their Leap Motion Controller. After it ships this summer, we should start to see a good amount of leap motion enabled apps later this year. HP has considered this so important that they did a major deal with Leap Motion recently and HP has committed to using it in their products in the future.

Similarly with Kinect, what is appealing about Leap Motion is the way you can interact with a game in 3D. Just use your hands as the controller, or use it to add hand controls to manipulating 3D objects. However, with support from the software community you can imagine eventually being able to just wave your hand and turn Web pages or use your hands to mold pottery on the screen, etc. The key thing here is that the Leap Motion technology is an enabler and once the software community gets behind it, it could become the next major step in making a user interface more friendly and even easier to use then it is today.

The reality is that Apple, Microsoft, Intel and others are all working on gesture based UI technology and believe that gestures represent the next significant evolution in computing interfaces. In fact, Intel has a human factors project around gestures and while not much is known about it, I would not be surprised to see the controller for gesture UIs even part of the SOC in the future.

While many had hoped voice would be the next big thing in user interfaces, there is still a lot of work in this space to be done to bring it into mainstream computing. I have no doubt that voice commands, such as the one HAL used in 2001: A Space Odyssey will eventually be the main way we interact with computers. However, for now, the next evolution will be gesture based. The technology used in Samsung’s Galaxy S4 smartphone and Leap Motion will most likely help define how gestures may soon become a major part of the interface we have on all of our computing devices.

Published by

Tim Bajarin

Tim Bajarin is the President of Creative Strategies, Inc. He is recognized as one of the leading industry consultants, analysts and futurists covering the field of personal computers and consumer technology. Mr. Bajarin has been with Creative Strategies since 1981 and has served as a consultant to most of the leading hardware and software vendors in the industry including IBM, Apple, Xerox, Compaq, Dell, AT&T, Microsoft, Polaroid, Lotus, Epson, Toshiba and numerous others.

6 thoughts on “The Next Evolution In User Interfaces”

  1. I agree that real space based gestures are a key part of where interface design is going, it is going to take someone with Apples historic ability to make them reality. Icons and touch screens were around before Apple made them work, and the same is true of non touch gestures. What is needed is someone to make them in a integrated compelling system, MSFT had the lead with Konnect but they are still struggling to integrate touch into the computer (in many ways Windows 8 is more like the touch pad based OSX then iOS).

  2. Gestures are nice, but I think the next important phase in UI is passive UIs. For example, how about a thermostat that just “knows” you’re hot and turns on the AC? Why gesture to set the temperature when the temperature can set itself?

  3. FYI – “Gilt Taste”, an iPad app, has had the Air Gesture functionality for a few years now – you wave your hand in front of your iPad and you can turn the page of a recipe that you are making. I don’t know if it’s based on an Apple API or if the developers found some way to do it on their own.

Leave a Reply

Your email address will not be published. Required fields are marked *