The Missing Link in VR and AR

on May 14, 2018
Reading Time: 4 minutes

VR and AR are big buzzwords in the world of tech these days. At Tech.pinions we have been covering these technologies for over five years and shared solid perspectives on significant AR and VR products if we feel they move this technology forward.

All of our team has tried out or tested most of the available AR and VR products on the market today, and at least in my case, I only see their value at the moment in vertical markets. This is especially true for VR. Apple and Google have tried to bring AR to a broader audience but here too, AR delivered on a smartphone is still a novelty and is most acceptable when used in games like Pokemon Go and some vertical markets.

As I have written in multiple columns over the last year, I have shared my excitement for AR, especially after seeing some cool AR applications in the works that should be out by the end of the year. Although they are still delivered via a smartphone, AR Kit and AR Core are giving software developers the tools to innovate on IOS and Android and in that sense I see the possibility of a broader interest in AR later this year. I also expect Apple to make AR one of the highlights of their upcoming developer conference in early June.

However, I feel the most effective way to deliver AR will be through some form of mixed reality glasses. While the intelligence to power these AR glasses may still come from the smartphone, these glasses will be an extension of that smartphone screen and deliver a better way to view AR content, than can just be provided on a smartphone screen.

I see glasses as the next evolution of the man-machine interface and technology that will be extremely important to billions of people over the next ten years. In my recent Fast Company column, I shared how I believe Apple would tackle the AR opportunity and how they could be the company who defines AR based glasses market.

But if you have used any of the VR or Mixed reality headsets or glasses so far, you understand that interacting with the current models are difficult when you have to use a joystick or handheld wand that is needed to communicate with any of the features or actions in any given VR or AR application. Even more frustrating is that these handheld interfaces do not deliver pin-point precision yet, which makes it often difficult to activate any of these AR or VR applications functions.

I believe there are three high hurdles to get us to where AR is valuable and is acceptable by mass-market users. The first is creating the types of glasses or eyewear that are both fashionable and functional. Todays VR and AR glasses or goggles make anyone who uses them look like nerds. In our surveys, this type of eyewear is panned by people we have talked to about what is acceptable to wear for long periods of time.

The second most significant hurdle will be how the wireless technology used in these smartphones are designed to communicate with what I call “skinny glasses.” This is where the glasses rely pretty much on the smartphone for its intelligence. Getting the wireless connections and applying the smartphone’s functions and intelligence to these glasses will be difficult but critical if we want to have the types of AR glasses that people will wear and not make them stand out as some tech dweeb.

But the missing link that gets little attention when we talk about VR and AR will be the way we interact with these glasses to get the kinds of functions we want and need to make these headsets valuable. Undoubtedly voice commands will be part of this interface solution, but there are too many occasions where calling out commands will not be acceptable, such as while in a meeting, at church or a concert, or in a class, to name just a few.

Indeed, we will need other ways to activate applications and interact with these glasses, which most likely will include things like gestures, object recognition via sensors and through virtual gloves or hand signals such as those created by Magic Leap to navigate their specialized mixed reality headset.

However, I believe this is an area ripe for innovation. For example, a company called TAP just introduced a Bluetooth device that fits over four fingers and lets you tap out actual words and characters as a way to input data into existing applications such as word, or eventually virtual applications on a mixed reality headset.

The folks from Tap came by and gave me a demo of this product, and I found it very interesting. There is a real learning curve involved to understand how to tap out the proper letters or punctuation marks, but they have great teaching videos as well as a teaching game to help a person master this unique input system. Check out the link I shared about to see how it works. They are already selling thousands to vision-impaired folks and others in which using a virtual keyboard like TAP are needed for a specific app or function.

But after seeing TAP, I realized that creating a powerful way to interact with AR apps on glasses should not be limited to joysticks, virtual gloves, voice commands or gestures. This missing link needs out of the box thinking like TAP has done. Hopefully, we will see many other innovations in this space as tech companies eventually deliver mixed reality glasses that are acceptable to all users and drive the next big thing in man-machine interfaces.