This week Amazon and Microsoft announced the rollout of Alexa and Cortana integration. First discussed publicly one year ago, the collaboration represents an important step forward for smart assistants today and voice as an interface in the future. I’ve been using Alexa to connect to Cortana, and Cortana to connect to Alexa, and while it’s clearly still in the earliest stages of development, it generally works pretty well. The fact that these two companies are working together—and other notables in the category are not—could offer crucial clues about the ways this all plays out over time.
Cortana, Meet Alexa
Enabling the two assistants to talk to each other is straightforward assuming you’re already using both individually. You enable the Cortana skill in the Alexa app and sign into your Microsoft account. Next, you enable Alexa on Cortana and sign into your Amazon account. To engage the “visiting” assistant, you asked the resident one to open the other. So you ask Alexa to “open Cortana” and Cortana to “open Alexa.” In my limited time using the two, I found that accessing Cortana via Alexa on my Echo speaker seemed to work better than accessing Alexa via Cortana on my notebook. Your mileage may vary.
One of the biggest issues right now is that it gets quite cumbersome asking one assistant to open the other so that you can then ask that assistant to do something for you. One of the reasons Alexa has gained such a strong following—and is the dominant smart assistant in our home (four Dots, two Echos, and two Fire tablets and counting)—is because it typically just works. The reason it just works is that Amazon has done a fantastic job of training we Echo users to engage Alexa the right way. It’s done this by sending out weekly emails that detail updates to existing skills as well as introducing new ones. Alexa hasn’t so much learned how we humans want to interact with her. Instead, we’ve adapted to the way she needs us to interact with her.
The issue with accessing Alexa through Cortana is that we lose that simplicity. I found myself trying to remember how I needed to engage Alexa while talking to the microphone on my notebook (Cortana). The muscle memory I’ve built around using Alexa kept getting short-circuited when I tried to access it through Cortana. I suspect this will self-correct with increased usage, but it’s obviously an issue today.
That said, even at this early stage, the potential around this collaboration is clear and powerful.
Blurring of Work and Home
We all know that the lines between our work lives and home lives are less clear than ever before. Most of us use a combination of personal and work devices throughout the day, accessing throughout the day both commercial and consumer apps and services. But when it comes smart assistants, the lines between home and work have remained largely unblurred. As a result, today Amazon has a strong grip on the things I do at home, from setting timers to listening to music to accessing smart-home devices such as connected lightbulbs, thermostats, and security systems. But Alexa know very little about my work life. Here, I’d argue, Microsoft rules, as my company uses Office 365, and Cortana can tap into my Outlook email and calendar, Skype, and LinkedIn among other things.
During my testing, I did things such as ask Alexa to open Cortana and check my most recent Outlook emails, or to access my calendar and read off the meetings scheduled for the next day. Conversely, I asked Cortana to open Alexa and check the setting of my Ecobee smart thermostat and to turn on my Philips Hue lights.
Probably the biggest challenge around this collaboration, once we get past the speed bump of asking one assistant to open another, is the need to discern individual users and then address their privacy and security requirements when working across assistants. Now that I’ve personally linked Alexa and Cortana, anyone in my house can ask Alexa to open Cortana and read off the work emails that previously were accessible only through Cortana (on a password-secured notebook). That’s a security hole they need to fill, and soon. The most obvious way to do this is for each of these assistants to recognize when I am asking for something versus when other members of my household (or visitors) are doing it.
Will Apple, Google, and Samsung Follow?
It makes abundant sense for Amazon and Microsoft to be first into the pool on this level of collaboration. While the two companies obviously compete in many markets, Cortana and Alexa represent an area where I’d argue both sides win by working together. I look forward to seeing where the two take this integration over the next few years.
But what about the other big players? Among the other three serving primarily English-speaking markets, I could imagine Samsung seeing a strong reason to cooperate with others. It’s Bixby trails the others in terms of capabilities, but the company’s hardware installed base is substantial. At present, however, it seems less likely that either Apple with Siri or Google with Google Assistant would be interested in joining forces with others. With a strong position on the devices most people have with them day and night (smartphones), both undoubtedly see little reason to extend an olive branch to the competition. Near-term this might be the right decision from a business perspective. But longer term I’m concerned it will slow progress in the space and lead to high levels of frustration among users who would like to see all of these smart assistants working together.