Privacy is Complicated

on May 9, 2019
Reading Time: 5 minutes

It has been a busy couple of weeks for privacy, and I am sure it will continue to be so for a while. We started last week with Mark Zuckerberg at F8 saying: “The future is private.” Then on Monday, at Microsoft Build, Satya Nadella said: “Privacy is a human right” echoing the words that Tim Cook at Apple has been using for quite some time. On Tuesday, at Google i/o, Sundar Pichai said that “Privacy and security are for everyone, not just a few.” Different pledges to offering more privacy and security to their users but also varying degrees of delivery.

Business Models and Core Competencies

Why all this focus on privacy now? We have to thank Facebook for it. Privacy has always been important, but the escalation we saw on both the amount of time tech companies address this topic and the amount of scrutiny they are under by government and regulators, which of course are intertwined, started with the Cambridge Analytica debacle.

The different responses tech companies have on privacy is highly dependent on their business model. It is the business model that created a split between Microsoft and Apple, who monetize from their products, on one side and Google and Facebook, who monetize from advertising, on the other. But this split seems to have been made less clear this week as Google’s focus on privacy materializes in concrete steps to give more control to users over their data.

As I was watching Sundar Pichai’s keynote, I commented

 

Pichai said on stage: “We always want to do more for users but do it with less data over time.” If you are skeptical about this, you can look at Google’s track record and see that they have changed their ways over time. In 2014, Google stopped tracking email content for ad targeting in the student version of Gmail. In 2016, they stopped scanning emails in Gmail before they hit the inbox and in 2017 they stopped doing so altogether. Of course, it would be disingenuous not to point out that the first two changes were in response to lawsuits, but the last one was driven by a business model change which generated from Google moving into the enterprise with G Suite.

Another point of confidence on my part comes from the core competence that Google has in AI. Pachai spoke about how AI plays a role in enhancing users’ privacy and then moved on to talk about federated learning. While federate learning is important and something that Google first talked about in 2017 I ultimately think what makes more of a difference is that their AI models have benefitted of vast amounts of data and they have learned what matter and what does not. They have also learned how to use data more efficiently. Put it this way it sounds a little bit less altruistic than depicted on stage, but the benefit to the consumer remains.

The rat and the dent

One criticism that has been made to Google after Tuesday’s keynote is that the added focus on privacy is putting the burden on the users rather than the responsibility on the services and tech that Google offers. It is up to the user to go and change the settings across Google’s apps so that their data is not tracked or stored. Credit to Google, they did make it easier than it currently is to go and find your settings and change them. But, even so, most users will not bother.

Most users won’t bother changing settings for two reasons. First, consumers see the value that Google having their data brings to their experience. Pichai even said it on stage: “data makes your experience better.” The other reason relates to something that Microsoft CVP Julia Liuson earlier in the week called the rat and the dent syndrome. If you have a dent on your phone, you are unlikely to do anything about it although you might be complaining about it all the time. But if you find a rat in your home, you will do something right away. I think lack of privacy for many consumers is a dent, not a rat. They complain about it, but when given a chance to do something about it they will likely pass at the opportunity especially if impacts their convenience.

It goes without saying that this dent syndrome favors Google as they provide the tools which make them compliant with regulations, but they are unlikely to see an impact on the level of data consumers will share. Of course, the dent can quickly turn into a rat, the moment you are caught doing something wrong and doing so intentionally, as the backlash towards Facebook has clearly illustrated.

Competitive Advantage

From a pure marketing perspective, it is clear to me that talking about privacy as a competitive advantage is going to be more complicated. The conversation might have to shift from “we care about your privacy” to explain why the business model a company is based on allows them to put privacy first for their customers but also how the same business models make it so that it is not possible to deliver that level of privacy to everybody. This is at the core of what Pichai had in his New York Times article this week:

“Privacy cannot be a luxury good offered only to people who can afford to buy premium products and services. Privacy must be equally available to everyone in the world.”

A statement that was aimed at Apple, the same way his keynote comment was:

“So far, we’ve talked about building a more helpful Google; it’s equally important to us that we do this for everyone, for everyone is a core philosophy for us at Google. That’s why from the earliest days, search works the same whether you’re a professor at Stanford or a student in rural Indonesia. It’s why we build affordable laptops for classrooms everywhere. And it’s why we care about the experience on low-cost phones in countries where users are just starting to come online, with the same passion as we do with premium phones.”

But helpful Google also fits well with their business model. Putting advertising aside for a moment, it is evident that if you are in the services business, you want to reach as many users as possible with your solutions which is what Google is doing. It will be interesting to see what changes will come from Apple’s move into services not as far as privacy as Apple already made it clear they will not track what you read or watch through their services, but as far as devices reach.

Business models also get caught up in doing good, being helpful, advocating for the people marketing message. Microsoft was the first brand strongly advocating for ethical AI and technology to empower all people. Google this week used similar talking points:

“And it goes beyond our products and services. It’s why we offer free training and tools to grow with Google, helping people grow their skills, find jobs, and build their businesses. And it’s how we develop our technology, ensuring the responsible development of AI, privacy, and security that works for everyone, and products that are accessible at their core. Let’s start with building AI for everyone.”

Yet both companies have faced criticism for providing their technology to governments and helping with surveillance of the very people they want their technology to help.

It is indeed complicated. When it comes to privacy, security, and ethics, between the black and white of right and wrong, there seem to be so many shades of gray that companies can use to position their business. Marketing aside, however, consumers’ decisions on who they trust will be driven by both rational and irrational components. The intent companies demonstrate putting the users first or slipping on their promise, as well as the value that consumers get from the technology and services these companies provide will both play a role in who they trust.