Apple and Parental Control AppsReading Time: 5 minutes
On Friday the New York Times published an article on Apple and how the company was cracking down on parental control apps because they want to limit competition to lead users to their Screen Time solution. The article also puts forward a theory that the much more permissive nature of Screen Time does not drive less engagement with devices which is ultimately, according to the New York Times, not something Apple wants. So let’s start here with two points:
- Screen Time is not monetized by Apple which makes it hard to talk about limiting apps in the App Store as an anti-competitive move. Apple is actually monetizing from these apps as they do for all apps in the store.
- The argument about Apple not wanting users to spend less time with their device is also flawed. Both for the user and Apple, it is not about the time one spends with the device, but the level of engagement you have with the device. Someone spending two hours watching a movie bought from the TV app drives more value to Apple than someone spending eight hours chatting over Facebook messenger.
After the article was published, Apple clarified with a statement that the reason why some parental control apps were taken down in the App Store was related to the use of Mobile Device Management (MDM). For those not working in a large corporate environment, MDM is a solution widely used by IT departments to control and manage employees’ devices. The level of access these tools give an IT department is deep and broad, but it is about the assets, not the users. MDM gives a third party control and access over a device and its most sensitive information including user location, app use, email accounts, camera permissions, and browsing history.
Because of this level of access, Apple started to look into MDM use in consumer apps and updated its guidelines in mid-2017. Here is what Apple said:
“MDM does have legitimate uses. Businesses will sometimes install MDM on enterprise devices to keep better control over proprietary data and hardware. But it is incredibly risky—and a clear violation of App Store policies—for a private, consumer-focused app business to install MDM control over a customer’s device. Beyond the control that the app itself can exert over the user’s device, research has shown that MDM profiles could be used by hackers to gain access for malicious purposes.
Parents shouldn’t have to trade their fears of their children’s device usage for risks to privacy and security, and the App Store should not be a platform to force this choice. No one, except you, should have unrestricted access to manage your child’s device.”
As a parent, this last sentence is key to the argument. No one, except the father of my child and me, should have such a level of access to my child through its device. And with MDM that access extends to the developer as well. This is very different from the IT manager in an organization who has been trained, vetted and is kept in check by the organization whose interest is to have access to the company assets and data, not personal data. As a matter of fact, there are many warnings about not letting MDM tools on personal devices as those cease to be yours and become basically a company device when it comes to what you can and cannot access.
Let me be clear; the point is not about me, as a parent, judging other parents using these apps. As a parent I know we are all as different as our kids and situations are. The point is about me being concerned about how these apps could be misused directly or indirectly without parents knowing about it. In the spirit of all situations being different and unique, I also cannot help but think that some cases don’t necessarily fit into a “happy family” scenario creating complexity on how the data could be used and manipulated by one of the parents. What if you are in an abusive environment and you use Whatsapp to communicate, and that is taken away from you because the app is disabled? You might think I am paranoid, but sadly there are realities like this and not considering this scenario would be irresponsible. So I welcome Apple’s decision to take a more cautious approach and work with these developers to find alternatives that while possibly giving less control to the parents are also keeping the safety and the security of the kids in mind. Mute, one of the apps covered in the TechCrunch story announcing its shutdown actually worked with Apple following the publication of the article, and as a result, the app is still available in the store.
Room for Improvement
The fact that Apple had been reassessing parental control apps first surfaced about five months ago in a TechCrunch article in which the author is also questioning Apple’s decision in relation to the launch of Screen Time. It seems to me that aside from a higher level of scrutiny on Apple and the current argument that the App Store is anti-competitive in nature, there is a lack of understanding on what the developers’ guidelines are and we only hear about them when something happens.
In this case, not all apps seemed to have infringed on one specific rule but a variety of rules from background location tracking to having an app regulate another app and the use of MDM. This makes it look like Apple is cracking down on parental control apps across the board using a selection of different rules.
I think it is fair to assume that the App Store would have seen an increase in parental control apps as the focus on screen addiction grew. It is also reasonable to think that this might have happened around the time Screen Time on iOS and Digital Wellbeing on Android were released. Apple updated its guidelines, but as it is often the case, this did not make news as it impacts developers and not consumers. I would argue, that given the sensitive nature of the apps, Apple could have provided some context maybe on its newly designed page for families to help parents figure out what tools to use as well as highlight how the decisions are linked to security and privacy.
Being a parent in this day and age is not easy, but I said this before and will continue to do so: technology should help us do our job, but it should not do our job for us. This is maybe why when other parents ask me for a recommendation of what tools or approach to use to manage their kids’ tech usage I am always hesitant. What works for me might not work for someone else, and this is true about what tech my kid is allowed as well as what book she can read, a movie she can watch or music she can listen to. I wish something as simple as age could help, but even that is an arbitrary measure that assumes all, say, 11-year-olds have the same level of maturity to understand or experience something. So age is far from the only yardstick we should use before applying our judgment, and we can’t expect any one app to replace our role as engaged parents.