Following a hue and cry from a coalition of parental control app developers last week, it looks like Apple has backed down and loosened the restrictions that would block parental controls apps from using the Mobile Device Management (MDM) technology designed for use in managing corporate and organizational devices.
Earlier this year, it was revealed that Apple had been suddenly and seemingly capriciously cracking down on parental control apps, removing them from the App Store. After the news of the move came to light, the company went on the defensive, issuing a rare public statement explaining how the apps were putting kids at risk by allowing developers to use the relatively invasive device management technologies normally reserved for businesses and schools to manage the devices that they own and issue to their employees and students.
While at least one popular developer spoke out against Apple’s statement, calling it misleading, Apple maintained its stance that the use of MDM features were dangerous and put children at risk from bad actors that could potentially abuse the level of control that it gave the developers of those apps over kids’ devices.
Although Apple’s position on the use of MDM was a reasonable one in our opinion, some frustrated developers also accused the company of anti-competitive behaviour, suggesting that it was attempting to stifle competition from third-party apps to push users onto its own Screen Time feature that debuted in iOS 12. Such complaints wouldn’t have likely held much merit — Apple makes no direct revenue whatsoever from Screen Time, and in fact actually does make money from the 30 percent cut it takes from third-party apps that are sold on the App Store, meaning that in terms of raw dollars it has more to lose than to gain by blocking these apps — but they were filed with European and Russian regulators nonetheless, and would have been an extra thorn in Apple’s side during a time when the company is already facing antitrust investigations on other fronts.
Late last week, a coalition of developers, led by iPod patriarch Tony Fadell, called for Apple to commit to releasing a solution that would allow their apps to function. The premise was simply that if MDM is a bad and risky technology, then Apple needs to come up with a better way of handling this in the form of a new API specifically for parental control apps.
Following Apple’s WWDC keynote yesterday, the company quietly published a blog post on its site for developers highlighting several new App Store Guidelines. Buried within several updates was a new Guideline 5.5, regulating the use of MDM within apps, while also acknowledging that it can be used — in limited cases — by parental control and similar security apps.
Although the new guideline seems like an admission by Apple that the use of MDM isn’t necessarily the big evil thing that the company had previously maintained, it’s not entirely an about-face. In fact, if anything it’s a move by Apple to actually police the use of MDM by all applications, and specifically to ensure that the extensive data that can be gleaned through the technology isn’t misused.
Up until now, Apple had no restrictions at all on the use of MDM. Any developer could create and submit an app that used the Mobile Device Management technology. Users of course would still need to accept the installation of MDM profiles on their devices, but apps can easily walk users through this process without a clear understanding of why they’re doing this.
With the addition of Guideline 5.5, Apple has added a new special capability that devices must request, which will more clearly flag MDM apps and provide information to users on what kind of controls and data that they intend to access. Further, Apple makes it explicitly clear in no uncertain terms that MDM apps are prohibited from making use of any data obtained through their use of MDM — it’s fairly strong language which makes it sound like they’re not allowed to even look at the data, much less sell it or disclose it to third parties. Apple is also insisting that developers commit to this in their privacy policies.
Apple has also updated Guideline 5.4, which regulates the use of VPN apps — another category of technology previously used by parental control apps to filter browsing — adding the same restrictions.
Most notably, in both cases Apple also adds that the use of these technologies by parental control apps will be “in limited cases” and for “approved providers” suggesting that Apple will likely be working very closely with individual developers to ensure that their apps are using the technologies in a safe, secure, and privacy-focused manner.
In a statement to The New York Times, an Apple spokesperson suggested that Apple’s main concern over the use of parental control apps was the amount of data that they were capable of collecting.
These apps were using an enterprise technology that provided them access to kids’ highly sensitive personal data. We do not think it is O.K. for any apps to help data companies track or optimize advertising of kids.Apple spokesperson
Other more recent reports have suggested that this kind of tracking of kids is something that Apple is working to limit on a larger scale, which is a laudable goal, and it’s clear that something needed to be done to at least ensure that developers of these kind of apps are exercising the necessary due diligence, although many developers are now questioning why Apple put them through the wringer over the past year, only to ultimately end up at almost the same place where they begun. Many developers have lost significant revenue while Apple has seemingly been trying to figure out its strategy for handling parental control apps on the fly, and despite yesterday’s mostly reversal of the policy, it’s still unclear on when and how these apps will be able to return to the App Store.