Can Apple Delete TikTok and Other Apps From Your iPhone?

koshiro / Adobe Stock

Apple is notorious for exerting tight control over how it reviews apps before they’re made available for download from the App Store. For example, Apple rejected over 1.6 million applications in 2022 out of the little over 1.7 million total apps available on the App Store.

One of Apple’s lesser-known but significant capabilities is its ability to remotely delete an application from iPhones. Apple’s unilateral app review process has received plenty of criticism. Does the ability to remotely delete apps warrant the same level of scrutiny, particularly in the era of TikTok?

What Does Remote Deletion Mean for Users?

Apple’s ability to remotely delete or disable apps from iPhones is not a widely advertised feature, but it is a critical component of the company’s strategy to ensure security and maintain a safe environment for its users.

This feature is designed as a safeguard, allowing Apple to quickly eliminate potential threats caused by apps that are found to be malicious, violate privacy, or otherwise break its strict App Store guidelines. From a security standpoint, this makes sense as a proactive policy. If a malicious app slips through the initial screening process, having the ability to remove the threat remotely allows Apple to protect its users effectively.

To be clear, while Apple has removed many apps from the App Store over the years for various reasons, we’ve never seen evidence of Apple throwing the “kill switch” for an app distributed through the official App Store.

The few occasions where Apple has blocked apps after the fact are those that have abused the company’s Enterprise Developer Program. This program is designed for businesses to distribute internal apps to their employees, but others — including Facebook and Google — have used the extended privileges of the program for more insidious purposes to bypass App Store policies or create dangerous spyware apps.

Apple has killed more than a few of these after the fact, but it’s done so by revoking the Enterprise Developer certificate entirely — a move that renders all apps issued with that certificate inoperable as they’re no longer authorized to run on the iPhone.

That said, Apple does have the power to do this for any app, and it certainly reserves the right to use it. However, nothing sinister enough has ever gotten through app review to pose a sufficient danger to users that it needs to be removed or disabled on iPhones where it’s already been installed.

Privacy and Legal Concerns

The idea a third party can make changes to the contents of one’s iPhone without permission might seem unsettling for some. However, it’s worth noting that in the rare cases where Apple has exerted this control, the apps haven’t been removed from end users’ iPhones, but have simply been rendered inoperable by being de-authorized by Apple. The apps and all of their data and settings remained on the iPhone. The kill switch for an App Store app would very likely work in the same way. That’s a subtle but important difference.

Still, this perspective sparks a broader debate on the ownership and control over digital content after purchase or download. The same debate has also raged for years over copy-protected media content such as music, movies, and TV shows purchased from places such as the iTunes Store, which could similarly be rendered unusable should the Digital Rights Management (DRM) certificates be revoked. The legal and ethical considerations are complex.

On one hand, Apple’s terms of service, which users agree to upon setting up their iPhones, clearly state the company’s rights. On the other, there are the broader issues of consumer rights and the limit of corporate control over consumer devices.

We’re seeing these issues unfold in the EU with new policies about app distribution and default choices. Clearly, jurisdictions around the world vary in their interpretation of these rights, complicating Apple’s global operation of its policies. However, even with third-party app marketplaces in the EU, Apple retains control over what apps can be installed and run on the iPhone through its “notarization” process. This means that even an app downloaded directly from a developer’s website could still be disabled by Apple to protect iPhone users from dangerous and harmful apps — something that the European Commission insists is the government’s responsibility, and not that of a tech company.

Will Apple Remove TikTok from iPhones?

Absent Apple independently discovering and determining a concrete threat to users, Apple is extremely unlikely to delete TikTok from iPhones. They’d also be able to challenge any law compelling deletion the same as TikTok. Consider that Apple has been forced to remove thousands of apps from the Chinese App Store over the years, yet it has never thrown the kill switch on any of them. The Great Firewall of China makes it difficult to use some of these apps in the country, but those who have them installed can keep trying to find ways around it.

The looming TikTok ban would mean the app is no longer available for download, and those who already have it installed would not receive updates since those come through the App Store. This could lead to the degradation of the app’s usability over time, but it would likely continue functioning as the US doesn’t have a national firewall like China does — and setting up such an initiative would be untenable for a nation that values net neutrality.

It’s tough to tell if the US government has concrete evidence of China using TikTok data against national security interests. We don’t get to see what’s presented behind closed doors at classified briefings. Is there a real need for immediate action, or is the ban simply based on what China “could” do with TikTok’s data? Without more information, young Americans are right to be skeptical.

As is often the case, the challenge lies in balancing legal and ethical considerations with the benefits of a secure and controlled app ecosystem. Transparency is usually part of the solution. For example, involving users more directly in the decisions, through notifications and options to contest such removals, could be a middle ground that respects user autonomy while maintaining security. This is true at least in most cases. In others, users are left to trust that Apple or their governments are making these decisions on their behalf only in the most egregious circumstances. Hence the great TikTok debate. Without more transparency, users are left guessing and many will understandably grow suspicious.

Back To Top