We Now Know Why Telegram Was Pulled from Apple’s App Store

Why-Telegram-Pulled-from-App-Store Credit: Reuters
Text Size
- +

Toggle Dark Mode

Popular secure messaging app Telegram was removed from the App Store last week — and now we officially know why.

The primary Telegram app and Telegram X (which is in testing) were both removed from the iOS App Store on Jan. 31. The next day, Telegram CEO Pavel Durov tweeted that the app was removed due to the presence of “inappropriate content.”

No other reason was given, but Durov added that once “protections” were in place, the Telegram app would reappear on the iOS app storefront. Indeed, the Telegram app returned to the App Store on Feb. 2.

But today, we’re getting a clearer picture of what “inappropriate content” caused the secure messaging platform to be taken down.

In an email in response to a 9to5Mac reader, Apple marketing chief Phil Schiller reported that Apple’s App Store team was alerted to “illegal content,” specifically child pornography, being shared through Telegram.

“After verifying the existence of the illegal content, the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children),” Schiller wrote in the now-verified email.

Presumably, the Telegram apps returned to the App Store with protections to stop the illegal content from being spread. Apple’s App Store guidelines require platforms to contain filters for objectionable material and the capability to report it, as well as the ability to block users, TechCrunch reported.

“We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity,” Schiller added. “Most of all, we have zero tolerance for any activity that puts children at risk — child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral.”

Distribution of child pornography is among the most grievous offenses on the internet, and the vast majority of social networks, tech platforms and websites include mechanisms to immediately detect it and remove it. Telegram, apparently, wasn’t as prepared, however. So Apple removed the app while its developer figured out how to eradicate the issue.

The secure messaging app is host to a suite of advanced security features that allow users to host private and secret conversations that are end-to-end encrypted. Notably, it was one of the first messaging platforms to feature end-to-end encryption when it launched in 2013.

But while the ability to hold secret conversations may be Telegram’s main selling point, it’s also its primary flaw, as the app has faced issues in the past with terrorism and terrorist-related content. The platform and its developers have been widely criticized by governments for being the “app of choice” for terrorist organizations like ISIS, Vox reported in June 2017.

Telegram was nearly banned by the Indonesian government for “terrorist-related content,” and the developers were forced to create a moderator team to tackle the content in the country, The Verge reported.

Sponsored
Social Sharing