Is Apple Finally Cracking Down on Deepfake AI Image Apps?

App Store icon Credit: BigTunaOnline / Shutterstock
Text Size
- +

Toggle Dark Mode

Apple is pretty vigilant about keeping the App Store as “family-friendly” as possible, but sometimes, it’s hard to keep up with the latest technology trends, especially in the fast-paced world of generative AI.

With the proliferation of AI image generation on the web and in standalone apps, it seems that a few questionable ones have snuck past Apple’s censors — specifically, apps that could generate nude images without the consent of the subjects involved.

According to a report by 404 Media, these apps weren’t just seemingly harmless AI apps that could be abused for this purpose — they were actually advertising the ability to “create nonconsensual nude images” (although not necessarily in those exact words).

Apple has since removed at least three of these apps from the App Store, but it only did so after the folks at 404 Media called them out.

Overall, Apple removed three apps from the App Store, but only after we provided the company with links to the specific apps and their related ads, indicating the company was not able to find the apps that violated its policy itself. Emanual Maiberg, 404 Media

404 Media discovered the apps through online advertising outside the App Store, following a report earlier this week of AI nude apps being advertised on Instagram. They found dozens of ads in Meta’s Ad Library that linked to five different apps, two of which were web apps and three available on the App Store.

The report doesn’t go into detail on the nature of the three apps, but they’re likely similar to a deepfake face-swapping app that Google recently removed from the Play Store after it was found advertising its ability to “Make any porn you want” on several adult sites.

Sadly, this isn’t a new problem, although 404 Media’s Emanual Maiberg is optimistic that both Apple and Google are becoming a bit more proactive in taking action. Two years ago, both tech giants seemed comfortable leaving some of these apps alone as long as they weren’t blatantly advertising their deepfake capabilities.

Back in 2022, when Sam and I first reported on apps that seemed innocent if you looked at their app store pages, but that advertised their deepfake porn capabilities on porn sites, Google and Apple did not remove them, but required them to stop running those ads. One of those apps continued running ads on porn sites until we reported on it again this year, prompting Google to finally remove it. Emanual Maiberg, 404 Media

As Maiberg points out, such apps are responsible for some of “the worst harms we’ve seen as a result of generative AI” since they’re readily available to middle schoolers and teens who use them on photos of their classmates without consent, often leading to harassment and cyberbullying.

Sadly, many of the developers of these apps are using deceptive tactics to fly below Apple’s radar. For example, one app was listed simply as an “art generator” with no reference to its ability to generate nudes. However, it was advertised on Instagram as an app that “can delete any clothing” and “undress any girl for free,” with images showing young women in various states of AI-generated undress.

The good news is that Apple is now at least responding to these reports, but it’s clear that the company will have to find a way to improve its App Review process to more closely examine generative AI apps before clearing them for publication on the App Store rather than just taking what developers are saying at face value.

Sponsored
Social Sharing