Apple Is Losing the Battle against Deepfake AI Image Apps

Apple logo on glass facade in front of older building Credit: Niccolò Chiamori
Text Size
- +

Toggle Dark Mode

Earlier this year, we highlighted a report on how Apple has been facing challenges keeping the App Store “family-friendly” in the face of generative AI tools, and it seems that the problem has only been getting worse as a new rash of “dual use” apps blur the lines between legitimate image generation and deepfake pornography.

In April, 404 Media discovered several apps on the App Store that were explicitly advertising their ability to “create nonconsensual nude images.” Of course, the App Store listings weren’t that blatant, so it’s perhaps easy to understand how they may have slipped past Apple’s censors. For example, one app was listed as an “art generator,” but it was advertised on Instagram with taglines like “make any porn you want” and “undress any girl for free,” complete with images showing normal pictures being turned into AI-generated nudes.

This Limited-Time Microsoft Office Deal Gets You Lifetime Access for Just $39

Sick and tired of subscriptions? Get a lifetime license for Microsoft Office Home and Business 2021 at a great price!

The App Store listings were innocuous, but opening the apps made it obvious what they were designed for. It’s hard to imagine how Apple’s review team could have missed this, but it turns out many developers of malicious apps have figured out a way to blindside the App Store Reviewers.

Deceptive Geofencing

Earlier this month 9to5Mac took a “deep dive” into how developers of pirate streaming apps deliberately engineer their apps to trick Apple’s reviewers into thinking they’re something else, and it’s a surprisingly simple technique.

Since Apple’s App Store Review team is located in California, malicious developers create a geofence that makes their apps behave differently when opened anywhere near Apple Park or other known Apple offices. The app’s actual user interface and features are hidden from anyone in San Jose, California, so App Store Reviewers never see the real app; instead, they get a clean PG interface with legal features. To make matters worse, 9to5Mac discovered that there’s a common code base that developers can use to make this happen, so they don’t even need to build these tricks on their own — they just need to add in the appropriate frameworks to deceive Apple before submitting the app.

This is likely how many of these deepfake porn apps are finding their way onto the App Store, despite obvious cues inside the app that they’re not just innocent image generators but apps designed to create nonconsensual pornography from any photo one happens to have handy.

While Apple removes these apps as soon as it discovers them, that typically only happens when they’re reported by sites like 404 Media and 9to5Mac. The company appears to have upped its game on dealing with “undressing” apps, but now it faces another new challenge: face-swapping apps.

‘Dual-Use’ Apps Create a Problem

According to 404 Media, Apple is struggling with the “dual use” nature of these face-swapping apps. Since they’re tools that can be used for both good and evil, the App Store Review team has been hesitant to drop the ban hammer on them, but like the other deepfake apps, many are blatantly promoting their ability to generate “nonconsensual AI-generated intimate images (NCII)” on different social media platforms.

Last week, I was scrolling Reddit when I stumbled upon an ad for a face swapping app. It seemed innocent enough at first, but I know enough about this space to know that the kind of face swapping apps that spend money to promote themselves often have a more insidious purpose, which allow for the creation of nonconsensual pornographic deepfakes and charge a high price for that service in order to make the ad spend worth it. Emanual Maiberg, 404 Media

According to 404 Media’s Emanual Maiberg, Apple doesn’t seem to be prepared to deal with this problem, despite how easily these apps can be used to create nonconsensual sexualized images from nearly any photograph — including those of children.

One video ad Maiberg reported on highlighted an app’s ability to pull videos from any site you want — with a strong implication that it includes Pornhub — and add custom faces and videos on top of the performers in those videos. Maiberg tested that app and found that he was able to create a highly convincing deepfake pornographic video from an ordinary snapshot and a Pornhub video in under five minutes.

After Maiberg contacted Reddit, the ad was removed for violating the site’s policies, which prohibit “sexually explicit content, products, or services.” He then contacted Apple to say he’d found “another face swapping app that was advertising its ability to generate NCII.” Apple removed the app — but only after Maiberg provided a link to help the company locate it, meaning it couldn’t find the app on its own.

On background, Apple gave Maiberg the usual canned responses about the App Store policies, which naturally prohibit these kinds of apps. However, Apple didn’t engage with questions about how it’s dealing with the “dual use” problem with face-swapping apps like these.

Apple will remove the app once we flag it, but refuses to have a substantive discussion about the problem in general and what it plans to do about it.Emanual Maiberg, 404 Media

Sadly, this has become an all too common problem in both the App Store and Google’s Play Store. Maiberg suggests the only solution may be a blanket ban on face-swapping apps and other potentially harmful AI image generators. Apps appear on the App Store and Play Store with innocuous names and descriptions that make them seem like a fun way of swapping social media photos with friends. However, elsewhere they explicitly advertise their ability to create nonconsensual deepfake porn — and many include features like pulling videos from Pornhub that make it clear that’s exactly what they’re intended for.

To make matters worse, these apps charge in-app subscriptions to unlock their features, and Apple gets its usual 30% cut of those. This leads to the cynical take that Apple has little motivation to ban apps that it’s profiting from — an accusation that some developers have levied against the company in the past.

However, the money Apple is making from these apps is likely a rounding error on its balance sheet — and certainly not enough for Apple to risk compromising its reputation. It’s far more likely that Apple is struggling under the sheer load of App Store submissions while falling prey to developers’ deceptions. In 2022, Apple reported that it had rejected nearly as many apps as there are on the App Store — 1,679,694 app rejections compared to a catalog of 1,783,232 apps. While each submission doesn’t represent an individual app — developers often tweak and resubmit apps — it’s still a staggering number of apps for Apple’s reviewers to deal with, which means it can’t go over every single app with a fine-toothed comb.

That’s also Maiberg’s take, who believes the problem isn’t going to stop due to the sheer scale of apps on the App Store and ads running through online social media advertising networks on Reddit, Instagram, and elsewhere. “The business model for these platforms works at scale, and that scale doesn’t give time for humans to manually review, investigate, and approve every ad or app,” he says.

Still, Maiberg notes that several clues could tip Apple’s reviewers off if they were determined to look for them, such as charging high fees for features that typically cost less or even nothing online. However, it would be better to ban these types of apps entirely. Hany Farid, a professor at UC Berkeley who is a leading expert on digitally manipulating images, points out that “the vast majority of use cases [for face-swap deepfakes] is to create nonconsensual sexual imagery.”

With the scales tipped so sharply in that direction, Apple should at least scrutinize these apps much more closely. Developers can add plenty of guardrails to ensure that their face-swap apps aren’t being used for nefarious purposes, such as working solely with images of the user’s own face captured live from the selfie camera or refusing to generate adult videos. As Maiberg reported earlier this week, a recent survey found that one in 10 teenagers acknowledge that either they or someone in their peer group has used AI tools to generate nudes of other minors. They aren’t finding these tools on the deep dark corners of the internet, but rather from Instagram, Reddit, and TikTok ads — “ads on the most popular internet platforms in the world which are directing them to the most popular app store in the world for the stated purpose of creating NCII.”

Sponsored
Social Sharing