Is Apple’s ‘Safe’ App Store a Myth? Senators Want X Booted Over Grok Deepfakes
Toggle Dark Mode
Grok has been making headlines lately for its lack of anything even vaguely resembling guardrails, and now three US senators are stepping in after Apple and Google have failed to proactively deal with the problem in their respective app marketplaces.
Elon Musk’s evil little AI bot recently gained image editing capabilities allowing any user on X to edit any image without permission. Unsurprisingly, the darker side of the internet responded in an entirely predictable way, flooding the social media platform with nonconsensual deepfake pornography.
Since then, things have only gotten more chaotic, as deviants and miscreants have tried to push the limits to see just how far Grok will go in undressing both adults and minors — and nobody seems to be doing anything to stop it, despite xAI’s Terms of Service clearly prohibiting “the sexualization or exploitation of children,” and “violating a person’s privacy or their right to publicity.”
A public statement from X’s Safety account insisted that it takes action against illegal content, and Elon Musk himself echoed that, adding that “Anyone using Grok to make illegal content will suffer the same consequences as if they upload illegal content.” However, despite Musk’s assurances on January 3, the tide of illicit content hasn’t just continued — it’s surged. That highlights a fundamental truth: moderation is a choice, and xAI is choosing not to moderate.
It also doesn’t in any way change the argument that Grok should be prevented from creating child sexual abuse materials (CSAM) in the first place. After all, while the public timeline is where these images are being weaponized, the availability of Grok as a standalone tool means this “undressing” is also happening behind closed doors — turning an iPhone into a more readily accessible private studio for nonconsensual imagery.
However, while X and xAI share the crux of the blame for this, Apple’s and Google’s hands aren’t exactly clean, either. As Caroline Haskins points out at Wired (Apple News+, via Daring Fireball), both companies have banned apps and developers for far less than what Grok is doing right now.
Apple and Google both explicitly ban apps containing CSAM, which is illegal to host and distribute in many countries. The tech giants also forbid apps that contain pornographic material or facilitate harassment. The Apple App Store says it doesn’t allow “overtly sexual or pornographic material,” as well as “defamatory, discriminatory, or mean-spirited content,” especially if the app is “likely to humiliate, intimidate, or harm a targeted individual or group.”
Caroline Haskins
In fact, Haskins goes on to note that Apple and Google have removed several apps that did precisely what Grok is now doing, as we also shared in 2024, following investigations by 404 Media and the BBC.
Yet, in today’s political climate, it seems that Apple has lost its moral compass, being more willing to ban an app that merely crowdsources information on ICE sightings than one that openly allows users to sexualize children. That’s a surprising turnaround for a company that once seriously considered building features into iPhones to scan for photos of child abuse.
With Apple and Google seemingly paralyzed by the fear of going up against the Musk empire, US lawmakers are now trying to light a fire under the two tech giants. Senators Ron Wyden, Ed Markey, and Ben Ray Luján have penned an open letter to Apple CEO Tim Cook and Google CEOs Sundar Pichai asking for X and Grok to both be pulled from their respective app stores.
In what feels like a sad twist, the senators are asking for nothing more than the two companies actually doing what their terms of service say they’re supposed to:
We write to ask that you enforce your app stores’ terms of service against X Corp’s (hereafter, “X”) X and Grok apps for their mass generation of nonconsensual sexualized images of women and children. X’s generation of these harmful and likely illegal depictions of women and children has shown complete disregard for your stores’ distribution terms. Apple and Google must remove these apps from the app stores until X’s policy violations are addressed.
Senators Ron Wyden, Ed Markey, and Ben Ray Luján
‘Just Plain Creepy’
While the letter is worded as a respectful request, the three senators aren’t pulling any punches here. They point out what X and Grok are doing, and then cite Apple’s policies, including one that gives it the right to remove apps for being “just plain creepy” — a standard that could easily be applied to what Grok has been up to lately.
Your app stores’ policies are clear […] Apple’s terms of service bar apps from including “offensive” or “just plain creepy” content, which under any definition must include nonconsensually-generated sexualized images of children and women. Further, Apple’s terms explicitly bar apps from including content that is “[o]vertly sexual or pornographic material” including material “intended to stimulate erotic rather than aesthetic or emotional feelings.”
Senators Ron Wyden, Ed Markey, and Ben Ray Luján
Then, to drive the point home, the open letter adds that “turning a blind eye to X’s egregious behavior” would completely undermine Apple’s most oft-used reasoning for maintaining control over the App Store: that’s it’s a “safe and trusted place for users around the world to discover and download apps.” There’s a veiled threat that could be read in between these lines that if life inside Apple’s walled garden isn’t any better than life outside, maybe it’s time for those walls to come tumbling down.
However, the senators don’t stop there. They also contrast Apple’s (and Google’s) willingness to remove apps like ICEBlock and Red Dot. Granted, those removals didn’t happen right away — ICEBlock was on the App Store for at least three months before Apple took action, although that may be more on the part of White House officials not realizing they could simply ask the company to remove it.
Under explicit pressure, and perhaps threats, from the Department of Homeland Security, your companies quickly removed apps that allowed users to lawfully report immigration enforcement activities, like ICEBlock and Red Dot. Unlike Grok’s sickening content generation, these apps were not creating or hosting harmful or illegal content, and yet, based entirely on the Administration’s claims that they posed a risk to immigration enforcers, you removed them from your stores.
Senators Ron Wyden, Ed Markey, and Ben Ray Luján
The letter asks the two chief executives to “demonstrate a similar level of responsiveness” to that and remove the apps immediately, even if that’s only a temporary action while a full investigation is performed. They’re also requesting a written response from both companies by January 23, 2026.
This puts the ball solidly in Apple’s (and Google’s) court. That’s arguably where it should have already been, but with this pressure being brought to bear, it will be interesting to see what happens next. At the very least, the companies are now being asked to account for their inaction.
The App Store is far from a free market. Apple has long exercised ironclad control over the content that it allows on its marketplace — often quite controversially and for far lower stakes. However, few people would argue that nonconsensual deepfakes and child abuse aren’t bad, and Apple has been remarkably quick to crack down much harder in the past.
For example, in 2018, Apple booted Tumblr from the App Store for merely hosting child pornography, forcing those apps to scrub their content before they’d be allowed to return. Now, Grok is literally generating it, and Apple is sitting on its hands, leaving its famed walled garden looking less like a sanctuary and more like a selective gated community where the rules only apply to those without a blue checkmark or a rocket company.


