Apple Battles AI Deepfake Apps But the War Looks Unwinnable


Blurred image of a screen displaying the app store icon, with a blue background and white stylized "a" made of popsicle sticks, in the center.

Following a 404 Media investigation into apps that advertised that they could generate non-consensual nude images, Apple has begun removing AI image-generation apps from its App Store.

In a follow-up article this morning, 404 Media explains that Apple has removed three of the offending apps from its shop, but only “after [404 Media] provided the company with links to the specific apps and their related adds,” which the outlet suggests reflects Apple’s inability to track down the violating apps itself.

In 404‘s original investigation, published on April 22, the company noted that Instagram and its parent, Meta, were “profiting from several ads that invite people to create non-consensual nude images with AI image generation apps.”

This reveals disturbing issues concerning apps that help users make non-consensual nude images, “deepfakes.” Thanks to the wonders of AI, causing this sort of harm is easier than ever.

The App Store Problem

These apps must undergo screening processes to be accepted into platforms such as the Apple App Store and Google Play Store.

For Apple’s part, its guidelines for developers include a section on safety — in fact, it’s the first section. Google has similar guidelines, by the way.

“When people install an app from the App Store, they want to feel confident that it’s safe to do so — that the app doesn’t contain upsetting or offensive content, won’t damage their device, and isn’t likely to cause physical harm from its use,” Apple explains. “We’ve outlined the major pitfalls below, but if you’re looking to shock and offend people, the App Store isn’t the right place for your app. Some of these rules are also included in Notarization for iOS apps.”

While Apple’s safety rules and regulations are unsurprisingly robust, loopholes remain that aren’t easily closed. As 404 observes, the advertised apps are “art generation” apps. They aren’t described as “porn generation” apps because they would never make it through Apple’s checks.

If a user didn’t know what one of the offending apps could do because of, say, an ad on a major social media platform, they would likely never know it was available on the App Store at all. And if users seeking to generate deepfakes of people against their will and without their knowledge wouldn’t find an app without assistance, what are the chances that Apple will?

There are countless “AI face swap” and “AI art generator” apps in the App Store, and presumably, not all of them are wolves in sheep’s clothing; but some are. The only way to tell which is which, unless the developer outright says so on a different platform, is to use the app.

Per 404, it asked Apple for comment once it located the apps. None arrived, so the outlet published its article. Apple subsequently asked for direct links to the offending apps. These apps have now been removed.

In at least some cases, Apple employees test apps before they’re published to the App Store, but it is unrealistic and impractical to investigate every part of every app firsthand. There are around two million apps on the App Store and even more on the Google Play Store.

Degenerate Developers Need Ad Space, and Companies Seem Willing to Sell Them

The subterfuge required to pass Apple’s safety checks means the apps must be advertised. Enter systemic failure two — advertising platforms.

This is the more troubling issue because the ads aren’t secret. The AI-generated porn of it all is front and center. How did it make it into Meta’s Ad Library and be served to users? It’s a good question, and one Meta didn’t answer when 404 asked.

It is not Meta’s first issue with sexually explicit ads. Or its second. Or third.

“Meta’s ads quality has been exceptionally bad recently,” writes Emanuel Maiberg for 404 Media. Thanks to Maiberg’s reporting, Apple removed the apps that were featured in the most recent Instagram ads. A drop in a bucket, perhaps, but certainly something.

When developers actively hide the unsavory “features” of their apps, perhaps the only way to root them out is to wait for them to say the quiet part out loud and act as quickly as possible. The most significant damage an app like that can cause comes only when people actually download it, and advertising is the most practical way to break through all the noise on the App Store and reach users.

Some of these developers don’t necessarily care to play the long game, either, as certain porn-ifying deepfake features are tucked behind a substantial in-app purchase. These developers aren’t looking for a small payout over time; they want big money as quickly as possible. So what if the app gets taken down after a few days if those few days were a windfall? That said, Apple doesn’t pay out instantly.

A Porn Problem With No Easy Solution

As repeated offenses have demonstrated, Apple, Google, Meta, and others are fighting an unending game of whac-a-mole against AI deepfake porn apps. Some companies appear to be putting up a more vigorous fight than others. Others don’t seem interested in swinging the hammer at all.


Image credits: Header photo background licensed via Depositphotos and then pixelated for illustrative effect.

We will be happy to hear your thoughts

Leave a reply

Funtechnow
Logo
Compare items
  • Total (0)
Compare
0
Shopping cart