Apple is reportedly cracking down on apps categorised as ‘AI image generators‘ in the App Store, as some found to be creating “nonconsensual nude images.”
A media outlet found that a handful of apps in App Store are advertising themselves on Instagram as art generators, but in real, were creating nude images of women. And since it’s a violation of the App Store policies, Apple removed the reported apps.
Surprise Sneak-in of Apps
Getting an app listed on App Store is a tough task, as the developer should comply to a number of rules to let through. Yet, there are hundreds of malicious apps that pass through Apple’s rigorous checks every year, with the latest being a handful of AI image generators.
As noted by 404 Media, Apple’s App Store has a bunch of nude image generators that shows a woman’s image in undressed mode! While it’s created by AI and untrue, they’re still harmful and strictly violate the App Store rules. These apps advertised themselves having the “ability to create nonconsensual nude images,” notes the outlet.
While it’s distressing, what’s more intriguing is the fact that Apple wasn’t able to find the apps – until a media outlet notified them. 404 Media says they haven’t received any comment from Apple on their initial report (Monday), but it later reached out to seek more details about the reported apps. And when the outlet shared relevant links, Apple removed the concerned apps from App Store.
“Overall, Apple removed three apps from the App Store, but only after we provided the company with links to the specific apps and their related ads, indicating the company was not able to find the apps that violated its policy itself.”