Apple finally pulls generative AI nude apps from the App Store

Apple has removed apps from the App Store that claimed to make nonconsensual nude imagery, a move that demonstrates Apple is now more willing to tackle the hazardous app category.

App Store icon
App Store icon

The capabilities of generative AI to create images based on prompts has become a very useful tool in photography and design. However, the technology also has been misused in the creation of deep fakes — and nonconsensual pornography.

Despite the danger, Apple has been remarkably hands-off from the problem. Prior to the recent move, it hadn't done much to fix a potentially major problem.


Continue Reading on AppleInsider | Discuss on our Forums

⦿Source