Jan. 27 (UPI) — The Apple and Google app stores have applications that generate images of women with their clothes removed with AI, a tech watchdog reported Tuesday.

The Tech Transparency Project found 55 apps on Google Play and 47 on the Apple App Store that alter images of women to make them appear nude or partially nude.

“That means the Android and Apple users have access to dozens of apps that can create nonconsensual, sexualized images of women,” the report said.

These apps have been downloaded 705 million times and generated about $117 million in revenue, which Google and Apple receive a portion of.

Apple and Google’s app store policies prohibit apps from displaying “sexual nudity” and “overtly sexual or pornographic material.”

Elon Musk’s AI chatbot Grok drew backlash earlier this month for following user prompts asking it to remove the clothes of children.

“TTP’s findings show that Google and Apple have failed to keep pace with the spread of AI deepfake apps that can ‘nudify’ people without their permission,” TTP said in its report.

An Apple spokesperson said Monday that 28 apps TTP identified in its report were removed but two were restored after developers altered their apps to address violations of the app store’s guidelines.

TTP said Monday that only 24 apps have been removed from the Apple App Store.

Google also said it has suspended some of the apps that TTP flagged in its report.