Policy Violations Persist: Apple and Google App Stores Continue Hosting "AI Undressing" Applications Despite Explicit Bans

Stock News
昨天

A report released on Wednesday by the Tech Transparency Project reveals that despite explicit policies from Apple and Alphabet prohibiting non-consensual intimate imagery, both companies continue to offer related mobile applications in their app stores. The report indicates that searching for keywords like "nudify" or "undress" in Apple's App Store and the Google Play Store yields multiple applications capable of processing photos of celebrities and ordinary individuals into nude or semi-nude images. Furthermore, both companies are placing advertisements for similar "undressing" applications within their search results.

Citing revenue estimates from market research firm AppMagic, the report states that the identified applications have accumulated 483 million downloads, generating $122 million in revenue. An AppMagic spokesperson noted that the Tech Transparency Project's findings have led to the removal of several applications and prompted others to revise their user agreements. Over the past year, politicians globally have intensified calls to curb the proliferation of such "undressing" apps. Earlier this year, Apple and Google removed problematic applications flagged by the Tech Transparency Project, but researchers claim dozens of similar apps reappeared within months.

"The issue is not only that these companies fail to properly vet these applications, allowing them to remain available while profiting from them," said Katie Paul, Director of the Tech Transparency Project, in an interview, "but they are also actively steering users toward downloading them." Through app store searches, the organization identified 18 apps with "undressing" capabilities on the Apple App Store and 20 on the Google Play Store. Researchers also noted that Apple and Google's search autocomplete features suggest additional similar apps as users type, effectively guiding users to them. Some apps feature overtly suggestive names and imagery, while others, though not explicitly marketed for such purposes, can be easily misused, with lower barriers to use than traditional photo editing software.

The Tech Transparency Project highlighted that some applications offer subscription services. Apple's App Store guidelines explicitly prohibit "overtly sexual or pornographic content," while the Google Play Store bans apps that "degrade or objectify individuals, including those claiming to undress or see through clothing, even if labeled as for pranks or entertainment." Google stated that several apps mentioned in the report have been removed from Google Play for policy violations, with investigations ongoing. "We investigate and take action when we receive reports of policy violations," Google said in an emailed statement. Apple stated that it removed 15 apps flagged by the organization after media inquiries about their existence.

Researchers mentioned that among the removed apps was PicsVid AI Hot Video Generator. The developer of PicsVid did not respond to requests for comment. Another flagged app, Uncensored AI—No Filter Chat, was capable of removing clothing from images of women uploaded by researchers. A representative from the app's developer stated that the "undress" feature has since been removed. Apple noted it contacted six app developers regarding necessary changes to avoid removal and added that other apps mentioned by the Tech Transparency Project did not violate its guidelines. Apple also stated it has proactively rejected numerous applications and removed other non-compliant ones.

Professor Anne Helmond from Utrecht University in the Netherlands commented that enforcement actions by these tech giants are "inconsistent and lack transparency." "If an app presents itself as a general image generator, even if it can be misused, it might pass review," said Helmond, who also directs the international research group App Studies Initiative. "App visibility is determined by ranking and search systems, which are driven by user engagement, meaning controversial uses can actually increase an app's exposure."

One app found by researchers on the Google Play Store, Video Face Swap AI: DeepFace, advertised the ability to superimpose actress Anya Taylor-Joy's face onto the character Daenerys Targaryen from Game of Thrones. However, investigation revealed that within the app's "Girls" category, users could overlay faces onto explicit pornographic video templates. This app, rated "E for Everyone," had over one million downloads and was accessible by searching "face swap." The developer, Okapi Software, stated it is investigating the raised issues and has removed some user-uploaded content. "The app does not offer an 'undress' feature and does not permit the creation of nude or explicit content," Okapi said. "We take content safety and compliance seriously."

Regulatory bodies are increasingly urging both companies to strengthen policy enforcement. Last year, U.S. President Trump signed the "Take It Down Act," making the distribution of non-consensual intimate imagery a criminal offense and mandating its removal from social media and websites. In April, the UK government proposed legislation that would create a pathway to prosecute executives of tech companies that fail to effectively remove such imagery.

免责声明:投资有风险,本文并非投资建议,以上内容不应被视为任何金融产品的购买或出售要约、建议或邀请,作者或其他用户的任何相关讨论、评论或帖子也不应被视为此类内容。本文仅供一般参考,不考虑您的个人投资目标、财务状况或需求。TTM对信息的准确性和完整性不承担任何责任或保证,投资者应自行研究并在投资前寻求专业建议。

热议股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10