Meta has initiated legal action against a company running ads for “nudify” apps, which use artificial intelligence (AI) to create unauthorized nude images of individuals. The lawsuit targets the firm behind CrushAI apps to prevent further advertisement on its platforms, following months of attempts to remove the ads.
“This legal action underscores our commitment to protecting our community from such abuse,” Meta stated in a blog post.
Alexios Mantzarlis, who writes for the Faked Up blog, reported that there have been “at least 10,000 ads” promoting nudifying apps on Meta’s Facebook and Instagram. He expressed support for Meta’s actions but cautioned that more needs to be done. “Even as this announcement was made, I found a dozen ads from CrushAI still live and many more from other ‘nudifiers’,” he noted.
“This issue requires ongoing monitoring from researchers and the media to hold platforms accountable and limit the spread of these harmful tools,” he added.
In its blog, Meta affirmed, “We’ll continue to take necessary steps, including legal action, against those who misuse our platforms.”
‘Devastating Emotional Toll’
The rise of generative AI has led to an increase in nudifying apps. In April, the Children’s Commission for England urged the government to legislate against these applications. While creating or possessing AI-generated sexual content featuring children is illegal, Matthew Sowemimo from the NSPCC highlighted that predators are using these apps to generate illegal images of minors.
“The emotional toll on children can be devastating,” he remarked. “Many feel powerless and violated. The government must act now to ban nudify apps for all UK users and prevent their promotion.”
ICYMI: 7 Effective strategies to heal from your past and move forward
Meta also announced it has been sharing information with other tech companies to combat the spread of nudify apps, providing over 3,800 unique URLs since March. The company acknowledges issues with advertisers circumventing its rules by creating new domains to replace banned ones.
To tackle this, Meta has developed technology to identify problematic ads, even those that do not feature nudity. Nudify apps are part of a larger trend of AI misuse on social media, including deepfakes that can mislead users.
In June, Meta’s Oversight Board criticized the decision to keep a Facebook post featuring an AI-manipulated video of Brazilian football legend Ronaldo Nazário. The company has previously implemented facial recognition technology to combat fraudulent celebrity promotions and mandates political advertisers to disclose the use of AI due to concerns about deepfakes affecting elections.
source: BBC