Technical Reporter

Meta has filed a legal lawsuit against a company that opens advertising on its platform, which promotes so-called “naked” apps that typically use artificial intelligence (AI) to create naked images of people without consent.
In a battle between cats and mouse, it has been removed in a series of months, it has sued the company behind Crushai Apps to prevent it from publishing ads entirely.
“This legal action underscores the severity of this abuse we take, and the commitment we do our best to protect our community from it,” Mehta said in a blog post.
Alexios Mantzarlis of the author’s fake blog said that the naked AP was promoted on Meta’s Facebook and Instagram platforms, “there are at least 10,000 ads.”
Mr Mantesaris told the BBC he was happy to see Meta take this step – but warned that more needs to be done.
“Even with this announcement, I was able to find a dozen ads for Crushai Live on the platform, as well as other ‘naked’ ads,” he said.
“This abuse vector requires researchers and the media to continue surveillance to keep the platform accountable and reduce the influence of these harmful tools.”
Meta Sad in its blog: “We will continue to take the necessary steps – which may include legal action – for those who abuse our platform like this.”
“Devastating emotional hurt”
In recent years, the growth in generative AI has led to a surge in “naked” applications.
They became so common that in April, the Children’s Commission of England called on the government to introduce legislation to ban them altogether.
It is illegal to create or own sexual content featuring children.
But Matthew Sowemimo, deputy director of child safety policy policy at the NSPCC, said research by the charity shows that predators are “weaponizing” these applications to create illegal images of children.
“The emotional loss to children is absolutely devastating,” he said.
“Many people feel powerless, violated and control of their own identity.
“The government must act immediately to ban the ‘naked apps of all UK users and to prevent them from being advertised and promoted on a large scale.”
Meta said another change has also been made recently to deal with the broader issues of online “naked” apps by sharing information with other tech companies.
“Since we started sharing this information in late March, we have provided more than 3,800 unique URLs to participating tech companies,” it said.
The company accepted that companies that do not know how to avoid deploying ads have encountered problems, such as creating new domain names to replace banned ads.
It said it has developed new technologies designed to identify such ads, even if they do not include nudes.
Nude apps are just the latest examples of what AI uses to create problematic content on social media platforms.
Another problem is using artificial intelligence to create deep effects – highly realistic images or videos of celebrities – to deceive or mislead people.
In June, Meta’s oversight committee criticized a decision to leave a Facebook post showing someone who appeared to be Brazilian football legend Ronaldo Nazário.
Meta has previously tried to crack down on scammers by using facial recognition technology to fraudulently use celebrity scammers in ads.
This also requires political advertisers to announce the use of AI because of concerns about the impact of deep strikes on elections.
