close
close
migores1

Guys Use AI To Deeply Fake Nude Photos – A Lawsuit Could Stop It

Nearly a year after AI-generated nude images of high school girls rocked a community in southern Spain, a juvenile court this summer sentenced 15 of their classmates to a year of probation.

But the AI ​​tool used to create the damaging deepfakes is still readily available on the internet, promising to “undress any photo” uploaded to the site in seconds.

Now, a new effort to shut down the app and others like it is underway in California, where San Francisco filed a first-of-its-kind lawsuit this week that experts say could set a precedent, but will also face many obstacles.

“The proliferation of these images has exploited a shocking number of women and girls across the globe,” said David Chiu, the San Francisco City Attorney-elect who brought the case against a group of widely visited websites based in Estonia. Serbia, United States. Kingdom and elsewhere.

“These images are used to bully, humiliate and threaten women and girls,” he said in an interview with The Associated Press. “And the impact on the victims has been devastating to their reputations, to their mental health, to the loss of autonomy and in some cases has led some to take their own lives.”

The lawsuit filed on behalf of the people of California alleges that the services violated numerous state laws against fraudulent business practices, non-consensual pornography and child sexual abuse. But it can be difficult to determine who is running the apps, which are not available in phone app stores but are still easy to find on the Internet.

Contacted late last year by the AP, one service claimed via email that “its CEO is based and moving to the US,” but declined to provide evidence or answer further questions. AP is not naming the specific apps that are being sued for not promoting them.

“There are a number of sites where we don’t know at this point exactly who these operators are and where they’re operating from, but we have investigative tools and subpoena authority to look into that,” Chiu said. “And we will certainly use our powers in the course of this litigation.”

Many of the tools are used to create realistic fakes that “nude” photos of clothed adult women, including celebrities, without their consent. But they have also appeared in schools around the world, from Australia to Beverly Hills in California, usually with boys creating images of classmates that are then widely circulated on social media.

In one of the first widely publicized cases last September, in Almendralejo, Spain, a doctor whose daughter was among a group of victimized girls last year and helped bring it to the public’s attention, said she was pleased with the severity of the punishment their classmates faced. a court decision earlier this summer.

But “it’s not just the responsibility of society, education, parents and schools, but also the responsibility of the digital giants who profit from all this garbage,” Dr. Miriam al Adib Mendiri said in an interview on Friday.

She applauded San Francisco’s action, but said more efforts were needed, including from larger companies such as California-based Meta Platforms and its WhatsApp subsidiary, which was used to spread the images in Spain.

While schools and law enforcement agencies have tried to punish those who make and share deepfakes, authorities have struggled with what to do with the tools themselves.

In January, the European Union’s executive branch explained in a letter to a Spanish member of the European Parliament that the app used in Almendralejo “does not appear” to fall within the bloc’s new general rules to strengthen online safety because it is not large. sufficient platform.

Organizations that have watched the rise of AI-generated child sex abuse material will be watching the San Francisco case closely.

The lawsuit “has the potential to set a legal precedent in this area,” said Emily Slifer, policy director at Thorn, an organization that works to combat child sexual exploitation.

A Stanford University researcher said that because so many of the defendants are based outside the US, it will be harder to bring them to justice.

Chiu “has an uphill battle with this case, but he may be able to take some of the sites offline if the defendants running them ignore the lawsuit,” said Stanford’s Riana Pfefferkorn.

She said that could happen if the city wins by default in their absence and obtains orders affecting domain name registrars, web hosts and payment processors “that would effectively shut down those sites even if their owners don’t show up never in litigation”.

Recommended newsletter: The Fortune Next to Lead newsletter is a must-read for the next generation of C-suite leaders. Each month, the newsletter provides the strategies, resources and expert insights needed to claim the most coveted positions in business. Subscribe now.

Related Articles

Back to top button