"AI was used to turn a teen's photo into a nude image. Now the teen is fighting for change to protect other kids."

Francesca Mani was 14-years-old when her name was called over the loudspeaker at Westfield High School in New Jersey. She headed to the principal’s office, where she learned that a picture of her had been turned into a nude image using artificial intelligence.

Mani had never heard of a “nudify” website or app before. When she left the principal’s office, she said that she saw a group of boys laughing at a group of girls that were crying.
“And that’s when I realized I should stop crying and that I should be mad, because this is unacceptable,” Mani said.

What happened to Francesca Mani

Mani was sitting in her high school history class last October when she heard a rumor that some boys had naked photos of female classmates. She soon learned that she and several other girls at Westfield High School had been targeted.

According to a lawsuit later filed by one of the other victims through her parents, a boy at the school had uploaded photos from Instagram to a site called Clothoff, which is one of the most popular “nudify” websites. 60 Minutes has decided to name the site to raise awareness of its potential dangers. There were more than 3 million visits to Clothoff last month alone, according to Graphika, a company that analyzes social networks. The website offers to “nudify” both males and females, but female nudes are far more popular.


Dorota Mani and Francesca Mani
60 Minutes

Visitors to the website can upload a photo, or get a free demonstration, in which an image of a woman appears with clothes on, then appears naked just seconds later. The results look very real.

Clothoff users are told they need to be 18 or older to enter the site and that they can’t use other people’s photos without permission. The website claims “processing of minors is impossible,” but no one at the company responded when 60 Minutes emailed asking for evidence of that in addition to many other questions.

Mani never saw what had been done to her photo, but according to that same lawsuit, at least one student’s AI nude was shared on Snapchat and seen by several kids at school.

The way Mani found out about her photo made it even worse, she said. She recalled how she and the other girls were called by name to the principal’s office over the school’s public address system.

“I feel like that was a major violation of our privacy while, like, the bad actors were taken out of their classes privately,” she said.

That afternoon, Westfield’s principal sent an email to parents informing them “some of our students had used artificial intelligence to create pornographic images from original photos.” The principal also said the school was investigating and “at this time we believe that any created images have been deleted and are not being circulated.”

Fake images, real harm

Mani’s mother Dorota, who’s also an educator, was not convinced. She worries that nothing that has been shared online is ever truly deleted.

“Who printed? Who screenshotted? Who downloaded? You can’t really wipe it out,” she said.

The school district would not confirm any details about the photos, the students involved or disciplinary action to 60 Minutes. In a statement, the superintendent said the district revised its Harassment, Intimidation and Bullying policy to incorporate AI, something the Manis say they spent months urging school officials to do.

Francesca Mani feels the girls who were targeted paid a bigger price than the boy or boys who created the images.

“Because they just have to live with knowing that maybe an image is floating, their image is floating around the internet,” she said. “And they just have to deal with what the boys did.”
Dorota Mani said that she filed a police report, but no charges have been brought.

Yiota Souras is chief legal officer at the National Center for Missing and Exploited Children. Her organization works with tech companies to flag inappropriate content on their sites. She says that while the images created on AI “nudify” sites are fake, the damage they can cause to victims is real.

“They’ll suffer, you know, mental health distress and reputational harm,” Souras said. “In a school setting it’s really amplified, because one of their peers has created this imagery. So there’s a loss of confidence. A loss of trust.”

Fighting for change

60 Minutes found nearly 30 similar cases in schools in the U.S. over the last 20 months, along with additional cases around the world.

In at least three of those cases, Snapchat was reportedly used to circulate AI nudes. One parent told 60 Minutes it took over eight months to get the accounts that had shared the images taken down. According to Souras, a lack of responsiveness to victims is a recurring problem the National Center for Missing and Exploited Children sees across tech companies.

“That isn’t the way it should be. Right? I mean, a parent whose child has exploitative or child pornography images online should not have to rely on reaching out to a third party, and having them call the tech company. The tech company should be assuming responsibility immediately to remove that content,” she said.


Yiota Souras, chief legal officer at the National Center for Missing and Exploited Children, speaks with Anderson Cooper
60 Minutes

60 Minutes asked Snapchat about the parent who said the company didn’t respond to her for eight months. A Snapchat spokesperson said they have been unable to locate her request and said, in part: “We have efficient mechanisms for reporting this kind of content.” The spokesperson went on to say that Snapchat has “zero-tolerance policy for such content” and “…act[s] quickly to address it once reported.”

The Department of Justice says AI nudes of minors are illegal under federal child pornography laws if they depict what’s defined as “sexually explicit conduct.” But Souras is concerned some images created by “nudify” sites may not meet that definition.

In the year since Francesca Mani found out she was targeted, she and her mom, Dorota, have encouraged schools to implement policies around AI. They’ve also worked with members of Congress to try and pass a number of federal bills. One of those bills, the Take It Down Act, which is co-sponsored by Senators Ted Cruz and Amy Klobuchar, made it through the Senate earlier this month and is now awaiting a vote in the House. It would create criminal penalties for sharing AI nudes and would require social media companies to take photos down within 48 hours of getting a request.

1 Like

This is going to be the problem with Ai. When your face can be planted in any photo, the idea of a photo being used as evidence or actual proof goes right out the window.

Knowing it’s a fake should make someone feel apathetic about it, knowing it’s not a real photo of them. But I imagine knowing that others believe it could be is what would bother someone the most. Also it makes obvious the perverted thoughts others are having about someone. The elephant in the room is pointed out and the perverts attempt to deflect their degeneracy by making fun of the victim. When in reality, their the real assholes in all of it!

But it would also be a crime to lock up an innocent person who comes across a fake and is prosecuted for possessing CSEM when it’s not an actual photo of the person. Just their face on an Ai body. I don’t think that would be fair. I agree that the idea of people doing this should be criminalized as it deeply hurts the other person, especially publicly sharing it. It’s such a slippery slope with this kind of stuff. What’s real, what’s fake. It blurs the lines too much.

Then we wonder why they want to come after lolicon and Ai nude art. Which as long as it’s 100% false, not a real person, shouldn’t be considered a crime. A nude painting is not the same as an actual photo. It’s a lot to ponder.

I don’t entirely disagree, but if I’m not mistaken, AI of real minors (especially if it’s considered too realistic/indistinguishable from unedited photography) is currently defined as a form of CSAM. So, those caught making this stuff can and will be charged with producing CSAM. Don’t quote me on that, but I think that’s what’s currently on the books.

Edit: in fact, I want to say that ANY sexual manipulation of a photograph of a minor is technically illegal, not just AI. That includes photoshop, and even ordinary SFW photographs placed in sexual situations (like so-called “tributes”). Again, don’t quote me on that, I might not be entirely correct.

2 Likes

I believe you are correct. If it’s indistinguishable from real, it’s illegal.

1 Like

Added an edit to my original comment. I don’t think it even has to be “indistinguishable”, or even AI. I imagine that even “tributes”, or even just posting SFW pics and just making sexual comments about them would get smacked down hard by the law. And remember that whole thing with the mother who accused a doll company of basing their product on her child?

Now, it was found that this woman made the whole thing up (FL Woman LIED about Child Sex Doll using her Daughter's Likeness), but let’s philosophize and assume a situation like this actually occured. Could the dollmaker be charged with producing sexual content of a minor? Let’s think another scenario, where a person made a 3D model instead of a doll. Would that be charged as CSAM?

The grayest area I’ve seen is ripping a mo-capped model from a video game and manipulating it for sexual purposes (like Sarah Miller from The Last of Us, mo-capped by then-12yo Hana Hayes), or clearly basing your drawing on the likeness of an actual child (like Emma Watson as Hermione or Millie Bobbie Brown as Eleven).

Sometimes, the line gets so blurred it’s hard to keep track. Obviously, the one line most do not cross is directly engaging with unambiguous CSAM or actually interacting sexually with a minor. Other than that, it’s kinda all over the place.

2 Likes

Meanwhile, there seems to be little or no interest by academia, nor by much of the mainstream news-media, in the increasing rate of boys harassed or sextorted online. Here in Canada, for example, it’s a growing problem.

In fact, three unrelated boys have committed suicide since February 2022 after someone anonymously claiming to be an interested female manipulated them into sending sexually compromising images of themselves then threatened to post them if the boys didn’t wire money.

Of course, the boys greatly feared being publicly humiliated, maybe in part because they had been tricked. Perhaps they, being male, also dreaded looking weak by asking others for help on the matter.

According to the Canadian Centre for Child Protection, males aged 14-24 are primarily targeted. Of 322 new cases the CCCP opened in July 2022, 92 percent concerned boys or young men being sextorted.

“Realizing that youth, especially boys at this age, are vulnerable to manipulation tactics to get them to engage in sexual acts online … biologically, they basically move too quickly [and] comply with these kinds of requests,” said Stephen Sauer, the CCCP’s director of Cybertip.ca, in a CBC News story posted online.

“There’s a lot of shame associated with this, so then they will also comply with paying to hopefully mitigate the distribution of that image or video.”