When news broke that AI-generated nude pictures of students were popping up at a Beverly Hills Middle School in February, many district officials and parents were horrified.
But others said no one should have been blindsided by the spread of AI-powered โundressingโ programs. โThe only thing shocking about this story,โ one Carlsbad parent said his 14-year-old told him, โis that people are shocked.โ
Now, a newly released report by Thorn, a tech company that works to stop the spread of child sexual abuse material, shows how common deepfake abuse has become. The proliferation coincides with the wide availability of cheap โundressingโ apps and other easy-to-use, AI-powered programs to create deepfake nudes.
But the report also shows that other forms of abuse involving digital imagery remain bigger problems for school-age kids.
To measure the experiences and attitudes of middle- and high-school students with sexual material online, Thorn surveyed 1,040 9- to 17-year-olds across the country from Nov. 3 to Dec. 1, 2023. Well more than half of the group were Black, Latino, Asian or Native American students; Thorn said the resulting data were weighted to make the sample representative of U.S. school-age children.
According to Thorn, 11% of the students surveyed said they knew of friends or classmates who had used artificial intelligence to generate nudes of other students; an additional 10% declined to say. Some 80% said they did not know anyone whoโd done that.
In other words, at least 1 in 9 students, and as many as 1 in 5, knew of classmates who used AI to create deepfake nudes of people without their consent.
Stefan Turkheimer, vice president of public policy for the Rape, Abuse & Incest National Network, the countryโs largest anti-sexual-violence organization, said that Thornโs results are consistent with the anecdotal evidence from RAINNโs online hotline. A lot more children have been reaching out to the hotline about being victims of deepfake nudes, as well as the nonconsensual sharing of real images, he said.
Compared with a year ago or even six months ago, he said, โthe numbers are certainly up, and up significantly.โ
Technology is amplifying both kinds of abuse, Turkheimer said. Not only is picture quality improving, he said, but โvideo distribution has really expanded.โ
The Thorn survey found that almost 1 in 4 youths ages 13 to 17 said theyโd been sent or shown an actual nude photo or video of a classmate or peer without that personโs knowledge. But that number, at least, is lower than it was in 2022 and 2019, when 29% of the surveyed students in that age group said theyโd seen nonconsensually shared nudes.
Not surprisingly, only 7% of the students surveyed admitted that they had personally shared a nude photo or video without that personโs knowledge.
The study found that sharing of real nudes is widespread among students, with 31% of the 13- to 17-year-olds agreeing with the statement that โItโs normal for people my age to share nudes with each other.โ Thatโs about the same level overall as in 2022, the report says, although itโs notably lower than in 2019, when nearly 40% agreed with that statement.
Only 17% of that age group admitted to sharing nude selfies themselves. An additional 15% of 9- to 17-year-olds said they had considered sharing a nude photo but decided not to.
Turkheimer wondered whether some of the perceived decline in sexual interactions online stemmed from the shutdown last year of Omegle, a site where people could have video chats with random strangers. Although Omegleโs rules banned nudity and the sharing of explicit content, more than a third of the students who reported using Omegle said theyโd experienced some form of sexual interaction there.
He also noted that the study didnโt explore how frequently students experienced the interactions that the survey tracked, such as sharing nudes with an adult.
According to Thorn, 6% of the students surveyed said theyโd been victims of sextortion โ someone had threatened to reveal a sexual image of them unless they agreed to pay money, send more sexual pictures or take some other action. And when asked whom to blame when a nude selfie goes public, 28% said it was solely the victimโs fault, compared with 51% blaming the person who leaked it.
Read the full article here
Discussion about this post