Last month, a detective in a small town outside of Lancaster, Pennsylvania, invited dozens of high school girls and their parents to the police station to undertake a difficult task: one by one, the girls were asked to confirm that they were depicted in hundreds of AI-generated deepfake pornographic images seized by law enforcement.
In a series of back-to-back private meetings, Detective Laurel Bair of the Susquehanna Regional Police Department slid each image out from under the folder’s cover, so only the girl’s face was shown, unless the families specifically requested to see the entire uncensored image.
[…]
The photos were part of a cache of images allegedly taken from 60 girls’ public social media accounts by two teenage boys, who then created 347 AI-generated deepfake pornographic images and videos, according to the Lancaster County District Attorney’s Office. The two boys have now been criminally charged with 59 counts of “sexual abuse of children,” and 59 counts of “posession of child pornography,” among other charges, including “possession of obscene materials depicting a minor.”
Forty-eight of the 60 victims were their classmates at Lancaster Country Day School, a small private school approximately 80 miles west of Philadelphia. The school is so small that nearly half of the high school’s female students were victimized in the images and videos. The scale of the underage victims makes this the largest-known instance of deepfake pornography made of minors in the United States.
[…]
Experts say that it is rare for criminal charges to be brought in deepfake pornography cases where both the victims and the perpetrators are minors.
“My guess [as to why there aren’t more such prosecutions] is just that there may be a general recognition that arresting children isn’t going to resolve this,” Riana Pfefferkorn, a policy fellow at the Stanford Institute for Human-Centered Artificial Intelligence who has long studied the intersection of child sexual abuse material and AI, told Forbes.
Of course not. The genie is out of the lamp though and getting cheaper and easier to create more realistic things, so what's going to change? This could probably qualify as a problem concerning AI safety, something that has been sidelined because profits are more important.