❌

Reading view

There are new articles available, click to refresh the page.

Court ordered penalties for 15 teens who created naked AI images of classmates

Court ordered penalties for 15 teens who created naked AI images of classmates

Enlarge (credit: master1305 | iStock / Getty Images Plus)

A Spanish youth court has sentenced 15 minors to one year of probation after spreading AI-generated nude images of female classmates in two WhatsApp groups.

The minors were charged with 20 counts of creating child sex abuse images and 20 counts of offenses against their victims’ moral integrity. In addition to probation, the teens will also be required to attend classes on gender and equality, as well as on the "responsible use of information and communication technologies," a press release from the Juvenile Court of Badajoz said.

Many of the victims were too ashamed to speak up when the inappropriate fake images began spreading last year. Prior to the sentencing, a mother of one of the victims told The Guardian that girls like her daughter "were completely terrified and had tremendous anxiety attacks because they were suffering this in silence."

Read 12 remaining paragraphs | Comments

Millions of OnlyFans paywalls make it hard to detect child sex abuse, cops say

Millions of OnlyFans paywalls make it hard to detect child sex abuse, cops say

Enlarge (credit: SOPA Images / Contributor | LightRocket)

OnlyFans' paywalls make it hard for police to detect child sexual abuse materials (CSAM) on the platform, Reuters reportedβ€”especially new CSAM that can be harder to uncover online.

Because each OnlyFans creator posts their content behind their own paywall, five specialists in online child sexual abuse told Reuters that it's hard to independently verify just how much CSAM is posted. Cops would seemingly need to subscribe to each account to monitor the entire platform, one expert who aids in police CSAM investigations, Trey Amick, suggested to Reuters.

OnlyFans claims that the amount of CSAM on its platform is extremely low. Out of 3.2 million accounts sharing "hundreds of millions of posts," OnlyFans only removed 347 posts as suspected CSAM in 2023. Each post was voluntarily reported to the CyberTipline of the National Center for Missing and Exploited Children (NCMEC), which OnlyFans told Reuters has "full access" to monitor content on the platform.

Read 15 remaining paragraphs | Comments

❌