Normal view

There are new articles available, click to refresh the page.
Before yesterdayTechnology

Apple “clearly underreporting” child sex abuse, watchdogs say

22 July 2024 at 12:46
Apple “clearly underreporting” child sex abuse, watchdogs say

Enlarge (credit: Bloomberg / Contributor | Bloomberg)

After years of controversies over plans to scan iCloud to find more child sexual abuse materials (CSAM), Apple abandoned those plans last year. Now, child safety experts have accused the tech giant of not only failing to flag CSAM exchanged and stored on its services—including iCloud, iMessage, and FaceTime—but also allegedly failing to report all the CSAM that is flagged.

The United Kingdom’s National Society for the Prevention of Cruelty to Children (NSPCC) shared UK police data with The Guardian showing that Apple is "vastly undercounting how often" CSAM is found globally on its services.

According to the NSPCC, police investigated more CSAM cases in just the UK alone in 2023 than Apple reported globally for the entire year. Between April 2022 and March 2023 in England and Wales, the NSPCC found, "Apple was implicated in 337 recorded offenses of child abuse images." But in 2023, Apple only reported 267 instances of CSAM to the National Center for Missing & Exploited Children (NCMEC), supposedly representing all the CSAM on its platforms worldwide, The Guardian reported.

Read 17 remaining paragraphs | Comments

Court ordered penalties for 15 teens who created naked AI images of classmates

10 July 2024 at 16:31
Court ordered penalties for 15 teens who created naked AI images of classmates

Enlarge (credit: master1305 | iStock / Getty Images Plus)

A Spanish youth court has sentenced 15 minors to one year of probation after spreading AI-generated nude images of female classmates in two WhatsApp groups.

The minors were charged with 20 counts of creating child sex abuse images and 20 counts of offenses against their victims’ moral integrity. In addition to probation, the teens will also be required to attend classes on gender and equality, as well as on the "responsible use of information and communication technologies," a press release from the Juvenile Court of Badajoz said.

Many of the victims were too ashamed to speak up when the inappropriate fake images began spreading last year. Prior to the sentencing, a mother of one of the victims told The Guardian that girls like her daughter "were completely terrified and had tremendous anxiety attacks because they were suffering this in silence."

Read 12 remaining paragraphs | Comments

Millions of OnlyFans paywalls make it hard to detect child sex abuse, cops say

3 July 2024 at 11:12
Millions of OnlyFans paywalls make it hard to detect child sex abuse, cops say

Enlarge (credit: SOPA Images / Contributor | LightRocket)

OnlyFans' paywalls make it hard for police to detect child sexual abuse materials (CSAM) on the platform, Reuters reported—especially new CSAM that can be harder to uncover online.

Because each OnlyFans creator posts their content behind their own paywall, five specialists in online child sexual abuse told Reuters that it's hard to independently verify just how much CSAM is posted. Cops would seemingly need to subscribe to each account to monitor the entire platform, one expert who aids in police CSAM investigations, Trey Amick, suggested to Reuters.

OnlyFans claims that the amount of CSAM on its platform is extremely low. Out of 3.2 million accounts sharing "hundreds of millions of posts," OnlyFans only removed 347 posts as suspected CSAM in 2023. Each post was voluntarily reported to the CyberTipline of the National Center for Missing and Exploited Children (NCMEC), which OnlyFans told Reuters has "full access" to monitor content on the platform.

Read 15 remaining paragraphs | Comments

❌
❌