West Virginia Sues Apple Over Alleged Failure to Prevent Child Sexual Abuse Material
West Virginia AG alleges Apple prioritized user privacy over child safety, reporting just 267 CSAM cases compared to millions by Google and Meta, seeking damages and product changes.
- On Thursday, West Virginia Attorney General JB McCuskey filed a consumer-protection lawsuit in Mason County accusing Apple of enabling CSAM storage and distribution.
- Apple's 2021 plan to scan images included NeuralHash, Apple's 2021 CSAM-detection model, but the state says Apple prioritized privacy branding and abandoned detection, a choice JB McCuskey calls conscious.
- The complaint points to a 2020 internal message where an Apple executive said `we are the greatest platform for distributing child porn`, citing Reuters data that Apple filed only 267 CSAM reports in 2023 compared to Google’s 1.47 million and Meta’s more than 30.6 million.
- Apple has responded by moving to dismiss, arguing the firm is shielded from liability under Section 230 of the Communications Decency Act, a law that provides broad protections to internet companies from lawsuits over content generated by users.
- The suit positions Apple at the center of a wider debate on end-to-end encryption and federal duties to report detected CSAM to the National Center for Missing and Exploited Children, marking the first government lawsuit of its kind amid 2024 litigation and watchdog findings.
89 Articles
89 Articles
State sues Apple over providing iCloud services for ‘child porn’ * WorldNetDaily * by Bob Unruh
Source link Apple already has denied the allegations in a lawsuit by private individuals, but now it is a state, West Virginia, that has gone to court to demand action regarding the tech corporation’s involvement, through its iCloud services, in child sex abuse. State Attorney General JB McCuskey has filed documents accusing Apple of prioritizing
The West Virginia Attorney General's office sued Apple on Thursday, claiming that the tech giant allowed child sexual abuse materials (CSAM) to be stored and distributed in its iCloud service.
A U.S. prosecutor announced this Friday an action against Apple, under the charge that his iCloud storage service serves as "refuge" to keep child sexual abuse material. iPhone 17e: what we know about Apple Zuckerberg's new smartphone 'baratinho' regrets that Instagram has been able to identify less than 13 years ago: 'We wish we had done this before' The action was presented by West Virginia's public prosecutor John Bohen McCuskey, who accused …
Coverage Details
Bias Distribution
- 59% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium




























