Добавить новость
smi24.net
WND
Февраль
2026
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
21
22
23
24
25
26
27
28

‘This conduct is despicable’: State sues Apple over providing iCloud services for ‘child porn’

0
WND 

Apple already has denied the allegations in a lawsuit by private individuals, but now it is a state, West Virginia, that has gone to court to demand action regarding the tech corporation’s involvement, through its iCloud services, in child sex abuse.

State Attorney General JB McCuskey has filed documents accusing Apple of prioritizing user privacy over child safety.

A statement from his office called its case against Apple the first of its kind by a government agency over the distribution of child sexual abuse material on the tech company’s data storage platform.

A report by Reuters said the state’s charges cited a text message in 2020 from the corporation’s anti-fraud chief that confirmed because of Apple’s priorities, iCloud was “the greatest platform for distributing child porn.”

“The state said it is seeking statutory and punitive damages and that the lawsuit filed in Mason County Circuit Court asks a judge to force Apple to implement more effective measures to detect abusive material and implement safer product designs,” Reuters said.

The report noted the company previously has considered scanning images, but dropped the plan over worries about user privacy.

Among the concerns is that a government could look for material to support censorship or arrests.

But McCuskey said there’s a reason a crackdown is needed.

“These images are a permanent record of a child’s trauma, and that child is revictimized every time the material is shared or viewed,” his statement said. “This conduct is despicable, and Apple’s inaction is inexcusable.”

Other online platforms scan uploaded photos and such against a database of identifiers for known child sex abuse material, and Apple’s platform was unencrypted, meaning law enforcement could search with a warrant.

It later considered encryption, but dropped the idea.

Reuters said, “In August 2021, Apple announced NeuralHash which it designed to balance the detection of child abuse material with privacy by scanning images on users’ devices before upload.” But that, too, soon was abandoned.

Now it has a featured called Communication Safety that blurs images of nudity.

The report pointed out the state is charging, “Federal law requires U.S.-based technology companies to report abuse material to the National Center for Missing and Exploited Children. Apple in 2023 made 267 reports, compared to 1.47 million by Google and 30.6 million by Meta Platforms.”















Музыкальные новости






















СМИ24.net — правдивые новости, непрерывно 24/7 на русском языке с ежеминутным обновлением *