West Virginia’s lawyer normal filed a lawsuit towards Apple on Thursday, accusing the iPhone maker of knowingly permitting its software program for use for storing and sharing baby sexual abuse materials.
John B. McCuskey, a Republican, accused Apple of defending the privateness of sexual predators who use iOS, which may sync pictures to distant cloud servers by iCloud. McCuskey referred to as the corporate’s choices “completely inexcusable” and accused Apple of working afoul of West Virginia state legislation.
“Since Apple has to date refused to police themselves and do the morally proper factor, I’m submitting this lawsuit to demand Apple observe the legislation, report these pictures, and cease re-victimizing kids by permitting these pictures to be saved and shared,” McCuskey mentioned.
The West Virginia lawyer normal mentioned the state would search “statutory and punitive damages,” adjustments to Apple’s baby abuse imagery detection practices, and different treatments to make the corporate’s product designs “safer going ahead.”
Within the new lawsuit, the state cites a handful of recognized complaints about Apple’s largely hands-off strategy to its image-hosting service. The largest concern: Apple finds far fewer situations of on-line baby exploitation than its peer corporations do as a result of it isn’t searching for them.
In an announcement offered to Quick Firm, Apple identified an iOS characteristic that “mechanically intervenes” when nudity is detected on a baby’s system. “All of our industry-leading parental controls and options . . . are designed with the security, safety, and privateness of our customers at their core,” an Apple spokesperson mentioned.
Apple walks the privateness tightrope
The West Virginia lawsuit isn’t the primary of its variety that Apple has confronted in recent times, although it’s the first coming from a state. In late 2024, a gaggle of 1000’s of sexual abuse survivors sued the corporate for greater than $1 billion in damages after Apple walked away from a plan to extra completely scan the photographs it hosts for sexual abuse materials. Within the case, the plaintiffs’ authorized crew cited 80 situations through which legislation enforcement found baby sexual abuse imagery on iCloud and different Apple merchandise.
Most tech corporations depend on a device developed by Microsoft greater than a decade in the past to mechanically scan pictures they host and cross-reference these pictures towards digital signatures in a database of recognized child-abuse imagery. That device, referred to as PhotoDNA, flags these pictures and acts as step one in a reporting chain that results in legislation enforcement.
Within the U.S., web platforms are required by legislation to report any situations of suspected baby sexual abuse materials (CSAM) to the Nationwide Middle for Lacking & Exploited Youngsters, the group that spearheads baby abuse prevention within the nation. The NCMEC collects suggestions from on-line platforms by a centralized CSAM-reporting system, referred to as the CyberTipline, and forwards these considerations, many collected through PhotoDNA, to related authorities.
In 2023, the NCMEC obtained solely 267 reports of suspected CSAM from Apple. Throughout the identical timeframe, the group obtained 1.47 million stories from Google, 58,957 stories from Imgur, and 11.4 million stories from Meta-owned Instagram.
Apple seems to know the extent of the issue. “We’re the best platform for distributing baby porn,” Apple government Eric Friedman mentioned in an notorious 2020 text message that surfaced in discovery through the prolonged courtroom battle between Apple and Fortnite maker Epic Video games. Friedman made the assertion in a dialog about whether or not the corporate’s insurance policies are weighted too closely towards consumer privateness slightly than security.
Apple is thought for sturdy privateness practices that make its merchandise famously protected from potential hackers. Over time, those self same encryption techniques have frustrated law enforcement agencies, such because the FBI, who’ve sought information locked away on iPhones in the midst of their investigations.
“At Apple, defending the security and privateness of our customers, particularly kids, is central to what we do,” an Apple spokesperson mentioned. “We’re innovating day-after-day to fight ever-evolving threats and preserve the most secure, most trusted platform for teenagers.”

