HOW THE RULING WAS DECIDED
Within the Los Angeles case, Kaley’s legal professionals argued that Meta and Google deliberately focused youngsters by means of platform design, reasonably than content material, and made choices that prioritised revenue over security.
The legal professionals’ technique made it tougher for corporations to cover behind authorized provisions resembling Part 230, which typically shields platforms from legal responsibility over user-generated content material.
Jurors had been proven inner paperwork revealing how Meta and Google sought to draw youthful customers, and heard testimony from executives, together with Meta CEO Mark Zuckerberg.
One juror, who recognized herself solely as Victoria, mentioned the panel targeted closely on what protections the platforms had in place to protect Kaley from hurt, in addition to on the long-term penalties for future younger customers.
“We regarded on the historical past of every thing that Kaley went by means of, and what was the method that these platforms had in place that was going to presumably stop any hurt,” she mentioned.
Collin Walke, companion and head of cybersecurity and knowledge privateness follow at legislation agency Corridor Estill, mentioned the case’s give attention to platform design reasonably than content material mattered within the eventual ruling.
The content material placed on social media just isn’t the accountability of the businesses, Walke defined.
“However what’s their accountability is the style and technique by which they design their algorithms with the intention to present you that content material,” he mentioned.
“And that could be a unilateral selection that they make within the design of their merchandise – and that’s the reason they had been discovered liable right here.”

