HOW THE RULING WAS DECIDED
In the Los Angeles case, Kaley’s lawyers argued that Meta and Google intentionally targeted kids through platform design, rather than content, and made decisions that prioritised profit over safety.
The lawyers’ strategy made it harder for companies to hide behind legal provisions such as Section 230, which generally shields platforms from liability over user-generated content.
Jurors were shown internal documents revealing how Meta and Google sought to attract younger users, and heard testimony from executives, including Meta CEO Mark Zuckerberg.
One juror, who identified herself only as Victoria, said the panel focused heavily on what protections the platforms had in place to shield Kaley from harm, as well as on the long-term consequences for future young users.
“We looked at the history of everything that Kaley went through, and what was the process that these platforms had in place that was going to possibly prevent any harm,” she said.
Collin Walke, partner and head of cybersecurity and data privacy practice at law firm Hall Estill, said the case’s focus on platform design rather than content mattered in the eventual ruling.
The content put on social media is not the responsibility of the companies, Walke explained.
“But what is their responsibility is the manner and method by which they design their algorithms in order to show you that content,” he said.
“And that is a unilateral choice that they make in the design of their products – and that is why they were found liable here.”
