Before the summer ends, California may pass the first US bill that would hold social media companies liable for product features that research has found are harmful to children. If passed, the law could have far-reaching consequences, potentially impacting how kids throughout the US use social media sites like TikTok, Instagram, and Snapchat. On Tuesday, the bill—the Social Media Platform Duty to Act—cleared what The Wall Street Journal called “a crucial vote in the State Senate.”Although much of prior reporting on the bill focused on its earlier goal to grant a parent’s right to sue over harm to individual children, WSJ reports that the amended version of the bill would instead “permit the state attorney general, local district attorneys, and city attorneys in California’s four largest cities to sue social media companies” for unfair business practices known to harm children.
Following this week’s California Senate Judiciary Committee 8-0 vote, to stay alive, the bill has a few hurdles to clear before the end of the legislative session in August. It now moves to the state Senate Appropriations Committee, ahead of a full Senate vote, before it could potentially be signed into law by California Governor Gavin Newsom. Though it seems unlikely that passing this version of the bill will lead to many lawsuits, the bill states that the “heart” of the legislation is to deter companies like Meta and TikTok owner ByteDance Ltd. from creating features specifically designed to make their platforms addictive to children. And it’s likely the law’s impact would go beyond California. Because social media companies would be unlikely to build technology specific to California users, Yahoo Finance tech editor Dan Howley suggested that, if passed, the bill could force new standards at social media companies that would apply to “every user throughout the US.”
More children “online almost constantly”
The California bill was introduced following a whistleblower report of an internal Facebook study showing that Facebook (now Meta) was aware its products were addictive to children. This has become a bigger problem because more children today report being online, increasing their engagement with social media. The bill cites surveys showing that 40 percent of tweens use social media and 45 percent of teenagers said they are “online almost constantly,” as well as a 2018 report showing 70 percent of teens use social media. It argues that by logging onto social media, minors are subjected to negative mental and emotional effects and points out that more minors started spending more time online during the pandemic.
If passed, the law would allow lawsuits if enough evidence showed that a design or feature was “a substantial factor” causing a child user addiction or harm, the negative impact could be reasonably foreseen by the company, and the child did, in fact, become addicted or harmed. Suggested civil penalties can rack up to $25,000 per violation and additional civil penalties up to $250,000 for knowing and willful violations. In the bill, addiction is defined as a “preoccupation or obsession” with social media, where children experience withdrawal that makes it hard to reduce or quit using social media when they want to. Harm from this addiction is defined as “physical, mental, emotional, developmental, or material harms to the user.”
Opposition from tech-industry groups
The WSJ reports that consumer groups, youth advocates, and teacher’s unions spoke in support of the bill during the most recent hearing, but the bill has faced continual opposition from Internet-privacy advocates and business and tech-industry groups. The watered-down version of the bill that passed this week, WSJ notes, was the result of lobbying to remove both retroactive applications of the law and a parent’s right to sue over addiction or harm to their children. That lobbying is expected to continue as the bill moves forward. Meta was the only social media company that did not decline to comment on the WSJ report, saying the California bill “would do nothing to encourage companies to make meaningful changes.”
Meta told Ars Technica that it currently has no further comment, but said that last week the company added new tools to increase parental supervision and notify teens to take a break on Instagram. Meta also shared new information on its internal process to ensure that product design focuses on “youth well-being.” ByteDance and Snap did not immediately respond to requests for comment on this story. Right now the future is unclear, but if the bill becomes California law this summer, it wouldn’t just affect the young people aged 13 and up who are targeted by social media. It would also impact any future social media products designed for younger kids. Howley pointed to Meta’s proposed Instagram for kids under 13 years old as a product indicating social media platforms’ interest in breaking into younger markets.