A new California law would require companies to shut off those algorithms by default for users under 18, and implement other mandated tweaks that lawmakers say would reduce the negative mental health effects of social media on children.
The bill, dubbed the Protecting Kids from Social Media Addiction Act by its author, state Sen. Nancy Skinner (D-Berkeley), was announced at a news conference with California Atty. Gen. Rob Bonta on Monday, alongside another proposed law that would tighten privacy protections for minors.
“Social media companies have the ability to protect our kids,” Skinner said. “They could act; they have not.”
One of the act’s key provisions is making a chronological feed the default setting on platforms, which would show users posts from the people they follow in the order that they were uploaded, rather than arranging the content to maximize engagement.
This change would show young users “the things that they want to see, as opposed to the addictive algorithmic feed that is presently being fed to our children,” Bonta said.
The act would also require the default settings on social media apps to mute notifications between midnight and 6 a.m., cap use at one hour daily, and remove the visibility of “like” counts. Parents — and in practice, most likely, the children using these apps — would have the ability to change these default settings.
Assemblymember Buffy Wicks (D-Oakland), who introduced the bill to tighten privacy protections for minors, said changing the settings can yield big benefits for children.
“We know there are some kids that will change the default setting,” Wicks said, “but the default setting is a very powerful tool.”
The new bills are just the latest in a string of legislative and regulatory actions taken by California lawmakers and lawyers in recent years aiming to change how social media companies do business.
In October, Bonta’s office filed a lawsuit against Meta, the parent company of Facebook, Instagram and WhatsApp, alongside 32 other states, alleging that the company designed its apps specifically to addict young users while misleading the public about the adverse effects of these “harmful and psychologically manipulative platform features.”
Portions of internal company documents included in that lawsuit show that Meta knew that more than a million children under 13 were using Instagram, while company officials publicly stated that underage users were not allowed on the platform. The suit also alleges that Mark Zuckerberg, the company’s chief executive, personally vetoed a proposal that would have banned filters that simulate the effects of plastic surgery, despite pushback over the negative effect on girls’ mental health.
Bonta’s office also won a $93-million settlement in a case against Google last year, which alleged that the company had deceived users by collecting their location data for ad targeting and other applications after they had opted out.
But a prior law intended to rein in social media companies’ treatment of young users ran into trouble in the courts last year. A federal judge in San Jose issued a temporary injunction against the California Age-Appropriate Design Code Act in September, ruling that the law likely violates the 1st Amendment rights of the tech companies that it seeks to regulate.
The law, which was co-authored by Assemblymember Wicks and signed into law in 2022, would require companies to provide privacy protections to children by default. The court found that enforcement of these provisions could either require more data collection — to verify the age of certain users — or limit the content that adult users were allowed to see. Bonta’s office is appealing the decision.
The lawmakers at the Monday news conference cited research published by the U.S. surgeon general last year as evidence of the harms that social media use inflicts on minors. That report found that “adolescents who spend more than three hours per day on social media face double the risk of experiencing poor mental health outcomes,” nearly half of adolescents said that social media made them feel worse about their body image, and a majority saw “hate-based content” on a regular basis.
“Profit is being made off of our kids and at expense of their well-being,” Skinner said Monday. The new law “is designed to prevent these very preventable harms.”