Wed. Nov 6th, 2024
Occasional Digest - a story for you

The states are moving because they believe social media is contributing to increasing rates of mental illness among children, and because Congress hasn’t. There’s bipartisan support on Capitol Hill to do more, but lawmakers there can’t agree on whether a national privacy standard should override state laws.

A national standard that protects all kids would be best, said Vermont state Rep. Monique Priestley, a Democrat who recently introduced a child-safe design bill. “In the meantime, there’s a great network of states that are kind of coming together … to fill in that gap.”

Some states, like Utah and Arkansas, aren’t mandating site design changes but have passed laws requiring minors to get parental consent to access social media. South Carolina and New York lawmakers are considering bills that would regulate the algorithms social media companies use to direct content to minors.

In a separate tack,
33 states sued Meta
, parent company of Facebook and Instagram, in October in federal court in San Francisco alleging it violated children’s privacy. If successful, it also could force the company to change its sites.

The legal battle is wide and the outcome far from clear. A 2022 California law, the first to mandate website design changes, is now in limbo after a tech industry group challenged it in federal court.

The prospect of having to comply with varying state laws has alarmed the tech firms, which are moving to convince state lawmakers new rules aren’t needed.

To do that, the firms are tightening their own controls over what kids see online. Meta is rolling out new protections that help children avoid content deemed harmful, such as posts about violence, sex or eating disorders.

The firms insist they don’t oppose regulation, but would prefer a national standard than a patchwork of 50 state rules.

“Laws that hold different apps to different standards in different states will leave teens with inconsistent online experiences,” said Liza Crenshaw, a public affairs manager at Meta.

Federal rules or federalism

Eighteen months ago, the House Energy and Commerce Committee voted 53-2 to pass the American Data Privacy and Protection Act to give Americans more control of their data and ban targeted advertising to minors.

It would have also created a new Federal Trade Commission division charged with considering additional rules to protect kids online.

But it failed to advance in the Senate, where the chair of the panel with jurisdiction, Maria Cantwell (D-Wash.), declined to take it up. Cantwell said it was
too weak on enforcement
considering it would preempt more robust state privacy laws like California’s.

State preemption is non-negotiable for many Republicans, who foresee a legal minefield and an impediment to entrepreneurs if states can layer their own rules on top of a federal privacy standard.

“The only entities with the legal sophistication to comply with a regulatory landscape that complex are large companies with buildings full of lawyers,” said GOP Rep. Jay Obernolte, who represents a California district east of Los Angeles.

Cantwell’s Commerce Committee advanced two kid-focused privacy bills in July, but the full Senate has yet to vote.

Republicans object because one of the measures, the Kids Online Safety Act, would not preempt state laws and, in their view, would spur lawsuits.

“I don’t think it’s a good idea to create a new litigation magnet when we have an opportunity in advance to solve future conflicts,” Sen. Ted Cruz (R-Texas)
said at the time
.

House backers of federal privacy legislation are continuing to build the case nonetheless. Energy and Commerce Chair Cathy McMorris Rodgers (R-Wash.) has held seven hearings on data privacy and launched an investigation into how data brokers profit off data.

But disagreements over whether a federal law reigning in social media would serve as a regulatory floor on which states can build, or a ceiling beyond which states cannot go have thus far proven intractable.

Evidence of harm

Still, pressure to act is rising. In 2021, more than 40 percent of high school students felt so sad or hopeless over a two-week period that they stopped keeping up with their regular pastimes, according to
the Centers for Disease Control and Prevention’s most recent Youth Risk Behavior Survey
. The survey also said that 30 percent of teen girls seriously considered suicide, up from 19 percent 10 years ago.

Experts are concerned that social media companies are contributing to the problem —
and profiting
from it.

In testimony to a Senate Judiciary subcommittee
in November
, former Facebook engineering director Arturo Béjar said that company data showed that a fifth of 13- to 15-year-olds experienced bullying on the platform and 13 percent experienced unwanted sexual advances. Another 40 percent found themselves making negative social comparisons.

So far, 13 states have passed 23 online child safety laws, according to
a 2023 report from the Center on Technology Policy at The University of North at Carolina Chapel Hill
.

Advocates of the design code laws say they want to create kid safety standards for online products, just as exist for other products. The bills are broad in scope in an effort to preempt potential loopholes, requiring the tech firms to assess their sites’ features and mitigate any harms they find.

“Tech is the one space where we have not applied the lens of child safety,” said Minnesota state Rep. Kristin Bahner, a Democrat behind one of the forthcoming bills.

Rather than trying to broadly restrict access to content, which could run afoul of the First Amendment, Priestley said her bill aims to stop companies from taking advantage of kids’ data and using it to target them with harmful content they weren’t looking for.

The U.K. pioneered child-safe design requirements in 2021 and the International Association of Privacy Professionals, a privacy advocacy group,
says the law
has forced social media companies to reduce the data they collect on kids and cut features.

YouTube, for instance, disabled an auto-play feature for minors that some consider addictive. It played continuous videos for site users. The Google-owned video-sharing site also added a “take a break” feature and bedtime reminders for kids.

Roy Wyman, a privacy lawyer at Bass Berry & Sims, said a critical mass of states could effectively set a national standard if social media firms grow weary of trying to operate their sites differently across state lines.

Following the U.K. law, for example, Google parent Alphabet implemented child-safe design changes to Google sites and YouTube globally. And
according to a policy blog from last year
the company supports age-appropriate design principles.

Tech’s dual response

Social media companies are taking a two-pronged approach to the regulatory push.

Meta is also making design changes to its sites voluntarily that it says will protect kids. And it’s pledging support for federal legislation to set rules.

Days after Béjar testified, Meta
published a blog
post calling for a law that requires parental consent on app downloads for kids under 16.

Meta has also rolled out new tools to limit content recommendations to teens, set stronger privacy settings and make it harder to find self-harm and eating disorder content.

But two industry trade groups of which it’s a member, NetChoice and the Computer & Communications Industry Association, are lobbying state lawmakers to oppose legislation to mandate design changes. NetChoice is going to court to stop new laws.

It argues they violate the First Amendment rights of both the kids and the firms.

NetChoice General Counsel Carl Szabo says these rules put the onus on tech companies to decide what’s appropriate for teens.

“This law is about denying free speech online, and that is why The New York Times filed a brief opposing the CA AADC,” he said.

It was a NetChoice lawsuit that prompted a federal district court in San Jose, Calif., to halt implementation of the child-safe law there in September.

The organization most recently convinced a federal judge in Columbus to stop Ohio’s new Parental Notification by Social Media Operators Act — which would require children to get parental consent to start an account — from going into effect while the judge considers the First Amendment argument.

Last month, California Attorney General Rob Bonta, a Democrat, filed an appeal of the court’s decision to pause his state’s law.

The American Psychological Association and the American Academy of Pediatrics subsequently filed an amicus brief in support of Bonta’s position.

In it, they say the internet and social media present “unique” risks to children.

Adolescents are “vulnerable to many of the manipulative design and privacy practices commonly employed by social media and digital platforms,” they said. “Broad protection across childhood and adolescence is needed.”

Source link