meta

Jury finds Meta, YouTube liable in landmark social media addiction case

A Los Angeles County jury on Wednesday found Meta and YouTube liable in a social media addition case. File Photo by Adam Vaughn/EPA

March 26 (UPI) — A California jury has found Meta and YouTube liable for negligently designing addictive social media platforms that harm children, in a landmark verdict that could have lasting implications for the tech industry.

The Wednesday verdict marks the first time technology companies have been found liable for creating addictive online products, amid increased scrutiny of the industry and a wave of litigation.

“This jury saw exactly what we presented from the very first day of trial: that these companies built digital spaces designed to negatively influence the brains of children, and they did it on purpose,” Mark Lanier, lead trial counsel and founder of The Lanier Law Firm, said in a statement.

“The evidence showed that Meta and YouTube knew their platforms were hooking children and harming their mental health, and instead of fixing the problem they kept developing features to maximize the time kids spent on their apps. Now a jury has told them that is not acceptable, and you are being held accountable.”

UPI has contacted Meta and YouTube for comment.

The verdict follows a seven-week trial centered on a now-20-year-old plaintiff known to the court by her initials K.G.M., who testified that her use of Instagram, owned by Meta, and YouTube, an Alphabet product, from a young age caused her to develop anxiety, depression, body dysmorphia and suicidal thoughts.

During the trial, she testified that the platforms’ addictive design features, including algorithm-generated recommendations, beauty features and push notifications caused her severe mental harm.

“[The plaintiff] put a human face on what these companies have known for years: that their platforms were engineered to hook young users, and that the children most vulnerable to trauma were the ones they were most effectively reaching,” Rachel Lanier, co-lead counsel and managing attorney of The Lanier Law Firm’s Los Angeles office, said in a statement.

In its verdict, the jury found Meta 70% responsible for the harm the plaintiff suffered and YouTube 30% responsible, and ordered the Mark Zuckerberg-owned tech behemoth and Google‘s video-sharing service to pay her a combined $6 million, half for compensatory damages and half for punitive damages.

Of the punitive damages, Meta is to pay $2.1 million and YouTube $900,000.

This was the first trial in a much larger consolidated case involving more than 1,600 plaintiffs seeking to hold social media companies responsible for the harm they suffered from using those products.

“This is a major victory for the public, for social media users and for child safety,” Libby Liu, CEO of nonprofit legal organization Whistleblower Aid, told UPI in an emailed statement.

“Each successful lawsuit paints a crystal clear picture showing that Meta is not above the law and can and should be held accountable.”

The verdict came down a day after a New Mexico jury found Meta liable for misleading consumers about the safety of its products, ordering the company to pay $375 million in civil penalties for violating the state’s consumer protection laws.

During the trial, state prosecutors showed that Meta’s design features enabled predators to engage in child sexual exploitation, while demonstrating that Meta intentionally designed its platforms to addict young people.

Following the verdict in Los Angeles County, New Mexico Attorney General Raul Torrez, a Democrat, celebrated it as “another critical step toward justice that puts Meta and other big tech executives on notice that they cannot evade responsibility for design choices that jeopardize child safety.”

“We will seek court-mandated changes to Meta’s platforms that offer protections for kids,” he said in a statement.

The rulings come as more attention is being paid to the effects social media has on youth, resulting with Australia in December banning those under the age of 16 from social media, while other countries are considering similar restrictions.

Source link

US jury finds Meta, Google, liable in social media addiction trial | Social Media

NewsFeed

A Los Angeles jury has found Alphabet’s Google and Meta liable for $6 million in damages in a landmark social media addiction lawsuit. The case involved a 20-year-old woman who said she became addicted to the apps at a young age due to their platform design. Meta says it plans to appeal the decision.

Source link

US jury finds Meta, Alphabet liable in landmark social media addiction case | Social Media News

A California jury found ⁠Alphabet’s Google and Meta liable for $3m in damages in a landmark social media addiction lawsuit that accused the companies of being legally responsible for the addictive design of their platforms.

The decision was handed down by a Los Angeles-based jury on Wednesday after more than 40 hours of deliberation across nine days, and more than a month after jurors heard opening statements in the trial.

Recommended Stories

list of 4 itemsend of list

Among those who testified in the case were Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri, although YouTube chief executive Neal Mohan was not called to testify.

The plaintiff in the case, referred to as KGM or Kaley, was awarded $3m in damages. The 20-year-old said she became addicted to social media at a young age, which exacerbated her mental health issues. She began using YouTube at age six and Meta-owned Instagram at age nine.

Kaley’s legal team alleged that the social media giants used designed features intended to hook young users, including notifications and autoplay features.

“Today’s verdict is a historic moment — for Kaley and for the thousands of children and families who have been waiting for this day. She showed extraordinary courage in bringing this case and telling her story in open court. A jury of Kaley’s peers heard the evidence, heard what Meta and YouTube knew and when they knew it, and held them accountable for their conduct. Today’s verdict belongs to Kaley,” lawyers for the plaintiff said in a statement shared with Al Jazeera.

Jurors were instructed not to consider the content of the posts and videos Kaley saw on the platforms. That is because tech companies are shielded from legal responsibility for user-posted content under Section 230 of the 1996 Communications Decency Act.

Meta consistently argued that Kaley had struggled with her mental health separate from her social media use, often pointing to her turbulent home life. Meta also said, “not one of her therapists identified social media as the cause” of her mental health issues in a statement following closing arguments. But the plaintiffs did not have to prove that social media caused Kaley’s struggles — only that it was a “substantial factor” in causing her harm.

YouTube focused less on Kaley’s medical records and mental health history and more on her use of the platform itself. The company argued that YouTube is not a form of social media, but rather a video platform, akin to television, and pointed to her declining use as she got older.

According to company data, she spent about one minute per day on average watching YouTube Shorts since its inception. YouTube Shorts, which launched in 2020, is the platform’s section for short-form, vertical videos that include the “infinite scroll” feature that the plaintiffs argued was addictive.

“We disagree with the verdict and plan to appeal. This case misunderstands YouTube, which is a responsibly built streaming platform, not a social media site,” Jose Castaneda, a spokesperson for Google, told Al Jazeera.

Meta did not respond to Al Jazeera’s request for comment.

Snap and TikTok were previously named in the suit but settled with the plaintiff for undisclosed terms before the trial began.

Shifting momentum

The verdict is the latest in a wave of lawsuits targeting social media companies. There is a looming federal social media addiction case slated to begin in June in Oakland, California.

On Tuesday in New Mexico, a jury found that Meta violated state law by misleading users about the safety of Facebook, Instagram, and WhatsApp, and by enabling child sexual exploitation on those platforms.

This case has been closely watched by legal experts, who say the verdict will shape future litigation.

“The fact the jury found Meta and Google liable represents that these cases have real exposure to the social media giants, and are going to frame how future litigation will proceed. Although this case will certainly be appealed, I would not be surprised if Meta and Google are already making changes within their platform to reflect the real exposure, and hopefully, the states will start to enact laws regulating social media in a manner congruent with the ruling,” entertainment lawyer Tre Lovell told Al Jazeera.

Professor Eric Goldman, associate dean for research at the Santa Clara University School of Law, echoed Lovell’s assessment.

“The Los Angeles jury verdict is the first of three bellwether trials in Los Angeles, with more bellwether trials to follow in summer, in the federal case. As such, today’s verdict is just one datapoint about liability and damages. The other trials could reach divergent outcomes, so this jury verdict isn’t the final word on any matter.”

Despite the ruling, Meta’s stock has not taken a hit, as it came the same day CEO Mark Zuckerberg was appointed to a new White House advisory council. The stock is up 0.7 percent. Alphabet’s stock, however, is trending downward in midday trading on the heels of the verdict, down 1 percent.

Source link

California considers restrictions on social media for kids

Meta, YouTube and Snapchat are already under scrutiny for risks they pose for young people. Now they are facing another hurdle in their home state.

California lawmakers are considering legislation to restrict social media use for teens and children under 16 years old. Assemblymember Josh Lowenthal (D-Long Beach) and others introduced a bipartisan bill that would bar social media platforms from allowing users under 16 years old from creating or maintaining accounts.

The legislation comes amid mounting concerns about how social networks impact the mental health of young people. Anxiety among parents and lawmakers has heightened as platforms and AI chatbots become more intertwined with people’s daily life.

Last month, tech executives, including Meta’s chief executive and co-founder Mark Zuckerberg, testified in a landmark trial in Los Angeles over a lawsuit that alleges social media is addictive and harms children.

The trial centers on whether tech companies such as Instagram, which is owned by Meta, and YouTube can be held liable for allegedly promoting a harmful product and addicting users to their platforms.

California has passed legislation before aimed at making social media platforms and chatbots safer but faced pushback from tech industry groups that have sued to stop new laws from taking effect. Tech companies are have responded by releasing more parental controls and restrictions for young users.

Other countries have been moving forward with restrictions on social media. Last year, Australia barred children under 16 years old from having social media accounts.

TechNet, whose members include Meta and Google, said in a statement that it hasn’t taken a position on the California bill but doesn’t believe a ban will effectively achieve the Legislature’s goal’s.

“We support balanced, evidence-based solutions that strengthen protections for young people, equip parents with meaningful tools, and ensure accountability across platforms. Our companies have made significant investments in teen safety and parental controls, and we remain committed to building on that progress,” said Robert Boykin, TechNet Executive Director for California and the Southwest in a statement.

The use of social media by young people has divided tech executives.

Pinterest Chief Executive Bill Ready wrote in an op-ed in TIME published on Friday that governments should follow Australia’s lead and ban social media for kids under 16 years old if tech companies don’t prioritize safety.

“Social media, as it’s configured today, is not safe for young people under 16,” he said.”Instead, it’s been designed to maximize view time, keeping kids glued to a screen with little regard for their well-being.”

Lowenthal’s bill cited social media’s dangers such as “exposure to harmful content, compulsive use patterns, exploitation, and adverse impacts on mental health and well-being.”

“Existing age-based restrictions that rely primarily on user self-attestation have proven ineffective and place an unreasonable burden on children and families rather than on the entities that design, operate, and profit from social media platforms,” the bill states.

A spokesman for Lowenthal didn’t immediately respond to a request for comment.

Source link