Parents and lawmakers say executives are not doing enough to thwart dangers, including sexual exploitation and bullying.
CEOs from Meta, TikTok, X and other companies have been grilled by United States lawmakers over the dangers that children and teens face using the social media platforms.
On Wednesday, the executives testified before the US Senate Judiciary Committee amid a torrent of anger from parents and lawmakers that companies are not doing enough to thwart online dangers for children, such as blocking sexual predators and preventing teen suicide.
“They’re responsible for many of the dangers our children face online,” US Senate Majority Whip Dick Durbin, who chairs the committee, said in opening remarks. “Their design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk.”
Durbin cited statistics from the National Center for Missing and Exploited Children non-profit group that showed financial “sextortion”, in which a predator tricks a minor into sending explicit photos and videos, had skyrocketed last year.
The committee also played a video in which children spoke about being victimised on the social media platforms. “I was sexually exploited on Facebook,” said one child in the video, who appeared in shadow.
“Mr Zuckerberg, you and the companies before us, I know you don’t mean it to be so, but you have blood on your hands,” said Senator Lindsey Graham, referring to Mark Zuckerberg, CEO of Meta, the company that owns Facebook and Instagram. “You have a product that’s killing people.”
Zuckerberg testified along with X CEO Linda Yaccarino, Snap CEO Evan Spiegel, TikTok CEO Shou Zi Chew and Discord CEO Jason Citron.
X’s Yaccarino said the company supported the STOP CSAM Act, a bill introduced by Durbin that seeks to hold tech companies accountable for child sexual abuse material and would allow victims to sue tech platforms and app stores. The bill is one of several aimed at addressing child safety. None have become law.
X, formerly Twitter, has come under heavy criticism since Tesla and SpaceX CEO Elon Musk bought the platform and loosened moderation policies. This week, the company blocked searches for pop singer Taylor Swift after fake sexually explicit images of Swift spread on the platform.
Wednesday also marked the first appearance by TikTok CEO Chew before US lawmakers since March, when the Chinese-owned short video app company faced harsh questions, including some suggesting the app was damaging children’s mental health.
“We make careful product design choices to help make our app inhospitable to those seeking to harm teens,” Chew said, adding that TikTok’s community guidelines strictly prohibit anything that puts “teenagers at risk of exploitation or other harm – and we vigorously enforce them”.
At the hearing, the executives touted existing safety tools on their platforms and the work they’ve done with non-profits and law enforcement to protect minors.
Ahead of their testimony, Meta and X also announced new measures in anticipation of the heated session.
Yet, child health advocates say the social media companies have failed repeatedly to protect minors.
“When you’re faced with really important safety and privacy decisions, the revenue in the bottom line should not be the first factor that these companies are considering,” said Zamaan Qureshi, co-chair of Design It For Us, a youth-led coalition advocating for safer social media.
“These companies have had opportunities to do this before. They failed to do that, so independent regulation needs to step in.”