Fri. Nov 22nd, 2024
Occasional Digest - a story for you

Elon Musk has made himself Europe’s digital public enemy number one.

Since Hamas attacked Israel on Saturday, the billionaire’s social network X has been flooded with gruesome images, politically-motivated lies and terrorist propaganda that authorities say appear to violate both its own policies and the European Union’s new social media law.

Now Musk is facing the threat of sanctions — including potentially hefty fines — as officials in Brussels start gathering evidence in preparation for a formal investigation into whether X has broken the European Union’s rules. Authorities in the U.K. and Germany have joined the criticism.

The tussle represents a critical test for all sides. Musk will be keen to fight any claim that he’s failing to be a responsible owner of the social network formerly known as Twitter — all while upholding his commitment to free speech. The EU will want to show its new regulation, known as the Digital Services Act (DSA), has teeth.

Thierry Breton, Europe’s commissioner in charge of social media content rules, demanded that Musk explain why graphic images and disinformation about the Middle East crisis were widespread on X.

“I urge you to ensure a prompt, accurate and complete response to this request within the next 24 hours,” Breton wrote on X late Tuesday.

“We will include your answer in our assessment file on your compliance with the DSA,” said Breton. “I remind you that following the opening of a potential investigation and a finding of non-compliance, penalties can be imposed.” Those fines, under Europe’s content rulebook, can total up to 6 percent of a company’s global revenue.

The heat on Twitter did not begin with the Hamas attacks. Ever since Musk bought the platform, he’s been hit by criticism that he’s failing to stop hate speech from spreading online.

X has cut back on its content moderation teams, in the spirit of promoting free speech; pulled out of a Brussels-backed pledge to tackle digital foreign interference; and tweaked its social media algorithms to promote often shady content over verified material from news organizations and politicians.

Musk has responded — via his social media account with 159 million followers — with jeers and attacks on his naysayers. But the latest uproar over content apparently inciting and praising terrorism has made it a surefire bet that X will be one of the first companies to be investigated under the EU’s social media rules.

In response to Breton’s demand, Musk asked the French commissioner to outline how X had potentially violated Europe’s content regulations. “Our policy is that everything is open source and transparent,” he added. In the U.K., Michelle Donelan, the country’s digital minister, also met with social media executives Wednesday to discuss how their firms were combatting online hate speech.

The probe is coming

In truth, an investigation into X’s compliance with Europe’s new content rulebook has been on the cards for months. Over the summer, Breton and senior EU officials visited the company’s headquarters in San Francisco for a so-called “stress test” to see how it was complying.

Under the EU’s legislation, tech giants like X, TikTok and Facebook must carry out lengthy risk assessments to figure out how hate speech and other illegal content can spread on their platforms. These firms must also allow greater access to external auditors, regulators and civil society groups that will track how social media companies are complying with the new oversight.

Investigations into potential wrongdoing under Europe’s content rules will likely involve months-long inquiries into a company’s behavior, the Commission taking a legal decision on whether to levy fines or other sanctions, and a likely appeal from the firm in response. Such cases are expected to take years to complete.

Within Brussels, the Commission has been compiling evidence of potential wrongdoing across multiple social media companies, even before the EU’s new content legislation came into full force in August, according to five officials and other individuals with direct knowledge of the matter.

The goal is to start at least three investigations linked to the Digital Services Act by early next year, according to three of those people. They spoke on condition of anonymity because the discussions are not public and remain ongoing.

In recent days, Commission officials have been compiling evidence associated with Hamas’ attacks on Israel — much of which has been shared on X with little, if any, pushback from the company.

That content included verified X accounts with ties to Russia and Iran reposting graphic footage of alleged atrocities targeting Israeli soldiers. Some of these posts have been viewed hundreds of thousands of times. Other accounts linked to Hezbollah and ISIS have similarly posted widely with few, if any, removals.

It is unclear whether such footage will lead to a specific investigation into X’s handling of the most recent violent content. But it has reaffirmed the likelihood Musk will soon face legal consequences for not removing such material from his social network.

Combating violent and terrorist content requires “people sitting at a computer screen and looking at this and making judgments,” said Graham Brookie, senior director of the Atlantic Council’s Digital Forensic Research Lab, which has tracked the online footprint of Hamas’ ongoing attacks. “It used to be that there were dozens of people that do that at Twitter, and now there’s only a handful.”

Steven Overly contributed reporting from Washington

Source link