1 of 3 | The European Commission said Tuesday Meta faces a new probe over deceptive advertising and political content on its Facebook and Instagram platforms. File Photo by Terry Schmitt/UPI |
License Photo
April 30 (UPI) — The European Commission said on Tuesday that it has opened an investigation into Meta, the parent of Facebook and Instagram, for suspected violations of the Digital Services Act over deceptive advertising and political content on its services.
The commission said in a statement it was also concerned about the non-availability of effective third-party real-time civic discourse and election monitoring tools to the European Parliament before elections.
The commission pointed out that Meta has not replaced the public insight tool CrowdTangle as well.
“This commission has created means to protect European citizens from targeted disinformation and manipulation by third countries,” European Commission President Ursula von der Leyen said in a statement. “If we suspect a violation of the rules, we act.
“This is true at all times, but especially in times of democratic elections. Big digital platforms must live up to their obligations to put enough resources into this and today’s decision shows that we are serious about compliance.”
The commission said it will probe whether Meta’s mechanism for flagging illegal content on its services as well as the user redress and internal compliant mechanisms are working effectively enough to comply with DSA rules.
“If we cannot be sure that we can trust the content that we see online there’s a risk that we end up not believing anything at all,” Margrethe Vestager, the commission’s executive vice president for a Europe Fit for the Digital Age, said.
“We suspect that Meta’s moderation is insufficient, that it lacks transparency of advertisements and content moderation procedures. So today, we have opened proceedings against Meta to assess their compliance with the Digital Services Act.”
The European Commission’s Digital Services Act kicked off in February for all but the smallest platforms, which hands tech companies more responsibility for policing their sites for disinformation and misinformation.
It requires reporting of illegal content and protecting minors from revealing personal information.