The EU on Tuesday launched an investigation into Meta’s Facebook and Instagram over concerns the platforms are failing to counter disinformation ahead of EU elections in June.
The probe is under the EU’s new Digital Services Act, a landmark law that cracks down on illegal content online and forces the world’s biggest tech companies to do more to protect users online.
The European Commission said it suspected Meta’s moderation of adverts was “insufficient” and that an increase in paid spots in those conditions could harm “electoral processes and fundamental rights, including consumer protection rights”.
EU leaders are especially worried about Russian attempts to manipulate public opinion and undermine European democracy.
The probe seeks “to make sure that effective actions are taken in particular to prevent that Instagram’s and Facebook’s vulnerabilities are exploited by foreign interference,” EU internal market commissioner Thierry Breton said.
“We suspect that Meta’s moderation is insufficient, that it lacks transparency of advertisements and content moderation procedures,” commission executive vice president Margrethe Vestager said in a statement.
Facebook and Instagram are among 23 “very large” online platforms that must comply with the DSA or risk fines up running up to six percent of a platform’s global turnover, or even a ban for egregious cases.
Other platforms include Amazon, Snapchat, TikTok and YouTube.
Meta did not comment on the investigation’s focus, instead stating more generally that the US company had “a well-established process for identifying and mitigating risks on our platforms”.
A Meta spokesperson added: “We look forward to continuing our cooperation with the European Commission and providing them with further details of this work.”