The European Commission began investigating Tuesday whether Meta, the provider of Facebook and Instagram, violated the Digital Services Act (DSA). The goal of the investigation is to prevent disturbances to voting in the June European elections.
In the investigation, the Commission will focus on the four areas. The first is non-compliance with DSA obligations by providing deceptive advertisements and disinformation. Secondly, the Commission raised concerns about the transparency of political content involving the obligations to provide relief for users. The third assessment regards the availability of civic discourse and its election-monitoring system for the coming European elections. Finally, the mechanism Meta uses to flag illegal content is problematic because the current system is not deemed user-friendly. All in all, the main issues are not only the defect of consumer protection but also the interference with fair electoral procedures, especially with the first three suspects.
After gathering the relevant evidence, the Commission will be able to enforce additional measures. Particularly, since the EU is seeking to ensure free and fair European elections, it can take some measures to avoid such disturbance and manipulation within its competencies. Additionally, it can request remedies from Meta for issues such as correcting or improving its platforms.
The DSA is a 2022 EU regulation. It now applies to all intermediary services offered or placed in its territory, setting rules for a safe, predictable and trusted online environment. Very Large Online Platforms (VLOPs) must comply with the specific obligations set out in the DSA, such as setting user-friendly terms and conditions and providing transparent advertising. Facebook and Instagram have been listed since April 2023. If the investigation discovers the violations for the four areas above, it would constitute the infringement of several articles of the DSA.
In the past, the Commission opened proceedings against other VLOPs, including TikTok and X, to assess potential DSA violations. They suspected the transparency and quality of their content and the risks to the well-being of their users.