Less than two weeks ahead of the European Parliament election, AI-powered deepfakes aimed at swaying voters are circulating on social media and on online news outlets. In particular, the Kremlin has launched targeted disinformation campaigns via a scattered network of proxies – well-rooted across the EU – aimed at disrupting the democratic process and weakening European support for Ukraine.
The risks of such foreign interference and disinformation are significant – European Commission president Ursula von der Leyen recently identified it as one of the biggest threats to European democracy. The European Parliament has passed landmark legislation such as platform accountability measures and AI transparency requirements to upgrade Europe’s response to harmful online content, but the European Union is failing to keep up with Russia’s disinformation efforts. Some member states are taking the lead by establishing bodies that fight against digital interference as in the French case of Viginum, or by filling the gaps in digital regulation enforcement as in Belgium. However, this leads to a patchy and uncoordinated European response.
Building a European Democracy Shield
If re-elected, von der Leyen has promised to set up a “European Democracy Shield” to detect, track, and delete deceitful online content in coordination with national agencies. The initiative would also take a tougher approach towards conventional and AI-engineered disinformation by focusing on pre-emptively debunking disinformation – so-called pre-bunking – and resilience-building. The scope and rollout of the plan are yet to be disclosed.
The European Democracy Shield could bring about a coordinated and assertive response to individuals and media outlets that funnel disinformation efforts – regardless of whether von der Leyen leads the next commission. To achieve this, the EU should use the European Democracy Shield to:
Break down the existing silos in the EU’s approach. With the commission, member states should work towards a collective strategy on countering foreign interference as called for by the Weimar Triangle. The commission, in consultation with the European Parliament, should then launch a new EU-level taskforce to nurture this vision across member states through information sharing, research and institutional coordination, and by boosting media literacy. This taskforce should also bridge the gaps in the existing tools of digital governance, and push for improvements when necessary. For example, smaller platforms like Telegram are not currently obliged to make risk assessments for disinformation and propose mitigation measures, due to a lower number of users. The taskforce could put forward regulation for smaller platforms that risk spreading disinformation.
Work with like-minded third-country partners through the European External Action Service’s Digital Diplomacy efforts by sharing relevant insights and exchanging best practices. The focus should be on regions that share a human-centric approach to emerging technologies and that could benefit from enhanced strategic engagement with the EU.
Disinformation hotbed
There has been a surge in Russian disinformation ahead of the European Parliament election, with fake news of ‘unprecedented migration flows’ in Bulgaria and foreigners ‘assaulting people’ on the streets of Sofia, or photos falsely claiming to show Slovakian prime minister Robert Fico’s alleged attacker alongside the leader of the opposition party’s father. But disinformation is not limited to election periods. Whatever the outcome of the election, fighting information manipulation should be high on the EU’s agenda for the next institutional term.
The European Council on Foreign Relations does not take collective positions. ECFR publications only represent the views of their individual authors.