How recommender algorithms threaten election integrity
A study published by EDRi member Asociația pentru Tehnologie și Internet (ApTI) Romania analysed how the recommender algorithms on Facebook, Instagram and TikTok distributed political content, during the 2025 presidential election. The quantitative analysis identified cases in which these social media platforms did not comply with either national electoral laws, nor with EU Regulations, such as the Digital Service Act (DSA).
How recommender systems disseminate political content during elections?
During the2025 Romanian presidential election campaign, ApTI conducted a quantitative analysis of how the recommendation algorithms on Facebook, Instagram and TikTok distributed political content. The goal was to evaluate the platforms’ compliance with user choices, electoral legislation, and mandates to protect underage users.
Inspired by a methodology used in similar election-focused studies during the German Federal Elections, the researchers gathered data using four new accounts on the social media platforms. Three accounts were adults and followed only the official accounts of the presidential candidates, while one account was marked as underage and followed only educational and gaming content.
To limit algorithmic inference, the accounts were operated on smartphones resetted to their factory settings, with both GPS and Wi-Fi turned off. Following the methodology of similar researches, these accounts did not create new posts or stories, nor did they like or share the posts from their feeds.
A quantitative analysis of thousands of posts revealed that recommender algorithms constantly overrode explicit user preferences . “Adult” users were shown political content coming from candidates’ official accounts in similar proportions to political content posted by personal accounts that they did not followor engage with.
During the legally mandated electoral silence period, when all campaigning was prohibited under Romanian law, the amount of political content did not decrease. On the contrary, some accounts saw a greater proportion of political content than non-political content compared to during the campaign.
Similarly, both TikTok and Instagram delivered political content to the underage account. During the electoral silence period, 80% of the posts in the underageTikTok feed had political messaging.
Social media algorithms pose threats to elections’ integrity
The research was conducted after the Romanian Constitutional Court annulled the 2024 presidential elections. In its decision, the Court cited findings of alleged foreign interference conducted through the “manipulation of digital technologies” and noted that the integrity of the electoral process was compromised. because one candidate received disproportionate online exposure amplified by algorithms.
The European Commission opened an investigation into ByteDance, the owner of TikTok, to assess whether the company failed to mitigate systemic risks under the Digital Services Act (DSA). The preliminary conclusions published by the Commission do not address risk in the context of elections. The investigation is ongoing.
ApTI’’s leading question in the quantitative analysis was whether algorithmic recommendation systems could facilitate fair and balanced electoral debates. The findings indicate the contrary.
Exposure of minors to political content, particularly during an election period, shows a major risk mitigation failure. It is understandably complicated, yet not impossible, for platforms to prevent their algorithm from recommending political content during mandatory electoral silence periods. The increase in the volume of political content being propagated by recommender algorithms is an infringement on national laws and a threat to electoral integrity.
A study published in Science in November 2025 and summarised by TechPolicy.press provided some of the clearest evidence that social media algorithms exacerbate political polarisation and directly shape electoral outcomes. With indications that algorithms are systemically pushing polarising political messages andexposing minors to political content, while infringing on national electoral laws, the study raises concerns about the systemic risks that social media platforms pose to election integrity.
Possible safeguards
To mitigate the systemic risks posed by recommender algorithms during elections, ApTI proposes that large online social media platforms give users incentives and tools to curate their own feeds. Very Large Online Platforms (VLOPs), as designated under the DSA, are mandated to offer a chronological feed that is not populated using recommender algorithms. An Amsterdam court ruled that Meta must honour user-expressed preferences for a chronological feed.
The results of the 2024 annulled presidential elections in Romania revealed that the structural flaws of the algorithms can be easily manipulated for political gain, as social media platforms remain the main battleground during the electoral campaigns.
ApTI’s findings underline a gap between the existing legal frameworks and their enforcement during elections. The EDRi member organisation believes that, without making drastic changes to their operating model, the platforms cannot provide a favourable and fair environment for electoral campaigns. Furthermore, while the European Commission’s investigation into TikTok o is essential for holding the company accountable for its systemic failures, civil society monitoring and watchdog organisations like ApTI remain the first line of defence. Authorities should require platforms to develop new systems that would mitigate these risks through user-led curation of content.
Contribution by: EDRi member, Asociația pentru Tehnologie și Internet (ApTI)
