Presidential Elections on Social Media: Defamation, Bots, and Questionable Research

23. January 2025.
Presidential Elections on Social Media: Defamation, Bots, and Questionable Research 1

The presidential elections have confirmed the importance of social media in election campaigns, as well as the ways in which they can be exploited for electoral manipulation. This included conducting defamation campaigns, using fake profiles, and disseminating questionable research just before the electoral silence period, as shown by Gong’s analysis of the presidential elections on social media.

As part of Gong's comprehensive monitoring of election implementation, we closely followed the period from December 1st 2024 to January 12th 2025, covering the pre-campaign, the unofficial part of the campaign, and the official campaign up to election day. Our analysis encompassed 26 official accounts of eight relevant political candidates on Facebook, X (formerly Twitter), Instagram, and TikTok, as well as 32 anonymous TikTok accounts that disseminated political content during the campaign. We specifically focused on monitoring how individual social networks were used, the promoted political narratives, and on mapping instances of manipulation and defamation campaigns targeting specific political candidates and parties.

Analysis Results

Although 14 of the most relevant candidates were observed, only those who collected 10.000 signatures and thereby became official candidates were included in the analysis: Zoran Milanović, Dragan Primorac, Ivana Kekin, Marija Selak Raspudić, Miro Bulj, Branka Lozo, Tomislav Jonjić, and Niko Tokić Kartelo. Consequently, candidates Mislav Kolakušić, Karolina Vidović Krišto, Ava Karabatić, Anton Filić, Aurora Weiss, and Dražen Keleminec were not included. All observed presidential candidates, except Branka Lozo, treated social media as important channels for their campaigns. The most popular platforms were Facebook and Instagram, where all candidates maintained active accounts. TikTok and X were used by all candidates except Zoran Milanović. These facts indicate that Facebook remains the most popular social network for politicians, but Instagram and TikTok are also emerging as key tools for political communication. This is particularly true for TikTok, which was widely used during the presidential elections by candidates for personalized and more relaxed communication with voters.

From a content perspective, most candidates had identical posts on their social media platforms, without attempting to tailor them to the specific forms of each platform. Significant differences between posts were only observed for Dragan Primorac (who ran a negative campaign against Zoran Milanović on Facebook and Instagram, while leading a positive and personalized campaign on TikTok) and Miro Bulj (who emphasized his political positions on Facebook and Instagram, while taking a more relaxed approach with memes on TikTok). The topics addressed by candidates were diverse and included criticizing other candidates (primarily Zoran Milanović and Dragan Primorac), supporting social policies (assistance for retirees and youth, affordable housing, and the healthcare system), and self-promotion. Among a larger number of candidates (belonging to the political right), themes of anti-globalism, anti-migration, opposition to “gender ideology,” and criticism of mainstream media stood out.

Zoran Milanović and Dragan Primorac advanced to the second round. Milanović largely focused his campaign on a positive portrayal of his character and previous mandate, with minimal and satirical commentary on other candidates. Dragan Primorac, on the other hand, directed his campaign toward presenting a negative image of Milanović.

The analysis also focused on the same anonymous TikTok accounts that were observed during the parliamentary elections, specifically accounts that create or share political content, but whose real authors are unknown. Of the 32 accounts observed, eight were inactive during the presidential elections, while six were shut down, indicating that anonymous political accounts have a limited lifespan.

As for active accounts, they mostly shared posts and media statements of the politicians they supported. Notable examples include accounts Fenomentastic (supporting Marija Selak Raspudić), PolitikazagenZ (supporting Ivana Kekin and Zoran Milanović), Zokipedia (supporting Zoran Milanović), and Mladi_za_Karolinu_i_OIP (supporting Karolina Vidović Krišto). It is important to note that the Zokipedia account appeared just before the presidential elections, with its first post on November 11th 2024. Despite this, it quickly became one of the most popular political TikTok accounts in the presidential elections, providing a strong digital presence for Zoran Milanović on TikTok, even though he, as a candidate, did not have an official account.

A smaller number of accounts, however, criticized certain politicians, created satirical videos, or provided information to users. Among these three groups, the most notable were accounts like Nemožemo, Zagreb Istok Bez Cenzure, and ZagrebeUstani!, whose content is usually characterized by criticism of the management of the City of Zagreb. During the presidential elections, they published content focused on criticizing presidential candidate Ivana Kekin and the Možemo political party, to which she belongs.

Three attempts at election manipulation

What  marked the presidential elections on social media content-wise the most were not the official campaigns of the candidates, but attempts at manipulating voters, among which we highlight the coordinated defamation campaign against Ivana Kekin regarding the Coffee Affair with Nikica Jelavić, the use of fake profiles and bots by Dragan Primorac, and the publication of a questionable analysis on the activity of pro-Russian bots for Zoran Milanović on the eve of the election silence in the second round.

The first case began on December 4th 2024, when the TikTok account Ne Možemo (@nemozemo1), one of the most popular anti-Možemo accounts (alongside Zagreb Istok Bez Cenzure and ZagrebeUstani), posted a video claiming that presidential candidate Ivana Kekin owned a hidden villa, which she had not disclosed to the public until now. However, Kekin quickly denied these claims, emphasizing that this information had been publicly available in her asset declaration since she became a Member of Parliament in 2021 and was thus easily verifiable. Nevertheless, the Ne Možemo account continued to publish content accusing her of lying about this "hidden" villa, claiming that it was acquired through theft.

This campaign reached its peak after the so-called Coffee Affair, in which Mile Kekin, the husband of Ivana Kekin, met with Nikica Jelavić, a businessman associated with organized crime. Within just one day of Kekin's announcement, disinformation on social media intensified, and a post based on false information went viral, with over 130.000 views and 600 shares.

Following Gong’s warning and the report of controversial viral content spreading disinformation, TikTok removed the reported videos and subsequently independently removed the entire Ne Možemo (@nemozemo1) account. Gong’s further digital investigation determined that, in addition to Instagram posts from the same account, a number of politicians and political actors were linked to it through the Collaborate option. Furthermore, the controversial TikTok account Ne Možemo, which the platform removed, reappeared under the name Nemozemo2 (recently also removed) and Nemozemo3, which continues to publish content against Možemo and Ivana Kekin in the run-up to the upcoming local elections in mid-2025. It is important to note that the posts related to this affair did not involve constructive criticism or discussion, nor satire, but rather a coordinated disinformation and defamation campaign, disseminated by anonymous accounts that cannot be held accountable during elections. Despite Gong urging the State Election Commission (DIP) to determine the existence of covert financing of such content, the DIP decided, in accordance with its Protocol for Handling the Emergence of Disinformation during Elections, that “it will not react to potential disinformation that appears in the media space or on social media, which are aimed at potential defamation of individual election participants, given that it does not have prescribed jurisdiction nor does it have the tools to verify such types of disinformation.”

The second case concerned the use of fake profiles and bots in election campaigns to manipulate voter awareness. Although Meta removed 150 fake profiles on Facebook and Instagram linked to members of the HDZ Youth during the parliamentary elections in April 2024, the use of fake profiles and bots was again observed during the presidential campaign, this time on the TikTok profile of their presidential candidate, Dragan Primorac. Such profiles raise doubts about their authenticity due to the low number of followers, nearly identical dates of activity initiation, limited amount of personal information, and the use of profile pictures stolen from other people.

Presidential Elections on Social Media: Defamation, Bots, and Questionable Research 2
Source: TikTok, account @miradji00 (front page with evidence of support for Primorc, with evidence of the inauthenticity of the profile photo)

After the fact-checking portal Faktograf published an article on December 24th 2024, exposing the activity of bots, TikTok removed over 7.000 inauthentic followers from Dragan Primorac's account. While he previously had 10.900 followers, after the removal, Primorac was left with only 3.424 followers. However, even after the removal of fake profiles, Gong detected their activity, albeit in significantly smaller numbers. Notably, the remaining few accounts (five) were deactivated immediately the day after the conclusion of the second round of elections, in which Dragan Primorac lost. This fact serves as further evidence supporting the argument that these were fake profiles, whose purpose was to create the illusion of widespread support for Dragan Primorac among real voters.

Presidential Elections on Social Media: Defamation, Bots, and Questionable Research 3
Source: TikTok, account @miradji00 (front page after the second election round, when it was deleted)

The final case pertains to a study by the Center for Information Resilience (CIR), published on January 8th 2025, only five days before the second electoral round. The study, covered in Croatian media (Novi list) just two days before the pre-election silence period, highlighted the presence of a coordinated network of pro-Russian fake profiles supporting Zoran Milanović.  

However, Gong, alongside numerous scientists, journalists, and IT experts, pointed out several ambiguities, omissions, and errors within the analysis, that raised serious and legitimate doubts about the validity of its conclusions, as well as concerns about the unusual timing of its publication. For example, the report claims a comprehensive analysis of social networks (Facebook, X, Telegram, Reddit, TikTok) of both presidential candidates but fails to mention the previously noted bots associated with Dragan Primorac's TikTok account. Furthermore, the study is anonymous, as its authors are not credited, and its methodology is inadequately explained, lacking information on the number of observed accounts and posts analyzed for network activity on the X platform. Additionally, its analysis of Facebook comments does not specify the number of fake profiles identified, stating only that there were "many." Finally, some of the alleged bot accounts were proven to be false claims (e.g., Darko Lesinger and Mila Hodak are real individuals), while certain pro-Russian alternative media portals, like Maxportal, were found to criticize Milanović rather than support him.  

Beyond these shortcomings, platforms TikTok, X, and Meta responded to Gong and Croatian media, asserting that they investigated the claims made in the analysis and found no evidence of inauthentic accounts or bots influencing the Croatian elections. Meta specifically debunked the analysis’s evidence, noting that no foreign influence was detected in the identified accounts. On January 14th 2025, Croatia's national regulator for the Digital Security Act (HAKOM) addressed the situation by stating that they did not find any significant impact of Russian bots on the elections.  

Digital election campaigns are becoming an increasingly critical aspect of elections. Advances in technology enable ever more sophisticated manipulations, with governments and electoral administrations struggling to keep pace, leaving voters to face new challenges that undermine the integrity of elections. These practices pose a persistent threat to the quality of democratic processes, eroding electoral integrity and fueling doubts about the validity of their outcomes. This is precisely why the EU adopted the Digital Services Act, aimed at preventing illegal and harmful online activities, combating disinformation, foreign election interference, and electoral integrity violations.  

Recommendations for Improving Regulation of Digital Political Campaigns  

Enhancing the Monitoring of Digital Political Campaigns

Due to the growing use of social networks in election campaigns, the State Election Commission (DIP), in addition to monitoring campaign expenses, should place greater emphasis on digital content. Gong proposes that the DIP proactively monitor digital campaigns, sanctioning politicians and political parties for violations of electoral rules. Drawing from lessons learned in the recent presidential elections, the DIP should revise its Protocol for Addressing Disinformation in Election Implementation to include combating disinformation targeting political participants, in the same way it provides for addressing disinformation about the technical conduct of elections. The DIP should also monitor viral political content published via anonymous accounts to determine whether it constitutes covert promotion or undeclared campaign funds.  

Greater Transparency in Social Media Election Campaigns

Data on the use of social networks by politicians and political parties are insufficiently transparent. Currently, this information can only be obtained through manual collection or the use of difficult-to-access tools. Gong recommends that the State Election Commission require political parties and politicians to promptly provide publicly accessible, machine-readable, and more detailed reports on their official social media accounts during election campaigns. These reports should include statistics for each platform used, such as changes in follower counts and the total number of posts. Additionally, they should include a list of posts for each account, their content, post metrics (likes, shares, comments, total views), and links to the posts.  

Mandatory Cataloging of Official Social Media Accounts of Political Candidates

To improve transparency of politicians' and political parties' social media accounts used for communication with citizens, Gong suggests that the State Election Commission issue guidelines for election promotion that would require political candidates to submit information about their official social media accounts, including links. Furthermore, Gong recommends that other institutions and public bodies follow this example. For instance, the official website of the Croatian Parliament includes personal pages for all elected representatives, with personal information such as education and date of birth. However, it lacks a list of official social media accounts they use, which would be useful for monitoring their political communication.  

Protecting Children and Minors

Observed TikTok content includes anti-democratic, disinformation-driven, and intolerant communication, which can easily reach minors. Gong recommends that relevant national authorities, particularly the Children's Ombudsman, respond more proactively to such cases to sanction those responsible and curb the manipulation of minors.  

Regulating Democracy-Threatening Narratives

Gong insists that the DIP actively monitor social networks to ensure election campaigns comply with DSA rules and its guidelines for electoral integrity, which demand the suppression of inauthentic behavior, deepfake content, and non-transparent influencers. All these issues leave room for foreign influence in the electoral process, which cannot be effectively countered by a single body. Thus, the DIP, in coordination with HAKOM, the State Attorney's Office of the Republic of Croatia (DORH), and the Ministry of the Interior (MUP), should collaborate with other relevant national regulatory and oversight bodies to jointly prevent foreign influence on digital platforms.  

Relevant authorities should respond to anonymous defamatory campaigns by investigating their origins and ownership if such campaigns go viral.  

Presidential Elections on Social Media: Defamation, Bots, and Questionable Research 4
Presidential Elections on Social Media: Defamation, Bots, and Questionable Research 5
chevron-right