Press "Enter" to skip to content

Elections, AI, and Influence Campaigns

By Ori Swed, Texas Tech University, and Bryan Giemza, Texas Tech University

In 2014, Russia changed the information landscape by utilizing mass production tactics and building on surveillance capitalism tools to flood Ukraine with disinformation. Similar campaigns, on a larger scale, aimed at influencing the 2016 US Presidential elections brought these tactics into the light. That campaign involved deploying hundreds of workers to animate bots and fake accounts across multiple social media platforms, all managed from a troll farm in St. Petersburg. From a Russian perspective, these operations were a successful illustration of the utility of mass manipulation online. Following those elections, Russian trolls expanded their influence attempts to elections worldwide. In a more recent example, in the African Sahel, their interference was used to seed coups, support the establishment of new regimes, and legitimize support for the operations of Russian mercenaries in the region.

            Yet, Russia is hardly alone. In the years following the 2016 election influence campaign, many other countries started running influence campaigns aimed at democratic discourses and specifically elections. Beyond Europe and North America, AI-powered influence has infiltrated electoral landscapes worldwide. In Ghana’s 2024 general election, investigators uncovered a network of 171 fake accounts generated with ChatGPT, promoting one party and smearing the opposition leader. In France, deepfake videos circulated ahead of the 2024 legislative vote, including one fabricating Marine Le Pen’s family life and another altering a France24 broadcast to claim Ukraine plotted to assassinate President Macron. In the spring of 2025, the Canadian federal elections saw India among the foreign actors targeting diaspora communities. And in the Philippines, Beijing funded troll farm activity to spread pro-China narratives ahead of national elections. Such cases underscore how AI has “democratized” influence operations, making tools once limited to state actors widely available to actors ranging from great powers to grassroots movements. Nonstate actors also influence campaigns for hire (consider the notorious example of Cambridge Analytica).

The effect of any single influence operation, or its cumulative effect, is vexingly difficult to measure. This is because the actual objectives of each campaign are hidden, the tools used are secretive and often indistinguishable from other online information, and the outcomes are not reported. By definition, the most “effective” campaigns might essentially evade detection altogether. Nonetheless, we do observe a marked increase in such attempts, including one recently reported on August 27, 2025, when Denmark summoned the US envoy over an alleged influence campaign in Greenland aimed at promoting secession.

            With the increasing popularity and internal legitimacy of different actors using this approach, we may also see how technological developments are changing the game. Russian troll farms required a significant investment in human resources, including hundreds of employees with expertise in coordinated messaging, language training, media expertise, creative content production, and various other specialized fields. The introduction of AI has transformed these operations. Instead of running multiple teams, a few operators can manage various AI agents that produce content and actively engage with users in real-time, allowing for real-world trials and refinement at an unprecedented rate. Instead of hiring content professionals, AI can independently produce catchy political memes, images, and text instantaneously and at no residual cost. It is no longer necessary to locate specialists to penetrate language barriers in different countries, as AI agents can readily operate in Vietnamese, Swahili, Urdu, or Indonesian, with synthetic media providing an increasingly realistic veneer of authenticity for media of all kinds. Instead of recruiting motivated personnel, AI-animated personae are relentless, working around the clock, on a global scale, to promote engagement. (Contrast this with human personnel in St. Petersburg, whose productivity demonstrably went down on very cold winter days and holidays.)

In sum, AI-driven influence campaigns have become easier and potentially more powerful by orders of magnitude as they shift costs from salaries to software and can be rapidly scaled up. User engagement can be more deceptive than ever, as AI mimics indigenous slang and local conversational conventions, something that often proves challenging for human trolls. And thus, AI agents and their influence campaign as a whole are more likely to avoid detection or raise suspicions, as AI-empowered influence campaigns have increasing access to information and sophistication in microtargeting (both aiming for specific audiences and tailoring the best messaging to each individual). This does not require weeks of training; it demands only someone who is a modestly clever prompt engineer. Whole sections of a troll shop can be dispensed with, as AI-powered influence campaigns can generate mountains of bespoke fake content —visual, viral, or otherwise— to suit a specific end. Already, AI visualization is proving to be a master puppeteer, churning out fake videos, taking political figures and animating them, making them say or do whatever the prompt asks them to do.

What does it mean for us? It means that we are witnessing the democratization of influence operations deployed at the heart of digital democracy. We expect to see the proliferation of influence campaigns, especially around election seasons, given the lower cost and potential high yield and low risk results that states and corporations can achieve through this type of intervention. Yet, the democratization of this tool extends beyond states and companies. An activity that was previously reserved for highly technologically sophisticated and well-funded entities, such as states or large corporations, has become accessible to grassroots operatives. If disinformation once trickled down from state power, it now bubbles up from anywhere, threatening not just national security but the foundations of trust that sustain democratic life. This means that we can expect to see grassroots influence operations that take place in smaller communities, organized around specific causes or targeting local elections. The weaponization of influence has become as simple as opening a laptop, but are states prepared for this reality?

Leave a Reply

Your email address will not be published. Required fields are marked *