Tuesday, November 5, 2024

AI Could Influence Political Propaganda in the 2024 US Election

Experts are raising the alarm that new AI tech could ramp up the spread of misinformation and propaganda, making our digital world more confusing.

Back in the 2016 US Presidential Election, social media was a major tool for spreading false information. Fast forward to 2020, we saw conspiracy theories and baseless claims of voter fraud running wild. Now, with the 2024 election on the horizon, AI technology might push the propaganda game to new heights.

AI-Powered Propaganda: An Embattled Info Space

False info generated by AI could trick people more easily and drown the internet in wrong and misleading content. As a result, trust will take a hit and spreading real info will be harder.

“Trust will dip, sharing actual information will be tougher,” warns Ben Winters, senior counsel at the Electronic Privacy Information Center. “It won’t be good for the info space.”

New Tools, Old Tricks

AI tools, such as those that generate super-real images, mimic voices, and write human-like text, have become increasingly prevalent. These tools, made available to the masses by companies like OpenAI, are already causing significant disruptions in various industries. Moreover, they are now being employed to generate political content.

For instance, when an AI-generated image depicting a Pentagon explosion emerged, it had a profound impact on the stock market. Additionally, AI audio parodies featuring US presidents went viral, capturing widespread attention. Notably, AI-created images portrayed Donald Trump engaged in altercations with law enforcement officers. Furthermore, political groups are utilizing AI to produce advertisements that depict imagined calamities that could arise if their opponents were to win elections.

What makes this development truly game-changing is that AI enables the creation of such content by individuals possessing basic digital skills. Consequently, the potential for propaganda to proliferate rapidly and extensively, particularly during election periods, is magnified.

AI Makes Disinformation & Propaganda Easy

With AI tools, it’s easier to produce and harder to detect fake bots, manipulated images, and misleading robocalls. Even non-English speakers can now make believable content in English. AI tech might also enable large-scale voter suppression campaigns, especially against marginalized communities.

AI could be used to generate fake engagement or letters, making it hard to know how voters are reacting. In a recent experiment, AI-generated letters were almost as effective as human-written ones in getting responses from state legislators.

Political Campaigns Join the AI Game

Political campaigns are already leveraging AI to produce content, actively generating fake images portraying politicians embracing or kissing their opponents. Additionally, they are sharing deceptively edited videos, intentionally creating misleading impressions. As a result, the utilization of AI in political campaigns is expanding.

The extent to which AI will significantly impact the 2024 election remains uncertain. However, it is evident that the combination of the AI surge and relaxed social media moderation rules raises concerns.

According to Josh Goldstein, a Georgetown University representative, “AI-generated images and videos can be produced at a faster rate than fact-checkers can verify them.” While certain AI services, such as ChatGPT, adhere to guidelines that prevent the generation of misinformation, there are still open-source models lacking these protective measures. Consequently, these models can be exploited to propagate false information.

Related Articles

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles