Sections

Commentary

Foreign influence operations in the 2024 elections

September 12, 2024


  • Attorney General Merrick Garland unveiled nefarious foreign activities designed to sway the electorate and turn voters against one another.
  • Recent polls show that 70% of Americans “are worried about how fake news might affect the upcoming election” and 58% already say they have been deceived by AI-generated news.
  • Influence operations are not limited just to foreign entities but include domestic organizations and actors as well.
A dollar bill held up in front of the X/Twitter logo in Brooklyn, NY on Oct, 18th 2023 after X/Twitter announced that it will begin testing out a new feature to charge users a dollar every year in order to prevent bot abuse.
A dollar bill held up in front of the X/Twitter logo in Brooklyn, NY on Oct, 18th 2023 after X/Twitter announced that it will begin testing out a new feature to charge users a dollar every year in order to prevent bot abuse. Meir Chaimowitz/NurPhoto

The recent indictments of Russian individuals and sanctions against Russian entities show the risks facing America in the upcoming elections. Attorney General Merrick Garland unveiled nefarious foreign activities designed to sway the electorate and turn voters against one another. The Russian state media firm RT is accused of using AI and bots to spread propaganda videos with false narratives about crime, immigration, and the Ukraine war. Russians also have created fake sites designed to look like U.S. news organizations that promulgate misleading pro-Russian narratives.

China is also engaging in deceptive practices. On TikTok and X, a Chinese influence operation has posted videos purporting to be from U.S. voters complaining about reproductive rights, Israel, and homelessness. Some of the videos have been seen by as many as 1.5 million people before being taken down. They are part of an influence operation called “Spamoflauge,” designed to heighten American divisions and sow doubts about democratic political systems.

Iranian operatives already have hacked the emails of GOP political consultant Roger Stone. After getting access to his account, they contacted top Trump campaign officials in an effort to gain confidential information. Among the materials they supposedly acquired were the vetting files of Republican vice president nominee JD Vance and other documents.  

It is for these reasons that recent polls show that 70% of Americans “are worried about how fake news might affect the upcoming election” and 58% already say they have been deceived by AI-generated news. More women than men indicate they have been tricked and half of young people claim they have been taken in by fake news.

Influence operations are not limited just to foreign entities but include domestic organizations and actors as well. In recent weeks, there has been a proliferation of fake pictures, videos, and narratives. For example, there are pictures of Kamala Harris in a bathing suit hugging convicted sex offender Jeffrey Epstein, even though that never happened.

There have been numerous examples of Donald Trump being arrested by the police, dragged away by officers, and running from law enforcement, none of which actually happened. The goals of those circulating such images were to inflame Trump supporters, play to the narrative he was being mistreated, and amplify the idea that the system was rigged against him.

Trump himself has circulated false materials showing Harris speaking before a communist rally with a large hammer and sickle insignia hanging above a large crowd. That image played to the nominee’s complaint about “Comrade Kamala” and allegations she was dangerously radical and far outside the political mainstream.

This narrative has also been pursued by businessman Elon Musk, who posted a fake image of Harris dressed in a communist-style military uniform. The caption on Musk’s post, which was made on his social media platform X, reads, “Kamala vows to be a communist dictator on day one. Can you believe she wears that outfit!?”

The circulation of fake images has not been limited to only depictions of the candidates. In a Truth Social post, Trump included a collection of images, many of which appear to be fake, that depict fans of Taylor Swift sporting “Swifties for Trump” shirts. Another fake image shows Swift dressed as Uncle Sam and includes text that reads, “Taylor wants you to vote for Donald Trump.” As the caption for his post, Trump wrote, “I accept!” implying that he accepted an endorsement from Taylor Swift.

We should not be surprised at the proliferation of so many fake images as this growing practice is documented in a new Brookings Press book, “Lies That Kill: A Citizen’s Guide to Disinformation.” Disinformation is a tough challenge because so many people have financial and political incentives to create and disseminate disinformation. Digital sites can make money through premium subscriptions, advertising, and merchandising sales. Tech tools make it easy to devise fake pictures and videos. The toxic political environment makes disinformation believable to many people who want to assign terrible motives to the opposition and accept the worst views about it. It’s like a chicken and egg game; the toxic political environment makes people willing to believe disinformation and the disinformation helps perpetuate the toxic political environment.

Beyond their own false narratives, fake images open an additional avenue for disinformation. With the proliferation of these images, politicians can falsely claim with perhaps greater apparent legitimacy that images painting them in a negative light (or their opponent in a positive light) are fake. This was seen when Trump falsely claimed that a picture of a large crowd at a Harris rally in Detroit was created using artificial intelligence, arguing that the sizable crowd “didn’t exist.”  

In order to safeguard democracy, we need to take several actions to protect people from false narratives. We should demonetize disinformation so that making lots of money off of it is more difficult. That can happen by giving advertisers more transparency and control over their ad placements so their advertisements don’t unwittingly finance known disinformation sites.

Social media platforms need to get more serious about content moderation. Most of the leading sites have terms of service agreements that preclude use of their platform to incite violence, promote hate speech, or engage in fraud. Companies that reduced their human safety staffs need to hire people whose jobs are to enforce the very standards firms claim to uphold.

Finally, voters need to exercise common sense in evaluating information from partisan or foreign sources. If something seems far-fetched, it probably is, and people should act accordingly. Just as consumers have grown more skeptical about email phishing efforts, consumer fraud, or Nigerian residents offering “quick rich” schemes, voters need to consider the source of information and weed out the pictures, videos, and narratives that are too outlandish to be true. Only in those ways can people protect themselves and help the country safeguard its voting processes in the face of nefarious actions from foreign and domestic sources.

Authors