Russia is probably the most prolific international affect actor utilizing synthetic intelligence to generate content material concentrating on the 2024 presidential election, U.S. intelligence officers mentioned on Monday.
The cutting-edge expertise is making it simpler for Russia in addition to Iran to shortly and extra convincingly tailor often-polarizing content material aimed toward swaying American voters, an official from the Workplace of the Director of Nationwide Intelligence, who spoke on situation of anonymity, informed reporters at a briefing.
“The [intelligence community] considers AI a malign influence accelerant, not yet a revolutionary influence tool,” the official mentioned. “In other words, information operations are the threat, and AI is an enabler.”
Intelligence officers have beforehand mentioned they noticed AI utilized in elections abroad. “Our update today makes clear that this is now happening here,” the ODNI official mentioned.
Russian affect operations have unfold artificial photographs, video, audio, and textual content on-line, officers mentioned. That features AI-generated content material “of and about prominent U.S. figures” and materials in search of to emphasise divisive points similar to immigration. Officers mentioned that’s per the Kremlin’s broader aim to spice up former President Donald Trump and denigrate Vice President Kamala Harris.
However Russia can be utilizing lower-tech strategies. The ODNI official mentioned Russian affect actors staged a video wherein a girl claimed to be a sufferer of a hit-and-run by Harris in 2011. There’s no proof that ever occurred. Final week, Microsoft additionally mentioned Russia was behind the video, which was unfold by an internet site claiming to be a nonexistent native San Francisco TV station.
Russia can be behind manipulated movies of Harris’s speeches, the ODNI official mentioned. They could have been altered utilizing modifying instruments or with AI. They have been disseminated on social media and utilizing different strategies.
“One of the efforts we see Russian influence actors do is, when they create this media, try to encourage its spread,” the ODNI official mentioned.
The official mentioned the movies of Harris had been altered in a variety of how, to “paint her in a bad light both personally but also in comparison to her opponent” and to give attention to points Russia believes are divisive.
Iran has additionally tapped AI to generate social media posts and write pretend tales for web sites posing as reputable information retailers, officers mentioned. The intelligence neighborhood has mentioned Iran is in search of to undercut Trump within the 2024 election.
Iran has used AI to create such content material in each English and Spanish, and is concentrating on People “across the political spectrum on polarizing issues” together with the warfare in Gaza and the presidential candidates, officers mentioned.
China, the third most important international menace to U.S. elections, is utilizing AI in its broader affect operations that intention to form world views of China and amplify divisive matters within the U.S. similar to drug use, immigration, and abortion, officers mentioned.
Nonetheless, officers mentioned they’d not recognized any AI-powered operations concentrating on the end result of voting within the U.S. The intelligence neighborhood has mentioned Beijing’s affect operations are extra centered on down-ballot races within the U.S. than the presidential contest.
U.S. officers, lawmakers, tech firms, and researchers have been involved concerning the potential for AI-powered manipulation to upend this yr’s election marketing campaign, similar to deepfake movies or audio depicting candidates doing or saying one thing they did not or deceptive voters concerning the voting course of.
Whereas these threats might but nonetheless materialize as election day attracts nearer, to date AI has been used extra regularly in numerous methods: by international adversaries to enhance productiveness and increase quantity, and by political partisans to generate memes and jokes.
On Monday, the ODNI official mentioned international actors have been sluggish to beat three most important obstacles to AI-generated content material turning into a larger threat to American elections: first, overcome guardrails constructed into many AI instruments with out being detected; second, develop their very own refined fashions; and third, strategically goal and distribute AI content material.
As Election Day nears, the intelligence neighborhood can be monitoring for international efforts to introduce misleading or AI-generated content material in a wide range of methods, together with “laundering material through prominent figures,” utilizing pretend social media accounts or web sites posing as information retailers, or “releasing supposed ‘leaks’ of AI-generated content that appear sensitive or controversial,” the ODNI report mentioned.
Earlier this month, the Justice Division accused Russian state broadcaster RT, which the U.S. authorities says operates as an arm of Russian intelligence providers, of funneling almost $10 million to pro-Trump American influencers who posted movies important of Harris and Ukraine. The influencers say they didn’t know the cash got here from Russia.