Nevada Law aims to Shine Light on AI-Generated Political Imagery
CARSON CITY, NV – Nevada’s new law requiring disclosure of AI-generated content in political advertising went into effect this month, as campaigns begin to grapple with the rising threat of digitally manipulated imagery. The law seeks to address concerns that deceptive AI-created videos and images could mislead voters,though enforcement challenges remain.
The legislation mandates that political advertisements utilizing artificial intelligence to depict realistic visuals or audio must include a disclaimer identifying the content as AI-generated. Secretary of State Cisco Aguilar acknowledged the difficulty of policing the new rules, stating, “Bad actors are going to bad act,” and noting that digitally altered videos could be disseminated anonymously, bypassing state oversight.
The challenge lies in balancing transparency with the protection of legitimate political speech, according to Peter Koltak, a Democratic campaign consultant. “I think it’s smart to err on the side of not over correcting to maybe stay in a position of being a little more reactive, to see how some of this stuff plays out,” Koltak said.
The law is already being tested.Clark County School Board trustee Lydia Dominguez, a Republican candidate for Nevada’s 3rd Congressional District, launched her campaign with a video featuring manipulated content portraying Rep. Susie Lee (D-NV) and other Democratic leaders in attire suggestive of organized crime figures.
Dominguez defended the ad, stating, “The new mob, as my ad points out, consists of career politicians like Susie Lee who, at taxpayer expense, seek to enrich themselves while the people they represent struggle to make ends meet.” Under the new law, Dominguez is now required to disclose the use of AI in her campaign video.
Rep. Lee’s spokesperson, Greg Lademann, criticized the ad, saying it demonstrated Dominguez’s need to “cover up for her lack of vision for Nevadans [by turning] to deepfakes and falsehoods.” Lademann added,”Nevadans should be able to trust what they see with their own eyes. Sadly, in our age of AI this is no longer guaranteed.”
Experts note that even when identified as false, manipulated media can reinforce existing beliefs. “People from very different persuasions can look at a piece of media in totally different interpretations,” explained Dr. Sofia Soto-Vasquez. “If you agree with that worldview, you might say, ’I know it’s fake, but it feels true to me.'”
While AI represents a new frontier in political manipulation, the practice of digitally altering campaign materials is not new. Kenneth Miller, an assistant professor of political science at UNLV, pointed out that tools like Photoshop have been used for decades. “I suppose the newer part would be it’s a little cheaper to do now,” Miller said.
Jeremy Hughes, a Republican political consultant, echoed this sentiment, stating that campaigns have been digitally altering photos of opponents for years. However, Hughes predicted the issue will become substantially more prominent in the future. “In 10 years it’ll be a much bigger deal than it is indeed now,” he said. “We’re just seeing the beginning of it.”