With the 2024 U.S. presidential elections just six months away, how serious a threat is artificial intelligence when it comes to politics and disinformation? AI is taking a more visible role in political campaigns and it is transforming campaign strategies.
The federal government is pushing for greater accountability. Jessica Rosenworcel, the chair of the Federal Communications Commission (FCC), pressed for heightened transparency in how AI is used in campaign materials.
As artificial intelligence tools become more accessible, the Commission wants to make sure consumers are fully informed when the technology is used.
Jessica Rosenworcel, Federal Communications Commission Chair.
Both major U.S. political parties are utilizing AI, from deploying AI-voiced advertisements to analyzing voter data. The increasing awareness of potential technology misuse — especially in creating persuasive deepfakes — is spurring discussions about the need for more stringent regulations.
The proposed FCC regulations would require political advertisements on radio, TV and cable to clearly disclose any AI-generated content. The FCC, however, does not have authority to regulate internet or social media ads.
This comes as the FCC fined Steve Kramer $6 million for his role in the AI-generated robocalls that mimicked President Joe Biden’s voice ahead of the New Hampshire primaries to discourage voter turnout.
Kramer, a political consultant for Democratic presidential long-shot Dean Phillips, also faces two dozen criminal charges in New Hampshire.
In an interview with local media, Kramer said that he sent out the calls to highlight the urgent need for stricter AI regulations.
“It’s exceeded what my initial thoughts were when doing this,” Kramer said in March. “I think that regulators realize this is a problem and I think legislators realize this is a problem. Now there is real impact and it’s going to keep happening so that we have real regulations that protect those same people who thought they were duped.”
The Biden and Trump campaigns said they’ve limited the use of generative AI to behind-the-scenes productivity tools for data analysis.