Artificial intelligence is free speech, let’s treat it that way

.

In July of 2023, Public Citizen submitted a petition to the Federal Election Commission urging it to clarify that the law against fraudulent misrepresentation of campaign authority (52 U.S.C. § 30124) encompasses the use of deliberately deceptive AI-generated content in campaign advertisements. On Sept. 19, 2024, when the FEC considers the petition’s request in an open meeting, I will vote against initiating a rulemaking because the FEC lacks the statutory authority to promulgate the rule the petitioner seeks. And upon careful consideration, I have concluded that, as a matter of policy, it should not.

First and foremost, the FEC may not pass a rule that goes beyond the statutory authority granted to it by Congress in the Federal Election Campaign Act. FECA generally prohibits a person from fraudulently misrepresenting that their words or actions are on behalf of a candidate or political party committee, for example, by misleadingly claiming to be raising money on behalf of a candidate or speaking on a candidate’s behalf. And the FEC has a track record of prosecuting these violations and will continue to do so, including where AI is used to effectuate the fraud. However, FECA does not prohibit broader general untruthfulness in a campaign ad, such as making untrue statements about a candidate’s voting record (of course, there are other laws that cover such conduct.) Thus, I cannot vote to initiate a rulemaking that would expand the FEC’s regulatory authority.

If Congress wants the FEC to regulate the use of AI beyond the scope of current law, it can amend FECA. But this path forward is a bad idea fraught with perils both known and unknown. To be sure, the technological landscape of campaign communications is rapidly evolving. But as the FEC’s decades-long struggle to apply FECA to communications on the internet demonstrates, the FEC is not equipped to take on the complex and nuanced challenges posed by AI regulation. For example, how much AI in a campaign advertisement is too much? Is it legally relevant, for campaign finance purposes, what data was used to train the model that was used to generate the content? As a result of these myriad types of questions, there is the all-to-real risks of arbitrariness in enforcement from lack of considered standards, or inadvertently stifling innovation, in the marketplace.

No matter the dangers of AI, an ill-conceived knee-jerk response risks constraining free political expression. As history has taught, from fear does not often spring good law. And this concern is heightened when those laws regulate the core of political speech in our country. As the use of AI permeates our daily lives, society will have to wrestle with questions such as the governmental interests in restricting its use, a process that the FEC should not be spearheading. 

 CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER

After thoroughly considering the more than 2,000 public comments submitted in response to the petition, including feedback from members of Congress, political party committees, advocacy groups, and individual citizens, it became clear that the opinions on the uses and effects of AI in elections vary widely. This complexity underscores the need for a cautious and informed approach rather than an immediate regulatory response. As always, I remain committed, and I believe the same holds true for my colleagues on the FEC, to safeguarding the integrity of our electoral processes, but any steps taken must be rooted in sound legal authority and careful policy considerations. For now, I believe the decision not to initiate a rulemaking is the most prudent course of action, ensuring that we can continue to adapt to the evolving electoral landscape while upholding the rule of law.

Trey Trainor currently serves as a commissioner on the Federal Election Commission. He was nominated to that role by former President Donald Trump and was confirmed by the Senate in 2020.  He has practiced campaign finance and election law for over two decades.

Related Content