Return to News Jan 02, 2024

TO: Interested Parties

FROM: Elizabeth Poston and Maxwell Schechter

RE: Michigan Introduces Disclaimer Requirements on Political Ads Using AI

Michigan Governor Gretchen Whitmer recently signed into law several bills regulating the use of artificial intelligence in political advertisements in the state. The new laws require the use of disclaimers to inform the public if ads employ artificial intelligence (“AI”). Although several states have taken steps to regulate so-called political “deepfakes,” Michigan is the first state to more broadly regulate the use of AI technology in political advertisements, regardless of whether the use of AI has the potential to mislead the public.

Notably, Michigan’s new requirements apply not only to media regarding state and local elections, but also to advertisements related to federal elections in Michigan. Because the laws go into effect on February 13, 2024, they will apply to the 2024 election in a state that is key for both the presidential race and for control of the U.S. House and Senate. Note, there are material questions regarding whether the application of these laws to federal campaigns is preempted by the Federal Election Campaign Act.

Campaigns and other political organizations airing advertisements in Michigan should discuss these requirements with legal counsel to ensure compliance with these new laws and should consider adding provisions to media vendor contracts to address the use of AI.  

Required Disclaimers on All Political Advertisements that Use AI

Ads that meet the definition of “qualified political advertisements” must include, in a clear and conspicuous manner, a statement saying that the ad was generated in whole or substantially by artificial intelligence. A qualified political advertisement is a paid advertisement that relates to a candidate or election for federal, state, or local office in Michigan and that contains any image, audio, or video that is generated in whole or substantially with the use of artificial intelligence.1 “Artificial Intelligence” is defined to mean “a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments, and that uses machine and human-based inputs to do all of the following:

(a) Perceive real and virtual environments;

(b) Abstract such perceptions into models through analysis in an automated manner; and

(c) Use model inference to formulate options for information or action.”

The law encompasses a very broad range of ad types, including “search engine marketing, display advertisements, video advertisements, native advertisements, issue advertisements, messaging service advertisements, mobile application advertisements, and sponsorships.” The format of the disclaimer depends on the type of advertisement.

Violations of this requirement may result in fines of up to $1,000 for each qualified political advertisement. The law does not apply to a qualified political advertisement “that constitutes satire or parody.”

Additional Disclaimers for Materially Deceptive Media (Deepfakes)

The laws also regulate the use of “materially deceptive media” (colloquially known as “deepfakes”). Materially deceptive media means any image, audio, or video that was produced by artificial intelligence and that falsely depicts an individual engaging in speech or conduct in which they did not actually engage but that a reasonable viewer or listener would incorrectly believe was real. Under the new laws, it is illegal for a person to distribute (or to have others distribute) materially deceptive material if:

(1) The person knows the material falsely represents a depicted individual;

(2) The distribution takes place within 90 days of an election;

(3) The person intends the distribution to harm the reputation or electoral prospects of a candidate and the distribution is reasonably likely to cause that result; and

(4) The person intends the distribution to change the voting behavior by deceiving voters into incorrectly believing that the depicted individual in fact engaged in the speech or conduct depicted, and the distribution is reasonably likely to cause that result.

Note that the law does not only apply to paid ads. Even free distribution of materials—such as on social media—could violate this law.

However, distribution of this type of material is not illegal if it includes the following disclaimers:

1.     A disclaimer informing the viewer that the media “has been manipulated by technical means” and “depicts speech or conduct that did not occur” (e.g. “This (image, audio, or video) has been manipulated by technical means and depicts speech or conduct that did not occur.”) The format of the disclaimer depends on the type of advertisement

2.     If the media was generated by editing an existing image, audio, or video, the media must include a citation directing the viewer or listener to the original source from which the unedited version of the existing image, audio, or video was obtained.

Violation of the materially deceptive media law is a criminal offense that could lead to fines of up to $1,000 per violation and up to 5 years in prison. The law also permits the state attorney general, any depicted individual, the opposed candidate, or any organization that represents the interests of voters likely to be deceived by the materially deceptive media, to seek an injunction to stop the deepfake.

The law applies to federal, statewide, legislative, judicial, county, or local elections.

Other Jurisdictions

Michigan is only the latest state to enact legislation around political deepfakes. California, Minnesota, Texas, and Washington each have their own laws regarding the use of AI in political advertising. We recommend that all organizations sponsoring political advertisements consult with counsel prior to running ads using AI technology and consider taking steps to require that their vendors inform them of any use of AI in advertisements.

-------------------------------------

1 The term also includes advertisements related to ballot questions.