A deep fake is audio or video of a person digitally altered so they appear to be saying or doing something that in actuality did not happen. This is typically done with a negative intent, such as spreading false information.
The victim is often unaware of what happened and only finds out after their reputation has been damaged or life seemingly turned upside down.
Effective Aug. 1, 2023, a new law aims to fight the misuse of artificial-intelligence-generated video, images, and sound.
Sponsored by Rep. Zack Stephenson (DFL-Coon Rapids) and Sen. Erin Maye Quade (DFL-Apple Valley), the law will:
• establish a cause of action against someone for the intentional dissemination of a deep fake done without consent of the depicted individual and the deep fake realistically depicts intimate parts of another individual, artificially generated intimate parts presented as the intimate parts of the depicted individual, or the depicted individual engaging in a sexual act; and
• make it a crime to disseminate or enter into an agreement to disseminate a deep fake if the disseminator knows, or should know, it is a deep fake and dissemination occurs within 90 days of an election, is made without consent of the depicted person, and is made with the intent to hurt a candidate or influence the result of an election.
Consent to the deep fake’s creation will not be a defense for unauthorized dissemination. Immunity is established for internet service and similar providers.
HF1370*/SF1394/CH58