26
Jun
19

Congress Considering Deep Fakes Law

The technological ability to create convincing “deep fakes” is getting some attention in Congress. The adult entertainment industry has already struggled with deep fake porn, and the unsettled intellectual property issues generated by this type of content. On the one hand, rights holders can assert trademark, copyright, and/or publicity rights claims against producers of deep fakes. Publishers, on the other hand, can argue “fair use”, Section 230 immunity, or First Amendment protections in certain circumstances. But the recent publication of a doctored depiction of Nancy Pelosi appearing to stammer through her words, has apparently caught the eyes of some politicians who are poised to take action.

In early June, 2019, Rep. Yvette D. Clarke [D-NY] introduced H.R. 3230, the Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2019, in the House of Representatives. The DEEP FAKES Accountability Act intends to “combat the spread of disinformation through restrictions on deep-fake video alteration technology.” If passed, the bill would create both criminal and civil penalties for failing to disclose a covered deep fake and for altering disclosures. The bill would also create a private right of action for those injured by covered deep fakes. The bill was referred to the House Committee on the Judiciary, Committee on Energy and Commerce, and Committee on Homeland Security. If the bill is passed, it will take effect one year after it is enacted.

Rather than imposing restrictions on all deep fakes, the bill would impose a watermark and disclosure requirement on all deep fakes which are “advanced technological false personation records” – meaning any deep fake that a reasonable person would believe accurately depicts a living or, in more limited instances, deceased person who did not consent to the production. The bill would apply only to those productions which appear to authentically depict the speech or conduct of a person by technical means. The bill would purposefully exclude productions that utilize the skills of another person capable of physically or verbally impersonating the falsely depicted living or deceased person. The bill would also provide an exception for parodies, historical reenactments, and fictionalized programming that a reasonable person would not mistake as depicting actual events.

All visual-only “advanced technological false personation records” must include an unobscured written statement at the bottom of the image for the duration of the visual element that the deep fake contains altered audio and visual elements and that explains the extent thereof. All audio-only “advanced technological false personation records” must likewise include at least one clearly articulated verbal statement at the beginning of the record that the deep fake contains altered audio and visual elements and explaining the extent thereof. This verbal statement requirement applies to every two minutes of audio. All audiovisual “advanced technological false personation records” must include both an unobscured written statement and at least one clearly articulated verbal statement. Finally, all “advanced technological false personation records” that include a moving visual element must contain a watermark clearly identifying the deep fake as containing altered audio or visual elements.

Software developers that reasonably believe their software may be used to produce deep fakes would be required to ensure that their software allows for the insertion of necessary watermarks and disclosures and includes terms of use that require the user to affirm their general awareness of their legal obligations under this bill.

If passed, an individual may be fined, imprisoned for up to 5 years, or both, for knowingly failing to include a required watermark or disclosure (1) with the intent to humiliate or harass by falsely, visually depicting a person engaging in sexual activity or in a state of nudity, (2) with the intent to cause violence or physical harm, incite armed or diplomatic conflict, or interfere in an official proceeding, and the deep fake did in fact pose a credible threat of doing so, (3) in the course of criminal conduct related to fraud, or (4) by a foreign power or agent, with the intent of influencing policy debates or elections.

The legislation also provides criminal penalties for  knowingly altering the deep fake to remove or obscure the watermark or disclosure with the intent to distribute the altered deep fake and with one of the four prongs listed in the paragraph above. In addition to prison time, the proposed law allows for a civil penalty of up to $150,000 per deep fake as well as appropriate injunctive relief. An individual or affiliated business entity who is falsely exhibited in a deep fake would be able to seek damages and injunctive relief against anyone that violates the disclosure requirements of anti-alteration clauses of this bill. Damages would be the greater of actual damages or $50,000 per deep fake, except the limit would increase to $100,000 per deep fake that depicts extreme or outrageous conduct by the falsely depicted person and would increase to $150,000 per deep fake containing sexually explicit visual content intended to humiliate or harass the falsely depicted person. An individual would be able to file the private action under seal if there is a reasonable likelihood that the creation of public records would result in embarrassing or harmful publication of falsified material.

The bill would also create a process by which producers of deep fakes may seek an advisory opinion from the Attorney General about the legality of their proposed deep fakes within 30 days. The Attorney General would not be able to enforce this law against any producer of deep fakes that relies on an advisory opinion in good faith. The Attorney General would also be required to issue rules governing the technical specifications of the required watermarks within one year of enactment. The Attorney General would designate a coordinator in each United States Attorney’s Office to receive reports from the public regarding potential violations by foreign states and agents as well as any violations depicting acts of an intimate or sexual nature.

In the year after the bill is passed, the Attorney General would be required to publish a report containing a plan to enforce the law, a description of foreign efforts to use deep fake technology to impact election and policy debates in the U.S. and abroad, a description of the impact of sexual deep fakes on women and marginalized communities, and official guidance to Federal prosecutors.  In addition, the bill would require the Secretary of Homeland Security to establish a “Deep Fakes Task Force” to combat the national security implications of deep fakes, research and develop technologies to detect, counter, and distinguish deep fakes from actual events, and work with the private sector on this issue.

The bill would not serve as a defense against, preempt, or limit any Federal, State, local, or territorial laws on deep fakes or related content. Producers will still be able to seek other legal remedies against those individuals that use their copyrighted content without authorization to create deep fakes. Those individuals falsely depicted in deep fakes would still be able to seek other legal remedies against those individuals that use their likeness in deep fakes including privacy, defamation, false light, and unauthorized use of likeness claims. Sites that host user generated content, potentially including deep fake material, would still be able to claim the defenses provided by Section 230 of the Communications Decency Act. However, some members of Congress have expressed their interest in amending Section 230 of the Communications Decency Act to more directly address liability for deep fakes.

Future regulation of deep fake technology is still uncertain, as Congress struggles to sort out the numerous legal and constitutional issues generated by this content. While the adult industry continues to wrestle with the problems caused by deep fake porn, politicians seem interested in nipping the issue in the bud, before a deep fake costs one of them an election.

This post was co-authored by Lawrence Walters and Bobby Desmond, of Walters Law Group. Nothing herein is intended as legal advice.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s


%d bloggers like this: