By: Hannah Lief
The Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2023, also known as the NO FAKES ACT, is a bipartisan proposal that is designed to create a uniform right to publicity, protecting a celebrity’s voice, image, and visual likeliness. It was introduced by four U.S. senators in a draft federal bill on October 12th, 2023 in response to the improvements in artificial intelligence, which has allowed individuals to create unauthorized digital images of celebrities and financially gain from their reputation. Moreover, these advancements permit individuals to replicate celebrities’ past performances, generating a lifelike virtual figure, and voice. Celebrities who have had their voice or image replicated without their consent through artificial intelligence may seek damages under the NO FAKES ACT, suing both the creator of the unauthorized replication and digital platform that knowingly hosted, published, or distributed the content. However, there are certain types of content that are excluded from liability and protected under the First Amendment, including replications of documentaries, sports broadcasts, historic work, and parodies.
Currently, celebrities have various avenues to assert their rights to publicity with digital replications through state laws; however, state laws differ on the amount of time for which this right extends. For example, New York extends the right to publicity for 40 years post-mortem, whereas the Oklahoma statue extends the right for 100 years post-mortem. The NO FAKES ACT would create a uniform federal right to publicity for 70 years post-mortem, granting celebrities the ability to control their own image, voice, and reputation well beyond their lifespan.Additionally, the increased duration of the right to publicity in the proposed legislation evinces the intention to prevent commercial gain by third-parties from a celebrity’s reputation.
In the wake of the proposal, businesses may need to spend additional resources on identifying unauthorized digital depictions to avoid liability. Furthermore, digital platforms can train artificial intelligence models to identify authorized and unauthorized content, similar to the way in which Instagram identifies images and videos that violate its community guidelines. However, the proposal leaves open the question of how effective a business must be in adhering to the requirements set forth in the proposal: is a good-faith effort to identify illegal content adequate, or will digital platforms be held liable if they fail to remove all illegal content even with reasonable effort and measures in place? Additionally, users of these digital platforms should have to affirmatively agree to not share or distribute unauthorized content. Through this explicit action, users will be fully informed of the liability implications associated with the NO FAKES ACT, as they can be sued under the Act, too.
Spotify and YouTube are examples of platforms that may be held liable under the Act for failing to identify and remove unauthorized digital replications. For instance, in May of 2023, an anonymous individual shared “Heart on My Sleeve,” on YouTube, Spotify, and Instagram, which used artificial intelligence to make the vocals of the production sound identical to the artist Drake. Prior to the removal of the content by the digital platforms, the creator furthered their reputation through this unauthorized use, benefiting from Drake’s established image, likeliness, and voice. Although the musical production was removed because of copyright violations, it is an example of the type of content the proposal attempts to regulate.
Despite the potential benefits of the proposal in protecting a celebrity’s right to publicity, it inherently places a restriction on free speech by eliminating the ability for individuals to use artificial intelligence in a manner that was protected prior to the Act’s introduction. Businesses that permit individuals to publish images and music will need to understand the liability implications of the Act, while also encouraging creators to express themselves in a manner protected under the First Amendment. As artificial intelligence improves, digital replications will advance and disrupt the balance between freedom of speech and risk of ill-gotten gain.
 Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act, 118th Cong. (2023) (as proposed by S. Coons, Blackburn, Klobuchar & Tillis, Oct. 12, 2023), https://www.coons.senate.gov/imo/media/doc/no_fakes_act_draft_text.pdf.
 Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act, supra note 1; Jennifer A. Kenedy & Jorden Rutledge, Locke Lord QuickStudy: The NO FAKES Act: With Proposed Bill, Congress Set to Protect Against Unauthorized Digital Replicas of Faces, Names and Voices, Locke Lord (Oct. 16, 2023), https://www.lockelord.com/newsandevents/publications/2023/10/no-fakes-act.
 Marc Tracy, Digital Replicas, a Fear of Striking Actors, Already Fill Screens, The N.Y. Times (Aug. 4, 2023), https://www.nytimes.com/2023/08/04/arts/television/actors-strike-digital-replicas.html.
 Brian Contreras, Senators draft policy aimed at deep fakes of Drake, Tom Hanks and noncelebrities, L.A. Times(Oct. 12, 2023, 6:00 AM), https://www.latimes.com/entertainment-arts/business/story/2023-10-12/senators-draft-policy-aimed-at-deep-fakes-of-drake-tom-hanks-and-other-celebs.
 Rebecca Klar, Bipartisan bill aims to protect actors, singers from AI recreations, The Hill (Oct. 12, 2023, 11:50 AM),
 Isaiah Poritz, AI Deepfakes Bill Pushes Publicity Rights, Spurs Speech Concerns, BL (Oct. 17, 2023, 5:05 AM), https://news.bloomberglaw.com/ip-law/ai-deepfakes-bill-pushes-publicity-rights-spurs-speech-concerns.
 Poritz, supra note 6; Okla. Stat. tit. 12 § 1448.
 Poritz, supra note 6.
 Poritz, supra note 6 (quoting Katharine Trendacosta, Electronic Frontier Foundation’s (EFF’s) director of policy and advocacy).
 Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act § 2 (c)(1), supra note 1.
 See generally Nick Clegg, How AI Influences What You See on Facebook and Instagram, Meta (June 29, 2023),
 Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act § 2 (c)(2)(B), supra note 1; Poritz,supra note 6 (mentioning that individuals who knowingly share unauthorized conduct can be held liable for $5,000 per violation).
 See Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act § 2 (c)(2)(B), supra note 1.
 Scott M. Hervey, Legit or Lawsuit – Fake Drake AI Song, Weintraub Tobin (May 18, 2023),
 See Mia Sato, Drake’s AI clone is here — and Drake might not be able to stop him, The Verge (May 1, 2023, 10:35 AM),
https://www.theverge.com/2023/5/1/23703087/ai-drake-the-weeknd-music-copyright-legal-battle-right-of-publicity (noting that the song generated millions of streams before Spotify, Apple Music, TikTok, and YouTube removed it).
 Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act, supra note 1; Hervey, supra note 14.
 Poritz, supra note 6.