Human Artistry Campaign Endorses Landmark ‘No AI FRAUD Act’

Human Artistry Campaign AI Fraud
  • Save

Photo Credit: Alexander Sinn

The Human Artistry Campaign announces its support of the ‘No AI FRAUD Act,’ legislation to establish safeguards protecting against abuses stemming from generative AI.

A group of US Representatives have introduced a bill to protect against generative AI abuses, and the Human Artistry Campaign has voiced its support. The “No AI Fake Replicas and Unauthorized Duplications Act of 2024” — the “No AI FRAUD Act” — would establish safeguards protecting against generative AI abuses stemming from the unauthorized copying of a person’s likeness or voice that results in deepfakes and voice clones.

The legislation was introduced in the US House of Representatives today by a bipartisan coalition led by Representatives María Elvira Salazar (R-FL), Madeleine Dean (D-PA), Nathaniel Moran (R-TX), Joe Morelle (D-NY), and Rob Wittman (R-VA).

“The most unique and foundational aspects of any person’s individuality should never be misappropriated or used without consent,” says Human Artistry Campaign Senior Advisor Dr. Moiya McTier. “We applaud Representatives Salazar, Dean, Moran, Morelle, and Wittman’s forward-thinking No AI FRAUD Act as a massive step forward in protecting people, culture, and art — while also urging other policymakers to follow their lead to shield us all from voice, image, and likeness manipulation.”

“Timely action is critical as irresponsible AI platforms are being used to launch deepfake and voice impersonation models depicting individuals doing and saying things they never have or would. This not only has the potential to harm these artists, their livelihoods and reputations, but also degrades societal trust,” continues Dr. McTier. “There has never been a more important time for our leaders to demand responsible and ethical AI that works for people — not against them.”

“Artists spend years, sometimes decades, meticulously crafting a brand with their labels to connect with fans. Permitting AI firms to profit from imitated artist voices undermines the core principle of copyright: to reward the risk-takers who promote the progress of the arts. This proposal is timely and addresses a significant void,” adds A2IM President and CEO Dr. Richard James Burgess.

“AI deepfakes and voice cloning threaten the integrity of all music,” says NMPA President and CEO David Israelite. “Music creators face enough forces working to devalue their work — technology that steals their voice and likeness should not be one of them.”

SAG-AFTRA President Fran Drescher concludes: “Without smart regulation, AI technology poses risks to individuals and intellectual property rights. We applaud the introduction of the No AI FRAUD Act. It’s essential for the public to get involved and advocate. Let’s ensure AI technology remains a tool for humans and not a means of human exploitation.”