Voice Rights

Artist Rights Against AI Voice Cloning

Legal options for artists whose voices are replicated without consent. What you can do if someone uses AI to clone your voice.

Legal Desk10 min readDecember 2024

Disclaimer: This is educational content, not legal advice. If your voice has been cloned without consent, consult an attorney who specializes in entertainment or intellectual property law.

The Problem

AI voice cloning has reached a point where anyone with a few minutes of audio can create convincing replicas of a singer's voice. This technology is being used to create unauthorized "AI covers," fake collaborations, and even entirely new songs that sound like they're performed by specific artists.

High-profile cases have involved AI clones of Drake, The Weeknd, Frank Sinatra, and many others. For artists, this raises serious concerns about identity, control over their work, and lost revenue.

The legal framework is still catching up, but artists do have options. Here's what's available today.

Artists can pursue several legal avenues against unauthorized voice cloning:

Right of Publicity

Most U.S. states recognize a right to control commercial use of your identity, including your voice. California, New York, and Tennessee have strong protections. This is often the most direct legal claim.

Lanham Act (Trademark)

Federal trademark law prohibits false endorsement. If an AI voice clone implies you endorsed or created content you didn't, this may apply.

State-Specific Laws

Tennessee's ELVIS Act (2024) specifically addresses AI voice cloning. Other states are following with similar legislation. See our voice cloning laws guide.

Contract Violations

If someone who had access to your vocals (studio, collaborator) used them for AI training, they may have violated contractual agreements.

Stay informed: Get weekly AI music updates in your inbox.

Subscribe

Takedown Process

For AI voice clones appearing on platforms like YouTube, Spotify, or TikTok:

  • 1Document everything. Save URLs, screenshots, and recordings before content is removed.
  • 2File platform reports. Most platforms have specific forms for impersonation or rights violations.
  • 3Work with your label/distributor. They often have faster channels for takedowns.
  • 4Send cease and desist letters. Your attorney can send formal notices to uploaders.
  • 5Consider legal action. For serious or repeated violations, a lawsuit may be necessary.

Platform-Specific Notes

  • YouTube: Has policies against "synthetic or manipulated media" that impersonates
  • Spotify: Removed AI-generated Drake/Weeknd track and updated policies
  • TikTok: Requires disclosure of AI-generated content
  • Streaming services: Increasingly screening for unauthorized AI vocals

Prevention

Steps artists can take to protect their voice:

Proactive Measures

  • Include AI voice protection clauses in all contracts
  • Register with voice monitoring services (emerging technology)
  • Trademark distinctive vocal elements where possible
  • Work with your label on anti-AI-cloning strategies

Contract Language

Ensure contracts explicitly prohibit use of your vocal recordings for AI training or synthesis without additional consent and compensation. This applies to label deals, sync licenses, and collaborations.

What's Changing

The legal landscape is evolving rapidly:

  • More state laws: Following Tennessee, other states are passing AI voice protection laws
  • Federal legislation: The NO FAKES Act would create national protections
  • Platform policies: Streaming services are implementing AI detection and removal
  • Industry standards: Labels and organizations developing best practices
  • Technology: Voice watermarking and detection tools improving

Never Miss an Update

Get the latest AI music news, guides, and insights delivered weekly.

Subscribe Free

Official Sources

Actor and performer union resources

Music industry advocacy

Official AI copyright guidance

AI-assisted content, reviewed by our editorial team.