According to Forbes, the AI music generators Suno and Udio were sued in June 2024 by the three major record labels for copyright infringement. Over a year later, partial settlements have emerged: Universal Music Group settled with Udio in October 2025, and Warner Music Group settled with both Udio and Suno in November 2025. Sony Music and UMG continue to litigate against Suno. During their two years of operation without licenses, these companies built their tech and user bases; Suno reached a $2.45 billion valuation after a November 2025 funding round. The settlements, however, only address the concerns of the major labels that sued, leaving a vast pool of independent artists—representing 46.7% of the global market—without compensation or recourse for their work being used in training.
The Two-Tier Copyright System
Here’s the thing that really grinds my gears. These settlements didn’t fix the problem. They institutionalized it. We now have a clear, two-tiered system for copyright. If you’re an artist signed to Warner or Universal, you get an opt-in mechanism and maybe some future compensation. If you’re one of the tens of thousands of independent artists whose music was almost certainly scraped and used to train these multi-billion dollar models? Tough luck. Your work is now a permanent, uncompensated part of their infrastructure. The lawsuits from majors like UMG had the financial heft to force a deal. As artist Tony Justice’s class action highlights, indie creators have no such leverage. So the law, in practice, only protects those who can afford to enforce it. That’s a terrible precedent.
The Infringement-First Playbook
And that’s the real danger here. This sequence of events basically writes a playbook for the next wave of AI startups. The calculus is now terrifyingly simple: ignore copyright, build your product as fast as you can with stolen data, and grow to a size where you’re too big to fail. By the time the lawsuits hit, you’re negotiating from a position of immense strength. Your tech is built, your users are hooked, and your valuation is sky-high. The worst-case scenario isn’t being shut down anymore—it’s just cutting a licensing deal with the biggest players, which is still cheaper than having licensed everything properly from the start. You get to rebrand from “thief” to “responsible partner” overnight. It’s a brilliant, if utterly cynical, business strategy. And it tells every other developer that ethics are just a cost to be managed later.
The Black Box of Settlements
But what’s in these deals? We don’t know. That’s by design. The financial terms are secret. How artists get paid, or if old unrecouped balances get deducted, is a mystery. As the Music Artists Coalition pointed out, there’s a desperate need for “consent, compensation, and clarity” that just isn’t being met. This lack of transparency isn’t an accident. It keeps independent artists in the dark with no benchmark to demand fairness. It also lets the AI companies and labels control the narrative. They can claim they’ve “done the right thing” without ever showing their work. It conflates two separate issues, too. Even if an artist opts into future use, that doesn’t retroactively approve the initial training that extracted value from their life’s work to build the model. That value has already been captured.
What’s Really at Stake
Look, this isn’t really about whether an AI can make a catchy tune. It’s about a fundamental misunderstanding of creativity itself. Venture capitalists see success as output—more songs per minute, more content per dollar. But creativity is a slow, difficult process of learning and synthesis. It’s the labor. When an AI model is trained on a song, it’s consuming thousands of hours of human labor—writing, practicing, refining. That labor created value, and these companies captured it. Now, they want to sell monthly subscriptions to a black box built on that unpaid labor. That’s not democratization; it’s a new form of extraction. The public gets it. A UK government consultation found 95% of people believe AI companies should license training data. Only 3% thought it should be free. Yet policymakers drag their feet. The path forward is obvious: start with licensing, compensate everyone fairly, and be transparent. Until that happens, this isn’t innovation. It’s just theft, followed by a PR campaign.
