A North Carolina man has pleaded guilty to one of the most brazen AI-assisted fraud cases in the music industry's history — using artificial intelligence to generate hundreds of thousands of songs and bots to stream them billions of times, fraudulently collecting more than $8 million in royalties.
Michael Smith's scheme, confirmed by the Department of Justice's Southern District of New York, represents a new frontier in AI-enabled fraud: systematically exploiting the economics of streaming royalty systems at a scale that was previously impossible without AI.
How the Scheme Worked
The mechanics were straightforward and scalable:
-
Generate content at scale — Smith used AI tools to produce hundreds of thousands of songs. These weren't high-quality productions — quantity was the point, not artistry.
-
Upload everywhere — the AI-generated tracks were distributed across major streaming platforms including Spotify, Apple Music, and Amazon Music.
-
Bot the streams — automated bots were deployed to stream the songs "billions" of times, according to the DOJ. Streaming platforms calculate royalty payments based on play counts, so artificially inflated streams translate directly into real money.
-
Collect the royalties — the fraudulent stream counts generated over $8 million in royalty payments that Smith received.
Why This Case Matters
Smith's case isn't just about one person's fraud — it's a preview of a systemic vulnerability in how the music industry monetizes streaming.
Streaming royalty systems were designed for a world where creating and uploading music had meaningful friction. AI has eliminated that friction entirely. Anyone with access to a music generation AI and basic technical knowledge can now produce thousands of "songs" in hours. Combine that with bot infrastructure for artificial streaming, and the fraud economics are compelling.
The scale Smith achieved — billions of streams, $8 million in royalties — required AI. A human couldn't manually create and upload hundreds of thousands of tracks. That's what makes this case a landmark: it demonstrates that AI doesn't just assist fraud, it enables fraud at a scale that wasn't previously feasible.
The Industry Response
Streaming platforms have been fighting bot-driven streaming fraud for years, but the AI content generation layer adds a new dimension. Previously, fraudsters needed some amount of real music. Now they need none.
Spotify, Apple Music, and Amazon have not commented specifically on this case. The broader industry body representing performance rights organizations has acknowledged that AI-generated content is complicating existing royalty frameworks — though the focus has been on compensation questions for human artists, not fraud prevention.
What Comes Next
Smith's guilty plea is significant but unlikely to deter others. The technical barrier to replicating his scheme is low, and the potential upside — millions in fraudulent royalties — is high. Until streaming platforms develop more robust detection systems capable of identifying AI-generated content and distinguishing genuine listeners from bots at scale, the vulnerability remains.
The music industry spent years dealing with stream manipulation fraud. AI just made the problem orders of magnitude worse.



