Rick Beato argues there's a fundamental gap between what artificial intelligence can produce and what music truly represents—a gap that no algorithm can bridge.
Beato recently appeared on CBS and NPR discussing his experiments with AI music generation platforms like Sunno and Udo. He used ChatGPT to generate prompts, Anthropic's Claude to write lyrics, and then fed those into AI music generators to create original songs. The results sound technically impressive, but they miss something essential.
AI can generate sounds, beats, and lyrics—but it cannot feel what musicians feel when creating.
Why Musicians Are Uniquely Positioned to Judge AI
The distinction matters because only human musicians can actually evaluate whether AI-generated music works. Beato points out that the Beatles, Rolling Stones, Beach Boys, Led Zeppelin, and Earth, Wind & Fire all used seventh chords throughout their recordings—from the 1950s through the 1970s. That musical heritage isn't just data for algorithms to ingest; it's a living tradition that musicians understand intuitively.
When Beato interviewed Pink Floyd's David Gilmore in London, he sat two feet away while Gilmore played guitar—experiencing music physically, emotionally, and historically. That's the kind of context AI lacks entirely.
The Counterargument
Some might argue that AI can already replicate musical patterns convincingly, or that audiences can't distinguish between AI-generated and human-produced music. But Beato's position is clear: technical replication isn't artistic understanding. The question isn't whether AI can mimic chord progressions—it's whether anything digital can comprehend why musicians choose those specific sounds in the first place.
Critics might note that many listeners genuinely enjoy AI-generated music, and that the emotional distinction is becoming less obvious as technology improves. That's a fair challenge—and it points to exactly why this conversation matters now.
Bottom Line
Beato's strongest argument is simple: music carries human history, emotion, and intention in ways algorithms cannot replicate. His vulnerability is less about proving AI is insufficient, but rather explaining what makes human musical choice irreplaceable. The 1970s rock bands he references aren't just examples—they're evidence that certain sonic decisions carry decades of cultural weight that no prompt can capture.