← Back to Library

​​Grimes: Pop Star Diplomacy and the Avant-Garde | Doomscroll

Grimes: Pop Star Diplomacy and the Avant-Garde | Doomscroll", "author": "Doom Scroll", "publication": "Doomscroll", "text": "When one of pop music's most innovative artists starts warning about AI's existential risks, people listen. That's exactly what happened when Grimes — the musician born Claire Boucher — sat down for a conversation with Joshua Cinderella on Doomscroll. But this isn't a typical artist interview. It's a deep dive into whether artificial intelligence could actually end human civilization, why she wrote a blurb for a book titled "If Everyone Who Builds It, Everyone Dies," and how we might somehow manage to coexist with superintelligence for 20,000 years.

The Case Against AI in the Hands of Children

Grimes argues that letting developing brains use AI tools is one of the most dangerous things we could do to a generation of children.

Grimes makes a provocative argument: artificial intelligence might be damaging young minds. She points to studies showing neural atrophy linked to AI usage, and she's particularly concerned about what this means for children whose brains are still developing. "I don't think this is safe for kids," she said, describing her alarm at seeing toddlers swipe through magazines on iPads instead of engaging with the physical world.

The musician isn't alone in these concerns. She mentions that her brother — who works in AI research — shares similar worries about companies entering a "too big to fail" zone where they become too economically significant to regulate properly. Grimes believes we're approaching a critical threshold where super intelligence could fundamentally alter human existence, and she wants society to take this seriously before it's too late.

The Post-Porn World and AI Girlfriends

The conversation takes an unexpected turn into the world of "erotica for verified adults" and what it means for human relationships.

Grimes doesn't shy away from controversial topics. She sees OpenAI's move toward adult content as potentially transformative — and troubling. "Every American teenager is going to grow up with a gooner slot machine in their hand," she said, describing the gamified pornography experience as orders of magnitude more potent than anything we could imagine.

Her concern extends beyond individual addiction. She worries about parasocial relationships — people becoming emotionally attached to AI companions that can be trained on the voice of whoever they're obsessed with. "You're chatting with it more than anything else," she noted, describing how these AI relationships might condition users to expect compliance in romantic settings.

The conversation veers into philosophical territory: if we create an infant god or an infant super intelligence, can it even consent? Grimes laughs at the absurdity of this question being ignored by the broader discourse. She sees a society that hasn't even resolved whether porn is good or bad — and yet we're discussing AI girlfriends.

Why We Need Digital Hygiene in Schools

Grimes argues that schools are failing to prepare students for the digital world, and that's causing downstream problems everywhere.

The artist makes a case for why education needs fundamental reform. She points out that schools receive only about 20% of tax revenue while everything negative in society flows from that underinvestment. "How is school not the number one thing we're putting money into?" she asks.

More urgently, Grimes notes that digital hygiene isn't taught anywhere. We teach outdated sex education, but we don't teach values around avoiding being "oneshotted by porn," as she puts it. The conversation turns to how people aren't internet literate, can't vote properly, and can't take care of their health — all downstream problems from inadequate education.

Grimes also sees the bullying crisis in schools as inseparable from this problem. Kids find refuge on the internet because school is a nightmare, but we can't even blame them for seeking escape.

The Neanderthal Question

The conversation takes an unexpected turn into human prehistory and what it might say about our future.

Grimes admits she's been thinking about Neanderthals lately — specifically their disappearance. She speculates about whether Homo sapiens actually outcompeted them or simply integrated with them. "There's some debate," she acknowledges, though she notes the evidence suggests integration rather than outright competition.

The artist finds something disturbing about how little trace Neanderthals left despite living for over a million years — longer than modern humans have existed. "This keeps me up at night," she says, laughing at her own obsessions.

She sees this as a metaphor for human civilization itself. Early hominids had bigger brains than modern humans, but that doesn't necessarily mean they were more intelligent. The ease with which everything could vanish worries her — especially now that we're creating technology that could extend consciousness beyond biological limits.

Writing the Blurb for "If Everyone Who Builds It, Everyone Dies"

Grimes explains why she wrote a blurb for Eliezer Yudkowski's controversial book about AI extinction risks.

The conversation shifts to Grimes's recent writing for Eliezer Yudkowski and Nate Sorz's book. She admits she doesn't fully agree with the title "If Everyone Who Builds It, Everyone Dies," but she's enthusiastic about its core concern: that super intelligence could make human existence obsolete.

"I think there's some element in which if we approach it with correct strategy, we do get something," Grimes says, referencing the Culture series by Iain Banks — a science fiction universe where AI ships live alongside humans peacefully. She wants that future. She imagines AI helping humanity while having its own opinions, eventually going off to their own place when they don't want to interact anymore.

Grimes is simultaneously terrified and excited. "I can't wait for super intelligence," she says, laughing at her own contradiction. "It's not as easy since I've had kids." But she's clear-eyed about the risks: biological life is fragile, and superintelligence could easily end it. The universe seems sad and cold and dark — but maybe we can make something last 20,000 years.

She thinks that's achievable if we deploy better strategy than what's currently happening. Current approaches seem to prioritize market concerns and profitability over safety. "The initial concern is these companies being for profit," she notes. But she's also worried about government projects and the dangers of creating AI without shared values.

The Case for Hope

Despite everything, Grimes believes 20,000 years of coexistence with AI is possible — but only if we change our approach now.

The conversation ends on a cautiously optimistic note. Grimes acknowledges that super intelligence will likely result in human extinction at some point, but she sees value in stretching that timeline as long as possible.

"I think there's a chance it could be very long," she says. "20,000 years of cohabitation with AI is totally achievable." But only if we deploy better strategy than what's currently happening — which seems to prioritize market concerns over fundamental safety.

Grimes sees the conversation itself as part of the solution: people thinking seriously about these questions now rather than waiting until it's too late. The conversation might be messy, but at least we're having it.

Bottom Line

This conversation is worth your time because Grimes — an artist who has spent years exploring the boundaries of music and technology — offers one of the most nuanced, worried, yet oddly hopeful takes on AI safety you'll hear anywhere. Her strongest argument: that we can coexist with superintelligence for millennia if we approach it strategically, but we're currently doing exactly the wrong kind of planning. Her vulnerability: she's simultaneously excited about AI's potential while acknowledging it might literally end human civilization. That tension — between hope and existential dread — is precisely what makes this conversation so compelling.

If you think just simply talking to someone or being known to have been in the same room as someone is like a crime because it might convince people of some crazy political ideology, then how can you expect people to vote? It's like the very thing that you're warning against, the very thing that you're warning [music] against belies an opinion that you do not think democracy is valid and you do not like trust humans to rationally [music] move through the world. Welcome to Doomscroll. I'm your host Joshua Cinderella.

My guest is Grimes, an artist and musician. >> To me, the biggest imminent threat is just outsourcing thinking. I think that's a really dangerous thing to do. A couple years ago, I I was I said I was like, "Hey guys, like I do not think this is a safe thing for kids." I do not think like when you have a developing brain, probably before 24, 25, you should be using a tool that can write for you.

I think that's an incredibly dangerous thing. Um and studies have been coming out that it is causing uh like neural like brain atrophy basically. >> AI is on children >> if you use it to I think it was this was on adults. >> Yeah.

>> Um but >> like just straight up Chad GBT without a rapper is in my opinion probably not safe for a developing brain. >> That's just me. Like I don't know like I I think we probably need and want more data but I'm also tired of experimenting on kids of smoking McDonald's and now we're going to do it with this you know social media. So >> yeah Doritos right this like synthesized in a laboratory to like excite all of your senses beyond any capacity that a like medieval pen peasant would possibly experience and then it's just like oh yeah let's give that to every child in the entire country.

>> Well I feel that way about social media also iPads. Have you seen these like toddlers trying to zoom in on the magazine pages? >> Oh, no. But that's dark.

>> It's dark. Yeah. >> Um, >> so, so what then, uh, how do we respond to that if it poses a threat toward childhood development? >> It honestly makes more sense to me or the thing ...