{"The artificial rapper has a problem: he's being used to spread far-right politics across Britain—and it's getting harder to tell what's real.
That's the core finding from an investigation by Novara Media and the Bureau of Investigative Journalism into Danny Bones, a fictional AI-generated rapper character created by a creative agency called The Node Project. Through analysis of millions of streams and social media interactions, they found Danny Bones has been deployed in targeted political campaigns—particularly during recent bi-elections in Gorton and Denton—spreading messaging that critics characterize as Islamophobic.
The investigation also uncovered ties between The Node Project and Advance UK, a far-right political party led by Ben Habib. A video produced for the party—depicting a history of Britain from ancient times to modern day—was recently removed from Advance UK's website. When questioned by journalists, The Node Project denied being formally affiliated with any political party or labeling itself as Islamophobic.
It's just kind of like so easy to create this kind of world that he's in.
Simon Childs, one of the authors of the Novara Media investigation, told reporters he was surprised by how effective the content has proven to be—not because it's good, but because it's catchy. The song "This Is England," featuring Danny Bones, has gotten stuck in listeners' heads despite being objectively terrible. The videos are hyperrealistic, almost like watching something from a video game.
The real concern is what's coming next. AI technology is getting cheaper and more convincing. The boundaries between what is real and what is completely fake are already fraying. People can now shoot real footage and add filters to create synthetic content that looks authentic.
The Political Impact
During the Gorton and Denton bi-elections, Danny Bones made a number of interventions in the campaign—specifically targeting voters with messages about certain communities. One track expressed anger about recent electoral outcomes featuring Palestinian and Pakistani flags while noting the absence of Union Jacks. Critics might note that attributing intent to AI-generated content is complex—the characters themselves have no agency.
The investigation found Danny Bones also attacked specific candidates, including Hannah Spencer from the Green Party, expressing what witnesses described as "unbelievable" anger through synthetic vocals.
The Platform Problem
TikTok did remove Danny Bones content after the report was published. YouTube added labels to let viewers know they're watching AI-generated material. Spotify reviewed the music and found it hadn't broken any rules—meaning nothing happened. The Electoral Commission responded by encouraging people to think critically about what they're seeing but offered little concrete reassurance that it's addressing the problem.
The bigger issue is whether current laws governing election material need updating for artificial intelligence. Simon Childs says platforms reward people for stirring things up—and that's exactly what AI-generated content does. The algorithm rewards division, and synthetic media makes division easier to produce at scale.
Bottom Line
This investigation reveals something genuinely new: far-right groups are using AI-generated characters to spread divisive rhetoric directly into electoral politics, and the technology is improving faster than regulation can keep up. The strongest part of this argument is the documentation that these tools already exist and have been deployed in real elections. Its biggest vulnerability is that pinning responsibility on artificial entities like Danny Bones is legally and ethically complex—AI doesn't have agency, but it can amplify human-written hate speech efficiently enough to matter. Watch for more AI-generated content in upcoming campaigns—the technology is becoming cheaper and more realistic by the month.}