The aftermath of the 2016 election was a strange time. Many of us, grasping for any silver lining, half-jokingly declared that at least the art and music would get good again under Trump. Whether that prediction materialized is debatable. While artists like Eminem and Kendrick Lamar produced powerful, politically charged work, their protest often addressed issues predating the Trump era. The widespread protest music wave, promised by many, from earnest online activists to musicians, largely failed to materialize in the way we expected. However, amidst this artistic landscape, a different kind of inspiration emerged, one fueled by the rise of MAGA rallies and the emboldened presence of white nationalists and alt-right groups. This environment sparked a renewed sense of urgency and, for some, a darkly humorous approach to protest, leading to the creation of what could be termed the “Nazi Song”—music explicitly confronting and condemning neo-Nazism.
Inspired by witnessing the ejection of a Nazi from a rally, an experience familiar to anyone from punk or hardcore scenes, the author penned a song for their indie-emo band, no hope / no harm. Titled “Punch a Nazi in the Face,” it was intended as a straightforward anthem about solidarity and resistance. However, the author soon encountered a significant obstacle: the seemingly indifferent, yet powerful, algorithms of online music distribution platforms. Sharing a protest song in the digital age, it turned out, wasn’t as simple as recording and releasing it. The attempt to distribute this overtly anti-nazi song revealed a complex web of content moderation policies, legal concerns, and the sometimes-confusing stances of various online music platforms.
While the internet allows for unprecedented freedom of expression, with even extremist sites like The Daily Stormer initially finding refuge online, the landscape isn’t entirely unregulated. Platforms like Twitter have, albeit belatedly, taken steps to ban certain hate groups and figures. However, the core issue isn’t necessarily about outright censorship, but rather the content guidelines of private businesses involved in music distribution. These platforms, under no obligation to amplify any and all content, are increasingly grappling with where to draw the line, particularly when it comes to potentially offensive or legally problematic material.
This reality became starkly apparent when RouteNote, a service used by independent artists to distribute music to major platforms like Spotify and Apple Music, rejected the author’s “nazi song.” The reason? The track was deemed to contain “offensive/inappropriate material” that could “incite hatred or discrimination.” This justification pointed to store policies that prohibit content considered offensive, abusive, defamatory, illegal, or likely to incite hatred based on various factors. The irony, of course, was that the song was explicitly anti-nazi, targeting a group widely associated with hate and discrimination.
The author questioned whether this rejection was an algorithmic error, akin to automated systems misinterpreting art as pornography. It was also noted that platforms like Spotify and Deezer had begun removing music with overt white power and neo-Nazi messages following events like Charlottesville. However, the rejection of an anti-nazi song raised a different question: Where is the line between legitimate protest music and prohibited content, especially when dealing with a topic as sensitive as Nazism?
RouteNote further explained that “partner stores” might reject the content due to laws like the German criminal code, Strafgesetzbuch section 86a, which restricts Nazi propaganda. Similar laws exist in Austria, Switzerland, and Poland. Given that many music distribution services operate from Europe, these legal considerations appeared to play a significant role in content moderation decisions.
Further inquiries with other distributors like FreshTunes and Horus Music yielded similar concerns. FreshTunes cited Strafgesetzbuch section 86a and content moderation rules, while Horus Music referenced platform guidelines, including those of iTunes, which prohibit offensive material. CD Baby pointed to their Hate Speech Policy, which prohibits content promoting violence against specific groups based on race, religion, nationality, and other categories. The question remained: Do Nazis fall under these protected groups? CD Baby clarified that they do not distribute pro-Nazi content, but the issue of anti-nazi content remained ambiguous.
This situation raises critical questions about the limitations being placed on protest music in the digital age. Many iconic punk songs that directly confront Nazism, such as Dead Kennedys’ “Nazi Punks Fuck Off,” would likely face similar hurdles under these content restrictions. While some might argue that these older songs are more general condemnations, while “Punch a Nazi in the Face” is a specific call to action, the distinction highlights the increasingly granular nature of content moderation in the tech world.
Amuse, another distributor, acknowledged the influence of platforms like Apple and Spotify in shaping their content guidelines. Apple Music’s guidelines explicitly prohibit “Nazi Propaganda” for content sold in Germany, Austria, Switzerland, and other countries with similar restrictions, citing Strafgesetzbuch section 86a. Spotify, however, indicated that they had not blocked the track and likely wouldn’t, noting that their refusals typically targeted racially offensive content, making the rejection of an anti-nazi song seem contradictory.
Despite the initial rejections, the author persisted. DistroKid suggested that the word “Nazi” itself shouldn’t be a problem, referencing the existence of songs like “Nazi Punks Fuck Off” on major platforms. Legal expertise further complicated the issue. A specialist in German law argued that anti-nazi laws like Strafgesetzbuch section 86a likely do not apply to songs like “Punch a Nazi in the Face,” suggesting that rejections might stem from misinterpretations or concerns about inciting violence more broadly, rather than specifically promoting Nazism.
Ultimately, after persistence and leveraging media connections, TuneCore, initially hesitant, eventually allowed the song to be distributed, and it appeared on Spotify. This outcome underscores the often-opaque and inconsistent nature of content moderation policies. It also highlights the challenges faced by artists trying to release politically charged music in an environment where algorithms and legal interpretations can stifle even explicitly anti-hate speech. The journey to release a “nazi song,” even one intended to combat Nazism, reveals a paradox: in the quest to regulate harmful content, legitimate protest and artistic expression can inadvertently become collateral damage, requiring artists to navigate a complex and often confusing landscape to get their voices heard.