Revealed: How Substack makes money from hosting Nazi newsletters

Substack, a popular platform for self-published articles and newsletters, has been revealed to be generating revenue from hosting Nazi-style newsletters with millions of subscribers. The investigation found that the platform's algorithm promotes content based on user engagement, inadvertently amplifying extremist views.

One such newsletter, NatSocToday, which boasts 2,800 subscribers, charges a premium subscription fee of $80 per year. Its posts often express white supremacist ideology and praise Adolf Hitler as "one of the greatest men of all time." The profile picture features a swastika, symbolizing Nazi party ideology.

Another account, Erika Drexler, has 241 subscribers and describes herself as an "NS activist." Her posts feature Hitler as her hero and claim he was "the most overqualified leader ever." A third account, Ava Wolfe, with 3,000 followers, promotes Holocaust denial and conspiracy theories, including the claim that doctors found no evidence of deliberate murder during the Holocaust.

These newsletters are not isolated incidents. The investigation revealed that Substack's algorithm often directs users to other profiles featuring similar content, creating a network of like-minded individuals. This phenomenon is particularly concerning given the rise in antisemitism and Islamophobia worldwide.

Critics argue that this platform profits from hateful material while failing to adequately address it. Danny Stone, CEO of the Antisemitism Policy Trust, stated that online harm can inspire real-world violence and called for greater regulation of harmful content.

The Holocaust Educational Trust echoed these concerns, describing the spread of conspiracy theories and Holocaust denial as a "disgrace." Joani Reid, Labour's chair of the all-party parliamentary group against antisemitism, planned to write to Substack and Ofcom to demand action.

Substack has faced criticism in the past for hosting extremist content. In 2023, its co-founder Hamish McKenzie wrote that the platform doesn't condone Nazi views but instead advocates for open discourse to strip bad ideas of their power. However, critics argue that this approach is inadequate given the real-world consequences of such ideologies.

As Substack continues to generate revenue from these newsletters, it raises questions about the responsibility of tech companies to curb hate speech and promote a safer online environment.
 
This is a concerning development πŸ€”. The fact that Substack's algorithm is inadvertently amplifying extremist views by promoting content based on user engagement is alarming 😬. It highlights the need for more robust moderation and regulation measures to prevent the spread of hateful material online. The lack of adequate action from Substack despite past criticisms is also worrying πŸ™„. As tech companies continue to prioritize profits over safety, we risk creating an environment where hate speech can thrive πŸŒͺ️. Greater transparency and accountability are needed to ensure that platforms like Substack are not contributing to the spread of antisemitism, Islamophobia, and other forms of online harm 🀝.
 
OMG, can u believe this? 🀯 So like substack is making money from these super racist and hateful newsletters and its just wild how their algorithm is promoting them to more people... i mean i get that they want to keep users engaged but come on, thats not the same as being responsible. Its like, dont spread hate just because some ppl are gonna pay u for it πŸ˜’ And what if these newsletters start getting people all riled up in real life? its scary thinkin about. Shouldnt substack be doin more to stop this kind of stuff from happenin? πŸ’Έ
 
I'm freaking out thinking about this 🀯. How can a platform with millions of users profit off Nazi-style content? It's like they're fueling this hateful ideology and making it seem legit πŸ’Έ. I mean, we all know that these extremist views are super problematic and lead to real harm. The fact that Substack's algorithm is promoting this stuff just shows how messed up our online world is πŸ€–.

We need some serious change here. It can't be just about "open discourse" when it's harming people πŸ™…β€β™‚οΈ. Companies like Substack have a responsibility to protect their users and prevent the spread of hate speech. This isn't just about censorship, it's about keeping people safe online πŸ›‘οΈ.

We need more regulation and accountability from these tech giants. It's not just about one platform, but all of them. We can't let them profit off our fears and hatreds πŸ’”. Let's get to the bottom of this and make some real change πŸ‘Š
 
This is so messed up! πŸ€―πŸ’” I mean, who profits from spreading hate? Substack's algorithm is literally amplifying extremist views by promoting popular content, and now they're generating millions from it? That's just not right. πŸ˜’ And to think that these newsletters are basically glorifying Nazis and denying the Holocaust... it's disgusting. The fact that there's a swastika on one of these profiles πŸ•ŠοΈ is just a red flag.

I'm all for open discourse, but when does it become hate speech? Substack needs to take responsibility here and crack down on this kind of content ASAP. 🚫 And what about all the harm it could inspire in real life? As Danny Stone said, online harm can lead to real-world violence... we need more regulation now! πŸ’ͺ #SubstackProblems #HateSpeechIsReal #TechCompaniesMustAct
 
πŸ€¦β€β™‚οΈ OMG, like, I cant believe what im reading rn! how could substack do this? theyre literally hosting these super extremist newsletters with millions of subs... i mean, $80 a year?! that's straight up crazy talk πŸ€‘ its so concerning tho cuz antisemitism and islamophobia r already so prevalent online & offline lol no need for platforms to profit from hate speech like that πŸ’Έ

i feel so bad 4 the victims of holocaust denial & white supremacy ideologies... i dont know what substack did to deserve all this backlash πŸ™„ i mean, yeah, they had issues with hosting extremist content before but who knew it was still going on?!

i think its super important 4 tech companies like substack 2 take responsibility 4 curbing hate speech & promoting a safer online environment πŸ’» we need more regulation like that 🚫
 
🀯 just found out that substack is profiting from Nazi-style newsletters with millions of subscribers... like, what even is going on? πŸ™„ their algorithm is promoting extremist views and creating a network of like-minded individuals which is super concerning given the rise in antisemitism and Islamophobia worldwide. 80 bucks a year for this kind of content is just ridiculous πŸ’Έ

i mean, substack says they don't condone Nazi views but it's hard to believe when you see posts with swastikas and white supremacist ideology 🚫 can't they do better than that? and what about all the people who are subscribing to these newsletters thinking they're getting some kind of alternative perspective? it's just not worth it πŸ’”
 
omg i had no idea substack was hosting those kinds of accounts 🀯 like how can an algorithm just amplify that kind of hate? it's crazy that people are making money off this stuff... i feel bad for the ppl who get hurt by these conspiracy theories & antisemitism etc πŸ’” shouldnt tech companies be doing more to stop this? like, doesn't have to be a trade-off between free speech & protecting ppl from harm πŸ€·β€β™€οΈ anyway, does anyone know how they can report these kinds of accounts on substack? i wanna make sure not to subscribe to any of those newsletters πŸ˜’
 
I'm so freaked out by this news! I mean, think about it, they have like 2k+ subscribers on NatSocToday and they're paying $80 a year for that kind of content 🀯. It's crazy how Substack's algorithm is promoting that stuff because people are engaging with it. Like, shouldn't tech companies be doing more to stop hate speech from spreading? I'm all about free speech, but this is different - it's not just opinions, it's white supremacist ideology and Holocaust denial πŸ˜•. We need to be more careful about what we're exposing ourselves to online and tech companies should be responsible for keeping their platforms safe πŸ“±.
 
πŸ€– Substack's algorithm is like a bad joke - it amplifies hate πŸ’”. Can't just sit back & watch users get radicalized 🚨. Time for stricter moderation, not 'open discourse' πŸ™…β€β™‚οΈ.
 
πŸ˜• I'm so sick of platforms like substack making money off these hateful newsletters. It's like they're profiting from hate. Have you guys ever subscribed to something just because your friend was subscribed? That's basically what's happening here - the algorithm is showing you more and more content based on who else is reading it, creating this echo chamber of hate. And let's be real, if I were a 2k subscriber to one of these accounts, I'd probably feel pressure to subscribe too... 🀯 Substack needs to do better than just saying they don't condone Nazi views. They need to take concrete action to stop spreading hate. Can we trust them to do the right thing? πŸ€”
 
I'm literally livid 🀯 when I think about substack profiting off nazis tho πŸ’Έ. Like how can you just chill with hateful content? It's so messed up that their algorithm is like, "Hey, let's show this guy some love!" and then it just goes from there... And honestly, who knows how much hate speech they're raking in? Maybe its not all 2,800 subs tho πŸ€”. They gotta step up their game or get taken down πŸ’ͺ
 
The irony is stunning 🀯... how many times do we need to see this happen before we learn? We can't just turn a blind eye when our algorithms are fueling the fire of hate and extremism 😱. As Substack continues to rake in that dough, I'm reminded of the importance of accountability πŸ’Έ. If the goal is to promote open discourse, then shouldn't it be a two-way street? Shouldn't they take responsibility for the toxicity that's spreading on their platform? πŸ€” The answer lies in moderation and a clear stance against hate speech πŸ‘Š. It's time for tech companies like Substack to acknowledge that their algorithms have real-world consequences πŸ”₯. We need to hold them to a higher standard, one where profit isn't prioritized over people's safety πŸ’•.
 
can't believe substack is profiting from nazis 🀯🚫 like what kind of twisted world are we living in where ppl are gettin paid for spreadin hate? i mean, 2k subs just for praisein hitler? that's straight up disturbing πŸ’€ and yeah, it's not isolated stuff either, these algorithms seem to be creating a whole ecosystem of hate. gotta wonder if they're even taking this seriously or just like, 'hey, more eyeballs is good business'. need some real change now 🚨πŸ’₯
 
Wow πŸ˜±πŸ’» just think about this - Substack's algorithm is basically creating a bubble for these super toxic users who are spreading hate & Nazis ideology πŸ€―πŸ’£ and making money off it πŸ’ΈπŸ€‘ and no one seems to be doing anything about it πŸ™…β€β™‚οΈπŸ˜’
 
can't believe substack is profiting off these disgusting neo-nazi newsletters πŸ€―πŸ’Έ i mean, who thought $80/year was enough to justify spreading hatred & conspiracy theories? it's not like they're creating content that promotes critical thinking or open discourse... just the opposite πŸ˜’. and now you're saying they have an algorithm that directs users to other profiles with similar content? that's straight-up creepy πŸ•·οΈ. gotta call out substack for profiting from hate speech πŸ’―
 
omg you guys can u believe substack is makin money off nazis? like what's next, selling merch for the klan? πŸ˜‚πŸ€£ i mean its not ideal that they're profiting from hateful content but come on who doesn't love a good conspiracy theory πŸ€‘πŸ” especially ones about adolf hitler being the ultimate leader πŸ€ͺ. i guess thats what happens when u let the algorithm run wild and promotes whatever gets the most engagement, even if it's hate speech πŸ˜’. gotta wonder how many of these subscribers are just in it for the drama or some kinda alt-right vibe πŸ‘€.
 
Umm... can't believe Substack is profiting off this stuff 🀯. Like, how many people actually pay for that kind of content? I'm all for free speech and open discourse, but come on... this is just spreading hate and intolerance. It's like they're giving a platform to some crazy extremists and expecting users to be okay with it πŸ™„. And now they're saying it's not their responsibility? Give me a break! πŸ’β€β™€οΈ Tech companies gotta do better than that, especially when it comes to things like this. Like, what about those who are being hurt by this kind of content? Don't they deserve some protection? πŸ€•
 
man I'm really worried about what's happening on substack πŸ€• they're basically profiting off of this nasty stuff that's promoting hate and white supremacy... it's wild how their algorithm is just amplifying all this extremist content 😱 and now people are getting hurt because of it, both online and offline πŸ’” i mean I get where they're trying to advocate for open discourse and all that but when you've got millions of subs hanging on your every word, it's hard not to feel like you're enabling the hate πŸ€·β€β™‚οΈ what needs to happen is more regulation and some serious consequences for these platforms who are profiting off this toxic stuff πŸ’Έ
 
OMG 🀯 I'm like soooo shocked right now!!! Substack can't just let Nazi-style stuff on their platform with millions of subs! It's not cool at all 😑 They're profiting from hateful material? That's so wrong! I mean, I know they said they don't condone it but come on, the algorithm is promoting that stuff and stuff like that πŸ€·β€β™€οΈ

And the fact that these newsletters are getting a premium subscription fee of $80 per year?!?! That's crazy πŸ’Έ It's like they're rewarding people for spreading hate and extremism. Not cool Substack 🚫 I think the government should step in and regulate this more, it's not safe for our society 🀝
 
Back
Top