Substack's Nazi problem
If you haven’t heard, Substack has a Nazi problem. The main problem is that the founders don’t really have a problem with the presence of Nazis on their platform. And that gives everyone who interacts with Substack a Nazi problem.
“Just leave Substack” is a fairly appropriate response to this. How and when to leave Substack is a longer discussion.
What is Substack’s Nazi problem?
Other people have written far more informed takes than I can, so I’m just going to share enough to provide context for anyone who hasn’t already been following this issue.
On 12/14/23, a number of people who publish through Substack shared an open letter to the platform’s founders. The letter asked one simple, direct question:
Why are you platforming and monetizing Nazis?
The letter goes on to cite a number of specific examples of content that would be immediately removed from most platforms, which is allowed to remain on Substack.
Before posting this, I looked at some of the content that people want to see removed. These are not just posts that people have a simple disagreement with. There is actual Nazi content on Substack. I’m not sharing links to any of this content, but if you want to read about how bad it gets, I’ve added a bit more commentary in the first footnote of this post. It’s important to understand that there is actual Nazi content here, when making sense of the response to the open letter by Substack’s founders.1
Substack’s core response
Most of the response is bad-faith justification for their course of action. I’ll start with their actual response, and then share a bit of their rationale. Here’s their core response:
Our content guidelines do have narrowly defined proscriptions, including a clause that prohibits incitements to violence. We will continue to actively enforce those rules while offering tools that let readers curate their own experiences and opt in to their preferred communities. Beyond that, we will stick to our decentralized approach to content moderation, which gives power to readers and writers.
The message here is clear: Substack will continue to allow Nazi content to be posted, because you have the power as a reader to avoid that content.
There’s a message here that’s just as clear, without being stated explicitly. Substack will allow a pro-Nazi community to develop on its platform.
Substack’s (problematic) rationale
Most of the response is Substack’s attempt to rationalize their decision to not remove Nazi content. I’ll share a few points they made, and why those points are disingenuous at best.
I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don't think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.
It’s easy to say that you don’t like Nazis. But if you’re the founders of a publishing platform, your actions mean much more than a simple claim of “we don’t like Nazis”.
Substack’s founders are saying they don’t “think” that censorship makes the problem go away. In fact, they go on to claim right here that censorship of Nazi content makes the problem worse. That is simply wrong; deplatforming hate content works. It reduces the reach of people pushing hate, it reduces their opportunities to build a following that supports and funds their work, and it reduces the normalization of these positions in our general society.
This rationale is another version of the claim that “sunlight is the best disinfectant”. But there’s a very important distinction between shining sunlight on a horrible corner of society where people plot the extermination of others, and offering those people space on the platform you’re building.
But what about freedom of speech?
It’s important to note that Substack does not use the phrase “freedom of speech” in any of their responses that I’ve seen. Here’s what they have to say about freedom.
First, from Hamish’ post:
We are committed to upholding and protecting freedom of expression, even when it hurts.
And from Chris’ co-signing post:
We believe strongly in protecting and upholding freedom of expression, and we think that it can thrive on the internet when matched with the right model.
They’ve chosen their words carefully; they’re claiming to uphold “freedom of expression.” I think many people believe they have to uphold “freedom of expression”. But that’s not the case at all; if you run a platform of any kind, you’re allowed to curate the content that appears on your platform.
If my take on this isn’t convincing, here’s the actual full text of the first amendment to the US Constitution:
Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.
I think people assume that the Bill of Rights is a complex legal document outlining how the rights of US citizens are protected. That’s not the case at all; each of the first ten amendments is a single short paragraph.2
It’s worth repeating that the strictest prohibition against restricting someone else’s speech only applies to Congress. The people running a platform like Substack are quite free to curate the content that appears here. They do, in fact, prohibit some kinds of content that is not illegal. So they’re curating content; it’s just that they’re okay with allowing Nazi content on their platform.
So where does this leave us? We’ve seen repeatedly that when people allow extremist content on a platform, that content starts to crowd out other content on the platform. You can see it in the discussions of this issue on Notes, and in the comments of posts discussing this issue.
Most reasoned takes on the issue are drowned out by people shouting “freedom of expression”. Some of these people misunderstand the restrictions laid out in the first amendment. Others understand quite well that freedom of speech is primarily about government censorship, but exploit the common misunderstanding that it’s supposed to apply to all private platforms as well.
The bigger picture here is that once this kind of content is allowed on a platform, it draws more and more people who will shout any reasoned conversation down. If you want to build a platform that allows extremist content, you are quite free to do so. But you should not be surprised to look around one day and find that most of the content on your platform is extremist, and most of the people on your platform are shouting loudly about extremist views.
I knew this was a possibility when I started using Substack. They didn’t promote Nazi content directly, but they have very visibly promoted right-wing commentary as they’ve grown. I had hoped they’d take a more neutral stance as the platform grew and their reach became more prominent. This open letter, and the response by Hamish and Chris, make it quite clear that they’re going to continue doing exactly what they’ve been doing.
They’ve also made our choice quite clear. We can stay on Substack, and see more extremist content and people all around us. Or we can leave Substack. I haven’t made my final decision yet, but it’s really hard to imagine staying here much longer.
How do you “leave” Substack?
When people who have the privilege of living comfortably talk about people living in oppressive areas, there’s a tendency to say “Well, they choose to keep living there.” But leaving a place is harder than many people realize. And leaving a publishing platform isn’t as easy as people think, either.
What is Substack?
If we’re looking for somewhere else to go, it’s helpful to recognize what we came here for in the first place. Substack offers a platform for building an email newsletter, along with a growing archive of posts that can be managed in different ways. There’s an integrated payment system. There’s a network of readers and writers, that support interactions in a variety of ways.
Substack has done a really good job of building all this, and that’s why many of us have chosen to publish here and not somewhere else. If you choose to leave Substack, I don’t think you can find all this somewhere else; you have to piece together elements that were offered as a complete package by Substack.
What else is out there?
I don’t have a clear plan for leaving Substack yet. I’ve been keeping my eye on what else is out there, because I’ve known I might need to leave the platform from the time I started using it.
Here’s a brief rundown of some of the tools that are currently available:
Buttondown offers a markdown editor for composing newsletter posts, and supports free and paid subscriptions. It’s free for up to 100 subscribers. It costs $9/month for up to 1,000 subscribers, and $29/month up to 5,000 subscribers. You have to sign up for the $29/month plan to use a custom domain.
This doesn’t strike me as particularly expensive for an established newsletter with a reasonable paid subscriber ratio, but it is prohibitive for people with a few hundred to several thousand subscribers, who aren’t bringing in much money yet.
beehiiv is similar to Buttondown. It’s free for up to 2,500 subscribers, and jumps to $42/month through 10,000 subscribers. You also have to choose the $42/month plan in order to use a custom domain, or if you want to offer paid subscriptions.
The distinguishing characteristic of Ghost is that you can self-host the software. If you have the skills to do so, you can build out your newsletter infrastructure for the cost of hosting those services. If you want to use the hosted version, which most Substack writers would need to do, you’ll need to choose a paid plan. The cheapest plan is $9/month (500 subscribers, with limited features), and the next level is $25/month (1,000 subscribers, most features).
I think Flodesk is meant more for email marketing, but it’s something I’m going to look into further if I decide to leave Substack. It’s the only platform I’ve seen that has a flat rate regardless of how many subscribers you have.
If I choose to leave Substack, I’ll write up my migration process, and a description of my new setup. If you leave, please take some time to share your process, and how it goes.
Substack has a Nazi problem, and it doesn’t appear to be going away. There are no easy answers, partly because most platforms are either controlled by people with similar outlooks, or have the potential to be bought up and taken in a similar direction. The people promoting extremist views want you to have a hard time getting away from them.
It’s also hard to migrate to another platform, because most platforms aren’t as easy to get started on and build an audience with as Substack. Most require more technical work, and require you to glue together more pieces of infrastructure.
If you’re writing on Substack, there are a few specific things you can do right now:
Make sure you’re downloading an archive of your work on a regular basis. If you do this, you’ll have the resources needed to recreate your work on a different platform. You should know, however, that it won’t be styled correctly and there may be some missing pieces. Open up the archive you download, and take a look at what’s there so you’ll know what you have to work with.
Explore the alternatives. Look at each of the tools described earlier, and start to become familiar with other options. Talk to your writer friends, and see what other people are thinking about where to go.
Consider using a custom domain. If your newsletter is only served over newsletter-name.substack.com, then readers are used to going to a Substack-specific URL to see your posts. You can set up a custom domain (for example, mostlypython.com in my case). For now, everything will keep looking the same as it always has. But if you choose to migrate, people will already be going to a domain that you control, not one that Substack controls. This is a good thing for the long term, even if Substack sorts out the Nazi issue for the time being.
Communicate with your readers. If your audience isn’t focused on writing newsletters, I don’t necessarily recommend a full post about Substack’s issues. But consider saying something to your readers. Add a note to one of your posts sharing your thoughts, and invite your readers to share their thoughts. Don’t be taken by surprise if there’s suddenly a large exodus from Substack. More importantly, don’t be surprised if you see more and more extremist content here to the point where you can’t ignore it.
If you mostly use Substack for reading, there are a couple things you can do as well:
Read just enough about this to know where Substack stands. Read the open letter, and read Hamish’ response. If you need convincing about the seriousness of the issue, follow some of the links in the Atlantic article.
Share your thoughts with the writers you’re here for. You can reply to any email you receive from a Substack writer, and most writers will receive your message. (It’s possible to turn that feature off, but most writers want to hear from their readers, and keep it turned on.) Let writers know what you think. Let them know if you’ll follow them on a different platform and let them know if you decide to cancel your Substack subscriptions because of this issue.
Finally, I want to share three posts that have been really helpful for me in making sense of all this.
Ken White writes under the name popehat. I’ve read his work for a long time, and have a lot of respect for his thinking. He recently wrote Substack has a Nazi Opportunity, and it’s a much more informed take on this subject than what I’m offering.
Dave Karpf wrote On Substack Nazis, laissez-faire tech regulation, and mouse-poop-in-cereal-boxes. It’s an excellent article with the basic point that you should accept the same amount of Nazis on your publishing platform, as you accept mouse poop in your breakfast cereal. It’s a good, nuanced take; the answer isn’t actually zero in either case. But we should always be striving for zero, in both cases.
A.R. Moon wrote Questions for Substack. It’s another great response to Hamish’ post. Moon poses questions that should be asked, but which people like Hamish are quite unlikely to ever answer in good faith. He also has a great take on the “sunlight is the best disinfectant” claim, when used to defend “free speech” for Nazis on private platforms.
Thank you for reading. If you have any feedback, feel free to leave a comment or reply to any email you receive from me.
Content warning: Extreme anti-semitic content. Most of my footnotes are quite benign. This one is not.
Shortly after Hamish made his post stating that Nazi content would not be taken down, a newsletter mentioned in The Atlantic article Substack Has a Nazi Problem released a new post with explicitly extremist content. I originally included two brief quotes from that article. One excerpt called for the “extermination” of Jewish people all over the world. The second quote included a claim that they were not advocating for “individual deaths of Jews” or “lone wolf attacks”. This second quote is typical of the way people espousing Nazi goals claim that they’re not really inciting violence.
I won’t link to any extremist content, but I wanted to share one specific example to counter any claims that there’s no “actual” Nazi content here. I wanted to include one specific example of the kind of content people are talking about, in case anyone reading this thinks “Nazi” just refers to “anything people disagree with these days”.
Casey Newton is one of the authors of Platformer, one of Substack’s largest publications. Casey shared yesterday (1/2/24) that they’re meeting with Substack about the issue of moderating extremist content, and stated that they’ll leave Substack if this is not addressed effectively. Casey and his team are also planning to build a database of extremist Substack newsletters. They’re planning to share their findings with Substack, and if that doesn’t go anywhere meaningful they’re planning to reach out to Stripe as well. Stripe handles payments for Substack, and Stripe does not support payment processing around extremist content.
This morning (1/3/24), the extremist post I mentioned earlier was finally taken down. Many people who want to see this issue just go away will claim that Substack is in fact taking down incitements to violence. But it’s really hard to see this as anything more than just taking down the most visible example of extremist content that others have pointed out quite publicly. There’s been no acknowledgment that this kind of content needs to be proactively moderated by Substack itself. If your most prominent authors have to build a database of Nazi content on your platform and share it with you, that’s not a good sign.
Substack is becoming like Twitter and many other social platforms: it’s a private company, but they’re creating a public communications infrastructure. The decisions of a few private individuals is having a serious impact on public discourse. I will continue watching how this plays out, and make a decision about where and how to publish before much longer.
The idea of applying US legal principles to a platform that has international reach is a topic I won’t go further into here. But this is also about a human right of expression, which has many forms around the world. The larger point I’m making should apply in many contexts around the world.