Is Substack a safe place to build your community?

Like us, you’ve probably been hearing a lot of chatter about Substack lately. And not necessarily in a good way. The CMJ Group's Director of Strategy, Kelly Shetron, and I sat down to unpack what we’ve been seeing:

Is Substack a place for community?

Kelly: A few weeks ago, I was listening to a podcast in which Liz Gilbert praised Substack as a safer space than social media, saying: “The reason it’s such a wonderful community is because you have to sign up for it. Nobody signs up for something to be a troll. And so the community itself is safer.”

Before we get into the safety part, I first want to talk about Substack—a newsletter service with a comment feed and Twitter-like Notes feature—as a community. I know Liz Gilbert isn’t the only one who thinks of it as one, but I also know we define community in a specific way: a set of people who share mutual concern for one another. We also talk about audiences as the product of one-to-many communication (when one person broadcasts to many people), whereas communities are spaces where everyone can form relationships and a sense of belonging with each other. I wonder if Substack is a hybrid of these two concepts.

What do you think? Is Substack a place for communities, audiences, or both? What do you think about the strengths and limitations of a platform like Substack in fostering community?

Carrie: Substack and other writer platforms like Ghost (its open-source alternative) are tools to bridge audiences into communities. A helpful corollary to Substack is Patreon. Patreon’s creators have audiences built around them, and usually, a subset of that audience is interested in being in real conversation with the creator and the creator’s other followers. Simply financially supporting a creator does not mean you are in their community. But starting to cross the line from consumers of their content to active contributors to their conversations? That’s where a community begins. Substack is similar in that it allows audience members to pass that threshold from passive readers to active commenters and conversationalists.

"Simply financially supporting a creator does not mean you are in their community."

In terms of strengths and limitations, Substack is great for publishing text-based content, holding an archive of that content, and easily monetizing it. Many talented writers would otherwise find it difficult to share their work with the world and financially benefit from it. Many, like Anne Helen Petersen, do a fantastic job of using the comments feature to start threads where hundreds of people can share everything from how they afforded buying a house in their 20s and 30s to what they’re reading or watching.

The limitation is that, as the creator, you’re the only one who can initiate conversation. So if you want a community to extend beyond just yourself, this isn’t gonna cut it.

Kelly: That seems like such an important distinction. What’s the value of members being able to start conversations? Is it important for communities to extend beyond a creator?

Carrie: If members can’t start conversations, the onus is on the creator to initiate engagement. That can become tedious and exhausting. In tech-speak, it doesn’t scale. There’s a hidden benefit though: you get to choose when you want to have conversations versus the community being a 24/7, always-on space. Anyone creating a community needs to decide if they want to prioritize scale or control. If you want control, you can’t scale.

"If members can’t start conversations, the onus is on the creator to initiate engagement."

Finding a safer space than social media

Kelly: We can probably agree that social media is unsafe in many regards for many people, as Gilbert mentioned. It also has a place in our lives, which doesn’t seem to be changing anytime soon, for better or worse. From a community strategy perspective, what is social media good for, and how does it fall short?

Carrie: Social media is best for discovering and maintaining individual relationships. Research has shown that spending time to strengthen relationships or our self-image on social media benefits our well-being. However, social media is terrible for encouraging relational maintenance. The algorithms favor content that distracts rather than deepens our commitment to each other.

"Social media is terrible for encouraging relational maintenance. The algorithms favor content that distracts rather than deepens our commitment to each other."

Does Substack have a Nazi problem?

Kelly: Substack recently faced backlash from writers (some of whom abandoned the platform altogether) when it failed to ban Nazi accounts (eventually, it did ban a few). What responsibility do community or community-like platforms have when it comes to content moderation?

Carrie: Substack does have a Nazi problem. The Internet has a Nazi problem. Despite repeated attempts from top creators (Casey Newton, Anne Helen Petersen, Jeanna Kadlec) to beg Substack to reconsider their policy on antisemitic content, they won’t back down.

The issue with Substack is that they are making a business decision to allow such content that prioritizes profit above all else. They know that hate makes money, and they’ve chosen to profit from it, much like YouTube allows its algorithms to serve up harmful content to children as it rakes in cash from advertisers. There is the ideal level of responsibility… and then there is late-stage capitalism.

A few platforms are navigating these decisions responsibly, most notably Reddit and Discord. Discord just updated its hateful conduct policy in October 2023, and Reddit was one of the first major community platforms to take a stand against Nazi content in 2017. Does that mean these platforms and their leadership are perfect? Absolutely not. Do they still have plenty of issues with moderation? Yes. But they offer helpful role modeling.

What does keep a community safe?

Kelly: What ingredients go into community safety? Is there a recipe?

Carrie: As we know through the work we’ve done with our clients, community safety is a moving target that requires an ongoing combination of platform-wide and community-specific policies, leadership, and user contribution. The community needs documented policies and a way to surface them and ensure people understand them (ideally prior to participating), and the platform needs to take a stand on things like hateful content. If it doesn’t, there is always going to be a limit to what the creator can do to make the space safe. When creators spoke out about Substack’s policies, their own comment sections filled with backlash from Nazis. All the creators could do was turn off comments, which effectively turned off their community.

"The community needs documented policies and a way to surface them and ensure people understand them, and the platform needs to take a stand on things like hateful content."

Beyond policies, you also need active role-modeling of positive behaviors from leadership and super-active moderation from leadership and moderators. The moderation can be “chill” (mods can speak like community members and say things like “buddy, we don’t talk like that here”), but it needs to be there.

One resource that’s especially helpful for staying on top of changes in moderation is Ben Whitelaw's Everything in Moderation Newsletter. I also highly recommend the work of Venessa Paech and Kat Lo.

 

Subscribe to Get Our Future Newsletters!

Don’t miss out on updates, explorations, and future topics. Sign up below:

     
    Previous
    Previous

    Founder & Tech Community Strategist Carrie Melissa Jones to Present at SXSW 2024

    Next
    Next

    The CMJ Group’s Community Newsletter Experiment