Note: The following is a companion post to my session “A Renaissance for Online Communities”, presented at PodCamp Toronto 2020 on February 22nd.
A community is not a single place or platform. A community is a connected group of people with something in common.
Communities form around different things. There are communities of place, like your local neighbourhood. There are communities of profession, like web design. And there are communities of interest, like the communities that form around podcasting.
People don’t identify as community members straight away. Showing up at a community event doesn’t mean you’ll feel like a member. At first, we often feel like an outsider, someone dropping in uninvited.
To feel like a community member, we need to achieve a sense of belonging. That sense of belonging grows through participation in shared experiences, both online and offline.
It goes like this: For a typical community, 90 percent of members are passive observers, 9 percent are occasional contributors, and 1 percent are highly active.
It’s similar to the Pareto principle, which you’ve likely heard of as the 80/20 rule: “80 percent of the effects come from 20 percent of the causes”.
A common trait of an active community is having unique symbols or language that mean a lot to the community members, but may not mean anything to outsiders: The maple leaf. “We The North”. Raccoons!
We anchor ourselves on things that convey meaning and identity. It’s how we project who we are, how we recognize others like us, and how we group ourselves as separate from others.
“This is who I am. This is who we are.”
So, to summarize:
- A community is a connected group of people with something in common.
- A community can form around many different things.
- A sense of belonging grows through participation in shared experiences.
- A minority of the community will drive the majority of the activities.
- Communities use symbols and language to identify and distinguish themselves.
Part 1: Where are we now?
Apple introduced the iPhone in 2007. It wasn’t the first smartphone, but it was a pivotal moment in consumer tech. The web, previously limited to our laptops and desktop computers, now fit in our pocket. We could take it everywhere. Android was a fast follow; an affordable alternative for everyone else who couldn’t stomach the cost of an iPhone.
Jump ahead a few years. The 2010s were about radically open social media. We went all in. Facebook wanted us to share as much information about ourselves as possible, so that we could all be more connected. Twitter invited us to share what we were doing and thinking at any given time of day. Instagram wanted us to share the mundane snapshots of our lives that never would’ve justified pulling out a camera for.
These platforms are driven by live feeds, dictated by an algorithm, monetized through ads. Twitter brought the feed. Then Facebook, in their first of many “copy the competitor” moments, introduced their News Feed. Then Facebook switched from reverse-chronological to algorithmic. Twitter followed, and then Instagram.
These platforms have built-in behaviour hooks in the form of Likes and Follows, each one acting like a virtual reward with a little dopamine boost. Likes and Follows also lets our social influence be quantified.
For generations of people who grew up obsessing over grades and scores, we can lean into these metrics as a new indicator of success and personal worth. Strong metrics also mean you’re more likely to be seen by others. The more often you appear in the feed, the stronger your metrics. It’s a perpetual cycle.
On the commercial side, the algorithm also means that what businesses share is less likely to appear in the feed. It’s all pay-to-play. But the feed is where the attention is, so businesses buy ads.
That’s how Facebook makes billions. Own the attention, keep people in the feed, sell ads in the feed. Keep people in the feed, sell more ads. Feed. Ads. Feed. Ads. Feed. Rinse and repeat.
(Aside: The term “feed” is pretty on-the-nose, isn’t it? We’re like animals at a trough. We started using these apps a decade ago because they promised some fulfillment through sharing and connecting with other people. Now look where we are.)
Welcome to the tragedy of our social media commons.
“The tragedy of the commons is a situation in a shared-resource system where individual users, acting independently according to their own self-interest, behave contrary to the common good of all users by depleting or spoiling the shared resource through their collective action.” — Wikipedia
Starting in the 90s and moving into the mid-2000’s, the open web was a network of independent spaces. Now we pile into centralized platforms, pouring our thoughts and photos and connections into the insatiable maw of social media, fueling the endless ad feed cycle.
We used to protect our data. I remember when sharing your real name — let alone a photo of yourself — was considered risky. Now we don’t think twice about sharing our daily lives, and entrusting all of it to a handful of companies.
Think about the breadth of coverage Facebook has. There’s the namesake blue app, but also Messenger, WhatsApp, and Instagram. All that data. All that activity. All that attention. All in the hands of a single company, who can do what they want with it because, hey, we signed their Terms of Service.
What did we get in return for sharing all that data? Exploitation.
Cambridge Analytica, however significant or insignificant their actions, is a flagship story of unintended consequences. Then there are the fun past-times of doxxing, where trolls distribute personal information to harass a target. Or Swatting, another gem, where trolls put lives in danger as heavily-armed law enforcement agents respond to a bogus call.
On the less extreme side of things, but still all-too-common, is cyber-bullying and harassment. I’m sure we’ve all been bullied before. I was bullied a ton when I was a kid. But home, at least, was a safe space.
For victims, the harassment is in their pocket, following them everywhere. There’s no safe space. It comes through the same platforms that bring us the ads and the feeds.
Stalking. Trolling. Doxxing. Sure, there are supposed policies against it, but enforcement is a layer removed and outsourced.
This is what happens when you treat moderation as an afterthought. Social networks were designed to be open and all-consuming. Pour it all in and they’ll figure out what to do with it later. And don’t even get me started on the PTSD of content moderators.
Facebook isn’t a community. Twitter isn’t a community. Instagram isn’t a community. They’re our commons, and the tragedy is real. But we can do better.
Part 2: How did we get here?
Let’s go back in time. In the 1970s, we had Bulletin Board Systems, aka BBSes, bound to a local network in a geographic region. You could upload and download files, exchange private messages, post to public discussion threads, and compete in multiplayer games.
In the 1980s we had Usenet. Unlike the local BBSes, Usenet was a global network of topical discussion groups. Each group was its own community. The system piggybacked on ARPAnet, the predecessor of the internet.
The late 80s introduced Internet Relay Chat (IRC). Users could connect to an IRC server and jump into real-time conversations, both public and private, with other users connected to the same IRC server.
In the 1990s, discussion forums, aka message boards, took off. Unlike Usenet, these forums run on web servers, and you access them through your web browser. This is where I found my footing on the web as a kid, in the late 90s, hopping around gaming forums.
With the turn of the millennium, the early 2000s, we entered the age of blogging. Platforms like Blogger made it easy for anyone to start publishing to the web, without having to spin up their own web servers. This new media, comprised of unfiltered and independent voices from around the world, unbound by the limitations of newsrooms or legacy organizations, were quick to break headline news and disrupt the traditional media establishment.
In the mid 2000s came Web 2.0. This brought us social networks like Friendster and MySpace and increasing volumes of user-generated content. The introduction of the iPhone and Android in the late 2000s took these platforms off our computers and stuck them in our pocket.
We now carry the web with us 24/7/365, and as tech continued to improve, both at a device and infrastructure level, the sheer volume of what we pushed out reached unfathomable scale. Social networks gave way to social media. Now everyone with a smartphone and an internet connection could act as an independent producer, publisher, and broadcaster.
So here we are today. Our attention shifted from a network of independent online communities to a consolidated handful of apps. We’re adrift in an ocean of media, leaning on closed algorithms and government regulations to make things better.
“Facebook CEO Mark Zuckerberg said Saturday that social media companies need more guidance and regulation from governments in order to tackle the growing problem of harmful online content.” — Facebook CEO Mark Zuckerberg calls for more regulation (CNBC)
What can we learn from online communities of the past?
There are three types of guidelines that have, until now, largely been an afterthought for our social media behemoths: community guidelines, moderation guidelines, and governance.
Community guidelines set clear rules around what is and is not allowed. These rules set expectations for new members and act as a reference for community leaders to enforce.
Moderation guidelines cover how the rules are enforced, and how corrective actions escalate from minor infractions to severe penalties. Think verbal warnings through global bans. A gentle reminder on one end. A complete removal from the community on the other.
Governance policies lay out how community leaders make decisions. How are the moderation policies and community guidelines developed? Who has the final say?
So, in other words:
- What are the group rules?
- How do we enforce the rules?
- How do we make decisions as a group?
All community organizations must answer these questions, regardless of their size. But, as far as I can tell, these important guidelines were never priorities for the big social media companies. They were more concerned about getting more users, keeping their users attention, and profiting from that attention. Content and conduct be damned.
Now, that’s not to say that things were always rosy in the old communities. Harassment, abuse of power, spam — those have always been a problem. And there were bad corners of the web, communities that prided themselves on being offensive and doing it for the lulz.
The difference back then was that you had more options. You had freedom of choice. If a community wasn’t a fit, you could go somewhere else.
And for particularly bad groups, internet service providers could either make it harder to find them — suppressing websites from search results, for example, or refusing to provide web hosting.
We don’t have those options on centralized platforms. And rather than taking a stance, these platforms have, for the most part, used “free speech” as a cover. Or they wait for governments and law enforcement to step in and make the judgement call for them.
Why? Here’s my hypothesis: These platforms profit off attention. Pushing people away from their platforms means attention goes somewhere else. That’s bad for business. They call for regulation because they don’t want to rock the boat.
The big social media platforms don’t want confrontation. They’d rather let the problems fester, wait for someone else to step in, set the rules, and then they can claim their hand was forced and they had no choice but to do as they were told.
We’re tired. We’re annoyed. We’re sick of it. And people are starting to do something about it.
Part 3: The pendulum swings back.
I’ve had a foot in the community management space for years. I got my start with gaming forums and fansites in the late 90s. It was a hobby I kept up through college. We were pretty good at what we did, bringing in millions of active sessions every month, finding scoops and getting cited by big industry names.
We were influencers before influencers were a thing. We were around before Facebook, before Twitter, before YouTube. Our sites were the go-to destinations for gamers to connect with each other, even if they couldn’t play with each other, because online multiplayer wasn’t around yet.
What I’m seeing play out now is making me nostalgic for where we were fifteen years ago.
Independent communities are on the rise. People are moving away from the open social media commons and into private groups.
Social media companies see the writing on the wall. That’s why Facebook is investing so much in Facebook Groups after years of neglect. Twitter recently introduced the ability for users to follow topics, not just profiles — similar to how Usenet, IRC, and message boards brought people together back in the day.
Chat apps like Slack and Discord are being repurposed as community platforms. I’m an active participant in about a dozen different professional groups piggybacking on Slack, like Post Status for WordPress; Online Geniuses for marketers; and CMX for community managers. I’ve met other folks who also hang out in equivalent groups on private Discord servers.
Then there are services like Substack and Community.com, taking legacy protocols — email and SMS, respectively — and building new, premium memberships layer on top of them. Substack lets writers charge a monthly fee for emails, while Community.com lets celebrities run two-way conversations over SMS without exposing their personal phone number.
And the membership sites — there are so many to choose from. You can run self-hosted on WordPress with plugins like MemberPress, Memberful, and WooCommerce. Or you can go for a hosted platform like Mighty or Mobilize.
That’s just the tip of the iceberg. There are more community platforms spinning up all the time as new startups launch with a focus on community. And there are the legacy forum platforms that can pivot into this space, like Invision, vBulletin, Vanilla, Discourse, XenForo, Flarum, and phpBB.
More options for members.
When we consolidated our attention around a handful of social media behemoths, we lost one of the greatest strengths of the open web: our freedom to choose.
In the days of Usenet and IRC and web forums, we could choose where and how we participated. If we didn’t like the leadership of a particular group, or if we couldn’t stand the other members, we could go somewhere else.
We could also choose how we presented ourselves within each context. We could wear a different identity within each community, sharing only what was relevant to that group, while protecting our privacy and isolating that part of our life from others.
Basically, we could choose how much information we wanted to share. Compare that to today. Our Facebook and Instagram profiles are a collage of our whole selves and our personal networks. Our Twitter profiles blend political posturing with angry customer service complaints with knee-jerk retweets and pile-ons.
A shift back to disparate, independent online communities means we can return to this sort of choose-your-own-online-adventure experience. We aren’t bound or anchored or pressured by an established presence. We can show up as we want to.
And hey — the kids are already doing it. Ever heard of a “Finsta”? It’s a fake Instagram account detached from your real-world identity. It’s a self-preservation tactic for an online existence, and it’s how we all got started on the web, hiding behind handles and screen names.
More opportunity for nonprofits & activists.
The future of online community, in my opinion, follows the organizational structure of nonprofits and activism. Both are anchored in grassroots movements that grow through 1:1 relationships between existing and potential members.
The opportunity we have now is for nonprofits and activists to add a technology layer on top, leveraging the benefits of a social network. We can bring decentralized groups together for collaboration and shared resources, without handing control over to a large corporation.
More opportunity for creators.
Transform your audience into a community. Imagine you’re performing in a room, rows upon rows of fans in front of you. Turn these people to each other, then lead them from the ground. Facilitate the connection through the shared experiences you create.
Musicians are great at this, creating a visceral and emotional atmosphere, especially on the dance floor. Think about the vibes of a great show, and knowing that you’re connected to everyone around you through that same feeling.
More opportunity for brands.
This is where I spend a lot of my time. I joined GoDaddy in 2015 to serve as the Community Manager for GoDaddy Pro, our partner program for web designers & developers.
My profession is marketing, and I think of what I do as Community Marketing. It spans three areas: earning awareness; earning trust; and earning loyalty.
We earn awareness through participation. That includes contributing to online and offline conversations, joining shared experiences like events and workshops, and supporting or sponsoring the existing community groups that align to your brand purpose.
We earn trust by leading a community of interest or purpose. This includes an online community punctuated by offline meetups. Everything is fueled by original, helpful, and interesting content, because it’s the content that brings people in. The community brings people back.
We earn loyalty by investing in our customer community. That includes a community program for rewarding and recognizing customer success and growth; identifying and empowering brand ambassadors to represent the community; and hosting online and offline events that bring customers together around shared experiences.
More job opportunities for community professionals.
Community managers have long been synonymous with customer support and social media. As community is increasingly embraced as a foundational layer, I see an opportunity for community professionals to lean into other areas.
On the marketing side, we’ve already touched on the alignment between community and brand. But community, and it’s tight relationship with content, can spill out into content marketing, social media, sponsorships, events, et al.
On the product or services side, community professionals can serve as the voice of the organization to customers, and as the voice of the customers within the organization. We ensure that qualitative insights and opinions from our customer community are honestly and earnestly represented to stakeholders. When our teams have something to share, we bring that information to the community on behalf of the organization.
And yes, on the care & success side, we’re there to help our members solve problems, and to help our members solve problems for each other. I believe that’s a fundamental role of any community, regardless of the intent.
More community revenue models.
The big social media companies have a simple business model: bring people in, get them to share content, use that content to bring other people in, then sell ad space that appears alongside that content.
In this setup, the users aren’t the customers — the advertisers are. The users are more like inventory, and their collective attention increases the value of the ad space.
When we pivot to private groups, we don’t need to be so reliant on advertising. We have options.
We can sell premium experiences, like retreats and events, the way my friend Mendel Kurland does with his Geek Adventures, or my other friend Kristina Romero does with her Recurring Revenue Retreat for web designers.
Or we can earn affiliate revenue by pointing our members to make purchases through us, the same way Wirecutter does for the New York Times.
And yes, we can still sell advertising — but in the form of sponsorship, either directly to advertisers or through a 3rd party service like Buy Sell Ads.
Next steps: Plan for success from the start.
Legacy social media is largely “anything goes”. There are minimal restrictions on content, because restrictions are bad for their business model. But because we’re not following their model, we can do the right thing from the beginning.
Keep group conversations and activity focused by establishing a clear purpose around clear topics. Create a Code of Conduct that encompasses all community activities, online and offline.
For example, in all of the groups I’ve ever managed, we had a major rule: no politics, no religion. This won’t work for everyone, but for the interest-focused groups I’ve led, this helped us avoid a common source of conflict, and let us bring members together around shared interests. If they choose to have those conversations outside of our space, that’s fine, but that’s on them.
Second, create your Moderation Guidelines so your volunteer and staff community managers have a policy to follow. Define the corrective action levels, escalation paths, and enforcement options available for moderating the community.
Keep in mind that these are guidelines, not laws. Community managers are ultimately empowered to make decisions. Context has a major impact on how these decisions get made.
That’s why I’m of the opinion that every corrective action should be recorded somewhere — ideally, tied to the member’s user profile, or in a CRM — so that other community managers can reference past offenses. If someone has a history of pushing the envelope with minor offenses, it may be time to crank up the corrective actions.
Third, document your governance model. This outlines how decisions get made within the community. Consider including a top-level ethos, “this is what we believe in”, a set of guiding principles and values. From there, detail the hierarchy of roles and responsibilities of who does what within the community.
Let’s take back the web.
“Our digital social environments will feel very different over the next 5+ years, re-emphasizing private interactions and helping us build the smaller communities we all need in our lives.” – Mark Zuckerberg, CEO of Facebook
I owe my career to the online and offline communities. My self-taught skills in web development, which I picked up through the fansite and forum work, helped me land my first tech jobs straight out of school. The network I built through community events led to every role I’ve had since.
In 2015, I joined GoDaddy to serve as the Community Manager for GoDaddy Pro, our partner program for web designers & developers. Now I have the privilege to work on content and community programs full-time.
If you’d like to learn more about community management as an industry:
- Community Manager Breakfast
- Community resource list from Rosie Sherry
- My Twitter list of community professionals