First! Moderating social networks and other deep thoughts

A new monthly newsletter, Weaver's Deep Thoughts.

Up close image of young woman looking upward; a rainbow is cast across the middle of her face.

Originally published on LinkedIn here.

Welcome to the first of hopefully many newsletters. Since it is the first, I'll take a moment to describe what I'd like to do with this newsletter and what kind of content you can expect. The first thing to know is that I'm publishing this monthly for now. I didn't want to deep dive too much too fast. I don't even know how to swim, ya'll. I can flop around a little and swim for a little bit, but then soon I start sinking.

Wait... not really relevant.

For this newsletter, I want to write about writing, books, technology, and some one-off topics via deep thoughts. I'm also considering starting an interview series for authors, so if you're an author stay tuned for more details on that. I'll also share a song with each newsletter because I'm a huge music fan.

🤔 November Deep Thoughts: Social networks and moderating

I've been a moderator in various places on the internet for social networks off and on for years: a writing forum, Google+ community for Blogger users that blew up, another writing community, Pluspora (a Diaspora instance that welcomed thousands of Google+ users when it closed), another writing community, and other online communities I can't think of now. All that to say, when I talk about moderating content on social networks, I come from a place where I've been in the muck and filth. I've dealt with Nazis, racists, sexists, antisemites, people who glorify mass shooters, videos of the mass shooting in Christchurch, reporting child pornography to authorities who can help the victims, and more. Muck. Filth.

If I had to sum up content moderation in one sentence, it would be this:

Social networks must have content moderation because without it a network implodes.

People who claim we do not need it, because it impedes free speech, do not understand what that means. If there is no content moderation, we're literally talking child porn and exploding heads. I wish everyone could take some time to read articles or books about content moderation and how it can be a miserable job for those who must sift through all the reports and trash people put on the web. It's one of the most thankless jobs in the world today. Not just thankless but despised. Some people hate the content moderators of the internet.

In the very least, the next time you start to say or post something about free speech, take a moment to think about this. Think of all the ugly that would run rampant without content moderation tools and the people behind them.

📖 Book recommendation: Behind the Screen by Sarah T. Roberts

About the book: "Social media on the internet can be a nightmarish place. A primary shield against hateful language, violent videos, and online cruelty uploaded by users is not an algorithm. It is people. Mostly invisible by design, more than 100,000 commercial content moderators evaluate posts on mainstream social media platforms: enforcing internal policies, training artificial intelligence systems, and actively screening and removing offensive material—sometimes thousands of items per day..." Learn more.

An image of a phone screen where the emphasis is on the Twitter app.

Moderating Social Networks

Originally published July 24, 2018, on my blog.

I’ve been thinking a lot lately about how social networks have become hotbeds of hostility and how new networks are trying to position themselves to answer actual and perceived problems. I want to talk a bit about that.

But first, a true story!

Several years back, I created an online community for writers. It was an odd system, it didn’t quite work perfectly, but it was fun while it lasted. At its peak, we were just under 100 members as I recall.

I had a single webpage for simple, no-nonsense rules about how and how not to behave. Behavior, or writing, which could get you banned. I wanted it to be simple, so when a person read them, it just made sense. As I recall, it was one page long. Because, honestly, it doesn’t take much to spell out what kinds of behaviors or posts you don’t want on a platform. They were called Da Rules—here’s a snippet of those rules:

ALT text for above image: Da Community Postin'. For Da Members. The Community is great, because you can communicate with your fellow authors, talk about randomness, have fun, and start collaborating if you want. Community tools are pleasant tools and should not be used to stir up trouble. The Community tools Include Wall, Forums, profiles (where you can become friends, post to their wall or send a private message), and Instant Messaging with your friends on Emerald Dragon.

Please refrain from the following when posting to the Community.

  • Sharing nudity, porn or the such like.

  • Cussing like a sailor or otherwise being abrasively offensive to others.

  • Posting anything malicious, such as an attack on a fellow writer. Such nonsense will be deleted, and your account may be deleted if you do not learn your lesson or if it was deemed malicious.

    Show some R-E-S-P-E-C-T.

  • Emerald Dragon will not tolerate malicious behavior, so be respectful. Those who are authors here are not professional authors, or at least most aren't. They are learners, so be encouraging, not discouraging. Take Into account that not everyone is perfect like you.

  • Teach instead of becoming irritated if you read some bad grammar or spelling. Share the knowledge.

    Laugh and be happy,

    Smile right in their face.

    Cause pretty soon

    You're gonna take their place

    Randy Newman.


Things were smooth. People respected Da Rules. And each other.

And then… I got an email.

One of our female members had been harassed on our platform by a male member. He had begun by writing criticism of her stories, and then it escalated beyond criticism. And then, he began to send her private messages through our messaging system on the platform. He escalated it to threats and even threats that involved real life threats against her family. That’s when she deleted her account and went running from the platform. She then decided to email me and let me know what had happened, as a warning, in case he went after other members.

It was one of those things where as a human I let out a big sigh. I don’t like it when people mistreat people. Especially in a space I provided. It was very upsetting, and I was hopeful her days of his terror were over.

I restricted the guy’s movements on the platform and sent him a private message. My first step was to try and be unbiased and get to the bottom of it. Provide him a chance to hear the allegations and respond. I did not tell him who it was.

Within the first few messages he had turned into a very critical, grumpy man. He had strong opinions about the fact I was even talking to him. It was a personality and attitude I was all too familiar with. He was the victim, everyone else was wrong, and he was mad as hell about it. Despite the fact nothing had happened to him. We were just talking.

As his anger and retorts grew, it became clear this guy was trash—and just as hostile as portrayed. I banned him from the site and restricted site access to having to go through an admin approval before joining as a new member (assuming he’d come back). I took note of his IP address’s location as well to help determine if a new member request was him or someone else. It wasn’t perfect, but it was something.

Almost immediately, he asked to join with a different name (his email address gave him away). I rejected. He tried again (using feminine names and info, I might add), but his IP address gave him away. He tried several times, but his IP always betrayed him. Eventually he gave up.

Why did I tell you this story?

Because it always comes to my mind when questions of how to squash abusive users arise. And those questions have been everywhere as of late. People want Nazis off Twitter, Jack is all like, “But with a checkmark?” People want to squash the trash heap of trolls, racists, bots and filth of social platforms, and Mark Zuckerberg and others are like, “But what if they just don’t know any better?” People just want to not have threats slung at them for being women, black, Muslim, Jewish, trans, whatevs, and the response is “this does not break our rules of bad behavior on our crap platform.”

WELL… I’m here to say that it should. You set the rules. So, ask yourself, as a platform, do you want Nazis? Do you want racists? Do you want spammers? Do you want bots, trolls? Do you want the scum of the earth? Is that the kind of community you want to foster?

When I was first putting together our online writing community, the key thing for me to understand what I was building was that it was a community. Yes, a community. We were going to live together. Work together, play together. Read and write together. A community cannot thrive, and be together, if it is divided. If you let people in who are angry, divisive, and who exist to cause a ruckus, it will become a hot mess. Good people will leave. You must have your community’s back. You must know the difference between a good member of the community and a bad actor. A bad actor is one who just wants problems, always argues, can’t get along. Sure, they’ll often claim innocence. Sure, they’ll bemoan they’re being attacked for having different viewpoints or whatever. But the point is that as a community leader, you must make that choice of what kind of members you want. Ones that are there to build up, or tear down?

New, reformed, and future social media networks.

Like I said, at the start, I’ve been thinking a lot about this stuff lately. With all that’s been going on, with all that’s not been going on, and so on and so forth. But also, I’ve been trying new and upcoming platforms, too, and I’m noticing some trends over there. No one can know for sure what the future holds, but here’s some things I’m seeing and some things I’m thinking.

Some new networks are touting that they are pro-free speech and don’t censor. This should stand out as a red flag, like a major one. You should stop, drop, and roll when you see this message. That makes me sound horrible, but here’s what that means… people who got banned from Twitter, Facebook, and other networks for being human garbage can run free on these networks. Cool, right?

Some new networks are niche in focus. This might just be the future of networks, honestly. Think about it, if your network has a niche, like mine did so many years ago—it was a writing community—it stands a better chance of everyone being on the same page about standard behaviors. Some new networks are trying to meet this need times a thousand, by being group focused, which may work. But it runs a risk of trolls from other groups mingling into groups they don’t belong to stir up trouble. When I was 13, I entered a Titanic (movie) chat on mIRC with a friend and we trolled it until we got banned. Been there, done that.

Is reforming social networks even possible? Serious question. I want serious answers on that in the comments too. Look at Facebook who is frantically trying to reform to regain the trust of their userbase and public perception. They’re trying new things with how news and news stories are handled. They’re trying new things with how content is pushed to users. But at the end of the day, when you’re so many years steeped into a method like Facebook already, what are the odds of changing? Really changing? Without a major rehaul? Without pissing off your current userbase and chasing a bunch of them off—whether on purpose or not? Zuckerberg seems determined to keep every single user, gotta pay the bills, even when those users are those bad actors we talked about. In Google+, a place I love, there is a different sort of reform trying to take place. Spammers are horrendous on the platform, especially in Communities, and it’s not an easy task trying to reverse those behaviors. Even when plucking out all the bad actors, when there’s so many that have already infiltrated. It’s an endless battle of reporting, blocking, muting and any other tool a network gives its users.

Wow, this escalated out of control.

But seriously… what in the world? Guess I had a lot in mind. This is stuff that’s been spinning in my head about social networks, communities, and so on. I think so much of it is a perspective problem for leaders like Jack and Zuck. They aren’t trying to build and foster a community.

🎧 This month's song of choice.

The pick for this month felt relevant to the current state of social networking. Here is Tom Jones and The Cardigans covering Burning Down the House.

ko-fi button
Previous
Previous

Can I get a Witness? Owning and Moderating Social Networks

Next
Next

End of year update 2021