Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>how meaninglessly violent people become when their posts are modded

Maybe if mods were more transparent across all the subs.

Most people get angry when someone removes their post and yet they see it reposted and approved hours or even minutes later (Gallowboob is infamous for that but regular users do it too)



Maybe if mods were more transparent across all the subs.

This is a common refrain I've heard going back well before Reddit ever existed.

I helped moderate the Vault Network boards for a while back when they were a thing.

Its hard to overstate how amazingly...disturbed and vitriolic some of the individuals of a community can be. And dishonest. And spiteful.

No amount of transparency helps when all it takes to refute any evidence is to label the mods as lying or "corrupt".

Let's say a user says something bad and you mod the post and give them a warning. You tell them exactly what the offense was and why it was moderated. They might even act civil in response.

Then you see later them talk about "X mod totally censored their post for no reason and refuses to explain why. X and the rest of the mods are totally corrupt".

So, what do you do? Post a screenshot of the private messages exchanged (something, for instance, we weren't supposed to do)? Take a screenshot of a browser window with the "mod view" (aka. uncensored) of the original post? Something said user will point out can be easily edited because well, its a browser showing a website, not exactly hard to alter).

And that happens every day, all the time constantly. And no one is being paid and there is a constant stream of other stuff you are trying to stay on top of.

And sure, some of the mods are shitty and "corrupt" in a sense. But I would say its as much a reflection of the community itself as anything.


As someone who used to moderate a pretty large gaming community back in the day I think there are two things to keep in mind.

The first is that transparency doesn't "fix" hostile community members. What it does is justify the actions of those in power to the rest of the community. Without this the community will very quickly lose faith and the situation just devolves into factions and hostility.

This is very indicative of your last line really. It's like any form of government. When the people lose faith in those in power due to perceived capricious or opaque behavior there tends to be a lot of civil disorder. This usually leads to those in power entrenching themselves and enacting even more draconian measures. It's a vicious circle. Perhaps all governance is subject to the laws of entropy and will eventually fail, but I believe transparency, consistency, and effective communication are the only methods to slow such an eventuality.

The second point is one of size. As the population of a community grows and the proportion of those who wield power shrinks you also end up with a lot of discontent. Many community members will no longer feel they can effectuate change as they're a small voice amongst many without any real connection to the small group in power.

Also note that my experiences are in relation to fairly tight knit communities. Reddit is a little different in the sense that plenty of subreddits are far less communal. Effective communication is very difficult when the community is mostly transitory right up to the point where mob mentality takes over.


Yeah, you definitely raise some good points here.

When the people lose faith in those in power due to perceived capricious or opaque behavior there tends to be a lot of civil disorder.

I would be really interested in figuring out how to combat "perceived opaque behavior" especially when the source of the complaints is more artificial, where the complaints are being used as a tool for manipulating the community rather than being based in an actual grievance.

That's the reality you run into sometimes, like that oft-used quote from the Batman movies "Some men just want to watch the world burn."


> how to combat "perceived opaque behavior"

As the GP said, transparency is for the punished is not sufficient. Governance must be transparent to the public, otherwise it will seed distrust. The behavior you described isn't "perceived" to be opaque, it is opaque.


I've been thinking of what I would do if I were a forum moderator and I came to the conclusion that I would have to implement a minutes keeping rule like they do in the actual government. In the UK for example you have the 30 year rule, which says that all the minutes in cabinet meetings get released after 30 years.

Maybe you could keep complete raw backups of the "minutes" (or mod logs in this case). So you would take a backup every 24 hours, encrypt it, and upload it to an independent write-only server (which proves you couldn't tamper with it). And after e.g. 1 year (let's reduce it from 30 years), you would release the encryption key to that mod log backup you made. This ensures transparency and trust.


That sounds like a job for public automated mod logs.

Naturally people will still complain since it's impossible to fix people, but I'd imagine having a "authoritative" list of every moderator action and the accompanying explanation would help stave off the corruption/lies accusations. That combined with a reputation of transparency.


That sounds like a job for public automated mod logs.

I mean, I think that would be an interesting experiment, as I can't think of a large community that provides such data.

But again, if you the underlying assumption is "the mods are corrupt", then that accusation can easily be transferred to whatever logs are provided as well.


The Something Awful forums, which have been around for a very long time and have been relatively successful in maintaining a decently sized community, have a public moderation log[0].

Something Awful also has a number of other moderation features that I think more sites should emulate:

1. Temporary bans from posting ("probation") of variable length, to allow for punishment short of a full ban. Usually 6 hours to a couple days, depending on the offense, occasionally a week or longer.

2. A nominal registration fee ($10, one time) to register an account, to cut down on bad actors just making new accounts.

3. Normal bans for being a dick are reversible by paying $10 (same cost as registering a new account), unless you get "permabanned" for either repeated bad behavior or posting illegal content. If you get permabanned, any new accounts you create get permabanned as well (assuming the admins can find them, which they do remarkably effectively using IP and I think payment info).

That last point sounds like it incentivizes the mods to ban users, so that the forums get more money. But it doesn't seem to actually have that effect, possibly because most of the mods are not paid.

There have also been a few interesting experiments in moderation that were less useful, but are definitely entertaining, such as the ability to limit an account to a single subforum (usually the one for posting deals, or one of the ones for shitposting). It's also possible to view a user's "rap sheet" of moderation actions from any of their posts.

[0] https://forums.somethingawful.com/banlist.php


Mods apparently inflict their PSTD from bad posters on good posters, given the common ban first ask questions later approach.

I really don’t care how shitty another poster was to you. I only care how shitty you are treating me.


While I am sure that occurs, I have been personally banned from several subreddits not for rules violations, or violence comments, but because the moderator simply disagreed with my comment or I was harsh in my commentary against the product the subreddit was about, for example a certain web browser subreddit banned me because I talked badly about a policy of that web browser

I have much more experience in dealing with corrupt and biased mods than I see anything else


Just to provide the other side of this.

Imagine seeing a post exactly like yours, literally word for word, with the same civil tone and all.

Except the user in question had been banned for posting a tubgirl (its gross) image by a moderator that happened be female and had responded in private messages on a clear alt account (minutes after the ban with the same IP address) calling that moderator a slut who deserved to be raped and killed.

And that style of interaction being relatively common.


Imagine treating a civil poster as if they were a jerk, just because a different civil poster once turned out to be a jerk.


Its' not about treating people like a jerk, just that you learn quickly to never trust a person at face value when they say "I didn't do anything wrong" because that's usually the first and most common phrase a person who does something wrong uses.

I was merely pointing out that the "public" persona a user portrays doesn't have to match the truth.


Which can be equally applied to moderation staff, mods that say "We are not biased, and always are fair only banning people that break the rules" are equally as likely to be bad actors as your narrative about regular users


Definitely. I mean, in the vast majority of cases, the moderators are just "regular users" who are given (or chose in the case of the creation of subreddits) power over other users. Even if the moderators are employees they are still fallible, with personal motives, just like the people they moderate.

In the end, that's why its such a difficult problem. You take the normal conflict that occurs in communities, add in the potential for malicious actors on both sides and it's no surprise that the normal conflicts can spiral out of control. Especially in the virtual and relatively anonymous setting of online communities.

And from a person on the outside looking in, it can be impossible to actually know what the truth is.


Not sure how that invalidates the equally if not more common occurrence of banning based on philosophical, political or other disagreements with the moderation staff


> Maybe if mods were more transparent across all the subs.

We're (r/relationship_advice) rarely transparent with removal reasons. Our templatized removal reasons generally look something like this:

> u/[user], please message the mods:

> 1. to find out why this post was removed, and

> 2. prior to posting any updates.

> Thanks.

or

> User was banned for this [submission/comment].

The reason is because we have a high population of people who:

1. post too much information and expose themselves to doxxing risks, and

2. post fantasies that never happened.

So in order to protect the people who inadvertently post too much information, we tend to remove these posts using the same generic removal template. However, if people know that the post was pulled for one of these two reasons, the submitter may still end up on the receiving end of harassment as a result, meaning we have to fuzz the possibility of the removal being for one of these two reasons by much more broadly withholding removal reasons.

This is specific to r/relationship_advice. Other subreddits have other procedures.


> Maybe if mods were more transparent across all the subs.

Don't get me started.

Particularly "muting" posts with auto-moderator (silently hiding them for others without notification/warning/explanation). It was originally created for spam control but is regularly abused for generic moderation. It needs more controls placed on it.

To give a recent example, I wrote a long reply in /r/fitness's daily discussion but it was muted because it contained an offhand remark about COVID-19 (vis-a-vis getting hold of fitness equipment right now). Why are they muting all comments that contain "COVID?" Who even knows, but the /way/ it was done was pretty irksome and resulted in wasted effort on my part for a comment that violated zero published rules or etiquette.

/r/politics has a huge dictionary of phases and idioms that result in auto muting. None of which are defined in their rules.


> /r/politics has a huge dictionary of phases and idioms that result in auto muting. None of which are defined in their rules.

This is true here on HN too. One such word that I’ve seen to cause comments get auto-killed is “m∗sturbation” (censored for obvious reasons), and I am sure there are others.


Here's the list:

  masturb circlejerk faggot nigger feminazi shitstain lmgtfy.com #maga
...plus a bunch of spam sites. Why those words and not others? Shitstained if I know. These things tend to get added when somebody does something, and then stay there.


Plenty of people complain about overmoderation here on HN as well... I am one of those people


I firmly believe that the only way an online forum, particularly an anonymous one with free signups can remain relatively civil is very heavy handed moderation. This is not a free speech zone (it's a privately owned space), and the only way to prevent bad actors is to be very liberal in applying heavy moderation.

If a few "medium" actors get banned by accident, that's the price to pay for the rest of us getting to enjoy discussing tech without dealing with a toxic cesspool.


I strongly disagree. Bad actors thrive just as well if not better in places with moderation and even sign up fees.

Metafilter is a forum which used to be very diverse in opinion and is now basically captured by a small vocal minority. How did they do this? The minority produced a large amount of content for the forum and was very active. 95% of that content was high quality and on topic but the remainder is biased and very opinionated. As a result of their interaction with the site they were closer to the mods, regarded more highly, given the benefit of the doubt in "bad actor" conversations.

Now today Metafilter is a dying community of territorial users who eviscerate anyone who doesn't know how to play the game. The minority won and created their little clubhouse corner of the internet. Metafilter as a forum though is a shell of its former self. There are fundraisers to keep the site alive and formerly paid mods and devs are now either retired or do it for free. Not only did it drive away old and new users but the minority also seems to have become bored without constant drama and moved on (as seen by new posts and commenting activity falling of a cliff).

I know the common refrain on HN and in general is "on a private platform there is no such thing as free speech" but be careful. Don't let blatantly toxic users run your platform but beware of users who are quick to call everything toxic.


Every community has its rise and fall, and there is no one solution for all of them, but I still think without heavy moderation that fall comes much quicker and more violently.


Yes, moderation might be necessary, but what the anecdote illustrates is that simply having moderation will not be enough.


>>how meaninglessly violent people become when their posts are modded

> Maybe if mods were more transparent across all the subs.

Does the latter really justify the former?

If you disagree with a moderation decision, take it up with them politely. If you consistently disagree, maybe this community just isn't for you!

Mods are just people donating their time. Even if they're inconsistent or "corrupt", there's no reason you should respond in any way that can be described as "violent".


What if you take it up with them politely and this is their response? https://imgur.com/a/jhmGXzJ


Find or create another community.

There are basically two scenarios here. One, a lot of people agree with you — in which case you should be able to appeal the decision or splinter off successfully. Two, most people disagree with you — in which case the mod is probably right, or at the very least you're simply not welcome in the community.

Let's also not forget that we're talking about violence in response to moderation decisions. So even if that's their response, it's still not okay to e.g. threaten them.


The problem with the first suggestion is the fact that the moderator in the above screenshot mods 1000+ subreddits. You can be pretty sure they mod the other subreddits too with the same mindset, how many of them do you recreate? This is the whole problem that has caused the drama in the past few days.

> Let's also not forget that we're talking about violence in response to moderation decisions. So even if that's their response, it's still not okay to e.g. threaten them.

Agreed.


> If you consistently disagree, maybe this community just isn't for you!

Ah the classic "don't let others make changes but outcast them" approach.


If you are in conflict with enough other members of a community, you are de facto outcast anyway. And if enough other members agree with you, just break off and form your own community!

We’re talking about online communities — low stakes to join, low stakes to leave. There are exactly zero reasons that anyone should resort to doxxing/threats/etc in response to moderation decisions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: