Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you can't afford to pay a sufficient number of people to moderate a group, you need to reduce the size of the group or increase the number of moderators.

Your speculation implies no responsibility for taking on more than can be handled responsibly, and externalizes the consequences to society at large.

There are responsible ways to have very clear, bright, easily understood, well communicated rules and sufficient staff to manage a community. I don't know why it's simply accepted that giant social networks get to play these games when it's calculated, cold economics driving the bad decisions.

They make enough money to afford responsible moderation. They just don't have to spend that money, and they beg off responsibility for user misbehavior and automated abuses, wring their hands, and claim "we do the best we can!"

If they honestly can't use their billions of adtech revenue to responsibly moderate communities, then maybe they shouldn't exist.

Maybe we need to legislate something to the effect of "get as big as you want, as long as you can do it responsibly, and here are the guidelines for responsible community management..."

Absent such legislation, there's no possible change until AI is able to reasonably do the moderation work of a human. Which may be sooner than any efforts at legislation, at this rate.



What guideline for community management would possibly not be a flagrant 1A violation?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: