>It makes sense for Facebook to show people content they want to see.
The problem is, Facebook doesn’t show people content that they want to see. They show people content that they will engage with. That’s a very important distinction.
HN algorithm/moderators actually explicitly do the opposite: if a thread gets too many comments too quickly, it’s ranked downward. The assumption is too many comments too quickly indicates a flamewar and the HN moderators want to keep a civil discussion. The approach Facebook takes is to “foster active discussion” which on the Internet typically means a flamewar. Noting generates engagement like controversial political views. So that’s what Facebook’s algorithm/moderators show to their users.
Facebook absolutely is a social conditioning tool, it’s designed from the ground up to show people content that stirs their emotions enough to click “like” or the mad face icon or even leave a comment and wait around until someone replies back.
My point is that this what happens in real life. People will continue to engage in stuff that they want to engage in. Facebook doesn't force people to engage in anything.
I think it is far worse to attempt to condition people by showing them things *they wouldn't otherwise engage with". What is scarier, showing somebody something they want to engage in based on their past behaviour, or showing somebody something they wouldn't have otherwise seen if you hadn't gone out of your way to shove it down their throat?
People seem to want Facebook to make people more placid. Oh you have extreme views? Here's, let's condition that out of you by only showing you more moderate stuff. Oh you think x is bad? Let's not show you anything to do with x so that you'll hopefully forget about it and not engage with that part of your brain any more.
Like I've already said, this alternative is far more Orwellian and far more of a tool for social control, than simply optimising for engagement.
> I think it is far worse to attempt to condition people by showing them things they wouldn't otherwise engage with". What is scarier, showing somebody something they want to engage in based on their past behaviour, or showing somebody something they wouldn't have otherwise seen if you hadn't gone out of your way to shove it down their throat?*
I don't think that makes sense, and I don't think that's what anyone's advocating for.
If you friend someone, or follow a page, or whatever, you are explicitly saying "I want to hear what this person/group has to say". They aren't saying "I want FB to carefully curate what this person/group says in order to increase my engagement of FB". FB shouldn't promote, hide, or reorder anything coming from someone who I've explicitly chosen to follow. It should just show me all of it, and let me decide what I do and don't want to see.
That's no distinction at all, what people engage with is just one effective way of measuring what people want to see. HN simply optimizes for something else, that's no less of a social conditioning tool than optimizing for engagement, just in a different direction. You could say that it's designed from the ground up to show people content that stirs their curiosity enough to comment cautiously, or to hide content that stirs their emotions enough to engage strongly.
The problem is, Facebook doesn’t show people content that they want to see. They show people content that they will engage with. That’s a very important distinction.
HN algorithm/moderators actually explicitly do the opposite: if a thread gets too many comments too quickly, it’s ranked downward. The assumption is too many comments too quickly indicates a flamewar and the HN moderators want to keep a civil discussion. The approach Facebook takes is to “foster active discussion” which on the Internet typically means a flamewar. Noting generates engagement like controversial political views. So that’s what Facebook’s algorithm/moderators show to their users.
Facebook absolutely is a social conditioning tool, it’s designed from the ground up to show people content that stirs their emotions enough to click “like” or the mad face icon or even leave a comment and wait around until someone replies back.