Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My idea for a quick fix is to remove likes/dislikes/emoji reactions. If people can't see how others feel about a post/article in an instant, they might have to read and decide for themselves how to feel.


Upvotes on HN/Reddit do essentially the same thing. I don't have the user statistics, but I'd imagine the typical HN reader only reads a sampling of the top-voted comments, which essentially allows them "to see how others feel about a post/article in an instant."

If you want an example of the opposite approach, you can always look at 2009-era YouTube, when all comments shown were in a strict "newest" order. If you remember, YouTube had a bit of reputation for some of the lowest-quality comment sections anywhere.

There simply is no "quick fix" for clickbait, aside from altering basic human psychology, or some type of (probably complex) regulation.


What I've noticed a lot on sites like HN and Reddit is that there's always heavy 'tilting'. If any comment ever drops below +1 (or below that of a different viewpoint, so +10 vs +1), people just blindly start ramming the downvote button without any thought, as if they aren't capable of analyzing the comment on its own merit because 'the crowd' has already decided for them and they take the path of least resistance in terms of mental effort. This is also why 'brigading' is so tremendously effective, you only have to tilt the balance a little bit and the crowd will take care of the rest.


For what it's worth, I always upvote downvoted comments when they are contributing something novel to the thread and/or taking a position which is explained and justified, and of course are civil. Even when I wouldn't have otherwise upvoted them, or when I don't agree with them.

That's just to help get them back to neutral, and avoid their point being grayed out and thus probably more likely skipped over by others. I think everyone should be allowed to put across their point, and we all end up better for seeing those different angles.

I quietly hope that others who agree with this also do the same, thus making the problem you describe a little less pronounced. We all need to play our role in creating the kinds of online communities we want to see.


I am not sure how true that is. I have had many comments go negative and still end up positive.

I think people are more likely to think about voting when they see a downvoted post.

PS: I also feel some net negative posts are a good sign that someone is not simply voicing the crowd's opinions.


I've caught myself before sometimes being more careless with the downvote button if the comment was already grayed out. After I realized this risk, I now attempt to be more mindful.

So it is a thing that happens, at least to a certain extent.


> YouTube had a bit of reputation for some of the lowest-quality comment sections anywhere

Has this reputation changed much?


I found top Youtube comments to be more or less funny/enjoyable now. The shitty comments got downvoted so I no longer see them unless I delibrately try to see them, they don't appear. It's very different from youtube comments back then. But then they are still mostly unhelpful/uninformative.


This might be true for some videos. But there seems to be a first in approach behind the sorting. The first who says something remotely relevant is often the top comment for a long while. I can not see much quality in those posts usually. Newer posts Barely have a chance to get to the top if you aren't a youtube influencer in some way.


Reputation and kill switches. Ultimately: editorial voice.

If I've got tools to block sources of clickbait, and those feed into reputation systems, that's a start. On Google's properties, I can block G+ posters, but I cannot block a YouTube channel. Not even if I'm logged in -- that channel will still show up in searches and recommendations. I'm actually better of not logging in so that recommendations are keyed far more to current search and view history, and there is no long lag of "you watched this once three years ago so we'll cram it down your throat".

For Web, I've taken to fairly ruthlessly blocking clickbait domains at the firewall. They pretty much completely disappear.

(And, if I've got second thoughts or want to poke through: archive or simplified-view sites will generally show me the content. And it's almost never worth looking at.)

I'm not saying that my individual tools will scale out, but if they're tied in to larger systems, and feed back to what's promoted, there's the possiblity of this.

Ultimately, though, massive media channels have to take an active voice in what they do or don't permit, most especially where there are real and serious harms possible. That's the editorial voice option. It's not a popular opinion on much of the Web, and it's not one I would have advocated even just a few years ago, but I'm coming to feel it's increasingly proving necessary, and that the historical record, over the past 500 years of mass communications, starting with the printing press, bears this out repeatedly.

Free speech is not a responsiblity-free zone.


As you point you Up votes are a useful filter. However "Like" only voting allows things to go viral that piss off large chunks of your readership. Readers end up being assaulted with emotional appeals devoid of content which many people disagree with but without down votes they simply don't count.


How about collaborative filtering? [1]

I.e., the stuff that is upvoted by people who "upvote like you" is ranked higher. (I would like this feature on HN, as it would eliminate e.g. articles about CSS which I find boring; of course that is personal, but that's the point).

Otoh, there's a danger of "echo chambers"; not sure how to deal with that.

[1] https://en.wikipedia.org/wiki/Collaborative_filtering


> but I'd imagine the typical HN reader only reads a sampling of the top-voted comments

That, and it doesn't help that on threads with hundreds of comments, most of them are hidden away.


Well, that's the same problem that any feed faces - 100s of pieces of content, and the user only wants the best 10-15.


I think they should remove the media content they add to external links and only show URLs. If people had to click a link to see the title, summary, images, or other clickbait meta content they'd be less likely to share links thoughtlessly. They might also see that the URL is untrusted and choose not to click through. Facebook could charge brands to show their content alongside click bait media. It would be a lot easier to verify the authenticity of ten thousand brands than every random link that's shared.

Wouldn't solve all the issues but it would be a start.


They'd just be less likely to click which isn't going to go well with marketing teams using facebook for their campaign.


Passive ranking is subject to the same bias, though. If I click - even if I dislike the content - it contributes positively.

I don't think there's a quick fix for wanting autonomously produced graph-based timeline without the ability to exploit it for clicks, propaganda, etc.


I've been doing this using uBlock Origin's filtering on the social media sites I visit; anecdotally, it makes a significant difference in how much I engage with content that is not highly upvoted. I highly recommend it.


I think we need new signals to decide about content. This issue happens everywhere from top music charts to movie rankings.


It's clickbait, it doesn't matter if they click like it not


My idea for a permanent fix is for Facebook to go out of business. Completely. Forever.


If there's a market for something like it, there will exist something like it. The interesting question is how to make something like it so uncompetitive that nothing like it can survive.


Create a cryptocurrency which you mine based on limiting social media usage


I upvoted this comment and then immediately felt dirty.


To be fair, HN hides a comment score until the user decides to see it.


How can one see the HN comment score? (except own comments)

I think it was there some years ago, but vanished incl karma-ratio info when PG stept back.


You can't. And they were hidden before pg stepped back; in fact, he made a poll asking what people thought of the change: https://news.ycombinator.com/item?id=2595605


I was under the impression that you could see it when clicking the "[-]" at the right side of the comment metadata. But it seems it's just the number of child comments, my mistake.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: