> "Don't we want companies like Meta to be responsible for misinformation?"
No. This is one of those questions that sounds easy but is hard in practice.
Sure, it's bad for society if a lot of people promote an obviously false idea such as flat earth.
But do we want companies like Meta to suppress that information? That creates kind of its own set of problems. For example, it creates a sense of persecution, which generates more followers. ("Big tech is censoring me!")
And you can't really automate that, because then you could inadvertently block e.g. people discussing history about ancient cultures and their views of the earth.
So humans have to be involved.
Now throw politics and money into the mix and it gets murkier still. Was it misinformation when people said the state of Georgia elected President Trump even before all the votes were counted? (And the inverse: was it misinformation that CNN refused to call Georgia for President Trump even when it became mathematically impossible for Senator Harris to win it?)
It was not long ago when the COVID accidental lab leak theory was considered misinformation. Do you want Facebook censoring posts about that? Now that theory enjoys status as a real possibility. What was once considered misinformation is now considered a possible, even plausible theory.
Too often tech folks try to think in binary here: either misinformation or factual information. But reality isn't always so clear cut. I don't want social media companies being the arbiter of what's true.
>But do we want companies like Meta to suppress that information? That creates kind of its own set of problems. For example, it creates a sense of persecution, which generates more followers. ("Big tech is censoring me!")
I don't think that's the question. Companies like Meta are already allowed to "suppress" that information by moderating content. But that isn't an enforcement of universal "truth," since a platform can only moderate itself.
I think the question is whether the government should be allowed to determine what is and isn't truth and force all platforms to either suppress or publish based on that determination alone. That's what repealing Section 230 would lead to.
>It was not long ago when the COVID accidental lab leak theory was considered misinformation. Do you want Facebook censoring posts about that? Now that theory enjoys status as a real possibility. What was once considered misinformation is now considered a possible, even plausible theory.
OK? And were people sent to the gulags because Facebook banned discussion of it? Did it even hinder popular discussion of the lab leak theory in any significant way? From what I recall, it didn't.
>I don't want social media companies being the arbiter of what's true.
They aren't, and never have been. But you do seem to want government to be the arbiter of what's true, and that seems worse to me. Yes, mistakes can be made where attempts to adjudicate truth and misinformation are concerned, but between being banned from Facebook or having men with guns knock down my door for publishing illegal facts, I'd rather have the former.
It is, though. The parent asked, "Don't we want companies like Meta to be responsible for misinformation?"
I am responding to that question, so yes, that is the question.
> "And were people sent to the gulags because Facebook banned discussion of it?"
Must there be gulags involved for free speech to be suppressed?
And yes, it did hinder discussion of the lab leak theory. Dozens of news outlets suppressed this theory, even mislabeling it as a conspiracy theory and ostracizing those who held it.
> "They aren't, and never have been. But you do seem to want government to be the arbiter of what's true"
Wow, either there is a serious miscommunication here or you are deliberately misreading my post. I don't want government or social media companies to be the arbiters of what's true. I don't want the current or future administration mislabeling critiques of the government as "hateful" or "misinformation". Nor do I want Zuck, Pichai, or any other tech company suppressing speech because it is unpopular, hateful, or misinformation. We cannot entrust government or corporations with guarding our rights to free speech.
No. This is one of those questions that sounds easy but is hard in practice.
Sure, it's bad for society if a lot of people promote an obviously false idea such as flat earth.
But do we want companies like Meta to suppress that information? That creates kind of its own set of problems. For example, it creates a sense of persecution, which generates more followers. ("Big tech is censoring me!")
And you can't really automate that, because then you could inadvertently block e.g. people discussing history about ancient cultures and their views of the earth.
So humans have to be involved.
Now throw politics and money into the mix and it gets murkier still. Was it misinformation when people said the state of Georgia elected President Trump even before all the votes were counted? (And the inverse: was it misinformation that CNN refused to call Georgia for President Trump even when it became mathematically impossible for Senator Harris to win it?)
It was not long ago when the COVID accidental lab leak theory was considered misinformation. Do you want Facebook censoring posts about that? Now that theory enjoys status as a real possibility. What was once considered misinformation is now considered a possible, even plausible theory.
Too often tech folks try to think in binary here: either misinformation or factual information. But reality isn't always so clear cut. I don't want social media companies being the arbiter of what's true.