Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
[flagged] Just How Fucked Up Is Texas’ Social Media Content Moderation Law? (techdirt.com)
27 points by Vladimof on May 18, 2022 | hide | past | favorite | 54 comments


I posted this because I think that I like this law ..large corporations shouldn't be in the business of deciding what speech is acceptable...


Contrary to your perspective, why should a business be forced to carry speech it doesn't agree with? Should a newspaper allow anybody to have a say in its op ed section? Are there any exceptions to your rule of not moderating, such as illegal content or violence against specific people? If you want free speech, go to the government and/or host your own, nobody is stopping you from having your own megaphone on the street corner.


Businesses are not people and shouldn’t have the same rights. Hard no from me on the concept of corporate personhood.


You don't believe businesses should have the right to free speech, because they are not a person. Fair enough. Do you believe religious groups should have the right to free speech? Religion isn't a person. If not free speech, what about the right of assembly?

What about the press? The press isn't a person, either. What about political parties? What about your family? What is the limit? Two people? Four? How many people does one need to gather together under common interests before their rights are nullified, and replaced by some other set of rights governing "entities" like businesses, religions, political parties, etc?

There's a very good reason the First Amendment refers to "the people" and not individual persons - because it makes no sense to deny the existence of rights in aggregate. Yes, the press has rights, despite being an abstract entity, because it - along with religious groups, political parties and businesses, represent collections of people, and thus is synonymous with the rights of the sum of the individuals making up those entities.


> Contrary to your perspective, why should a business be forced to carry speech it doesn't agree with?

A business choose to offer 'carrying speech' service to public, so it should offer that service in impartial and fair manner. That is pretty basic consumer protection approach (at least in Europe, i have an impression that US has lower standards).

> Should a newspaper allow anybody to have a say in its op ed section

That is completely different / incomparable case. Social networks publicly offer service to anybody, while op-ed sections are individually offered on newspaper discretion.

I am ok with social networks to use their discretion on which messages they promote/recommend (as that is in principle discretion-based decision), but not for mechanistic functionality like 'distribute message to followers'.

> Are there any exceptions to your rule of not moderating, such as illegal content or violence against specific people?

It is acceptable if a service has rules and enforce them, but then it should also be acceptable to sue a service that does arbitrary moderation (either excessive outside of its own rules, or doing selective enforcement of rules).

Also, there is an argument that such rules should not be discriminatory.


>Contrary to your perspective, why should a business be forced to carry speech it doesn't agree with?

A publisher isn't forced. They choose the speech they like and censor what they dont like. They are liable for what they say. So when CNN for example published lies about that maga kid... maga kid got a nice payout from them.

A platform though isn't a publisher and isn't liable for content; they get this liability protection by agreeing they cannot censor speech. The only allowed censorship is over reasonable good faith efforts. When twitter had extreme examples of lies about maga kid, he cants sue twitter.

If these platforms could censor speech, they could operate exactly like a publisher but without liability? That's not how it works. What a disaster if you made publishers not liable by pretending to be platforms. So yes, you must force platforms to accept all speech with very limited good faith moderation.


That is a common misconception. At least under US federal law, there is nothing special about a platform. There is no distinction between a "platform" and a "publisher".


I'm not sure how to interpret that. What do you mean by "platform"?

I would have thought the normal use of "platform" would be similar to "interactive computer service" and Section 230 is very explicit: "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider"

In essence the point of section 230 is to allow social media platforms to provide content while not being considered the publisher.

https://www.eff.org/issues/cda230

https://www.law.cornell.edu/uscode/text/47/230


The normal use of a term is legally irrelevant. Section 230 applies equally to all online services, regardless of whether they are social media platforms or not. The law dates back to 1996 so obviously the point of the law wasn't specific to social media.

Again there is no distinction between "platform" and "publisher". Go read the actual law.


> Again there is no distinction between "platform" and "publisher".

Not quite right. Section 230 says that online services are not considered publishers when they provide services that allow other people to publish information online (ie, when they act like a platform, though the law doesn't use this term).

However, they are considered publishers of content they create themselves.

Thus the distinction between publisher and platform is who creates the content.

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.


Of course normal usage doesn't matter. But do you have a legal definition of the word "platform"? If you don't have a legal definition, you're just spewing words.

There is a legal distinction between a "computer system" that includes any normal usage of platform and a publisher. That's what the actual law says. Section 230 is surprisingly legible if you haven't read it yet.

Do you remember 1996? There were a lot of systems then that would absolutely qualify as social media today, even if the term wasn't used then. Those things are what section 230 addresses.

You seem to be making some sort of pedantic strawman argument and I can't tell what you're really trying to convey. Instead of phrasing it as a negative, is there something you're trying to positively affirm?


The law doesn't use the term "platform" so your point is moot.


>That is a common misconceptions. At least under US federal law, there is nothing special about a platform. There is no distinction between a "platform" and a "publisher".

Ok so how do you differentiate the liability protection Twitter has and CNN doesn't?

Also why do you believe Judge William Orrick of San Francisco is wrong? There is literally case law confirming the distinction between platform and publisher.

Note, this is the problem with the 'disinformation governance board'.


The liability protection in Section 230 relates to user generated content. CNN formerly allowed such content on their web site in the form of user comments on news stories, but they removed that feature in about 2014 because it wasn't profitable.

Which specific case law are you referring to? Please provide an actual citation because I don't think anything in my comment contradicted an opinion by Judge Orrick.


> Should a newspaper allow anybody to have a say in its op ed section?

It's completely different... one allows everyone and then bans the ones they don't like...


Common carrier laws already force some businesses to carry speech they don't agree with. I see no problem with extending those laws to cover social media companies, as suggested by Justice Clarence Thomas.

https://www.npr.org/2021/04/05/984440891/justice-clarence-th...


Businesses have been forced to bake cakes. Precedents have been set.


The Supreme Court overruled that about four years ago.

https://www.scotusblog.com/case-files/cases/masterpiece-cake...


If you're murdered and the court finds you shouldn't have been murdered, that doesn't help you out much.


You know that went the other way? The precedent is that businesses don't have to bake cakes based on religious viewpoints.

https://www.bbc.com/news/world-us-canada-44361162

Precedents have been set exactly the other way. This law looks like conservatives having a hissy fit that this time around market forces are keeping them from dominating all conversations, so they're making laws to enforce their social dominance.


I'm pretty sure that's not true. If it's the case I'm thinking of the result was exactly the opposite. Do you have more details on what case you're referring to?


"Just build it yourself" is a common lie often used by people who either don't know of or don't care that even domain registrars and backbone internet providers have been forced into censorship by activists.

Given you might be in a position where nobody will sell you bandwidth and nobody will register your domain name, "just build it yourself" becomes "just build an entire network infrastructure provider, domain registrar, hosting company and website". Don't forget that Parler, a relatively milquetoast 'free speech' website, was kicked off some pretty core internet infrastructure and was denied access by dozens of others. "Just build your own AWS and internet backbone!" is now your argument.

As others have pointed out, the pro-censorship activists generally also hypocritically support free speech when it comes to their issues -- such as forcing private businesses to bake cakes they disagree with, or forcing retail companies to stock books with certain themes.


> As others have pointed out, the pro-censorship activists generally also hypocritically support free speech when it comes to their issues

A few months ago I would have called the "pro-censorship" label hyperbole. The folks who support censorship on social media have a few decent, good-faith-sounding arguments in favor of censorship. The primary argument being something like "I'm not in favor of censorship, but censorship by corps isn't unconstitutional!". But as soon as it seemed clear that Elon would buy Twitter, the mask came off and those same people who claimed they weren't in favor of censorship were openly bemoaning the end of the pro-censorship regime.


> Don't forget that Parler, a relatively milquetoast 'free speech' website, was kicked off some pretty core internet infrastructure and was denied access by dozens of others.

Parler[0] and Gab[1] are still alive and well. In other words, despite being kicked off, they still managed to survive.

[0]: https://parler.com/

[1]: https://gab.com/


Except the bill's author has said that basically anything covered by section 230 is exempted from the bill.[0] In other words, the bill's own author has said it is legally meaningless.

[0]: https://www.techdirt.com/2022/05/17/author-of-texas-social-m...


Put a gun to my head if you dont want me to block "I made $2051 working from home" from my blog comments. I have a right to remove that.


Yes. Spammers/advertisers won't be able to resist externalizing the cost of their compulsive lying. This kind of law is the death of topical forums.


This isn’t about who gets to speak, it’s about who gets a free megaphone.


I keep hearing this, but if everyone has a megaphone, I don't see the issue... it's not like you have time to listen to everyone. In other words, you get flooded with information and it isn't really a megaphone anymore...


You only have a megaphone when your post gets amplified by ‘the algorithm’ and widely shown.


> kwertyoowiyop - You only have a megaphone when your post gets amplified by ‘the algorithm’ and widely shown.

Are you saying that the companies ban people because the company itself decides to promote those users more then others because of their own algorithm?

Just get rid of the algo? (They want the algo to control speech, but it looks likes it failing on them...)


I read the actual bill and find it fantastic. I wish a similar law were passed in my jurisdiction.

All that it does is establish section 230, which isn't being enforced at the federal level, at the state level. So theoretically no change will occur here. Then goes to say you can't censor politics or religion. Which frankly you cannot be upset about this unless your position is that you intentionally wish to censor politics or religion. Which yes they do, twitter especially so.

Which is quite stupid in my books. If you are on whatever side in your country, your political opponents are still your team. If your policies or intentions are to harm your political opponents... you're breaking that team. When the teams are broken, maintaining the peace by force can only be done temporarily. Inevitably the polarization from maintaining the peace by force will become impossible to sustain. Which is where we are now.

I've been talking to many people on both sides in the USA and I also see the complete lack of any attempt at fixing this; or worse you have the ministry of truth coming along to inflame and make the situation far worse.

So what do you plan to wear to the civil war?


Forcing a business to operate in a given state? Ordinarily I’d think ‘let’s have the Supreme Court weigh in on that one’ but given how things are now I no longer have that wish.


This law will be the death of topical forums. I mean, one man's off topic is another man's censorship. We saw that in usenet back in the 1ate 1990s/early 2000s.

If you can't remove posts for being offtopic, all forums are just a sea of unrelated, meaningless things.


As a private company, don't all of the mentioned platforms have the option to just boycott the state entirely? Just say no Texas IP addresses will receive access, as to not trigger any frivolous lawsuits. I imagine if any single one of the mentioned 50M user companies threatened to pull out like this there would be such massive public outcry that this law would be immediately reversed. And for those that say that Texans are a large part of the userbase, I'll just mention that the EU(and others) Require various forms of "censorship" (hate speech, calls to violence...) So really it's most of the world, vs just Texas on this.


> As a private company, don't all of the mentioned platforms have the option to just boycott the state entirely?

According to the author of this article, no.

"I haven’t even gotten to the bit that says that you can’t “censor” based on geographic location. That portion can basically be read to be forcing social media companies to stay in Texas. Because if you block all of your Texas users, they can all sue you, claiming that you’re “censoring” them based on their geographic location."

I'm not a lawyer, so I'm not sure that argument would actually work in court, but it might?


The law forbids that, saying that

> A social media platform may not censor a user, a user’s expression, or a user’s ability to receive the expression of another person based on…a user’s geographic location in this state or any part of this state.

Removing their ability to use the platform would be geographic censorship under the law.


But considering the internet is a nation-wide thing, would the state even have jurisdiction? If you never employ in a state and actively prevent access to the website from said state's residents, are you even doing business there? Sure, if FB employed Texans who work remote, they can't avoid it, but if they had no physical or even virtual presence?

Because, in order to sue an out-of-state entity, you need to sue across state lines. That takes it out of state court jurisdiction and puts it in federal court. And in federal court, only federal laws apply.

Is there any Supreme Court precedent for this situation?


And what is the possible enforcement mechanism?


What if they did it based on the user’s state explicitly, rather than their geographic location?


There is some blurring between moderation and censorship here. There are plenty of forums and entire platforms that cater to your favorite religion and politics. Free speech absolutists would have us turn every platform into 4chan.


The 50m people provision just means that social media companies only need to create a “Texas Edition” offshoot that only allows 49m users at a time. This would not be discrimination based location, because it would only allow Texans to sign up. Nobody outside of Texas would have standing to sue.


They can create a Texas edition, but they can't make their non-Texas edition unavailable in Texas. So they'll still be subject to law suits based on their main edition.


I believe they could. Twitter: Texas Edition is still Twitter. They are not being deprived of Twitter as a service for their geographical location, only being served a different version of it. Which is already true. The engagement algorithm already biases local content.

Besides which, Twitter need only open a nominal office in each state, make 50 different Twitter editions, and then choose which ones get to talk to each other. All of those little twitters will be less than 50m users, because no state has 50m people.

One thing is for certain, this law will not work out the way they planned. That’s inevitable because the government doesn’t know Twitter better than Twitter.

Most likely outcome, though, is that moderation changes so that the rules are different for users with Texas IPs, and those normal “censorship” outcomes populate for anyone not on a Texas IP address.


>A social media platform may not censor a user, a user’s expression, or a user’s ability to receive the expression of another person based on:

So not only are they at fault if they prevent your tweets from being seen, they're at fault if they prevent someone in Texas seeing your tweets. Which would prevent what you're suggesting.


Random thought from a Texan who is a resident of Germany, but still only a US citizen (and one who votes in Texas, because that was my last US residence): how’s that going to work if one of my cousins posts something Nazi-adjacent enough on Facebook that Germany’s well-known (but apparently not well-known enough) law against publishing Nazi-supporting materials kicks in, which obliges Facebook and other services to block said posts being displayed to users in Germany, even if those users are not Germans, but Americans?


Thats indeed an intractable problem with cross-border legislation over the exact same content.

My guess is that facebook is not allowed to share nazi content on german IP adresses but can be sued for removing content for texans.

Freedom of speech remains our greatest blessing and curse


Your example demonstrates how broken the current system is and why it's good that there are states in the US that are at least trying to kickstart change. Starting with the early days of the internet, which was about connecting the world and the free flow of information/knowledge, we've ended up with a global censorship machine controlled in places like China by totalitarian states and in the US by dystopian megacorporations that remove everything governments in various jurisdictions tell them too plus some extra for their own agenda and politics.

That German law is wrong (as are most other such laws) and it's done nothing to prevent political calamity. On the contrary, it can be argued the overly protected German public sleepwalked into another war on European soil, one that could have possibly been prevented if they'd realized the danger posed by Putin's regime earlier.

Free expression can be difficult and a struggle. Free repression on the other hand feels nice and safe for some folks initially. They don't have to deal with all those uncomfortable opinions. In the long run it almost always makes things much worse when you don't talk things out.


> law against publishing Nazi-supporting materials kicks in

If you want to be able to see their content from Germany, just use a VPN located in Texas?


This is your regular reminder Mike Masnick is a paid tech industry lobbyist, not an actual journalist. Check the Copia Institute link in the footer.


LoL... why was this post flagged?


I've seen worse laws. Politicians not understanding how websites count users is a given.

"It also excludes email and chat apps (though it’s unclear why)."

Because they're not common carriers. I can't read someone else's email or chat messages. They are invite or whitelist based and therefore not public (common) information.

"First, even if there is a good and justifiable reason for moderating the content — say it’s spam or harassment or inciting violence — that really doesn’t matter. The user can simply claim that it’s because of their viewpoints — even those expressed elsewhere — and force the company to fight it out in court. This is every spammer’s dream."

Death threats and spam is not opinion. I'm questioning the author's honesty when he thinks calls to violence are protected and make for good court cases. And I dont think spammers are going to court anytime soon either.

" “disclose accurate information regarding its content management, data management, and business practices.” ... it also is going to effectively require that websites publish all the details that spammers, trolls, and others need to be more effective.

Shadowbanning and throtteling comes to mind on this one. They are essentially requiring transparent policies like how HN [dead] tags work. This might indeed help spammers be more efficient. but who knows

"“qualified immunity” for blocking certain commercial email messages, but only with certain conditions, including enabling a dispute resolution process for spammers"

This borders on pure bait. 'certain commercial email messages' is known in the industry as spam. Which they are therefore still allowed to block first and resolve later.

The email section of this law is just pure bliss. If enforced, no more bouncing emails with self-hosted domains, if its spam filtered there's finally a resolution procedure. less monopolistic walls to enter the market.

What social media companies are ACTUALLY required by this law to state biannually:

A) content removal; (B) content demonetization; (C) content deprioritization; (D) the addition of an assessment to content; (E) account suspension; (F) account removal; or (G) any other action taken in accordance with the platform's acceptable use policy

This codeification isn't bad. My only concern is not the law but the absolute state of how social media companies operate. We've already seen the weirdest implementation horrors of GDPR sometimes outright geoblocking entirety of europe which is silly and a minor inconvenience.


No matter how much you dislike it, people are allowed to vote for their own manifest destiny. Also it's clear from the twitter leaks and Elon's finding on bots, that twitter attacks anyone not extremely liberal.

So I only expect more counter party reactions as both sides as the left and right have no middle ground anymore.


In this case this isn't people voting their own manifest destiny. This is a Texas law that decides a California company must operate in Texas, and that whilst operating in Texas it can't decide it's own moderation policies, and that if it does choose to moderate, it can get sued repeatedly in Texas. Oh, and not just for stuff people in Texas post - a person in Texas can sue this california company for censoring something a california person has posted because the Texas user must be allowed to see what the california person posted.

So to be clear, under this law if a california person posts a nude photo and twitter censors it, every single user in Texas can sue twitter, repeatedly in every different court, under the claim that the california person is being censored for their viewpoint that nudity is ok.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: