Hacker News new | past | comments | ask | show | jobs | submit login

> Will the services be forced to accept whatever unmoderated feed of filth the linked to service sends their way?

I was kinda wondering that. If a user gets banned, and moves off-platform to continue the behaviour there, but is somehow magically still linked to everyone they were linked with before... has anything really changed? Will facebook be forced to show its users the same nastiness it just booted off, because of the interop necessities?




> Will facebook be forced to show its users the same nastiness it just booted off, because of the interop necessities?

Facebook should be forced to show me the feeds I choose to subscribe to. I'm perfectly capable of deciding for myself whether I find something nasty, and I find the idea of Zuckerberg as an arbiter of morality both absurd and dystopian.


> I find the idea of Zuckerberg as an arbiter of morality both absurd and dystopian

This is a very common characterization that I find frustratingly absurd. Zuckeberg is not "an arbiter of morality" in any respect, the contents of Facebook is a corporate business decision, not a reflection on morality.


If FB are blocking things that they deem bad, then they are clearly setting themselves up as arbiters of morality, even if they say they aren't.


Facebook has opinions about the type of content they'd prefer not to facilitate on the Facebook platform, that doesn't make them "arbiters of morality", it just makes them curators of their product identity.


So they might say "we don't want $X type of content on our platform, as it would negatively affect our product identity".

But that's in fact a moral judgement! It's saying "some people think $X is bad, and we agree with them (or at least choose to pretend to so as not to soil our image)".


I would disagree that it is a judgement based on morality. It is a judgement made by the logic of corporations, which are machines made up of people that take on a life of their own.

While the actions of organizations like this have moral consequences, and the people that are part of it have their own morals, the machinery itself does not use morality to make decisions, except in hindsight as justification.

The moral consequences may be beneficial or harmful; it is all the same to a profit machine that acts in its own interests, which transcend the people who it is made of in the form of workers and users.


Saying that corporations should act to maximise their profits is in fact also a moral judgement.


I never put a value judgement like should on it. It just seems to me to be the dynamic that happens with organizations. They take on a life of their own, independent of the people they are composed of.


It's not one or the other, they can both care about their product identity, but by the same stroke arbitrate morality by doing so. (Product identity takes popular morality as an input).


By that definition, they're no more the arbitrator of morality than anyone else is by doing this


What do you mean? Not anyone has the power they do


The point is that if they're simply responding to consumer demand, then it's collectively us who are the censors.


If we were the censor, we would have the government do it (which represents us). Facebook's opinion of what they perceive to be morally dubious does not represent anyone but facebook themselves.


> If we were the censor, we would have the government do it (which represents us).

Well sort of. We can't legally, right? The constitution prevents that. If Facebook is simply censoring the things that consumers ask it to, it is absolutely serving itself (in a profit driven way), but it isn't asserting Facebook's morals. It's asserting something more like "American consumer morals".


Facebook clients are not individual, they're ad-centric. Facebook is serving what it believes to be advertisers' interest, which may not align with most people.

Plus, the reason we can't legally is because we agree it shouldn't be done.


> Facebook clients are not individual, they're ad-centric. Facebook is serving what it believes to be advertisers' interest, which may not align with most people.

But this still ultimately consumer morals, since it's about content that advertisers don't want to be associated with because that content will reflect badly on them in they eyes of the consumer.

> Plus, the reason we can't legally is because we agree it shouldn't be done.

We (generally) agree that the government shouldn't engage in censorship, yes. The claim that individuals and groups should not be able to themselves moderate the stuff that appears on their websites is a much more controversial claim.


> the contents of Facebook is a corporate business decision, not a reflection on morality.

Are you suggesting business decisions are exempt from having moral consequences? Are the decision makers somehow less culpable as long as it is "just business"?


Of course business decisions have moral consequences. Does this imply that all business leaders are arbiters of morality, or is it only Zuckerberg--who is just making business decisions like the rest--who is an arbiter of morality?

The answer is, he's not. He's just a greedy person with a powerful company. There are no real moral decisions to be found here, hyperbole notwithstanding.


> Does this imply that all business leaders are arbiters of morality, or is it only Zuckerberg--who is just making business decisions like the rest--who is an arbiter of morality?

Zuckerberg and other social media CEOs are the only ones using moral language to justify censorship, hence the irony of Zuckerberg as moral arbiter


If yellow journalism can stampede a nation into war (e.g. "Remember the Maine") through presenting a slanted view of the situation, well, Facebook and however it chooses to editorialize what it allows to be presented to its users has a vastly bigger audience than the newspapers of old. There's nothing "absurd" about that.


If future, as social media gets more powerful, they will be able to determine who gets elected. No-one will be able to get elected if they go against them.

Once that happens, Google and Facebook will never have to pay their share of taxes ever again!


I think this gets you into really weird places really quickly.

How do you not end up with a censorship department of the government, that is tasked with writing moderation rules? It probably wouldn't be called a censorship office, but that's what it would be: a government office determining what can and can't be posted on social media. This seems entirely antithetical to the idea of the first amendment.

Do you have any other way to implement your idea of forcing Facebook to show you the feeds you subscribe to?

EDIT: Just saw in your profile that you might not be from the USA - feel free to substitute "the idea of free speech in modern liberal democracies" for "the idea of the first amendment" as needed. :)


> How do you not end up with a censorship department of the government, that is tasked with writing moderation rules?

That already exists: governments already have laws limiting some forms of speech e.g. copyright infringements or pornography. Obviously corporations that want to stay in business have to obey the laws.

> "the idea of free speech in modern liberal democracies"

As I've already pointed out, all states already have laws limiting speech. Maybe I think there should be more such laws; maybe I think there should be less; but either way it's orthogonal to whether I want FB to have power to control what I can say and hear.

If my government passes laws that I don't like, citizens can vote them out at the next election. I can't vote Zuckerberg out, he's an undemocratic locus of power.

> It probably wouldn't be called a censorship office, but that's what it would be: a government office determining what can and can't be posted on social media.

As I said, such laws already exist.

> Do you have any other way to implement your idea of forcing Facebook to show you the feeds you subscribe to?

Yes, require all social media apps to provide interoperability though ActivityPub and similar protocols. Make sure there are no legal or technical impediments to people running social media servers (Mastodon etc) on their own hardware though their home internet connections.

What I actually want is a world where people don't care whether FB censors things, because it doesn't matter as people can and do use other means to communicate. So FB censors stuff and no-one notices because it has no practical effect. That is, a world where FB is stripped of their power.


> What I actually want is a world where people don't care whether FB censors things, because it doesn't matter as people can and do use other means to communicate.

Fortunately, we already live in this world! We have the internet!


Lol, that has the public square embodied in a few big systems.

If it were me, I would have those regulated to be like utilities. Pre internet, business was done via phone. Deplatforming = not allowed to have a phone, and that basically did not happen.


Deplatforming doesn't apply to phones because phone communication is fundamentally different from social media.

And prior to the internet, deplatforming still existed: when would you ever hear anyone make neo-nazi arguments on any of the broadcast tv networks?


Well, broadcast has been gatekeeping on a LOT of fronts for a very long time, nothing new there.

When do we see economic reporting from the labor point of view, for example? We don't on broadcast anymore. And after massive media consolidation, that reporting only happens via indie media, and it's largely via the Internet, discoverable through social media.

The answer to broadcast issues was public access broadcasting, and the likes of PBS. That sort of worked, except for underfunding and the need for big business to underwrite most of the programming.

In terms of social media, what is needed is process. Trying to do business today without it is similar to not having a phone. However, let's set that aside and just look at conflict of interest and speech issues.

The major players are the public square today. Whether that was intended isn't the point. That they are, is.

Back when Alex Jones was deplatformed, a lot of us, myself included, said we were headed down this increasingly difficult road today, and here we are. We were also frequently judged as being some supporter for Jones. (I am not, but see his case as being high value for this discussion)

Process needs to be there to manage expectations, provide opportunity for improvement, provide consistency and all the basics needed. In other words, some growing up needs to happen. Costs are higher, risks are higher, and the money is more than good enough.

Without it, what we've got is a very highly arbitrary, quite easily abused system that far too many people depend on for important discussions. Look at the folks advocating for direct democracy. They assume all this is here, working properly, and working for us, not against us.

The neo-nazis were on public access BTW. I seen 'em, along with quite a few others on the mandated access channels.


> How do you not end up with a censorship department of the government

That department of censorship will exist no matter what. Internally, it is called the trust and safety department, at most big companies.

The only question is do you want to have the decisions of that department to be limited by judges and rights that are laid out in the constitution, or do you want it to be an unaccountable free for all, where they can do whatever they want?

> Do you have any other way to implement your idea of forcing Facebook to show you the feeds you subscribe to?

Well, phone companies are currently forced to allow most people on their network. We could do it the same way, that phone companies are required to do so.


> The only question is do you want… or do you want…

Therein the problem lies. Users have no choice over which moderation policies their feeds are curated by at all. I should be able to choose moderation that I deem acceptable. To continue your analogy to telcos, this is akin to robocall blocking services which are a user decision and often tell you when blocking occurs.


> rights that are laid out in the constitution

Just to be clear, under US law, what you are describing (creating a government censorship department that determines how social media companies are allowed to moderate) would have NOTHING to do with the rights laid out in the constitution - it would in fact need to overcome the companies' constitutional rights to publish whatever they wish, just to exist.

You can argue that it's a good idea anyway, but claiming that government-mandated moderation is somehow enforcing constitutional rights is way, way off base.


> Just to be clear, under US law, what you are describing (creating a government censorship department that determines how social media companies are allowed to moderate

Yes it could.

The point is that if the government is doing the "censorsing" or whatever you want to call it, then at the very least, whatever that "censoring" is, we have judges and rights and similar checks and balances that prevents the government from doing too much.

If something is so bad, that it needs to be censored, then our court system should handle it.


The legislatures are already the "censors" and the executive the once enforcing "censorship". And there is an "appeal" process to some degree in the judiciary branch. The legislatures already write the "moderation rules" for entire societies.

Unlike the big companies who are only beholden to their share holders - and sometimes only really that one shareholder called Mark - the legislatures are beholden at least in theory to the citizens (who vote) in our democracies and at least in theory are curtailed by things like the US Constitution and Bill of Rights and equivalent constitutional documents and human rights legislation in other places.

Is leaving it to legislatures to make the rules a perfect solution? Certainly far from perfect, in my humble opinion, at least in practice with all the lobbying, corruption and self-interest going on. But it still beats leaving such extremely significant decisions of what can and cannot be said to the rulers of some big corporations.

Facebook, instagram, twitter, youtube, etc will also never get tired to tell everybody that they are not publishers and do not "edit" or "curate" information on their "open" platforms and thus fall under 230 and are not liable. All the while they constantly curate and edit and purge.


Having the government writing censorship rules sounds absolutely dystopian to me.

There are plenty of ways to get information on the internet - just because Facebook says "no porn" doesn't mean you can't get porn on the internet, you know?

But there's only one government (per country, of course). If your government says no porn... What then?


> Having the government writing censorship rules sounds absolutely dystopian to me.

No, because our court system could prevent any censorship that infringed on our rights.

It is better to have our court system stop people from censoring too much, than it is for there to be zero rules at all, as to what could or could not be censored.


> No, because our court system could prevent any censorship that infringed on our rights

No they can't. The legislature would. Do you trust Nancy pelosi and Ted Cruz to write censorship rules you agree with?


> No they can't. The legislature would

Yes the courts would. Lol. Any laws that infringe on our rights, can be struct down by the court system.

Thats literally the point of the court system. To strike down laws that infringe on our rights.

> Do you trust Nancy pelosi and Ted Cruz to write censorship rules you agree with?

Hey, at least if they write them, the courts can strike them down, if they infringe on our rights.

At least there is some check and balance there, where our court system, will strike down laws that infringe on our rights.

I'd rather have that then no court system being able to strike any of it down.

If you want to suggest that the court system should be able to strike down FB censorship rules, that would work as well though!


You're simultaneously advocating for a sweeping change in how the courts interpret our rights, while proclaiming that they will continue to protect them.

Currently, the courts have an opinion on censorship that is extreme enough that it protects what Facebook is doing. Today, the government cannot write censorship rules. They all, to use your words, "infringe on our rights". So if you want the government to be able to do that, and the courts to allow some censorship, you have to amend the constitution, and change the recognized legal framework of what our rights even are.

That (broadly) falls to lawmakers, not the courts. Do you trust Pelosi and Cruz (and other lawmakers) to change the very framework of what our right to free speech is, while simultaneously allowing the courts to protect it?


> while proclaiming that they will continue to protect them.

Well, are rights aren't particularly protected, from the censorship that happens right now.

There are zero checks on balances on the censorship that already happens. At the very least it would be better if we had more courts involved, so as to make sure that those courts have some say over what censorship can happen or not.

> Today, the government cannot write censorship rules

Sure they can. They are already doing so. Go look at the laws that cover phone companies. The government already has laws, that prevents phone companies from engaging in certain acts of censorship, or whatever we want to call that, on the telephone network.

Those laws don't apply to other things at the moment. But they could, if we changed them, and it would be no bigger of an issue than how those existing laws, already apply to phone companies.

> you have to amend the constitution

No we won't. All we have to do is take our existing common carrier laws, and expand them.

Common carrier laws, as they apply to phone companies, are already constitutional.

> to change the very framework

There is no massive change to the framework of free speech, when we already have similar laws, that already apply to a similar thing, which is the phone network.

Just whatever is happening there, already, and which already does not infringe upon our rights, could be expanded to apply to other things, and no massive constitutional change would be needed to apply these existing laws to new things.


> Well, are rights aren't particularly protected, from the censorship that happens right now.

Given that I don't believe that it is fundamentally possible for any private entity to infringe on my right to free speech, I'm going to have to disagree with you here.

> No we won't. All we have to do is take our existing common carrier laws, and expand them.

The Cato institute appears to disagree with you that it's that simple[0]. It's not clear that applying common carrier status to Facebook or Google could be constitutional, because they do not and have never claimed to be neutral, so using the force of law to make that so would be an infringement on the companies' rights to speech and association.

[0]: https://www.cato.org/blog/are-social-media-companies-common-...


> they do not and have never claimed to be neutral, so using the force of law to make that so would be an infringement on the companies' rights to speech and association.

You are not being creative enough, with how we could apply these laws.

One big reason why phone companies are claiming to be neutral in the first place, is to get certain protections, that those laws give them.

They claim to be neutral, to get certain benefits.

Phone companies are claiming to be neutral, for a reason. Its not because they simply want to. It is because they are doing that, on purpose, because the law is strongly incentivizing them to do so.

A similar strategy could be used against Facebook. What can be done, is we take away certain protections, that we currently give them, and only give them back, in exchange for them following something similar to common carrier laws.

The most common thing that people bring up, of a right to take away from them, is section 230 protections.

When the smart people talk about repealing section 230 protections, it is not to actually remove those protections.

Instead, the strategy, is to make it so, if companies want to keep those protections, they then have to follow other requirements, by making those protections contingent on acting a certain way, such as by following common carrier laws.

That is the way to get around being unable to force Facebook to follow common carrier laws. You don't actually force them to do it. Instead, you make the alternative, extremely difficult, by making previously necessary protections, contingent on following new neutrality laws.


This doesn't make sense. If you repeal 230, Facebook doesn't gain anything from being a common carrier. Repealing 230 solves the problem you seem to be solving, which is that the companies can no longer do any form of moderation.

The problem there, of course, is that both the average citizen and the average lawmaker want the companies to be able to do some forms of moderation. Which brings us back to the original question: Do you trust Pelosi and Cruz to define a reasonable framework for internet content moderation?


> If you repeal 230, Facebook doesn't gain anything from being a common carrier

Yes they do. They would not be able to be sued, if they were a common carrier, and in return, they would be unable to engage in many forms of censorship.

If they did not have section 230 protections, then them existing in their current form, would be almost impossible, and therefore, they would have to make the choice to receive the alternative protections, which are common carrier protections.

Common carrier protections are different from section 230 protections. If they are put in a position, where their only real option is to go under common carrier laws, then that would be the point.

> the average lawmaker want the companies to be able to do some forms of moderation

Common carriers are already allowed to do some forms of moderation/censorship. It is simply much much more limited.

And additional forms of moderation, could simply be handled over to the user, to choose what they want.

For example, it would be completely allowed for a customer, to have some automated calling blocklist, where they choose to block all adult content phonelines, that call them.

> Which brings us back to the original question

No it does not bring us back to that. Because common carriers are already allowed to do very limited forms of moderation, and that is what it would be a good thing to incentive facebook into having to follow.

We don't have to have lawmakers do anything except for get us into a position where facebook is strongly incentivized to choose to be under common carrier protections, where they will then be extremely limited in the types of censorship that common carriers are allowed limited to.


> Yes they do. They would not be able to be sued, if they were a common carrier, and they would be unable to engage in many forms of censorship.

If facebook doesn't moderate, they can't be sued anyway. Without section 230, moderation makes them liable, but if they choose not to moderate, they aren't liable.

You're suggesting that by opting into a common carrier status, they gain what?

> Common carriers are already allowed to do some forms of moderation/censorship. It is simply much more limited.

Such as? As far as I can tell, they're allowed to moderate forms of pornographic and harassing content which, to be clear, a non-common carrier would be legally required to remove anyway. The statues you're referencing basically say that a common carrier still has to do the moderation that everyone else is legally required to do.

> No it does not bring us back to that. Because common carriers are already allowed to do very limited forms of moderation, and that is what it would be a good thing to incentive facebook into having to follow.

Except, you're

1. Mistaken about the status of the moderation common carriers can do, as I explained just above and

2. Mistaken about the amount of moderation that people want. Both lawmakers and the average citizen want more moderation than a common carrier is allowed to engage in, which brings us once more back to my question: do you trust Cruz and Pelosi to develop a moderation framework?

I already know the answer. You don't. What you appear to want is a form of no-moderation absolutism that forces these companies to promote all speech that isn't harassment or pornography.

The problem with that is that, from a profit perspective, most users of those systems (and transitively, most advertisers) don't want to be associated with that kind of content.

The practical result would be a mass exodus from these platforms to other, different, new forms of social media that avoid being classified as common carriers (or do things to avoid the broad section 230 issues, perhaps by being even more highly moderated and small-group oriented or totally private, like discord). Maybe that's a win for some people, but if your goal is popular unmoderated platforms, that's not what you'll get, because that's not what consumers want.


> You're suggesting that by opting into a common carrier status, they gain what?

It means that they can't be sued for content on their platform. If they do not have section 230 protections, every piece of content on the platform, they would be responsible for.

And if they then choose to be a common carrier, they also get that immunity from being sued. There are 2 different ways of recieving this immunity to lawsuits. And if you take away way 1, which is section 230, then their only option is to take way 2, which is by being a common carrier.

> but if they choose not to moderate, they aren't liable.

The point would be to strongly incentivize them to respect the user's choice of moderation, common carrier laws, are the way that people often talk about, but sure maybe there is a 3rd way that you have found to do this.

> Such as?

Such as moderation that is done at the request of the user, for that specific user only. The phone company, can blocks calls if that user wants it, and common carrier laws dont stop that.

If Facebook was only allowed to do moderation, at the request of that specific, individual user (or, of course, things that are already illegal.), then that solves most of these problems, where users want moderation for themselves.

If we want to get more detailed here, we could imagine allowing categories of moderation, that facebook can engage in, if and only if the user opts into it. Then we don't have to define moderation, and can simply have it be whatever that specific user wants.

> the average citizen want more moderation

That can be done through incentivizing facebook to allow user choice, of their own moderation. Problem solved. Then they can get that moderation, and we don't have facebook forcing this moderation on others.

> What you appear to want is a form of no-moderation absolutism

Nope. I want the moderation to be left in the hands of the specific user. Therefore, everyone gets what they wants, except facebook I guess.

> most users of those systems

And yet, the phone system works perfectly fine. I don't particularly care if the phone companies are mad about being forced to follow certain laws. We wouldn't have to make laws, if they wanted to follow them.


> It means that they can't be sued for content on their platform. If they do not have section 230 protections, every piece of content on the platform, they would be responsible for.

No! See Cubby, Inc. v. CompuServe [0], which predates Section 230. If you do zero moderation, you are not liable for user generated content period. Section 230 creates a good-faith protection so that if you do choose to moderate some content, you don't become liable for failing to moderate the rest.

The legislative history of section 230 is literally that without it, companies were liable for user generated content only if they tried to moderate, and lawmakers wished to encourage companies to moderate content

> Such as moderation that is done at the request of the user, for that specific user only. The phone company, can blocks calls if that user wants it, and common carrier laws dont stop that.

This is not "moderation" in the common use. In the context of facebook, this would be that "I'm unable to block another user", which is silly (and perhaps should reveal to you why trying to shoehorn Facebook into a common carrier doesn't make sense).

> That can be done through incentivizing facebook to allow user choice, of their own moderation. Problem solved. Then they can get that moderation, and we don't have facebook forcing this moderation on others.

Facebook already allows users to block other users and pages. All of the options you want are already available. You're suggesting a strict reduction in moderation tools, and claiming that this will solve everything the way everyone wants.

So let me pose a simple question: how does your approach address coordinated misinformation campaigns? Say I come along tomorrow and start spreading lies about how and when voting works. I'm so effective at this that turnout next year is significantly lower because people are confused. I haven't done anything illegal. Can Facebook remove my content? It's actively and demonstrably harming the democratic process by preventing people from voting. What should we do?

> And yet, the phone system works perfectly fine.

I basically disagree here. Spam is a much larger problem on my phone than anywhere else. Plus, due to the implicit attributes of the phone system, things like spam and misinformation are handled by being crimes on the part of the caller. In other words, Ben Shapiro can post whatever he wants on Facebook, but if he were to call every American citizen and try to share the same information, he would be a crime. Not because of the information, but because sharing things widely via the phone system is essentially illegal.

[0]: https://en.wikipedia.org/wiki/Cubby,_Inc._v._CompuServe_Inc.


> . If you do zero moderation, you are not liable for user generated content period

Ok sure, whatever. But the point would be to put facebook in a position where they are only doing moderation that a user is requesting that they do. We can use common carrier laws, or we can use some different legal precedent. Thats that goal. The specific method, or process, isn't really the main point.

> you're suggesting a strict reduction in moderation tools, and claiming that this will solve everything the way everyone wants.

I am suggesting that users should be allowed to have voluntary choices about what moderation they want to have, as it relates to them, without facebook forcing those decisions on the user.

> All of the options you want are already available

No, users still are forced to accept facebooks choosen moderation preferences. That can't choose to opt out of Facebook's censorship or moderation, if they voluntarily choose to subscribe to certain content.

> solve everything the way everyone wants.

It will solve what a lot of people want. Sure, I am sure that there are people who are just explicitly pro-censorship, and want the 1st amendment to be removed, and want the government to ban all opposing political opinions.

But there are also a lot of people who don't want facebook to be forcing users to accept certain moderation, and instead want users to be in control of what they see.

> how does your approach address coordinated misinformation campaigns?

If something is so bad, that it needs to be censored, then that is what the government is for. Terrorist threats are already illegal, for example.

Backdoor censorship isn't the way to go. If you really believe that mis-information is so bad, that it needs to be stopped, by these types of methods, then the onus should be on you to gather enough support, to get around the protections of the 1st amendment.

Checks and balances are difficult to get around for a reason. And if you can't get around these checks and balances, well thats the point of those checks and balances.

> I haven't done anything illegal

Well thats life. You are trying to backdoor your way to try to get stuff censored, because you aren't able to convince the government to do it. We don't need this backdoor method of supporting censorship.

> Spam is a much larger problem on my phone than anywhere else

There is nothing illegal about phone companies offering additional, voluntary methods, of allowing users to choose more things to be moderated for themselves.

Such problems could be solved that way. By facebook, or whoever, offering voluntary programs, to block certain types or categories of things, if the user chooses to opt in to that form of moderation.

> Ben Shapiro can post whatever he wants on Facebook

You don't have to listen to Ben Sharpiro if you don't want to, in this situation. Instead, you could just voluntarily choose to subscribe to a "ban right wingers" block list, or whatever voluntary algorithm that the user opts in to.


> to get around the protections of the 1st amendment.

I have gotten around the protections of the first amendment though! You're objecting how I did, and saying that it's not fair or not right, and that I should instead get around them in a different way. But why am I supposed to trust that you won't also object to that other way when, or if, I do gain that support?

> But there are also a lot of people who don't want facebook to be forcing users to accept certain moderation, and instead want users to be in control of what they see.

They are: they're free to use a different service. No one is forcing you to use facebook or to distribute your content via facebook. There are plenty of other ways to distribute your content, and you have no more right to require facebook distribute your content then they have a right to force you to use their website.


> I have gotten around the protections of the first amendment though!

Yeah thats the point, lol.

If you are just pro censorship, and dont care about the principles that the 1st amendment protects, and just want to find a way around it, just say so.

If you would support the government, censoring all opposing political opinions, or jailing them, or worse, because of some legal technicality, just say so.

As long as you are honest about it, and you just want to do things that are basically equivalent to the government censoring anything for any reason, just say so, and say that you want to ignore all of this stuff, and want to find a way to censor all of your political opponents, in any way that you can find.

Yes, I completely agree that people such as yourself, don't care about any of these principles, and just want to figure out whatever ways that you can, to defacto censor anyone for any reason, regardless of the motivations in the first place, to stop censorship. Thats the point.

The whole point is that you don't care about any of this, and just want to engage in authoritarianism, in any way that you can get away with. Thats the problem.

Thats what everyone else is trying to fight against. Authoritarians who look for loopholes, to engage in authoritarianism, with whatever possible way that they can get away with.

Thats why we want laws to stop that!

> They are: they're free to use a different service.

But the alternative solution, that provides a lot of benefits, to most people, is to simply incentivize facebook to implement these policies.

That would allow people to get most of what they want, by preventing these work arounds, where the principles of anti-censorship, are backdoor ignored. While also simultaneously giving users the ability to moderate content that they see.

> have no more right to require facebook distribute

I agree that the law does not currently prevent facebook from doing certain things. Which is why the conversation, the entire time, was about making new laws, or changing laws, so that they cannot engage in what, in your own words, is a way to get around the 1st amendment, and engage in mass censorship, that is almost as bad as if the government did it.

Thats the problem!


This goes back to what I said far upthread: "Given that I don't believe that it is fundamentally possible for any private entity to infringe on my right to free speech", I'm not pro censorship or trying to sneak around the first amendment or whatnot.

My point was that given your view of free speech and the first amendment, a view that I fundamentally don't accept, I have no reason to trust that you're being forthright. And I think we have proof now that you were't. You admit that, in your view, we have gotten around the first amendment.

But you said that that was okay, that if it was so important, we could gather support and bypass the first amendment.

> If you really believe that mis-information is so bad, that it needs to be stopped, by these types of methods, then the onus should be on you to gather enough support, to get around the protections of the 1st amendment.

And my point is that we have that support right now. You're the one advocating for a change. So which is it, are we allowed to get around the first amendment, or not? When is it okay?

> The whole point is that you don't care about any of this, and just want to engage in authoritarianism, in any way that you can get away with. Thats the problem.

I think it's authoritarian to force groups to associate with people they don't want to. You're trying to use the first amendment, something that's supposed to protect our rights, as a cudgel to allow the government to strong-arm businesses into acting the way you'd prefer. That's authoritarian!

You're advocating for laws to enforce anti-authoritarianism. You want the authority to force people to (in your view) be less-authoritarian. Hopefully you at least recognize the irony.

> But the alternative solution, that provides a lot of benefits, to most people, is to simply incentivize facebook to implement these policies.

But you're not talking about "incentivizing" you're talking about forcing via law and government.

> that is almost as bad as if the government did it.

And what's even worse is the government having control over how and what they can moderate. Like it absolutely baffles me that you think further centralization would make this better.


> in your view, we have gotten around the first amendment.

There are principles that the 1st amendment protects, and it is those principles that matter.

For example, there are laws that prevent the government from locking people up, without due process. If someone were to find a way to get around that law, by having a private company lock people up, without due process, that would be getting around the principle of due process.

> But you said that that was okay, that if it was so important, we could gather support and bypass the first amendment.

It is "ok", in the same way that it would be "OK" to change the constitution, so as to remove everyone's right to due process, and allow the government to lock people up without a trial.

And by "OK", I mean, you better have a really good reason for it, and you better have to follow the extremely difficult process of removing everyone's right to trial, and you really shouldn't be using a loophole to do that (such as if private companies locking people up without trial, was allowed).

Thats what I mean by that. That getting around these principles, should require a very high bar, and you better be able to convince a whole lot of people, if you want to remove everyone's right to trial, or some other similarly important thing.

> as a cudgel to allow the government to strong-arm businesses into acting the way you'd prefer.

Common carrier laws already exist and are uncontroversial.

If you want to call that authoritarianism, ok, go ahead. But most people think that these laws are perfectly ok. Most people do not think that anyone's 1st amendment rights are infringed upon, because phone companies have to carry all users.

So, you don't get to say that extending these already existing, and uncontroversial laws, are some giant change, when they aren't.

If we really want to go even further, we could talk about the civil rights act of 1964. That forces businesses to not discriminate. But few people would say that our rights are infringed upon, because businesses can't discriminate.

Your line of argument, should cause you to think that anti-discrimination laws, are some "cudgel" to force groups to associate with people that they don't want to.

Which would be an extremely controversial argument to make, for you to say that all anti-discrimination laws, are authoritarian.

> you want the authority to force people to (in your view) be less-authoritarian. Hopefully you at least recognize the irony.

It forces people to be less authoritarian, using authority, in the same way that the civil rights act forces people to be less authoritarian, using authority.

If you want to call the civil rights act, or common carrier laws, ironic, well OK I guess.


>Today, the government cannot write censorship rules.

That actually was my point: the governments do write the censorship rules already and define punishment that you get for violating them.

Child porn is censored, for example. Here in Germany there is censorship laws against promoting national socialism or antisemitism.

Our democracies are structured in a way that legally prevents the legislatures to write censorship rules as they please and without good reasoning, and that is usually achieved by constitutional law guaranteeing freedom of speech to the widest extent possible, of the press, of assembly and of religion, and a division of powers and the mandate for those divisions to check each other. The courts but also multi party democratic systems, the press, voters and activists and protestors play a significant role in this.

While this system is not perfect, and there were and will be mistakes and abuses along the way that need to be corrected - which usually takes a lot more time than anybody likes - it is still the best we ever had and will ever get I think. The only real "alternative" we tried, that of a "strong" leader or oligarchic group of "strong" leaders was and is always a lot worse.

But now we moved a very significant part, if not a major part, of public discussion forums onto the platforms of private companies, and are now surprised that the "strong" leaders on top of these companies do not necessarily share our values or even the minimal set of values our societies as a whole agreed on, and make up and change rules as they please, without any real repercussions so far.

And not just that, they sometimes actively try to use their immense powers of content distribution and "moderation" to nudge discussion into one way or another. We already see these companies getting bolder in restricting certain topics all by themselves, be it around certain Corona related topics, be it to "fight white supremacy" ideology or other "extremism". And of course the companies started by censoring topics where reasonable people would most likely say "oh, those are dangerous or vile ideas by dangerous or vile people" [and then somebody brings up the Paradox of Tolerance to justify letting the companies reign free in those areas]. I am pretty much convinced that a lot of these are test balloons by the companies to see how far they can push the envelope right now before the momentary backslash becomes too much. And if we let them, then over time they will slowly push the envelope further and further for more and more control.

We and our politicians even help those companies to gain more power and control, by writing laws delegating policing responsibilities to the companies and by demanding the companies start policing more against misinformation, extremism, racism or and other of those -isms more and more. Not only is that a transfer of power, it's at the same time regulatory capture making sure any future competitors to these companies have it almost impossible to enter the market at all.

The first step on your route to doom is signing a petition requiring people to "provide id before making a social media account" (as is happening right now e.g. in the UK because some racists said some vile things about English football players), and then before you know it you end up with a "social score" system like in China, except it's the big companies maintaining it and regulating it and not even the (repressive) government.

Good intentions and all that...

[/rant over]


> But now we moved a very significant part, if not a major part, of public discussion forums onto the platforms of private companies, and are now surprised that the "strong" leaders on top of these companies do not necessarily share our values or even the minimal set of values our societies as a whole agreed on, and make up and change rules as they please, without any real repercussions so far.

But the alternative is that the government forces its own values onto the companies. I'm suspicious of "we're going to hide this information because its bad", but I'm also suspicious of "we're going to prevent companies from expressing values different from the ones the government allows".

I much prefer independent systems to a government that ultimately controls how and what corporations can say. The first is perhaps bad, the second is far worse.


You seem to want some sort of aggregator ... ? I'm not sure facebook's the best model to start with there. I think you'd be happier with a third-party service that could receive from FB, and then we wouldn't get into forcing facebook to carry stuff.


I don't want to force FB to do anything. I want to require all social media companies to allow aggregation (though ActivityPub etc) but if Zuckerberg/Facebook decide they don't want to be in that industry, I would have no problem with them simply closing down.


>Facebook should be forced to show me the feeds I choose to subscribe to.

What about constantly manipulating you to become aligned to the more extreme versions of your ideals, so you will subscribe to more extreme feeds?


>Facebook should be forced

I would consider this sentiment more absurd and dystopian than Facebook freely removing content from its own website.


Why? Facebook is not a person, it should exist solely at our discretion. "People should not be afraid of their governments, governments should be afraid of their people" applies even more strongly to corporations than to governments.


Corporations are just citizens cooperating in a venture. That shouldn't result in them losing rights simply because they are incorporated.

The law should be applied to relevant details of a situation, not something arbitrary like incorporation unless the issue is specific to the details of incorporation.


> Corporations are just citizens cooperating in a venture.

No, that's partnerships or unincorporated associations (well, “people”, not necessarily “citizens”). Corporations are distinct legal entities created by the power of the state and im ued with state-issued charters whose investors are granted the immense legal privilege of limited liability.

> That shouldn't result in them losing rights simply because they are incorporated.

The exercise of state power involved in incorporation and the privileges at public expense it grants should absolutely be tied to restrictions and conditions that ensure that a sufficient public benefit to warrant the cost is served and that the exercise of state power involved in the creation, existence, and operation of the corporation is consistent with the restrictions otherwise applicable to state power.


> Corporations are just citizens cooperating in a venture.

No they're not. They come with an extraordinary privilege of limited liability, which is meant to be paired with a corresponding responsibility to create a social benefit.


You are right.

On a different tack, if governments legislate some kinds of moderation for corporations, then the corporations would become an extension of the government and that would become government censorship in practice. Not good.

If governments legislate no moderation for corporations, then corporations would lose the ability to encourage different kinds of communication with moderation.

I don't see either case as healthy for freedom, capitalism, or individuals in the end.

Legislating that moderation rules must be made public, and moderation actions logged so the public could review a corporations compliance with its own chosen rules, seems reasonable to me. It would avoid deceptive moderation, which to me is a problem similar to other kinds of fraud that are outlawed.


All governments apply rules to corporations forcing them to do or not do things. Are you in principle against that?


Not at all. I am against arbitrarily forcing them to do or not do things, which is what some people are calling for here.

I think Facebook is a terrible site, and I don't even know why anyone would want to keep using it tbh.

edit: typo


Do you consider the already-existing (in other fields) common carrier arrangement to be absurd and dystopian?


[flagged]


There’s a huge gap between ‘legal but morally unacceptable to some’ and ‘blatantly illegal’. Making an argument against the latter while ignoring that the discussion is largely about the former is the very definition of a straw man.


Characterising people who want free speech as the criminals who will use that speech for the worst ends is no different to people who opposed a fair trial for all because criminals would want a chance to avoid jail.


Will somebody please think of the children?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: