Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Most people does not want cars to crash into Christmas fairs either. But the solution is not to ban cars. Mist people does not want kids to be bullied at school. But the solution is not to ban schools.

Right! And the solution also isn’t “fuck it. Too complicated to do perfectly. No discipline in schools / no traffic laws”.



Decentralization does not mean anarchy. The discipline and the laws come from the bottom up. Also, they are usually better than the ones imposed by a central planner because they can be developed faster and within the context of the social norms and culture of the people who are subject to it.


Decentralization doesn't mean much of anything on its own. There are "decentralized" setups that do moderation fairly well (e.g., Mastodon or most oldschool web forums, where there's ultimately someone accountable). This (Nostr), specifically [1], makes fun of such setups.

To moderate any system, there must be affordances for moderation and someone(s) accountable to the users of the system. As far as I can tell, like most blockchain projects, the Nostr project has effectively stripped (nearly) all affordances for doing moderation. Given that the Nostr audience seems to overlap considerably with crypto enthusiasts, I think their stance is basically the same: no moderation, no "censorship", etc. Given that there's literally child porn stored on the Bitcoin blockchain right now, I don't think your argument holds that "decentralization" can just Jeff Goldblum it and "find a way" without explicit affordances / accountability.

[1] https://nostr.com/comparisons/mastodon


Nostr makes fun of the idea that you can have centralized moderation and moves it explicitly to the relays. It is up to the relay owner to determine what is allowed or not.


Which practically means unanimous agreement is needed among all relays in order to moderate anything. I also don't see what incentive / accountability relays face to remove content. So... I don't see that strategy being particularly effective. But hey! If more than a couple dozen people wind up actually using the thing, guess we'll find out!


> Which practically means unanimous agreement is needed among all relays in order to moderate anything.

No, it means that the people will tend to congregate around the relays that work according to what they expect/want to see.

You are thinking from the assumption that things are only acceptable if total compliance is enforced. Even on the highly-controlled and regulated Internet there is still abhorrent content out there, why would you expect that from the alternative?

The interesting question is: do you think that the majority of people don't see this side of the internet because of how effective the centralized control and policing is, or just because the majority of people are not interested in seeing this content in the first place?


> The interesting question is: do you think that the majority of people don't see this side of the internet because of how effective the centralized control and policing is, or just because the majority of people are not interested in seeing this content in the first place?

Both! Most people don't want to see beheading videos and would be very upset if one came across their Youtube recommendations. Fortunately, YouTube's "CENSORSHIP" is pretty good at not showing such videos (even though they're completely legal content!)

However, there is a significant long tail pool of people who are totally into watching such abhorrent content (e.g., 8channers), and could easily cause a deeply offensive content to find itself in the unmoderated "Democratically trending" video feed, as it's demonstrated in the Flare example walkthrough.

Maybe that's your point? Platforms like these will necessarily be used pretty much exclusively by people who like or will tolerate seeing extremely offensive content because everyone else will be put off by the occasional display of horribleness.

Libertarian "DON'T CENSOR ME, BRO" havens like 8chan, kiwifarms, daily stormer, etc. already exist. They're not particularly popular, when compared to the likes of Twitter, Facebook, Instagram, etc. But they're certainly popular enough to draw millions of users. And every one of them would be delighted if they could post their inflammatory nonsense on Facebook or YouTube to reach a wider audience. And Facebook and YouTube have wider audiences because they moderate content.

In a previous life I worked building popular social media apps that included user-generated content. And I saw first-hand how horrific content moderation is. The shit people post to social media sites is as vile as it is vast. I'm certain most "anti-censorship" people's opinions would be changed if they'd watched an actual content moderator do their job for 30 minutes.


The point that I am trying to make and I am not sure you are getting: if we take the decentralized system as the Internet and the different social Networks as "relays with autonomy over their own content", isn't that already an example that that each subnetwork gets to enforce the policies that their own communities value?

The problem I am taking with your view is that shows a not-so-subtle hint of totalitarianism. It tries to use the abuses caused by people with freedom to justify that we all should lose our liberties (or accept that global subjugation to a common set of rules as inevitable.

Yeah, currently all platforms that promote "censorship-resistance" are predominantly used by those who got affected by large -scale censorship. Yeah, most of these people are doing or saying despicable things. But that should not be an argument to make the case that centralized platforms and worldwide gatekeepers are the best solution.

To repeat: you keep arguing like the enthusiasts of decentralized platforms are "anti-censorship", when in fact the fight is about claiming back some sense of autonomy and agency to let people be able to do the moderation/curation themselves (or to someone closer to them who understands their values and social context better)


> The point that I am trying to make and I am not sure you are getting: if we take the decentralized system as the Internet and the different social Networks as "relays with autonomy over their own content", isn't that already an example that that each subnetwork gets to enforce the policies that their own communities value?

Yes. I agree that this model can, and does, work. Mastodon and old school web forums were the examples I gave upthread where they do work. And I believe these work because forum owners / mastodon server owners have the capability and necessary incentives to moderate content posted on their subnetworks. As I understand Nostr's design (relays / clients), and its explicit citation that Mastodon's model is bad because "3rd party (server hosts) can censor you", I do not believe your success model is applicable. Nostr relays lack the incentive and accountability needed to moderate content much the same way Bitcoin miners do.

> The problem I am taking with your view is that shows a not-so-subtle hint of totalitarianism. It tries to use the abuses caused by people with freedom to justify that we all should lose our liberties (or accept that global subjugation to a common set of rules as inevitable.

Holy slippery slope, batman! Global Totalitarianism! Believe it or not, there exists a middle ground between total "LIBERY!" freedom! Moderate yourselves, nerds! and endorsing a Global Cabal of Media Reviewers.

What I completely reject is that the primary problem with YouTube, Facebook, and the like is that they're arbitrarily censoring "views they don't like". The handful of examples of this I've ever seen have been comically obvious censorable material, or actual very difficult decisions that it's completely reasonable to understand why the decision was made. I have many other concerns about these mega-tech companies that I find infinitely more troubling than their current content moderation practices. At the end of the day, these companies are accountable. If nothing else, you have the option to leave! See Twitter. Embracing decentralization for the sake of decentralization only complicates this accountability, potentially to the point where no-one has any appreciable accountability (a la Bitcoin, again).

What I also reject is the implication that this is a simple problem. The Paradox of tolerance is a thing, and it's just plain complicated. And not something decentralization—or any other technology—can solve.

> To repeat: you keep arguing like the enthusiasts of decentralized platforms are "anti-censorship", when in fact the fight is about claiming back some sense of autonomy and agency to let people be able to do the moderation/curation themselves (or to someone closer to them who understands their values and social context better)

I think you and I actually have a lot in common in that respect. I'm a big Mastodon fan, run my own #HOMELAB to reclaim ownership of my data, do the whole POSSE thing, etc. I'm all about all of this stuff. What I reject is this perverse idea that the biggest problem with big tech is that they're just haphazardly censoring ideas they don't like (it's always conservative ideas). It's little more than a conspiracy theory that's led to virtually every social media platform today happily platforming actual Nazis (because of the free speech, you know?) and forbidding such taboo things as sex workers, critics of the CEO, etc. These are decisions. And platforms like Flame exist for basically one reason: to further propagate the conspiracy that big tech (like the mainstream media, colleges & universities, etc.) are just completely and irrationally biased against poor old conservatives who just want to, like, share their opinions, man.


The (main) criticism of the Mastodon model is that your identity is still centralized. The federated model is fine, but the whole thing still depends on domain names which can be seized.

And this issue is not just about moderation. Let's say that you have been a model citizen on a server, but one of the moderators woke up in a bad mood, found something they don't like about you and kicks you out. Now you are locked out of your account and can not even migrate it away. FYI: not an hypothetical, this happened with some of my friends who were working with crypto.

> the biggest problem with big tech is that they're just haphazardly censoring ideas they don't like

To me, the problem is that they are too big, plain and simple. Too big, too powerful and too far removed from their actual customers to even care about the individual customer or anything that slightly deviates from the norm. I don't like them much like I don't like the EU-style of bureaucratic government.

What (I hope) nostr is trying to build is something where the centralization is outright impossible first, then come out with the mechanisms to tackle content curation/moderation.


Why compare against neo nazi and shock websites?

Why expect everything to end in "trending" feed?

Telegram has solved this problem a loooong time ago.

Those who want beheading and castration videos subscribe to russian and arab channels.

Those that doesn't goes elsewhere.

And if one of them shows up in the comments of a sane channel we report them and they are gone.


Exactly right. I don't understand why people miss this detail. Maybe Google does have too much power over how content is served, but this isn't a fix, at all.


No, but the solution is hopefully much closer to traditional western style punishment of switchboard operators who listen in or interfere and far far away from Stasi style mandatory snitching?




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: