Hacker News new | past | comments | ask | show | jobs | submit login

There are really 2 options here:

1. Live with the first amendment at section 230 as. That means things stay mostly the same. Big tech will continue to censor too many of the things I like and too few of the things I don't like. And we wait for the pre-internet generation to die off and be replaced so society can progress.

2. Restrict the 1st and force fact checking, fairness, neutrality, etc. This would get us back to a functional media/societal/democratic state a lot faster. But it would require pretty radical change. Also, the biggest losers wouldn't be big tech. They would be Fox news and similar orgs.

I don't see any other option here. I don't see that "regulating" (wtf that means, different things to different people for sure) Facebook but not Fox is productive or fair or possible even really.

Personally, I don't trust the current system to do anything other than beat tech companies with a stick for no reason but that they exist and don't pay enough in "lobbying". I don't see Fox or similar, much bigger propaganda/censorship orgs being touched. Certainly not on a bipartisan basis (and that's what's needed to do more than sneeze in Washington).

So here we are, and I am glad the right and left and (stupidly) fighting each other...

Thanks for reading, you may now downvote these inconvenient truths...




>force fact checking

Its not possible, period. You can remove this option. FB/Twitter/YT all have "fact checkers" and they are neither neutral nor do they get it right. Even if outsourced it still fails miserably. On top of that people should not be told what the "facts" are for complex thinks. It removes the need to think and build an opinion. This is essential because for almost everything controversial there is no way to find the absolute truth. People need to learn what it means to accept that we dont know something and likely never will know what the truth is.


Option 2 implies that there is some truly unbiased entity without conflict of interest that could both be able to handle speech vetting and also be allowed to do it. This is extreme fantasy.


I won't pretend I know what it would look like or how it should work. That said, it's not impossible to at least improve the current situation: removing at least some demonstrably false statements would be a start. Requiring "right to reply" and equal air time to different parties would be too.

Again, I don't really support this approach, I'm just pointing out it is what people presumably want since Option 1 is out of favour and that only really leaves option 2.


https://oversightboard.com/ is trying to do this.


Such a vetting would be (1) extremely expensive and (2) in most cases the decision would "well... We don't know for sure"


The worst part is that people will be told "You don't know for sure" rather than people using their brain and come to the conclusion that they dont know for sure. Which is actually the default for any critical thinking person. And it helps a lot to not fall for extremist views or ideas which are usually presented oversimplified and thus fool people who usually quickly pick sides rather than accepting the default of not know for sure.

Rather than a fact checker maybe an algorithm could find opposing content and present this. That would force the user to make some "fact"-checks aka to think. needed to say that such an algorithms could/would be biased as well.


> Rather than a fact checker maybe an algorithm could find opposing content and present this. That would force the user to make some "fact"-checks aka to think. needed to say that such an algorithms could/would be biased as well.

What do you think of a small social network composed of vetted (and always subject to review) reasonably trustworthy and fairly unbiased but definitely open minded users who decompose, fact check, and deeply debate very small volumes of stories, using a platform more sophisticated than nested discussions with voting? Something with structures like a "points for" and "points against" format, among many other novel (as far as social media goes) features, where "it is not really known for sure" is a perfectly acceptable conclusion, subject to review as new information becomes available?


Almost everything will be "not really known for sure". So the next step to make it useful would be somehow count/vote and get "an X is more likely true than Y". Maybe even add % and now you essentially have "mob-fact-checking". Needles to say that the majority isn't a source of truth, especially not if you have a broad range of topics and a broad range of people so that for any given topic only a fraction of the people have deep knowledge and "the truth" is actually defined by the rest (mob) who dont have deep knowledge about the topic.

What you actually would need is a peer-review like system. Where people familiar with the topic do the fact-checking. But this just moves the problem to another place because someone would need to defined who is familiar with a topic, but without putting people with aliened views together, its just as impossible as the fact-checking itself.

Lastly if we actually would be able to create a working fact-checking system, once that system has been used for one of the long time controversial topics like for example fact-checking a statement about abortion being murder, then almost everyone who disagrees with that fact-checking would loose trust in the system which render is essentially useless. You now have a "source of truth" but a significant portion of people (roughly 50% probably) don't trust it.


If I'm hearing you correctly, it sounds like you're kind of setting a goal of reaching something like "X is True"? My thinking is, very often (usually?) the best that can be done is to decompose X down to as fine grained sub-components as possible, and then tentatively flag things with True/False/Unknown...or maybe even something more fine-grained than that ("Seems True", "Seems False").

From my perspective, the biggest issues are that we refer to issues with extremely ambiguous names/perspectives, and we assert true/false where it is absolutely not finalized. I believe that if there was a system run by a transparent, independent organization that took the definitions and the epistemic status of these issues very, very seriously, some people would start to have some trust in them, especially if they developed a track record over time.


>...a goal of reaching something like "X is True"?

Not really, It would serve no purpose, only the people who agree with it will agree with the system that outputs "their truth".

I dont think the goal should be such a system because I think is will be flawed no matter what and also because I think people should be triggered to think and not feed with simplified "facts".

People would also do better if they stop caring about a lot of "garbage" facts they get feed every day if they would actually need to think and build an opinion on something.

Instead they get a opinion presented and either take it or reject it based on bias mostly. This is not useful. It would be better if a person who's not sufficiently interested in the topic simply does not have an opinion on it. At lest if at a later point he gets interested he would then not be preoccupied by past "copied/rejected opinions". Many people today have very strong opinion on very irrelevant topics and can hardly reason their stance because it dint "grew" in them it was mostly planted/absorbed from media.


> Not really, It would serve no purpose, only the people who agree with it will agree with the system that outputs "their truth".

Does this not assume that people's minds cannot be changed? I understand the generalization you're making and very much agree with it, but I suspect we differ greatly on the underlying causality.

> I dont think the goal should be such a system because I think is will be flawed no matter what...

Is "perfect is the enemy of good" relevant here, and perhaps also "perception is reality", and some others?

> and also because I think people should be triggered to think and not feed with simplified "facts".

100% agree. A proper system would have numerous goals and features, I imagine you can think of many that I overlook (despite how much more time I've spent thinking about this problem).

> People would also do better if they stop caring about a lot of "garbage" facts they get feed every day if they would actually need to think and build an opinion on something.

This is a fine idea - how might one cause (force) such ideas to manifest in physical reality?

> Instead they get a opinion presented and either take it or reject it based on bias mostly.

Under the current system, agreed.

> This is not useful.

That depends on one's perspective, goals, etc - it is immensely useful to some people.

> It would be better if a person who's not sufficiently interested in the topic simply does not have an opinion on it. At least if at a later point he gets interested he would then not be preoccupied by past "copied/rejected opinions". Many people today have very strong opinion on very irrelevant topics and can hardly reason their stance because it dint "grew" in them it was mostly planted/absorbed from media.

Agree again - so, what can be done to alter this state of affairs? What is the most efficient approach that can be devised and implemented (and, how might one go about that)?

The lack of systems & logical thinking on HN when it comes to certain topics is an extremely interesting phenomenon to me, what do you think?


> force fact checking

Just remember that less than a year ago it would have been your worst enemy appointing the fact checkers.


Nothing is "inconvenient truths" from what you said, you've just showed yourself as an ignorant fool, that's all.

At least try and read half the article next time before engaging in conversation.


I didn't downvote but I read twice and I don't understand what you're saying. Maybe you could try and rephrase to make your point clearer?


* Problems with fake news, censorship etc are much bigger outside of big tech than inside (media monopolies and propoganda like fox news for a start).

* so if you actually care about those issues (rather than just hating big tech) you need to fix things beyond big tech

* that means reform of the 1st amendment

* I suspect that no one actually wants that, they just don't like Facebook etc but they're used to Tucker Carlson or Joe Rogan

Is that any better :)


Yes I understand now thanks!


No worries. I was pretty wordy the first time so it was good for me to summarise. :)




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: