Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Do you have any other options? I don’t care who solves it, but when a company is run for the intent to produce propaganda it’s pointless to ask them to self regulate.


There is always the option to let the issue sort itself out. To allow space and time for a solution to emerge.

We should be careful not to fall into action bias. E.g. the thought that we need to do something, anything, since that can lead to counterproductive solutions.

I've begun to look at information problems like this not too differently than viruses of thought. Right now these viruses are running rampant because we've never had to deal with anything like them before on such a wide scale. It seems perfectly possible to me that over time we will develop social standards that immunize us from these viruses. More and more people will begin to disregard clickbait, outrage-inducing headlines, etc. They will simply become less salient the more and more we experience them.

Reframing the question at hand around this metaphor: What would an effective vaccine look like for these thought viruses? I'm not at all sure, but I can't imagine any kind of partisan response that would work, since these viruses infect left and right alike, and many people will bend over backwards to argue otherwise. Until we can face that fact honestly, I don't see how we could even begin to have a productive conversation about a solution.


>Reframing the question at hand around this metaphor: What would an effective vaccine look like for these thought viruses? I'm not at all sure, but I can't imagine any kind of partisan response that would work, since these viruses infect left and right alike, and many people will bend over backwards to argue otherwise. Until we can face that fact honestly, I don't see how we could even begin to have a productive conversation about a solution.

Actually, there are effective ways to identify the credibility of information. From the well known CRAAP.[0] test to "lateral reading"[1] and a host of related [2][3][4][5] methods to clarify the credibility of online (or offline, for that matter) material. There are even curricula[6] that addresses these issues.

And no, none of these methods are partisan. Rather, they give the reader tools to help them determine the validity and credibility of information.

That many folks don't do so is definitely a problem. One of the less involved methods is "lateral reading" as described in [1]. I heartily recommend it, as well as other methods.

[0] https://researchguides.ben.edu/source-evaluation

[1] https://www.nytimes.com/2021/02/18/opinion/fake-news-media-a...

[2] https://www.library.georgetown.edu/tutorials/research-guides...

[3] https://paperpile.com/g/find-credible-sources/

[4] https://libanswers.tcl.edu/faq/6286

[5] https://www.virtualsalt.com/evalu8it.htm

[6] https://www.schrockguide.net/critical-evaluation.html


They're not partisan, no, but as you kind of allude to ("many folks don't do so") a prerequisite for using them properly is a certain ideological flexibility that is... less common these days. If someone is ideologically possessed, they will use these tools to skewer outgroup ideas but not apply them to ingroup ideas. As the letters in this very post demonstrate even our congress people can't apply them to their own thinking!

And as far as a governing apparatus, I'm not sure these tools really help provide the structure needed to declare any given piece of media misinformation or not.


>And as far as a governing apparatus, I'm not sure these tools really help provide the structure needed to declare any given piece of media misinformation or not.

If my post came across as suggesting that the methods I linked to should be used in some sort of [quasi]-governmental way to determine what is "good" or "bad" information, then I apologize.

My focus was strictly on answering GP's question[0] on an individual basis:

"What would an effective vaccine look like for these thought viruses?"

I was also trying to imply that there are already ways to "separate the wheat from the chaff" that are quite well-known and well thought out.

But they are just tools. And what use someone (doesn't) makes with such tools is up to them.

[0] https://news.ycombinator.com/item?id=26244080


> If someone is ideologically possessed, they will use these tools to skewer outgroup ideas but not apply them to ingroup ideas.

Not just that, but ideologically-possessed people will flat-out reject a truth-finding methodology that results in conflicts with their worldview.

There's no point in giving someone the tools to find the truth if they're so wedded to their "truth" that evidence will not make them change their minds.


>There's no point in giving someone the tools to find the truth if they're so wedded to their "truth" that evidence will not make them change their minds.

Are you making the argument that because some folks won't use them, such tools/methods are useless?


Perhaps useless in the sense that those who need them the most will either refuse to use them or misuse them.

Going further... I think it might be fair to say that those tools just don't scale.


>Perhaps useless in the sense that those who need them the most will either refuse to use them or misuse them.

I'd argue that such tools are valuable to everyone, even those who have no interest in verifying the credibility or veracity of information sources.

As the old saw goes, "you can lead a horse to water, but you can't make him drink." Or both more snarkily and (IMHO) more accurately, "you can lead a fool to knowledge, but you can't make him think."

>Going further... I think it might be fair to say that those tools just don't scale.

I'm not sure what your mean by "scale" in this context.

Determining for oneself the credibility/veracity of information or an information source is (and should be, IMHO) inherently an individual pursuit.


> Reframing the question at hand around this metaphor: What would an effective vaccine look like for these thought viruses?

Well, if stopping disinformation is too hard for various reasons, maybe we can focus on the problem from the other direction: we need to find ways for accurate information to be easier to find and to verify.

If you think of misinformation more like a bacteria, then one of the common causes of bacterial infections is that the regular good bacteria have been wiped out for one reason or another. Antibiotics might help, or they might make the problem worse.

I do think we have some serious institutional problems that are preventing the usual sources of accurate information from operating effectively. News that's become overtly partisan, and an economic model that selects for the most sensational headlines. Scientific research findings that aren't reproducible. Universities becoming increasingly run like profit-focused corporations, and too expensive for many to attend due to lack of public funding. Misinformation is always a problem even in the best of times, but it can also fill the void when there's a lack of accurate information.

I don't know what the solution is. I tend towards more distributed models of information sharing that have fewer institutional gatekeepers declaring who the experts are, but I don't know exactly what that looks like, or how to do that in a way that tends towards more credibility rather than less.


So in other words no, but you don’t think it’s a long term problem?

I honestly think your fath in humanity is refreshing. Personally, I think this is just reversion to the mean where simply lying was historically the most common response.


> I honestly think your fath in humanity is refreshing. Personally, I think this is just reversion to the mean where simply lying was historically the most common response.

Its your faith in humanity that's "refreshing" if you think giving the people with the guns the power to police speech is the proper solution.


Don’t put words in my mouth, I didn’t suggest government regulation of speech.

Though I will admit debate rules where each side gets equal time back to back to be somewhat humorous. That’s mostly my love of chaos and the spectacle of such an idea.


Apologies, didn't mean to put words in your mouth, but that's how I interpreted the following:

dnissley: "Do you think the solution should come from the government? It seems implied here but just checking."

Retric: "Do you have any other options?"


Odd, I was initially thinking in terms of a non governmental organization to regulate the terms Reporter and News much like how Doctor is a protected. But, that doesn’t seem viable.


The medical monopoly is more expansive than that. It regulates not just the use of the terms but the practice of medicine itself. And those rules are backed by the force of the government. If that’s what you’re suggesting for journalism, it’s even worse. It ultimately has the backing of government, but without the political accountability.


That’s one of many issues, however the peanut butter vs peanut spread line feels like a useful benchmark.

You could call your a current events organization and say anything or call yourself a News organization and be held to some standard. That IMO avoids limitations on free speech as the body of the message is what’s important not the label of such a method. As you say using government force to require organizations to change their name is distasteful.

However, coming up with a new term like whizphish that currently has no meaning but could gain meaning in this context should avoid stepping on any toes while achieving similar goals. LEED Platinum certified doesn’t directly have government backing, but a building falsely claiming such is simple fraud.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: