Hacker News new | past | comments | ask | show | jobs | submit login
Cognitive bias cheat sheet (coach.me)
684 points by charlieirish on Oct 27, 2016 | hide | past | favorite | 138 comments



I find the treatment of psychology on HN to be perplexing. On one hand, there have been attacks on psychology as a field [1] due to legitimate concerns related to replication. On the other hand, blog posts such as this come up every few days that take the same results for granted and frame them in everyday terms.

I wonder, are there different groups of HN readers with different attitudes towards psychology? Or does the treatment also depend on the presentation, e.g. in the form of a scientific publication vs. a brain hacking tips or cheatsheet.

[1] https://news.ycombinator.com/item?id=12643978


I actually get the impression that most of the articles that are "liked" by HN are somewhat grass roots. Take this one. It is a "hacker" that is spending time to try to improve biases held.

It is somewhat ironic, to me, that I am basically accusing us of having an anti-intellectual bend. However, it is not unique to psychology. There is a similar slant against "academic" computer science.


I think the same is true with philosophy too.

Hackers often seem to have an anti authority preference and are more likely to respect the plucky diy-er than the member of the establishment that has paid their dues. Other fields are by default assumed to be obfuscationary rather than hard or deep.


There is a similar slant against "academic" computer science.

As someone who went to college for CompSci, I see good reason for a slant against computer science. The problem is that it's the wrong topic for somebody who does software development!

What we do is not science, it's engineering. Much of the time spent teaching us the science aspect is wasted, at least compared to the opportunity to use the time on engineering topics like modeling and design, documentation, etc.


> The problem is that it's the wrong topic for somebody who does software development!

For somebody building the next CRUD app, perhaps. In my line of work (EDA software), I routinely apply concepts from my computation science education on the job and in my code. Algorithms, data structures, discrete mathematics, graph theory, set theory, type theory, universal algebra, tractability theory, automata theory — these are all topics where knowledge has come in handy in my professional experience at various jobs.

Of course, concepts from other fields — biology, linguistics, and physics, for example — have also come in handy on occasion. I almost feel like there's no such thing as a "wrong topic for somebody who does software development", though some topics are certainly more relevant/useful than others.


>I routinely apply concepts from ... theory

Sounds exactly like what I would expect an Engineer to do. Apply theory to the real world. Versus a scientist who I would expect to develop, test, and prove new theories.


Well, then you're most likely part of minority. And even in that minority I'm willing to bet that a lot of the time is spent in things that have little to do with Computer Science.

Be honest and think of all the time you've spent fiddling with build systems, various random tools (compilers, scripts, databases, version control, etc.).


Far too often, to be sure. Two things I have yet to encounter in my (admittedly short, so far) professional career are a sane build system and version control that isn't a PITA...


If you are building programming languages, compilers, tools, etc, Comp Sci is certainly pertinent. If you aren't building programming tools, you can get away with a minimal Comp Sci background.


I went to school for Software Engineering, in the engineering faculty. My school also had Computer Science.

The difference was somewhat ironically that the CS degree had more practical, hands on coding, and engineering was more theoretical, plus we had to take all the physics/chemistry/math courses to meet the requirements of the Engineering board.


> What we do is not science, it's engineering.

You could argue that for many of us it is Software Carpentry.

I am a B.Sc. in something something computers. I have worked on some serious stuff (real distributed sensor systems across huge geographical areas and another time working on fringes of a soft-realtime image-recognition system).

Much of the time however I feel I'm much more of a software carpenter.

I actually quite enjoyed some of the stuff from "Software carpentry" online course even though I already was an engineer.


I completely agree. I love computer science. I love software development. But those two things are radically different things.

What I'm finding it difficult to find, these days, is actual mature-age accredited uni courses in computer science. Instead, you get a plethora of development courses which, as you say, are akin to engineering, not science.


>Take this one. It is a "hacker" that is spending time to try to improve biases held.

Funny, the other day there was a thread about Cialdini and his books (including Influence). He has a chapter on this phenomenon (I think he called it Social Proof).

We are more receptive to ideas if it comes from "one of us".


Amusingly, I skipped out based on the headline of "Pre-suasion." Just sounded stupid. Will have to relook at it, now.


Sounds like we have a serious case of in-group favouritism and with a hint of bandwagon effect


There is a difference between anti-intellectual and anti-academic.


> I wonder, are there different groups of HN readers with different attitudes towards psychology?

Yes. This isn't a monolithic hive mind or fundamentalist religion. There are people from all walks of life with different ideas here. You will find trends, but no unanimity.


There is a favor towards Kahneman's work, either out of interest or approval. It hits all the HN high notes (oriented around decision-making, thinking, personal effectiveness, reason) while additionally being backed by Kahneman's more-than-typically-rigorous models.


On replication, this same criticism has recently been directed at physics, chemistry and biology, among others. [1]

Psychology, with view to its practical (non-research) use has not helped me, but at least three people I know say they benefitted from cognitive behavioural therapy. Perhaps in the future, medicine and science will subordinate it to the same role of philosophy to better explain human behaviour and thinking as the latter did in less techical approaches for centuries. [2]

And yes, I am somewhat skeptical of said blog posts that cite some "new study" claiming to explain something only to be refuted or contradicted months later.

[1] http://www.nature.com/news/1-500-scientists-lift-the-lid-on-...

[2] Wittgenstein conceded that philosophy in the 20th century would only be useful in itsapproaches to linguistics and language


The psychology replication crisis is much worse than any other field.

http://slatestarcodex.com/2015/09/05/if-you-cant-make-predic...


>The psychology replication crisis is much worse than any other field.

The article doesn't get into this.

Coming from a physics background, I will not say there's a crisis, but:

1. Few research is replicated.

2. We who did physics research were very unconcerned about whether our results could be replicated. Our journal papers were very concise, and it was common practice not to give all the details needed to replicate, partially due to paper size concerns, but very often also to keep the "secret recipe" to ourselves as a means to prevent others from conducting future research that we want to conduct (need less competition for grant money).

3. Pick any researcher in any field of physics, and ask them: "Can you list a few papers published in the last 15 years that are highly cited that you think are completely wrong?" Everyone will have a list like that. Everyone.


>keep the "secret recipe" to ourselves >Everyone will have a list like that

You and/or your publishers are terrible human beings.


>You and/or your publishers are terrible human beings.

Eh? Not sure why. Because we have a list of papers we don't believe? It would be naive to believe everything that is published.

I don't like the notion of withholding information, and personally I would not like to accept any paper that does that, but I'm merely stating the norm in the field.

You still have to contend with journal size restrictions. A paper in Physical Review Letters needs to be less than 3 pages. You simply cannot put in enough detail for replication in there. Nature has a page limit of 5.


Maybe I was wrong but here are the reasons for the condemnation

1. propagating falsehoods, or flaws in published work without correcting or refuting their validity, claims etc. by your strong admission that everyone knows of such work (which is subsequently used and cited) yet "completely wrong" there must be so much crap getting published. but there is disinclination to do anything about it

2. witholding of information necessary for reproducing results of experiments as a means of discouraging co-operation with peers you view as rivals or adversaries.

The second is really serious because it effectively subverts the scientific method, undermining scientific consensus and progress. That these are "norms in the field" means nothing save for everybody's indifference and lack of cooperation.

Is the above correct?


Historically you have the roles reversed in that medicine and science are both subordinates of philosophy. But admittedly due to the predominance of analytic philosophy, particularly in modern western philosophy, it has become much more of what you describe. This is really more a failing of modern philosophy than a condemnation of the field as a whole. There were/are/will-be many philosophers that don't ascribe to its limitations.


As a former philosophy student, I disavow philosophy for its uselessness. Save for ethics and a few niche topics of inquiry where it has practical utility, its only function is to add epistemlogical detail to science's study of nature (historically known as natural philosophy).

To paraphrase what John Locke said to Newton: "Sir, it is the lesser duty of mine as Philosopher and under labourer of the Natural Philsopher, to remove the debris and furnish detail that the latter imparts to our Knowledge."


As someone who is probably guilty of ignoring most psychology articles due to doubts about psychology research, I think this article is an interesting step forward for popular psychology. This categorization of biases is something I can use both for myself and when I try to help others. This article is unlikely to lead to debates about research methods and statistics because it's very high level and presents few opinions.

Incidentally, I noticed these cognitive biases probably apply equally well to strong AI (artificial intelligence). No matter how powerful an AI is, it's always going to lack some information, it will take time to calculate things, it will lack experience, and it will not have enough memory capacity to remember every detail. Strong AI will probably have to learn to use cognitive biases.


> This article is unlikely to lead to debates about research methods and statistics because it's very high level and presents few opinions.

Huh? The article is entirely opinion. Every single thing that's in bold text is a testable hypothesis presented as pure fact. Even the selection of "problems" is pure and problematic conjecture (are these actually the most important problems? Are these actually problems at all? Would addressing these problems improve our decision making?).

There'll be no debate about research methods because there's no scientific research here. What's presented in this article might be based on observation, but it's definitely not based on scientific testing (or, to the extent it is, that evidence isn't presented).

There's nothing wrong with an article like this, per se. But shrugging off psychology research (much of which explores and attempts to rigorously test the conjectures made in this article), while also accepting this article's statements at face value, is a bit ridiculous. It amounts to "philosophy I agree with is not problematic, but science I agree with should be harshly critiqued"


I can see your viewpoint. I see the article as a suggested categorization of a complex subject, where that suggested categorization is an opinion of what's useful. You're right, that's an opinion. I've read many textbooks that were also opinion in the same sense.


Notice how this particular article is devoted to skepticism generally, rather than promoting a particular hypothesis.

It should be heartening that most science topics here are treated with this kind of Feynmanian attitude, always willing to accept contradictory evidence, always reluctant to draw strong conclusions.


Psychology as a field has many problems, but subfields within psychology are backed by a fair amount of evidence.


It's possible to be interested in a field's subject matter while being critical of the methods.

For example, US space research at NASA may be awash in politics, crippled by a lack of long-term vision, and overly focused on military applications, but space is still cool.

The workings of the mind and the results of cognitive biases are fascinating, but that doesn't excuse poor statistics and a lack of rigor.


> are there different groups of HN readers with different attitudes towards psychology?

yes


> I wonder, are there different groups of HN readers with different attitudes towards psychology?

Yes and no. Yes, we are all different people. No, there is not much sense to grouping us based on a single dimension. That is an excellent example of what Vonnegut called a granfalloon.


This is cool, although the real trick is knowing when and how to employ methods that will mitigate the problems caused by cognitive bias to accurately identify and resolve conflicts and facilitate clear decision-making. This is the purpose of courts, the purpose of peer review, the purpose of debate.

Generally, it is not necessary to understand every single type of cognitive bias in a nuanced way to mitigate the problems. Indeed, sometimes cognitive biases overlap to the extent that trying to mitigate one, you'll wind up affected by another. What's important is that your behavior and social rules be oriented towards uncovering truth through dialectic methods.

Jonathan Haidt, a moral psychologist, recently posted a terrific video on the the issue viewpoint diversity on college campuses-- specifically, the lack of it. Problems of confirmation bias are exaggerated, especially in social sciences, when there's a lack of viewpoint diversity on campus. When everyone likes the conclusions put forth by a paper, no one is motivated to find the flaws. Thus the flaws are not found, the flawed papers get cited by other papers, and you wind up with a knowledge base that is increasingly divorced from reality. Whether you know the name for that bias or not is less relevant than actually addressing the structural problems.


If you are interested in learning how your brain makes decisions, where biases and error come from, and a whole lot more, I highly recommend this book by Nobel Prize winner Daniel Kahneman: https://www.amazon.com/Thinking-Fast-Slow-Daniel-Kahneman/dp...

Interestingly, he won his nobel prize in the field of economics, but he's a psychologist, not an economist. His research was so influential that it changed business strategies (esp. around how meetings are held) forever.

I can't say enough good things about Thinking Fast and Slow. Go read it.

I posted this article in reply to another comment in this thread, but I think many will find it interesting and useful. It's a good jumping off point into his research and why it's important.

http://www.newyorker.com/tech/frontal-cortex/why-smart-peopl...


Do you have any examples around how meetings have changed? I'll need to relay that information promptly...


http://juanreyero.com/article/technology/meetings.html

"The principle of independent judgments (and decorrelated errors) has immediate applications for the conduct of meetings, an activity in which executives in organizations spend a great deal of their working days. A simple rule can help: before an issue is discussed, all members of the committee should be asked to write a very brief summary of their position. This procedure makes good use of the value of the diversity of knowledge and opinion in the group. The standard practice of open discussion gives too much weight to the opinions of those who speak early and assertively, causing others to line up behind them."

(Following this advice is rare; I think it's bold to say that Kahneman's research has already changed meetings.)


I would agree. For me, the most important thing to know is that there are biases that could be impacting me and to always leave room for a) a competing viewpoint can be sincerely held and possibly correct over the viewpoint I am convinced of, and b) to always ask myself if there is a self interested reason I want the belief I hold to be true.

Many biases cannot be overcome even with knowing about them... but accepting that it's possible I'm wrong and that the crazy guy on the other side is both sincere in his competing belief and possibly correct, allows me to evolve and learn and ultimately get along with more people.

I have some strongly held beliefs and opinions...But I try to keep my mind open and without considering cognitive biases, you are going to be more close minded than you should be in most cases.


but accepting that it's possible I'm wrong and that the crazy guy on the other side is both sincere in his competing belief and possibly correct, allows me to evolve and learn and ultimately get along with more people.

As often as not you're both right but simply have different or conflicting priorities


Very True. I would add that conflicting priorities and/or conflicting preconceptions. In other words, conflicting priorities reflects bias in general, but cognitive bias could be independent of self interest priorities and simply be about a different preconceived perception, frame or foundation.


>Jonathan Haidt, a moral psychologist, recently posted a terrific video on the the issue viewpoint diversity on college campuses-- specifically, the lack of it.

As someone who just finished reading his book, I must ask for clarification. Is he talking about lack of diversity in the student body, or in the faculty body?

I know he has published work showing a bias against conservatives in his field...


I believe it was faculty bias that was implied with regards to the point about disconfirmation.

He did point to statistics showing that liberal bias has been present in universities for quite some time, but that until 1990 it seemed to remain at a steady 4:1 ratio or so. But that in the last 25 years, that ratio has become much more skewed. Up to 67:1 in some fields.


This is the purpose of courts, the purpose of peer review, the purpose of debate.

Ideally. But keep in mind: not for everyone, and it would be in the interests of those primarily concerned with power and manipulation to convince those concerned with truth and the greater good that this is the case.


I'd agree, in fact I'd say that in many cases finding the truth is not the first priority of most individual participants in those systems. That's why they're systems, and that's why concepts like integrity are so important and why the commitment to truth, honesty, and sincerity must be constantly reinforced.


Can you share a link to the video?


He's spoken about it in a number of forums, take your pick: http://righteousmind.com/viewpoint-diversity/ (near the bottom).


Here's the one I referenced: http://heterodoxacademy.org/2016/10/21/one-telos-truth-or-so...

Telos, incidentally, I consider to be a very fascinating and important concept, though only tangentially relevant to this thread.


"The world is very confusing, and we end up only seeing a tiny sliver of it, but we need to make some sense of it in order to survive. Once the reduced stream of information comes in, we connect the dots, fill in the gaps with stuff we already think we know, and update our mental models of the world."

So wonderfully said. I wonder what biases AI will develop in its models


I once heard that our senses primarily function as filters. I.e. our eyes only see by filtering out all sorts of other visual stimuli, our ears hear by filtering out other noises, etc...

That means that we are only capturing a "tiny sliver" of whats actually happening around us.

If we only keep this small detail in mind we can be sure that lack of evidence is never proof of anything and that any evidence we do see can only be used to find the most probable of competing hypotheses and its diagnosticity should always be considered. (i.e. fever might mean you are sick but has no diagnostic value to what particular ailment you have since its consistent with other possibilities.)

Same goes for using bounce rate as a metric for web analysis...its diagnostic value on its own doesnt tell us much, but with other context, like time on site and conversion goal of site (i.e. a phone call conversion will register as a bounce if they dont navihate elsewhere.) It can tell us a lot.


Likely a lot of the same biases... Look at the categorization, these are things any general system of intelligence is going to have to do at some level. Only a system with infinite computational power can afford to be without bias.


I even think that this biases could be a base of a good AI. Just code this biases in and your almost good to go!


This is an artifact of evolution - mental shortcuts enabled faster decision making, and species that got to the conclusion fastest survived. Of course, there is an error rate, but as long as it was small, the faster decision makers won.

Logic is simply a way to systematically eliminate these errors in wetware (and ultimately hardware).


I'm impressed that he managed to figure this out whilst taking care of a baby.

When my son was born, I spent most of the wee hours testing which Star Wars theme I hummed worked best for getting him back to sleep.

(By the way, The Imperial March worked best, especially slowed down and rocking him on every 4th beat)


My kid loves the Imperial March too for some reason. I thought I was a bad parent. Strange we came to the same conclusion. There must be something to this. /s (n=2)


No, you're not a bad parent, your son is just going to grow up to be a Sith Lord.


Now I want to know if anyone else uses the Galaxy Song from _Monty Python's Meaning of Life_.


I made my own song up something about ...little baby falls asleep so dad can go back to bed...


This was my first reaction as well. Paternity leave is not what I'd call an opportunity for deep thoughts and pondering. He must have had a very good baby!


Not a Star Wars theme but the one that worked the best for me was Ravel's Bolero.


I think that tunes with a strong repetitive beat seem to work best over something that is pure melody. Bolero hits on that count.


I wonder how his partner feels about the experience.


In politics, I find gambling is the best way to resolve disputes among friends stemming from cognitive bias. Put a bit of money on various results and someone will be right and someone will be wrong.


I've heard it said that wagering on real life events (presidential election outcome, climate change, etc) is "a tax on bullshit."


Not sure if this is the original source, but at least the first where I saw this calim:

http://marginalrevolution.com/marginalrevolution/2012/11/a-b...

And another view:

http://noahpinionblog.blogspot.fi/2013/05/bets-do-not-necess...


Prediction markets work ... up to the point that skewing the message of the prediction market itself is an incentive and a party has sufficient resources to do that.

Information is used both to understand and to influence. It's a three-edged sword.


Interesting; can you expand on that?


Sure. For instance, I had a buddy that was sure Bernie would win the NY democratic primary since Bernie had such huge rallies and people were excited. I trusted the polls. We both made small bets on PredictIt - I even bet Clinton would win by at least a 15 point margin, since that's where the polls were. I won, and he now trusts poll aggregates more.


If you'd ran a similar exercise for the 2015 UK General Election, the outcome would not have been as you desired. Famously the polls were wrong and the Conservatives won by a small majority. In fact, Sir Paddy Ashdown stated on live television that he'd "Eat his hat" if the polls were wrong, an action I think he may have probably regretted.


Yet polls work most of the time. I lost money when Bernie won Michigan in a surprise upset, but it told me to respect polling error. But I won money on the vast majority of my bets based on polling aggregate.


I think gambling like this provides incentives for people to follow up and review their predictions. Or to have their friends follow up in some cases :) Otherwise people never realize that they were wrong.

And once you know you are wrong, you can now start figuring out what you did wrong.


Heh. New meaning to "put your money where you mouth is." Perhaps a corollary.


Isn't this the old meaning of "put your money where your mouth is"? What other meaning does it have?


I was trying to think of a more clever twist on it, like "put your mind where your money is."


I always interpreted it roughly as "Do what you say/preach".


That's roughly correct, but it means that because it is saying that if you really believe something you should be willing to back up that belief with something that will cost you if you're wrong - i.e., a genuine belief that you honestly think is correct is something you should be willing to bet on because you will expect to win the bet.

People telling someone else to 'put their money where their mouth is' are essentially asking the someone else to prove the sincerity of their belief - or the person saying it has an opposing sincere belief and they would like to make some money off of a bet over who is right.


Cool, you were being quite literal.

And I love it!


Quite funny when hearing yourself think "Yeah this is confirming my own thoughts" and then you stop and think about what you just read.


Confirmation biasception?


How can any list of cognitive biases be complete without the end-of-history illusion [1]. Given the number of young people I see with tattoos it would have to be the most common illusion.

1. https://en.m.wikipedia.org/wiki/End-of-history_illusion


I'm fascinated by the graph Manoogian created. Does anyone know what specific tools were used to create it, or which could be used to create similar ontologies?

https://cdn-images-1.medium.com/max/2000/1*71TzKnr7bzXU_l_pU...

I'm working with a largish ontology of my own I'd like to present to 2-3 and possibly more levels of depth. GraphViz isn't cutting it.

(I'd asked Manoogian himself, he vaguely pointed at some R graphics tools, which was as far as I've gotten.)


To summarize biases further, in one word: Incompetence.

Lab results confirming cognitive biases come from testing small groups of students (up to 200). Among other things, it means: (1) respondents with similar background, so we can't generalize (2) respondents don't care about outcomes, (3) tests are synthetic. Plus publication bias and other standard issues.

These results are themselves a sort of confirmation bias.

Mistakes in real life happen when we don't know what we're doing. If a person can learn, he'll discover systematic mistakes. But that comes with domain experience, not cognitive science.


It's not that simple. Some cognitive biases, especially related to shortcuts in decision making, are more prevalent in smarter people.

Here's a good jumping off point: http://www.newyorker.com/tech/frontal-cortex/why-smart-peopl...

I highly recommend reading more of Daniel Kahneman's work if you find the above interesting.


Great find! Doctors are famous for lousy financial decisions, which I think proves the point.


I'm interested in the questionnaire they mention.


> If a person can learn, he'll discover systematic mistakes. But that comes with domain experience, not cognitive science.

Describing and categorizing systematic mistakes that people do is precisely what cognitive science does. The human mind is a domain of it's own.


Of course someone would come into the comments and try to rationalize why this doesn't apply to him, and why he's a special snowflake who can do no wrong.


Limitation :)


To respond to myself,

Resource-constrained

(Are we counting hyphens as one word or two?? :) )


Granted this might be useful but I believe the ability to recognize these biases and fallacies can only be improved by experience. You can read or memorize what each bias is but you might not be that quick to recognize it in a discussion, it takes practice not just a cheat sheet[0].

0.https://en.wikipedia.org/wiki/List_of_cognitive_biases


First line of article: I’ve spent many years referencing Wikipedia’s list of cognitive biases[0] whenever I have a hunch that a certain type of thinking is an official bias but I can’t recall the name or details.

I don't think there's anything the article author mentioned that argues against improvement through experience, critical thinking, and sensitivity to the limitations of the human mind.

[0] actually links to https://en.wikipedia.org/wiki/List_of_cognitive_biases


This is excellent. It could be improved by adding a very short summary of each bias. This would help the reader to drill-down to a specific bias of interest.


If a one-sentence tooltip for each bias were added to the codex graphic fielded on a webpage, and each bias was linked to its own page, that would make the graphic more sensible and accessible.

And if each quartile of the codex had a color coded mouse target, you could popup a menu of just that cluster of biases sorted into whatever spectrum makes the most sense.


Though, if the article is to be believed, 'a one-sentence tooltip for each bias' would itself be highly incomplete and therefore subject to all sorts of biases in its interpretation and application ;)


And if "Not Enough Meaning" and "Too Much Information" were no so similar in colour. :( (At least to my eyes)


Rendering the graph as an SVG with embedded links might work.


The article starts off by complaining about the wikipedia article, but as I was reading his post I kept thinking "But this is exactly what was in the Wikipedia infographic!" .... aaand then I realize the infographic is a side-effect of his blog post!

A little ah-hah moment for me. :)


Thanks for putting this together! As someone interested in cognitive biases, I wonder: how many of these effects have survived the recent replication crisis intact?


I learned a long time ago already not to trust my memory. For example I'm trying to find a piece of text in a book, or a specific story in a newspaper my memory might give me a clue, which would be something like "it's on that page next to that "red thing" or "there's a story on the opposite page about xyz". I never trust this anymore it's a wild goose chase.

A good thing to think about how biased your brain is, is to think of that time when you were witnessing that wonderful sunset and you decided to take a photo. Later you look at the photo and it doesn't look at all like you remember. Why? Because it's your brain playing tricks on you. You have a built in image filter in your brain that adjusts the image and your memory of it whereas the camera sees it "objectively."

Once you become aware of all the cognitive biases you just get tired of listening to people talk (about anything really), when it's full of logical holes and anecdotes. In fact it becomes painful especially when listening to some electoral candidate / politician talk about stuff that might actually matter. sigh


>We notice flaws in others more easily than flaws in ourselves. Yes, before you see this entire article as a list of quirks that compromise how other people think, realize that you are also subject to these biases.

I found this section was written very pourly. /s ;]


This exact title/link has been posted a number of times (>5) in the past 1-2 months [1].

I'd be interested to know which cognitive bias is at work when the instance of the post today gets 480+ points -- but no instances of this post (same title, same url) in the past 1 month garnered more than 26 pts.

I am both new to HN, and legitimately curious. Perhaps the content of the site was improved dramatically?

edit: clarity.

[0] https://hn.algolia.com/?query=cognitive%20bias%20cheat%20she...


Great article, but what on earth is the point of the huge wheel chart? Its pretty, but I'm not sure what conclusions it helps me draw, other than sorting sources of bias by group/subgroup in a difficult to read manner.


The point is to put it on the wall of your office so that when people come in they will understand that you are an expert on human biases and be on notice not to try anything funny.


> they will understand that you are an expert on human biases

unless they correctly identify your hanging of the image (and implied expertise therefrom) as a fallacious appeal to internet-blog-authority!


I agree. However, the groupings of biases is pretty brilliant and useful.


I got tired reading this, but had gotten too far not to finish.


If you had really been tired, you wouldn't have kept reading.


One has to make a distinction between dialectic situations, where you are trying to get at the truth, vs. rhetorical situations, where the attempt is to convince others (often not the person you are speaking with) of the truth.

For example, trotting out cognitive biases in a rhetorical situation is often very effective, particularly when your opponent is operating in a dialectic mindset.

Know thyself, but also know thy situation.


Useful resource to review. There is always time to think about how we think. I know from experience that always underestimate by 30% the time it takes to do things - being it gardening or coding. I measure it to find out. Cognitive bias is something that we all need to be aware of. Just give yourself a 30sec "CB Check" before any decision and see what happens.


This is the third article on the front page that is running on medium.com infrastructure with that annoying banner in the footer.

Is that the new standard?


Which banner are you talking about? The persistent footer w/ SocMed link litter in it? Which I find annoying, frankly.


Yes, that one. It links to Medium so I think of it as a banner.


4.3 "We reduce events and lists to their key elements." I liked how this one relates to cheat sheet itself.


Is there a dataset somewhere with sentences/paragraphs labeled with cognitive biases? The idea of a machine learning program auto labeling college essays with faulty deductive logic is enticing. Then run the system backward to generate logically sound arguments like the yahoo Nsfw detector post recently.


I highly recommend "Charlie Munger On the Psychology of Human Misjudgement."

PDF transcription here http://www.rbcpa.com/mungerspeech_june_95.pdf

Explains a lot (he says, during Presidential Election season in the US)


This reminds me of master persuader Scott Adams mentioned about "cognitive dissonance"[1] on some of his recent tweets.

[1] https://en.wikipedia.org/wiki/Cognitive_dissonance


His book is really interesting and an easy read. I guess it pays off to write things as simple as possible.

https://www.amazon.com/How-Fail-Almost-Everything-Still-eboo...

The one thing to take away from this election is that persuasion trumps everything else. It's scary and crazy!


Another thing to take away from this election is that Scott Adams turns out to be somewhat of a moron when discussing the topic of presidential politics.

I followed his arguments in the primaries and found them somewhat compelling, the idea that Trump was in better shape than people thought. But he's lately gone of the rails totally and completely.

And more to the point of this post, he completely misuses the phrase cognitive dissonance, having apparently defined it to mean something like "a way to tell people who disagree with me they don't know what they're talking about" rather than an actual psychological characteristic.


Lots of Clinton supporters here, with their powerful donwvote clicking :)


there are so many cognitive bias we see them on the street, in popular music, homeless people, university professors, alcoholics, politicians... everyone exhibits more than a few of these.

So many in fact, I just look at the list overwhelmed and curl in bed and suck my thumb


Related to biases are fallacies in argument as illustrated (in a less serious but instructive manner) here: http://existentialcomics.com/comic/9


And of course the biggest cognitive bias: that knowing about cognitive biases will make you less of a monkey in a baseball cap. Just remember that at all times you are probably wrong, and you'll do okay.


For those interested in the topic, there's a really good book (also available as audiobook) that covers this in some depth, "Thinking, Fast and Slow" by Daniel Kahneman.


Nicely done.

I like to refer to it as "short-cut thinking" rather than cognitive bias, because people that might not know the term immediately understand what I'm getting at.


20 cognitive biases that screw up your decisions

http://i.imgur.com/czyJsjO.png


I like that "anchoring bias" only has one example.


Off topic: I'm developing a resistance for scrolling up. It became common to significantly reduce the size of the view-port every time you go up.


Hurray Buster! Cool to see this article doing so well!


Can one get that graph in a non renderd format. Is it only available as that blury jpg?

Looked for myself, but i cant find anything.


My suspicion is that cognitive biases have a lot of what it means to be human. In particular I think the "language instinct" is a derangement of the ability to reason about uncertainty in a consistent way that makes language learning possible.


This is 'cheat sheet'; short and handy reference for making a quick decision. Full reference is probably a few tons of books.

I feel that someone who just uses instinct, will make faster and better decision, even with all the biases and cognitive illusions.


All these cognitives biases actually tell you that your instincts are not working properly, they are biased. Sure it will be faster (your biases, your prejudices will guide your decision), but not necessarily better.

If you see a spider, your instinct is to kill it, it's not necessarily the better solution.


Looks like the wheel chart has already been added to the Wikipedia page... talk about circularity...


Want a poster of this in every meeting room, maybe sans brain picture in favor of Rodin's Thinker sculpture.


Insulate yourself from Machiavellianism (willingness to manipulate and deceive others), Narcissism (egotism and self-obsession), Psychopathy (the lack of remorse and empathy), Sadism (pleasure in the suffering of others)


it talks about suken cost falacy but links to the entry on actual sunken cost, which at most has an argument about fixed vs recurring sunken costs.


I think its a slur of excuses we tell ourselves. We just don't care all that much.

It's all about filling your brain with the latest instant fad. People get bored if they don't get their fix of fake news or cat pictures. How would they even spend a few hours doing real research and actually trying to think?

Way too hard. The internet only exacerbates these issues. This blog is even a prime example. It's telling you what to think so you don't have to, and it matches the popular bias so its easy to just praise it and move on to the next insta-fad.


It does appear to be classic conditioning, despite the downvotes. I am concerned for the future of concentration & independent thought as I observe people of all ages around me who cannot stop fiddling with their gadgets... especially unnerving when they are operating 2000+lbs of plastic & steel and drifting into my lane.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: