🌱 Some cognitive biases are rational under some assumptions

July 2024 July 2024

Cognitive biases have a bad reputation, and rightfully so. They can lead to making predictably irrational decisions, and beliefs. Unfortunately, even being aware of the biases does little to alleviate their effects which can lead to frustration that our brains are designed this badly, but I think it's worth taking the time to appreciate that the brain is actually doing a pretty great job trying to make sense of the world and construct correct beliefs.

Many cognitive biases come from the brain taking mental shortcuts doing fast efficient inferences to come to a quick conclusion. We will see that these shortcuts are predictably flawed, but also completely rational under certain assumptions. It's also a small miracle that we are able to overcome cognitive biases by simply taking a step back and thinking carefully through the problem.


The illusory truth effect

If we hear the same information multiple times, we are more likely to believe it's true. This is rational if the sources of the information are independent. If two people independently report the same thing, the chances of both of them being randomly wrong is lower than one of them being wrong. The problem occurs when sources are highly correlated which is common today because information is widely shared. So we end up with scenarios where both sources cite the same erroneous source.

Useful when sources are independent. Breaks down when sources are correlated.

Confirmation bias

The tendency to believe information that already matches our existing worldview and the distendency to believe information that disagrees with it. This can be seen as an attempt to implement bayesian statistics where the prior is non-uniform. This works really well if most of our beliefs are true, but fails when we already have a lot of false beliefs, or if our beliefs related to the new information are wrong.

The part of confirmation bias where we seek out information that already confirm our beliefs is less rational in the sense that it increases the risk of becoming entrenched in false beliefs, but it also allows for efficiently weeding out hopelessly wrong information. For example, no astrophysicist will spend a lot of time studying flat Earth theories for astronomical models.

Useful when prior beliefs are mostly correct. Breaks down when prior beliefs are wrong.

Continue reading

Loading...