Heuristics and Cognitive Biases

1. Introduction

I wrote an article titled, Thoughts, Emotions, Behaviour and Cognitive Distortions – A Primer, in which I spent considerable time on the important concept of cognitive distortions, explaining what they are, how they cause negative emotions etc. I also mentioned that they are more severe forms of thinking disturbances. In this article, we will look at another type of thinking error.

Cognitive distortions are closely related to another phenomenon of the human mind called cognitive biases, which is the human tendency to make systematic errors of perception, interpretation and judgment under certain circumstances. Please note that the term used is “systematic” and not “abnormal” or “irrational”. It seems that we are, by nature, hard-wired for such mental errors! Cognitive biases can be thought of as milder forms of thinking disturbances and are accompanied by much less prominent negative emotional components. But milder does not mean less important, and many cognitive biases profoundly affect our interpretation of the world around us and our resultant behaviour, mostly, in my opinion, for the worse. Therefore, it is very important for us to be aware of such errors in our own thinking, and such awareness will surely make us strive to be better human beings.

2. Historical Perspective

A historical note at first – The fact that we may be riddled with irrationality in our thinking was suspected ever since rational thought began!

Socrates, the father of western philosophy said, Gnothi seauton: Know thyself. The following passage about Socrates was taken from the famous and must-read book, The Story of Philosophy by Will Durant –

..agnosticism…was the starting point of his philosophy -“One thing only I know, and that is that I know nothing.” Philosophy begins when one learns to doubt – particularly to doubt one’s cherished beliefs, one’s dogmas and one’s axioms. Who knows how these cherished beliefs became certainties with us, and whether some secret wish did not furtively beget, clothing desire in the dress of thought? There is no real philosophy until the mind turns round and examines itself…”

Another noteworthy character in the history of thinking errors is the great 11th century thinker Ibn al-Hazen. He was the first person ever to set down the rules for science. He created an error-correcting mechanism, a systematic and relentless way to sift out misconceptions in our thinking. The following, which is attributed to Ibn al-Hazen, has been transcribed from the must-watch television series, Cosmos: A Spacetime Odyssey –

Finding Truth is difficult and the road to it is rough. As seekers of the Truth you will be wise to withhold judgment and not simply put your trust in the writings of the ancients. You must question and critically examine those writings from every side. You must submit only to argument and experiment and not to the sayings of any person, for every human being is vulnerable to all kinds of imperfections. As seekers after Truth, we must also suspect and question our own ideas as we perform our investigations, to avoid falling into prejudice or careless thinking. Take this course, and the Truth will be revealed to you..”

Another noteworthy attempt to characterise and classify mental biases was made by the great Sir Francis Bacon, a 16th century English philosopher, visionary and statesman. In his classic magnum opus, The New Organon, he mentioned that if we really want to understand the secrets of Nature through science, we have to clear our minds of mental errors; Sir Francis called such mental errors “Idols”. He further classified these mental errors into –

Idols of the tribe: mental errors that are common to all human beings, for example, the innate human tendency to impose order or cause and effect into events when none exists (see below).

– Idols of the cave: mental errors that are peculiar to an individual human being, as part of his character as formed by nature and nurture. For example, some minds are analytic where as others synthetic.

– Idols of the market-place: mental errors that arise because of interaction of a human being with others in the society. These include errors imposed by the use of language.

– Idols of the theatre: mental errors that arise out of various schools and dogmas of philosophies. For example Plato’s philosophy, represents Plato more than the Truth it seeks to convey.

It is amazing that, nearly 400 hundred years ago, a human being can look at the functioning of the human mind and point out systematic errors in human thought process! My admiration and respect for Sir Francis Bacon has only grown ever since. Just take a look at this passage from The New Organon:

“…the human understanding when any proposition has been once laid down (either from general admission and belief, or from the pleasure it affords), forces everything else to add fresh support and confirmation; and although most cogent and abundant instances may exist to the contrary, yet either does not observe, or despises them, or it gets rid of and rejects them by some distinction, with violent and injurious prejudice, rather than sacrifice the authority of its first conclusions …All superstition is much the same, whether it be that of astrology, dreams, omens, retributive judgment, or the like, in all of which the deluded believers observe event which are fulfilled, but neglect and pass over their failure, though it be much more common … For what a man had rather were true, he more readily believes”

After you have finished this article, come back and read this passage; you will realise that it is an amazingly accurate description of the so-called confirmation bias, written many hundred years before cognitive psychology itself emerged as a scientific discipline! Cognitive psychology, by the way, is the relatively new branch of science (founded in 1960s and 70s) where such biases are identified and studied.

3. Heuristics

Let us now get back to cognitive biases. Cognitive biases usually result from information processing short-cuts of the normal mind called heuristics. A heuristic method is a mental mechanism used to arrive at a solution rapidly, based on readily accessible but incomplete information; such a mechanism may not be the correct one but rather an approximation – like an educated guess or a rule of thumb; using the correct method would necessitate additional information which may not be possible or practical, especially when time is of essence.

Confused? Well, it is not as esoteric as its fancy name suggests. We use heuristics everyday in our lives. One of the most common heuristic is the ‘trial and error’ heuristic. Let me give you an example: Suppose you want to unlock a nut. You have a set of spanners. The ideal solution would be to measure the size of the nut with callipers and then choosing the right size of spanner to unlock it. But the chances are you won’t have a callipers at hand and therefore, you won’t know the correct size of the spanner which will unlock the nut (incomplete information), unless you are an expert. So what will you do? Rather than thinking of a way to find a set of callipers (entails additional time and effort), you will obviously eyeball the approximate size of the nut and employ the trial and error heuristic using the available set of spanners (readily accessible information). If the spanner you try first is too small, you will then try the next bigger size, and vice versa, till you get to the right size and thus successfully solve the problem of unlocking the nut. The process you used is not the ideal one but rather an approximation, but who cares? It works, and it is much quicker. “But why is not the ideal method?” you may ask, “It looks like this is what every average human being would do!” You are correct. We use heuristics so commonly, so automatically, that we mistake heuristics for the correct method. Think about this: What if the correct size spanner is the one that is missing in the set? If you use the trial and error heuristic, the chances are that you would have tried a few spanners before you realise that the right size is missing; after wasting time and energy (sometimes considerable), you are back to square one; you are stuck. But if you had used the ideal method of measuring the size of the spanner needed, you would have identified, quite early on, that potential obstacle. But most of us will have a set of spanners but very few of us will have precision callipers handy.

It appears that many of the heuristics are hard-coded in our genes during evolution (“nature” component), since the ability to rapidly arrive at an approximate but optimal solution in the face of incomplete information must have surely given us survival advantage; it also appears that some heuristics are learned, mostly from the environment in which we grow (“nurture” component). In other words, all of us are hard-wired by evolution with a set of rules of thumb called heuristics, upon which various environmental and cultural factors act – just as with the language module.

Thus, heuristics are information-processing and problem-solving short-cuts; they are an approximation; they can lead to errors; but most of the time they work very well, especially when you have to decide and act quickly.

4. Heuristics to Cognitive Biases

I mentioned that heuristics in humans can lead to cognitive biases. Let us look at a specific heuristic: There is something called availability heuristic, which is a mental process by which people judge the probabilities of future events based on how easily those events can be visualised or retrieved from memory; in other words, “If you can think of it, it must be important”. This availability heuristic can lead to cognitive biases.

One such bias is “hindsight bias”: because events which actually occurred are easier to imagine than the other possible events which did not, people often overestimate the probabilities they previously attached to events that later happened; in other words, “I knew this would happen!”

Another bias that can arise from availability heuristic is the “curse of knowledge bias” – people who know a lot find it hard to imagine how little others know! In other words, “How can people not understand something that is as simple and straight-forward as Theory of Evolution by Natural Selection?”

I have listed below some (only some) well-known cognitive biases, mainly to show how common they are, how much they are a part and parcel of us, without many of us even being aware of them:

Loss aversion – the magnitude of pain from a loss is much more than the magnitude of joy from an equal-sized gain.

Gambler’s fallacy – people expect a tail after the coin has landed heads three times in a row.

Confirmation bias – tendency to accept or encode events which confirm one’s belief, and reject or despise those that are contrary. In my opinion, this is one of the most dangerous of biases, misleading people to all sorts of wrong beliefs, shutting off their minds, denying them the joy of new knowledge and preventing the consequent enrichment of their lives. I plan to write a whole article about it.

Narrative fallacy – tendency to organise past events as a narrative sequence of causes and effects, when in reality none exist.

Round-Trip fallacy – absence of evidence is mistaken for evidence of absence.

Framing effect – how choices are presented to an individual profoundly affects what choice is made.

Endowment effect – people overestimate the value of an asset they own, over and above the actual market value for it.

Most of the heuristic methods and some cognitive biases are adaptive, helping people make decisions when facing complex problems or incomplete information. In some instances, cognitive biases result from mal-adaptation of heuristics (ie, applied outside the limits of applicability of that particular heuristic).

There is a school of philosophers and thinkers who insist that we should rid ourselves of mal-adaptive cognitive biases such as confirmation bias, narrative fallacy etc because they are no longer relevant to the world we live in; they helped us to survive and make sense of world since early human history, but the world has changed drastically over the past century or so, and many of the cognitive biases are becoming irrelevant, even downright harmful. I fully agree with them.

5. Heuristics / Cognitive Biases to Cognitive Distortions

My hunch is that cognitive distortions are more extreme forms of mal-adaptive heuristics and cognitive biases. I think it is possible to trace the origin of the 10 cognitive distortions described by Dr. Burns to misapplied heuristics (check this article for the complete list of 10 cognitive distortions) – for example, it is possible that the so called representativeness heuristic (ie., tendency to judge the nature of a group based on a single representative or event) may be the basis for all-or-nothing and overgeneralisation types of cognitive distortions; availability heuristic is probably misapplied to produce magnification and minimisation type of cognitive distortions. Knowledge about the underlying heuristics may give us a firmer foundation to formulate better rational responses, thus helping us overcome crippling cognitive distortions. The following figure shows the possible progression from heuristics to cognitive distortions:

heuristics1

6. Summary and Conclusions

In summary, heuristics are (“quick and dirty”) mental short-cuts which help us to make decision rapidly in face of incomplete information. Cognitive biases are systematic errors in human thinking under certain circumstances. Many cognitive biases probably have their origins in heuristics, and at least some of them are mal-adaptive. Cognitive distortions are more severe forms of thinking disturbances, accompanied by a strong negative emotional component. They probably have their origins in mal-adaptive heuristics and cognitive biases. Identification and characterisation of cognitive biases and distortions can help us to think better and to become better human beings.