Now Reading: Mind Traps : The Ultimate Guide


Mind Traps : The Ultimate Guide

svgFebruary 4,

Survivorship Bias

  • Survialorship bias is a cognitive mind trap where people tend to focus on those things that have survived a process, while overlooking the ones that failed.
  • This bias often results from people looking only at samples that have already survived some kind of filtering process, such as successful companies, and not noticing the failures or other results which led to the current sample.
  • An example of this is Navy Researchers during World War 2 making decisions on armoring planes based on planes that had survived their misssions, ignoring the planes which had been shot down.
  • To combat this bias, be aware of it and try to look for the data that is not present to more accurately understand the situation.

Self-Serving Bias

  • Self-Serving Bias is a cognitive mind trap where when explaining successes and failures, we tend to attribute our successes to our own internal actions and our failures to external forces.
  • This bias can often lead to people being over-optimistic about their chances at success and underestimating the challenge of a task.
  • To combat this, first become aware of the bias and practice humility. Always try to seek feedback and accountability from others.

Fundamental Attribution Error

  • The Fundamental Attribution Error is a cognitive mind trap when judging the behavior of others, we tend to attribute their actions to their internal character, while overlooking any external factors which may have contributed to their decisions.
  • For example, if someone is late to work, they may attribute it to being a lazy person, while when they are late they are more likely to blame external factors, such as traffic and the weather.
  • To combat this, try to stay aware of the bias and consider any external factors when judging the decisions of others.

Hindsight Bias

  • Hindsight bias is commonly known as the “I told you so” phenomenon.
  • It is a memory distortion that leads people to believe their past opinions or beliefs are more inline with the actual results of an event than they actually were.
  • It leads to people overconfident in explaining why something happened and can lead to blame being directed at good decision making that had a bad result.
  • We remember events as more obvious and simple in hindsight than they may have seemed at the time.

Availability Bias

  • Availability biases are illogical statements based on examples that are easily recalled by an individual such as examples seen recently in the media.
  • It leads individuals to misjudge the likelihood of various events occuring as they overestimate the chances of an event if it is easily remembered or recalled and underestimate the chances if it is not.
  • These false perceptions can lead to irrational fears as well as poor decision making.

Availability Cascade

  • An availability cascade is a self-sustaining chain of events, which may start from media reports of a relatively minor event and lead up to mainstream public panic and/or large-scale government intervention.
  • People begin to adopt the belief, not because it’s true or prevalent, but because it’s popular. Put more simply, an availability cascade is an enormous overreaction to a minor problem.
  • Whenever we encounter minor risks we either ignore them completely, or completely over exaggerate them, with nothing in between.
  • Examples include rumours of razor blades in candy leading to a nation wide fear of homemade treats at Halloween.
  • It is caused by availability bias and hindered our ability to accurately assess the risk of certain events.
  • An example of an availability cascade is a parent waiting up at night for their teenage child to come home.
  • They might experience a self-sustaining chain of thoughts that grows into more thoughts about something disastrous having happened, which can eventually lead to panic.
  • When consuming news sources, it is important to take into account whether or not it is an objective piece of journalism, or just an availability cascade.

The Sunk Cost Fallacy

  • The sunk cost fallacy occurs when an individual continues to invest more time, energy and resources into an endeavor because they have already invested a significant amount and feel that it is irrational to quit.
  • An example of this is two people realizing after 25 minutes that the movie they are watching is terrible, yet staying because they feel that the money they paid for the tickets is wasted if they leave.
  • This analogy can be applied to relationships, investments and many other life choices.
  • To avoid the sunk cost fallacy, one should focus more on the current status and forecast rather than the past investments of time, energy or money.

The Framing Effect

  • The framing effect is the tendency to draw different conclusions from the same information depending on how it is presented.
  • For example, when asked which of two options (99% fat free or 1% fat or 98% fat free or 1% fat) people will usually choose the 99% option despite them being identical.
  • This example can be used to illustrate how companies often frame their offers to make them more desirable.
  • Additionally, the framing effect can be used to demonstrate the power of decoys, where the introduction of a third option changes the perception of the two existing options and influences the decision making of the consumer.

The Clustering Illusion

  • The clustering illusion is when individuals perceive a pattern out of random data points.
  • Common examples are seeing faces in the clouds, Jesus on toast, or perceiving faces in rock formations on Mars.
  • While the clustering illusion is mostly harmless, it can introduce bias into investments where investors sense patterns in the stock market where none actually exist.
  • It is important to remember that the words used can affect the emotions we have towards making certain decisions. This can help us separate luck from logic when making important decisions.

Barnum Effect

  • The Barnum Effect is a phenomenon where people easily attribute their personalities to vague and generalized statements, even if they can apply to a wide range of people.
  • Bertram Forer conducted an experiment in 1948 in which he gave participants the same text of generalized statements found in astrology magazines, telling them he had personally written the statement just for them.
  • 86% of the people found that the text was an accurate description of their personality.
  • Be wary of horoscopes, palm reading and psychics due to the Barnum Effect.

Exponential Growth

  • Exponential Growth is incomprehensible to humans because it wasn’t needed before. Our ancestors mainly had linear experiences.
  • If one grain of rice is used for the first square on a chessboard, and double the grains of rice on each subsequent square, 18 Quintillion grains of rice would be required to complete the task. This illustrates how quickly a small amount of something can grow when it is subject to the power of exponential growth.
  • To approximate the time it would take any process to double, or to describe something that is more intuitive, the rule of 70 can be used. Examples are 10% returns on investment = double in approximately 7 years, 8% percent inflation = half in approximately 8.7 years, and a country with 5% population growth = double the population in approximately 14 years.

Pattern Recognition

  • Pattern Recognition is our sensitivity to recognizing patterns, even when they are completely random.
  • People often come up with laws or rules to explain patterns of letters, such as a string of X’s and O’s, even when it is completely random.
  • To overcome your sensitivity to pattern recognition, regain your skepticism and ask yourself if the pattern is more likely to be pure chance or are you falling for the clustering illusion.

What do you think?

Show comments / Leave a comment

Leave a reply

Quick Navigation
  • 01

    Mind Traps : The Ultimate Guide