
If you’re like me, you’ve probably asked yourself, multiple times, something along the lines of “how can that person think that way?” or “how did they arrive at such a weird/ridiculous/wrong/stupid conclusion?” We’ve all thought it and it’s absolutely certain that someone else has thought the same about us. How ideas/opinions/thoughts emerge and how they are formed and shaped through time and human experiences is incredibly fascinating. As soon as I saw the title of this book and the blurb, I was hooked.
Written by a professor of Psychology at Yale, Woo-Kyung Ahn, Thinking 101 takes a look at some of the most common thinking processes, the traps we fall into regarding those and what tools we can use to avoid them. Overall, there are eight chapters, each focusing on one aspect of our thought processes. Most of them you have heard or read about before, but it’s still worthwhile to take a tour through them. Let’s list them first.
- Fluency
- Confirmation bias
- Causal Attribution
- Perils of examples
- Negativity bias
- Biased interpretation
- Perspective taking
- Delayed Gratification
The chapter on fluency focuses on how we perceive skilled individuals in their field of work. The easiest example I can think of is when we watch professional athletes. These people are so skilled, so unbelievably athletic that they make everything look easy. You might have heard a couple of months ago when a poll came out that a majority of US tennis players think they can take a game of a ATP player (i.e. the best players in the world). Sure, when you look at tennis players on TV, they just play tennis, a bit faster but it’s still tennis. It doesn’t look hard. Obviously, a random tennis player has basically no chance taking a single point off a professional, let alone a game. In other domains too, people tend to overvalue their own knowledge and skillset. One aspect she mentions, that is very near and dear to my heart, is metacognition, knowing what you know or do not know.
When we see final products that look fluent, masterful, or just perfectly normal, like a lofty soufflé or a person in good shape, we make the mistake of believing the process that led to those results must have also been fluent, smooth, and easy.
Confirmation bias is well-known trap we fall into. In short, when we already have some sort of opinion or expectation, we are primed to confirm it as soon as humanely possible, and potentially ignore whatever contradicts it. I encourage you to read (or watch YouTube videos) about the 2-4-6 problem, it’s incredibly interesting. Another well-known example she writes about is the Mozart effect, the idea that listening to Mozart boosts intelligence in some way.
Confirmation bias might be a side effect of meeting our need to satisfice, stopping our search when it’s good enough in a world that has boundless choices. Doing that can make us happier and it can also be more adaptive. Nonetheless, the problem with confirmation bias is that we continue to use it even when it is maladaptive and gives us wrong answers.
Causal attribution speaks for itself, as our tendency to assign a cause to an event, leading us to perceive things in a manner that does not accurately reflect reality. It has many potential consequences, like wrongfully assigning blame, pinpointing actions rather than inaction or trying to solve an issue without understanding its real root.
The perils of examples focus on the idea of extrapolating generalities from a singular event, therefore ignoring statistical realities like the law of large numbers and regression to the mean.
Negativity bias is all about how we perceive the “bad stuff” with usually more emphasis than the “good stuff”. It’s a bit vague, but it touches on many different aspects. For example, we can be swayed by a negative review of an item online, and ignore the thousands of positive ones. We also put a lot more weight on the negative events that happen to us, often overshadowing all the positive ones. In the economics realm, the aversion to losses can lead to irrational behaviour.
Biased interpretation, while more complex and varied, can be summarily resumed by the following quote from the book : if participants start out […] and develop the initial belief that A causes B, that belief is imprinted and does not get revised, even after seeing the full pattern of data that clearly indicates that the causal association between A and B is false.“
I have often claimed that the ability to put yourself in one’s shoes is a very important aspect of living in a society, so I was very interested to read what the author wrote about that very idea. It became perhaps my favourite chapter, especially the idea of the curse of knowledge, which as a teacher I have to fight against on a daily basis.
That is the curse of knowledge: once you know something, you have trouble fully taking the perspective of someone who doesn’t know it, even if you are an adult.
In fact, smart people who know a lot are not necessarily good teachers or coaches, partly because of the curse of knowledge.
We’ve all head about delayed gratification and the famous marshmallow test. As you can probably guess, the reality is much more complicated as various factors and circumstances enter into play. One very important factors is uncertainty ; it can have a major impact on our judgment.
Whenever we are faced with a choice that involves delayed gratification, there is the possibility that our preference for certainty (getting it now) over uncertainty (getting it in the future) will be a factor.
I really, really enjoyed the book. It’s not very long, yet the author manages to get her points across, accompanied by many supporting studies. It’s an easy read, one that will likely make you think twice about many ideas you believed obviously right or wrong. Ideally, I think it would help the reader be more patient and aware of how our thoughts are born and how they evolve over time. It can be uncomfortable and feel a wee bit unnatural at times. Let’s face it, it’s much easier staying camped in our beliefs and stick to what we think. We do not like being challenged or exposed to ideas that can fill us with doubts. As the author aptly puts it :
That is why it is so important for society that we have conversations with people who hold different views than we do. We tend to be drawn to people who share our views. When we stay in our bubbles, we do not talk about the impacts of the policies we support, because we assume that our allies already know them. It’s only when we are forced to explain the consequences of the positions we hold to someone who does not share our views that we can begin to recognize the holes in our knowledge and the flaws in our reasoning, and work to fix them.