'The Art of Thinking Clearly' by Rolf Dobelli
'The Art of Thinking Clearly' by Rolf Dobelli

This book is a list of 99 common thinking errors and cognitive biases. Some of these you’ve probably heard many times before, but many will likely be new. I found it a quick, fun, interesting read, but it has 3 major flaws:

  1. Because it’s just a list of 99 disconnected items, with no common “story” to tie them all together, you will forget the vast majority of it shortly after finishing the book.

  2. The book will tell you about the thinking errors, but not the solutions. Granted, there is value in being aware of the thinking errors in the first place, but without a concrete plan of how to avoid the errors, there isn’t much actionable to take away from the book. In short, don’t expect to be thinking all that much more clearly when you’re done reading.

  3. The book leans very heavily on a few other authors: especially Robert Cialdini (“Influence: The Psychology of Persuasion”), Daniel Kahneman (“Thinking, Fast and Slow”), and Nassim Nicholas Taleb (“The Black Swan”). Dobelli gives you the TLDR version of these other authors, which loses much of the nuance and value. My recommendation would be to skim Dobelli’s book, figure out which topics you find interesting, and go back to the original source material for a deeper, more fulfilling read.

Despite these problems, I still found a few fun ideas/thoughts/concepts that I took down as notes:

  • How to make people believe you can predict the stock market. First, email 50,000 people one stock prediction and a different 50,000 people the opposite prediction. A week later, one of those two predictions will be correct. Now repeat the process with the group where your “prediction” was correct: email 25,000 of them a new prediction, and the other 25,000 the opposite prediction. Keep repeating this process for several weeks, and at the end, a small group of people will believe that you made a series of seemingly impossibly correct predictions in a row. These people will think you’re a genius and be readily willing to give you all their money for investment.

  • The concept of “social loafing.” When someone is working on something alone, they typically work harder than if they are working on the same thing in a group. For example, in a tug of war, the more people you add to the repo, the less hard each one pulls. The less your individual contribution is noticed, the less effort you put in. This is a critical lesson for management and team building.

  • Omission bias. Most people believe that you are less “culpable” if you allow something bad to happen due to inaction rather than action. For example, shooting someone is seen as worse than letting someone die. Building no new products and going out of business because the market changed is seen as less bad than trying to build a new product and failing.

  • Hyperbolic discounting. We value instant rewards much, much more than the same or even larger reward, but with any sort of delay. For example, if I let you pick between earning $1,000 12 months from now or $1,100 13 months, most people would take the latter. After all, what’s one more month after waiting 12? But if I let you pick between $1,000 right now and $1,100 one month from now, almost everyone would pick the former. It’s exactly the same one month difference, but the possibility to get something now carries a lot of weight.

  • The power of because. People will forgive much more readily if you give a reason (i.e., include a “because” in your speech), even if it’s not much of a reason at all. E.g., people were much more willing to let someone cut in line at a copy machine when they said “Could I cut in line because XXX” rather than just “Could I cut in line.” The XXX itself barely mattered: “because I’m late for class” worked more or less as well as “because I need to make copies” (i.e., a meaningless reason). The mere presence of “because” was the important part.

  • The Will Rogers phenomenon (AKA stage migration). It comes from his joke: “When the Okies left Oklahoma and moved to California, they raised the average intelligence level in both states.” Here’s an example where this sort of thing can cause problems in real life: cancer is often grouped into different stages (e.g., stage 1, stage 2, stage 3), and we measure average survival rates for each group (e.g., stage X patients survive on average Y years). It turns out that if we develop better cancer detection techniques that catch the disease even in people that otherwise seemed healthy, we’ll end up with more healthy people in stage 1. This will increase the average survival rate for patients in stage 1, even though treatment hasn’t actually improved in any way!

  • Effort justification. If we work hard for something, or suffer for it, we value it much more. This is one of the reasons hazing and initiation rituals are so prevalent in groups: the pain you go through to join the group makes you value the group much more. This also explains the “Ikea Effect,” where customers value their Ikea furniture more because of the effort they had to put in to assemble it. And this also partially explains the “not invented here” syndrome, where companies prefer their internal, home-spun solutions, simply because they took part in building them, and not because those solutions are actually better than the alternatives.

  • The sleeper effect. We forget the source of a message more quickly than the message itself. For example, you see a political campaign ad with a negative message about a candidate. Initially, this has little impact on your opinion of that candidate, as you know that message was paid for by the opposition, which is obviously biased. However, a while later, the negative message about the candidate is likely to stay in your mind, whereas you may no longer remember the biased source of the message, and therefore, it’ll start to affect your opinion. Advertising likely takes advantage of this effect too: when you first watch a commercial touting the benefits of a product, you largely ignore it, as you know the ad is obviously biased and trying to sell you something. But a while later, you’ll remember the product benefits, but not necessarily where you heard them, and you’re more likely to buy the product.

  • The confusion between risk and uncertainty. Risk is when you can predict the probability of various outcomes. Uncertainty is when those probabilities are totally unknown. We can calculate risk and make informed decisions about it; we cannot do the same with uncertainty.

  • The house money effect. People tend to categorize money, which makes no sense, as all money should be the same to us. For example, a blackjack player goes to Vegas with $500 and plays a very deliberate and discplined strategy. But then, if they suddenly won $500, they might treat that as “house money” and start betting it wildly. This makes no sense, as that $500 is no different than any other $500, but we mentally put it in a different bucket. The same happens with investors who suddenly get a big payout, and instead of following their usual, disciplined investing approach, they buy a bunch of high risk stocks. This is also why many services give you “free credits” when you first start: you’ll end up using these free credits in a different way than you would’ve with your normal money, which gets you used to spending more money on that service.

  • The idea of strategic misrepresentation. That is, lying that is socially acceptable. For example, women wearing makeup, or a rich person driving a fancy sports care, or a contractor promising a short timeline just to get the deal signed.

  • The effect of TODO lists and planning. If you have a long list of TODOs on your mind, it leads to a lot of anxiety. It will actually be hard to focus on anything else until those TODOs are all done… Except in one case: if you come up with a clear, solid plan for getting those TODOs done, studies show that it significantly reduces anxiety and lets you clear your mind.

Rating: 3 stars