Your Brain Is Not a Logic Machine

We like to believe our decisions are rational — that we weigh the facts, consider the options, and choose the best course of action. Decades of behavioral research suggest otherwise. The human brain relies heavily on mental shortcuts called cognitive biases, and while these shortcuts are often useful, they also cause us to make predictable, systematic errors in judgment.

The good news: once you understand these biases, you can catch them in action — and correct course before they cost you.

Confirmation Bias: Seeing What You Already Believe

Confirmation bias is the tendency to seek out, interpret, and remember information in a way that confirms what you already think. If you believe a particular diet works, you'll notice the success stories and ignore the contradictory evidence. If you've decided a colleague is difficult, you'll interpret neutral behavior through that lens.

How to counter it: Actively seek out the strongest argument against your current position before making important decisions. Ask: "What evidence would change my mind?"

The Sunk Cost Fallacy: Throwing Good Money (and Time) After Bad

We irrationally overvalue things we've already invested in — time, money, effort — and continue committing to them even when logic says to stop. Staying in a failing business, finishing a terrible book, holding a losing investment "until it recovers" — these are all sunk cost thinking.

The money and time already spent are gone regardless of what you do next. The only rational question is: "Given where I am right now, what's the best path forward?"

The Availability Heuristic: Overweighting Memorable Events

We judge the likelihood of something based on how easily an example comes to mind. Plane crashes get massive media coverage, so people dramatically overestimate the danger of flying. Car accidents are common but rarely covered, so we underestimate the risk of driving.

  • We overestimate risks that are dramatic and memorable
  • We underestimate risks that are mundane and chronic
  • This skews our decisions in health, safety, investing, and politics

The Dunning-Kruger Effect: Confidence Without Competence

People with limited knowledge in a domain tend to overestimate their competence — they don't know enough to know what they don't know. Conversely, genuine experts often underestimate their abilities relative to others.

This is why beginners are often the most opinionated and experts are often the most cautious. Recognizing this in yourself requires deliberate intellectual humility — regularly asking "What am I missing here?"

Anchoring Bias: The First Number Wins

The first piece of information we receive disproportionately shapes our subsequent judgments. In salary negotiations, whoever names a number first tends to anchor the conversation. In shopping, a "was $200, now $120" price tag makes $120 feel like a deal — even if the fair value is $80.

Counter it by: Researching independently before seeing any offered price or figure. Form your own baseline before being exposed to someone else's anchor.

The Planning Fallacy: Every Project Takes Longer Than You Think

Humans consistently underestimate how long tasks will take and how much they'll cost — even when they have experience with similar tasks. We imagine the best-case scenario, ignore potential obstacles, and remain optimistic against all evidence.

BiasThe Mistake It CausesThe Fix
Confirmation BiasIgnoring contradictory evidenceSeek disconfirming information
Sunk Cost FallacyContinuing losing venturesFocus only on future outcomes
Availability HeuristicMisjudging probabilityLook up actual statistics
Dunning-KrugerOverconfident decisionsFind genuine experts to consult
AnchoringOver-relying on first infoResearch before engaging
Planning FallacyUnrealistic timelinesUse historical data, add buffers

You Can't Eliminate Bias — But You Can Manage It

Recognizing these biases doesn't make you immune to them. Even people who study cognitive biases professionally fall prey to them regularly. The goal isn't perfection — it's building habits of thought that introduce a pause between impulse and decision.

Slow down on high-stakes choices. Ask a second opinion from someone who disagrees with you. Write out your reasoning before committing. These simple practices won't eliminate bias, but they'll catch the most costly mistakes before they happen.