Web Design

Your content goes here. Edit or remove this text inline.

Logo Design

Your content goes here. Edit or remove this text inline.

Web Development

Your content goes here. Edit or remove this text inline.

White Labeling

Your content goes here. Edit or remove this text inline.


Discussion – 


Discussion – 


How To Avoid Bad Decisions, Vol. 2: the Bandwagon Effect

Avoid Bad Decisions

A couple weeks ago, we started our dive into cognitive biases: the mental shortcuts we all develop to make quick sense of the world, no matter how (in)accurate or (un)fair the resulting thoughts might be. Cognitive biases can interfere with our judgments in any part of life—but they can be especially expensive when they lead business decisions astray.

In Volume 1, we covered Errors of Metacognition, or the ways that we fail to observe and “check” our own thoughts. In this week’s Volume 2, we’ll be turning outward instead of inward to tour Errors of Groupthink: ways that the influence (or mere presence) of other people can distort our thinking.

If you never took Social Psychology in college—and if we had to recap it in one sentence—here’s what you need to know for today:

Each person is an independent soul, but all of us share the same social-animal instincts—and the fusion of these two properties explains countless facets of human behavior, some fascinating and some horrifying.

This dual nature is reflected in the anatomy of your brain: human cognition and rationality are made possible by your huge cerebral cortex 🧠 yet you still need the underlying monkey brain for all your survival functions—including many of the social ones. 🙈 The cognitive biases we’re covering in Volume 2 result largely from the friction between these two systems.

Having said all of that, today’s starter example is widely known: the bandwagon effect. Why is it so hard to be first on the dance floor and so easy to be last?

Well, this is a weird crossover between our human-rational brains and our monkey brains. The “content” and “quality” of the bandwagon are things only the human-rational brain can evaluate. What the monkey brain understands is much simpler: being “wrong” is safe when you’re part of a big group, but being “wrong” by yourself could mean that you die a virgin. (High school was fun.)

When that feeling bubbles up into our thoughts, the phrasing frequently becomes “I don’t want to look stupid.” But we all know it goes deeper than that. Any social animal’s primal fear—reinforced by eons of evolution—is separation from the pack, and this fear is wired into so many things we do. Of course, part of maturing into an adult is learning how to (rationally) manage this fear, and we do—but the main reason it can warp our thinking is that the impulse always comes before the thought.

All we’ll say for marketers: it’s often way more valuable to create a bandwagon (however modest) than to join one (however massive).

Confirmation Bias

Confirmation bias, or the tendency to listen only to what confirms our beliefs, is one of the single worst bugs in the human code.

Let’s take the simple principle from yesterday—that monkeys are terrified of being “wrong by themselves”—and add a very human complication, which is that questions like ethics and strategy don’t yield clear answers (sometimes ever). Even if we accept this intellectually, it’s maddening to our monkey brains because we crave direct feedback. Phrased as a human thought: I really hope I’m right, but even if I’m wrong, I just want to close the loop and move on.

The monkey brain hates being wrong, but this complication creates two things it hates worse: self-doubt and self-contradiction. There’s no self-contradiction at trivia night and self-doubt doesn’t matter—because the point of trivia is to be, well, trivial—so ultimately you’re either right or wrong, then you move to the next question. But with higher stakes, and in the absence of a clear truth, your credibility is largely a function of your confidence—and to the monkey brain, changing your mind is the same as confessing prior fault and fallibility.

The human-rational brain can re-interpret information according to any paradigm it chooses, which is where changing your mind can be healthy. But the default paradigm, inherited from the monkey brain, is “play defense for your team”—and so our natural temptation is to seek out information which satisfies this urge (often creating a vicious cycle).

This metaphor might get weird, but it explains why fixing confirmation bias is harder than just “having an open mind.” Imagine you’ve been dropped behind enemy lines like the 101st Airborne before D-Day—except your rifle is permanently glued to your shooting hand. Your rifle can defend you as long as you have ammo, but it might as well be a walking stick once you run out—and you can’t afford to run out because you can’t switch weapons and Nazis don’t care about your problems. So you’ll always have an eye peeled for ammo your rifle can use, and you’ll ignore anything it can’t use. Wrapping this up: our rational brains don’t regard hostile ideas like literal Nazis, but our monkey brains do. (Dude, we need to rewatch Band of Brothers.)

The internet and social media didn’t create confirmation bias, much less intentionally—but they circulate most of its ammunition in our time. To borrow a Gary Vee phrase, social media is a mirror; it reflects you, but more importantly and dangerously, this mirror allows you to see (and defend) whatever you want, no matter how irrational or downright wrong it may be.

Conservatism Bias

Let’s start with conservatism bias, which is easier to unpack because we’ve all seen it at Thanksgiving. Conservatism bias, simply put, means weighing prior information or beliefs more heavily than new ones. This is technically a human universal: everyone prefers beings comfortable in what they already know and believe. Beyond that, it’s a question of degree; there are people who vote blindly along party lines their entire lives (not ideal) and there are people who drive Toyotas (you do you, yo).

In any case, why does this happen? An analogy makes the explanation tidier: imagine everything you know and believe as the contents of a single book (or Google Doc if you prefer). Now imagine you’re told you have to accommodate new information in your book. If you just have to add a sentence or paragraph, no big deal—but the more you’re required to change what’s already written, the more intrusive it will feel and the more you will naturally resist.

Part of the reason is that editing is hard, tedious work (ask us how we know)—but even more, this type of “editing” is painful in an Alexis Carrel sort of way:

Alexis Carrel man cannot remake quote

Image: Reddit

Discomfort aside, there are certain sections of the Self Book that we try never to edit at all. Some pages are like precious mementos; other pages are like source code for our psyches (where it feels like edits might cause a crash).

On the other hand — sort of — there’s pro-innovation bias, which is the tendency to overvalue the usefulness of new technology and undervalue its limitations. In some ways, this is equal and opposite of conservatism bias, but it also raises the separate question of what counts as “innovation.”

Let’s take a quick (but relevant) detour to the patent office. Patents are essentially verifiable claims to innovation but, of course, patent offices only issue patents when the claims meet certain criteria. For something to formally count as an “innovation,” a patent office’s essential standards are Novelty, Non-Obviousness, and Usefulness—easy enough to meet individually, surprisingly hard to unite in a single invention.

Having explained that, the essential cause of pro-innovation bias is that most people who are excited about technology don’t think like patent offices. Congratulations: you now understand the city of San Francisco.


This is a behavior we all indulge, both to ourselves and in groups—but if left unchecked, the cross-pollination can bear truly toxic fruit.

Just for clean definition: stereotyping is when we make assumptions about a broad group of people and/or any individual members of that group, but based only on appearances and nothing else. It’s another shortcut we learned from a hostile primordial world, where the ability to quickly distinguish friend from foe could literally save your life. Stereotyping, like your appendix, was invented by circumstances very different from the ones we now inhabit, and by now it has way more potential to damage us than to protect us.

There’s an obvious moral danger here: you shouldn’t care about a person’s appearance because it tells you very little about their inner humanity. There’s also a business danger here: you shouldn’t care about a person’s appearance because it tells you very little about the money they could spend with you (or the degree to which they could help your business indirectly). In other words, stereotyping doesn’t just narrow the mind; it also narrows the market, and that’s dangerous to businesses much the same way that narrowing blood vessels is dangerous to the body.

There’s no good way to teach open-mindedness and acceptance in a single paragraph, but two thoughts might help keep it in mind. First and more individually: Socrates explained kindness as understanding that each person is fighting a battle you know nothing about. Second and more broadly: if there’s any merit to the Law of Attraction, then doing well at “your thing” will attract people to you and you can open yourself to the ones who come gladly, which is easier and more natural than forcing yourself to find strangers where they are.

Remember, too, that it’s natural—if rarely constructive—to pick up on the stereotyping of others, especially when those stereotypes are laced with fear. As with many things, the best way to be sure(r) is to learn everything you can from both sides.

Get more from theCLIKK




You May Also Like