Everyone knows (but especially ’90s kids) what it’s like when your computer isn’t strong enough to do something. Fortunately, the solution has always been pretty simple—some version of “get a stronger computer”—and because of Moore’s Law, a stronger computer has always been just around the corner.
Here’s where things go left. Suppose you’ve built the ultimate supercomputer (you couldn’t possibly make it more powerful), but it’s still not strong enough to run every single computation in your program without crashing. You are forced, at this point, to adapt in the opposite direction: instead of adding computational power, you need to simplify the computations. The program won’t run as perfectly as you hoped—in fact, it’ll get buggy in some places—but it’ll run without crashing.
Congratulations: you now understand how the human brain is a flawed supercomputer. It’s always looking for computational shortcuts because it can’t evolve in its own lifetime (a few million years between firmware updates) and it has to do something with the incredible amount of data being fed to it.
One category of shortcuts might be called Impressions, or the ways that certain details stick in your mind more than others. For a useful concrete image: you’d be making a literal impression upon yourself if you took a coin and pressed it into your skin, so mental impressions are the details that you continue to “feel” and “see” with that sort of immediacy before other details return to mind.
These mental shortcuts developed to help our ancestors survive, but at the expense of perfect understanding and decision-making. Errors of Impression exist because the details we’re likeliest to notice and remember don’t, won’t, and can’t always represent the full truth.
Today’s cognitive bias is pretty familiar, and appropriate for starting Volume 3: anchoring bias is the tendency to rely more heavily upon the first pieces of information you hear. Three quick nuggets of analysis:
(1) Anchoring bias is, in a sense, built atop confirmation bias. As long as you don’t reject the First Piece of Information outright, it becomes the “seed” of what you understand, and you’ll naturally seek information which is compatible with that understanding.
(2) Robert Cialdini (the persuasion expert) might point out a principle called “commitment and consistency,” which says that human beings naturally expect everyone (including themselves) to remain consistent in their own words and behavior. Changing your mind is hard, but it’s doubly hard to undo your “first impression” of something and rethink it from scratch—and triply hard when other people are witness to this process.
(3) Anchoring bias is naturally aided by something we usually consider a good thing: that we’re especially likely to listen to the people who teach us the “first things” about anything, like parents and teachers. So it’s quadruply hard to undo the first thing you learned when you learned it from someone you loved or trusted.
P.S. Ever notice that back-and-forth negotiations play out the same way 99% of the time? Now observe that there’s a predictable relationship between the starting number and the ending number for most negotiations.
Next we’ll talk about cognitive bias, it’s useful to continue the computer-versus-human-brain analogy from yesterday.
Let’s focus on a specific piece of the computer/brain: long-term memory. At this writing, the storage capacities of brains and hard drives are eerily similar; most computational neuroscientists estimate the brain’s data-storage capacity is between 10 and 100 terabytes, which is only 1-2 orders of magnitude higher than current consumer drives. Having said that, human memory and hard drives are engineered in very, very different ways.
Most data stored on a hard drive is searchable and sortable, meaning that your computer can find everything matching your terms in just a few seconds—but no part of the computer understands what it’s finding, much less how the different pieces of its data connect. The brain is sort of the opposite: it has an “organic” understanding of its data and the connection between pieces, and it can draw these sorts of insightful connections with incredible speed, yet this same faculty makes the brain a badly unreliable database.
Thus we arrive at today’s Error of Impression, which is called the availability heuristic (a “heuristic” is just a problem-solving strategy based on experiences with previous problems). In computational terms: the availability heuristic is a typical strategy used by the brain to simplify a complex search query so that results return faster. In human terms: the brain grabs the first examples which pop into mind so it can start working.
The trouble, of course, is that the first things to come to mind are usually the most vivid and memorable things, which almost never constitute a representative sample of reality. Our ancestors’ brains developed this strategy because noticing and remembering “vivid” things first—like a food source, or a predator who sees you as one—could make a life-or-death difference, especially in fast-moving situations. In our time, this strategy still has cognitive uses (therapy, free association) and practical uses (crossing the street safely), but it makes us seem like silly computers much of the rest of the time.
Two things which make more sense if you know about the availability heuristic:
Rules and Exceptions. The human-rational brain can understand that there are very few (if any) airtight rules which have absolutely zero exceptions. Even better: we can understand that exceptions don’t diminish good rules, and that exceptions can even prove the rule. But the underlying monkey brain thinks in simpler, more binary terms and therefore prickles at any sort of exception. This means we’ve all experienced two intrusive thoughts as a result: “so your rule is bulls**t” and/or “the exceptions I know must prove a different/better rule.”
The Fundamental Attribution Error (FAE). The FAE is our shared human tendency to assume that a person’s behavior, at any given moment, is a function of their character and not their circumstances (the classic example is assuming that someone is an asshole rather than a normal person having a bad day). The availability heuristic plays into this because, most times, a person’s circumstances are invisible and the only information available to us is what we see. Any decent person who’s had a hellish day understands the difference between “observing bad behavior” and “judging bad people,” but the monkey brain snaps to here-and-now judgment of others because (historically) it’s a protective behavior.
The next two cognitive biases are pretty simple and easy to understand—and yet still no easier to undo, given how often they steer our thoughts.
Recency is something we’ve discussed before, at which time (1/29) we called it “an incredibly powerful marketing principle.” The nutshell version for marketers is that recency is an unusually good predictor of behavior; as one simple example, some of the people likeliest to buy from you are those who have bought from you recently.
Recency is an unusually good predictor of behavior because it’s consistent with how our brains are naturally wired to think. And our brains are naturally wired to return the most recent results first, regardless of their relevance‚ because…
(1) Danger awaits mostly in the present moment. To the extent that you need ANY degree of memory in life-and-death situations, remembering whether or not you turned off the stove won’t save you. But you might save yourself by remembering, for example, your steps into a minefield.
(2) Most information ages very poorly. Given the above, it’s unsurprising that most of the information your brain has ever collected was laughably useless a day later; a tiny sliver remained useful for a little while, and a trace fraction stays with you still. Evolution tends to “notice” things like this and give advantages accordingly.
Salience, defined simply as “prominence within a situation,” is similar in principle but different in practice. It’s similar in principle because, aside from the most recent results, your brain will also return the most popular results first—except, of course, that (A) “popular” is an internal poll and (B) the audience is full of brain systems which have survived by being skittish.
In practice, salience is different from recency because the most salient examples in a person’s mind can seem “stuck out of time.” Whether it happened 10 seconds ago or when you were 10 years old, the most salient examples for any person are the ones which still feel most present. (In Deepak Chopra’s words: the things we fear the most have already happened to us.)
Frankly, even therapy can’t undo the influence of these two biases in our lives, and it’s impossible to reflect any version of the world without them. Yes, we live in strange times… but in this sense, perhaps all times are strange.
Everyone knows about “selective hearing,” where you’re talking to someone and they completely ignore most of what you say. This is maddening behavior to experience—and sometimes, bluntly, it’s only happening because the other person sucks—but we’ve all done it at some point (without meaning to) because the causes run both wider and deeper than we’d usually imagine.
The bigger Error of Impression behind it is called selective perception. In a phrase, selective perception happens when our expectations “bend” our perception of the world, most often to irrational results.
It’s easier to see why and how selective perception works by unpacking the two key words in the definition above…
Perception. Yes, perception is subjective by nature and reality is objective by nature, yet perception is not the opposite of reality (“fantasy” would be a better antonym). It’s more useful to think of them as overlapping layers; if reality is the world around you, perception is your particular pair of glasses for seeing it.
A second note: perception and reality are not exclusive because they have a reciprocal relationship. Yes, reality was “here first” and perception layers over top of it, but then perception can also change reality… by way of whatever you do next.
Third and finally: while we do can play a conscious role in how we perceive the world, we cannot change that we perceive the world. We can’t turn it off. To get a rough idea of why changing your perception can be difficult, imagine trying to fix a sink while the water is still running.
Expectations. We’re all familiar with the idea, but there are a few different kinds of expectations (and that’s the real rub here)…
Conscious — Expectations which you experience as complete thoughts, like what you want to do today. To borrow a Brené Brown phrase, these are “the stories you tell yourself.”
Explicit — Conscious expectations which are spoken aloud or otherwise made clear to others, like “this is what I want to do today.”
Implicit — Conscious expectations which are NOT made explicitly clear to others in words, e.g. that you’ll show up to work on a normal weekday.
Unconscious — Expectations which “underpin” your understanding of the world so deeply that you may not be able to notice (much less address) them on your own. Tiny example: “people will be, somehow, rational.”
The first three are familiar in everyday ways, but the fourth is frighteningly invisible and best for exemplifying…
👉 Why people sometimes “just don’t get it.” We wish we could write so much more, but here’s the essence of it: when people don’t “hear” you, it’s because they somehow don’t understand you.
Anything that anyone ever thinks, well, it has to be able to “fit” somewhere in the brain; like Earth, the trouble is NOT “lack of space per se” so much as “lack of space on the grid.” When someone doesn’t understand your idea—and thus, can’t supply mail and city water to that new idea—they literally can’t remember what you said because they’re trying to remember your exact words, at which point we all suck at remembering more than 4-5 things at once. For anyone keeping track: that’s a curt sentence at best.