Navigated to How Cognitive Biases Work - Transcript

How Cognitive Biases Work

Episode Transcript

Speaker 1

Welcome to Stuff you Should Know, a production of iHeartRadio.

Speaker 2

Hey, and welcome to the podcast.

I'm Josh, and there's Chuck and Jerry's here too, and we are getting down to business, getting right to it here on stuff you should know, because we've got a lot to cover here.

That's right.

So, Chuck, I got a little bit of an intro.

Speaker 1

Let's hear it was that it?

That wasn't it?

Speaker 2

Do you remember how homeostasis used to come up a lot?

Yes, So, for those of you who haven't been listening that long, homeostasis is what your body, in your mind and your brain wants to return to.

Right.

You just want everything nice and even, keel and normal and without exerting too much effort and energy.

Right, that's homeostasis.

Speaker 1

That's are you asking me?

Sure?

Speaker 2

Okay?

So one of the ways that your brain returns to homeostasis as fast as it can is to use shortcuts in making decisions, right, Because if you're having to decide something, you're actively being challenged, you have to You're not in your homeostatic space.

So if you use a shortcut, you can say something like I've had the red apple in the past and it was delicious, I've eaten the brown mushy one before and it was awful.

I'm going to eat this red apple, right.

Rather than going to the trouble of pulling both apples out and like analyzing them with a microscope and all that, you can just kind of use a little shortcut.

That's a heuristic, and it makes a lot of sense because your brain is like, great, I didn't use that much energy.

I made the right decision, and we're good to go.

The problem that comes about, though, is that with heuristics, you're not always right.

You don't always make the right decision, you're not always taking all of the information into account, and when that happens, you start stumbling into cognitive biases.

Speaker 1

Yeah.

Like, this is a frustrating episode because I feel like the title could be cognitive bias is everything you think you know is wrong.

Speaker 2

Yeah, well that's a great title.

Let's go with that.

Speaker 1

It just it made me feel like a dummy the whole time.

Speaker 2

Oh don't it's You're not a dummy.

All humans are dummies as far as cognitive biases go.

It's not just you, and this stuff is hardwired into us because like I just said, we take mental shortcuts and the problem, Chuck, is what we're talking about mostly today are unconscious biases.

Right, there's conscious biases.

We just usually call those biases, right, those are the active challenges that you need to overcome to be a better version of yourself.

These are like unconscious.

So there's not a lot you can do about it.

Although at the end we're going to kind of give you some tips and pointers, but it's a challenge for absolutely everybody.

Doesn't it doesn't make you dumb.

Speaker 1

Yeah, I mean I think the tips and things can help, for sure, But it's just part of being human, you know, the unconscious bias, and there's not a lot we can do to completely eradicate them.

And if it's a you know, if it's a real problem, then I'm sorry.

Speaker 2

Well.

One of the big problems that we all kind of face is that we are were predictably irrational, as was said by Dan Arielli, who was a behavioral economist, And because of that, corporations, marketers, basically everybody who wants to sell you something knows about these things, and they can manipulate those things.

They can trick you into making decisions you wouldn't otherwise make.

Speaker 1

Yeah, for sure, And we wouldn't even be here probably talking about this, but hadn't been for two kind of revolutionary thinkers who ended up being some of the more off cited researchers in the history of research, as we'll learn, especially when it comes to economics.

And they were a couple of psychologists, Israeli psychologists named Amos.

And I looked up different ways to pronounce this because we always get guff and I've heard everything from hard Tbirski to his colleague Daniel Kahneman doing more of one of those Verski.

Speaker 2

Oh, I just heard him refer to him as Big T.

Speaker 1

The Big T because it's TV.

But those were the two guys working together.

They developed this concept in the seventies at the Hebrew University of Jerusalem and really like got down to it pretty quickly as a result of Connoman I think taking some issue with the Big T's research, and I guess they kind of bonded over that or something.

Speaker 2

Yeah, it was pretty cool because Tski, basically he was a mathematical psychologist, which anytime you hear mathematical and it's to do with something other than math, what that means is you've taken something and you've set it out in a very standardized way, so you can explore it, you can teach it based on certain facets and the upshot of mathematical psychology as far as human behavior goes.

These are the people who came up with the kroc idea that humans behave as rational actors.

We're self interested, we take all the best information available to make the best decision for ourselves.

And Daniel Kaneman was like, this is not at all true, and he started challenging Amos Tawirski's theories and Tavsky instead of saying like, no, you shut up, he was like, all right, let's go figure out, let's get to the bottom of this.

And because of that, Yeah, they formed this partnership that a huge impact on the world.

Yeah.

Speaker 1

I think it's kind of heartening that they, as academics, you know, got together.

There were no ruffled feathers or at least it didn't end up that way, and they work together.

It's kind of a heartening thing I think these days.

Speaker 2

Yeah, there's got to be at least one.

Speaker 1

Yeah, that's right.

They came up with a program called the Heroistics and Biases Program to basically, you know, study how human beings make their decisions, how they go through life making choices when they don't have like all the information at hand, all the most perfect information to make that choice, or if they don't have like all the time in the world to look at the information that they do have to make that choice.

So, like, how are people making decisions?

How are they making mistakes and their decision making?

And they ended up coming up with a couple of different systems, one which is super quick, and one which is much more deliberate.

Speaker 2

Yeah, Daniel Kahman came out with Thinking Fast and Slow, which was one of those super popular airport books.

Speaker 1

You know, Yeah, thinking Comma faster.

Speaker 2

Yes, thank you, eat shoots and leaves, that's right.

And in it he basically lays out this kind of shorthand models.

He's very explicit to say, like this is not this is not how like your brain is actually laid out, but it's a good metaphor for it.

And System one is how you think quickly, You think almost unconsciously, you make rapid decisions, and that is kind of how we generally navigate life.

System two is much more deliberate.

It's where we take into account like different ideas, It's where we really stop and think about something before making a decision, and they're essentially competing.

There's something called interference, and system one has a really great tendency to interfere with system two.

And there was a psychologist working all the way back in nineteen thirty five named John Ridley's Stroop who basically discovered the Stroop effect that is a way of demonstrating how system one interferes with the slower, more deliberate system too.

Speaker 1

Yeah, I bet a boy.

I bet he patted himself on the back after this one, because it's one of those things that's so simple, But I bet he winked at everyone like watch this.

Yeah, this is going to break your brain.

Speaker 2

It's genius.

Speaker 1

It really kind of is.

So what they did was they simply wrote down the names of colors, but they would write down the name of that color in a different color, and then he would just say, as to the person to read out loud, the color of the word that is written, not the color that it's written in.

And it is surprisingly difficult to do that.

It's just a little weird brain breaking thing.

Speaker 2

Yeah.

Yeah, so he's showing that your system one just wants to hurry up and read it, yeah, and it's getting it wrong and that's interference, right.

So that kind of like started to lay the groundwork for this idea that we do have kind of competing ways of seeing the world and making decisions.

And what Knoman was saying is that most of the decisions were walking around making are actually the System one super fast shorthand decisions.

But we think that we're using our more rational mind because we make post hoc explanations for why we decided that.

And that's not to say we're all walking around with these creepy little secrets that we know.

We're like fooling ourselves.

We don't realize we're doing this.

That's why these biases are unconscious.

Even if you stop and think about what you're doing, you may still not come up with the answer like, oh, yeah, I was making up explanations after the fact to explain why I actually used System two when I didn't.

It's really hard to do that.

Speaker 1

Yeah, for sure, Livy gives a pretty good example of that.

As far as like hiring somebody, Someone may make an impression in an interview that kind of locks it up from the second they walk in.

Maybe they look like their mom or dad, or a relative, or maybe they remind them of themselves, or maybe like who knows what it could be.

Then they end up getting that job, and later if you ask the person who hired them, you might say, oh, what it was because of this, this and this and this, when in fact that's really just system too kind of confirming, Like now it's because the guy walked in wearing a New York Giants T shirt.

Speaker 2

Yeah, and we'll get into some of the problems with the stuff throughout, but this is a good example.

Right if the opposite happened, If like you didn't hire somebody because they weren't quite like you, that's an example of a bias too, even if you don't think that that's why you did it.

If you're looking at their CV afterward and you're like, oh, they didn't graduate from college, it's why.

But really it was not because you're racist, it's not because you're a woman hater.

It was because you're preserving your own level of comfort because other groups that are different than you make you uncomfortable.

And that's how groups can become entrenched.

Right You just once one group kind of dominates an organization, they tend to continue doing that because people hire other people who they're comfortable around, rather than pushing themselves outside of their comfort zone and probably improving their organization.

And that's why diversity programs exist in the first place, because of that human tendency.

Speaker 1

Yeah, or maybe they were just a Jets fan.

Speaker 2

That's possible.

I mean, no Jets is going to hire a Giants fan.

Speaker 1

Yeah.

Well, here's a tip.

I don't know a lot about interviewing other than be yourself and try and get someone to like you.

But don't go into any interview wearing any sort of branded sports apparel.

Speaker 2

Yeah, especially a jersey.

Yeah, I think that says quite a bit.

Speaker 1

Yeah, you wear that Giants jersey in there.

Well, I guess you're rolling the dice.

You've either got that job right or there's no way you're going to get it.

So maybe it's not a bad idea.

Then I don't even know what I'm I might be wrong.

Speaker 2

Yeah, I mean, I guess if you dressed it up with a bow tie, maybe you could get away with it.

But yes, it is still a gamble regardless.

Speaker 1

Well, not all jobs you have to wear a suit and tie, you.

Speaker 2

Realize, I know, But I'm saying, like you dress for the part you want.

If you're wearing a giant's jersey with the bow tie, I think you're making a good impression out of the gate.

Speaker 1

All right, So I guess we can talk about We're going to go through a list of about ten different biases and they're all pretty interesting and I know everyone can identify with probably each of these at some point.

But before we do that, we need to point out that, like, these are all mental shortcuts or the result of a mental shortcut, but not all of them work in the same way.

And how our brains work, a lot of you know, there could be a lot of things at play.

Emotions can come into play, maybe, like we were just talking about, like it's hard to reassess something after you've gotten a first impression.

People humans historically through their life tend to make bad guesses at things because if you make great guesses, then things like you know, gambling would be super easy.

So all of these things come into play.

There's not just like a single way that this it's a broken system.

Speaker 2

Right yeah, yeah, yeah, But people generally, it seems like universally in a lot of these cases, behave in these ways under the same circumstances.

Speaker 1

Yeah, like some of their stuff wasn't replicatable, but that's sort of standard for studies in psychology.

Like a lot of this stuff, as we'll see, has checked out across cultures.

Speaker 2

Yeah, which is huge, you know, considering the whole weird problem in psychology in.

Speaker 1

Particular that weird people.

Speaker 2

Are Western educated, industrial hich are Yeah, rich democratic, I think.

Speaker 1

All right, so we'll start.

We'll leave the biggest guy for last, I think, which will be after a break probably, but we'll start then with maybe hindsight bias.

And this is the idea that after something has occurred and we talked about this one before here and there, that we tend to think like, oh, well, of course that was going to happen.

In fact, not only was that should I have seen that coming, It was probably inevitable that that happened.

And a lot of a time maybe because you're misremembering your expectation before it even happened, right.

Speaker 2

Like we can rearrange our memory of how we felt about the event or the outcome of the event afterward to basically match the outcome.

Yeah, I guess because we have this number ending need to be right.

Speaker 1

Yeah, that probably had something to do with it.

Speaker 2

I knew you were going to say that, so I've got another one for you, Chuck all right, self serving bias combined with a little fundamental attribution error on the side.

Speaker 1

Yeah, that's a good one.

It's a good side dish.

Speaker 2

So these things basically go hand in hand.

It's basically how we see ourselves in a great light and how we see other people in a more negative light.

Self serving bias is basically saying, if something good happens to you, it's because you are good, like you earned it.

It's because of you doing something right.

Something bad happens to you, it's external forces that made that happen, right.

Fundamental attribution error is the exact opposite with other people, If they do something right, it was just luck.

If something bad happens to them, it's their own fault.

So good example of this is like if a coworker comes in late one day, you're like, they're just lazy and slack, but then you come in late the next day and you're like it was traffic.

Speaker 1

Right.

Speaker 2

That's basically the two things going hand in hand, and those are those are both biases.

Speaker 1

Yeah, and I hope people understand that, Like all of those things can also be true.

You know.

Sure, so if you're if you're thinking like, well, no, but some you know, sometimes I did deserve the thing, and sometimes it was someone's fault.

Yeah, sure, that can't happen.

That's we're not These aren't absolutes.

Speaker 2

No, it's more just yeah, your your your tendency to think in certain ways.

Yeah, sometimes you're going to be right.

Sometimes you're going to be wrong for.

Speaker 1

Like humanities tendency.

Yeah, you got to take a big broad view here.

Speaker 2

Yeah, but also you specifically right you James Kirkland listening in Baltimore.

Speaker 1

There's one.

Oh, man, James Kirkland is going to pull over to the side of the road right now and really freak out.

Speaker 2

I hope, man, I hope I nailed it.

Speaker 1

Yeah, I think you picked a common enough name.

Speaker 2

We'll see, all right.

Speaker 1

So anchoring bias is another one.

This one I've fallen prey too.

I'm gonna say that probably about all these, but that is the first piece of info that you get about something can really affect and even in a very disproportional way, things that happen after that, Like, once something is kind of locked in, it's hard to unwind that.

Speaker 2

Yeah, that first piece of information, it's like, oh, okay, this is going to basically prime you in your answer your decision.

Right.

So a good example I saw is there was a study that says, like, Okay, the Mississippi River is less than two miles long, how long is it?

And those people would say something like fifteen hundred miles.

And then other people would say, okay, the Mississippi River is less than five hundred miles long, and people would say like, it's like three hundred miles.

And then another group was the Mississippi River is less than eighty miles long.

Those people would answer like sixty.

It's the same thing, the length of the Mississippi River.

But they were presented with this basically this priming number, a large one, a middle number, or a smaller number, and their answers were related to that first piece of information that they got, and that's anchoring bias.

Speaker 1

Yeah, And Livia pointed out another little side this year, which is called the decoy effect, and that's when you will go into a restaurant and they might and this is how just kind of one way this can affect economics, which will come up a lot.

But you'll go into a restaurant and they might have one super expensive bottle of wine on the menu, and maybe it's even placed at the top so you see it first, and then the other bottles of wine might seem like a decent deal after that, even if they're also overpriced.

Speaker 2

Yeah, exploitation, but.

Speaker 1

All wine and restaurants is overpriced.

I hope everyone realizes that.

Speaker 2

I mean buy a lot, right, Yeah, I think that most of their margin on that.

Speaker 1

Oh, totally.

It's it's frustrating, but it's that's that's the business.

Speaker 2

Well, that's also I think, chuck.

This anchoring bias is why they say you should never lead in a negotiation with your actual price that you want to go high or lower depending on your position.

Speaker 1

Oh like if they ask what you want to get paid?

Speaker 2

Yeah, or like you're trying question or something like that.

Speaker 1

You know, yeah, I hate all that stuff.

Speaker 2

Yeah.

Yeah.

If somebody it's like, well, what do you think I should get paid?

Speaker 1

You know, what are you looking to make at this job?

But that's what I'm saying, what you make is should always be the answer.

Speaker 2

Exactly.

You just want to add I don't know, fifty percent to what you actually want, and then you're away for negotiation.

There's the one other thing that's related to this framing bias, and that's basically the same thing, but rather than the first piece of information guiding you, this is more directly guiding you.

So for example, some drug maker says ten percent of patients die, You're like, oh god, that's a lot, right.

You could say at the opposite weight, ninety percent of patients live and you're like, oh that's great, the same amount of people dying.

It's just framed differently to exploit your your response.

Speaker 1

To exploit your aversion to dying.

Speaker 2

And that's a human thing, isn't it?

Speaker 1

For sure?

Shall we take a break?

Yeah, all right, Josh said human things.

So it's time for a break and we're going to come back with more biases right after this.

Speaker 2

All right, chuck up to bat is number twenty three availability heuristic?

Speaker 1

Can we put like a stadium echo effect on that?

Speaker 2

Next to bat?

Many motos?

Speaker 1

Very nice?

Speaker 2

So what is that availability heuristic?

And I'm sure this has happened to you before.

Speaker 1

Right, Yeah, none of these have ever happened to you, which is a funny thing about us doing this episode.

But the availability heuristic is what you have available to call up in your brain at any given moment, So you're you're going to rely more on what you can immediately think of in the moment, and chances are what you're immediately able to think of in the moment as something that probably aligns with your worldview or something like that, which is a sort of a well, we won't talk about the sea bias because that's coming up.

Speaker 2

Well, yeah, or something that like really kind of goosed you emotionally, like that's that's very available because it's wow, you know, loud and scary in your mind kind of you know, like if you saw something about a plane crash in the last like day or so, right when somebody asks you how frequent plane crashes are, you're probably going to give a much higher estimate than you would have before that, you know, maybe based on the number of times you've flown and nothing bad happened.

Speaker 1

Yeah, that's a good one.

Speaker 2

There's also an attentional blindness.

And before anybody, before we talk about this, because we're gonna spoil it.

Yeah, yeah, I want to send everybody, if you have the means to do this, go onto YouTube and search for selective attention test.

And this is Daniel Simon's YouTube channel, and then watch the test where the people are passing the basketball back and forth.

We'll wait a second.

Speaker 1

Dude, Okay, that's enough.

Speaker 2

All right, great, so hopefully you press pause and you didn't just try to watch it while we were while Chuck was doing the Jeopardy theme.

Speaker 1

It is short, but it's not that short.

Speaker 2

It's like a minute and a half or something, right, Yeah, So tell them about this video, Chuck, because it's pretty great.

Speaker 1

That's right.

In the video, they have a group of what was it like six people, probably six on the nose, six college students I guess, Three are wearing white shirts, three are not wearing white shirts, and they're in a very tight, small circle.

It looks very awkward.

They have a I think was it two basketballs.

There's two groups, yeah, two groups, two basketballs, and you're what you're told is the tasket hand is to count the number of times that people in white the white team are passing the basketball.

So you're counting, right, one, two, three, four, four, five, six, seven, and that's all you're supposed to do.

And at the end you're supposed to say, you know how many times they pass a basketball?

Speaker 2

Right?

Speaker 1

And now, now, hit them with the good stuff.

Speaker 2

So apparently half of the people who do this, which is astounding to me, half of the people who watch this video and take this test don't notice that in the middle of it, a person in a gorilla suit walks into frame and turns to the camera and I think beats on their chest and then walks out of frame.

Like in the middle of these people throwing these basketballs around the gorilla, half of the people are paying such close attention to counting how many times the people wearing white T shirts are passing the basket well that they do not notice the gorilla until the end of the video when it's pointed out.

Speaker 1

Yeah, and we're assuming it's a person in a gorilla costume.

Speaker 2

I'm hoping, first of all.

Speaker 1

That might be a bias at play, that it's not a real gorilla.

Speaker 2

Well, I guess it depends on the amount of funding they had.

Speaker 1

Yeah, it looks actually like the gorilla from trading places.

Speaker 2

Totally, which is like, were they even trying?

No, Okay, kid, I just wanted to make sure.

Speaker 1

Uh, did you watch this video Before you knew about this.

Speaker 2

I had heard it from some friends who were who do magic, and they were basically talking about this on a little podcast that they made.

Speaker 1

Who Bo Bobo, You are friends who do magic?

Speaker 2

So you know our friend Toby.

Oh, yeah, he has very good friends that do magic.

Speaker 1

Wow.

Speaker 2

I became Yes, I became kind of friends with him.

So yeah, I guess I do.

I have friends that do magic.

Speaker 1

Well, buddy, next time we are in Los Angeles at the same time, our good friend and friend of the show, Adam Pranica, Yeah, is a member of the Magic Castle and it's one of his and his wife Elaine's favorite thing to do is to take friends to the Magic Castle.

So have you ever been?

Speaker 2

No?

Speaker 1

And it is great fun.

Speaker 2

Adam Pranika just keeps getting better and better, doesn't he.

Speaker 1

Yeah.

I haven't been with him, but I've been a couple of times, once many many years ago, then another probably ten years ago.

But it's a lot of fun.

I'm a big fan of magic.

Yeah, And it's pretty magical.

When people don't see that gorilla in a very tight frame, it's not like it's on a big basketball court and the gorilla sneaks in there like there's six people and then there's a seventh.

Very clearly it is.

Speaker 2

It is very obvious so I mean, what this is showing is that our attention is limited, right, and we're really focused on a task.

You you saw that gorilla.

Those half of the people who didn't notice the gorilla, you still saw it, but you're still focused on the task.

That your brain was just getting rid of information that was unrelated to the task because it's not pertinent.

It can become pertinent though, when that gorilla decide to attack you.

And so this is a cognitive bias we have where we're ignoring potentially unimportant information to take in the stuff that's related to the task at hand.

Yeah.

Speaker 1

Well, you know where they could really get away with this because where you have great concentration.

Is it a professional sports game on the jumbo tron when they have the baseball under the helmet or whatever, uh huh, and then they're moving them around and you got to find you know, it's like three card money, yep, Because you're concentrating so hard on that, they could they could put whatever they wanted on that screen while that's going on, And I bet you most people would not or maybe I guess it's half if that's what they found.

Speaker 2

Yeah, I'll bet you're right.

I'll bet you're right, man.

Speaker 1

I was just trying to think of something where you're super trying to follow because I was happy I came up with the correct amount of passes at the end.

Speaker 2

You did.

You said, what does it mean if I noticed the gorilla and got the correct number of passes?

And I said, it means you're a perfect human.

Speaker 1

That's right, which we all know is not true.

Speaker 2

None of us are, Chuck, None of us are.

Speaker 1

Yeah, I know.

Speaker 2

So there's another one that you may have heard of before, even if you've not heard of any of these other ones, called the Dunning Kruger effect.

It became kind of viral because if you take it through the pop culture meat grinder, it becomes much more simplified and kind of loses some of its actuality.

But yeah, people still like it because it's a good way to put other people down.

Speaker 1

Yeah, it is.

This is the idea that the correct idea is that people with a little understanding in an area tend to overestimate their ability and their knowledge about something, right because they don't They know so little they don't even know what they don't know.

Kind of right exactly, But what you were talking about, it's kind of been transformed into like morons have them are the most like braggadocious, which can be true.

Speaker 2

It can be you know.

I think that's one of the things.

Like you said, you can be right with cognitive biases, you're not wrong with them all the time.

So yeah, that kind of supports that, But that's not what the Dunning Kruger effect actually says.

You said it.

And then there's the opposite way too, where the more experience you have, the more expert you are in a field, the more you assume that it should be easier for you than it is.

Speaker 1

Yeah.

That's a very valuable thing to understand, I think, and you get much further in life if people are like, well, you're the expert, and the experts usually the one going yeah, but I don't know, maybe we should hold off because you know X, Y and Z.

Speaker 2

Right.

Yeah, So that's the actual Dunning Kruger effect, and I saw that it's it's being assailed right now.

People are showing a question even the basic version of it, like the actually academic version of it.

Yeah, so we'll see what happens with that.

Speaker 1

Oh interesting.

We've got the gambler's fallacy next, and that is oh boy, if you have ever gambled anywhere.

But if you like go to casinos and stuff like that, you're going to see this all over the place.

You're going to hear it spoken out loud.

And this is the idea that you find patterns where there are no patterns.

Speaker 2

Yeah.

Speaker 1

So if you're at the blackjack table and you hear the person next to you like, well, oh, man, see, I've lost four in a row, so I'm gonna bet, like, I'm gonna go all in on this because I'm bound to win because I've lost four in a row.

There's no way I'm gonna lose five in a row.

Speaker 2

Right.

The problem is is those each of those hands of blackjack are unrelated to one another.

They don't form a pattern.

But you are predicting a pattern that just doesn't exist.

Yeah, that means you're a fallacious gambler.

Speaker 1

I can get you in real trouble.

I mean it's you can do the same thing on the playground with coin tosses.

In fact, coin tosses, I think is a lot of a lot of times the way they sort of try and prove.

Speaker 2

This, Yeah, because each coin toss, considering like you're playing with a perfect unflogged coin that has no bias whatsoever.

Each coin toss is totally unrelated to the last.

So you could get one hundred heads in a row and that doesn't mean anything.

It doesn't mean a tail is coming, because that each of those hundred heads in each of those coin tosses had nothing to do with the last one or the next one.

Speaker 1

I know.

That's hard to break out of, though, because it seems very human to think like they've flipped foreheads in a row.

There's no way there's going to be a fit.

Speaker 2

Well, that's another reason why this is so hard.

We're hardwired to find patterns and stuff.

It's a way to navigate the world.

The way we navigate the world is by finding patterns so that we can recognize things in the future and thus spend less energy getting back to homeostasis.

Speaker 1

That's right, This is all so interesting to me.

Speaker 2

I love this stuff.

Speaker 1

I knew that you loved it.

This is Josh Clark Central.

Speaker 2

I love observing it because I can't grasp what it feels like to suffer any of these, So just to discuss it and in this way is really fascinating to me.

Speaker 1

All right, let's talk about the base rate fallacy That means you put more weight on just like one very specific piece of information instead of looking at all the pieces of information that have come your way.

Speaker 2

Yeah, and usually it's individuated information, meaning like say some quality or careacteristic of one person, and then you're ignoring the base rate, which is like pure statistical information about what you're trying to figure out.

And a really good example of this is like, let's say that you are looking at somebody who is super fit, a woman who's very fit and athletic, and you're asked, do you think that woman is a personal trainer or a teacher, Because the basically the only evidence you have there is that this woman is very athletic and fit.

You might say personal trainer.

But if you took all the base rate information into account, you would know that even the very say, very small portion of teachers who are very fit and athletic may be small compared to the total number of teachers, it's still much larger than the total number of personal trainers in the world.

So statistically speaking, it's much likelier that that very fit athletic woman is a teacher and not a personal trainer.

You don't do that because you think personal trainer athletic fit must be a personal trainer.

You've just fallen prey to the base rate fallacy.

Speaker 1

My friend, Yeah, but she has on yoga pants and hokuhs exactly.

That doesn't narrow down anything these days.

Speaker 2

You know, how about the mere exposure effect, Chuck, and like, mirror is part of it.

I'm not making a judgment about it.

Speaker 1

That's right.

That means just merely being exposed to something has a vast impact.

So the more we experience something, the more you like it, which is why you see that commercial for the thing over and over and over, that burger king ad over and over and over.

Although I wouldn't say that you might like that one the more you heard it.

Speaker 2

That's the outlier for me too.

Speaker 1

But that's the idea.

Though there's just just mere exposure, we'll get you there.

Speaker 2

And then there's a related thing called the illusory truth effect, which is basically that repeated exposure to a lie causes you to eventually believe in it if you hear it enough times, even if you initially knew that it wasn't true.

So that makes me wonder if like it just wears you down over time, like your brain is tired of defending itself against being assailed with a lie.

And it's just like, fine, that's true.

I don't I don't care.

Speaker 1

Yeah, I mean sure, politics certainly comes to mind.

Repeat the lie, repeat the lie, repeat the lie.

Speaker 2

Yeah, And I mean like it's a viable way to exploit people's cognitive biases in that In that respect.

Speaker 1

Should we end up should we close out with the big daddy of them all, the big c the big confirmation bias?

Speaker 2

Yeah, let's do it, baby.

Speaker 1

All right, why don't you start this one?

Speaker 2

Okay, So there's a guy named Peter Wasason back in the sixties.

He coined the term confirmation bias, and he basically had an experiment that's really clever.

It's hard to understand at first, but it's very clever.

He basically said, hey, here is a sequence of numbers two, four, and six.

Figure out what the pattern is.

Just to be clear, this is really hard to explain.

If if you find somebody who can explain this, well you'll get it.

But I don't think I'm a candidate for that.

I think we all know that I'm not going to explain this very well.

Speaker 1

Oh I don't think that's true.

Speaker 2

Do you want to take a crack?

Speaker 1

Okay, The original numbers were two four six, and people might tend to go with, like, all right, eight ten twelve, and they thinking it might be all right, it's even number sequence ascending even number, right, and they would say, no, that's not correct.

And you said, well, maybe it's four eight twelve and it's like doubled or something, and they would say, well, that's also incorrect.

And then you're at WIT's end because what you haven't done is just done any ascending order.

You didn't go one seventy nine, three hundred.

Speaker 2

All right, let me take a crack at it.

You ready?

Speaker 1

Sure?

Speaker 2

So the original numbers two four six, and the participants would try to come up with the explanation of why, like what are the what pattern are those numbers following?

Speaker 1

Right?

Speaker 2

So you might say the like does eight ten twelve work and they would say yes, and you'd say, okay, well then you're just looking at even numbers and they would say, no, you still got this right, This still fits the pattern, but your your hypothesis for it is wrong.

Speaker 1

Right, that's yeah, that's the key.

Speaker 2

Right.

Here's where the confirmation bias came in.

People would then go back and continue trying to find versions that fit their hypothesis to explain this even though it was wrong.

Yeah, rather than take their hypothesis and say, Okay, this is right, this fits the pattern, but it's still not correct and start trying to break their original hypothesis by coming up with like just completely random stuff that doesn't fit their original hypothesis, in which case they might have said something like does one, six or twenty seven work and they would say no, that's that doesn't fit, and then that might lead the person to see that actually, the only the only thing that has to be correct to be part of the model is that the numbers have to ascend in order.

That was it.

But people, people, man, just you.

Speaker 1

Might even try, you might even try and break it by saying three, five, seven, but you're still using that original a version of that original hypothesis.

Speaker 2

Exactly that that there's yes, say like that you think it goes up by two or something like that.

Yes, You're like very few people go back and try to break their own hypothesis, And that's the point of confirmation bias.

Let's move on from that experiment.

The point of confirmation bias that this shows if you actually can understand it from other people than us, is that we tend to take our initial ideas, our beliefs in a lot of cases, and look for information that supports those and discard information that doesn't support it.

Speaker 1

Right, And that's of course you know mentioned politics a minute ago.

That where you most firmly see that these days is you are in a media bubble.

Probably, I don't know a ton of people that get their news sources from completely disparate points of view, and news these days that you're getting is so a lot of it is so slanted to begin with.

It's probably not even the best example anymore to use, but you're probably a long way of saying, you're probably going to be seeking out news that confirms your beliefs because you don't want your beliefs challenged.

Speaker 2

Yes, I mean, I, like everybody else, I have trouble with that as well.

But I have to boast you me is actually really good about getting news from different sources.

Speaker 1

Yeah.

Speaker 2

And one reason that I find it difficult to do is because I have like physical reactions sometimes.

Yeah, And that is a thing that's one of the reasons why they think we have we use confirmation biases because it sucks to be to have your beliefs challenged, right, It's really difficult to overcome that.

And there's this thing called belief perseverance, which is, even when you're beliefs are challenged with, say like an indisputable fact, you can still use confirmation bias to preserve that belief because we usually attach our identity, or build our identity around our beliefs.

That's who we are.

So it's like we're being personally attacked.

And then even more than that, there's the backfire effect.

Right did you see that?

Speaker 1

I did not.

Speaker 2

So the backfire effect says that in the in the face of being presented with information that is that basically counters your own beliefs, it can make you actually solidify your original incorrect belief in the first place.

Right, You'll you'll you'll believe it even more strongly, even though you've just been given facts that contradict it.

So we really really hang on to our beliefs as much as possible.

And that is a huge, huge thing that humans trip over.

That confirmation bias is probably the grandaddy of all biases.

Speaker 1

I think, Yeah, that's why I saved it for last Yeah, and you know, a lot of reasons people do this.

You might be protecting your yourself, like your self esteem, because otherwise you can you're admitting that you're may have been wrong about something, and it, you know, takes a big person to do that.

You want to believe that you're right about stuff.

And it also might just be difficult to process more than one hypothesis at once.

It might just be a little too brain breaking.

Speaker 2

Yeah, because once you lock into an explanation, your brain just it's like, I know, we've got it.

We don't have to figure this other thing out.

Homeostasis, homeostasis.

You know.

It's it is very hard to entertain something that is counter to what we already think is true.

Speaker 1

That's right, all right everyone.

As you can tell by the clock, we are taking our second break, this is a long one, and we're going to come back and talk about behavioral economics right after that.

Speaker 2

Well, wait, before we do that, let's try to explain this confirmation by a study again.

Speaker 1

Yeah, we should all right, we'll be right back, Okay, I promised talk about behavioral economics.

A lot of the work that Spirsky and Konoman did was super applicable and kind of revolutionary in a lot of ways for the world of economics and how people buying behavior is affected.

They didn't invent it, like Adam Smith wrote about stuff like this, and starting about World War two is when they started really kind of homing in on stuff like this, like using mathematical models, and it all kind of started with the assumption that people and companies and organizations are really just trying to pursue their self interest at the end.

Speaker 2

Of the day.

Yeah, and they're going to make the most rational decision.

Yeah, and that's just they And they were like, yes, we know people make irrational decisions, but these are outliers.

Like if you take all of the information and their data and aggregate, you will see that humans generally try to make the most rational decision.

That's just not true.

People don't do that.

We make all sorts of irrational decisions that very frequently run counter to our own best interests.

And again we'll even reject stuff like information that would help us make decisions to our own best interests if they counter our beliefs.

So there's a guy named Richard Thaler who ended up becoming a colleague of Diversky and Konnoment, and he took some of their papers, and he realized that these mistakes, these cognitive biases, they can be predictable.

Right, you can actually map how somebody's going to make a bad decision.

And this became the basis of behavioral economics.

Speaker 1

Yeah, he well, let's talk about this this prospect theory, because this was from Tversky and Konomon.

It was an article from nineteen seventy nine from this idea of prospect theory colon an Analysis of Decision under Risk, and Livia says it's probably the most cited economics paper of all time.

Like this was a revolutionary, landmark paper and they didn't write a ton of papers for researchers.

They did like eight totalah, which just shows what an outsize impact they had.

But they talk about in this paper a lot of attitudes about risk.

One is loss a version, which is the idea that you're going to experience more emotional suffering when you lose money, then you will gain happiness if you gain something, So you may pass up an offer that gives you equal odds of winning twenty five or losing twenty.

There was another example I think kind of gets it across more is there was an experiment in nineteen ninety six where they gave participants a lottery ticket and before you scratched it off or whatever, let's say it was a scratch off.

They said, all right, well, hold on, before you do that, I'll give you another lottery ticket plus ten dollars in cash.

And for no logical reason at all, people tended to think that that first ticket was the one, even though there was no it was a lottery ticket, there was no difference at all.

They would turn down that extra ten bucks.

I think less than fifty percent of them took that deal.

Speaker 2

Yeah, because in giving away or trading that first ticket, they risked a loss even though the gain was right there.

Just trading the ticket, you got an extra ten bucks, right, Yeah, that's fairly irrational.

We also have a lot of trouble with rare events.

Yeah, we tend to overestimate them.

It can be a positive event and it can be a negative event.

But we're really bad at probabilities and statistics.

And this is essentially it's like you won't let your kid walk to school because you're afraid of your kid being kidnapped, even though the chance of your kid being kidnapped is just ridiculously low.

It's technically irrational, even though very few people would fault you for that, but it's still an irrational decision.

Speaker 1

Yeah, for sure.

We talked about that in the.

Speaker 2

Well was it free range kids?

Speaker 1

Maybe I can't remember, tied in with a satanic panic and stuff like that I think definitely did back in the day.

Relative rather than absolute terms.

That is a theory, monetary theory where and there's a great example.

You might drive an extra ten minutes in a car to buy a shirt that you know is selling the shirt for twenty bucks rather than the one closer to you that sells it for thirty.

But you're not going to do that because that saves you ten bucks.

You won't s twenty dollars on a car even though you may even have to drive five minutes down the road, because you're like, oh, it's twenty dollars, the car is twenty thousand.

But it's really a relative, you know, absolute thing, as you're saving twice as much money as you did on that T shirt purchase.

Speaker 2

Right, But it's like you said, it's all relative again, totally irrational.

But all this stuff relates to economics, and like you said, this stuff can be replicated.

There's a twenty twenty study that looked at the prospect theory in particular, and this major study was conducted in nineteen countries and thirteen different languages and held up not bad.

No, that's not bad at all.

And so it's not just economics.

It's not just being exploited by the wine list or you know, Kentucky Fried Chicken or something like that to make you buy their stuff.

This actually this can have like life and death consequences too, although I guess so can wine.

In Kentucky Fried chicken.

Speaker 1

You know what you're gonna get a Kentucky fried chicken?

What PEPSI?

Speaker 2

That's right, you will get some pepsi.

You know how I know that cause you just had Kentucky fried Chicken.

Speaker 1

I did.

After our tour.

I was a little tired and needed just some fried chicken.

So I got fried chicken.

Speaker 2

What do you get just the original or extra crispy?

Because you're crazy if you don't get extra crispy.

Speaker 1

I get extra crispy.

But they were out they could satisfy.

I got the three piece.

They had two more pieces of extra crispy, and they asked if one piece of O R was available, and I was like, yeah, sure, I'm not gonna not eat a piece of chicken.

Speaker 2

It is good.

They do chicken, right, Yeah they do.

Did you get the mashed potatoes and gravy?

Speaker 1

You know it?

Buddy times too, and extra biscuit.

I went all in.

It was a rare treat eating frenzy.

Speaker 2

Did you drink a PEPSI?

Speaker 1

I did?

Speaker 2

Awesome.

Well that all fits somehow.

I don't know how, but it somehow fits this this episode so where I was saying that this can be life and death, as with medicine, because although doctors have God complexes and like to present themselves as infallible, they are quite fallible.

They're humans and they can suffer the same cognitive biases as us.

But they have your life in their hands.

We rarely have others' lives in our hands.

Speaker 1

Yeah.

Do you watch the show the Pit?

Speaker 2

I tried and I just did not grab me.

I gave it like ten minutes, but I hear nothing but good things.

Speaker 1

Yeah, I mean, I really like it.

I've never been a hospital show guy, so this is kind of one of my first forays into it, but I like it a lot.

I haven't started season two, but I noticed when reading through these bias like medical biases that they do, or at least Noah Wiley does a really good job on the show with these younger residents trying to bust through and a lot of this stuff comes up.

He doesn't say, hey, that's affect heuristic.

He just will talk about what that is.

And now that I know the definitions, I'm like, oh, he's talking about as an outcome bias or an anchoring bias.

It's fairly interesting.

Speaker 2

Yeah.

Rather than say, like being presented with a really high price for a bottle of wine to make the other overpriced wine seem like a bargain, this can be like your first lab work comes back and that forms the anchoring biased impression of your condition.

And even as new lab work comes back, that doctor may fail to adjust their view of your condition because they're not taking into account this new stuff.

They're giving more weight to that original, that original number.

So yeah, and that's just the anchoring bias.

The way that it can affect there's, like you said, there's all sorts of other ways for it to happen, and all of it can result in poorer outcomes for patients just because their doctors are humans, and we don't really approach cognitive biases in a really methodical or deliberate way.

Speaker 1

Yeah.

They In fact, now that I'm thinking about it, they do this so much on the show.

The show could be called medical confirmation bias the show because you see it all the time.

Outcome bias is when a shift in the patient's health you're convinced is the result of a treatment, like it's because of that thing I did, or affect heuristic that I mentioned, an emotional reaction to a patient, you know, kind of overrunning you know, deliberating on this thing in a logical way.

This happens all the time on the show.

Speaker 2

Yeah.

Well, another field that it happens with this forensic science, which we've gone to great links to kind of point out, is junks in the most in most cases, and a lot of that junk is just based on cognitive biases.

Speaker 1

Yeah, for sure.

I mean certainly the way they do lineups is flawed.

I mean the way they I feel like, you're right, we've done this a lot on the on the show.

The way they have done a lot of this is super flawed.

And I think maybe they're looking at it some but not a lot.

Speaker 2

No, So if you want to fight cognitive bias in your own mind, Chuck, what do you do?

What do you do?

Speaker 1

Well, there's a list of good tips here, and I think these are pretty good tips.

The first one is just being aware that you have these, which is something that we've already kind of kind of worked through on this episode, except for you, of course, because you don't have these.

Sure, but studies show that like just being aware, it's not one of those things where like, well, being aware is have the problem.

It's like being aware seems like two percent of the problem.

Speaker 2

Yeah, it's like you're aware that you have an unconscious bias.

It doesn't make you understand the bias.

You just know that they're there, Right, that's the problem with is unconscious.

Speaker 1

What else?

Speaker 2

There are some like actual things you can do like delay decision making.

Yeah, don't don't come to snap judgments.

Go get more information, Go get information from a contradictory source or different source or something like that.

And then it like kind of tied into that.

You can have like personal like rule like if there's a big decision, you will not make that decision until you've slept on it.

Yeah, for example, don't buy a TV unless your friend says, yeah, good idea.

Speaker 1

Try and consider your past experience for sure, because optimism biased could come into play, like hey, worked out last time, yeah, Like why would I why would I take more time this time?

Speaker 2

Yeah?

And that's another way that you can kind of do that.

An exercise you can do is write down your expectations for an outcome and then go back and look at it afterward and see if you were right or not.

Can kind of help you realize like, uh, I do kind of tend toward the optimism bias.

Speaker 1

Yeah, because I believe that was one of the other biases.

Is even like it is hard to recognize because you're biased and that you misremember what you thought going into it.

So writing it down is a good that's a good.

Speaker 2

One, right, But if you're super super unconsciously biased, you might be like someone else wrote this in my handwriting, Right, I've never been this wrong.

Speaker 1

What about Thomas Bayes and Baysian reasoning.

Speaker 2

So he was a minister from the eighteenth century, and he basically came up with a standardized formula for taking into account the probability of an outcome, right that things aren't essentially so I saw this on less wrong dot org founded by one of the guys who wrote, if anyone builds it, everyone dies about Ai Eliotz or Yukowski.

The whole point of less wrong dot org is to overcome your biases in a methodical way.

And they love Baysian reasoning, and it basically says, there's no such thing as something is just true.

Everything is just a probability, and you can kind of try to determine how probable something is based on whatever evidence you can gather about it.

Just basically going through life like that.

Speaker 1

You know, who hates that website?

Who l E s R O NNG That dude who started his own personal comedy website right, less wrong dot God.

That's right, he's just getting smashed.

Speaker 2

What else, Chuck?

Speaker 1

What else is I cultivate a growth mindset?

Speaker 2

That's a big one.

Speaker 1

Hey, I make mistakes.

I screw things up, and like I need to recognize that and try and grow from that rather than you know, just being confirmed with my own biases constantly.

Speaker 2

Yeah, maybe like looking around at some of the ways that you're commonly exploited, say like by advertisers, Like scarcity is one when somebody says act now supplies are limited.

They're creating a scarcity mindset in you social proof basically like these people like this, so you probably should too, and you're like, oh, I should like that too.

Speaker 1

Yeah.

Speaker 2

And then two other things I saw.

There's something called cognitive bias mod modification I think is what it is.

Okay, you can use this for like treating anxiety, right, Like people like tend to seek out negative facial expressions.

Oh yeah, and this treatment is like like here's a thousand frownie faces.

Find the smiley face in there, and just screen after screen, you're looking for the smiley face, and you're training your brain to stop putting as much weight on negative facial expressions, just using like basically exploiting your cognitive bias to get over your cognitive bias.

Speaker 1

Oh wow.

Speaker 2

And then the last thing, Chuck is apparently AI are starting to show signs of emergent cognitive biases because they use heuristics too, so they're starting to make cognitive they're starting to make errors in judgment in predictable ways, which are cognitive biases, just like humans.

Speaker 1

Rob Zombie more human than human.

Speaker 2

That's right, you got anything else?

Speaker 1

I got nothing else.

Speaker 2

This is a good one.

This is fun.

Chuck, I'm going, well, since Chuck and I both liked this episode, that means we have no choice but for listener mail to be triggered.

Speaker 1

Right now, I'm going to call this follow up on a Sebastopol because I wondered what the connection there was, Hey, guys, because if you didn't listen Sebastopol, California, and we were talking about the Sebastopold in the Crimean War, and I was like, there's no way that's a coincidence.

And it's not.

Hey, guys, listening to the podcast on the Light Brigade from Sonoma County.

Our Sebastopol was named after Sebastopod.

And here's a little information.

The settlement was apparently originally named pine Grove, and the name changed to Sebastopol was attributed to a bar fight in the eighteen fifties which allegedly compared by a bystander to the long siege of the seaport of Sebastopol during the Crimean War.

Wow, so the original name survives in the name of the Pine Grove General Store downtown only, and that is it.

There's also the rush there Russian River Valley, so apparently there are some Russian influence in that area which I didn't know about.

And that is from Marsha Ford.

Speaker 2

Yeah.

Also, we want to apologize to all of our Iron Maiden fans who wrote in to be like, yeah, that song of the Trooper is about that whole battle.

Speaker 1

Yeah, I didn't know.

I am not.

I like Iron Maiden, but I didn't have as much shame upon my head as you.

But you didn't reading the lyrics, it doesn't say you know crimean war in charge of the light Brigade, does it?

Speaker 2

I don't know.

I haven't heard it in a while.

I'm a big fan of the poster.

I love the poster a lot.

Speaker 1

Yeah, me too.

Speaker 2

Well, sorry all of you Iron Maiden fans out there.

We'll try to do better next time.

Speaker 1

Yeah, missed opportunity.

Speaker 2

Who is that that wrote in about Sebastopol?

Speaker 1

That was Marcia, I believe.

Speaker 2

Thanks Marcia, Marcia, Marsha Marsha.

We really appreciate you, and if you want to be like Marcia, you can email us as well.

Send it off to Stuff podcast at iHeartRadio dot com.

Speaker 1

Stuff you Should Know is a production of iHeartRadio.

For more podcasts Myheart Radio, visit the iHeartRadio, app, Apple podcasts, or wherever you listen to your favorite shows.

Never lose your place, on any device

Create a free account to sync, back up, and get personal recommendations.