Navigated to How to Spend Your Time and Money Better (with Nobel Prize Winner Richard Thaler) - Transcript

How to Spend Your Time and Money Better (with Nobel Prize Winner Richard Thaler)

Episode Transcript

Speaker 1

Pushkin.

As humans, we do a lot of things that don't make all that much sense.

I sometimes imagine a completely rational alien species somewhere out there in the universe that's observing us earthlings and is totally shocked by how often we act against our own best interests.

We eat foods that we know aren't good for us.

We avoid small simple actions like exercising or flossing preventive care that we know will pay off in the long run.

But we also do some truly beautiful things, behaviors that might puzzle those rational aliens.

We donate blood or even an organ to someone in need and at real risk to our own health.

We're kind to strangers who, in many cases will never even see again.

I also wonder what those aliens would make of the way humans experience happiness.

How complex and counterintuitive the things that promote our well being must look to a totally rational being.

Unfortunately, it's unlikely I'll ever get a chance to chat with a rational extraterrestrial like one of the Vulcans from Star Trek, those creatures who embody pure logic and self discipline about the anomalies of human behavior.

But in this episode, I get the chance to do the next best thing.

I get to chat with a Nobel Prize winning economist, although, as you'll hear in this episode, he doesn't exactly fit the stereotype.

Speaker 2

Uh, Lessie, what's my name?

I'm Richard Taylor.

I'm a professor of behavioral economics at the University of Chicago's Booth School of Business.

Speaker 1

You don't Mike drop the like en Nobel laureate.

Did you bring it?

Did you bring the metal like?

Do you bring it in your pocket?

In case pull it up on video?

Speaker 2

It's a little heavy, so I don't.

Speaker 1

Doctor Richard Thaylor started writing his seminal book The Winner's Curse, Paradoxes and Anomalies of Economic Life all the way back in the nineteen eighties, so you might be surprised to hear that I'm featuring it in this series about my favorite books of twenty twenty five.

But several decades after creating the book that transformed the way I think about human behavior, Richard is republishing it with new research and new reflections on the irrationalities of everyday life, all with the help of a new co author.

Speaker 3

My name is Alexeimas.

I'm also a professor of behavioral economics and behavioral Science at the Booth School of Business.

I don't have any Nobel laureate stuff to mention, so.

Speaker 1

You'll come back in a couple of years on the podcast, and I'm sure things will be there.

So we're featuring The Winner's Curse as one of my favorite books about twenty twenty five.

But that feels kind of odd because I remember reading The Winner's Curse when I was like in graduate school.

So it's kind of an odd book to have as my favorite book of twenty twenty five and one of the books that really taught me important things.

But maybe i'll have Richard, you set the stage of, like the origin story of this book.

Speaker 2

Okay, Yeah, the origin story is when I was about Alex's age, about forty years ago.

I had just come back from spending a year with Danny Konnoman, my mentor.

Speaker 1

Just for context, the late Danny Connoman is also a Nobel Prize wunner, but he was a psychologist, a heavyweight in my field, who is credited with pioneering the field of behavioral economics.

Richard, then a young professor at Cornell, was building on Connoman's insights as he forged his own.

Speaker 2

Path, and somebody suggested that I write a series of columns in a new journal called the Journal of Economic Perspectives.

The journal was aimed at like the general economist, so it would be articles that non specialists could understand.

The idea was in each issue, I would write about something an anomaly.

And what is an anomaly?

An anomaly is something that is unexpected, right, so elephant in real life would be an anomaly for an economist.

An anomaly is something that the theory says won't happen, like if price goes up and demand goes up, that would surprise us.

So I did this for about four years, and when I had enough of these that it looked like a book, I kind of stapled them together and called it the Winner's Curse because it's an intriguing phrase and it was one of the anomalies.

Speaker 1

At the time, Alex was in a very different place in life, was being born.

Speaker 2

How was it.

Speaker 3

I was in Moldova at the time.

I was born in Moldova in Eastern Europe.

Basically I wasn't around when the book was first came out.

He was shirking, yeah, But then Richard and I met when I was a graduate student at San Diego, and our offices ended up being adjacent that I was super interested in behavioral economics, super shy, kind of didn't want to bug them, but eventually I kind of got the courage to be like, what do you think about this idea?

What do you think about that idea?

We started chatting.

I got my first faculty job at Carnegie Mellon, and then at some point I joined Booth.

Speaker 2

And then a few years ago the publisher of this book said, hey, you know, this book is getting old like you, and it's going to go out of print.

Maybe you want to freshen it up or something.

And I think they had in mind a new cover, and stupidly I thought, oh, well, maybe there's something more ambitious we could do.

Speaker 3

Fortunately for me, by the way, maybe stupidly for you.

Speaker 2

Yeah.

Well, I got the clever idea to get Alex to help.

So I had this pile of anomalies.

I wrote another half dozen of these halfter the book came out, and the question was, you know, it's forty years later, how does this stuff hold up?

Speaker 3

So that was the concept we got on the phone, and it sounded kind of straightforward.

And easy, like let's knock this thing out in six months, ended up being anything but that it took us about four years now to actually finish it up.

That's largely because two thirds of the book is brand new.

Speaker 1

The new version of The Winner's Curse includes the original anomalies Richard began exploring in the eighties, but also new research examining whether the old ideas hold up today.

Speaker 3

Really how this has held up in terms of the empirical robustness of the results.

Have these results been replicated?

Do they show up in the real world like the original results were largely you know, with some exceptions in lab experiments with college students, relatively small samples in some cases.

And you know, we've had forty years, there's been hundreds of replications.

Where are we now?

So part of the book, and we emphasize this throughout, is we went ahead, Do we replicated all of the studies ourselves?

Findings are really robust, Everything replicates, everything has been replicated.

Speaker 1

The anomalies the book explores are all about the surprising ways humans act differently from what's known as the standard economic model, the model of rational behavior that both Vulcans and economists tend to expect from us.

Speaker 2

The standard economic model is really that people, or as economists call them, agents, solve problems by optimizing.

So what route did you take to get to your office today?

You took the best route?

Why are you doing this podcast?

Because it's the best possible use of your time?

That plus markets, that is what distinguishes economics from psychology, say, and so this.

Speaker 1

Idea of these agents that are optimized in the best way interacting in these markets produced a certain kind of view of what people tended to do when making these decisions.

And that's what's often been called homo economicists give me the like homo economicus one oh one, Like, you know what do we think this guy's what they're doing.

Speaker 3

Homo economicus is a rational agent who bays principles of rationality, So they have rational expectations.

They don't have any memory issues, they don't have any sorts of biases of what's out there in the real world.

They're fully rational.

They take these beliefs as inputs and then they optimize.

They make the best decision given their correct beliefs about the world.

Speaker 2

We should add that these agents are also selfish jerks, so they don't really care about anybody else, possibly members of their family, though possibly not.

And they also have no self control problems, So no need for any weight loss drugs in this world.

Everybody weighs just the right amount.

Speaker 3

No AA, no alcoholics, nobody gets addicted to drugs unless they really want.

Speaker 2

To, and no need to worry about saving for retirement.

People will do that because otherwise they're going to starve when they're old.

Speaker 1

Of course, you're saying all this stuff quite facetiously, because real people don't tend to do this.

We are in a world with AA and weight loss drugs and people who haven't saved for retirement.

And these are the kinds of anomalies that you pointed out in your book, These cases where you looked and you said, hey, people are supposed to be obeying the standard economic model and they're just not.

And so what does that mean for human psychology?

And maybe how can we avoid these anomalies, especially in cases where they're kind of hurting us and messing us up.

And so that's what I want to go through today.

I want to go through my favorite of your anomalies that you cover in the book.

Six of these, but anomaly number one that I love is the anomaly that the book is named for, the so called winner's curse.

So Alex, tell me what the winner's curse is with the famous jar example showing this anomaly.

Speaker 3

Yeah, so imagine you go into a bar with a jar of coins.

The jar has a certain amount of money in it, and then you tell everybody at the bar, look, whoever bids the most for this jar gets the jar and all the money that's in the jar.

What's the winner's curse?

Pretty simple.

The person who wins the jar will end up losing money in the sense that their bid is going to be higher than the amount of money that's in the jar, So they'll get the jar, they go home, they take all the money out, they're like, oh crap, I pay twenty bucks.

This has fifteen dollars in it.

So that's the winner's curse that by winning the auction you're actually losing.

The jar example is really kind of easy to demonstrate.

You could do it in a class, Richard, you've done this in a class before.

Speaker 1

I'm sure you do it in classrooms, But do you ever do it in a bar?

Is this like your bar trick?

Speaker 2

It better be a friendly bar.

Speaker 3

I don't want to get beaten up.

Speaker 1

So that's the bar version, you know, the kind of toy version that economists talk about.

But you've talked about lots of examples where we see this in real life.

A curious example I didn't know until I read your book was the case of oil companies, Richard, kind of how to oil companies fall prey to this winner's curse.

Speaker 2

Yeah.

In fact, the winner's curse was discovered by oil engineers at a company that was called Arco.

Oil Companies were bidding for leases for a certain plot of land in the Gulf of Mexico.

We're going to still stick with the original name, the Gulf of Mexico.

And what they found was that every time they won one of these auctions, there was less oil there than their engineers had predicted.

And they're saying, what's the story, lousy engineers or are we just unlucky?

And then they figured out no, actually, it's that if there are lots of people bidding, the winner is going to be the one whose engineers on this plot had the most optimistic forecasts.

Speaker 1

And then winners Chris Stemps are in the fact that we're not tracking the fact that well, hey, if everybody's bidding that there's less money in this jar than I think, or if all the other oil engineers are bidding less than I am for this oil, maybe I'm wrong about how much oil or money is really out there.

Perspective taking failures are ones that we talk about a lot on this podcast because many of them really impact our happiness in these bad ways.

Right, we don't realize that other people want to connect with us, or we don't realize that other people don't know we're super grateful, or don't know that it'd be fine if they asked us for help.

So there's all these cases in the happiness science where perspective taking failures kind of mess up how much happiness we could be getting from other people.

But this is one where it seems to be messing up a lot of the value that we get from winning, Like quite ironically, yeah, And so how do we overcome the winner's curse?

How do we take into account what other people are bidding in other people's perspectives a little bit more to break this well?

Speaker 2

A simple lesson is in the actual bidding circumstances, the more bidders there are, the less you should bid.

Now that is really counterintuitive advice.

If Alex is auctioning off this jar of coins and there are ten of his friends at the bar, and then twenty more people come in and we let them bid, all of Alex's friends should lower their bids, And that's just hard to get your head around.

Speaker 1

Yeah, I feel like my intuition is exactly the opposite.

Speaker 2

Right, it's because you have the intuition, oh, I want to win.

No, the goal should be to submit a bid that if it wins, it will be lower than the amount of money.

It's very counterintuitive, which is why people get it wrong.

It's a classic anomaly.

Speaker 1

Okay, so that was my number one.

Number two classic irrationality, which is one that just makes me feel good about the human race, is that people aren't the self interested jerks that economists think they are.

You mentioned this in your description of homo economicists, that homoeconomicists is just out for themselves.

But explain why standard economic theory sort of predicts that.

Speaker 2

In a sense, economists have talked about a world in which people don't care at all about anybody else.

Paul Samuelson, one of the great economists of the twentieth century, wrote a paper about what he called the public goods problem.

A public good is something that if you provide to one person, you provide to everybody, like a fireworks display.

And he said it will be underprovided because no one will contribute because they can watch it for free.

Now, of course, people do contribute to charities and to public radio and all kinds of other good causes.

If you open your eyes and look out the window, you'll notice that people aren't always suffrage jerks.

Speaker 1

And this is the kind of thing that experimenters were starting to notice around the time of The Winner's Curse, Too.

Alex tell me about some of the classic violations of selfishness that folks talked about early in the literature.

Speaker 3

The real anomaly came when Werner Goot and his colleagues ran something called the ultimatum game.

It's super simple.

Richard and I are playing the ultimatum game.

I have ten dollars, and I decide how to split that with Richard.

I give him ultimatum.

I'm the proposer.

Richard is the receiver.

Richard sees my offer to him.

Let's say, I say, look, I like money.

I'm going to keep nine dollars.

I'm going to give a buck to Richard, and then Richard says, I want the dollar.

Everything's fine.

I end up with nine, Richard ends up with one.

But if Richard says no, both of us end up with zero.

Mm hmm.

Speaker 1

And so if homogonomicus is the proposer, he should give you know, one penny, right and keep nine dollars in ninety nine cents for himself.

And if Richard is a homo economicus, Richard'd be like, wow, one penny is better than nothing.

I should go for that.

But if you're playing with real humans, if you offer one penny of your ten bucks, most real humans are like explot of you.

You jerk like you know, I'll like take a hit, right.

And so what did gooth find in the original ultimatum game effects?

Are people homoeconomicus or do they throw some explotives in there?

Speaker 3

That's exactly what he found is basically people are not homeo economics.

Essentially, under twenty percent of the pipe, those offers are rejected.

So if I offer something less than two dollars, Richard's going to reject my offer.

Both of us end up with nothing.

And so this is a real puzzle.

So the proposer's decision of how much to give.

That could be driven by a lot of different things.

But the real puzzle is the receiver saying I would prefer nothing.

Speaker 2

To two dollars.

Speaker 1

So this is in the case these like standard economics games where we're really setting up these kind of arbitrary situations.

You know, I'm giving you ten bucks these really specific rules.

But folks were also finding that people are nicer than you expect when you give them more real world contexts.

Richard tell me about these famous wallet studies and how it showed that people were nicer than we think.

Speaker 2

Yeah, so there's a guy called Alan Colne who was a postdoc here for a while, and he undertook this unbelievably ambitious project where they would take a little wallet that would have some kind of id.

Suppose it's Alex's wallet and it has something like his email address on it and sometimes money and sometimes key, and they would turn it in at some place like a hotel, lobby or a train station.

And they did this thousands of wallets all around the world.

And the question is what do people do.

Are they more or less likely to try and find the owner of the wallet if it has money in it?

And if people are self as jerks, the more money that's there, the less likely they are to turn it in.

Speaker 1

Or at least to turn it in with the money at least to turn.

Speaker 2

It right exactly.

Yeah, And the opposite happens.

The more money that's there, the more likely that it gets returned with the money to the fictional owner.

Speaker 1

And so these are anomal is when it comes to the standard economic model, but they're kind of great when it comes to human nature, that we're kind of naturally not selfish jerks.

But a question I had for you, given all the evidence, is what can we do to get people to become even more cooperative?

Are there ways that we can bump this lack of selfishness up even more?

Speaker 3

Yeah, So one of the things that's been shown to matter a lot is how much people can connect interpersonally.

So any sort of communication between people before some sort of potential exchange takes place or any sort of decision takes place, facilitates more cooperative, less selfish behavior.

People kind of get to know each other, that connection brings them closer together, and then they're a lot less selfish.

So that's one thing that really very intuitively boosts cooperation.

The other thing is that in society we have a bunch of norms.

Some of those norms involve punishment of people who don't cooperate, and when you introduce these sorts of norms into these kind of very abstract games, turns out that boosts cooperation tremendously.

Of the largest effects in the behavioral economics literature is that when you take a standard sort of abstract game where people can contribute to the public good or something like that, if you introduce the opportunity for other people to punish non cooperators at a costs to themselves, by the way, which is itself an anomaly the standard economic model.

If it costs me something to punish somebody else, I would never do it.

But turns out people do, and because there's this threat of punishment, everybody ends up cooperating.

So this is kind of like a little model of society where you take a group of people interacting, you introduce the types of things that we see in the real world, like norms and the ability to punish, and all of a sudden, the world looks a lot less selfish.

Speaker 1

So it seems like there's some fun happiness implications from this anomaly, it seems like happiness.

Implication Number one is like, we can just trust people more than we might think.

The other is that there's ways you can boost cooperation even more, talk to people more, get more social connection, which honestly is great for happiness anyway, and if possible, give people both the opportunity to set up norms where you can call out bad actors, which is good.

Speaker 2

Yeah.

The one finding is that if you are trusting, you produce more trustworthy behavior.

You know, in the old book I talked about farm stands in Ithaca, where I used to teach at Cornell, where a farmer would put out like fresh corn on the honor system, and people would put money in the box.

And Alex and some of our behavioral economics friends were out hiking in the Swiss Alps this summer and came across a place that had wine and cheese.

What were they selling?

Speaker 3

Alex there was like you know, these like little cabins and secluded in the Alps.

There's nobody there.

You come in, there's wine, and she set up.

You take however much you want than you put your money in the little box and you leave.

Speaker 1

It's less quaint than cornell, I guess with fresh corn versus really nice French wine.

Speaker 2

But yeah, but you have to hike up there to get it.

Speaker 1

Coming up after the break, we'll dive into more of my favorite irrationalities, like why keep paying for a gym that I've only ever used once?

And why it's so hard to actually redeem the frequent flyer miles I've been hoarding.

The Happiness Lab will be right back, all right, So now we're moving on to irrationality number three, something that I struggle with a lot personally, which is that as humans, we seem to have more of a problem with inertia than the standard economic model might predict.

Richard, tell me about the classic mug study that sholowed us the problems that people face with inertia.

Speaker 2

Yeah.

So there's something that I originally called the endowment effect, because your endowment is something you own, and the empirical result is that we value stuff we have more than we would be willing to pay to get it.

Speaker 1

So, maybe having me in Alex play, if you're putting me in an endowment effects study, how would you do it?

Speaker 2

You and Alex are in a class, sitting next to each other, and I go around and I give half of you a Yale University coffee mug.

It says Boola boulah or something on the mug.

Speaker 1

Boola boola, which, of course, for those that don't know, is the Yale fight chant.

There are literally those mugs in the bookstore.

Speaker 2

So yeah, yeah, yeah, And then we say, all right, what we're going to do is we're going to have a market for these mugs.

Laurie, you have a mug and you're asked for each of the following prices, are you willing to sell it or not?

So ten dollars, nine to fifty so forth down and Alex is asked and each of these is are you willing to buy one?

Now?

A principle of economic rationality is if you wouldn't pay six dollars to get it, then you should be willing to sell it for six dollars.

Suppose we hand out five dollar bills, Well, people will be willing to trade those for six dollars and they won't pay six dollars to get one.

Right, So mugs should be like five dollar bills.

But when we run those experiments, the people who get mugs demand twice as much to give them up as the people who randomly didn't get a mug are willing to pay to get it.

And notice it's not like we're asking about your favorite hat, right this mug has been in your possession for about two minutes and you haven't grown to love it, but you act like you do.

Speaker 1

So Alex, explain what endowment effect is exact example of us showing inertia, because I think that it's part of a broader set of biases that you talk about in the book that I find really fascinating.

Speaker 2

Yeah.

Speaker 3

So basically, there's a bunch of different biases that are essentially an effect that's observed in a particular context that are kind of driven by the same sort of underlying psychology, which you can describe as inertia, that is driven by something that we discuss as loss aversion.

Particularly.

Look, I'm already here.

I have this thing or this activity that I'm doing, or whatever you want to call it.

This is now kind of my status quo.

This is where I'm at lose that thing, and that loss really hurts.

In fact, a loss hurts about twice maybe sometimes more than an equivalent gain feels good.

This is what we call loss a version, and this.

Speaker 1

Fits with the mug results that Richard was just talking about, because people are demanding twice as much to sell this mug as buyers are willing to buy.

It's like losing the mug kind of hits them twice as hard financially.

Speaker 2

Exactly.

Speaker 3

Now this mug is mine, this kind of now my status quo giving it up will hurt a lot.

So I need more money to be compensated for that paint.

Speaker 1

And so that's in the context of again this sort of toy example where you're selling mugs, but this kind of status quot inertia bias is something that lots of companies are using against us all the time.

I have my own example right now.

I have paid to join a gym as part of this summer program.

This really cool bouldering gym near my house was like we have yoga classes, and I was like, oh great, I'm going to join the bouldering gym.

And it turns out that so far this summer, I've had a very expensive single yoga class for the price of the entire summer enrollment.

It's like a three hundred dollars yoga class, which is really embarrassing.

But this is my status quote bias at work.

Richard explain why this is.

Speaker 2

Yeah, our friends, the married couple of Stefanodelavigna and Ulrika Malmonde, wrote a paper about this when they were grad students.

That was called paying not to go to the gym, which is what Laurie did this summer.

You know, this is an example of what's called status quo bias.

Whatever the status quo is you tend to stick with.

We all know this happens.

Suppose you're watching some show, say you're streaming, and an episode ends and then the next one starts.

So if you do nothing, which people are really good at, then you start watching the next one and the next thing you know, you've watched four of this stupid sitcom that really wasn't all that much fun, but you just found yourself doing it.

You know, you can use this for good or for bad, and firms have figured this out, so that gym warning Laurie they are going to automatically renew your subscription.

Yeah right, and you know what, you're only going to go once this fall because you'll be even busier.

So I'm gonna save you a lot of money here and suggest you quit now.

But we've used exactly that trick for good.

When the new kind of retirement plans for one K type plans first came on the scene.

One of the problems was that people just didn't sign up even if their employer was matching their contributions dollar for dollar, which is really really stupid.

And so we got the clever idea, why don't we just change the default and say, hey, Laurie, we have a new retirement plan.

We're going to put you in unless you fill out some form and say you don't want it, and boom, that gets enrollment up to ninety percent.

And notice that's something economists would say, we'll have no effect because what kind of idiot would turn down a dollar for dollar match of six percent of their salary.

No, no idiot would say, oh, because I have to fill out a one page form.

But you know, enrollments went from fifty percent to ninety percent just by doing that.

So that was using inertia for good.

Automatically renewing your gym membership is for profit and we observe both.

And there are lots of policy decisions.

One that is a little up in the air.

The last weeks of the Biden administration, they passed a rule saying that you have to be able to unsubscribe the same way you subscribed.

So if it was with one click, it should be one click.

Some gyms during during COVID, we're requiring people to come to the gym which was closed in person to quit.

That's really evil, right, So this rule is up in the air.

I don't know whether it's going to be enforced or not, but I strongly believe that's a rule for good.

Speaker 1

So it seems like the happiness lesson from this problem we have with inertia is like, if you're stuck in some status quo, make sure it's a status quo that you like, or build your own status quos that might be helpful, But try to notice when you're getting stuck in something just because it was the thing that you're used to.

Speaker 2

Yeah, okay, So.

Speaker 1

That was irrationality number three that I think has interesting implications for happiness.

Now we get to irrationality number four, which is I think a very very big one when it comes to our happiness.

And this is this idea that we have a defective telescope, as economist Arthur Pigu put it.

What does Pigu mean by a defective telescope.

Speaker 2

The idea is that the difference between today and tomorrow seems bigger than the difference between two days apart in the year, irrationally so.

So Laurie is going to say, well, today, she's busy, she's taping this podcast, she doesn't have time to quit that gym.

She's going to do it tomorrow, and tomorrow there's going to be something else, so we all procrastinate.

That Maybe one reason it took just four years to do this simple revision of this book, Alex had a couple of kids, so he has an excuse.

So that defective telescope affects things like saving for retirement, because retirement seems like it's a long way off, and then suddenly you wake up and you're old like me.

Speaker 1

I mean, I'll give you an even closer to home example for listeners who might not be as old as you are, Richard, which.

Speaker 2

Is imagine that.

Speaker 1

Which is just what's happening in your calendar weeks or months from now, you know today, Laurie, if you ask me, hey, would you want to write a chapter for a book that doesn't even sound that interesting, I'm like, no way.

But if you ask me, does December Laurie want to put some time into writing a chapter that sounds kind of interesting.

I'm like, oh my god, December Laurie would love that.

She would love to spend her time doing that thing.

And so I feel like my personal calendar is filled with these instances of past Laurie's myopia, Like somehow she thought that, you know, doing this podcast interview today would be totally fine to do that with a bunch of student meetings and she doesn't need lunch that day.

She'll just squeeze other things in.

Speaker 3

And I think that, you know, in terms of like rules in order to kind of solve these sorts of issues.

My favorite one is, what if that person asks you to do that today, would you do it If the answer is no, don't do it a year from now, because a year from now, unless something real bad happens, will at some point become today, and you will be like, holy you.

Speaker 1

Know, yeah No.

I think this is so powerful, right, because I think one of the big happiness implications of this defective telescope is that we're just constantly screwing over our future selves.

Interestingly, we talked recently on a podcast about a different way we screw over our future cells that I'd be curious to think how behavioral economists think about this.

Behavioral economists are constantly talking about cases of my apia, but happiness scientists often consider these cases of what's called hyperopia, right where you kind of wind up saving these good things for the future that you never end up enjoying.

So I'm thinking of cases of like my frequent flyer miles.

I'm sitting on one hundred thousand frequent flyer miles that I'm like, someday I'll want to enjoy these, but I never actually planned the vacation to use them.

Or you know, a really nice bottle of wine that a good friend gave me for a birthday and I'm like, oh, I want to wait for the special day to use that, and then I never use it, and years later I'll probably open it and it's corked and I've forgotten about it.

Are there hyper optic cases that behavioral economists think about?

Speaker 2

Yeah, So I have a student named Suzanne Chu, and she talks about the example that you have two tomatoes sitting on your kitchen counter and one is perfect, it's like God's tomato, and one is two days past, which one do you eat tonight?

And in my family we refer to the perfect one as Suzanne's tomato.

My wife is more frugal than me, and she can't bear the thought of throwing away that almost still good tomato, whereas I always want to go for the perfect one.

Speaker 1

And that's important because if you wait till tomorrow, Susanne's tomato is going to be not so good too.

Speaker 2

You never eat Susanne's tomato, that's the problem.

So you gotta go for Suzanne's tomato when it's there.

Speaker 3

Our friend George Lowenstein has and we talk about this in the book, this idea of anticipatory utility.

You were talking about a nice bottle of wine or all of these travel miles or something like that.

You're kind of saving it for something.

You have this dream in your head that there's gonna be this perfect event where you open up the bottle, you have this incredible dinner party, everything is amazing.

It's gonna happen at some point in the future.

And what this allows you to do is kind of just like go on imagining that that's gonna happen, which provides you some sort of happiness over time.

And this sort of anticipatory utility leads to what looks like putting nice things into the future too much.

Speaker 1

And so it seems like we want to embrace the savoring.

We want to embrace the anticipatory utility.

But maybe just put in the calendar that, like, you know, October twenty first, I'm gonna drink the good wine, or maybe like January right in your calendar, use a frequent flyer miles this month.

Speaker 2

And next time sailors in New Haven.

That's when you open that.

Speaker 1

We'll see if we have some assusiasts tomatoes.

Then I don't know if we will Okay.

When we get back from the break, we'll dive into my final two favorite anomalies.

Think why Dustin Hoffmann, how did Gene Hackman for lunch money?

And why people buy convertibles in snowy climates.

The Happiness Lab will be back in a moment.

Irrationality number five gets us into monetary territory, and it's the idea that money is actually fungible.

We don't treat it like that, Alex.

What's this idea of fungibility?

Speaker 3

The idea of fungibility underlies the whole reason we have money in the first place, and we're not bartering anymore.

It just basically means a dollar is a dollar.

It doesn't matter how I got the dollar.

It'll buy you the same thing.

So you should spend it the same way, regardless of how you got it.

So let's say you got a nice bonus at work, come home, you think about how you to spend that five hundred dollars, or you found five hundred dollars envelope on the ground, you think about how you want to spend it.

You should basically spend it the same way.

It doesn't really matter how you got it.

The money is worth the same to you.

You should spend it exactly the same way.

And this idea of fungibility underlies the very very basic principles of economics.

Speaker 1

Except the problem is that we tend to do that.

In fact, we violate this fungibility principle in pretty much every behavior we engage in Richard, in the book, you use an example of the way we violate it when we're purchasing gas under certain situations.

I hadn't heard about this was share the gas example.

Speaker 2

Yeah.

So one of the big themes in the book in terms of methodology, Alex referred to this earlier.

Is a lot of stuff that was demonstrated with thought experiments or lab experiments has now been replicated with actual data.

And this is a good example of that.

Our friends Jesse Shapiro and Justine Hastings looked at what happened when the price of gasoline fell by fifty percent during the financial crisis, right, so this was really bad times.

Also unemployment.

People are coming back on everything.

But gas has gotten cheap.

So they were spending eighty dollars a week on gas and now it's for So there's forty dollars extra in their budget.

And what do they do with that windfall, Well, they spend some of it very stupidly on better grade gas.

Speaker 1

So it'd be like I would buy the eighty seven percent, you know, the cheapo gas, the cheapest one on the board.

But then when the price of gas falls, instead of saying like, oh, I can save that money and go get an extra coffee or something like that, I say today, I'm going to get the ninety two percent the premium, right.

Speaker 2

Exactly, and that will do exactly no good.

The car won't appreciate it.

Go buy a better bottle of olive oil.

Speaker 3

You know.

Speaker 2

So that's a good example of what I call mental accounting.

Speaker 1

Yeah, and so what's mental accounting?

Because I think it's so intuitive once you explain it.

Speaker 2

Mental accounting is sort of the way we keep track of stuff.

There's a great video you can find online Dustin Hoffman and Gene Hackman and they're having this discussion.

It's about back when they were starving actors, and Hackman tells this story about going to he calls him Dusty, going to Dusty's apartment in Pasadena, and Hoffman says, he needs some money?

Can he loan him some money?

And then Hackman goes into the kitchen and he says, hey, you need money.

There's these jars in your kitchen and they've got money in them.

Why do you need money?

And Hoffman says, yeah, but there's no money in the food jar, right, So that is mental accounting.

Now, I should say budgeting per se is not stupid, you know, and making sure that you have enough money to pay the rent each month, that's smart, and in fact, creating the equivalent of those jars can be helpful.

But spending the gas windfall on gas that's stupid.

Speaker 1

But we can harness these biases also, so for good you mentioned before the example of the tool you were using to get people to save more, and this is another spot where you were able to use people's mental accounts to do better a little bit too, using people's raises, for example, to get them to save a little bit more.

Explain what you did in that program.

Speaker 2

Yeah, so I mentioned before that the first problem we had to solve was getting people to sign up.

Then the second problem was getting them to put more money in.

And the trick there was another one of my former students, Sloman BENARTZI and I create a program we called save More Tomorrow.

And this goes back to what we were talking about before, because it's.

Speaker 1

Tomorrowmorrow is far away.

I'm happy to save.

Laurie's going to be a great saver tomorrow.

Speaker 2

Right, So we would go to people and say, how about if you increase your saving contribution when you get your next raise?

All right, So that's combining two of the things.

So first of all, it's tomorrow, right it you know it's in janu when I get the raise, so sure, and I'm going to have more money, so I'll take some of that new money and I'll save that and that program.

In the first company we tried at, we triple saving rates.

Wow.

That has created billions and billions of dollars in savings around the world.

Speaker 1

And this kind of fits with the ideas we've been talking about for so many of these other anomalies, right, which is that mental accounting in some context looks sort of silly when you're just kind of spending your gas budget on higher priced gas just because you can.

But you can also use mental accounting to do things that make you happier.

Speaker 3

Right.

Speaker 1

You can think of the account of, Oh, I'm going to get this future raise that's not allocated for yet, so I can put that into savings.

I feel like I do this all the time when the so called pain of paying is kind of high, Like I want to do something, but it feels sort of luxurious if I take some random windfall I get, say I get an extra honorarium from a talk I wasn't expecting.

I put that towards the stuff that normally i'd feel little bit embarrassed about getting, and then that makes me feel not so bad.

That's like a happiness hack that I use for mental accounting all the time.

Speaker 2

I'll tell you a funny story about my daughter, My daughter Maggie, who has been raised by a beaveral economist.

Right.

She lives in Rhode Island, and one of her neighbors grew into be a Major League baseball pitcher playing for the Mets.

And I noticed the Mets were going to have a game in which this kid was going to be pitching, and so I called Maggie and said, hey, would you want to go to that game?

I'll treat you to two tickets, and she said, okay, great.

So this game was like in another day and a half, so we had to act fast.

So I look online.

I send her a link and say, look, mag you can get what looked like pretty nice tickets for three hundred dollars each.

By the ones you want, I'll send you one thousand dollars and go have fun.

And she text me back and said, well, this is just like in your book.

If you send me one thousand dollars, I'm not going to use it on going to a baseball game.

Speaker 1

All right, So now we get photo.

My final irrationality that I think matters for happiness, which is the problem that we don't always know what we're going to like in the future.

Alex.

In the book, you use this so called the hungry shopper example.

Speaker 3

What's that well, I think everybody's kind of probably familiar with this one.

You're you shouldn't shop on an empty stomach because everything feels like it's going to be really good, and you fill your grocery cart with all of these delicious looking things.

You go home, you eat dinner, and then you're like, why did I buy seventeen different types of potato chips?

And it's the idea that you can't really accurately imagine what you're going to be like and what you're going to want in the future.

So if I'm hungry, you kind of think I'm always going to be hungry, I'm always gonna want these things.

You're going to fill your grocery cart with those things, and then all of a sudden, you eat and your house is filled with stuff if you don't want, especially you know, stuff you really don't want, like chips and other sorts of unhealthy things.

It's this idea that we started perspective taking, only it's not perspective taking with respect to other people's perspective taking with respect to what you're going to be like.

Speaker 2

In the future.

Speaker 3

So people struggle with both.

Speaker 1

In some ways, it makes sense that we struggle with both when it comes to like hunger, right, it's hard to kind of imagine a different state that you're in.

But we see these kinds of effects more broadly than like what's happening in the grocery store.

Alex tell me the example of the effect of a person's weather and the present day on their car purchases.

Speaker 3

Yeah.

So this is by our colleague Devin Pope and his co authors.

They basically looked at data.

This is again going into this kind of revolution and behavioral economics.

We started out with these experiments with college students and now most of the results that we're seeing are with data millions of car purchases.

They take this database of what sort of cars people are buying and when, and they find that when it's sunny, people are a lot more likely to buy a convertible.

And the funny thing is is that you know, if you're in Minnesota, you're on that one sunny day June seventh.

Speaker 1

Yeah, you all lived in Chicago.

We know the one sunny day in Chicago.

Speaker 3

Yeah, and you know people are buying these convertibles and they're going out, they're happily driving off the lot, and then you know, the rest of the year starts and they're like, oh crap.

And the idea is that when it's sunny, you kind of imagine yourself wind flowing through your hair driving down the highway, and it's really hard for you to imagine, Wait, it's going to snow tomorrow probably and I'm going to have to close it up and it's just going to be kind of a windy, cold car.

And then similarly, you know, when it's a cold day, people are really not likely to buy convertibles because they can't imagine the sunny day where they're actually going to enjoy it.

Speaker 2

Danny Kahneman my mentor coin the phrase the focusing illusion, like nothing is as important as the thing you're thinking about right now.

Speaker 1

So this is a big problem, right because we need our predictions to make decisions about the kinds of things that we're going to engage with in the future that are going to make us happy, and it seems like this focusing illusion really messes us up.

Right, whatever our attention is focused on, we start thinking about that and we ignore all the other stuff.

So how can we do better, Alex, any advice for how we can kind of notice what we're going to like in the future a little bit better and maybe open up our narrow attention.

Speaker 3

Yeah, so I have some work on this myself.

I mean the kind of easy advice is just to think about it for longer.

People become a lot better calibrated just by giving them what we call a waiting period.

And these waiting periods are actually not something we invented.

As you probably know, waiting periods are all over the place.

Many states have been as part of kind of gun laws.

The idea is that if I'm in a hot state, I can't imagine what it's going to feel like when I'm in a cold state.

If I'm in a hot state, I'm angry at somebody I go out there and buy a gun and do something really stupid that I'm going to regret.

So states impose a waiting period to kind of simulate like, let me think about it for a little longer and then maybe I won't need it.

It's there for marriages, too, right.

Somebody meets somebody at a bar and is like, you're the love of my life.

Let's go, and it's like wait a second, think about it for a little bit.

And this same sort of very simple intervention we found this in careful experimental tests allows you to kind of simulate what you're going to feel like in the future a lot more accurately.

Speaker 1

And so it seems like there are two pieces of advice there when you're making a big life decision, right.

One is give your attention a moment to catch up with all the stuff it's not currently paying attention to.

And the other piece of advice is, just like one end doubt, give it a little time because over time more stuff will reveal.

And whatever state you're in now, if that's hungry, sitting out in the sunshine thinking you want a convertible, you might revert to a different state, which might be more of a baseline for how you're going to feel in the future.

Speaker 2

No, And there's one another extension of that.

When I'm talking to students about career choices.

The mistake I find people make all the time is they decide a career based on what they like to study in school.

And I always say, no, think about what you can imagine doing every day for the rest of your life, and that may not be the same thing.

Go you know, shadow somebody for two weeks.

See if that seems like that's exciting.

I think there are lots of careers that sound good until you think about doing it every day.

Speaker 1

Well, Richard, I'm so glad that what you decided to do every day for the rest of your life was to be a behavioral economist because it's been so fun to learn about these anomalies from you through the years.

It was fun to read this book back in the late nineties, and it was even more fun to read it new in twenty twenty five, when we have all of Alex's new insights there.

So everyone should go out and check out The Winner's Curse.

There's so many cool anomalies that we'd get any chance to go into for Richard and Alex, thanks so much for being on the show.

Speaker 2

Thanks Laurie, great to see you.

Speaker 1

So it seemed humans are not vulcans.

We don't always act rationally, but as these six anomalies show, understanding the irrational quirks of our species can help us live richer, happier lives in ways that I think pure logic may not have predicted.

The new and improved version of the Winner's Curse is out in October, and it's available for pre order now.

Next time on the Happiness Lab, we'll hear about another of my favorite books of twenty twenty five, one that's also from a scholar trained in economics and who, like Richard Taylor, also wants to break the rules.

But her rule breaking involves teaching us why we should all be working a bit less.

All that.

Next time on the Happiness Lab with me Doctor Laurie Santos,

Never lose your place, on any device

Create a free account to sync, back up, and get personal recommendations.