Navigated to S6E20 - The Final Boss(es) (Effective Altruism & Longtermism) - Transcript

S6E20 - The Final Boss(es) (Effective Altruism & Longtermism)

Episode Transcript

Kayla

Kayla: When you google Sam Bankman Freed and effective altruism, you get headlines back like how effective altruism let Sam Bankman Freed happen.

Effective altruism is as bankrupt as Sam Bankman Fried's FTX.

Effective altruist leaders were repeatedly warned about Sam Bankman Fried years before FTX collapsed and Sam Bankman Fried and the effective altruism delusion.

It sounds fairly damning.

And in the fallout after the FTX scandal and Bankman Fried's arrest, effective altruism itself seemed to be on trial along with him.

Now, when you google Sam Bankman Freed and polycule, you get back partner swapping pills and playing games inside Sam Bankman Fried's FTX party house, polyamory, penthouses and plenty of loans.

Kayla: Inside the crazy world of FTX and FTXs crypto empire was reportedly run by a bunch of roommates in the Bahamas who dated each other, according to the news site that helped trigger the companys sudden collapse.

What the hell exactly is going on.

Chris

Chris: Kayla?

It was really hard for me to not reply to anything.

You're like, okay, I'm gonna do the intro snippet now.

I'm gonna do the cold open now.

And I was like, okay.

And then you started doing it and I was like, how am I supposed to not fucking respond to any of this?

Kayla

Kayla: Well, respond now.

Chris

Chris: I had to, like, point my mouth away from the microphone cause I started giggling when you brought up the polycule roommates.

Bahamas.

I mean, it sounds pretty fun.

Kayla

Kayla: It really does, honestly.

Chris

Chris: Where to begin?

Kayla

Kayla: You know what?

We're gonna get into it.

So you'll have plenty of time to respond.

Chris

Chris: You should know where to begin cause you did the research.

Kayla

Kayla: I did.

Welcome back to culture just weird.

I'm Kayla.

I'm a tv writer and a goddamn expert on effective altruism.

After all of these episodes.

Not really, but it sounds good to say.

Chris

Chris: I'm Chris.

I am a data scientist, game designer.

I have never been to the Bahamas with a polycule to caused a major financial collapse before, but I'm looking if anybody else wants to join me.

Kayla

Kayla: I already said welcome back to culture.

Just weird.

But I'm going to say it again.

Thank you for supporting the show by listening to it and coming back and hearing us Yap week after week.

If you'd like to support us further, you can go to patreon.com culturesweird.

And if you want to yap with us about cults and weirds, you can find us on discord linked in our show notes last week on the show we tackled affective altruism and long, the last two letters of the test Grail bundle.

It was a bit of a primer, allowing us to dip our toes into what exactly these concepts are, in short, effective altruism, or ea.

Using evidence and reason to figure out how to benefit others as much as possible.

Kayla: And taking action on that basis, usually with like, charitable donations and longtermism, means extrapolating that idea out to include considerations of the far future and the humans who might live then.

Chris

Chris: And that's why you'd be in a polycule, so you can make as many babies as possible.

A lot of cross pollination going on.

Kayla

Kayla: That's not.

Not a thing.

I don't.

Seriously, I don't think this polycule was.

I don't think this polycule had anything to do with population stuff.

But, like, that's why Elon Musk has twelve kids, or however many.

Chris

Chris: Should we explain polycule?

Kayla

Kayla: Well, we'll get to it.

Don't worry.

Chris

Chris: Okay.

Okay.

Kayla

Kayla: We could explain it later.

I want to keep our listeners on their toes, on bated breath for a moment.

Chris

Chris: Don't forget, my mom still listens to this.

Kayla

Kayla: She'll enjoy this episode.

This week we're going to get into some of the criticisms of effective altruism and long termism, and some of the criticisms.

It's not really possible to be totally comprehensive on this because it's a big topic.

In short, we are all now here for the good stuff.

Chris

Chris: Yeah.

The juicy episode.

Kayla

Kayla: The juicy episode.

So what the hell happened with infamous effective altruist and alleged polycule enthusiast Sam Bankman fried?

I would like to say upfront that we at cult are just weird.

Are pro polyamory for consenting parties.

Chris

Chris: Hell yeah.

Kayla

Kayla: Absolutely.

Nothing wrong with it.

Probably a good thing that more people are learning about this relationship structure.

That said, we really want to talk about those polycule headlines because, boy, did they scandalize the news media for a while in 2022.

Chris

Chris: I feel like we run into this, like, frequently where it's like there's stuff that we genuinely are supporting and in favor of, but like, somehow it's still.

We still have this, like, vestige of, like, societal, whatever, shame or something that, like, makes it fun to, like, gossip.

Scandalize about.

Kayla

Kayla: Yeah.

Chris

Chris: You know.

Yeah, we're hypocrites.

We feel like we run into that more than once.

Kayla

Kayla: Yeah.

That's why we have a podcast.

Chris

Chris: That's right.

Kayla

Kayla: Okay, so Sam Bankman Fried, I will refer to him as many others do, as SBF founded a cryptocurrency exchange called FTX.

Cause we're gonna get a lot of.

Chris

Chris: Acronyms in test grill.

Yep.

Kayla

Kayla: Yeah.

He courted a ton of investors, made a boatload of money for himself.

Like, I think he ranked 41st in Forbes list of richest Americans at one point.

Holy shit.

Chris

Chris: He got that high?

Kayla

Kayla: Oh, yes, he.

Yes.

Yes.

Chris

Chris: That dude.

Kayla

Kayla: That dude.

Chris

Chris: Like, what am I doing with my life?

Kayla

Kayla: This is like, a Mark Zuckerberg 2.0.

He very much leaned into the aesthetic of, like, my hair is unkempt, and I wear t shirts and flip flops.

That guy.

Chris

Chris: Oh, my God.

Kayla

Kayla: Okay, nothing wrong with it.

I mean, there's a lot wrong with it.

But in November 2022, FTX went bankrupt really badly, and the following month, SBF was arrested and indicted on charges of wire fraud, commodities fraud, securities fraud, money laundering, and campaign finance law violations.

Chris

Chris: That's a lot of fraud.

Kayla

Kayla: Oh, yeah.

He was convicted on seven counts, sentenced to 25 years in prison.

And everyone invested in FTX lost their money before shit hit the fan.

He helped popularize the concept of effective altruism in the mainstream, as he was a big EA guy.

Great, good.

We are all caught up.

But what about the polycule?

Chris

Chris: Okay, get to that.

Kayla

Kayla: So the story goes that in 2021, SPF and the whole FTX crew moved the operation from Hong Kong to the Bahamas, allegedly because there were fewer regulations there for financial stuff.

Chris

Chris: Yeah, I think I.

That tracks with what I know about the Caribbean.

Kayla

Kayla: Ten members of an inner circle, essentially, like, the execs of FTX, lived together in, like, a palatial mansion.

Like, just the most succession y.

Most, like, Silicon Valley y, Bahamas mansion you can imagine.

Chris

Chris: Sorry.

Why isn't there a tv show yet, though?

Kayla

Kayla: There will be.

You better believe there will be.

Chris

Chris: Okay.

Kayla

Kayla: While running, quote, unquote, the cryptocurrency exchange, they also partied hard, obviously.

Chris

Chris: Yeah, yeah.

Kayla

Kayla: The stories go that SBF and his roommates would, like, stay up all night and take amphetamines, like speed.

He himself once tweeted, quote, stimulants when you wake up, sleeping pills if you need them.

When you sleep.

Like, that was the tweet.

Chris

Chris: I mean, you know, I drink coffee in the morning, and sometimes I smoke weed at night, so I can't really make fun of him too much.

Kayla

Kayla: A lot of time was also spent watching SPF play video games.

Like, it's confirmed that he was literally playing League of Legends during at least one very important fundraising call with Sequoia Capital.

Chris

Chris: All right, so here's where.

Kayla

Kayla: Also, he was bad.

Chris

Chris: Oh, he was like a baddie.

Kayla

Kayla: They found his ranking and he was not good at it.

Chris

Chris: Wood league.

I would love to make fun of him for playing.

Lol.

And I would love to equate the toxic nature of these types of games with him being a douchebag.

But also, I have a bunch of friends that work at riot, so I feel like I shouldn't do that.

Kayla

Kayla: But also, anyone who plays video games is bad and wrong.

Chris

Chris: That's right.

Kayla

Kayla: That's the position of the show.

Yes, of course, the rumors go that all ten of the co ed inner circle group were all dating each other, either pairing up and then mixing and matching in either an on again, off again situationship or something akin to swinging, or they were doing straight up polyamory.

Chris

Chris: Okay, okay.

So it's unclear whether they were all cuddle puddling in every single night, all ten of them.

Or whether it was some amorphous, amoebic sort of.

Something would break off and then they would come back in.

Kayla

Kayla: It was like an incestuous pile.

Chris

Chris: Awesome.

Kayla

Kayla: And to be clear, polyamory is a relationship structure in which multiple people, rather than just a couple, form a romantic unit.

So, like, multiple people all dating each other in a single relationship.

Chris

Chris: Do we know the ratio of these ten people?

Kayla

Kayla: You know, that's a good question.

I don't.

But we do know that Caroline Ellisone, CEO.

Not related to Larry Ellison, as far as I know.

Chris

Chris: Oh, okay.

Yeah, you're saying tech people and then Ellison.

I'm just assuming it's Larry Ellison.

Kayla

Kayla: Not related to Larry Ellison.

Different Ellison.

She was the CEO of a trading firm founded by FTX, and she was also SBF's.

There's too many acronyms.

She was also bankman Fried's sometimes girlfriend.

Chris

Chris: SBF's.

Gf.

Kayla

Kayla: Correct.

She also blogged about polyamory on Tumblr and her journey into it.

Chris

Chris: She did.

You know, I remember her in the sort of, like, fallout news of FTX.

Like, she also got quite a bit of heat.

Kayla

Kayla: Part of the maelstrom.

My favorite quote from her tumblr that I found was this.

There's problems here.

And if you are a polyamorous person, you will be able to identify them immediately.

And if you are not a polyamorous person, I think they'll still, like, scream right in your ear.

When I first started my foray into poly, I thought of it as a radical break from my trad past.

But TbH, acronyms.

I've come to decide the only acceptable style of poly is best characterized as something like imperial chinese harem.

None of this non hierarchical bullshit.

Everyone should have a ranking of their partners.

People should know where they fall on that ranking, and there should be vicious power struggles for the higher ranks.

Chris

Chris: That sounds awesome.

Kayla

Kayla: I cannot confirm or deny whether this is, like, a joke or reality.

It definitely falls in that, like, troll.

Not a troll realm, but, like, yeah, it does.

I have some thoughts.

Chris

Chris: Like, if you treat it like a sport, that sounds awesome.

You know, if it's, like, not, like, a serious, like, you know, if you're like, well, I'm just gonna play the sport of polyamory and not take it seriously.

That sounds like that could be fun.

Kayla

Kayla: But, yeah, I mean, from my limited understanding of, like, ethical polyamory practices, this is perhaps not the most sound way to pursue a poly relationship.

I don't think that relationships should be about vicious power struggles and rankings.

That's my personal.

Chris

Chris: Then how do you get a Game of Thrones, Kayla?

Kayla

Kayla: Well, I mean, they didn't even have time to get their Game of Thrones on.

Cause they all got arrested or whatever.

Chris

Chris: That's true.

Kayla

Kayla: But why bring any of this up?

What does Bankman Freed's alleged polycule have to do with EA effective altruism?

Doesn't this kind of just seem like were talking about scandalized, sensationalized reporting to make these tech bros look like weirdos to pearl clutching Guardian readers?

Chris

Chris: Yeah.

Isn't that what we're doing?

Kayla

Kayla: Yeah, I mean, I think Scott Alexander points out in his essay in continued defense of affective altruism, which we talked about last week.

Chris

Chris: Right, the rationalist guy.

Kayla

Kayla: Correct.

He says, quote, nobody cares about preventing pandemics.

Everyone cares about whether SBF was in a polycule or not.

Effective altruists will only intersect with the parts of the world that other people care about when we screw up, therefore, everyone will think of us as those guys who are constantly screwing up and maybe doing other things.

I'm forgetting right now.

In short, he's saying that, like, for every article I know.

Yeah, for every article that about SBF's polycule, there are a dozen articles that should have been written about the self proclaimed 200,000 lives Alexander's estimated.

Alexander estimates effective altruism has saved a.

Chris

Chris: I guess he has a point about sensationalism in media and clicking.

Kayla

Kayla: That's why I brought this up.

That's why I brought this up in this episode and not in the previous episode, because I read that quote, and I was like, oh, yeah, let's talk about the.

Maybe the good stuff first.

So I'm not just like, you're not being oo.

Yeah, but I still want to do oo a little bit.

Chris

Chris: Mm mm.

Dammit, Kayla.

Kayla

Kayla: Frankly, however, to go back on myself.

Chris

Chris: Yeah.

Oh, okay.

Kayla

Kayla: I bring it up because it is relevant.

Like, especially when we're getting into the criticism section of effective altruism.

Like, if a movement is going to support and even encourage tech billionaires to acquire as much wealth and power as they can so they can donate it to causes of their choice, we also have to look at the whole package of the person we're allowing to acquire said wealth and power.

Chris

Chris: I think that's actually a really good point.

No, you're bringing me back around now because you're right.

Kayla

Kayla: I a big part of not to say that polyamory is unethical.

I'm just saying that upfront.

Sorry, continue.

Chris

Chris: No, no, the point is not that polyamory is unethical.

It's perfectly ethical if it's consensual and whatever, like anything else, it's more that, like, yeah, maybe there is a reason to interrogate deeply into the lives of people who we have allowed to accumulate such wealth.

Because effectively accumulating that wealth is equivalent to saying, we are giving this person a lot of decision making power over how resources are spent and what things are built.

Kayla

Kayla: Right.

Chris

Chris: And if these guys are using those resources to build yachts instead of building bridges or shelters for homeless people, I think that we need to be saying, like, okay, well, what are they doing?

Like, what would you say you do here?

Kayla

Kayla: What would you say you do here?

Yeah, I don't know if it's really possible to disentangle one's lived ethics with one's charitable ethics, especially when we're talking about, like you said, people who are hoarding billions of dollars.

Chris

Chris: Yeah.

Not at that level of wealth.

Right, right.

Kayla

Kayla: But again, there's nothing wrong with polyamory, and there's nothing even wrong with, like, taking drugs or playing video games or like, fucking off to the Bahamas or whatever.

These aren't the case.

Chris

Chris: That's lucky.

Kayla

Kayla: But when Caroline Ellison is helping make her boyfriend violently wealthy, and then blogging that this is another quote, blogging that her ideal man can, quote, control most major world governments and has sufficient strength to physically overpower you.

Chris

Chris: Okay, hold on.

Kayla

Kayla: I'm gonna look at that twice.

Chris

Chris: Okay, so she's bragging about how her boyfriend can.

Kayla

Kayla: No, she's saying, this is her ideal man.

She's not saying, oh, her ideal man.

This is my ideal man.

Which, I mean, she has and has continued to date SPF during this period.

Chris

Chris: Well, he became not that guy for sure.

I mean, it sounds like he never was that guy, but after FTX collapse.

Kayla

Kayla: I don't think he's the ideal anymore.

But I think that he was certainly on his way at one time, right, when you're ranked 41st richest american, he.

Chris

Chris: Didn'T look like he could physically overpower anyone.

Kayla

Kayla: Well, I don't know.

Maybe she was like four foot eleven, I don't know.

Chris

Chris: Oh, physically overpower her.

Kayla

Kayla: She's talking about the ideal man is for her, is somebody who can physically overpower herself.

Chris

Chris: Oh, I thought she was doing like, a my boyfriend can beat your boyfriend up thing.

Kayla

Kayla: You know what?

That could be an interpretation.

But my interpretation, she's saying that, like, I am.

I am deeply aroused by a man who has extreme global power and can also physically overpower me.

Chris

Chris: Okay.

Kayla

Kayla: I find that erotic.

Chris

Chris: Got it.

Well, you know, like, to each their own.

Kayla

Kayla: To each their own.

But I don't think that we should be giving a lot of money to people who are eroticizing and idealizing, in reality, individual Silicon Valley tech bros to be able to control major world governments.

I know that this is probably some Tumblr roleplay fanfic bullshit, and also, it's kind of what was actually happening, right?

Chris

Chris: Troll.

Not a troll.

Kayla

Kayla: And another thing I'm going to look at, read the polyamory situation, is that sometimes shitty people use polyamory as a cover for shitty abusive behavior in relationships.

And the EA community has been accused of exactly that.

Even outside of the Sam Bankman fried stuff, both Bloomberg and Time reported on women accusing the EA community, particularly the Silicon Valley EA community, of cultivating a culture in which predatory sexual behavior thrived.

Chris

Chris: Because, of course, yeah, this is.

Now we're talking about classic cult stuff here, right?

This is like source family or NXIVM or any number of others.

Kayla

Kayla: Men within the movement were accused of using their power to groom younger women and utilized the guise of polyamory to do so.

The accusers also stated that EA was largely male dominated in the community and sexual misconduct was either excused or ignored.

The center for Effective Altruism, which is William McCaskill's organization, argued to time that they had banned accused perpetrators from the organization and offered to investigate new claims.

They also said it's hard to figure out whether the sexual misconduct that went on was unique to the EA community or whether it was just good old fashioned societal misogyny.

Chris

Chris: I don't disagree with that.

I mean, you see misogyny get its own flavor no matter where it is.

In fact, I was even gonna say, yeah, polyamory can be used as a cover for certain types of abuse.

So can just regular old monogamy.

Kayla

Kayla: Sure.

Chris

Chris: The individual family unit is used as a bludgeon by a lot of people to advance political agendas.

Kayla

Kayla: I hope that they're then donating some money to the structural issues behind societal misogyny that might be taking down their very organization, but I don't think that they are.

Oh, we'll get to that.

I don't know.

I agree with you.

And also, that response rubbed me the wrong way a little bit.

I don't think it's wrong to acknowledge that these problems grow out of systems, not simply some effect of altruism itself.

And also, it feels a little bit like a cop out and a little.

Chris

Chris: Bit like washing your hands.

Kayla

Kayla: A lack of understanding of how, like, you are currently shaping culture and you're continuing to shape culture in the image of, like, the shitty stuff.

Chris

Chris: Yeah, it's hard to tell, especially with these billionaire guys, because so many of them seem like we can't do anything, pass it on, but, like, they're also creating culture.

So I.

Yeah, I don't know.

Kayla

Kayla: It's tough.

Regarding SBF specifically, there is some question about whether he was, quote, unquote, really an effective altruist.

And I think those questions kind of expose a deep criticism of EA.

It is extremely easy to bastardize the original concept and use it for personal gain.

That ends up hurting a lot of people.

SBF publicly declared himself an eaer, stated that he was, quote, earning to give, and made donations, quote, not based on personal interests, but on the projects that are proven by data to be the most effective at helping people.

He was a member of giving what we can.

The organization we talked about last week, where members pledged to donate at least 10% of their incomes to EA causes.

He founded something called Future Fund, which was supposed to donate money to nonprofits.

And guess who was on the team?

Chris

Chris: Eliezer Dukowski.

Kayla

Kayla: William McCaskill.

Chris

Chris: Oh, okay.

Kayla

Kayla: One of the founders of the EA movement.

And it wasn't the only way McCaskill was connected to SBF.

Like, I read somewhere that at one point, Sam Bankman Fried had, like, worked at the center for effective Altruism.

I'm not sure if that's true, but in 2022, when Elon Musk was looking to fund his Twitter purchase, William McCaskill acted as a liaison between Musk and Sam Bankman fried.

McCaskill reached out to Musk, who had once described McCaskill's book, what we owe the future as a, quote, close match for my philosophy.

Chris

Chris: Right.

That quote comes up, like, everywhere.

That quote has been plastered across the Internet by now.

Yeah.

Kayla

Kayla: And McCaskill said, hey, my collaborator.

And, yes, he used the phrase, my collaborator can maybe help secure this funding.

And then, you know, ultimately, of course, that did not go through because Sam Egmon fried got arrested and went to jail.

Chris

Chris: Yeah.

Getting arrested would make.

That would put a damper on that.

You know, I.

I'm picturing now, because you were saying, like, oh, there's other ways that they were tied together.

And now I'm picturing there's, like, a fives League of Legends team, and it's like, Sam, Bankman, Fried, McCaskill, Musk, and, I don't know, pick another fifth bostrom or something.

And they're like.

They're all playing league of legends, and I'm trying to figure out who would go where.

Cause there's very specific roles.

Kayla

Kayla: But they're also all enemies.

Chris

Chris: Yeah, of course.

And they're yelling at each other like, dude, you should have been there for the gank, man.

Kayla

Kayla: Kill me.

Leading up to his arrest, Bankman Fried did an interview with Vox reporter Kelsey Piper via Twitter DM, which I can't tell if that's really smart or really dumb.

He stated that his quote ethic stuff was, quote, mostly a front, and that ethics is a, quote, dumb game we woke westerners play where we say all the right shibboleths, and so everyone likes us.

Chris

Chris: Ooh, dropping the shibboleth word.

Kayla

Kayla: Many.

Chris

Chris: Should we define that?

Kayla

Kayla: Can you.

It's like a word.

No, I can't.

It's a word that is used to define in groups and out groups.

Chris

Chris: Yeah.

Yeah.

Kayla

Kayla: So, like, if you know the word, then you're on the in, and if you don't know the word, then you're identified as an outsider.

Chris

Chris: It's like how we talk about jargon being part of the ritual criteria.

Kayla

Kayla: Yeah, he could have.

Yeah.

Many, of course, took this statement to mean that he was using EA as a cover for his shady dealings and his accrual of wealth.

He later claimed he was referring to things like greenwashing and not EA efforts, but, like, damage kind of done.

Chris

Chris: Right.

Kayla

Kayla: McCaskill has since expressed deep regret at being duped by SBF, for what it's worth.

Chris

Chris: So did they break up their League of Legends team?

Kayla

Kayla: I think they did.

Chris

Chris: Oh, no.

Kayla

Kayla: Well, I don't think you can play lol in jail.

Chris

Chris: Shit.

SPF needs another support.

Oh, yeah.

Now he's just gonna get, like, you know, like, neo Nazi Steve from his cellmate is gonna have to be his playing partner.

Kayla

Kayla: Unfortunately, neo Nazi Steve is probably not that far from a regular lol player.

Chris

Chris: Oh, ing.

Sorry, riot friends.

Kayla

Kayla: I don't know anything about lol.

I just wanted to make a burn.

Make a burn for what it's worth.

As I mentioned before the intro, music Time reports that McCaskill and other EA leaders were actively warned about SBF being a fraud, being a deceiver very early on, like, 2018, and those warnings were essentially ignored.

Like, McCaskill was literally with this guy till the end.

Chris

Chris: When he.

When the whole FTX thing went down, did McCaskill play it?

Like, I had no idea, or was he more like, well, I was warned.

Kayla

Kayla: But, you know, I think he played it more as, like, I'm outraged at how duped I was.

I'm outraged at this harm that this guy has caused.

I don't think he said, like, I should have known better.

I could be wrong.

He definitely tweeted about this, so, like, it's free to go and, like, look at and kind of see how you feel about it.

But there was a lot of expression of, like, I'm outraged that this guy did this.

Chris

Chris: Yeah, I'll give him a couple empathy points here, because, like, I do understand that, like, when you have a cause that you think is really important and you have a hose of money feeding that.

Cause, right.

There's gonna be a lot of sunk cost fallacy of, like, no, no.

This guy has to be fine.

Cause if he's not fine, then I'm fucked.

Kayla

Kayla: Yeah, and that's a really good point.

Like, McCaskill has all these organizations, but, like, he himself is not an FTX tech bro.

He himself is not Elon Musk.

He himself is nothing, generating billions and billions of dollars of wealth.

Chris

Chris: Yeah.

So there's a lot of motivation for him to dismiss warnings.

Kayla

Kayla: Yeah.

And, like, none of us are perfect, but I think you gotta be more perfect when you're doing stuff like this.

Chris

Chris: Yeah, absolutely.

There's a higher standard when you're in command of that many resources.

Kayla

Kayla: Let's actually continue this conversation about McCaskill.

Obviously, we talked about him in last week's episode, but I kind of held off on saying how I felt about him.

And part of that was because I wanted people to come and listen to this episode.

But another part of it is that I feel really complicated about it.

I don't think he's a villain.

I do think he's demonstrably naive or has been demonstrably naive, and I think he's in a really unfortunate position right now.

Chris

Chris: Yeah.

Kayla

Kayla: Academic and researcher Gwilym David Blunt, whose area of focus is, among other things, the ethics of philanthropy, wrote an article for the philosopher titled Effective Altruism, long termism, and the problem of arbitrary power, which you sent to me.

So thank you for finding that.

Chris

Chris: Wait, his last name was Bluntley?

Kayla

Kayla: Yeah.

Chris

Chris: Sweet.

Kayla

Kayla: In this essay, he explains.

Ha ha.

That was me laughing at you.

Chris

Chris: Yeah.

Thanks for the support.

Kayla

Kayla: In the essay, he explains that there's an atmosphere of schadenfreude surrounding MacAskill now, particularly in the wake of FTX's spectacular fall, largely coming from other philosophers and academics.

And I think I would also argue, the mediaev.

Blunt explains that part of this might be related to McCaskill's success in doing one of the more difficult things in academia, breaking out of it, and having a direct and recognized impact on the wider world.

Blunt rightfully credits MacAskill with creating both the effect of altruist and long termist movements, and states that his center for effective Altruism has, quote, annual expenditures approaching $400 million, with about $46 billion more in funding commitments.

Chris

Chris: That's a lot of money.

Kayla

Kayla: That's a lot of impact, baby.

Chris

Chris: That's like a Scrooge McDuck swimming your little gold coins amount of money.

Kayla

Kayla: Blount goes on to describe how, in 2022, McCaskill expressed concern about a deep change in the overall vibe of effective altruism.

What he originally imagined and embodied as a practice of ascetic frugality had now become a way for the very wealthy to wield more wealth.

In short, his own philosophy in breaking through to wider culture had also gotten away from him and its original roots.

Chris

Chris: It's interesting that he felt that switch because I didn't feel it in time, but I definitely feel it in space with this, where I feel like there's kind of.

I don't know, there's two different kinds of effective altruists, right?

There's, like, the people that like to do some math and figure out where to donate their $10,000 or $5,000, and then there's, like, this Sam Bankman fried set of, like, crazy wealthy billionaires that are, like, you know, using it again as, like, a club.

Kayla

Kayla: I think that it's probably.

They were able to tap into a.

Ooh, if I tell people that if they invest in me, they're not just investing in me, they're investing in the future, and they're investing in these good causes.

I get more money.

Chris

Chris: Right.

And especially people are going to take advantage of that.

People, you know, compared to 1020, 30 years ago, people are much more interested investing in things that are more activist, investing things that are more.

Kayla

Kayla: I hate that phrase.

Chris

Chris: I know.

I hate that phrase, too.

But people are more likely today to invest in something that feels like it's doing something for the capital G.

Good.

Kayla

Kayla: Right.

So in this way, I feel for William McCaskill, because that's tough.

If you come up with this idea and you have this.

Monkish is not the greatest word, but it's supposed to be.

It was originally supposed to be more frugal, more ascetic is the word that is used.

More austere versus big and ostentations and billions of dollars.

This article kind of softened my heart towards him a little bit, which is good.

And I think McCaskill was 24 years old when he developed the idea of effective altruism.

He's a little baby boy, and 24 year olds are, of course, well into adulthood.

And McCaskill was deeply educated in philosophy, among other things.

And still, 24 year olds, while they have many gifts of youth that we lose in older age, 24 year olds also often lack wisdom that does come with age.

Kayla: And I think that there is some wisdom lacking around his approach to his own philosophy.

It's worth talking about how he was unable to see some of this coming, that he couldn't look at history and recent events and think, wealthy people usually act in their own self interest and often resort to unethical means to accrue wealth.

It's worth talking about how, despite being warned about SBF and his shady practices, McCaskill was still duped, along with the rest of the FTX investors, and had to take to Twitter to express his rage and embarrassment over the whole thing.

Chris

Chris: So were the warnings, like, I mean, if somebody had said, hey, this dude's sus.

No cap skibity, do you think that would have gotten through to him?

Kayla

Kayla: I think he would have been like, what the hell are you talking about?

Chris

Chris: Oh, he was 24 in 2018.

Okay, so he's a millennial.

Kayla

Kayla: I think he's 37 now.

No, he wasn't 24 in 2018.

He was 24 when he came up with these ideas.

Chris

Chris: Okay.

Kayla

Kayla: When he founded, like, giving what we can in those things.

Chris

Chris: But he's more of a millennial.

Kayla

Kayla: Yeah, I think he's my age.

Chris

Chris: Okay, so, okay, so he's, like, right smack in the middle of millennial.

Okay.

So you'd have to be, like, hey, man, this SPF guy is Jugi.

Kayla

Kayla: Jugi was a Gen Z term, baby.

Chris

Chris: Oh, that was a Gen term against millennials, right?

I don't know what we're talking about.

Kayla

Kayla: You don't even remember millennial jargon from 15 years ago or whatever.

Harry Potter has effective altruism.

Chris

Chris: SPF is like Draco Malfoy.

Kayla

Kayla: Yeah, that would have gotten through.

Chris

Chris: Okay.

Okay.

Kayla

Kayla: It's also worth talking about how when effective Ventures foundations, a coalition of EA organizations, including the center for Effective Altruism, giving what we can and 80,000 hours, all of which McCaskill is involved in, when effective Ventures foundations bought Witham Abbey, a literal manor house on 25 acres of land for 17 million pounds to use as their new headquarters.

And McCaskill does not really seem to see the problem there.

Chris

Chris: Yeah.

I mean, the optics there aren't great.

He does say that.

Well, he used the word ascetic.

You used the word monk.

But if you're gonna get a.

You know, if you're gonna be monkish, get Nabby.

Kayla

Kayla: I guess that's true.

Yeah.

Like, you should go look up pictures of it.

Chris

Chris: It's like a palatial abbey.

Kayla

Kayla: It's not versailles, but it's a fucking abbey, man.

Chris

Chris: Yeah.

Kayla

Kayla: And it does just bring up the question of why is an effective altruist group that is like, put your money where your mouth is.

Why are they spending 17 million pounds on a mansion when the whole mission of the movement is to spend the most money where it can do the most effective material good.

Chris

Chris: Yeah.

And you know what?

I've heard the argument.

I don't know if you're going to bring this up, but I've heard the argument about, like, well, it's, you know, the.

Kayla

Kayla: We can do the best work always for show.

Chris

Chris: Well, it's for show and for influence.

So, like, if.

Yeah, and we can do the best work, right?

Like, we can work better here.

People will take us more seriously, blah, blah.

All the, you know, the sort of, like, aesthetic things that maybe it brings to the table, and then that has a positive Roi.

So it's actually good.

Don't really buy it.

Kayla

Kayla: I just don't buy it anymore, because.

Chris

Chris: I feel like if you.

If your thing is effective altruism, if your thing is, like, acetic, you know, high roi giving, then wouldn't you better off advertising yourself as being like, yeah, man, we work out of a warehouse.

Like, that's much more, to me, is much more effective.

Like, I.

There was.

I forget who the.

This is, like, long the knowledge of this is long lost to me.

But in business school, I remember hearing a story about, like, some CEO that was CEO of this company that was.

And they were, whatever.

It was, like they were, like, very conscious about costs and they were like, hey, we need to, you know, do things cheaply so we can give it, you know, pass on the savings to the customer, whatever it was.

They wanted to be really cognizant about costs.

Chris: And so this guy, like, sat on some, like, his desk was like some, like, busted ass table in like an.

It wasn't a corner office.

And it was like he was sitting on, like, milk crates or something like insane like that.

And.

But, and it was like he, you know, he could have spent $10 to get a decent chair, but, like, it was for show.

It was for, like, hey, I'm sitting on milk crates.

So, like, that's the attitude I want you guys to take.

And I feel like that also could apply here.

Kayla

Kayla: If SPF figured out he could get more money by not getting haircuts and wearing flip flops, then, like, I feel like that could maybe translate, I don't know, business.

But also, I still just, like, don't buy an abby.

There's other options between milk crates and Abby.

Chris

Chris: Right, right.

But, like, it's just, it's not like it's fine and all, I guess if you're, again, going for the show of it, but don't you want to show your DNA and not, like, what you're not about?

Kayla

Kayla: That's what I think.

Blount goes on to explain that McCaskill and others have failed to include a working theory of power, which results in major blind spots and loopholes in the EA framework.

Chris

Chris: I think that sentence there is why I was like, oh, you got to read this, because I think that's sort of the key insight.

Kayla

Kayla: There seems to be little or no understanding of the fact that the harm caused by billionaires leveraging a system that allows them to become billionaires enough to embrace EA ideals cannot then be outweighed by the good they might do in their EA endeavors.

Chris

Chris: Okay, maybe it's that sentence.

Kayla

Kayla: Actually, that was my sentence.

Oh, that was your sentence summarizing what they're talking about.

Chris

Chris: Yeah, I think that's just another thing that's so.

And that particular insight, I think even goes beyond just EA, but to altruism in general, to charity in general, because that's a lot of these, starting with the Carnegies and the Rockefellers, that's what they like to do.

But why are they in a position to be doing that in the first place?

Do they want to interrogate that?

Not really.

Kayla

Kayla: I still don't understand why Bill Gates keeps going.

I'm just donating all my money to charity, and then he gets richer and richer.

Yeah, I don't understand.

Chris

Chris: But God forbid we get taxed.

Kayla

Kayla: EA founders like McCaskill and Peter Singer seem hellbent on viewing someone like SBF as a bad apple, an outlier, unfortunate one off.

He's not the result of a flaw in the philosophy, even though the philosophy facilitated his rise to money and power, which facilitated his harmful behavior.

Without a working theory of power, without grappling with structural power and what it means, EA and long termism helps put power in the hands of the ultra wealthy and then keep it there, which is.

Do I need to say why that's a problem?

Chris

Chris: No, I think we've said it before, like three or four times in this episode.

They get to make all the resource decisions.

If that happens, that's not great.

Kayla

Kayla: EA and long termism look around and say, like, hey, let's put our money over here where the people need help, but they do not look around and say, like, hey, what are the structures in place that cause those people to need help in the first place?

And how do we change that?

Because those causes are so often necessary in generating the kind of wealth disparity that we're talking about.

If you buy into the idea, which I do, that billionaires only exist when massive amounts of people live in poverty, it's easy for those billionaires to go, hey, I'll donate some of my wealth, rather than aid in the dismantling of the structures that allowed them to become so rich and powerful.

Chris

Chris: Right?

Kayla

Kayla: It's an indictment on the whole system.

Chris

Chris: I feel indicted.

Kayla

Kayla: EA seems unable to grapple with the fact that there are structural issues within their own philosophical framework and movement.

And part of that is because the philosophy seems to avoid grappling with the very of structural issues.

And like you said, this is a problem that comes up in the ethics of philanthropy and charity time and time again.

Like, this is not reinventing the wheel.

This problem is time immemorial.

Chris

Chris: Right.

Kayla

Kayla: And they're not fixing it.

Chris

Chris: No, because these people want to have their names on their college buildings.

Kayla

Kayla: It's very important.

I will also note again that it's funny to me that the center for effective altruism was like, maybe the issues with sexual misconduct in our ranks was actually the result of systemic misogyny, when they don't really seem equipped to talk about or engage in systemic issues elsewhere.

Chris

Chris: Just, yeah, that's a little like, have your cake and eat it, too.

Kayla

Kayla: McCaskill and SBF.

And these guys aren't the only place to look for these criticisms of EA and long termism's effect on empowering the mega wealthy with both, like, a lot of money and a lot of material power.

Professor Gary Marcus, who is a researcher focused on the intersection of neuroscience, cognitive psychology, and AI, which is very cool, recently wrote an article titled OpenAI's Sam Altman is becoming one of the most powerful people on earth.

We should be very afraid.

Chris

Chris: Great.

And I am done.

Yeah.

Kayla

Kayla: So Sam Altman, another Sam, just Sam problem all the way down.

Sam Altman is the CEO of OpenAI, the company behind the ubiquitous chat GPT, and he's kind of recently become the poster child for the Silicon Valley AI movement.

And he definitely has that Mark Zuckerberg.

I'm just a regular guy, and I'm just doing this because I really like it.

Chris

Chris: Just a regular guy having people take pictures of me.

Wakeboarding on the 4 July.

Kayla

Kayla: Gary Marcus goes on to describe how Sam Altman uses both deceit to build an image.

He lies about owning no stock in OpenAI.

When he owns stock in companies that own stock in OpenAI, he's just like, put layers.

Chris

Chris: Wait, so he says, I don't own stock in OpenAI, but then he has an ownership stake in Y.

Kayla

Kayla: Combinator, in Y Combinator, and Y Combinator owns stock in OpenAI.

Chris

Chris: Okay, so he's just straight up lying.

Okay.

Kayla

Kayla: And it's not like he doesn't know, because he was like, I don't know if he currently is, but he was at one time the president of Y Combinator, so he knows.

Chris

Chris: That's like me saying, like, well, I don't own XYz stock, even though, like, it's part of a mutual fund that I own.

Yes, I do.

Kayla

Kayla: Yes, you do.

He lies about being pro AI regulation while actively working against it.

He lies about whether or not a certain famous actress voiced his chat GPT program, even when that certain actress said, don't fucking use my voice.

And then he tweeted her when he tweeted the chat GPT voice, which is the name of a movie that Scarlett Johansson was in, and it sounded exactly like Scarlett Johansson's voice, even though she said, don't use my voice.

Chris

Chris: Yeah.

And then she sued.

Kayla

Kayla: Yeah.

Which she should have.

Chris

Chris: Of course, at no point in that process did she say maybe?

Kayla

Kayla: No.

Chris

Chris: Yeah.

Kayla

Kayla: No, he was moving fast and breaking things, and you shouldn't do that.

Ex employees of OpenAI have been forced to agree to not talk badly about the company.

I forget exactly what it was, but they were coerced into signing contracts that said they would lose all their stock options or something if they talked badly about the illegal.

I think opening.

I had to be like, oh, sorry, I guess we won't do that.

Chris

Chris: Okay.

I guess that's good at least.

Kayla

Kayla: Sam Altman's been recruited by Homeland Security to join their AI safety and security board.

I do not know whether he actually has, but I know that, like, they've tried to recruit him, okay.

While actively working to dismantle AI safety in his own company.

And he's made a shit ton of money doing all this, even though he's, like, one of those guys.

Like, I don't take salary.

I don't have stock.

Yes, you do.

Chris

Chris: I thought OpenAI was just, like, losing money, like, burning piles of cash hand over fist.

Kayla

Kayla: I don't know how anything works, because it seems like that's how every company these days in Silicon Valley is.

Chris

Chris: No, you're right.

Kayla

Kayla: They're losing all this money while the CEO and the execs become fabulously wealthy somehow.

Chris

Chris: Yeah.

Yeah.

I also don't really know how is.

Kayla

Kayla: Elon Musk so rich when, like, half of his companies are constantly losing money?

Chris

Chris: I feel like this is an important question to answer, and I don't quite have the answer to it, but it was like, when were watching succession, it was like, it didn't faze me at all that there was, like, this whole plotline of, like, oh, my God.

Waystar Roiko is, like, owes bajillions, and, like, we're way in the red.

Way in the red.

And yet all the roys were just, like, on yachts and private jets.

I was like, yeah, that makes sense.

I don't really understand how this person is less solvent than I am, but they're the one on the yacht.

I don't really get it, but that does track.

Kayla

Kayla: Yeah, it does.

There's a lot more to the Sam Altman story.

We'll link the article in the show notes so you can read further, because it is an article that and it's not anti AI article.

This Gary Marcus fellow is a pro AI guy.

Read the article, see what you think.

But just know that Sam Altmande is another self proclaimed effective altruism guy.

Chris

Chris: Oh, of course he is.

Kayla

Kayla: And there's no safeguards in place keeping this guy from fucking everything up with his quest to move fast.

And break things, deregulate and de safetify AI.

So he can either make his company rich or himself rich, or at least become very powerful, even if he doesn't have any money.

This is a powerful man.

He's being recruited by homeland security.

Chris

Chris: Right.

Right.

Kayla

Kayla: There's two more things I want to talk about before we wrap up.

First.

Last week I said I'd explain the difference between EA and utilitarianism.

It didn't really fit anywhere, so I'll just say it here.

Chris

Chris: Oh, yeah.

Cause we said it was kind of like.

It kind of feels like a modern iteration of utilitarianism.

Kayla

Kayla: Luckily, Wikipedia has a section on exactly this.

Chris

Chris: Oh, perfect.

Kayla

Kayla: It states that EA does not claim that the good is the sum total of well being, and that ea.

And that quote, EA does not claim that people should always maximize the good regardless of the means of.

It.

Goes on to explain that Toby Ord, one of the original philosophers behind EA, described utilitarians as number crunching compared with effective altruists, who are guided by conventional wisdom, tempered by an eye on the numbers.

So they're the same but different.

It's their different flavors.

Chris

Chris: Okay.

Okay.

I think I understand.

I'm gonna have to give that more thought.

But thank you for the disambiguation.

Kayla

Kayla: I think the effect of altruists try to paint the utilitarians as, like, they just look at the numbers and nothing else matters, which, like, maybe some don't.

And effective altruists look at the numbers, but they also consider other things.

And I kind of think that.

Chris

Chris: Okay, so what you're telling me, I.

Kayla

Kayla: Think there's a bit of an overlap.

Chris

Chris: Is that utilitarians are pro mote or no pro torture, and the eas are pro mote, probably.

Kayla

Kayla: Okay, you gotta ask them.

Lastly, I wanted to talk about community, and sorry if I'm getting on a soapbox here, because it's something that's been.

Chris

Chris: Like, Caleb, this is a podcast.

Kayla

Kayla: I know this whole episode has been a little soapboxy.

I was like, here's the criticism, and they're all mine.

This is something that's been rolling around in my head ever since we started talking about these episodes and this topic.

I think one of my personal criticisms of EA and longtermism is that it seems to remove a sense of community building and mutual aid from the idea of philanthropy or helping.

And I don't think.

Again, I don't think it's wrong for McCaskill to argue that it's better if you have $100 to use that money to help 100 people living in an impoverished area across the world from you rather than helping ten people living next door to you.

There's certainly an argument there.

I don't think that's wrong.

Kayla: I think it's good to think about the global community and consider where we can help and who matters in this world.

But I also think that philosophically diminishing the help you can do in your own community as ethically inferior has its own downsides.

Like, I think that people like MacAskill and the Elon musks and the various sams of the EA world, they feel very cut off from people like you and me.

I think the wealthier you get, the more cut off from quote unquote, regular society.

You become to the point where you can only relate to other extremely wealthy people, and you live in this really hard edged bubble that cannot be penetrated.

Ha ha.

The world becomes a series of, unfortunately, calculations and hypotheticals.

Like, hypotheticals you're detached from, and that's really, like, the opposite of community.

Kayla: You do not live with the people who are your neighbors.

Chris

Chris: Yeah, man.

Yeah.

I also am of two minds on this.

You're right.

It's good to widen the scope of who you think of as your neighbor and who you are willing to give charity to.

And consider the global community and all of humanity.

That all sounds nice.

But on the other side, there's the contribution to the atomization of society.

And if we're all just doing the math, which seems to be what they complain about, utilitarians, but anyway, if we're all just doing the math to say we can help the most people in XYZ place, don't worry about physically going down to the soup kitchen or whatever, or even just, I don't know, baking a pie for your neighbor, maybe they're still into that.

But it just.

It feels like it's emphasizing one thing over the other just because of the efficiency and the effectiveness.

Chris: Yeah, I don't think.

Kayla

Kayla: It's not the only eroding.

Chris

Chris: Yeah.

Eroding the community down to the like.

Like I said, atomization, where everything you do has to be mediated through an app, and you have to.

If you're gonna swim at a friend's pool, it has to be through swimpley.

And if you're gonna, you know, get food from a friend, it has to be through Uber and yada, yada.

If you're gonna stay at a friend's place, it's gotta be through Airbnb.

Kayla

Kayla: Right?

Right.

I read a quote, during all this research that I lost in one of my tab closures, and I haven't been able to find it again.

Forgive me.

Chris

Chris: Oh, tab closures.

Kayla

Kayla: No, forgive me for paraphrasing, especially, but someone pointed out that the Silicon Valley effective altruists literally live in their ivory towers above San Francisco, debating and calculating how to get more wealth and what future populations to help with it, while scores of people literally suffer and die in the streets of San Francisco below them, like, literally below them, literally beneath their feet.

And that point stuck with me.

Like, we live in Los Angeles.

We live in a city that has a tremendous unhoused population.

Like, I think it's 75.

It's north of 75,000 people.

And I think about that's so many.

And I think that's just La city.

I think La county is more.

And so I think about that 200,000 number that Scott Alexander talks about.

Kayla: And I think about if Elon Musk were to take some of his exorbitant wealth and do something to house unhoused people in Los Angeles, if you want to think about it in long term stuff, that's not just helping those 75,000 people, that's helping the unhoused people that come after them as well.

I don't know why they're not thinking about these things.

I think that the destruction of community does have material impact.

Chris

Chris: Well, I think that's part of my problem with long termism, is that there's just a lot of, like, assumption that we are doing the calculations correctly.

Kayla

Kayla: Right.

Chris

Chris: And I just don't know that you can do that.

You know, it's like, oh, well, we're definitely spending the money in the right place.

Like, are you, like, you have certainty on that?

Like, we're applying the certainty of mathematics to something that doesn't feel that certain to me.

I'm not saying there isn't a correct answer.

I'm just saying that we don't know what it is.

Kayla

Kayla: Right.

Chris

Chris: And you certainly don't know what it is with your EA calculations or your utilitarian calculations.

It's just.

Yeah, that's one of my problems with it.

Kayla

Kayla: Essentially, it's not morally inferior to help your neighbors.

Like, our communities are important, and I think that effective altruism and longtermism divorces the wealthy from that idea.

Yeah, I lied.

There's one more point.

Chris

Chris: Oh, my God.

You're just like, one more point.

Kayla

Kayla: You're like, I know.

The long termist argument.

This is a little point.

The long termist argument that future people morally matter just as much, if not more, than people literally alive right now is like a super duper steep, slippery slope.

And I worry about the anti abortion argument lurking in that philosophical mire.

Like, I really worry about.

Chris

Chris: That's fundamentally natalist.

Kayla

Kayla: I hope long termists at some point grapple with that, because I'm worried about that one.

Chris

Chris: Yeah.

And like, I gotta say, as an economist, I also am kind of annoyed that, like, they're not applying a future discount to the morale, you know, like, you do that with money.

Future money is worth less than current money.

Kayla

Kayla: That's true.

Chris

Chris: So why aren't you doing that with future people?

Like, people ten years from now should be like, 50% of people now and so on.

Kayla

Kayla: You should email Sam all day.

Tell him that.

Chris

Chris: See, but they're like saying, oh, we're so objective with all of our math.

And, like, have you even taken an econ 101 class?

Kayla

Kayla: I think they have, and I think that's the problem.

Caroline Ellison's father is like an econ professor at MIT.

Chris

Chris: Well, then he should be bringing this up.

Kayla

Kayla: You should go email him.

Chris

Chris: I'll email him.

Kayla

Kayla: I'm sure he doesn't have anything else to worry about right now.

Okay.

I have talked a lot about these guys and about effective altruism and long termism and everything that goes along with it.

And I think it's finally time we made our way through the test grail bundle.

We hit basically every letter on our way down the ladder.

So next week on Culture, just weird.

Is Tess Creole a cult or is it just weird?

Chris

Chris: Oh, actually, sorry, Kayla.

There's one more thing that we need to cover, I think.

Wait, is context.

Kayla

Kayla: But we did all the test grills.

Chris

Chris: I know we did do all the test grills.

And I know that we're really chomping at the bit at this point.

I kind of feel like EA, by itself, could be its own criteria.

We probably could have evaluated that.

But I want to talk about a little bit about the context of eugenics that sort of is not.

Behind is not a good word.

It's sort of like a precursor, but it's a complex precursor to all this stuff.

And I don't want to.

We'll get to that.

That'll be next week's episode.

But I don't want to give the impression that, yeah, eugenics, the super nazi part of eugenics is just everything we've talked about.

Kayla

Kayla: You're saying it's etescreal.

Chris

Chris: It's sort of etescreal, but the test real bundle has some DNA in the eugenics movement, and I feel like that's important context to bring up before we do the criteria.

Kayla

Kayla: That's really good because I left out a bunch of stuff from this episode that has to do with eugenics adjacent, stuff that's related to effective altruism.

So.

Perfect.

All right, next time on cultures is weird.

We're talking about eugenics again.

Again, this is Kayla, this is Chris.

Chris

Chris: And this has been cult or too much context.