Episode Transcript
Also media, step up to the slop trough.
It's time for better offline and I'm your host ed ze Tron.
As ever by the merchandise sign up to the newsletter message me on goot.
I'm everywhere.
But today I'm joined by the author of a book about how much I should be punished, what punishments I deserve, and how long I should be punished.
I'm of course talking about more Everything Forever, written by astrophysicist Adam Becker.
Speaker 2Adam, welcome to the show.
Thanks for having me ed, It's good to be here.
What's the book actually about?
I apologize now, it's okay.
The book is actually about the horrible ideas that tech billionaires have about the future that they're trying to shove down our throats, and why they don't work.
Speaker 1You're an astrophysicist, right.
Speaker 2Yeah, by training, what is that?
Well, I did a PhD in astrophysics looking at how much we could learn about what was happening right after the Big Bang by looking at what was happening in the universe.
You know what's happening in the universe right now.
Speaker 1So how did you get into the touching in the world of Silicon Valley because you have to talk about some of the damnest bavots I've ever done.
Speaker 2Sane.
Yeah.
Well, I started out my career as a science journalist straight out of grad school and was writing mostly about physics.
Speaker 1Why are you writing?
Speaker 2I was writing for pretty much everybody.
I started at New Scientists and then moved on to writing for the BBC, wrote a book about quantum physics, wrote some stuff for NPR, the New York Times, very co American.
Yeah, and you know, like was sort of having a normal science journalist career, and then in twenty sixteen, the weirdest fucking thing happened.
What happened?
You know?
We elected a fascist?
Oh right, yeah, that thing, yeah, that thing.
Speaker 1Yeah.
Speaker 2And I thought, oh, you know I should be doing something more to, you know, directly combat this.
If I write another book, I would like it to have a more directly political angle, right, and uh.
I live in Berkeley, I live in the San Francisco Bay area and just surrounded by tech bros constantly and was getting tired of their bullshit and it seemed more and more directly connected to the disintegration of American politics.
And I thought, Okay, you know, somebody needs to write about how they have these insane ideas about the future and how that informs their terrible politics.
So how long have you lived in the Bay?
God, oh, thirteen years?
Speaker 1Okay, so you've got like a good backing of where the bay has been in that time as well, because I think a lot of these people are transplants.
And I say this is someone who literally moved to the Bay for two years in twenty fourteen.
Yeah, like and even then it was weird watching what they were doing.
Yeah but wait, so that was so you've been there thirteen years.
But when did the book get started?
Like, how did all this because it's this kind of came out of no way in a good way.
Speaker 2Yeah, yeah, No, I got started on the book probably the first inklings of it were around twenty nineteen or twenty twenty, right before the pandemic.
I uncovered a online magazine that was trying to sanewash creationism and climate denial that was being funded by Peter Teel.
Hell yeah yeah, And I broke that story and thought, oh, yeah, yeah, these these tech bros are awful and everybody thinks they know a lot about science and technology.
Even the people who don't like them seem to think that they know a lot about science and technology, right, and it's just not true.
Like they don't know anything about physics, they don't know anything about biology.
Peter Teel thinks that creationism is you know, plausible, or that evolution isn't the whole story.
That's nonsense, you know.
Elon Musk thinks that we can live on Mars.
That's nonsense, or at least he says that.
Whether he actually believes it, I don't know.
Speaker 1But that's actually kind of my question.
Yeah, how much of this shit do you actually think they believe?
Because I know Bezos is tied up with the long Now Foundation.
Yeah, and they do make nice tea over there.
However, the rest of the stuff not so good.
But how much do they really believe in this?
Because I just I you've said that this is kind of a homecoming for them.
This is kind of them coming back to the things they truly believe.
I don't think they believe in anything thing.
I think some of them don't believe in anything.
Go into I'm not saying, I'm you know that's really right here?
Speaker 2Absolutely.
Yeah.
So, like, for example, I think it's very plausible.
Obviously, we can't know for sure, but I think it's very plausible that, say, Sam Altman doesn't believe in anything.
Yeah, that's quite possible.
Yeah right.
Karen Howell makes a good case for that in her reporting and in her.
Speaker 1Books Excellent, being a guest on the show, she's fantastic.
Speaker 2She's amazing.
But at the other end of the spectrum, I think it's very plausible that Jeff Bezos really really does believe that we need to go to space.
And the reason I say that is he when he was the vale edictorian of his high school down in Florida in like nineteen seventy eight or something like that, he gave a speech about how we need to go to space, we humanity need to go to That.
Speaker 1Doesn't feel as ludicrous as a belief like we go to space.
But what does go to space mean for this guy?
Speaker 2Well, that's the thing.
The specifics of that belief that he professed at the time are pretty similar to what he's saying now, and it is pretty ridiculous, like he has said very recently, and it echoes the stuff from his valedictorian speech when he was like eighteen, that we need to move into you know, hundreds of thousands or millions of enormous cylindrical space stations.
Oh, of course.
And have you know a trillion people living in the Solar.
Speaker 1System tubes, yes, of trillions of peach exactly, do just work really well, like the hyperloop for example, Yeah, a successful tube.
Speaker 2Yes, or you know the Internet itself a series of tubes.
Yeah.
Speaker 1And because those tubes in the Internet worked, of course, the tubes work in space.
Speaker 2Yeah, that's I think that the bongo.
Speaker 1Yeah, yeah, this is science.
I believe I failed all the sciences.
I'm really sorry.
I'm really sorry, but no, keep going.
Speaker 2Yeah.
So you know, he said, then we can like make Earth into a beautiful park that you know, allows us to you know, save the environments.
I know.
And he said, when we have a trillion people living in the Solar System, we can have a thousand Mozarts and a thousand Einstein's.
I'm like, buddy, we probably already have people who are just as talented as Mozart and Einstein and all the other geniuses of history who are living and dying in poverty.
Yes, and you know you don't care about that, And you don't seem to care about climate change because you know, the carbon footprint of the Blue Origin rockets and the Amazon warehouses and all that stuff, right, But instead of that, he's like, no, no, no, the solution is to go to space.
Like, why that's not gonna work.
Speaker 1Because more space is up in space, Adam, you put cube and sp It's really it's funny because these people are insanely rich but also sound very stupid.
Yeah, when you really get down to the trumps of its life, what's your solution, Jeff of Gold, the money in the world tubes yep, tubes of space tubes, yep, trillion people, mozarts more mozarts, just sweating.
Speaker 2Profusely, exactly.
Yeah, And he does try to give an argument, but the argument is hilariously bad.
He says that we need to go to space, among other reasons, like he's all that environmental stuff, but he the thing he keeps harping on and coming back to is he says, we need to keep using more energy per capita?
Speaker 1Right, why?
Speaker 2Right?
Why?
Exactly?
He never says exactly WHOA Okay, Yeah, he says.
The only defense he gives for that is he says, if we don't do that, we'll have a civilization of stagnation and stasis.
Speaker 1As opposed to now when there's tons of innovation happening and all of big tech is focused on a diverse series of options rather than one big expensive dogshit thing.
Speaker 2Precisely, he nailed it.
Yeah, and he says we have to go to space because like otherwise, we're gonna run out of energy here on Earth.
We won't be able to keep expanding the energy we used per capita.
What does the energy do?
Speaker 1Well?
Speaker 2Yeah, first of all, what does the energy do?
And second, uh, this is this is actually my favorite part.
Hell yeah, he is right that if you just like idiotically kept that trend going in a way that's physically impossible, you know, for for hundreds of years, you would run out of energy here on Earth.
Like you, you wouldn't be able to keep the energy users per capita growing at the exponential rate that it has been in about three hundred years of be using all the energy that we get from the Sun here on the Earth.
But if you keep that trend going, if you try to do that by you know, going out and living in tubes in the solar system, right, that only gets you like another few hundred maybe one thousand years.
Then you're using all the energy that comes from the sun.
Speaker 1Right, and we've still not really established what we're using the energy for.
Nope, nope, no, so Beata sentus Yeah, finally.
Speaker 2Right, and this and this brings us to like the bullshit that Sam Altman said about building a dice and sphere.
Speaker 1Exclaim what a dice sphere is?
I thought it was the ball on the fucking vacuum.
Speaker 2But yeah, no, that's the only kind of dice and sphere that actually exists.
No, a dice in sphere is a giant, mega construction project, you know, beyond anything that anyone's ever actually built or probably could build.
That just encloses a star and captures all of the energy from that star.
Speaker 1Right, and if we ever built anything like that, no, of course, okay, just making sure we So it's building a big.
Speaker 2Around the star to capture all of the energy from that star to use it for data.
Speaker 1Where would the energy How would the energy get from the star to Earth?
Speaker 2I mean, oh, tubes, yeah, toobes exactly.
Yeah, so I battery.
Speaker 1Yeah, like, it's so safe, it's so cool.
We live in a site.
I wish I could do what he does.
I would be saying shit all the time.
I just I'd be like, yeah, actually, you need we can change the world if we just create a series of tubes that just gave me money every day.
No, wait, that's too that's that's too obvious.
I'd need to come up with the best.
Speaker 2Well no, I mean, I just think it's pretty interesting that these guys are spouting obvious bullshit and the only reason people listen to them is that they're rich.
Like if if they weren't saying this stuff, but then I went around saying this stuff, nobody would listen to me unless they funded me.
Speaker 1If a guy on the street who smelled kind of bad walked up to you and said, the price of intelligence is getting too cheap to meet her, you'd be like, all right, mate, yeah, can't do anything but clammy Sammy says it and everyone loses their fucking shit.
Speaker 2Well yeah, and that that Actually, that brings me to something else that we were plenty to talk about, you know, speaking of weird dudes on the street who are not billionaires making insane claims.
Ellie azer yod Kowski.
Speaker 1That's how you say his name, don't even Liza, I don't really.
Here's the thing.
He's a disrespectful sex is more on Grift.
So I really don't give a shit.
Speaker 2Yeah, no, it is rather bizarre that anybody listens to anything he has to say about anything he's So, who is the fuck this fuck?
Nun?
No?
No, the the alternate title for my book, like in my head the head Kennon was these fucking people.
Yeah, I love that.
Speaker 1No, No, I write these fucking bosses a lot.
Right, Ellie's a Yudkowski?
Speaker 2Yes, who what does he do?
Why does what does he do?
Speaker 1And why does so many seemingly smart people believe this tipshit?
Speaker 2So Ellie as a Judkowski, I'm gonna give you, like the formal version of who he is, what he might say, what would be in like his online bio, and then I'll tell you the reality.
Okay, so uh.
Ellie as a jud Kowski is the co founder of the Machine Intelligence Research Institute, which has been around for about twenty five years, and he has been researching artificial superintelligence for all of that time and mostly going on about how dangerous it could be if anybody built it without ensuring that it would serve human.
Speaker 1And this is just to be clear.
He has no scientific knowledge.
Can he even code?
Like?
Does he have any kind of.
Speaker 2He doesn't even have a high school diploma.
Speaker 1So I won't judge people for that, but I'll judge him for all the rest of here's the thing.
Speaker 2Yeah, no, no, I'm not judging him for not but that it doesn't really make Filly full of confidence.
No, no, no, no, he has no formal qualifications.
And again, that's fine.
You know, there are many people who've made major contributions to many fields of human endeavor without any formal qualifications.
Right, that's fine.
The thing is, if you make extraordinary claims like he's making, you need extraordinary evidence, and not having those qualifications, like you said, doesn't really inspire confidence.
He has made a series of really outlandish claims about what the future of AI could be, right, based on essentially nothing, based on like reading a bunch of science fiction.
He explicitly cites, uh science fiction.
There's like verner VINGI, as you know, oh verner Vingie wrote a bunch of books like Marooned in Real Time and uh god, I'm trying to remember the names of the others.
It doesn't matter.
Point is, yeah, he's a fiction writer who's also I think a scientist of some stripe I don't remember what, but still writing fiction and VINGI came up with this idea, or was one of the originators of and popularizers of an idea called the singularity.
Right, So the singularity is this idea that the rate of technological change is just going to keep getting faster and faster, and specifically, the rate of intelligence of AI is going to keep getting smarter and smarter, uh, until we reach this sort of point of no return where we have a singularity accompanied by an intelligence explosion that.
Speaker 1Leads to like the singularity moment.
Speaker 2Yeah, the singularity moment is very ill defined.
Speaker 1Oh and the I can't fucking believe this.
Yeah, I've heard this bollocks so many times.
I thought they had a moment.
I thought they had a point.
Speaker 2No, not really, oh you yeah, So like Kurzweil, right, the patron saint of evangelizing the Singularity, the guy who read the Singularity is Nearer, and then the sequel last year, the Singularity is Nearer, which is the real fucking title.
Bro, I know that's the real title of the book.
Speaker 1But his next books just called sorry, yeah no, it's his next books, Like it's here again, you see, it is the Singularity in the room with us, Yes, exactly, but he doesn't.
Speaker 2He tries to, but it's incredibly vague.
He says, like Kurzweil says, the singularity is going to be here in twenty forty five.
He also said in two thousand and five, and the singularity is near that, you know, we would have all kinds of nanotechnology by now.
They love nanotechty, they love nanotechnology.
They use it as a synonym for magic.
Speaker 1It was, I swear to God.
Also, there was a nanotechnology bubble briefly like ten years ago.
I vaguely remember them trying.
Speaker 2It didn't really go any I mean, there was also a nanotech sort of hype bubble back in the eighties and nineties, and it also didn't go anywhere.
And it didn't go anywhere because it turns out that like this idea of nanotech is like magic pixie dust that fixes everything is nonsense and it's a real like it's being echoed right now in the AI bubble.
Yeah right.
It's the same kind of hype, often pushed by the same people with the same logic, sometimes working at like the same nonprofits.
I mean, Yudkowski talks about nanotech constantly.
It's in his new book it's all over you know, the website that he's created.
Uh.
Speaker 1And his book is called if You Buy this book called Mike Money.
Yeah, that's sorry, It's called if They Built This Everyone.
It's such a stupid fucking title.
Speaker 2Sorry.
Yeah, no, it's a very stupid title.
I will say the one thing I'll say about Yudkowski.
I am sure that he is a true believer.
He is not a Yes, he's not a grifter.
Why because, like it's it's hard to explain, but I am so much more sure about him than I am about anybody else, even base.
Speaker 1I trust your judgment.
It's just he gives off the air of like a despot for a mad men.
Speaker 2Yeah.
I would say the best way to think about Yudkowski, like the way that I often think about him, is imagine like a really smart, self educated fifteen year old.
Yeah yeah, and like, you know, because if a fifteen year old was running around saying the stuff that Yudkowski is saying right now, I'd be like, wow, bright kid.
Speaker 1I hope he grows out, and I hope his parents have a lock on the gung capin.
Speaker 2Yeah yeah, yeah, well, and also like I hope he you know, I hope he grows up.
Speaker 1Yes, and I'm still thinking that.
Speaker 2Yeah, And like, and I don't think jud Kowski did.
I think, you know, I think like.
Speaker 1I also think everybody fell for it.
Speaker 2Yeah.
Well, and that's the thing, Like he got a lot of support online.
He you know, he got money from Peterman.
Speaker 1Sam Moltman said that he could, he should, he may win the Nobel Peace Prize.
Speaker 2One yeah, yeah, sam Moltman said that he should win a Nobel Peace Prize.
Speaker 1Falling down moment if that happens.
Speaker 2Yeah, no, that's not happening.
Speaker 1Defense.
Speaker 2Yeah, no, there's no way.
But like, look, he got a bunch of money from Peter Teal because Teal thought that, you know, j had Kowski was saying smart stuff about AI, right.
Teal now doesn't much like jud Kowski because he thinks jud Kowski is too pessimistic.
But the sort of the damage has been.
Speaker 1The ultimate Oh yeah yeah, classic old grins and smiles of that foul.
Speaker 2Yeah no, too pessimistic for Peter Teal.
Speaker 1That's actually bad.
Speaker 2Yeah no it is, no, it's but no, he's he's a true believer.
He's just kind of nuts.
Speaker 1So what does he do all day?
I say this as a blogger pr person news, newsletter writer and podcaster and all this ship Like I realize I have an email job, Fine, but at least I can tell tell you what I do all day.
What does he do with like go to parties with people at Kevin Rusan going the computer's going to kill us all.
Speaker 2I think that's a good chunk of it.
And I also think he writes an enormous amount right, Like, this is a guy who wrote that, you know, Harry Potter fan fiction that's longer than War in Peace.
Right, he wrote like a one and a half million word BDSM decision theory novel.
Speaker 1I say this as someone who writes a lot, a lot of words.
That's an unhealthy amount of words.
Speaker 2I agree, and it does help.
I think for him being able to write that many words, he's not a very good writer.
Speaker 1Yeah.
I mean even again, I write fifteen thousand word blogs, so I can't really judge him too harsh.
But one point five million words, how do you even know what it's about?
Speaker 2At that point?
I only know what it's about because that's what he said it's about.
I haven't read that one.
I did read most of the Harry Potter one research.
Yeah, bad, it was really really incredible.
Speaker 1Any any sexism or racism in there, it's just strange.
I mean it's JK.
Speaker 2Rowling, So yeah, exactly.
Yeah, that's a good question.
I don't remember anything specif big good.
Speaker 1Yeah, I mean it's just strange.
Speaker 2Yeah.
I mean, he's definitely got a heart on for eugenics and.
Speaker 1These and this is somewhat paraphrasing the comic book Preacher, but it's like, why do these fucking guys always look like that?
Speaker 2If you're gonna claim.
Speaker 1You're like a eugenicist, you should not look like an egg with a hat on.
And I won't get into but I don't generally get into personal appearance because I'm self conscious myself.
But it's like, if your whole thing is like, yeah, we need to make the perfect human beings, it's like, you can't look like that, mate, I'm sorry, you can't do that.
Speaker 2Well I don't.
Speaker 1I guess you can.
You'll be in the New York fucking Times.
Speaker 2Yeah.
No, it's it's crazy.
It is really crazy that anybody listens to him.
But no, he's really into eugenics.
Speaker 1Why do they listen to him?
Speaker 2He's really into evolutionary psychology, and he's got like the sexism and racism that's like tied up in that.
Why do people listen to him?
I mean part of it is that he got that money from those billionaires, right.
He was hanging out in the bay saying the kind of insane contrarian shit about AI that attracts the kind of like brain dead billionaires like Peter Thiel.
And then you know, he became the guy and started, you know, a series of online platforms that attracted a following, right, like you know, Less Wrong, and then that spun off this whole rationalist subculture.
Yeah, that's a very good question.
Less Wrong is an online platform that serve slash maybe served as a home and like epicenter for this movement called the Rationalists, right, which are sort of formed around Yudkowski's writing, include this set of writings he has called The Sequences where he lays out, Oh he's a cult leader, yeah in a way.
Yeah.
Yeah.
Speaker 1The rationalists are just people, I'm guessing, guys with triobies who say that we need to focus on ration, rational thoughts and logic.
Speaker 2And there's a lot of it.
I mean, some of them are women, but some of them are non.
Speaker 1Baring, really surprising yeah.
Speaker 2I mean, look, there are nerds of all stripes.
Speaker 1Yes, And also he's very much playing in the older Internet.
Yes, but the idea of a large forum with any kind of following is actually kind of adorable these days, except when it's less wrong, it's not adorable.
Well.
Speaker 2Also, less wrong's been around since the somewhat older Internet, right, it's not been around since the nineties, but it's been around since like the mid to late two thousands.
Okay, And Yudkowski is you know, a lot of the rationalists are in their twenties and maybe early thirties, but Yudkowski himself is in his mid forties, right, because you know, he he is terminally online.
And I'm sure, like obviously he'd be unhapy be with many of the things I've said about him, but that one I'm sure he'd agree with.
You know, like he's been online since he dropped out of school at age what like thirteen or fourteen.
He's been online since the mid nineties on like butta yeah, and like you know, he was on transhumanist forums like like you know, since the mid nineties, like email threats and stuff like that.
Speaker 1God, yeah, he really he is like the detritus of the Internet in a way brought to like Katamari of center right freaks.
Speaker 2Yeah, bearing ever right.
I wouldn't even say center right.
I would say techno libertarian.
But that is just that's just right.
Oh, no, it is right wing.
It's the center part that I disagreed with.
Speaker 1Yeah, perhaps he started that when he was fifteen, before he learned yeah, the wrong things.
Speaker 2Yeah, I will say he like I don't get the sense that like he likes Donald Trump, but he certainly like will parrot a lot of standard libertarian talking saints along the way to you know, making his The one thing.
Speaker 1I keep thinking though, is I don't know if I can shake this thing he's a grifter just because you're taking a bunch of twenty year olds, you've got all of this writing thing.
He's either a grifter or a true cult leader.
He may actually just be a cult leader.
Speaker 2Why I would say cult leader is.
Speaker 1Closer, Yeah, because he seems to I mean, dangerous is probably the wrong word.
Speaker 2Yeah, I think that's straight.
Speaker 1He's not.
Speaker 2I wouldn't call him dangerous, but he is.
Speaker 1You think the only dangerous to like a hot topic worker, a very nerdy hot topic No, No, no, no to them, just just that he'd speak to them.
It's just peculiar as well, because and this actually gets into some of your science background when you got my continual frustration because I am self taught with all this economic stuff which is insane, I probably shouldn't criticize you had Kowski quite as much, but I will.
I'm a hypocrite.
I looking through financial journalism and tech journalism.
The thing that I keep noticing is that people keep accepting things that I've just patently wrong.
There's just shit that they say, like even with this in video Open AI deal, people saying and Video invested one hundred billion dollars.
They didn't.
They're investing progressively.
When they do the first giga what gigawatts of take like gig what of data center will take about a year and a half two years.
That though, it's just bollocks.
I imagine the last few years have been a little bit mind bending for you hearing all this stuff about Agi and the future and all that.
Speaker 2Yeah gobshite, Yes, yeah, I mean the AGI stuff, which, like I started working on this book before chatch Ept came out right, and it.
Speaker 1Was twenty nineteen, a few years into Open a eyes.
Speaker 2Yeah exactly.
So like I knew about open AI, and I knew about like transformer models, but like you know, chat GPT comes out and and suddenly, you know, the public conversation shifts in a way that I didn't anticipate.
I realize, oh, this book is going to have to be a little bit different than I thought it was going to be.
But also, you know, all of this conversation about AGI, right, like in a way it helped me for writing the book because I thought I was going to have to spend a lot of time in the book explaining what AI is, what people think AGI is, right, There's gonna be a lot more explanation, And then all of this stuff came out and I'm like, oh, actually, this, you know, I can I can spend more time in the meat of the book.
This is this is helpful for me because you could just quote them directly exactly.
Yeah.
But the thing is, AGI is this hopelessly ill defined thing like super intelligence, this thing that that yed Kowski is on about.
Uh you know, what does it even mean?
Like have you looked at the definition of AGI in the open AI charter like the original one?
Oh yeah, no, it's great, Like the original charter from way back.
It says something like AGI is a machine that can reproduce any economically viable or economically productive activity that humans engage in.
That's a bad definition.
Speaker 1That's anything.
Yeah, I mean that could just mean anything that could.
It's it's a machine that can do anything.
Speaker 2It's both vague and really narrow, right, because it's like, Okay, I thought AGI was supposed to be like, you know, commander data on Star Trek, right, and so that means you know, it's gonna be sort of like humans.
It can do the things that.
Speaker 1Humans do, also economically viable work.
And the first thing they start with is fucking writing.
Yeah, like Jesus Christ.
That's like, oh my god.
Yeah, the first and the most.
We're gonna build boats and sell like we're gonna buy boats as an investment vehicle, like like what the fuck?
Speaker 2Yeah, these people don't do any real work.
Speaker 1It's it's so strange as well, because the AGI conversation almost never happened.
It's about AGI.
It's because my favorite thing to do is media.
Speaker 2Go.
Speaker 1Isn't this slavery?
Because it is.
It's like, oh, yeah, we'll do an autonomous thing.
We'll make do things, but it will be conscious, which will allow it to work better.
Speaker 2Yep.
And so then you get people talking about like a data center filled with geniuses, and like, oh, okay, wouldn't a data center filled with geniuses not want to work for you?
Speaker 1Wouldn't a dat ascent of full of geniuses that can't leave and have to work be cold to prison?
Speaker 2Yep?
Yeah?
Cool, yeah, exactly.
No, I I get into this in my book.
You know, the inspiration for a lot of these ideas ultimately traces back to mid twentieth century science fiction, right, And so you get people like Isaac Asimov, Arthur C.
Clark, right, Asimov's robot stories in particular.
If you go back and look at Asimov's robot stories, it is very hard with like a modern eye to look at them as, especially certain ones of them.
It's hard to see them as anything other than like kind of being about slavery and race relationships, because you get like, for example, there's this one short story I think it's called Oh God, I think it's called Catch that Robot, but I might be confusing it with a different one.
It might be a little Lost Robot.
I get those two confused.
But either way, it's about a robot that is trying to escape gain its freedom.
And in that story, the humans are like addressing a bunch of they're interviewing a bunch of seemingly identical robots to try to find the one that they're looking for that's trying to escape.
And they interview these robots and when they're interviewing them, they address them as boy, and the robots called the humans Master.
Yeah, And these stories are from like nineteen fifty five, like you know, the Jim Crow South is alive and well, it's really bad, it's really really uncomfortable.
And then like forty years later, in the nineteen nineties, you get Werner Vingi writing about the singularity and how great it's going to be when we all have these robot assistants, and he refers to Asmov's wonderful dream of and this is a direct quote from Vingi.
Willing slaves Jesus fucking Christ.
Yes, And that's something that someone wrote in like nineteen ninety one.
I mean, but that's what this is.
Speaker 1And this is an uncomfortable topic because that's what this is.
It's what pisses me off, other than like nineteen other things about Kevin Bruce at the time, because he's written several things about AI and AGI and one thing about AI welfare, and it's like the AI welfare begins with slavery.
And if you can't write that, you're a fucking coward and a bitch.
I'm sorry you can't write.
Yeah, everyone is excited about slavery because that's what it is, and it's nothing else.
It's not oh well, it's like they wouldn't be.
They wouldn't be.
They like doing it, and it's like, fuck you man, that it's slavery.
What I really hope happens is if AGI happens, it's just a just a regular dude, yep, and he's lazy and he's a yeah, like you just do this.
What I think that's way more likely is I don't think AGI is possible.
Speaker 2Yeah.
No, Actually that's a good question.
Do you think it's possible?
Not really?
No likely?
Speaker 1I say this is a non scientific person.
Speaker 2Yeah, yeah, no, I don't think that you can build.
Well, first of all, I think AGI is just hopelessly ill defined, right, But if we want to say, like an artificial machine that has the cognitive capacities of a human, like they can do all of the tasks, like all of the things that humans do.
First of all, I think you're gonna need a completely different kind of machine.
I don't.
I don't think certainly.
I don't think that you know, scale is all you need.
And if just scale attention, yeah, give it, give it more data.
And if you don't have enough data to make more synthetic data with more LLM, like dude, that why why wouldn't know?
Absolutely not?
But but I also think that like there is, there's is very simplistic set of ideas behind the idea of AGI, right, and and the two that I keep coming back to are the idea of the brain as a computer and the idea of our bodies as like meat spacesuits for our brains.
And both of those are just wrong.
The brain is not really very much like a computer.
It is more like a computer than it is like say, a clock.
But there was a long history of comparing the brain to you know, the most complex piece of machinery that humans have at the time, right before it was a brain, or before the brain was like a computer, it was like a telephone network.
Before that, it was like a hydraulic system.
Before that, it was like a clock or a windmill, right, And it's not really actually like I mean, it's a little like some of those things.
But the brain is like the brain, and the main different.
Speaker 1We don't understand thinking, no, and we don't.
Speaker 2Understand exactly how the brain works.
And part of that is that the brain was not built.
The brain evolved.
Right.
But also we are not our brains.
We are our bodies in our environments.
Right.
The brain is inextricably connected to the body, and the body works in an environment surrounded by other bodies, in a culture, a society, a world.
Right.
You need all of those things in order to get the human cognition that you know.
These guys are so you know, determined to reproduce inside of a computer.
If you just take a human baby and like leave it with a bunch of food in the woods, even if you get rid of all the predators and everything, that baby's gonna starve to get a bunch of books, man, right, yeah, like the baby books.
Yeah, if you give the if you somehow, if you feed the baby but don't talk to it, the baby will not grow up being able to think properly.
Speaker 1Or Speakit's thinking will be vastly different.
Speaker 2Exactly.
Yeah, and so like you need so much more than just the brain.
Speaker 1It also compresses human experience.
They can play experience with learning.
Speaker 2Yep.
Speaker 1When we don't know how we learn, like we learn, we learn intentionally but also unintentionally societal yup conditions around us, how we felt in a particular moment.
Vastly memory is also insane yep, yep.
We experience the world in this is my personal experience.
My experience of the world is vastly different to my memory.
My memory is like crystal clear and beautiful, and my real life is a mixture of slops.
Yeah, and it's it's frustrating as well.
Books of these people also donut pits are like people.
They don't they don't like the human brain is kind of like human bodies, even the dumbest dumb dumb it's kind of amazing yep thing.
Speaker 2Yeah, no, And one of the things that's amazing is, Yeah, we don't know how the human brain works.
We don't know how thinking or learning works.
But what we do know is that we don't do it in anything like any way anything like on LLM.
Yes, right, because the amount of you know, material that we take in over the course of the first three years of our lives.
When we go from not knowing a length which to knowing a language, maybe multiple languages, is nowhere near the amount of material that is, you know, force fed into these llms, And yet we get the trick done, and three year olds know things that no LLM knows.
Speaker 1Also, there's no affordance for the fact fact that some people can't learn stuff like I cannot learn languages.
Speaker 2I've really tried.
I'm pretty trash of that too.
Speaker 1But I also was really bad at like I was uniquely bad at a lot of things.
I have my various no but I have ADHD, dyspraxia and other stuff I won't get into.
But it's I can't like certain things, don't like the things that I pick up insanely quickly.
Other people can't.
Other people can't even see the connections.
It doesn't Robert Evans actually had a really good point on the subreddit Yes Robert, I read all your stuff, where he was saying that like he is very good at like picking up stuff like almost to me, he can read fast than most people as long as it's about conflict and it's no, but it's true, and it's one of the remarkable things about the human brain.
And I think that it's actually kind of disgusting how little appreciation that is for like human bodies and the brain, how incredible the average busts, even average people are.
Speaker 2Yeah, no, and this is this is the thing.
These guys don't have a proper appreciation for the human brain and the human body.
And going back to the tech billionaires, and I guess Yodkowski as well, they don't have an appreciation for how remarkable Earth is in particular, right, you know they you know, especially when you talk about somebody like Bezos or Musk, they talk about Earth like it's doomed, Like we need to get off of this planet.
Yeah, like this is this is our home.
It's a remarkable place, and there is nowhere that we could get to in the Solar System.
There's nowhere else in the Solar System that's remotely as as hospitable as the Earth.
Speaker 1I also think that they want more.
They want their own land, they want their own countries, they want they want to escape governance.
Speaker 2Yeah, yeah, they see they see spaces and escape from politics because they they're like living a libertarian wet drenk, which.
Speaker 1Is really funny.
Speaker 2Because when they get that, they'll immediately do fascism.
Speaker 1Yeah, that's what's on the agenda.
Speaker 2Second, can't you cannot run away from politics to minute you have more than one person in a room, there's good.
It's just really sad.
Speaker 1And I actually think on a grander scale, they don't have an appreciation for tech.
I was just writing something last night.
It was on the way to New York where it was like the actual state of technology is kind of fucking amazing.
Yeah, like we can message I could message you happen to be in town, you message me on blue Sky hundreds of miles thousands of miles away.
I was like, I'm able to write a note that was on my computer, that's on my iPad here.
I know that this sounds like boosting, but it really isn't.
We have the raw tools that are just fucking incredible.
Yeah, and these people do not appreciate them.
They don't appreciate them, which is why general if AI is so fucking ugly because it's bad technology.
It's not even good technology.
It's poorly run, inefficient, endlessly expensive, and directionless.
Speaker 2Yeah, and it inflicts harms on users that like we would not accept from anything that was not subjected to such an enormous hype cycle.
Speaker 1Right literally nothing, yeah, nothing, No, if ten.
Speaker 2Years ago you you know, took any you know person off the street and said, hey, there's this cool new technology.
It takes up enormous amounts of electricity.
It can do things.
It can do things that you know, no other piece of technology you've ever seen can do.
Also, it's very good at talking teenagers into killing themselves.
Yes, should we release it into the you know, wider.
Speaker 1World, And they'll say, well, no, but can do anything else?
And you, of course would say yeah.
It can sometimes write code, yeah exactly.
And sometimes it also gets things horribly wrong and.
Speaker 2It writes bad pros.
Yeah like, and it just kind of makes everything feel kind of mediocre and smeared out, yes exactly, like, yeah exactly, and it makes some people go crazy.
Speaker 1And yeah, it drives people to actually, yeah, how do you feel about that?
Like, how do you like did you see this coming?
Because this really jumped out No.
Speaker 2Yeah, no, no, this surprised me.
This did surprise the hell out of me because I think that you know, these machines, I don't even like calling them AI right, because I think that's a marketing term.
It is.
Yeah, like if you go back in time to you know, nineteen ninety and tell me when I'm a kid, Hey, I have a little device in my pocket that lets me talk to an AI and and then you know, I would have thought, oh, like that lets me talk to you know, like command or data from start exciting, And instead it's this, and I would have been like, what the hell is that?
No, AI is this marketing term.
It's it's a text generation engine.
It it it produces you know, homogenized thought like product.
I was.
Speaker 1Also, I'm in the midst of a long one as usual.
It also conflates doing stuff with outputs.
I know that sounds kind of flat, but it's like that everything is a unit of work rather than actually creating stuff.
Well that you pay an you pay a person for their experience too.
And it's just also not very good at Stuff's pissing me.
No, really, it's really bad at stuff.
Speaker 2It is.
And I think, and I think that's where you know, this sort of driving people insane is coming from, right, Like I I like what I missed.
The reason I think I didn't see that coming is I failed to think about how like I knew that these things just generate text and in a lot of ways they just sort of spit out back to you what you put in right, right, which is which is an old thing with chatbots that goes way before the LMS, goes all the way back to Eliza, right.
Speaker 1Oh yeah, and that was the first the first AI computer.
Speaker 2Yeah, the first chapter.
I wouldn't even call Eliza AI, right, And even.
Speaker 1The creator of Eliza, god huz.
Yeah, this is from Karen Howe's Empire of AI.
Great bit about it in there.
Speaker 2Yeah, exactly, Yeah, no, no, no, Eliza.
Eliza was just like a one hundred or so lines of code that you know, you'd say I'm having a bad day, and it would say, oh, I'm sorry to hear that.
Yeah, why are you having a bad day?
Right?
Speaker 1But like the gassing engine, the yeah.
Speaker 2Like that's the thing I didn't think about.
Oh wait, if it just repeats back what you put in, but it does it in a way that's compelling and convincing to some people, that's going to just get them sort of caught in this like dope meine self validation loop, and that could drive them off the edge.
Speaker 1And I think that there is a condescension.
I judge myself for this where I was like, oh, this doesn't fool me, and it's like, but the harm also of I'm very I'm blessed to have tons of people who love me who also you give me clear feedback, which is not just what I want to hear.
But I definitely when I was younger and very depressed, like crave validation and crave someone to just tell me what I want to hear, I definitely never thought what if someone did, and the actual danger of having every fucking thought validated, and also just the sheer horrors Like Matt Hughes, my editor just did a great story about this kind of horrible story where he's simulated someone going through a mental health episode, and Claude was very clear to go, yeah, man, you don't seem so good, chatchip.
It's like, no, everyone is not to get you made.
Yeah, actually, it's in any other tick in the world did this.
You'd shut the ship down immediately.
Yeah, exactly, close it, you know, But where's fucking Eliezer on this bullshit?
Speaker 2Because this feels like.
Speaker 1If you write a book about how everyone dies, this should be the thing that if he actually believed in anything, would he should be up saying like, hey, look, this is what I was talking about.
Speaker 2Oh, I mean I I do think that he thinks this is like an incipient version of what he's talking about.
I think, like a baby version.
I think he loves it.
I think I think it helps him out well.
I think that he finds that useful for making the argument that he made exactly.
But he is not again the argument he's making.
And this is the only nice thing I'll say about him.
He means it, seriously, He's not a grifter.
He's three anxiety disorders in a trench coat.
Speaker 1Damn you put that on a fucking book, coach, stupid ass.
I can't think of any other movement in tech ever, Yeah, there is anything like this, specifically because of how much it sucks us, Like, I can't think of any maybe the metavers and crypto, but even then, I don't like the comparison.
Yeah, they're so much smaller.
Speaker 2Honestly, what I keep thinking about is is that in a way, it is taking the daily experience of the tech billionaires and like bringing it to the masses, right, because what is it?
Speaker 1Low?
Speaker 2Yeah, what is it like to be Sam Altman?
Speaker 1Right?
Speaker 2You've got billions of dollars and you're surrounded by people who will never tell you no and validate your every thought.
Speaker 1And they'll convince you that you understand every subject exactly.
Speaker 2Yeah, and now, well.
Speaker 1I've been saying this as well, because if you're an executive, a machine that can write emails, readium and otherwise you go to lunch, it's kind of magic.
Yeah, but no, I like this idea that it's the extension as well, just this completely like separate thing that just as Yeah, that's completely right, man, I fully agree.
Speaker 2Right, And so of course they don't see the harm because that's their entire goddamn all that happens, and so the well, but if this were bad for people, that would mean that, you know, I'm in a bad environment.
That's unhealthy for me, and like, yeah, actually it is, but they don't think they don't think that.
But I genuinely believe that like the best thing for the tech billionaires themselves that could happen to them would be to lose all their money, would be the best thing for their mental health, put man of them.
Speaker 1Yeah, no, let me get them on the show.
I think that I could have a great chat with any of them.
Sure, just because I went to a private school.
Like a lot of these American billionaires as well, they would get destroyed by the average scum aristocrats of old in England, like the real blood drinkers.
Speaker 2They're amateur vampire.
Speaker 1No they really are though, like like they It's the classic thing why British colonialism and American colonialism have never matched up, because Britain was just evil.
They just fucking murdered people and destroyed communities.
And they're like, why are we're doing this?
Of course we're British.
This is what we do here.
What do you mean, what's a moral?
I've not heard of that.
Speaker 2No, what do you mean?
Speaker 1No?
No, send my send my eighth cousin to to Africa, shoot whoever you see.
Like that was the horrifying stuff.
But they knew they didn't care about what people think.
I still think the billionaires care.
Speaker 2Oh, they definitely do.
Like this is this is the thing that which.
Speaker 1Is insane to me.
If I had had one billion dollars, I would no longer care.
Speaker 2Well, if I had a billion dollars, I would just try to make sure that nobody knew my name and that you know, it's the same amount.
Speaker 1Yeah, well I would be posting.
Speaker 2No, I think I think I'd just be done.
Man, I'd be like, okay, cool, you know, I'm gonna donate to a bunch of causes that mattered to men, and you know, like I still think it's bad for them to be billionaires, and I try to try to change that, but also like I'm just just gonna, like, you know, hang out in a nice house with my friends and have a good time.
Speaker 1That's the problem though they don't have those.
Speaker 2Well, yeah, because you've heard that.
Speaker 1Have you ever heard the really depressing story about Elon Musk and this guy called here was this investor who got really cooked by COVID Peter something, and he know this story of going over to Elon Musk's house and there was a decanter of wine and Elon Musk picked up the wine before it's done decanting, and then something said something along the lines of honey, badger, don't care.
And I just want to say, that's one of the saddest fucking things I've heard in my life, just absolutely, just unfathomably depressing.
Because you can get things like a coravan that can kind of aerrate it.
There are various ways around ourration.
If you're really feeling it and you have hundreds of billions or what, however many many dollars Elon has liquid, you could just have someone whose job is to make sure the wine is errated.
They could make two hundred and fifty thousand dollars a year.
It wouldn't matter to you.
That's what you lose in the couch.
Speaker 2Yeah, but I think that what matters to Elon is not doing what he's supposed to do, right.
Yeah, he can be seen as cool.
Speaker 1Or just drinking as quickly well, because otherwise he might feel something.
Speaker 2Yeah, I mean, I just he desperately wants to be liked and it's never gonna happen.
Speaker 1It's so funny as well, because it could be so easy for him.
I know, he could just post this lunch every day and nothing else.
Then would be like, look at this is actually what pisses me off as well, though, because people like el must fucking sucks.
It's like he was sending people off to Aaron Bieber, a science reporter.
Speaker 2In twenty one's a friend of mine.
Speaker 1Yeah, that's p and like when that happened, not a single Kara Swisher didn't say shit, then here Kevin Rouse Case and you and on these fucking people I thought necessary, but now.
Speaker 2Like must have such a bad guy.
He's such a bad No, he's always been like.
Speaker 1And also he called a guy a pedophile for saving children yep, because he wasn't allowed to send his submarine.
Speaker 2No, he's never been good like this is this is the thing that I.
Speaker 1Don't think any of these people enjoy anything as much as I enjoyed diet coke.
Like I'm one hundred percent sure of that, because I love these things.
Like it's if this kills me, if this shit's meant to like in three years, they're like, it's rat blood, Like I'm like, I will.
Speaker 2Keep drinking offline brought to you by Diet Coke.
Speaker 1It's rap blood.
I really hope that you spokes the show at one point, so that's the commercial.
But that's the thing, Like I'm only kind of joking because it's I really enjoyed that coke.
I'm sitting down chat and chew with my friends.
I love watching football with chat with my friends.
Like it's like there are very basic things I enjoy.
What do these people Like?
These people just must walk around in this haze of anger, yeah, or like emptiness.
Speaker 2I think they're they're really cut off from their own emotions, right and like and again, that's gonna happen if you just constantly get validation, right, you know.
One of one of the many tweets from back when Twitter was less shitty, before Musk bought it.
There are many tweets that just like live rent free in my head, and one of them is about the cognitive impact of being a billionaire.
Yeah, it ends.
It's like, you know, like everything around you is really expensive.
It's just a constant every chair fifty Yeah, in terms of the cognitive impact, it must be you know, roughly equivalent to being kicked in the head by a horse every day exactly.
Yeah.
Speaker 1I think I'd be fine, but that's my pathology, I guess, but it no, but it's they have this weird isolated thing.
And even Benioff, who used to seem okay.
Speaker 2Well, I mean he his whole game was like to be the best of the billionaires.
Speaker 1Which love are, and then he was just like, ah, fuck it, yep, just fuck it.
I don't give a shit anymore.
Yeah, Agent Force, it doesn't sell to anyone.
No one likes it, but it's the future agent for us Jesus, he's.
Speaker 2Donated to all.
I know, it's so cool.
Speaker 1It must be really cool being a guy who actually has qualifications and from from proving things to watch the world like all these guys being like, yeah, this is the future and just because want to go it doesn't work.
Speaker 2No one likes it.
Yeah, I mean cool is one word.
Incredibly frustrating is another.
Speaker 1Right, this is this is stymying real innovation.
Speaker 2Yeah, yeah, I know there's opportunity costs and and also just like actual stifling of real innovation in the effort to achieve even possible ends that would be bad even if we could achieve them.
Speaker 1So slight directional shift.
Is there anything within like science and tech innovation that you're actually excited about?
Anything you look at and like that's fucking cool.
Speaker 2I mean mRNA vaccines are the first thing that comes about exactly.
Yeah, they're really awesome.
Speaker 1Tell them.
Speaker 2I mean, like, look, you know the fact and what is.
Speaker 1An mri NA vaccine said flawlessly.
Speaker 2Yeah, an m now I'm able.
Yeah, and mRNA vaccine.
Speaker 3Yeah, it's the kind of thing that that you know, we have with the COVID vaccines, right, Basically, the thing that's so exciting about them is that they are so much easier and faster to synthesize, right than previous vaccines.
Speaker 2You know, I think the previous record before the COVID vaccine, for you know, how long it took to develop a safe, widely deployed vaccine with something like five to ten years, Jesus, And then this vaccine.
Most of the time delay, most of that year that we were waiting for the vaccine was actually a little less than a year.
Most of that was testing.
The actual time that it took to synthesize the damn thing was I believe on the order of weeks.
Speaker 1And what's crazy is I believe that was venture backed, right, Yeah, some of it was venture backed, which is like, see venu capital can be useful.
Speaker 2Yeah, it can be, but once Yeah, some of it was venture backed.
Some of it was backed by you know, NIH grants.
We do need those, yeah, we surefucking do.
Speaker 1No.
Speaker 2Government funding of basic research is important and not just because it leads to amazing technological breakthroughs like mRNA vaccines, but also because, like basic scientific research is an important thing for humans to do, like the same way that art is important, right, right, but it also does enable a massive scientific and technological breakthroughs.
And I you know, there's promise for MR and A vaccines to like open up a whole new class of vaccines that you know, for things that were previously very hard to vaccinate against.
I am not an expert in the field, but like everyone I know who works in biomedicine, they they're all very excited about this, and they're all really depressed by the fact that, you know, we have an antifactor who sounds like a fork that got stuck in a fucking garbage disposal as the Health.
Speaker 1And Rise from your Grave guy from that one video.
Yeah, yeah, very depressing.
I I just wish we were like green energy as well.
Feels like an oh.
Speaker 2Yeah, green energy was the next thing I was gonna say, Yeah, batteries, solar panels.
It's incredible.
Speaker 1Uh, this opportunity is there.
Yeah, it's we need to innovate, Like we are innovating.
Speaker 2But like, yeah, and also like we even had the legislation that we needed, right or some of it right, like the you know Biden's big bill, the Build Back Better It uh, you know, was not a perfect bill, but it was the best environmental bill in American history.
Yeah, and and now you know, it's being destroyed because we have a government in this country that that you know, does not believe in climate change and doesn't believe in anything other than short term profits at the expense of everybody else, and also doesn't believe in democracy.
Speaker 1That feels that feels like a big problem though, the growth it all cause, Yeah.
Speaker 2I mean, that's that's the thing.
And that's why my book has the actual title that it does, rather than going with the title these fucking people.
Speaker 1Yeah, so these fucking people or just the Bostards.
I think it is also very good too.
Speaker 2I think so too.
I mean, like there's ah, it would be so easy for them to do better, No, like you know, to forget forget the best thing that they could do.
They're doing some of the worst things that they could do.
Doing better than they are right now is just an incredibly low but.
Speaker 1Even through like very poorly guided generosity, they could very easily.
Yeah, they could just fund media outlets versus whatever it is they're doing to them, tearing them down.
Speaker 2But that would mean, you know the possibility of losing control and losing you know, losing some of their power and money, and they just are not willing to do that because they've got something broken in their hearts.
Speaker 1We need to heal them.
No, now I think that, I think.
Speaker 2I think we need to tax their money away.
Speaker 1I think that too, but I think we actually true.
My truth here is that we need to change how we do that though.
We need to start doing executive liability hmmm.
We need to make it so if like CrowdStrike happens again, like a bunch of people potentially get diye in the NHS system because the computer shuts down, that satch and a DELA can lose something because it isn't enough to find the companies.
Finding the company is not going to do shit unless you do scaling revenue percentage of revenue.
This is this and more and how I become the FTC.
Speaker 2No, they I'm not kind of let me.
Speaker 1But it's just I feel the one of the wonderful things of having you on is you're able to come at this from a science communicator perspective.
You're actually able to because it's not just about what these people want.
It's the practicality of it, which is the nothing's really happening yep.
But that's the actual weirdest thing about the real nihilism of this is that nothing seems to actually be occurring.
Speaker 2Yeah, and they also act like there's not going to be any accountability for like, forget their actions, just even their words.
Right, you know, Sam Altman says, you know, like this this thing that just drove me up a wall that he said about a month ago, he said that, you know, in ten years, college graduates are going to have really cool jobs going out to explore the solar system in the spaceships enabled by AI.
That is not happening, like on the list of things that are not happening exploration.
No, No, that's not happening.
He is just wrong.
He's lying, right, and he is probably still going to be alive in ten years, and you and I are also likely still going to be alive in ten years.
And then we're gonna say, hey, remember when he said that that, you know, now we we can show he's just wrong and he and nothing's gonna happen to.
Speaker 1Him and what these like this is why I'm so harsh on media criticism as well.
Yeah, the one thing you can do is at least say area man full of shit.
Yes, stupid bastard wanks off again.
Speaker 2Well, this is my this is what I attempted to do.
What you do?
Speaker 1Well, yeah, and it's I think that the change that we need in our hearts is to just regularly say this stuff.
I regularly say on the show.
I don't care if you quote me, just say this shit about them.
Yeah, clammy, Sammy.
He's been promising.
He said that this was the year of agents.
He said that.
I don't, but now I read in the Information dot Com the next year is the year of agents, So maybe I do.
Actually, here's a question for you.
Yeah, AI twenty twenty seven.
Do you read that.
Speaker 2He've read a little bit of it.
It's nonsense.
It's nonsense.
Yeah.
Speaker 1Why do you think things like that fool so many people?
Speaker 2Though?
Speaker 1Why do you think it got the media coverage it did?
Speaker 2See?
Sh I mean?
Part of it is bad journalism, right.
Part of it is is that Kevin Russ has confused Kevin and Russ has confused reiterating the views of the wealthy, influential and powerful with taking a brave contrarian stance.
And how he made that mistake I don't know, but you know, I really get the sense that he thinks he's being very brave when he's doing exactly what journalists are not supposed to do, which is just uncritically parroting powerful.
Speaker 1And it feels like it's the large language model again.
It's just the affirming thing.
It's like, oh, I'm being contrarian by stepping out against these people who say it isn't making any money and isn't very good at stuff.
Speaker 2And it's like, look, buddy, if there's you know, if there's a two sides to a debate, I mean, obviously there's more than two sides, but like if on one side you have the wealthiest people in the world and on the other side you have people who say mean things about you personally online, and you think that you know, it's the first side that's the contrarian underdog.
Yeah, something is wrong with your brain, and that's the thing.
Speaker 1But this is, and this, I think is a weird thing in our society that we just people trust the rich and the media has got to a point where they've just bred out the real cynicism because I swear, like ten to fifteen years ago, you used to have some tech journalism, Like I read a thing about Amazon web services that Kevin Ross wrote and it was actually pretty cynical about it.
Really, yeah, it was actually pretty critical.
He then he made I feel bad for him because this can't be in his fault.
He basically said at the end, Yeah, they'll never be profitable.
No, no, no, it gets worse at a month later, Amazon announced that AWS is profitable for the first time.
Just like Buddy miss the bean.
Speaker 2Come on, whoa, Yeah, maybe that's the origin story.
Maybe he was like, oh wow, I screwed that up.
Speaker 1I guess I believe the origin story is actual social media.
I think I think he felt.
I actually think a lot of journalists think that they missed the boat on social media.
I have been in media relations since two thousand and eight.
I have read and this sounds insane, yeah, but it's true.
I think I've read just about everybody's work since then within the tech media, at least, including Kevin's, and he has always had a little bit of anxiety that he missed social media.
Nobody missed social media, not a single fucking one since two thousand and eight.
Everybody was on Zuckerberg's zuck.
He sucked him off at hardcore.
Sorry sorry, but nevertheless, they were on top of this, they wrote about some social media was written about immediately.
If anything, I think the media was a little slow to get on apps.
Then they gone on hardcore.
I know the history of this shit.
I have been taking detailed notes.
They sound crazy, but I think that there's just there is this weird thing of like the powerful would never lie to us.
And then Prism came out, and then Cambridge Analytic happened and people are like, maybe Zuckerberg's bad, but he wouldn't lie to us.
He would and they're like, well they know things we don't know.
And that's actually another that's my favorite.
AI think when it's like the secret things they're working on, secret things, secret things, sitting in the waiting in the wings, you'll never believe what's coming.
And it's just I actually think it's just what's his name, Iliasuitskeva just goes to Bar's occasion.
He's like, you'll never guess when I said, you never get.
Speaker 2A okay, okay, no, I I have a thing to say about when like this is this is me just being petty and making a point that other people have made that yeah, of course.
Speaker 1No.
Speaker 2When he when he announced that he was you know, putting together a team to you know, just go straight for safe super intelligence.
He meant to say when he posted this on social media, that he was putting together a crack team, but that's not what he wrote.
What he right, He wrote that he was putting together a cracked team.
Hmmm, crack ed, cracked, and like, yeah, actually you know what, Yeah, I think that's true.
Speaker 1I agree, yeah, exactly.
I also think him and Mary Moralty I can't.
And so for the for the listeners, Illocitskava, one of the co founders of OPE and I, raised two billion dollars at a thirty billion dollar valuation.
Mira Mourati did a billion some amounts some bullshit.
Neither of these companies have told their investors how they will spend the money, or what on or what they will build.
And you may think I'm being facetious.
Mira Murati literally said to investors, I will not tell you, and then said I and has board rights where she can veto everyone, I will be honest, go for it.
Fuck yeah, I think I think at this stage, if these people are so fucking stupid that you're just like, I promise you literally nothing.
I won't give you you hogs a single oinc You're not gonna get anything from me.
Give me money now, fuck yeah, go for it.
But on the other hand, I cannot wait for the investigation.
Speaker 2I just hope there is one.
Right.
These people are acting with impunity like and and also like again accountability just for their words.
Right.
The most basic criticism of the wealthy in media like this is to shift just a little bit.
Eric Schmidt right said about about a year ago, shortly before the election, he said, we're never going to meet our climate goals anyway, so we might as well just burn as much carbon and use as many resources as possible to get to AGI, and then that will solve climate change for us, which is ridiculous because we don't so cool.
Speaker 1Yeah yeah, Like we have no responsibility for our actions until we hand them off to someone else.
Speaker 2Right, So he said this, which is ridiculous for lots of reasons, and echo stuff that Sam Altman has said, right, It's it's ridiculous among other reasons, because like AGI is not a thing, and also because we don't need like we know what we need to do to solve global warming, right, we know what we need to do to solve the climate crisis is just a matter of actually getting everybody.
Speaker 1I don't need to do it, but may sam Oltman said that they know what they need to do to get to AGI, and then said a few months later that agi wasn't a useful term.
Speaker 2Q D, Well, no, this is this is exactly what I was about to say about right.
He repeats this claim about like just pushing as hard as we can about a month after the last time he said it, maybe two months.
Only a few weeks ago he comes out in the New York Times with this op ed saying, oh, AGI is not really a thing and we shouldn't care about it.
It's like, buddy, you were saying just a few weeks ago that this was gonna save the world from the biggest emergency of our time, and now you're saying it's not a thing.
Do you think we're all stupid?
Are you that stupid?
What the fuck is going on?
Speaker 1I can actually tell you.
I think he thinks the meteor is that stupid and will write and we'll publish anything.
Speaker 2He says.
I mean to see that the New York Times published it.
Speaker 1I wasn't Yeah, I would be honest.
That's the least we've got fucking Ezra Klein being like, hey, geis the fucking Ezra Ezra?
What a peculiar fella, What a peculiar fella, Ezbra Klein is?
What's going on there?
You ever run into mister Klein.
Speaker 2I've never met him directly, I know people who know him, but now I think I think he just hung out with too many tech billionaire as while he was living in the Bay Area.
He's this fucking mind poison.
These people are boring.
Yeah, these people are bored.
Speaker 1You sit down.
Speaker 2They have amounts of money.
And like, if you're if you're someone who's never been cool and they have been cool, and I've also never been cool, love it, don't I know?
But like, if you've never been cool, one of two things happens to you as you grow up.
Either you desperately want to be cool and that can, you know, go wrong in many different ways, like Musk and possibly like Klein.
I don't know, maybe like I'm willing to believe that something else is going on with him.
I don't know.
But or you become like us and you stop giving a shit yeah, and you accept oh, I'm just permanently uncool.
Whatever, I'll make my way through life.
Speaker 1And that's the thing.
It's these people are just disconnected from humanity.
None of these people seem to have friends or loved ones, because there's just if I did any of this whack of through shit, I would get takes from Casey or there or any number of people love me just like, hey, man, are you okay?
Sound insane?
No, Casey would definitely not be just.
But hey, yeah, what the fuck you you okay?
That doesn't make any sense.
Speaker 2What do you mean?
Speaker 1What do you mean a Dyson sphere?
Do you know what that is?
A dice in sphere.
It's just they don't have friends, and I don't know if they want them.
I think it could require a certain level of vulnerability.
Have you talked twenty Have you met up with any of them?
Speaker 2With the billionaires, no, I tried.
There's a list at the end of my book of all of the tech billionaires I tried to interview and they all said no.
The only one who I successfully interviewed was like a lower tier billionaire guy named jan to Lynn who's in deep with the effective algist was Skype and Speaza.
Yeah yeah that's right, christ Yeah, yeah, yeah.
Actually, can't hate him for those are two pretty good ones.
Speaker 1Yeah, exactly, though I will say Skype definitely one of those inventions that I've never seen something just stop.
Yeah, no, Scott just got like no, just no, I mean trapped in Amber.
It was the same product for fifty oh yeah.
And the Microsoft are just like what Box are an animal farm banged, it's at a glue factory with Skype.
We fucked this up well enough, It's just it's also it's also sad.
But this has been such a wonderful conversation.
Oh where can people find you?
Speaker 2Well, I'm on blue Sky because I don't want to be on a platform like Ext's filled with Nazis now, so blue sky is the best place to find me.
I'm Adam Becker dot blue sky dot social, orgy dot social.
And yeah, I got a book, is the main thing.
Yeah, I got a book called More Everything Forever linked to it in the notes.
Yeah, it is available wherever fine books are sold.
And if you liked what I had to say on this episode, I think you'll like the book.
Speaker 1And if you like what I have to say on this show, you're a sick puppy, you know where to find me.
Thank you so much for your time as ever, I love you all.
Thank you to beahet of course here in New York City, here for producing this episode, and of course the Madisowski, the wonderful producer at home.
I will catch you with them monologue in a few days.
Thank you so much, Thank you for listening to Better Offline.
The editor and composer of the Better Offline theme song is Matasowski.
You can check out more of his music and audio projects at Matasowski dot com, M A T T O S O W s ki dot com.
You can email me at easy at Better Offline dot com or visit Better Offline dot com to find more podcast links and of course my newsletter.
I also really recommend you go to chat dot Where's youreed dot at to visit the discord, and go to our slash Better Offline to check out our reddit.
Thank you so much for listening.
Better Offline is a production of cool Zone Media.
For more from cool Zone Media, visit our website cool Zonemedia dot com, or check us out on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.