Episode Transcript
1
00:00:00,000 --> 00:00:02,880
These people are doing it because they're too stupid
2
00:00:02,880 --> 00:00:05,040
to realize that what they're in is kind of a scam.
3
00:00:05,190 --> 00:00:07,680
I think that they're scamming in the sense that they're selling something
4
00:00:07,680 --> 00:00:10,500
they don't really understand, but they're not like, I think this will die.
5
00:00:10,740 --> 00:00:12,630
I think that they think this will go forever.
6
00:00:12,690 --> 00:00:16,290
Kind of like crypto and when I, this is the other thing, people
7
00:00:16,290 --> 00:00:18,420
with my work, they're like, oh, you started like a year ago.
8
00:00:19,320 --> 00:00:20,490
I was writing in 2020.
9
00:00:20,580 --> 00:00:22,140
I was on crypto before.
10
00:00:22,140 --> 00:00:22,860
Everyone.
11
00:00:27,780 --> 00:00:29,640
Welcome to Screaming in the Cloud.
12
00:00:29,730 --> 00:00:33,690
I'm Cory Quinn, and I've been looking forward to this episode for a while.
13
00:00:33,990 --> 00:00:38,370
Ed Zitron is the host of Better Offline, the writer of Where's Your
14
00:00:38,370 --> 00:00:42,990
Ed at Newsletter and has often been referred to as an AI skeptic.
15
00:00:43,170 --> 00:00:47,760
I tend to view him as much more of an AI realist, but we'll get into that.
16
00:00:48,030 --> 00:00:49,770
Ed, thank you for joining me.
17
00:00:50,160 --> 00:00:51,060
Thank you for having me.
18
00:00:53,265 --> 00:00:57,585
So you have been very vocal about AI and I, I wanna say your
19
00:00:57,585 --> 00:01:01,155
skepticism of it, but that feels like it's an unfair characterization.
20
00:01:01,155 --> 00:01:03,615
I think you're, you're approaching this from the perspective
21
00:01:03,615 --> 00:01:06,465
of you are trying to sell me something, I'm not going to buy
22
00:01:06,465 --> 00:01:10,155
it without examination, and the rest sort of flows from there.
23
00:01:10,785 --> 00:01:14,505
Yes, and that's the thing right now, and I've said this
24
00:01:14,505 --> 00:01:17,535
a few times in a few interviews, I am being and have been
25
00:01:17,535 --> 00:01:20,835
characterized as a radical because I'm like, Hey, this company
26
00:01:20,835 --> 00:01:23,685
loses billions of dollars and has no path to profitability.
27
00:01:23,685 --> 00:01:28,365
Or, Hey, none of these companies appear to be making much revenue, let alone
28
00:01:28,365 --> 00:01:32,265
profit, but they're spending so much money on this, is that sustainable?
29
00:01:32,265 --> 00:01:33,914
And people like, ed, you are crazy.
30
00:01:34,365 --> 00:01:35,475
You are crazy.
31
00:01:35,625 --> 00:01:39,345
I irrational person believe that these companies will
32
00:01:39,345 --> 00:01:42,045
simply work it out in some way that I cannot describe.
33
00:01:42,645 --> 00:01:43,785
I have blind faith.
34
00:01:43,845 --> 00:01:45,165
You are the radical for not
35
00:01:45,255 --> 00:01:45,645
Right?
36
00:01:45,645 --> 00:01:47,985
Strangely, when I try and pitch people on things with blind
37
00:01:47,985 --> 00:01:50,175
faith in them, they, they tend to ask a whole bunch of
38
00:01:50,175 --> 00:01:52,335
uncomfortable questions that I'm not prepared to answer.
39
00:01:52,335 --> 00:01:54,315
But nowadays with ai, when you ask those
40
00:01:54,315 --> 00:01:56,205
questions, you're treated like you are the problem.
41
00:01:56,295 --> 00:01:56,805
Exactly.
42
00:01:56,805 --> 00:02:01,305
You, it's like a gifted child that you have to, it's what is that, um, Twilight
43
00:02:01,305 --> 00:02:04,485
zone episode with the child that can dream, dream, thinks of something.
44
00:02:04,635 --> 00:02:06,795
It's just everyone wants to coddle AI in case it
45
00:02:06,795 --> 00:02:09,285
hurts them or in case it breaks the narrative.
46
00:02:09,525 --> 00:02:14,475
It's just, it's very frustrating because disagreeing with me is one thing.
47
00:02:14,505 --> 00:02:18,315
I do not have trouble with that unless the person is just like, their
48
00:02:18,315 --> 00:02:23,595
argument is no, because that is most of the response to my work is just nah.
49
00:02:24,315 --> 00:02:24,945
No, it's not.
50
00:02:25,305 --> 00:02:26,955
Well, I'm trying to sell something and if what you're
51
00:02:26,955 --> 00:02:29,445
saying is true, that impedes my ability to do that.
52
00:02:29,445 --> 00:02:30,525
So shut up, please.
53
00:02:30,825 --> 00:02:33,675
Yeah, could you, but even then, it's like it goes beyond that.
54
00:02:33,675 --> 00:02:35,355
'cause you've got tons of journalists as
55
00:02:35,355 --> 00:02:37,305
well who are just like, no, it's not true.
56
00:02:37,305 --> 00:02:38,115
They're gonna work it out.
57
00:02:38,115 --> 00:02:39,375
This is all part of the plan.
58
00:02:39,495 --> 00:02:40,425
What's the plan?
59
00:02:40,425 --> 00:02:41,715
Can you tell me the rest of it?
60
00:02:41,715 --> 00:02:45,885
Because it doesn't seem like a good plan unless the plan is raise as much
61
00:02:45,885 --> 00:02:50,565
money and get as much awareness as possible, which they've succeeded at.
62
00:02:50,595 --> 00:02:51,555
They've not made any money.
63
00:02:51,555 --> 00:02:53,715
In fact, they're not making much money at all.
64
00:02:53,715 --> 00:02:55,155
No one is making any money.
65
00:02:55,245 --> 00:02:59,085
I think there's like two profitable companies within this space.
66
00:02:59,595 --> 00:03:01,065
Maybe, maybe three.
67
00:03:01,740 --> 00:03:06,240
If we go back in time and we look at the standard generative AI chatbot or
68
00:03:06,240 --> 00:03:09,960
coding assistance and you offer it to someone in a vacuum, they would spend
69
00:03:09,960 --> 00:03:13,260
thousands of dollars a month for this based upon the perceived utility.
70
00:03:13,260 --> 00:03:16,800
Not everyone, but enough people would now though that has
71
00:03:16,800 --> 00:03:21,150
been anchored at about $20 a month per user is the perception.
72
00:03:21,270 --> 00:03:24,540
So I don't see a path to suddenly multiplying each price by
73
00:03:24,540 --> 00:03:28,200
a hundred to wind up generating some sort of economic return.
74
00:03:28,350 --> 00:03:31,410
They have anchored their pricing at a point where I don't see a
75
00:03:31,410 --> 00:03:35,190
path to significant widespread industry defining profitability.
76
00:03:35,400 --> 00:03:38,370
I'm gonna be honest, I don't think they would've paid thousands of dollars.
77
00:03:38,370 --> 00:03:40,440
You had code auto complete things.
78
00:03:40,440 --> 00:03:45,000
I talked to a client about this before COVID, like I was remember in a WeWork,
79
00:03:46,290 --> 00:03:48,570
and they were telling me, yeah, it was kind like auto complete for coding.
80
00:03:48,570 --> 00:03:50,010
I'm like, oh, seems useful.
81
00:03:50,010 --> 00:03:52,380
I bet people would like that They, I think
82
00:03:52,380 --> 00:03:54,150
they sold part of it to Microsoft or something.
83
00:03:54,150 --> 00:03:54,930
I, I forget.
84
00:03:55,260 --> 00:03:56,940
But also, if they would've paid thousands
85
00:03:56,940 --> 00:03:58,500
for it, wouldn't they have paid thousands of.
86
00:03:58,875 --> 00:04:00,135
Dollars for it already.
87
00:04:00,279 --> 00:04:04,274
'cause GitHub copilot basically got out there before everyone.
88
00:04:04,424 --> 00:04:07,035
Would people not have paid thousands for that?
89
00:04:07,424 --> 00:04:11,924
That's the thing, Microsoft launching this product, they are the pricing gods.
90
00:04:11,924 --> 00:04:13,515
If they thought they could get a thousand
91
00:04:13,515 --> 00:04:16,305
dollars a month per head, they would do it.
92
00:04:16,305 --> 00:04:17,265
They would do it today.
93
00:04:17,265 --> 00:04:20,385
They would've done it yesterday or several years ago, but they didn't.
94
00:04:20,445 --> 00:04:24,615
And there was a story that came out in like 2023, I think that said that, um,
95
00:04:25,034 --> 00:04:29,085
Microsoft was losing like 20 bucks a customer with GitHub copilot as well.
96
00:04:29,085 --> 00:04:34,305
But putting that aside, I think that people overestimate, or sorry,
97
00:04:35,025 --> 00:04:38,200
underestimate how much software engineers love automating shit.
98
00:04:39,390 --> 00:04:42,030
Like there, there was the whole platform as a service era.
99
00:04:42,330 --> 00:04:45,780
Uh, the, the, I mean when that with containerization, when it came
100
00:04:45,780 --> 00:04:49,350
in, I remember talking to a client at the time PR firm who was
101
00:04:49,350 --> 00:04:51,780
just like, yeah, there are people who just do containerization.
102
00:04:51,780 --> 00:04:54,099
'cause their boss said they'd heard it at a conference.
103
00:04:54,430 --> 00:04:58,830
Nick Suresh Esh, uh, ity, he did a great piece about Monte Carlo.
104
00:04:58,920 --> 00:05:01,770
I don't even know what Monte Carlo does, but he has a great
105
00:05:01,770 --> 00:05:04,620
piece about how a bunch of people were trying to get Monte Carlo.
106
00:05:04,620 --> 00:05:06,270
They were like, why am I talking about Monte Carlo?
107
00:05:06,450 --> 00:05:07,560
And no one knew what it was.
108
00:05:07,920 --> 00:05:11,790
This is the modern software era and we can't even make
109
00:05:11,790 --> 00:05:15,720
real money off of AI in this kind of specious software era.
110
00:05:15,720 --> 00:05:17,070
It's crazy to me.
111
00:05:17,070 --> 00:05:17,460
Man.
112
00:05:17,850 --> 00:05:19,680
It, the whole thing feels insane.
113
00:05:20,220 --> 00:05:22,620
Well, as you said, it's automation, and this is on some level from
114
00:05:22,620 --> 00:05:25,350
a coding assistant perspective, the purest form of automation of
115
00:05:25,350 --> 00:05:27,930
the workflow, which is copying and pasting outta a stack Overflow.
116
00:05:28,170 --> 00:05:29,310
IDs aren't new.
117
00:05:30,150 --> 00:05:34,200
Like, I mean, it's just, I'm not saying that this is, and one, one common
118
00:05:34,200 --> 00:05:37,020
mischaracterization of my work, and frankly, I've done myself no favors.
119
00:05:37,020 --> 00:05:40,620
In my earlier stuff, I was a little unilateral, but one
120
00:05:41,070 --> 00:05:44,790
common mischaracterization is that I believe there's no value.
121
00:05:44,850 --> 00:05:47,940
No, I think this is a $50 billion total addressable market.
122
00:05:48,000 --> 00:05:52,110
I think that we are nowhere near the top, but the top is much smaller,
123
00:05:52,470 --> 00:05:56,640
and I don't think there is going to be some magical breakthrough.
124
00:05:56,700 --> 00:06:01,440
Every breakthrough that's happened is just
125
00:06:01,440 --> 00:06:04,530
kind of an incremental one since about 2023.
126
00:06:04,830 --> 00:06:07,920
Even with the launch of reasoning, remember what everyone was
127
00:06:07,920 --> 00:06:12,480
beating off about strawberry and it comes out and nothing happens.
128
00:06:12,600 --> 00:06:13,050
Yes.
129
00:06:13,050 --> 00:06:15,150
The number of RS in strawberry, which then all
130
00:06:15,150 --> 00:06:17,610
the models special case that that'll solve it.
131
00:06:17,640 --> 00:06:19,770
We'll just play whack-a-mole a hundred million times.
132
00:06:19,770 --> 00:06:21,570
Every time someone finds something embarrassing.
133
00:06:21,840 --> 00:06:22,920
It's so cool.
134
00:06:22,980 --> 00:06:25,710
It, it's so well, the strawberry one was there for a while.
135
00:06:26,039 --> 00:06:28,080
I know because every single time a new model would
136
00:06:28,080 --> 00:06:29,580
come out, I'd be like, how do you, how many Rs?
137
00:06:29,580 --> 00:06:31,020
And just the first thing I did.
138
00:06:31,020 --> 00:06:34,739
Or it'd be like, give me 50 sta, give me all the states with R in them.
139
00:06:34,890 --> 00:06:36,390
And I'd sit there and be like, woo hoo.
140
00:06:36,450 --> 00:06:37,229
They did it again.
141
00:06:37,229 --> 00:06:39,419
And then you easy posts.
142
00:06:39,510 --> 00:06:41,340
The one I love is trying to cyber bully it
143
00:06:41,340 --> 00:06:43,799
into ranking the US presidents by absorbency.
144
00:06:43,859 --> 00:06:45,900
That that can go surprisingly well.
145
00:06:45,990 --> 00:06:51,210
It's so funny that we're like boiling lakes to do, but it, it's
146
00:06:51,210 --> 00:06:56,789
just, it's very frustrating because I'm, I don't feel genuine.
147
00:06:56,789 --> 00:06:59,429
I genuinely, nothing you hear is really anger.
148
00:06:59,609 --> 00:07:04,710
I'm frustrated because what I am saying is true and it will event
149
00:07:04,710 --> 00:07:09,299
gravity exists and everyone's going to feel like me eventually, I think.
150
00:07:09,900 --> 00:07:13,140
And it's just like, how long do you want do this to yourself?
151
00:07:13,140 --> 00:07:14,909
How long do you wanna to base yourself
152
00:07:15,060 --> 00:07:17,580
pretending that these children are gifted?
153
00:07:17,820 --> 00:07:18,150
Yes.
154
00:07:18,180 --> 00:07:20,700
Uh, we're, we're still trying to summon God via JSON and
155
00:07:20,700 --> 00:07:23,250
we're hoping for the best, but it's not going super well.
156
00:07:23,340 --> 00:07:25,349
Uh, you've been tracking this longer than I have.
157
00:07:25,349 --> 00:07:30,270
I, I didn't want to be tracking this, but AWS had a corporate
158
00:07:30,300 --> 00:07:33,840
inability to shut up about any, about anything touching
159
00:07:33,840 --> 00:07:36,750
ai and it was all that they were willing to talk about.
160
00:07:36,750 --> 00:07:38,520
So I was brought in kicking and screaming
161
00:07:38,580 --> 00:07:41,490
relatively late 'cause I resisted for a while.
162
00:07:41,940 --> 00:07:43,320
What have you seen economically?
163
00:07:43,320 --> 00:07:44,370
Let's talk about the numbers.
164
00:07:44,460 --> 00:07:48,659
So recently came out as in like a day before we spoke the
165
00:07:48,719 --> 00:07:52,740
micro, so Microsoft fun story about Microsoft and numbers.
166
00:07:53,280 --> 00:07:55,664
Two quarters ago Microsoft said they had $10
167
00:07:55,664 --> 00:07:59,789
billion of annualized recurring revenue from ai.
168
00:08:00,060 --> 00:08:04,530
Now Microsoft does not have an AI section of their earnings.
169
00:08:04,530 --> 00:08:07,920
They have intelligent cloud productivity and software and I think.
170
00:08:09,270 --> 00:08:12,599
Like something like it's something and more intelligent
171
00:08:12,599 --> 00:08:14,760
cloud, which also speaks to the existence of more on
172
00:08:14,760 --> 00:08:16,770
cloud somewhere, but they don't like to talk about it.
173
00:08:16,800 --> 00:08:17,070
Yeah.
174
00:08:17,130 --> 00:08:19,830
Yeah, that's, that's actually what large language models are.
175
00:08:20,099 --> 00:08:21,960
I, so they have these, they don't have these
176
00:08:21,960 --> 00:08:25,230
sections, but AI is basically all combined AI revenue.
177
00:08:25,349 --> 00:08:30,450
So last quarter, then quarterly earnings in 2024, they said $10 billion a RR.
178
00:08:30,450 --> 00:08:35,520
So month times 12, Hey, January rolls around, proud as can be.
179
00:08:35,610 --> 00:08:39,960
Microsoft's at $13 billion of a RR in ai.
180
00:08:39,990 --> 00:08:41,039
Exciting, right?
181
00:08:41,220 --> 00:08:42,870
Next quarter, earnings come round.
182
00:08:42,870 --> 00:08:46,145
They do not update, the number doesn't update.
183
00:08:46,525 --> 00:08:47,305
Why would they do that?
184
00:08:47,535 --> 00:08:50,310
Then the information came out a few weeks ago with a story that
185
00:08:50,310 --> 00:08:55,950
said, uh, Microsoft's projected revenue in AI was $13 billion.
186
00:08:56,040 --> 00:08:59,550
And then another story came out yesterday that said that
187
00:08:59,910 --> 00:09:02,790
10 billion, well, first of all, $10 billion of that was
188
00:09:02,790 --> 00:09:07,260
open AI's spend, but OpenAI is as yours biggest customer.
189
00:09:08,505 --> 00:09:12,855
And it's very bad that that's the case because Azure, they pay a discounted
190
00:09:12,855 --> 00:09:17,415
rate on Azure and there's a decent chance that a lot of that money isn't real.
191
00:09:17,415 --> 00:09:22,545
'cause Microsoft invested 10 billion ish in 2023 with a large portion of
192
00:09:22,545 --> 00:09:25,845
that, that we don't know how large in cloud credits SEMA four reported.
193
00:09:26,205 --> 00:09:29,685
And so we have the, we have one of the largest
194
00:09:29,685 --> 00:09:31,725
cloud providers feeding itself cardboard.
195
00:09:32,220 --> 00:09:34,320
Like it's what it, that is what's happening.
196
00:09:34,320 --> 00:09:35,820
I've spoken to multiple report.
197
00:09:36,030 --> 00:09:38,730
It's like Azure's old biggest customer to my understanding was Xbox.
198
00:09:38,730 --> 00:09:39,690
I'm like, okay, great.
199
00:09:39,690 --> 00:09:40,680
Congratulations.
200
00:09:40,710 --> 00:09:41,940
Uh, the Snake eats itself.
201
00:09:41,940 --> 00:09:42,990
But that makes it sense.
202
00:09:43,050 --> 00:09:43,830
That makes sense.
203
00:09:43,890 --> 00:09:47,850
If you're providing, I imagine 365 is probably a very, very large Azure
204
00:09:48,060 --> 00:09:52,860
customer just because, but it isn't aort as revenue because it is cost,
205
00:09:52,860 --> 00:09:55,290
which probably has tax preferential, blah, blah, blah, blah, blah.
206
00:09:55,350 --> 00:09:56,910
They've also stuffed AI into it.
207
00:09:56,910 --> 00:09:59,070
So how much do you want to bet that that is now being
208
00:09:59,070 --> 00:10:01,560
considered AI revenue, despite the fact that just
209
00:10:01,560 --> 00:10:03,870
people who want to do want to write documents for work?
210
00:10:03,930 --> 00:10:07,830
It is, but Microsoft's only making $3 billion of annualized revenue
211
00:10:07,830 --> 00:10:12,390
from all of their AI bullshit other than open AI's $10 billion.
212
00:10:13,170 --> 00:10:17,520
But what's crazy about the story is that apparently Microsoft invested
213
00:10:17,520 --> 00:10:21,870
in open AI belie with top executives believing that they would fail.
214
00:10:22,980 --> 00:10:24,360
This is this.
215
00:10:24,570 --> 00:10:25,860
I've talked to multiple reports today.
216
00:10:25,860 --> 00:10:26,880
I'm trying to be like, Hey.
217
00:10:27,420 --> 00:10:28,469
This is weird.
218
00:10:28,469 --> 00:10:31,050
Have you looked and everyone's like, yeah, yeah, maybe.
219
00:10:31,050 --> 00:10:32,520
You know, these things happen.
220
00:10:32,550 --> 00:10:34,410
You know what people do?
221
00:10:34,500 --> 00:10:35,699
No one does this.
222
00:10:35,819 --> 00:10:39,030
I have never heard in my entire history in Silicon Valley ever
223
00:10:39,300 --> 00:10:42,750
of someone investing in a company and thinking they would die.
224
00:10:43,050 --> 00:10:44,339
There's only one reason.
225
00:10:44,550 --> 00:10:47,459
These, these throw money at it and early stage ideas, but they throw
226
00:10:47,459 --> 00:10:50,280
a million here, 2 million there out of a hundred million dollars fund.
227
00:10:50,310 --> 00:10:52,140
But that's a slight difference.
228
00:10:52,140 --> 00:10:58,050
They are believing it might die, but kind of hedging their bets.
229
00:10:58,380 --> 00:11:00,510
Microsoft executives believe they would
230
00:11:00,510 --> 00:11:04,170
eventually fail, is a vastly different situation.
231
00:11:04,709 --> 00:11:08,730
And the reason in my opinion, is that Microsoft has access to all of
232
00:11:08,730 --> 00:11:13,860
open AI's research IP and they host all of their cloud infrastructure.
233
00:11:14,370 --> 00:11:16,589
Kind of feels like they can let OpenAI die and then
234
00:11:16,589 --> 00:11:19,829
take everything OpenAI has because they already own it.
235
00:11:20,220 --> 00:11:22,199
But nevertheless, putting all that aside.
236
00:11:23,820 --> 00:11:25,980
Microsoft's biggest as your customer is a
237
00:11:25,980 --> 00:11:30,570
company that op that loses the money That is bad.
238
00:11:30,570 --> 00:11:34,320
We are, and Microsoft is the company making the most from
239
00:11:34,320 --> 00:11:38,880
AI by handing money to a company to hand it back to them,
240
00:11:39,090 --> 00:11:42,660
Google is doing a deal to serve compute, to open ai.
241
00:11:43,440 --> 00:11:46,140
By which I mean they are hiring core weave to provide the
242
00:11:46,140 --> 00:11:48,630
compute to Google, to provide the compute, to open ai.
243
00:11:48,840 --> 00:11:50,670
This is fracking wash trading.
244
00:11:51,300 --> 00:11:53,580
It's wash trading in the cloud.
245
00:11:53,940 --> 00:11:56,460
We, this should terrify people.
246
00:11:56,520 --> 00:11:58,920
'cause when you remove open AI's, compute, Microsoft's
247
00:11:58,920 --> 00:12:03,510
making $3 billion a year from AI off of what?
248
00:12:03,570 --> 00:12:05,700
$80 billion of pledged CapEx.
249
00:12:06,030 --> 00:12:07,470
What the frack are we doing?
250
00:12:07,950 --> 00:12:10,680
Like this is, this is not even radical stuff.
251
00:12:10,680 --> 00:12:12,780
I'm speaking objective fact.
252
00:12:13,050 --> 00:12:15,090
Like this is stuff that is real.
253
00:12:15,270 --> 00:12:17,715
But people are like, ah, he's just, he's just a pessimist.
254
00:12:17,715 --> 00:12:18,630
He's just a cenic.
255
00:12:19,080 --> 00:12:20,700
You just have to believe harder at.
256
00:12:21,225 --> 00:12:24,045
I don't even know what I'm believing in at this point.
257
00:12:24,315 --> 00:12:28,245
No one can express what it is that I am not believing in beyond.
258
00:12:28,515 --> 00:12:32,475
They will work it out and model get better.
259
00:12:32,715 --> 00:12:33,885
How does model get better?
260
00:12:34,125 --> 00:12:35,025
Oh, I'm sorry.
261
00:12:35,025 --> 00:12:35,955
I can't answer that.
262
00:12:36,375 --> 00:12:37,905
The what?
263
00:12:38,535 --> 00:12:39,705
Oh, millions of people.
264
00:12:39,765 --> 00:12:40,665
A hundred million.
265
00:12:41,025 --> 00:12:43,545
It's actually a really good that we're on a cloud podcast.
266
00:12:44,415 --> 00:12:45,045
Sure.
267
00:12:45,795 --> 00:12:46,605
Here's a fun fact.
268
00:12:46,605 --> 00:12:50,565
Do you ever notice the, the OpenAI never announces monthly active users.
269
00:12:50,565 --> 00:12:51,585
They only do weekly.
270
00:12:52,245 --> 00:12:52,395
Mm-hmm.
271
00:12:52,785 --> 00:12:53,265
I did.
272
00:12:53,625 --> 00:12:54,075
Yeah.
273
00:12:54,195 --> 00:12:57,345
Do you, do you, do you know why they're not doing that?
274
00:12:57,345 --> 00:12:58,275
I have a theory.
275
00:12:58,905 --> 00:13:00,615
I'm, I don't have anything concrete.
276
00:13:00,615 --> 00:13:01,815
I'm curious to hear your theory.
277
00:13:02,175 --> 00:13:07,065
So the information reported earlier in the year that, uh, I think it was, they
278
00:13:07,065 --> 00:13:13,935
have 15.5 billion, sorry, 15.5 million million paying subscribers on chat GPT.
279
00:13:13,935 --> 00:13:14,355
Right.
280
00:13:15,045 --> 00:13:16,995
I reckon their monthly active users are much
281
00:13:16,995 --> 00:13:19,425
higher than 500 million, which would seem good.
282
00:13:19,425 --> 00:13:19,785
Right.
283
00:13:20,280 --> 00:13:23,760
Would seem good to have except, okay, so say they're 600,
284
00:13:23,760 --> 00:13:27,660
700 million and then you divide that by 15.5 million.
285
00:13:27,839 --> 00:13:29,610
That's a dog shit conversion rate.
286
00:13:29,729 --> 00:13:34,439
That's absolute Do-do that is some of the worst conversion rates in SaaS.
287
00:13:34,589 --> 00:13:37,079
And you have to treat open a, like a SA business.
288
00:13:37,079 --> 00:13:39,360
You cannot treat it like a consumer subscription.
289
00:13:39,540 --> 00:13:41,400
They lose money on every customer, but they
290
00:13:41,400 --> 00:13:44,160
want to be given SaaS style valuations.
291
00:13:44,400 --> 00:13:47,130
Okay, so you have a 2% conversion rate.
292
00:13:47,130 --> 00:13:49,439
The information came out yesterday, well, day before yesterday,
293
00:13:49,439 --> 00:13:52,410
I think with a whole thing about retention rates open.
294
00:13:52,410 --> 00:13:57,780
AI's got like a 73, 72% retention rate after six months,
295
00:13:57,990 --> 00:14:01,020
which is in line with like Spotify, which has 72%.
296
00:14:02,160 --> 00:14:05,849
That's not very good for a company that wants SaaS style valuations.
297
00:14:06,540 --> 00:14:08,910
That's more consumer product valuation than anything else.
298
00:14:09,060 --> 00:14:09,719
Exactly.
299
00:14:09,719 --> 00:14:12,449
And a consumer product that only loses money.
300
00:14:13,349 --> 00:14:14,834
That's not great.
301
00:14:15,555 --> 00:14:16,815
It's not, it's not brilliant.
302
00:14:16,875 --> 00:14:17,385
It's not.
303
00:14:17,475 --> 00:14:19,275
I juggle my consumer stuff all the time.
304
00:14:19,305 --> 00:14:22,155
The business stuff that I use that tends to stay put for years.
305
00:14:22,155 --> 00:14:25,215
That's why you see relatively small churn rates.
306
00:14:25,215 --> 00:14:28,395
That's why you see renewal rates being very sticky.
307
00:14:29,235 --> 00:14:31,365
It, there's not these big drop off at one
308
00:14:31,365 --> 00:14:32,835
year cliffs at most of these companies.
309
00:14:32,895 --> 00:14:36,015
And the thing is, it's because they're effectively consumer subscription.
310
00:14:36,075 --> 00:14:38,145
They, no one really, you can't really express.
311
00:14:38,265 --> 00:14:41,955
I realize that a lot of SaaS is sold in this kind of specious way.
312
00:14:41,955 --> 00:14:44,685
You, where you sell to A CTO or a CEO or a CIO
313
00:14:44,685 --> 00:14:46,575
that doesn't really touch the rest of the stack.
314
00:14:47,265 --> 00:14:50,865
You can't really do that with AI quite as good because
315
00:14:51,675 --> 00:14:53,985
there's not really an outcome that you can even look at.
316
00:14:54,135 --> 00:14:57,015
It's just kind of a thing you offer and it's so heavily commoditized.
317
00:14:57,015 --> 00:14:59,595
Sure, you could get cursor, sure, you could get chat
318
00:14:59,595 --> 00:15:03,225
GPT, but at the same time, okay, I've now got it.
319
00:15:03,495 --> 00:15:05,205
How are you making more money outta these companies?
320
00:15:05,205 --> 00:15:06,975
'cause that's how SaaS companies really cook.
321
00:15:07,215 --> 00:15:08,985
You can't really upsell someone with chat
322
00:15:08,985 --> 00:15:11,565
GPT because, oh, my model will be better.
323
00:15:11,865 --> 00:15:12,225
How?
324
00:15:12,690 --> 00:15:13,890
One company goes out with a good model.
325
00:15:13,890 --> 00:15:17,460
The other one winds up matching or exceeding it within the next two weeks.
326
00:15:17,490 --> 00:15:20,310
Uh, if you suddenly say, you're not allowed to use this one, I can
327
00:15:20,310 --> 00:15:23,340
switch to the other one for anything that I'm doing in less than
328
00:15:23,340 --> 00:15:25,890
an hour's worth of work, and we're right back where we started.
329
00:15:25,890 --> 00:15:26,580
There's no moat.
330
00:15:26,580 --> 00:15:27,450
They're they're fungible.
331
00:15:27,510 --> 00:15:28,230
Exactly.
332
00:15:28,230 --> 00:15:31,350
And on top of that, if you were trying to sell into the enterprise,
333
00:15:31,740 --> 00:15:34,710
I guess if you're doing enterprise chat, GPT subscriptions
334
00:15:35,100 --> 00:15:38,130
fine, but they're not getting the kind of revenue they want.
335
00:15:38,250 --> 00:15:39,510
They're not getting the kind of revenue that
336
00:15:39,510 --> 00:15:42,180
like a Salesforce would have out of that.
337
00:15:42,480 --> 00:15:42,930
They're not.
338
00:15:43,170 --> 00:15:46,260
They are, right now they are, let's see, let's see.
339
00:15:46,260 --> 00:15:51,120
Salesforce earnings bring 'em up live because I think that
340
00:15:51,120 --> 00:15:56,130
they want to, for chat GPT and open AI to make sense, they
341
00:15:56,130 --> 00:16:01,200
need the combined value of the consumer, sorry, the consumers.
342
00:16:01,200 --> 00:16:03,930
I think it's the smartphone market's like half a trillion dollars.
343
00:16:04,260 --> 00:16:05,640
And the enterprise SaaS market, which is
344
00:16:05,640 --> 00:16:08,220
about a quarter of a trillion, they need both.
345
00:16:08,715 --> 00:16:10,455
That's the only way any of this makes sense.
346
00:16:10,455 --> 00:16:14,145
And they are not even at the total revenue.
347
00:16:14,234 --> 00:16:16,305
They may be just a little bit above the total revenue.
348
00:16:16,604 --> 00:16:19,935
If you combine open ai, anthropic, Microsoft, Amazon,
349
00:16:20,084 --> 00:16:23,984
all of the projected revenues, they're like 38 billion.
350
00:16:25,124 --> 00:16:29,805
The global smartwatch revenue for this year is projected to be about 32 billion.
351
00:16:30,749 --> 00:16:32,339
So congratulations everyone.
352
00:16:32,339 --> 00:16:34,530
All the kings, horses, all the kingsmen came together
353
00:16:34,680 --> 00:16:36,989
and we've barely beaten the smartwatch industry.
354
00:16:37,349 --> 00:16:39,390
The register had an article come out this morning that
355
00:16:39,390 --> 00:16:42,810
says that, uh, was a quarter of all Gen ai Proof of
356
00:16:42,810 --> 00:16:45,269
concepts don't, are turned off by the end of the year.
357
00:16:45,659 --> 00:16:46,169
People are.
358
00:16:46,394 --> 00:16:49,334
It's people are experimenting with it, and I work on cloud bills.
359
00:16:49,334 --> 00:16:51,374
I see what the spend looks like.
360
00:16:51,435 --> 00:16:54,435
People are experimenting with these things, but I'm not seeing
361
00:16:54,435 --> 00:16:57,824
anyone saying, well, we're, we're spending $150 million a year
362
00:16:57,824 --> 00:17:01,154
with AWS, but we'd better make a 200 million on our commitment.
363
00:17:01,184 --> 00:17:05,174
'cause of all the Gen ai, they're doing $50,000 a month here and there.
364
00:17:05,174 --> 00:17:07,935
But on that basis of a spend, what's it matter?
365
00:17:07,935 --> 00:17:10,514
It's, if it succeeds, maybe they'll start rolling, rolling
366
00:17:10,514 --> 00:17:12,674
it out more wildly, and then care about optimizing it.
367
00:17:12,944 --> 00:17:16,364
But it's getting turned off if it doesn't hit certain, uh, certain metrics.
368
00:17:16,395 --> 00:17:19,065
And I'm not seeing the stories that show that this
369
00:17:19,065 --> 00:17:21,585
stuff is actually being transformative in any way.
370
00:17:21,614 --> 00:17:24,464
And if it were even slightly, they wouldn't shut up about it.
371
00:17:24,525 --> 00:17:25,545
Exactly.
372
00:17:25,634 --> 00:17:27,435
These motherfrackers love going to conferences
373
00:17:27,435 --> 00:17:28,995
and talking about how important stuff is.
374
00:17:28,995 --> 00:17:32,625
But I guess another Nick Suresh suresh piece, I'll have to link you to,
375
00:17:32,924 --> 00:17:36,284
but talking about why people get, uh, why people talk about Snowflake.
376
00:17:36,674 --> 00:17:37,815
Like it's a database company.
377
00:17:37,815 --> 00:17:39,915
It's like, it's the most boring thing, but guys like buying
378
00:17:39,915 --> 00:17:42,134
Snowflake so they can go and talk about using Snowflake.
379
00:17:42,465 --> 00:17:45,284
But with AI it's like, it's all experimental.
380
00:17:45,284 --> 00:17:48,074
I don't think people realize how bad that is because
381
00:17:48,105 --> 00:17:50,415
it would be one thing saying this a year ago, I was
382
00:17:50,415 --> 00:17:53,685
right then and I'm right now, but a year ago, fine.
383
00:17:54,255 --> 00:17:56,054
It's experimental fine.
384
00:17:56,384 --> 00:17:57,915
There's experimentations are taking an
385
00:17:57,915 --> 00:18:01,215
awfully long time by enterprise standards.
386
00:18:01,335 --> 00:18:01,844
Exactly.
387
00:18:01,844 --> 00:18:04,215
Like they would say this is a, this is a good thing.
388
00:18:04,215 --> 00:18:05,235
Like this is moving well.
389
00:18:05,235 --> 00:18:07,034
Also, I wanna correct myself, I brought it up.
390
00:18:07,034 --> 00:18:10,065
Salesforce's revenue is $9.83 billion in the
391
00:18:10,065 --> 00:18:14,685
last quarter, so only four x to go for open ai.
392
00:18:14,745 --> 00:18:19,965
Also, they made $1.5 billion in profit open AI is yet to make $1 in profit.
393
00:18:20,445 --> 00:18:23,835
I spent top dollar for Slack and I sort of work my way between
394
00:18:23,835 --> 00:18:26,985
all the AI stuff that they're shoehorning into it against my will.
395
00:18:27,195 --> 00:18:30,134
Oh, I, I am paying for Slack with a gun to my head.
396
00:18:30,134 --> 00:18:34,005
Like I do not wanna pay the idea of paying Mark Benioff sickens me.
397
00:18:34,065 --> 00:18:34,545
But.
398
00:18:35,384 --> 00:18:37,694
Salesforce, as I mentioned earlier, they say
399
00:18:37,694 --> 00:18:39,824
there's not gonna be any revenue growth from this.
400
00:18:39,944 --> 00:18:46,065
It's just so weird because all of these things have been said publicly.
401
00:18:46,335 --> 00:18:50,054
All of these things kept, keep being said publicly, but
402
00:18:50,054 --> 00:18:52,094
when you put 'em all together and you say, Hey, this
403
00:18:52,094 --> 00:18:54,705
doesn't look good, everyone goes, you rude bastard.
404
00:18:55,125 --> 00:18:55,844
How dare you?
405
00:18:56,475 --> 00:18:58,364
How dare you insult my beautiful child?
406
00:18:58,364 --> 00:18:58,844
Ai?
407
00:18:59,054 --> 00:19:02,534
It's like, I dunno, because it sucks.
408
00:19:03,165 --> 00:19:06,554
Like even if it worked, even if it was flawless, if these
409
00:19:06,554 --> 00:19:09,165
were the revenues and the losses, it would still kind of suck.
410
00:19:09,344 --> 00:19:11,384
Like it's not, it's not good business and
411
00:19:11,384 --> 00:19:13,574
frankly I don't think it's great software either.
412
00:19:13,814 --> 00:19:15,750
. Um, it's.
413
00:19:16,229 --> 00:19:18,749
It's just so bizarre because sure, there are people who really
414
00:19:18,749 --> 00:19:21,360
like it and I think there's a philosophical angle here where
415
00:19:21,360 --> 00:19:23,850
there are people who are really attached to being attached to the
416
00:19:23,850 --> 00:19:27,449
future where they want, they wanna be the guy that said it first.
417
00:19:27,449 --> 00:19:28,379
I think you see it in journalism.
418
00:19:28,379 --> 00:19:31,019
I think you see it in software engineering and founders in particular.
419
00:19:31,019 --> 00:19:34,229
They wanna be the person that said, I was on this trend before everyone.
420
00:19:34,229 --> 00:19:35,279
Look how smart I am.
421
00:19:35,399 --> 00:19:37,409
I think that that's what Andy Jassy's doing.
422
00:19:37,829 --> 00:19:38,879
Management consultant.
423
00:19:38,939 --> 00:19:39,870
MBA motherfracker.
424
00:19:40,709 --> 00:19:41,519
He, okay.
425
00:19:41,519 --> 00:19:43,829
I don't think he was a management consultant is an MBA though.
426
00:19:44,100 --> 00:19:46,110
He does have at MBA, but he was a marketing
427
00:19:46,110 --> 00:19:49,110
manager at at Amazon before he started AWS.
428
00:19:49,410 --> 00:19:52,800
He was one of the 21 people that take credit for founding AWS.
429
00:19:53,010 --> 00:19:55,230
It's like a, there's like a bunch of people.
430
00:19:55,350 --> 00:19:56,430
You ever notice it's, yeah.
431
00:19:56,430 --> 00:19:58,860
Can you tell on LinkedIn whether project's successful or not by how many
432
00:19:58,860 --> 00:20:01,770
people claim credit for the thing versus distance the hell outta themselves?
433
00:20:01,830 --> 00:20:05,460
But I think the Jassy, I think it's an AWS,
434
00:20:06,090 --> 00:20:08,280
it's kind of a hard situation for AWS as well.
435
00:20:08,280 --> 00:20:09,780
'cause of course they have to jump on this.
436
00:20:09,990 --> 00:20:11,850
Like the, they couldn't, not the stock would
437
00:20:11,850 --> 00:20:16,200
get pummeled, but they are, they're doing this.
438
00:20:16,200 --> 00:20:18,570
Jassy's staking his future on this.
439
00:20:18,600 --> 00:20:21,720
He wants this to be, here's everything store.
440
00:20:21,780 --> 00:20:23,520
He wants this to be the thing that defines
441
00:20:23,520 --> 00:20:25,910
the future of Amazon, except it's dog shit.
442
00:20:26,040 --> 00:20:26,580
Amazon.
443
00:20:26,610 --> 00:20:30,930
The Amazon may be one of the most embarrassing AI companies.
444
00:20:30,930 --> 00:20:32,520
I think Apple truly is.
445
00:20:32,520 --> 00:20:34,290
I think Apple's doing the smartest thing
446
00:20:34,710 --> 00:20:38,100
in that they are kind of slow walking back.
447
00:20:38,130 --> 00:20:41,040
But Apple intelligence, it was like the most radicalizing thing.
448
00:20:41,190 --> 00:20:42,870
They shot their mouth off too early.
449
00:20:43,020 --> 00:20:43,830
I love it.
450
00:20:43,920 --> 00:20:44,400
I love it.
451
00:20:44,400 --> 00:20:46,530
So I love that they were like, what if.
452
00:20:47,025 --> 00:20:49,665
Hey, you know, autocorrect, what if it sucked?
453
00:20:50,685 --> 00:20:52,785
And okay, I, I don't like that idea.
454
00:20:52,845 --> 00:20:56,925
What if we randomly added buttons that loaded things you didn't want?
455
00:20:57,315 --> 00:20:58,515
No, I don't want that.
456
00:20:58,515 --> 00:20:58,965
Cool.
457
00:20:58,965 --> 00:21:00,765
This update will apply automatically.
458
00:21:01,185 --> 00:21:04,305
I think they radicalized millions of people against ai.
459
00:21:04,455 --> 00:21:07,275
And on the keynote too, it's what if your Siri had this
460
00:21:07,275 --> 00:21:09,525
magic knowledge of everything you did on your iPhone?
461
00:21:09,525 --> 00:21:11,415
It could tie it to, it's like, wow, that'd be kind of cool.
462
00:21:11,415 --> 00:21:12,135
Like, when do we get that?
463
00:21:12,140 --> 00:21:12,405
No, no, no.
464
00:21:12,405 --> 00:21:13,755
We just said, what if, what if?
465
00:21:13,755 --> 00:21:14,445
Wouldn't it be neat?
466
00:21:14,445 --> 00:21:16,425
Like we don't know how to do it, but wouldn't it be neat?
467
00:21:16,455 --> 00:21:17,235
We are riffing.
468
00:21:17,295 --> 00:21:18,315
We are riffing right now.
469
00:21:18,615 --> 00:21:22,275
Well, uh, Google did the same thing at the last, uh, IO in 2024.
470
00:21:22,275 --> 00:21:26,505
They were like, imagine if you needed to return a shoe, you hit one button.
471
00:21:26,685 --> 00:21:29,325
An agent takes the shoe data, it does this, it does
472
00:21:29,325 --> 00:21:31,245
that, and then this happens, and then this happens.
473
00:21:31,845 --> 00:21:33,765
All of that is entirely theoretical.
474
00:21:33,765 --> 00:21:34,995
But wouldn't it be cool?
475
00:21:35,805 --> 00:21:38,565
I swear to God it's illegal for public companies to lie.
476
00:21:38,565 --> 00:21:41,235
I. Or like they, there should be, I thought there was
477
00:21:41,235 --> 00:21:43,395
something that stopped them from just making shit up.
478
00:21:43,395 --> 00:21:43,935
But there isn't.
479
00:21:43,935 --> 00:21:45,655
They can just do what they want.
480
00:21:46,255 --> 00:21:49,245
And then when you look behind the curtain, there's no money.
481
00:21:49,695 --> 00:21:52,245
There was an analyst that said, my Amazon is estimated
482
00:21:52,245 --> 00:21:55,575
to make $5 billion in generative AI this year.
483
00:21:55,755 --> 00:22:00,705
That's, that's, these are not just rookie numbers, these are alarming.
484
00:22:00,945 --> 00:22:03,765
One of their, uh, one of their job ads a while back
485
00:22:03,765 --> 00:22:07,065
listed S3 as an AI service, as one of the list of them.
486
00:22:07,065 --> 00:22:07,455
So, great.
487
00:22:07,515 --> 00:22:10,155
Where does that start And stop the, the broad buckets
488
00:22:10,155 --> 00:22:12,735
that no one will ever break down further for you.
489
00:22:12,765 --> 00:22:14,085
So here's the thing.
490
00:22:14,655 --> 00:22:16,245
S3 is storage, right?
491
00:22:16,245 --> 00:22:16,725
Forgive me,
492
00:22:16,725 --> 00:22:17,805
I'm, it is.
493
00:22:17,925 --> 00:22:20,205
I have heard that Anthropic is the biggest customer there.
494
00:22:20,700 --> 00:22:22,080
I have been told by someone.
495
00:22:22,680 --> 00:22:26,040
There are multiple exabyte scale customers on Esther.
496
00:22:26,040 --> 00:22:28,380
I don't, I don't have the details at this point.
497
00:22:28,425 --> 00:22:29,970
I, I would not be surprised.
498
00:22:30,150 --> 00:22:33,560
Oh, and this is just a, this is just doing the Joe Rogan "you know
499
00:22:33,560 --> 00:22:38,190
that guy told me that once", but it's still five billion dollars is
500
00:22:38,190 --> 00:22:41,280
not a lot of money on a hundred billion dollars of CapEx this year.
501
00:22:41,580 --> 00:22:43,965
Microsoft last year, I think only made $5
502
00:22:43,965 --> 00:22:46,830
billion in ai, and that included open AI spend.
503
00:22:47,370 --> 00:22:50,400
This is not an in, this isn't an industry.
504
00:22:50,580 --> 00:22:54,480
Like if you really look at the cost benefit analysis, this isn't real.
505
00:22:54,630 --> 00:22:57,270
Like that's it, it's a Fugazi.
506
00:22:57,420 --> 00:22:59,550
If all this growth is there, then why haven't we seen
507
00:22:59,550 --> 00:23:02,280
massive posted growth numbers from all of these companies?
508
00:23:02,280 --> 00:23:04,050
Rather than what looks a lot like typical
509
00:23:04,050 --> 00:23:06,780
organic, more of the same style growth.
510
00:23:06,990 --> 00:23:13,110
I mean, Microsoft has been amortizing $2.5 billion of, of
511
00:23:13,110 --> 00:23:16,800
revenue from open AI's cloud spend across intelligent cloud.
512
00:23:16,980 --> 00:23:19,020
Any growth within intelligent cloud.
513
00:23:19,409 --> 00:23:20,699
Is now suspicious.
514
00:23:21,330 --> 00:23:21,629
Yes.
515
00:23:21,750 --> 00:23:22,169
Yes.
516
00:23:22,379 --> 00:23:25,590
And they're the biggest investor in OpenAI, so they're effectively putting
517
00:23:25,590 --> 00:23:29,429
their own money into back into their own pocket by laundering it through OpenAI.
518
00:23:29,550 --> 00:23:32,310
And if it's cloud credits, is it even money?
519
00:23:32,669 --> 00:23:34,949
I, I've never trusted Azure growth numbers and Azure usage
520
00:23:34,949 --> 00:23:37,770
numbers, just because they're so inextricably linked to Office
521
00:23:37,770 --> 00:23:41,580
365 to enterprise agreements that big companies have for windows.
522
00:23:42,060 --> 00:23:43,290
Whoa, whoa, whoa, whoa.
523
00:23:43,770 --> 00:23:45,870
Do you think that they're charging, that they're
524
00:23:47,399 --> 00:23:51,000
amortizing revenue from their own spend on Azure?
525
00:23:51,090 --> 00:23:51,929
That can't be.
526
00:23:52,470 --> 00:23:55,439
I would not be entirely surprised.
527
00:23:55,439 --> 00:23:57,240
I'm sure it's by, I'm sure it's above board.
528
00:23:57,240 --> 00:24:00,870
I would not want to accuse them of doing anything other, other than that.
529
00:24:01,139 --> 00:24:04,379
But Amy Hood, their CFO has been an absolute
530
00:24:04,379 --> 00:24:07,620
master of financial engineering for over a decade.
531
00:24:07,919 --> 00:24:13,050
Uh, it's impossible to get much signal from the things that they say based
532
00:24:13,050 --> 00:24:16,110
upon the way that they very intentionally have structured and laid them out.
533
00:24:16,470 --> 00:24:18,899
Can I run my favorite.
534
00:24:19,500 --> 00:24:23,010
This is tin foil hat, like this probably won't
535
00:24:23,010 --> 00:24:25,290
happen, but I wanna put my flag in the ground.
536
00:24:25,530 --> 00:24:27,990
I think, I think Amy Hood takes over Microsoft.
537
00:24:29,040 --> 00:24:32,429
I think when whatever happens with OpenAI shits the bed.
538
00:24:32,520 --> 00:24:36,389
I think they put a, they put like boxer in Animal Farm.
539
00:24:36,389 --> 00:24:37,610
They send Satya off.
540
00:24:38,310 --> 00:24:41,250
Bye by Satya because, 'cause he is, so, I'm reading between
541
00:24:41,250 --> 00:24:43,770
the lines here, but the information story that came out over
542
00:24:43,770 --> 00:24:46,800
this tense negotiation between open AI and Microsoft, there
543
00:24:46,800 --> 00:24:50,970
was a thing where top executives thought open AI would fail.
544
00:24:51,090 --> 00:24:53,070
Wall Street Journal Berber Jin there were like
545
00:24:53,070 --> 00:24:54,600
four different bylines on the information.
546
00:24:54,600 --> 00:24:55,980
But Berber Jin wrote journal one.
547
00:24:56,189 --> 00:25:00,750
That one had a thing about how a top executives didn't like the clause
548
00:25:00,750 --> 00:25:04,860
about AGI Basically, when AGI is called open AI doesn't have to share
549
00:25:04,860 --> 00:25:09,300
IP with Microsoft and, but Satya Nadella apparently pushed it through.
550
00:25:09,360 --> 00:25:14,110
What I think happened was Amy Hood went, this company is going to die.
551
00:25:14,879 --> 00:25:16,379
This company is gonna die.
552
00:25:17,040 --> 00:25:19,440
This is stupid, but they're gonna die.
553
00:25:19,500 --> 00:25:21,300
Let's buy them effectively.
554
00:25:21,300 --> 00:25:23,130
And then when they die, we take all their stuff.
555
00:25:23,400 --> 00:25:26,070
And then, Satya Nadella said, well, the AGI thing.
556
00:25:26,070 --> 00:25:27,070
And everyone went, no.
557
00:25:27,070 --> 00:25:27,450
Satya.
558
00:25:27,620 --> 00:25:28,500
That's stupid.
559
00:25:28,960 --> 00:25:29,530
Don't do that.
560
00:25:29,530 --> 00:25:30,760
So, and he agreed to it Anyway.
561
00:25:30,910 --> 00:25:33,250
It's now a major sticking point, I think.
562
00:25:33,520 --> 00:25:35,530
I think when it all falls down and it, well,
563
00:25:35,860 --> 00:25:38,920
this is the Azure thing is a freaking scandal.
564
00:25:39,130 --> 00:25:41,200
I think the tech press is doing it, and business
565
00:25:41,200 --> 00:25:42,940
press is doing a terrible job with this.
566
00:25:43,240 --> 00:25:46,360
The fact that your largest customer is a company that burns money
567
00:25:46,360 --> 00:25:50,680
and has incredibly shaky finances, that is a counterparty risk.
568
00:25:51,010 --> 00:25:52,030
That is a huge thing.
569
00:25:52,360 --> 00:25:55,420
So when this breaks, I wouldn't be surprised if Satya is the
570
00:25:55,420 --> 00:25:59,620
one that gets blamed and Amy Hood in a responsible leadership
571
00:25:59,620 --> 00:26:02,110
role comes in and just starts freaking savaging the company.
572
00:26:02,560 --> 00:26:06,370
Sort of an interim CEO story where they conduct a search, but
573
00:26:06,370 --> 00:26:09,880
the search drags on and they remove interim from her Title I.
574
00:26:10,030 --> 00:26:11,350
Or whoever she picks.
575
00:26:11,350 --> 00:26:12,370
It's another like.
576
00:26:13,405 --> 00:26:16,534
Another elder, God, she goes back, finds a cthulhu
577
00:26:16,555 --> 00:26:20,125
adjacent creature to take over Microsoft, because I have
578
00:26:20,125 --> 00:26:22,975
friends who think that this Microsoft thing is a plan.
579
00:26:23,215 --> 00:26:26,665
It was always the plan kind of because non 'cause of the nonprofit
580
00:26:26,665 --> 00:26:28,975
situation, I've kind of like poorly explained the situation.
581
00:26:28,975 --> 00:26:29,784
Do you want me to?
582
00:26:30,504 --> 00:26:31,645
It's entirely up to you.
583
00:26:31,645 --> 00:26:33,625
I, I feel like there's, there's a lot of information
584
00:26:33,625 --> 00:26:35,365
around this floating around out there, but.
585
00:26:35,784 --> 00:26:38,725
Basically OpenAI needs to become a for-profit entity by the
586
00:26:38,725 --> 00:26:41,305
end of the year or they lose $20 billion from their funding.
587
00:26:41,485 --> 00:26:43,014
Microsoft is the stopping block.
588
00:26:43,014 --> 00:26:45,145
'cause OpenAI wants a bunch of concessions
589
00:26:45,145 --> 00:26:46,675
and Microsoft doesn't want to give them.
590
00:26:47,095 --> 00:26:47,395
Yeah.
591
00:26:47,754 --> 00:26:51,865
And um, no one knows what's gonna happen and no one wants to say
592
00:26:52,045 --> 00:26:55,195
what I'm saying, which is what if Microsoft just lets them die?
593
00:26:55,225 --> 00:26:57,595
Because if Microsoft says no, we won't let you convert.
594
00:26:58,585 --> 00:26:59,485
OpenAI dies.
595
00:26:59,889 --> 00:27:01,060
They can't go public.
596
00:27:01,690 --> 00:27:04,510
And uh, Microsoft owns all their IP and research
597
00:27:04,510 --> 00:27:06,070
anyway and handles all their infrastructure.
598
00:27:06,100 --> 00:27:09,070
Well, what about all the CapEx they did explicitly for open ai?
599
00:27:09,100 --> 00:27:09,340
Yeah.
600
00:27:09,370 --> 00:27:11,050
'cause those things can't ever be repurposed
601
00:27:11,050 --> 00:27:12,820
for other customers doing the different things.
602
00:27:12,820 --> 00:27:15,910
They also didn't, they never said it was just for open ai.
603
00:27:15,910 --> 00:27:17,139
They just said for ai.
604
00:27:17,440 --> 00:27:19,870
Amazon has been saying that Project Rainier, all
605
00:27:19,870 --> 00:27:22,750
of these specific data centers are for anthropic.
606
00:27:22,960 --> 00:27:24,700
And there was a New York Times story about it.
607
00:27:24,700 --> 00:27:25,030
Yeah.
608
00:27:25,600 --> 00:27:28,210
Indeed, and I'm very curious as to how that plays out because they're
609
00:27:28,210 --> 00:27:31,530
talking a lot about how Oh, anthropic says that , they're running,
610
00:27:31,860 --> 00:27:36,330
uh, Claude Opus or Claude, the Claude four on Trainium, which is
611
00:27:36,330 --> 00:27:39,660
interesting because I have looked far and wide and I can't find
612
00:27:39,945 --> 00:27:41,875
any Trainium or Inference customers.
613
00:27:42,055 --> 00:27:47,185
The two that I'm aware of from keynotes were Apple and Anthropic Anthropic.
614
00:27:47,185 --> 00:27:50,005
They've invested four billion dollars in, and Apple has
615
00:27:50,065 --> 00:27:53,155
been known to be a large AWS customer for quite some time.
616
00:27:53,665 --> 00:27:57,565
When we see contracts being negotiated between these things, who says
617
00:27:57,565 --> 00:28:01,285
what on stage at whose conference is always negotiated into that?
618
00:28:01,465 --> 00:28:03,895
Apple never says shit about other companies'
619
00:28:03,895 --> 00:28:06,385
products, but they did at reinvent last year.
620
00:28:06,385 --> 00:28:09,715
Gee, I'm sure there was no contractual stipulation there.
621
00:28:10,405 --> 00:28:12,865
Also, no one has said that they're running their
622
00:28:12,865 --> 00:28:15,455
Claude four models exclusively on Trainium.
623
00:28:15,625 --> 00:28:16,855
So what did they do?
624
00:28:16,855 --> 00:28:19,495
Did they try running a quick like, and here's the thing that
625
00:28:19,495 --> 00:28:22,465
ties the last bow on it and just this one side bit we can
626
00:28:22,465 --> 00:28:24,835
run on Trainium and the rest, we using video like grownups.
627
00:28:25,285 --> 00:28:27,985
I don't know, but they're being cagey and I don't trust them.
628
00:28:28,195 --> 00:28:30,805
Well, they also, one of the reasons they wouldn't have said they
629
00:28:30,805 --> 00:28:34,315
exclusively run on Trainium is that both Amazon and Google have.
630
00:28:34,755 --> 00:28:37,935
Separately claim they're the exclusive cloud provider of anthropic.
631
00:28:38,025 --> 00:28:38,685
It rocks.
632
00:28:38,685 --> 00:28:39,315
I love that.
633
00:28:39,315 --> 00:28:42,075
The just lying just lies.
634
00:28:42,195 --> 00:28:45,195
It's an amazing competition though between Amazon and Google
635
00:28:45,195 --> 00:28:47,595
Over who can hit Anthropic harder with the money stick.
636
00:28:47,595 --> 00:28:50,445
I, I wanna be the, the centerpiece in that competition.
637
00:28:50,450 --> 00:28:53,660
I, I would love to get like $100 million.
638
00:28:54,000 --> 00:28:55,610
That sounds so sick.
639
00:28:55,850 --> 00:28:57,030
4 billion each.
640
00:28:57,075 --> 00:29:00,465
Uh, then, then Amazon made it eight and I'm sure it's all AWS credit.
641
00:29:00,465 --> 00:29:02,205
So out of one pocket into the other.
642
00:29:02,325 --> 00:29:03,255
Oh, no, no, no, no.
643
00:29:03,255 --> 00:29:03,675
It did.
644
00:29:03,885 --> 00:29:09,375
Amazon then converted whatever it was into some sort of new vehicle.
645
00:29:09,435 --> 00:29:11,775
So they were out actually able to book that as
646
00:29:11,775 --> 00:29:14,565
profit, like their stock in Anthropic as profit.
647
00:29:15,435 --> 00:29:16,215
Corporate bulls.
648
00:29:16,275 --> 00:29:17,805
They're better at this than Microsoft.
649
00:29:17,805 --> 00:29:18,465
Microsoft is.
650
00:29:18,675 --> 00:29:22,335
Microsoft made such a bad, they really thought OpenAI would die faster.
651
00:29:22,485 --> 00:29:25,975
But looking at the original $4 billion investment from November
652
00:29:25,975 --> 00:29:29,955
22nd, 2024, Amazon and anthropic, deepen strategic collaboration.
653
00:29:29,955 --> 00:29:34,119
Amazon names AWS , it's primary trading partner and will use
654
00:29:34,149 --> 00:29:38,019
AWS Trainium to train and deploy its largest foundation models.
655
00:29:38,289 --> 00:29:39,429
Yeah, use them too.
656
00:29:40,029 --> 00:29:42,609
Great that that does an awful lot of heavy lifting.
657
00:29:42,759 --> 00:29:43,419
Use them.
658
00:29:43,419 --> 00:29:44,019
How?
659
00:29:44,049 --> 00:29:47,439
How many of them is Amazon still buying GPUs from Nvidia?
660
00:29:47,439 --> 00:29:49,119
I didn't hear them cancel their order.
661
00:29:49,824 --> 00:29:52,044
Yeah, I, I'm no expert in this space.
662
00:29:52,044 --> 00:29:53,514
I want to be very clear on that.
663
00:29:53,514 --> 00:29:56,394
For a long time it sounded like Amazon execs would get on stage and
664
00:29:56,394 --> 00:29:59,124
recite Star Trek techno babble when it came to machine learning.
665
00:29:59,574 --> 00:30:02,214
And now, like I, I tend to recognize patterns.
666
00:30:02,334 --> 00:30:05,454
I know that when people are telling me to get in on something now while
667
00:30:05,454 --> 00:30:08,934
it's on the ground floor, they're trying to sell me something every time.
668
00:30:09,144 --> 00:30:12,474
And I also know that when someone is selling something this
669
00:30:12,474 --> 00:30:16,824
hard, it doesn't necessarily live up to a lot of those things.
670
00:30:16,854 --> 00:30:19,614
My 4-year-old daughter loves ice cream, and I don't
671
00:30:19,614 --> 00:30:22,014
find myself having to shove it down her throat.
672
00:30:22,134 --> 00:30:24,054
It's a pull rather than a push.
673
00:30:24,384 --> 00:30:27,354
Everyone's pushing this so hard, I don't
674
00:30:27,354 --> 00:30:29,424
see the outcome that they're talking about.
675
00:30:29,664 --> 00:30:34,014
What if, just, what if the, the power of AI is more useful to folks
676
00:30:34,014 --> 00:30:37,134
on an individual level than it is an enterprise corporate one?
677
00:30:37,284 --> 00:30:41,874
I think your idea that this is a $50 billion tam is directionally correct.
678
00:30:42,084 --> 00:30:42,414
Yeah.
679
00:30:42,534 --> 00:30:44,544
I'll spend 20, 30 bucks a month for this stuff.
680
00:30:44,544 --> 00:30:45,114
No problem.
681
00:30:45,114 --> 00:30:46,164
Would I spend five grand?
682
00:30:46,164 --> 00:30:46,674
No.
683
00:30:46,884 --> 00:30:47,634
So.
684
00:30:48,009 --> 00:30:50,589
There is an inherent limit to all of this until
685
00:30:50,589 --> 00:30:53,469
they un they cover some massively awesome use case.
686
00:30:53,619 --> 00:30:55,299
But they've been looking for that for three
687
00:30:55,299 --> 00:30:57,879
years now and haven't much to show for it beyond.
688
00:30:58,059 --> 00:31:01,389
It's really good at writing code badly, just like our engineers.
689
00:31:01,779 --> 00:31:04,419
And the thing is, I like to talk about the iPhone
690
00:31:05,139 --> 00:31:07,059
and I picked up the iPhone singular wireless.
691
00:31:07,059 --> 00:31:09,459
It was at visiting Penn State where it went to college.
692
00:31:10,029 --> 00:31:12,129
And um, I remember getting it and I remember
693
00:31:12,129 --> 00:31:14,709
handing it to, it was dating a dating girl.
694
00:31:14,709 --> 00:31:18,399
It was like a social studies, like I had her dad worked as a local accountant
695
00:31:18,429 --> 00:31:22,359
in central pa. So really good representation of just like normal folk.
696
00:31:22,569 --> 00:31:23,949
Every single person who touched it.
697
00:31:23,949 --> 00:31:24,309
Got it.
698
00:31:24,969 --> 00:31:26,559
They were like, holy shit, there's like a
699
00:31:26,559 --> 00:31:28,149
little computer and you can just touch it.
700
00:31:28,989 --> 00:31:29,469
Wow.
701
00:31:29,589 --> 00:31:30,939
Like I can take photos on there.
702
00:31:31,149 --> 00:31:32,649
I, my music is on here too.
703
00:31:32,769 --> 00:31:33,099
Wow.
704
00:31:33,129 --> 00:31:34,509
Your email as well.
705
00:31:34,914 --> 00:31:36,834
Do you even like, they kind of immediately, like
706
00:31:36,984 --> 00:31:39,924
the channels started forming in their head with AWS.
707
00:31:39,924 --> 00:31:41,904
People loved, oh, my fracking, God, I want
708
00:31:41,904 --> 00:31:44,934
to strangle every person AWS lost many.
709
00:31:44,934 --> 00:31:48,924
First AWS did that big fracking deal.
710
00:31:49,404 --> 00:31:50,934
There was a path to profitability.
711
00:31:51,024 --> 00:31:53,964
There was an obvious way it which this would become profitable.
712
00:31:53,964 --> 00:31:57,444
On top of that, you didn't have to explain to people why AWS was important.
713
00:31:58,284 --> 00:31:59,424
It made sense.
714
00:31:59,544 --> 00:32:00,174
It made sense.
715
00:32:00,174 --> 00:32:00,744
Immediately.
716
00:32:00,804 --> 00:32:03,564
Suddenly you could provision stuff instantly, overnight.
717
00:32:03,564 --> 00:32:06,684
You could run an experiment in your dorm room and realize, oh, this is stupid.
718
00:32:06,684 --> 00:32:09,024
No one's gonna buy it and you're out 27 cents.
719
00:32:09,024 --> 00:32:12,414
As opposed to having to buy and rack servers and wait 16 weeks because
720
00:32:12,414 --> 00:32:15,234
Dell can't get its act together and sign leases and data centers.
721
00:32:15,414 --> 00:32:16,464
That was transformative.
722
00:32:16,734 --> 00:32:18,834
And it was obviously transformative.
723
00:32:19,614 --> 00:32:23,184
It was obviously like you didn't have to, like you had to sell people.
724
00:32:23,184 --> 00:32:24,894
It got, Amazon loves to sell it, but.
725
00:32:25,464 --> 00:32:27,564
You didn't have to like con people.
726
00:32:27,564 --> 00:32:28,584
There were no people going around it.
727
00:32:28,584 --> 00:32:31,404
If you don't get, I imagine there was someone saying, if you don't get in on
728
00:32:31,404 --> 00:32:35,304
the cloud early, there's always someone, but it's, there was an obvious point.
729
00:32:35,304 --> 00:32:38,004
You didn't have to like talk like a wizard.
730
00:32:38,664 --> 00:32:43,734
I got seriously into cloud in 2015, 2016 era, very late by a lot of standards.
731
00:32:43,734 --> 00:32:45,114
I seemed to have done okay with it.
732
00:32:45,899 --> 00:32:48,474
I, I got into enterprise tech in like 2013.
733
00:32:48,594 --> 00:32:52,164
That was when I started doing PR clients and that, and it's like, guess what?
734
00:32:52,224 --> 00:32:55,134
It's like it was still cooking because there were obvious things.
735
00:32:55,134 --> 00:32:57,114
You have the mixture of on-prem and off, off-prem.
736
00:32:57,114 --> 00:32:58,794
You have all sorts of solutions within it.
737
00:32:58,884 --> 00:33:01,194
There were obvious things and obvious ways it could proliferate.
738
00:33:01,464 --> 00:33:03,474
I am a gizmos and gadgets guy.
739
00:33:04,134 --> 00:33:06,174
I work in a, I run a PR firm.
740
00:33:06,264 --> 00:33:07,914
I have to find reasons.
741
00:33:07,914 --> 00:33:09,354
Things are interesting all the time.
742
00:33:09,564 --> 00:33:11,064
I know what to look for.
743
00:33:11,484 --> 00:33:12,594
I genuinely can't hear.
744
00:33:12,669 --> 00:33:17,769
I genu, I, I have tried even the coding I have on the show, I had Carl
745
00:33:17,769 --> 00:33:20,949
Brown from Internet of Bugs who I adore, and he really explained it where
746
00:33:20,949 --> 00:33:24,669
it's like, yeah, it can do some code, but mediocre code can be dangerous.
747
00:33:24,789 --> 00:33:26,409
If you know what you're looking for, it's great.
748
00:33:26,409 --> 00:33:28,059
But do you always know what you're looking for?
749
00:33:28,059 --> 00:33:29,649
How much of this are you handing off?
750
00:33:30,159 --> 00:33:31,329
And it's like, yeah.
751
00:33:31,329 --> 00:33:34,359
It sounds like pretty much every cloud automation solution
752
00:33:34,359 --> 00:33:36,519
I've ever freaking heard of this is really useful.
753
00:33:36,519 --> 00:33:37,929
It can speed up your workflow.
754
00:33:38,139 --> 00:33:40,569
Something that would take you 10 minutes might take you two.
755
00:33:40,749 --> 00:33:41,979
You need to watch it.
756
00:33:42,369 --> 00:33:44,049
You can't trust it on its own.
757
00:33:44,139 --> 00:33:46,479
But this can automate and turn a 10 minute job
758
00:33:46,479 --> 00:33:48,279
into a two minute job, which compounds over time.
759
00:33:48,399 --> 00:33:48,909
Awesome.
760
00:33:48,969 --> 00:33:50,049
$50 billion.
761
00:33:50,079 --> 00:33:50,859
Tam Market.
762
00:33:50,859 --> 00:33:51,579
Absolutely.
763
00:33:51,849 --> 00:33:54,129
Rock and roll requires stealing from everyone.
764
00:33:54,129 --> 00:33:55,869
It's horribly horrible from the environment.
765
00:33:56,244 --> 00:33:59,783
But I even think that once this bubble pops, all that shit still exists,
766
00:33:59,963 --> 00:34:02,664
but you're not gonna see the scale and you're probably gonna see more
767
00:34:02,664 --> 00:34:08,094
money invested ever in finding ways to make inference as like nothing.
768
00:34:08,243 --> 00:34:10,314
I don't think they are even close.
769
00:34:10,314 --> 00:34:14,153
And I think because if they were, my evidence here is real simple.
770
00:34:14,394 --> 00:34:18,833
If they were close, if they were there, they would say, we've made influence.
771
00:34:18,894 --> 00:34:19,554
Influence.
772
00:34:19,554 --> 00:34:19,974
Nothing.
773
00:34:20,214 --> 00:34:21,024
Influence doesn't cut.
774
00:34:21,054 --> 00:34:22,073
They would just say it.
775
00:34:22,434 --> 00:34:24,533
They would say, we have made this profitable.
776
00:34:24,743 --> 00:34:27,024
It would be the easiest thing in the world.
777
00:34:27,024 --> 00:34:31,283
You would just say, you would just say like, it's profitable, it's good.
778
00:34:31,314 --> 00:34:35,243
It's good When they have these, when with the growth of Azure, the
779
00:34:35,243 --> 00:34:38,214
classic growth of Azure, the growth of AWS, the growth of any software
780
00:34:38,214 --> 00:34:42,474
as a service solution, you, they trumpeted in very simple terms.
781
00:34:42,474 --> 00:34:45,175
They say, Mark Benioff saunters out, slimly.
782
00:34:45,180 --> 00:34:48,054
And it's like, ah, number has gone up so many times.
783
00:34:48,649 --> 00:34:51,399
And he says the word agentic and the industry shits itself for the next
784
00:34:51,399 --> 00:34:55,179
18 months trying to fall over themselves to get in on that buzzword now.
785
00:34:55,329 --> 00:34:57,969
So last year I sat down and I went through a bunch of
786
00:34:57,969 --> 00:35:01,479
Salesforce, um, Dreamforce conference things, the transcripts.
787
00:35:01,689 --> 00:35:06,069
He said they were in AI like five times since 2014.
788
00:35:06,069 --> 00:35:09,609
I think he has said Einstein AI was happening so many times.
789
00:35:09,788 --> 00:35:13,479
It is remarkable how bad the business press is, and it's remarkable
790
00:35:13,479 --> 00:35:19,419
how disconnected the valuations are because this is foolhardy stuff.
791
00:35:19,449 --> 00:35:21,009
These people are not smart.
792
00:35:21,639 --> 00:35:26,739
But when you look at these numbers as well, this is just,
793
00:35:26,739 --> 00:35:28,989
this is the hubris point that's always been coming for them.
794
00:35:29,229 --> 00:35:32,739
When you have just had businesses that it's kind of incredible
795
00:35:32,739 --> 00:35:35,469
how much they've grown, how big these businesses have got.
796
00:35:35,469 --> 00:35:37,869
It's kind of impressive, good or bad people that they
797
00:35:37,869 --> 00:35:40,089
are, they've been able to make remarkable businesses.
798
00:35:40,269 --> 00:35:42,129
I just think that they got it by accident.
799
00:35:42,129 --> 00:35:44,559
I don't think that they were super intentional
800
00:35:44,559 --> 00:35:46,239
with everything because if they were.
801
00:35:46,899 --> 00:35:48,489
Well, they would be doing it again.
802
00:35:49,118 --> 00:35:52,449
They would've, someone would've found, and Microsoft
803
00:35:52,449 --> 00:35:55,229
was the one that seemed to have worked it out.
804
00:35:55,349 --> 00:35:57,019
They had more even if it wasn't profit,
805
00:35:57,038 --> 00:35:58,659
they had more revenue than everyone else.
806
00:35:59,469 --> 00:36:03,069
And then you see, oh, it's, they're meeting cardboard.
807
00:36:03,159 --> 00:36:08,019
They fed themselves their own money, and there should be shareholder lawsuit.
808
00:36:08,019 --> 00:36:10,659
I'm just, it's unbelievable.
809
00:36:10,839 --> 00:36:14,319
This episode is sponsored by my own company, the
810
00:36:14,319 --> 00:36:18,069
Duck Bill Group, having trouble with your AWS bill.
811
00:36:18,159 --> 00:36:20,949
Perhaps it's time to renegotiate a contract with them.
812
00:36:21,159 --> 00:36:23,259
Maybe you're just wondering how to predict
813
00:36:23,259 --> 00:36:26,319
what's going on in the wide world of AWS.
814
00:36:26,499 --> 00:36:29,439
Well, that's where the Duck Bill group comes in to help.
815
00:36:29,859 --> 00:36:32,109
Remember, you can't duck the duck bill.
816
00:36:32,109 --> 00:36:34,209
Bill, which I am reliably informed by my
817
00:36:34,209 --> 00:36:36,909
business partner is absolutely not our motto.
818
00:36:37,449 --> 00:36:40,209
One question I do have for you on the inference economics, though,
819
00:36:40,209 --> 00:36:43,179
I, I hear what you're saying, but one of the arguments I would
820
00:36:43,179 --> 00:36:46,328
make against that would be the fact that Google has been shoving
821
00:36:46,328 --> 00:36:51,009
their dumb AI nonsense into every search result for a while now.
822
00:36:51,339 --> 00:36:54,368
And I, I have a hard time believing that they're losing money on every
823
00:36:54,368 --> 00:36:59,139
search by shoving that in that that would, that would imp well on some
824
00:36:59,139 --> 00:37:02,589
level that would show from orbit at the scale that they tend to operate at.
825
00:37:03,158 --> 00:37:04,479
And their engineers who I, some of them
826
00:37:04,479 --> 00:37:06,459
whom I trust have said that they are not.
827
00:37:06,459 --> 00:37:08,649
But I, I have a hard time seeing how that gets there
828
00:37:08,649 --> 00:37:11,259
except for the fact that TPUs sound like what Trainium and
829
00:37:11,259 --> 00:37:14,559
in wish they were as far as actually supporting things.
830
00:37:14,769 --> 00:37:17,589
I don't have any insight into the economics of it.
831
00:37:17,589 --> 00:37:21,969
I just can go by what I see and they have a stupendous amount of
832
00:37:21,969 --> 00:37:25,299
inference running all the time now for these things that no one wants.
833
00:37:26,139 --> 00:37:27,699
And it's fast inference too, because people
834
00:37:27,699 --> 00:37:29,589
aren't gonna wait five minutes for a page to load.
835
00:37:29,589 --> 00:37:30,519
A few things.
836
00:37:30,759 --> 00:37:31,269
One.
837
00:37:32,004 --> 00:37:36,114
They're not just using TPUs, they're absolutely using, uh, Nvidia chips.
838
00:37:36,114 --> 00:37:37,674
They buy absolute shit.
839
00:37:38,004 --> 00:37:40,134
They have, they did a recent thing.
840
00:37:40,134 --> 00:37:40,344
Wow.
841
00:37:40,344 --> 00:37:42,774
We can run one model on one h, 100
842
00:37:45,444 --> 00:37:46,314
well done.
843
00:37:46,703 --> 00:37:49,614
Um, doesn't mean it's, doesn't mean they're not losing money.
844
00:37:49,764 --> 00:37:53,094
And now engineer, they may have some insight,
845
00:37:53,094 --> 00:37:54,864
but how much financial insight do they have?
846
00:37:54,894 --> 00:37:58,134
If Google was, and also real simple, if Google was not
847
00:37:58,134 --> 00:37:59,964
losing money, they would say they weren't losing money.
848
00:38:00,114 --> 00:38:02,453
How would we see that they're losing money?
849
00:38:02,453 --> 00:38:04,314
Where would that be on their earnings?
850
00:38:04,374 --> 00:38:06,834
We wouldn't, 'cause they bury every, these companies.
851
00:38:07,283 --> 00:38:09,864
Amy Hood is not the only finance wizard out there.
852
00:38:09,864 --> 00:38:13,194
I forget what the CFO at Google is probably intentional there.
853
00:38:13,464 --> 00:38:15,294
But Sundar is a McKinsey guy.
854
00:38:15,924 --> 00:38:17,214
These people are evil.
855
00:38:17,964 --> 00:38:20,694
Uh, McKinsey, flesh eating bacteria of business.
856
00:38:21,894 --> 00:38:22,913
They're not gonna say it.
857
00:38:22,913 --> 00:38:25,434
But also they would just say, in instance, free.
858
00:38:25,434 --> 00:38:27,413
And would they burn billions of dollars?
859
00:38:27,684 --> 00:38:28,644
Are you asking.
860
00:38:29,109 --> 00:38:33,249
Whether, just wanna be clear, a hyperscaler would burn
861
00:38:33,249 --> 00:38:38,169
billions of dollars chasing AI because they already have.
862
00:38:38,828 --> 00:38:40,029
They already are.
863
00:38:40,089 --> 00:38:41,288
They would burn billions.
864
00:38:41,349 --> 00:38:44,859
Maybe they're even wrapping some of that loss into their capital expenditures.
865
00:38:44,919 --> 00:38:45,879
Who freaking knows?
866
00:38:45,879 --> 00:38:47,019
They don't tell us.
867
00:38:47,078 --> 00:38:48,788
But there is plenty of evidence that these
868
00:38:48,788 --> 00:38:50,618
companies are willing to lose billions.
869
00:38:50,709 --> 00:38:56,299
Google hired, they bought character AI kind of for $2.7 billion just to get Noam
870
00:38:56,299 --> 00:38:59,949
Shazeer back, who's one of the original attention is all you need paper writers.
871
00:38:59,949 --> 00:39:01,328
The transformer paper that kicked off this
872
00:39:01,328 --> 00:39:05,049
generation, Google does dumb shit all the time.
873
00:39:05,259 --> 00:39:07,149
Everyone hates their integrations.
874
00:39:07,449 --> 00:39:09,099
Yeah, I think they are losing money.
875
00:39:09,099 --> 00:39:11,559
I think they're willingly doing so and I think they'll do it again.
876
00:39:11,739 --> 00:39:16,839
I think that they have their nasty growth pig search and all of
877
00:39:16,839 --> 00:39:20,709
their monopolies allow them to finance many money losing operations.
878
00:39:21,038 --> 00:39:23,919
Do you think I, I would actually really need to look into the economics
879
00:39:23,919 --> 00:39:26,529
of Chrome, 'cause I can't imagine that that has direct revenue.
880
00:39:26,949 --> 00:39:29,349
But it's kind of like meta meta's.
881
00:39:29,349 --> 00:39:30,788
Absolutely losing billions.
882
00:39:30,788 --> 00:39:33,249
100%. And do you not think that
883
00:39:34,179 --> 00:39:37,029
Meta's very clearly thrashing and desperately trying to find the next thing.
884
00:39:37,239 --> 00:39:38,979
I love, I love what they're doing.
885
00:39:39,129 --> 00:39:39,788
I love it.
886
00:39:39,849 --> 00:39:42,699
I think it's the funniest shit in the world I've ever seen.
887
00:39:42,849 --> 00:39:45,609
It's like watching Zuckerberg's midlife crisis play out before us.
888
00:39:45,939 --> 00:39:46,839
It really is.
889
00:39:46,839 --> 00:39:49,029
He is having like a hell of a midlife crisis.
890
00:39:49,029 --> 00:39:53,199
I thought Elon Musk was having any uh uh, like a midlife crisis.
891
00:39:53,199 --> 00:39:56,649
But someone said something to me on Blue Sky about this earlier.
892
00:39:56,649 --> 00:39:56,919
Yeah.
893
00:39:57,279 --> 00:40:00,339
That meta is being the LIV golf of AI research.
894
00:40:00,459 --> 00:40:02,769
I think that that is one of the funniest ways to put it.
895
00:40:02,769 --> 00:40:06,578
'cause it really is just like this giant evil company that everyone hates being
896
00:40:06,578 --> 00:40:12,519
like, do you want more money than realistically a person can conceptualize to
897
00:40:12,519 --> 00:40:17,078
come and work with me on large language models becoming super intelligence.
898
00:40:17,259 --> 00:40:20,809
By the way, here's my chief AI scientist, his name's Yann LeCun.
899
00:40:21,038 --> 00:40:22,449
He disagrees with me.
900
00:40:22,449 --> 00:40:23,859
He does not think this is possible.
901
00:40:23,979 --> 00:40:25,449
This is your boss.
902
00:40:25,809 --> 00:40:28,809
Oh, and, and Altman was saying that Meta was offering a hundred million
903
00:40:28,809 --> 00:40:33,129
dollars signing bonuses that people were not taking for bullshit.
904
00:40:33,189 --> 00:40:35,709
That is generational wealth money to come work
905
00:40:35,709 --> 00:40:37,569
with some douche bag for a couple of years.
906
00:40:38,198 --> 00:40:38,948
Who were they offered to?
907
00:40:38,948 --> 00:40:39,609
Sam Altman?
908
00:40:39,819 --> 00:40:40,658
Two things.
909
00:40:41,049 --> 00:40:44,529
One, they have hired some people, including a guy
910
00:40:44,529 --> 00:40:46,929
with like one of the most insane names I've ever said.
911
00:40:46,929 --> 00:40:49,569
I just, I've been looking forward to saying this out loud all day.
912
00:40:49,569 --> 00:40:50,679
Lemme just get to this.
913
00:40:51,249 --> 00:40:53,229
Oh God, this name is so good too.
914
00:40:54,209 --> 00:40:57,129
His name is Trapit Bansal.
915
00:40:57,749 --> 00:40:58,689
That is quite the name.
916
00:40:58,959 --> 00:41:00,759
That's a incredible name.
917
00:41:00,879 --> 00:41:02,019
I'm called Ed Zitron.
918
00:41:02,019 --> 00:41:06,609
I have a phenomenal name, but Trapit Bansal might have to marry him.
919
00:41:06,999 --> 00:41:11,679
Uh, but is it, is it Trapit Bansal Zitron is actually pretty anyway.
920
00:41:11,889 --> 00:41:15,489
Uh, girlfriend's gonna hate that, but there we are.
921
00:41:15,788 --> 00:41:16,509
She'll understand.
922
00:41:16,509 --> 00:41:19,569
It's for the content, but where was it?
923
00:41:19,569 --> 00:41:21,279
Yeah, these a hundred million dollar offers.
924
00:41:21,279 --> 00:41:23,828
I reckon Mr. Bansal, I probably took one.
925
00:41:24,953 --> 00:41:27,233
Well, as Altman said, he offered to their best employees.
926
00:41:27,233 --> 00:41:30,294
So definitionally, anyone who takes it is not one of their best.
927
00:41:30,624 --> 00:41:31,134
Oh, of.
928
00:41:31,493 --> 00:41:35,063
And of course he's gonna say that, but I think it's great
929
00:41:35,063 --> 00:41:38,033
because like I said, Yann LeCun, chief AI scientist says that
930
00:41:38,033 --> 00:41:40,344
large language models will not become super intelligence.
931
00:41:40,464 --> 00:41:41,964
It's like, this is your boss.
932
00:41:42,084 --> 00:41:46,523
I've hired you to do a job that your boss doesn't like, but I am also your boss.
933
00:41:46,523 --> 00:41:48,294
And his boss have fun.
934
00:41:48,563 --> 00:41:48,924
Oh good.
935
00:41:48,924 --> 00:41:50,304
Mommy and daddy are fighting again!
936
00:41:50,393 --> 00:41:54,714
And constantly like it's like being born into a divorce.
937
00:41:54,834 --> 00:41:56,153
Like insane stuff.
938
00:41:56,153 --> 00:41:59,153
But on top of that, what are you freaking doing there?
939
00:41:59,214 --> 00:42:02,663
You, the head of your team is Alexandr Wang of scale ai.
940
00:42:02,754 --> 00:42:03,384
Not for nothing.
941
00:42:03,384 --> 00:42:05,453
If they offer me a hundred million dollars signing bonus,
942
00:42:05,453 --> 00:42:07,344
I'm doing whatever the hell they ask me to do until that
943
00:42:07,344 --> 00:42:09,504
earnout is done and then you'll never hear from me again.
944
00:42:09,594 --> 00:42:11,754
You want me to hit, hit myself in the nuts every morning?
945
00:42:11,754 --> 00:42:12,294
It'll happen.
946
00:42:12,294 --> 00:42:13,823
I can buy, I can buy new ones.
947
00:42:14,453 --> 00:42:14,844
That's right.
948
00:42:14,844 --> 00:42:15,563
I, I would learn.
949
00:42:15,624 --> 00:42:15,983
You're right.
950
00:42:15,983 --> 00:42:16,764
Hit myself in the nuts.
951
00:42:16,764 --> 00:42:18,384
Every morning I would learn node for that.
952
00:42:18,443 --> 00:42:18,684
Yeah.
953
00:42:18,684 --> 00:42:20,754
I mean, I will a hundred million dollars.
954
00:42:20,783 --> 00:42:21,384
Yeah, sure.
955
00:42:21,653 --> 00:42:22,523
But on top of that.
956
00:42:23,154 --> 00:42:25,703
The other thing, there's two other things that I think have pushed people back.
957
00:42:25,703 --> 00:42:28,104
The information had some good reporting on this as well, where
958
00:42:28,104 --> 00:42:31,314
it was, first of all, people were like the Yann LeCun thing.
959
00:42:31,314 --> 00:42:34,283
They were like, Hey, what are you doing here?
960
00:42:34,283 --> 00:42:35,663
I don't think you really understand.
961
00:42:35,694 --> 00:42:38,064
'cause Mark Zuckerberg, it's actually kind of a compelling offer.
962
00:42:38,304 --> 00:42:42,024
He was saying, I will give you a shit ton of money and also you won't
963
00:42:42,024 --> 00:42:45,953
have to worry about compute resources because OpenAI classically would,
964
00:42:46,374 --> 00:42:50,364
Sam Altman is very vengeful and he will restrict access on compute and
965
00:42:50,364 --> 00:42:53,993
resources at OpenAI if he doesn't like you being reported repeatedly.
966
00:42:54,474 --> 00:42:56,964
I think it was in Karen Howe's Empires of AI as well
967
00:42:56,964 --> 00:42:59,154
and Washington Post and Natasha Tku had some stuff.
968
00:42:59,484 --> 00:43:01,344
I think, pardon me if I got that wrong.
969
00:43:01,554 --> 00:43:04,254
No one has ever offered me all the compute resources
970
00:43:04,254 --> 00:43:06,743
I can use because they know my kind of horse shit.
971
00:43:06,834 --> 00:43:09,234
I mean, I would be working out ways to burn it.
972
00:43:09,234 --> 00:43:12,323
I would, I would just go and there'd be like a hundred million dollars.
973
00:43:12,323 --> 00:43:15,384
This is a, this is time to be the most annoying gentleman.
974
00:43:15,624 --> 00:43:16,734
But the other thing is.
975
00:43:17,124 --> 00:43:20,243
Is, I wonder if that's not a hundred million upfront.
976
00:43:20,243 --> 00:43:22,674
Maybe it's more like still $5 million.
977
00:43:22,733 --> 00:43:24,264
I would probably still do the nuts thing.
978
00:43:24,894 --> 00:43:25,134
Yeah.
979
00:43:25,163 --> 00:43:25,943
20 year earnout.
980
00:43:26,004 --> 00:43:26,214
Yeah.
981
00:43:26,453 --> 00:43:26,874
Yeah.
982
00:43:26,964 --> 00:43:29,334
But it's probably structured in a weird way.
983
00:43:29,964 --> 00:43:32,334
Oh, there's no way they're cutting checks for that.
984
00:43:32,334 --> 00:43:33,294
There's, there's just no way.
985
00:43:33,323 --> 00:43:35,453
It's probably got weird cliffs to it.
986
00:43:36,203 --> 00:43:37,193
Probably milestones.
987
00:43:37,193 --> 00:43:42,983
And on top of it even you are, and on top of that, the actual
988
00:43:42,983 --> 00:43:45,684
job sounds like it would be fracking miserable, because do you
989
00:43:45,684 --> 00:43:48,504
think Mark Zuckerberg is gonna put more or less pressure on you?
990
00:43:48,624 --> 00:43:49,044
No.
991
00:43:49,163 --> 00:43:53,544
You are the, you are the person that he's going to blame when this doesn't work.
992
00:43:53,544 --> 00:43:56,424
That is what you, I would take a hundred million dollars to do that.
993
00:43:56,663 --> 00:43:57,174
Oh, sure.
994
00:43:57,174 --> 00:43:57,953
I, I'll do it.
995
00:43:57,953 --> 00:43:59,424
And I'll just expect that I'm gonna need
996
00:43:59,424 --> 00:44:01,163
to cry to my therapist after every meeting.
997
00:44:01,163 --> 00:44:01,584
Fine.
998
00:44:01,674 --> 00:44:03,983
Oh, I, I would, I wouldn't be crying at all.
999
00:44:03,983 --> 00:44:05,214
I'd have a hundred million dollars.
1000
00:44:05,214 --> 00:44:07,943
I would go into work every day dressing.
1001
00:44:07,943 --> 00:44:09,943
However, Mark Zuckerberg last dressed.
1002
00:44:10,464 --> 00:44:12,069
And then just claim I'm copying him.
1003
00:44:12,069 --> 00:44:13,959
I would just create chaos like I would
1004
00:44:13,959 --> 00:44:16,658
jokify the entirety of Meta's AI department.
1005
00:44:16,868 --> 00:44:17,739
It would be amazing.
1006
00:44:17,979 --> 00:44:20,199
Well, you don't need a hundred million dollars to do that.
1007
00:44:20,288 --> 00:44:22,779
No, but I would, if I got that money, I would really, I
1008
00:44:22,779 --> 00:44:25,359
would have someone, my joker makeup guy every morning.
1009
00:44:25,929 --> 00:44:27,158
Just, just a guy who comes in.
1010
00:44:27,699 --> 00:44:28,029
Yeah.
1011
00:44:28,029 --> 00:44:30,158
He makes $250,000 a year.
1012
00:44:30,399 --> 00:44:32,828
He's also my driver and he also dresses like the Joker.
1013
00:44:33,129 --> 00:44:35,439
'cause Mark Zuckerberg fracking hates me by the end of it.
1014
00:44:35,618 --> 00:44:37,419
Just like he's doing the Joker.
1015
00:44:37,629 --> 00:44:37,779
Yeah.
1016
00:44:37,779 --> 00:44:40,419
I try to be just like Mark proving that I too can hire a clown.
1017
00:44:41,634 --> 00:44:43,704
He does look like a clown without his makeup on.
1018
00:44:43,824 --> 00:44:44,424
He really does.
1019
00:44:44,424 --> 00:44:47,764
It's the frizzy hair and the weird, like, like weird tan.
1020
00:44:47,844 --> 00:44:50,064
We saw a few years of business mark where he,
1021
00:44:50,064 --> 00:44:51,474
he honestly, I will give him credit, we're due.
1022
00:44:51,594 --> 00:44:55,704
He has managed to reinvent himself in a way that few people necessarily can.
1023
00:44:55,734 --> 00:44:58,524
He became, he went from the hoodie dude to wearing a suit
1024
00:44:58,524 --> 00:45:01,464
all the time, dude, to people confused him for a robot dude.
1025
00:45:01,644 --> 00:45:04,254
And now he's just weird because I, I have to imagine at
1026
00:45:04,254 --> 00:45:06,294
that tier, you don't have to have anyone around you who
1027
00:45:06,294 --> 00:45:08,484
doesn't empower you to do whatever the hell you want to do.
1028
00:45:08,604 --> 00:45:10,913
You surround yourself by nature with yes men,
1029
00:45:11,154 --> 00:45:13,464
and that is not good psychologically for anyone.
1030
00:45:13,554 --> 00:45:18,564
You realize though, that like Mark Zuckerberg can't be fired.
1031
00:45:19,703 --> 00:45:19,913
Yeah.
1032
00:45:19,974 --> 00:45:23,154
Like he can't be, he, he owns more board seats than anyone.
1033
00:45:23,154 --> 00:45:23,964
There's nothing they can do.
1034
00:45:23,964 --> 00:45:26,243
They couldn't even sue him out, so he would
1035
00:45:26,243 --> 00:45:28,224
drive this bad boy into the, into the ground.
1036
00:45:28,224 --> 00:45:34,374
But top of all of that, his last rebrand worked only for a minute, and because
1037
00:45:34,374 --> 00:45:38,033
it worked on the media, which he thought would be enough, no, he needed
1038
00:45:38,033 --> 00:45:41,274
it to work on Donald Trump, except Donald Trump doesn't like new money.
1039
00:45:41,964 --> 00:45:44,724
The actual trick for Mark Zuckerberg would've been to align with
1040
00:45:44,724 --> 00:45:50,033
some sort of elder god creature, some sort of aged finance freak.
1041
00:45:50,033 --> 00:45:52,404
Like if Sheldon Adelson was still alive.
1042
00:45:52,899 --> 00:45:54,609
Our rest in piss.
1043
00:45:54,819 --> 00:45:55,449
Oh, yes.
1044
00:45:55,509 --> 00:45:57,759
My, my personal favorite Zuckerberg trivia on this is apparently
1045
00:45:57,759 --> 00:46:00,759
he's five foot eight or so and hates it to the point where
1046
00:46:00,788 --> 00:46:04,538
he has people stage his Instagram photos, so he looks taller.
1047
00:46:04,749 --> 00:46:07,089
Uh, in Congress, someone took a picture, there was a cushion
1048
00:46:07,269 --> 00:46:10,029
on the chair, so he didn't look like he was drowning in it.
1049
00:46:10,209 --> 00:46:11,169
Now, I don't give a shit.
1050
00:46:11,199 --> 00:46:12,279
People are gonna be over tall.
1051
00:46:12,279 --> 00:46:14,889
They're going to be, but the fact that he's so wildly
1052
00:46:14,889 --> 00:46:17,679
sensitive about him for all his money, he can't buy himself
1053
00:46:17,679 --> 00:46:20,679
four inches of height is nothing short of hilarious to me.
1054
00:46:20,859 --> 00:46:24,519
I'm, I'm five nine, neither short nor tall, and now that I
1055
00:46:24,519 --> 00:46:26,828
know I'm taller than Mark Zuckerberg, I feel so powerful.
1056
00:46:27,038 --> 00:46:29,529
Also, who cares if anyone insulted by height?
1057
00:46:29,674 --> 00:46:30,024
Sorry.
1058
00:46:30,069 --> 00:46:34,209
He's five, seven and a half, apparently, which whenever its people
1059
00:46:34,209 --> 00:46:37,299
start saying a half, that's how you know they're sensitive about it.
1060
00:46:37,299 --> 00:46:39,399
Oh, I hear Sam Altman's the same.
1061
00:46:39,819 --> 00:46:42,069
I, I've heard, I don't know his true height.
1062
00:46:42,158 --> 00:46:45,189
If Sam Altman's like five foot one, that would be that.
1063
00:46:45,189 --> 00:46:46,658
That's what I'm gonna start saying.
1064
00:46:47,078 --> 00:46:49,899
I realize we've kind of got off of topic, but the
1065
00:46:49,899 --> 00:46:53,889
fact is this does tie back, which is we're discussing
1066
00:46:53,889 --> 00:46:56,288
people just spending billions of dollars for no reason.
1067
00:46:56,948 --> 00:46:57,988
It's vanity projects
1068
00:46:58,389 --> 00:47:01,989
But it's vanity projects that have absorbed the entirety of the economy.
1069
00:47:02,259 --> 00:47:05,649
The thing that I scare people with, no one likes hearing this, so enjoy.
1070
00:47:06,249 --> 00:47:11,919
So the magnificent seven stocks, they are 35% of the economy, right?
1071
00:47:11,948 --> 00:47:15,339
The US economy, 19% of those stocks is Nvidia
1072
00:47:15,429 --> 00:47:17,649
NVIDIA's continued growth is the only thing.
1073
00:47:17,799 --> 00:47:20,529
Continuing NVIDIA's valuations.
1074
00:47:20,618 --> 00:47:24,639
How does Nvidia grow by the rest of the magnificent seven buying GPUs from them?
1075
00:47:25,749 --> 00:47:27,399
What happens if that changes?
1076
00:47:28,528 --> 00:47:29,453
Well think about this.
1077
00:47:29,453 --> 00:47:32,064
I think that the same people, you ever notice how a lot of
1078
00:47:32,064 --> 00:47:35,754
the AI folks who are the big, the most bullish folks you'll
1079
00:47:35,754 --> 00:47:39,113
meet, were also deep in the wool crypto bros. It's like, what?
1080
00:47:39,113 --> 00:47:39,384
What?
1081
00:47:39,384 --> 00:47:40,764
Why do these people all pivot?
1082
00:47:41,064 --> 00:47:43,854
Because they're Nvidia Street team all, as long
1083
00:47:43,854 --> 00:47:46,344
as they're using the GPUs, that's what it is.
1084
00:47:46,794 --> 00:47:49,224
And if this falls through, we'll go back to using 'em for gaming.
1085
00:47:49,493 --> 00:47:51,684
I, it's not the same like that.
1086
00:47:51,743 --> 00:47:53,004
I will push back on that one.
1087
00:47:53,453 --> 00:47:55,344
You're 50 to 80%, right?
1088
00:47:55,344 --> 00:47:55,794
And the, yeah.
1089
00:47:55,794 --> 00:47:58,044
A ton of people who are like, I'm in crypto, I'm
1090
00:47:58,044 --> 00:47:59,964
in AI Just 'cause they were never in anything.
1091
00:48:00,504 --> 00:48:01,434
They're just hype chasing.
1092
00:48:01,434 --> 00:48:01,554
Yeah.
1093
00:48:01,554 --> 00:48:04,733
It turns out that by being an influencer with no actual expertise
1094
00:48:04,733 --> 00:48:07,613
in a thing means you can retool and pivot super quickly.
1095
00:48:07,644 --> 00:48:08,903
'Cause you don't have to spend that time
1096
00:48:08,903 --> 00:48:11,033
gathering this pesky thing called expertise.
1097
00:48:11,064 --> 00:48:12,264
It's also sold in the same way.
1098
00:48:12,384 --> 00:48:13,974
It's, I have a specious idea.
1099
00:48:13,974 --> 00:48:16,524
I've connected nascent tech to You don't understand to it.
1100
00:48:16,524 --> 00:48:17,033
Wow.
1101
00:48:17,214 --> 00:48:18,653
And getting quick before it's too late.
1102
00:48:18,743 --> 00:48:19,374
Exactly.
1103
00:48:19,403 --> 00:48:20,363
Same kind of pressure.
1104
00:48:20,724 --> 00:48:23,754
The only thing is, is that the GPU side is just wholly different.
1105
00:48:23,754 --> 00:48:28,999
And I mean the rack mounted insanity of a, oh god, what is it?
1106
00:48:29,064 --> 00:48:31,193
GB 200 or whatever the Blackwell.
1107
00:48:31,704 --> 00:48:35,514
Things are, they're all these rack mounted, 3000 pound monstrosities.
1108
00:48:35,514 --> 00:48:36,294
They're horrifying.
1109
00:48:36,444 --> 00:48:37,314
They are the rack.
1110
00:48:37,524 --> 00:48:41,124
They, they really, and they have cooling issues because science
1111
00:48:41,184 --> 00:48:45,984
exists, but they're, these people are doing it because they're
1112
00:48:45,984 --> 00:48:48,894
too stupid to realize that what they're in is kind of a scam.
1113
00:48:49,044 --> 00:48:51,504
I think that they're scamming in the sense that they're selling something
1114
00:48:51,504 --> 00:48:54,354
they don't really understand, but they're not like, I think this will die.
1115
00:48:54,594 --> 00:48:56,454
I think that they think this will go forever.
1116
00:48:56,514 --> 00:48:57,654
Kind of like crypto.
1117
00:48:57,834 --> 00:49:00,413
And when I, this is the other thing, people with my
1118
00:49:00,413 --> 00:49:02,244
work, they're like, oh, you started like a year ago.
1119
00:49:03,204 --> 00:49:04,314
I was writing in 2020.
1120
00:49:04,404 --> 00:49:06,684
I was on crypto before everyone.
1121
00:49:07,104 --> 00:49:07,464
Yeah.
1122
00:49:07,524 --> 00:49:10,163
I, I found that crypto has been looking for a business
1123
00:49:10,163 --> 00:49:14,784
model that is not fraud or fraud adjacent, uh, for 15 years.
1124
00:49:14,784 --> 00:49:16,314
At this point, it doesn't seem to have one.
1125
00:49:16,464 --> 00:49:18,384
It's, uh, liquidity for venture capital.
1126
00:49:18,413 --> 00:49:19,764
That's what it actually is.
1127
00:49:19,884 --> 00:49:21,174
At, at least ai.
1128
00:49:21,174 --> 00:49:22,044
I could point at that.
1129
00:49:22,044 --> 00:49:24,654
I could make a robot say something funny very quickly.
1130
00:49:25,148 --> 00:49:25,688
Okay.
1131
00:49:25,688 --> 00:49:26,648
Is there value to that?
1132
00:49:26,648 --> 00:49:27,219
I don't know.
1133
00:49:27,219 --> 00:49:29,108
Not a trillion dollars worth, but yeah.
1134
00:49:29,108 --> 00:49:29,379
Okay.
1135
00:49:29,379 --> 00:49:30,759
It amuses me for 10 minutes
1136
00:49:30,908 --> 00:49:34,118
and I have the grand business idiot theory, which is that the people
1137
00:49:34,118 --> 00:49:38,648
in power who have the money and the power and the hands of the media,
1138
00:49:38,799 --> 00:49:42,009
they don't understand much of what's going on they never did before.
1139
00:49:42,249 --> 00:49:46,118
And AI is this kind of mystical thing where it kind of has value, it kind
1140
00:49:46,118 --> 00:49:49,479
of has something you can point to and said outcomes you can point to and
1141
00:49:49,479 --> 00:49:54,009
go, well, to extrapolate from what this does now, it will do this next year.
1142
00:49:54,009 --> 00:49:54,608
Why?
1143
00:49:55,538 --> 00:49:56,438
I don't know.
1144
00:49:56,469 --> 00:49:58,868
But these people have a lot of money and this
1145
00:49:58,868 --> 00:50:01,148
much money can't be wrong this much money.
1146
00:50:01,148 --> 00:50:03,339
Ha ha ha.
1147
00:50:03,639 --> 00:50:03,999
Damn.
1148
00:50:03,999 --> 00:50:05,078
Can it be freaking wrong?
1149
00:50:05,408 --> 00:50:05,528
Yeah.
1150
00:50:05,528 --> 00:50:07,749
But that's bet's Howard's big enough that when
1151
00:50:07,749 --> 00:50:09,759
it falls, it takes a lot of other people out.
1152
00:50:09,999 --> 00:50:13,209
And I mean, if, what happens when people don't buy GPUs?
1153
00:50:13,209 --> 00:50:14,139
'cause here's the other thing.
1154
00:50:14,528 --> 00:50:18,339
Whether or not you agree with me on ai, maybe you want to think about
1155
00:50:18,339 --> 00:50:22,749
something which is NVIDIA's, uh, data center revenue in the last quarter.
1156
00:50:22,749 --> 00:50:24,219
So the one where GPUs live.
1157
00:50:24,249 --> 00:50:27,069
It was, uh, $39.1 billion.
1158
00:50:27,069 --> 00:50:28,059
Pretty big, right?
1159
00:50:28,299 --> 00:50:31,389
Except they missed estimates by a couple hundred million,
1160
00:50:31,538 --> 00:50:35,229
which, you know, that's one I, it happened to me yesterday.
1161
00:50:35,379 --> 00:50:36,009
I knew a guy.
1162
00:50:36,459 --> 00:50:38,349
Um, here's the problem.
1163
00:50:38,948 --> 00:50:43,538
That number is the single most economic important economic indicator right now.
1164
00:50:44,559 --> 00:50:48,669
That number is, I think, 80 something percent of NVIDIA's revenue.
1165
00:50:48,759 --> 00:50:51,309
It is the reason that Nvidia grows every quarter.
1166
00:50:51,549 --> 00:50:55,899
If that number does not continue growing, because for me to be wrong,
1167
00:50:56,078 --> 00:50:59,559
for this to keep going, everyone needs to keep buying Nvidia GPUs.
1168
00:50:59,559 --> 00:51:02,679
Nvidia will need to, like in a, in a year or so, they'll need
1169
00:51:02,679 --> 00:51:06,729
to sell 60 to a hundred billion dollars of GPUs a quarter.
1170
00:51:06,879 --> 00:51:10,599
And then after that, they will need to sell 80 to $150 billion a
1171
00:51:10,599 --> 00:51:13,868
quarter, and then they will need to sell a hundred to 180 billion.
1172
00:51:14,229 --> 00:51:16,569
They will need to keep growing exponentially.
1173
00:51:16,569 --> 00:51:21,219
The rate that the market is requiring Nvidia to grow is too high.
1174
00:51:21,894 --> 00:51:24,983
At some point that growth has to run out because we
1175
00:51:24,983 --> 00:51:28,344
don't have the physical space for all of these GPUs.
1176
00:51:28,554 --> 00:51:30,743
And at some point, if we don't have the returns
1177
00:51:30,743 --> 00:51:32,934
coming out of them, in fact, we're losing money.
1178
00:51:33,474 --> 00:51:35,304
Maybe it's time to not buy them.
1179
00:51:35,573 --> 00:51:37,403
What if Google's TPUs help?
1180
00:51:38,153 --> 00:51:40,794
What if go Google actually can move to TPUs?
1181
00:51:40,913 --> 00:51:41,934
What would happen then?
1182
00:51:42,354 --> 00:51:44,754
What if we find out that the training paradigm we're
1183
00:51:44,754 --> 00:51:47,363
under does not work, which we're already finding out?
1184
00:51:47,604 --> 00:51:50,663
Okay, that means we don't need to buy all of the GPUs because the GPUs
1185
00:51:50,663 --> 00:51:54,863
are really most useful for having a lot of them allows you to train.
1186
00:51:55,854 --> 00:51:58,794
If training isn't helpful, why would you need that?
1187
00:51:58,794 --> 00:52:02,094
Because inference is not as demanding on a GPU.
1188
00:52:03,113 --> 00:52:06,924
So it's like what do we, what do we do here?
1189
00:52:06,953 --> 00:52:08,033
What are we doing?
1190
00:52:08,033 --> 00:52:08,874
There's not much money.
1191
00:52:08,874 --> 00:52:12,294
There's a lot of expenses and I think the.
1192
00:52:13,284 --> 00:52:15,699
It is funny that I'm on this podcast 'cause I think it's probably
1193
00:52:15,699 --> 00:52:20,288
the most germane to my art, the actual way for AI to be sticky.
1194
00:52:20,288 --> 00:52:23,529
The actual way for AI to have made it was the enterprise.
1195
00:52:23,589 --> 00:52:24,759
It was never consumer.
1196
00:52:25,364 --> 00:52:27,639
It, it was going to be API access.
1197
00:52:27,729 --> 00:52:28,599
That's where the money is.
1198
00:52:28,719 --> 00:52:33,578
It is it it, but it was gonna be selling big seat SaaS
1199
00:52:33,578 --> 00:52:38,859
contracts, basically combining the way it should have worked.
1200
00:52:38,859 --> 00:52:41,199
If OpenAI would've been a functional business, would've been, it
1201
00:52:41,199 --> 00:52:45,639
would've looked a little like Microsoft 365 plus a, uh, Azure.
1202
00:52:45,939 --> 00:52:48,399
It would've been, we make a shit ton of money from selling
1203
00:52:48,489 --> 00:52:51,369
enterprise subscriptions, not consumer, but enterprise.
1204
00:52:51,699 --> 00:52:54,099
And we will make a shit ton of money on API.
1205
00:52:54,519 --> 00:52:57,578
They make a decent amount of money on both, because everyone
1206
00:52:57,578 --> 00:53:00,639
talks about AI and everyone mentions open AI every time
1207
00:53:00,879 --> 00:53:02,379
They won the branding marketing war.
1208
00:53:02,619 --> 00:53:03,339
Exactly.
1209
00:53:03,983 --> 00:53:07,733
And on top of that though, they're having, there's multiple stories, journal
1210
00:53:07,733 --> 00:53:10,644
and the information this week that say they're having trouble, they're having
1211
00:53:10,644 --> 00:53:14,033
to hit, actually you'll know this one, no one else would agree with me on this.
1212
00:53:14,033 --> 00:53:16,434
They're like, yeah, they're changing from a
1213
00:53:16,554 --> 00:53:19,344
subscription based model to a usage based model.
1214
00:53:19,584 --> 00:53:21,533
And I'm like, that's a bad sign.
1215
00:53:21,594 --> 00:53:24,684
That's a, if you know anything about SaaS, that's a bad sign.
1216
00:53:25,044 --> 00:53:27,983
That means that they're, they do not have product market fit.
1217
00:53:28,644 --> 00:53:30,894
OpenAI, the biggest name in the fracking game
1218
00:53:30,953 --> 00:53:33,294
does not have enterprise product market fit.
1219
00:53:34,044 --> 00:53:35,063
They don't have it.
1220
00:53:35,273 --> 00:53:36,474
That is doom.
1221
00:53:36,533 --> 00:53:38,514
That is certain doom.
1222
00:53:38,993 --> 00:53:42,413
If you, do you think a usage based model is gonna No, it'll probably
1223
00:53:42,413 --> 00:53:46,733
bring down, it will hopefully allow you to sell to scale more, but
1224
00:53:46,733 --> 00:53:49,644
it's not like it's gonna increase the amount that people are spending.
1225
00:53:49,794 --> 00:53:51,353
'cause if people wanted to increase the amount
1226
00:53:51,353 --> 00:53:53,334
of the spend, they use the subscription.
1227
00:53:54,189 --> 00:53:55,359
They'd use the subscription.
1228
00:53:55,899 --> 00:53:56,318
Right.
1229
00:53:57,129 --> 00:53:58,959
Uh, the problem too is you have a psychological issue,
1230
00:53:58,988 --> 00:54:01,059
especially in the consumer land when you do that, where people
1231
00:54:01,059 --> 00:54:03,399
feel that when everything you do is metered, then every time
1232
00:54:03,399 --> 00:54:05,408
you use it for anything, it feels like an investment decision.
1233
00:54:05,589 --> 00:54:07,719
Well, cons, they're not doing it for consumers,
1234
00:54:08,198 --> 00:54:09,639
they're just doing it for the enterprise.
1235
00:54:09,729 --> 00:54:10,389
Mm.
1236
00:54:10,568 --> 00:54:12,219
And they're having to, they're also having to
1237
00:54:12,219 --> 00:54:14,679
already deep discount to compete with Microsoft.
1238
00:54:16,359 --> 00:54:20,408
It's like these are on a SaaS and enterprise sales level.
1239
00:54:20,499 --> 00:54:25,988
These are terrifying stories, but most people, it's just really bad.
1240
00:54:26,379 --> 00:54:31,658
I built a dumb blue sky bot, uh, the, uh, AWS snark bot where it effectively
1241
00:54:31,658 --> 00:54:34,389
just tracks the, the news that it comes outta their, what's new feed,
1242
00:54:34,419 --> 00:54:37,929
summarizes it with inference and then puts it out there and that's it.
1243
00:54:38,198 --> 00:54:40,778
It's, I, it's, I haven't yet made it to anything funny on it, but
1244
00:54:40,778 --> 00:54:44,019
that costs me something like a dollar a month to run that thing
1245
00:54:44,019 --> 00:54:46,929
for the many hundreds of things that come across that RSS feed.
1246
00:54:47,379 --> 00:54:49,028
And it's okay.
1247
00:54:49,028 --> 00:54:51,549
That's an interesting toy, but it costs me nothing.
1248
00:54:51,939 --> 00:54:55,029
And if it started costing me something, I probably wouldn't do it.
1249
00:54:55,209 --> 00:54:56,974
And that is, it.
1250
00:54:56,979 --> 00:55:00,099
It, it fuels a bunch of fun, interesting toys like that.
1251
00:55:00,399 --> 00:55:02,559
But this does, I don't see that I could build a business
1252
00:55:02,559 --> 00:55:04,839
around that and I wouldn't be bold enough to try.
1253
00:55:04,989 --> 00:55:07,929
And even then, right now we're at the height of the mania.
1254
00:55:08,529 --> 00:55:10,989
If this was going to be the world's stickiest
1255
00:55:10,989 --> 00:55:14,229
business, there would be metrics that showed that.
1256
00:55:14,754 --> 00:55:16,119
It is very simple.
1257
00:55:16,119 --> 00:55:18,009
However anyone feels about ai.
1258
00:55:18,578 --> 00:55:20,769
Where is the actual business?
1259
00:55:20,769 --> 00:55:27,099
Because Perplexity, for example, uh, rag Search, their whole thing is they
1260
00:55:27,129 --> 00:55:31,389
re, I think they gave away $35 million last year, like discounts and refunds.
1261
00:55:31,719 --> 00:55:33,489
They're a massively lossy business.
1262
00:55:33,489 --> 00:55:34,059
It's just.
1263
00:55:34,403 --> 00:55:37,254
Instead of a three hour keynote where Randy Chassis gets up there and
1264
00:55:37,254 --> 00:55:40,014
talks about how great AI is, he would basically get on the earnings
1265
00:55:40,014 --> 00:55:44,903
call and like, okay, this past quarter we've posted 300% annual growth.
1266
00:55:45,084 --> 00:55:47,903
Uh, I will now be taking no questions.
1267
00:55:47,934 --> 00:55:48,983
Have a nice day.
1268
00:55:49,104 --> 00:55:51,384
Like that's what success story looks like.
1269
00:55:51,533 --> 00:55:53,004
And they would be kissing his ass.
1270
00:55:53,033 --> 00:55:55,554
They would be saying, Chassis, they, any of these ones,
1271
00:55:55,554 --> 00:55:57,863
they would be like, Satya Nadella has changed the world.
1272
00:55:57,863 --> 00:56:00,413
If this w this is the greatest investment ever.
1273
00:56:00,413 --> 00:56:04,613
Because he would say, AI revenue, blah, blah, profit, blah, blah.
1274
00:56:04,674 --> 00:56:05,519
He'd be like, beep, beep.
1275
00:56:05,618 --> 00:56:08,964
It would be, I don't need like, it's like Jensen Wong, Jensen Wong.
1276
00:56:09,424 --> 00:56:12,484
Goes up there and just like, sounds like he has a stroke.
1277
00:56:12,514 --> 00:56:14,554
He just goes and is like, yeah, we will have a computer.
1278
00:56:14,554 --> 00:56:17,163
They'll control AI with a computer that goes like this.
1279
00:56:17,613 --> 00:56:20,014
He will scream at someone that works for him and then he leaves
1280
00:56:20,014 --> 00:56:24,004
and everyone's like, yay, I love you Jensen sign my baby.
1281
00:56:24,064 --> 00:56:26,163
Well, for a year he was the one, he was
1282
00:56:26,163 --> 00:56:28,354
with doled out who got the GPU allocations.
1283
00:56:28,354 --> 00:56:31,024
He just made an entire year's study out of just, I'm
1284
00:56:31,024 --> 00:56:33,453
gonna go and speak at other people's, uh, keynotes.
1285
00:56:33,604 --> 00:56:34,504
Uh, why?
1286
00:56:34,533 --> 00:56:35,974
Because I choose to do so.
1287
00:56:36,004 --> 00:56:36,904
What are you gonna talk about?
1288
00:56:36,964 --> 00:56:38,554
Whatever I damn well, please.
1289
00:56:38,554 --> 00:56:39,573
Because what are they gonna do?
1290
00:56:39,573 --> 00:56:40,564
Risk of setting me?
1291
00:56:40,774 --> 00:56:42,424
And that's because he had a good business.
1292
00:56:42,424 --> 00:56:44,044
I think he sounds like a huge asshole.
1293
00:56:44,134 --> 00:56:46,624
Oh, but you see a gold rush sell a pickaxe.
1294
00:56:46,624 --> 00:56:47,194
My God.
1295
00:56:47,254 --> 00:56:50,944
But also, but also he, he performed scoreboard.
1296
00:56:51,573 --> 00:56:53,044
Like that's the thing, scoreboard.
1297
00:56:53,194 --> 00:56:56,584
Really like whether or not you like Nvidia, they got the fracking numbers.
1298
00:56:56,674 --> 00:56:57,124
Yeah.
1299
00:56:57,334 --> 00:56:58,863
Why doesn't anyone else?
1300
00:56:58,984 --> 00:57:01,203
In fact, here's a real weird question.
1301
00:57:01,474 --> 00:57:03,964
Why is it that everyone who buys GPUs from NVIDIA
1302
00:57:03,964 --> 00:57:06,184
installs them and then immediately starts losing money?
1303
00:57:07,419 --> 00:57:10,029
Is that not a li This is another thing that worries me.
1304
00:57:10,029 --> 00:57:14,439
It's like, not only is our economy built on buying GPUs, when you install
1305
00:57:14,439 --> 00:57:18,879
them, you start losing money like immediately because the costs are too high.
1306
00:57:19,059 --> 00:57:22,658
This feels like a corollary of something that I said the client might observed
1307
00:57:22,658 --> 00:57:25,779
years ago, and they're right, which is your AWS bill is not a function of
1308
00:57:25,779 --> 00:57:28,658
how many customers you have, but rather how many engineers you've hired.
1309
00:57:30,069 --> 00:57:30,549
Okay.
1310
00:57:30,759 --> 00:57:32,319
They're not entirely wrong.
1311
00:57:32,589 --> 00:57:34,899
Like once, once you have a very expensive hammer,
1312
00:57:35,139 --> 00:57:37,328
every problem starts to look like your thumb.
1313
00:57:37,569 --> 00:57:39,399
And that's the thing, the automation thing.
1314
00:57:39,399 --> 00:57:42,279
Software engineers are always gonna try and automate stuff.
1315
00:57:42,309 --> 00:57:44,559
Like it's just, that's what, that's what software.
1316
00:57:44,559 --> 00:57:45,639
They love that shit.
1317
00:57:45,819 --> 00:57:49,479
Like the containerization fest where everyone's like, we got containers now.
1318
00:57:49,479 --> 00:57:49,809
Why?
1319
00:57:49,868 --> 00:57:50,019
Ah.
1320
00:57:51,114 --> 00:57:55,374
I, as someone screamed Docker at me on the street and I'm just freaking out.
1321
00:57:55,493 --> 00:57:56,724
I need to install Docker.
1322
00:57:56,724 --> 00:57:57,234
What is it?
1323
00:57:57,234 --> 00:57:58,284
Do you work that out?
1324
00:57:58,434 --> 00:57:59,453
I need Docker now.
1325
00:57:59,663 --> 00:58:02,574
But even that did not work out so great for Docker because
1326
00:58:02,574 --> 00:58:04,794
it turns out that when people don't really know why they're
1327
00:58:04,794 --> 00:58:07,074
buying your shit, eventually, they go, why am I buying this?
1328
00:58:07,434 --> 00:58:09,084
They, they sold their enterprise business off.
1329
00:58:09,084 --> 00:58:13,044
My comment, the Morant, and my comment was to congratulate them on,
1330
00:58:13,044 --> 00:58:15,413
uh, getting rid of that pesky part of their business that made money.
1331
00:58:15,474 --> 00:58:15,924
Yes.
1332
00:58:15,924 --> 00:58:17,453
Uh, good work, everyone.
1333
00:58:18,084 --> 00:58:20,993
It's just, it's such a fantastical time.
1334
00:58:20,993 --> 00:58:24,924
And what's f what I'm finding is the conversations I have are more
1335
00:58:24,924 --> 00:58:28,614
like this one where people are just like, yeah, that is pretty weird.
1336
00:58:29,304 --> 00:58:30,114
What are we gonna do?
1337
00:58:30,203 --> 00:58:31,074
I don't know.
1338
00:58:31,134 --> 00:58:32,334
We'll watch and learn.
1339
00:58:32,484 --> 00:58:34,284
Oh, there's no next step for me.
1340
00:58:34,284 --> 00:58:37,734
I, I keep poking at the bear, but it turns out no one asked my
1341
00:58:37,734 --> 00:58:40,794
opinion before investing a hundred billion dollars into bullshit.
1342
00:58:40,974 --> 00:58:45,413
Well, that's the, my favorite thing here though, is no one, I mean,
1343
00:58:45,413 --> 00:58:48,984
people will mention it, but no one really wants to talk about the open ai.
1344
00:58:49,479 --> 00:58:50,169
Revenues.
1345
00:58:50,288 --> 00:58:51,879
No one wants to talk about the costs.
1346
00:58:52,029 --> 00:58:54,549
They will glance at 'em and be like, oh, they lost
1347
00:58:54,549 --> 00:58:57,729
$5 billion in a larger paragraph about some staff.
1348
00:58:57,819 --> 00:59:01,419
Well, I don't believe they have any formal reporting requirements, do they?
1349
00:59:01,749 --> 00:59:02,169
No.
1350
00:59:02,169 --> 00:59:03,279
They're private company.
1351
00:59:03,309 --> 00:59:04,538
I'm just saying the,
1352
00:59:04,779 --> 00:59:07,719
I'm trying to remember what the nonprofit reporting, uh, rules were around them.
1353
00:59:07,868 --> 00:59:08,439
Uh, no.
1354
00:59:08,439 --> 00:59:10,509
There's, they, no, they don't have to because
1355
00:59:10,509 --> 00:59:12,368
the for-profit entity is a separate wing.
1356
00:59:12,519 --> 00:59:13,689
Oh, that's, that's right.
1357
00:59:13,689 --> 00:59:19,149
But I mean, journalists, I mean, analysts, they don't want to touch this.
1358
00:59:19,328 --> 00:59:21,368
I, there are other people that have talked about these
1359
00:59:21,368 --> 00:59:23,679
numbers in passing, but no one else has sat down and gone,
1360
00:59:24,038 --> 00:59:26,649
Hey, look, I'm writing something properly like today.
1361
00:59:26,649 --> 00:59:29,049
That's called, why did Microsoft Invest in Open ai?
1362
00:59:29,559 --> 00:59:32,859
I've written open AI is a systemic risk to the tech industry.
1363
00:59:33,009 --> 00:59:34,448
Open AI is a bad business.
1364
00:59:34,448 --> 00:59:36,459
And what, how does OpenAI survive?
1365
00:59:36,549 --> 00:59:39,219
That headline feels like something Satya Nadella as saying to himself,
1366
00:59:39,219 --> 00:59:42,669
his head in his hand at his desks, like, why did Microsoft invest in this?
1367
00:59:42,908 --> 00:59:44,259
I'm asking it because.
1368
00:59:44,634 --> 00:59:46,794
The deal suggests many different things could be the
1369
00:59:46,794 --> 00:59:49,374
reason, such as maybe they invested hoping they'd die.
1370
00:59:49,884 --> 00:59:51,864
But no one wants to talk about this stuff.
1371
00:59:52,163 --> 00:59:56,514
And if I'm right and I believe I am, I very much, very much do.
1372
00:59:57,743 --> 00:59:59,993
I think this is one of the greatest failings
1373
00:59:59,993 --> 01:00:02,304
of financial journalism of all time.
1374
01:00:02,394 --> 01:00:05,934
I think it's only beaten by the great financial crisis.
1375
01:00:05,934 --> 01:00:07,674
I think the great financial crisis would be
1376
01:00:07,674 --> 01:00:09,654
much worse than what what's gonna happen here.
1377
01:00:10,134 --> 01:00:11,214
But it is.
1378
01:00:11,364 --> 01:00:13,854
And even then, I think had the great financial crisis happened
1379
01:00:13,854 --> 01:00:16,884
today, people would've noticed because you would, you had people
1380
01:00:16,884 --> 01:00:18,594
with the internet who could go and look at these things like
1381
01:00:18,594 --> 01:00:21,144
Michael Borough and is like, they had privileged information.
1382
01:00:21,324 --> 01:00:26,634
The proliferation of available internet services was large but not as large.
1383
01:00:26,634 --> 01:00:28,884
And you didn't have the proliferation of social
1384
01:00:28,884 --> 01:00:30,774
media at the scale that you didn't have Twitter.
1385
01:00:30,804 --> 01:00:33,714
We had Twitter was out, but it wasn't Point I'm making is
1386
01:00:35,064 --> 01:00:39,354
you have the biggest company in the ai, open ai, the most
1387
01:00:39,354 --> 01:00:43,074
well-known one, funded by the next biggest company in ai.
1388
01:00:43,449 --> 01:00:46,599
Microsoft, Microsoft's biggest customer is open
1389
01:00:46,599 --> 01:00:50,769
ai and the core finances of open AI are bad.
1390
01:00:50,769 --> 01:00:51,159
Bad.
1391
01:00:51,339 --> 01:00:52,599
They are so bad.
1392
01:00:52,659 --> 01:00:53,859
They are so awful.
1393
01:00:53,979 --> 01:00:58,149
This company burned, I think they spent $9 billion to lose $5 billion last year.
1394
01:00:58,509 --> 01:01:01,359
And this is something that modern journalists do not want to talk about.
1395
01:01:01,359 --> 01:01:03,339
They don't wanna write about it, they don't wanna read about
1396
01:01:03,339 --> 01:01:06,159
it, don't wanna hear about it in passing conversations.
1397
01:01:06,159 --> 01:01:09,369
They all say the same thing, which is, yeah, I just don't wanna think about it.
1398
01:01:09,489 --> 01:01:10,869
Yeah, that's not, that's uncomfortable if
1399
01:01:10,869 --> 01:01:12,279
I have to actually deal with any of it.
1400
01:01:12,459 --> 01:01:16,659
But it's, the problem is, is that, and I understand why they
1401
01:01:16,659 --> 01:01:19,569
don't wanna do it, because it starts justifying you in real time.
1402
01:01:20,109 --> 01:01:24,819
You start thinking, okay, well this is the biggest company in the industry
1403
01:01:24,819 --> 01:01:29,649
and they're burning billions of dollars and Microsoft needs them to
1404
01:01:29,649 --> 01:01:32,859
live, but Microsoft owns all their ip and this company can't go public
1405
01:01:32,859 --> 01:01:36,009
unless they become a, not a for-profit, which is in Microsoft's hands.
1406
01:01:36,249 --> 01:01:39,759
Even if they do though SoftBank has to keep funding them forever.
1407
01:01:40,299 --> 01:01:44,078
This company says they're gonna burn, what, 40, $50 billion?
1408
01:01:44,078 --> 01:01:46,509
By 2030, I think it's gonna be more.
1409
01:01:46,959 --> 01:01:47,889
Where are they getting that money?
1410
01:01:47,889 --> 01:01:48,879
'cause they lose money.
1411
01:01:48,879 --> 01:01:49,959
They lose money all the time.
1412
01:01:49,959 --> 01:01:51,009
They're constantly losing money.
1413
01:01:51,339 --> 01:01:52,689
And then when you think all those things you
1414
01:01:52,689 --> 01:01:57,069
go, is the entire AI industry built on sand?
1415
01:01:57,189 --> 01:01:58,839
And the answer is no.
1416
01:01:59,169 --> 01:02:01,389
The actual AI industry is built on dreams.
1417
01:02:02,319 --> 01:02:04,149
It's, it's on unicorn farts.
1418
01:02:04,479 --> 01:02:05,049
It is.
1419
01:02:05,229 --> 01:02:07,719
The only way for you to rationally look at this and
1420
01:02:07,719 --> 01:02:10,959
believe that it will keep going is fantasy Fanta?
1421
01:02:10,959 --> 01:02:12,009
I was gonna say fantastical.
1422
01:02:12,009 --> 01:02:12,459
It's fan.
1423
01:02:12,639 --> 01:02:13,719
Fantastical thinking.
1424
01:02:13,719 --> 01:02:14,379
Magical thinking.
1425
01:02:14,379 --> 01:02:14,889
There we go.
1426
01:02:15,609 --> 01:02:16,839
Could have delivered that point better.
1427
01:02:16,839 --> 01:02:20,528
But nevertheless, less you have to start engaging in fantasy.
1428
01:02:21,189 --> 01:02:21,459
You do.
1429
01:02:21,698 --> 01:02:24,908
That's the only, it's the only, it is an irrational approach to
1430
01:02:24,908 --> 01:02:28,269
this, to approach this from the perspective that OpenAI will survive.
1431
01:02:28,314 --> 01:02:33,444
Yeah, because SoftBank had to borrow $15 billion in a one year loan, a
1432
01:02:33,444 --> 01:02:39,324
bridge loan from, and they took 21 banks, 21 to get that $15 billion.
1433
01:02:39,654 --> 01:02:43,344
Note that SoftBank has also promised $19 billion to the Stargate Project.
1434
01:02:43,584 --> 01:02:46,224
The Stargate project is yet to incorporate, but Oracle is
1435
01:02:46,224 --> 01:02:49,044
already agreeing to put $40 billion of chips inside it.
1436
01:02:49,614 --> 01:02:51,533
They are being built by a company called Cruso
1437
01:02:51,533 --> 01:02:53,184
that has never built a data center before.
1438
01:02:53,844 --> 01:02:55,134
How hard could it possibly be?
1439
01:02:55,224 --> 01:02:55,913
But that's the thing.
1440
01:02:55,913 --> 01:03:02,724
It's like, it, it reminds me of the, one of the best threats of all time.
1441
01:03:02,724 --> 01:03:06,144
The IRA sent, uh, to Margaret Thatcher and
1442
01:03:06,144 --> 01:03:08,663
they said, you have to be lucky every time.
1443
01:03:08,663 --> 01:03:13,584
We only have to be lucky once it's for OpenAI to survive.
1444
01:03:13,764 --> 01:03:17,634
They will have to raise more money than anyone has ever raised
1445
01:03:17,754 --> 01:03:21,804
before, then raise it again, then likely raise it again.
1446
01:03:21,804 --> 01:03:24,624
They need to raise another $17 billion in 2027.
1447
01:03:24,624 --> 01:03:25,464
They fracking said it.
1448
01:03:25,464 --> 01:03:25,524
It.
1449
01:03:26,274 --> 01:03:29,214
They will then need to build a massive data
1450
01:03:29,214 --> 01:03:31,913
center that is moving along decently well.
1451
01:03:31,913 --> 01:03:34,254
But I've heard reports that it's having some issues.
1452
01:03:34,584 --> 01:03:37,974
They will also have to keep being able to
1453
01:03:37,974 --> 01:03:41,064
broker deals with various companies to pay them.
1454
01:03:41,064 --> 01:03:42,894
'Cause OpenAI is a Banana Republic.
1455
01:03:43,374 --> 01:03:45,474
They don't make enough money to run them.
1456
01:03:46,014 --> 01:03:47,663
All of their money has to come from outside.
1457
01:03:48,534 --> 01:03:51,774
SoftBank also has to keep getting that money and SoftBank does not have it.
1458
01:03:51,774 --> 01:03:54,504
They had to sell four and a half billion of T-Mobile stock.
1459
01:03:55,254 --> 01:03:57,953
And this is just to get a little bit, they need to give
1460
01:03:57,953 --> 01:04:01,524
open AI unless they don't convert another $30 billion.
1461
01:04:01,854 --> 01:04:03,564
Now they could syndicate $10 billion of
1462
01:04:03,564 --> 01:04:05,244
that, but still that's another $20 billion.
1463
01:04:05,244 --> 01:04:07,134
They do not have, they don't have the cash.
1464
01:04:07,314 --> 01:04:12,684
The 15 that they just borrowed, uh, six to seven of that went to amper.
1465
01:04:12,744 --> 01:04:16,524
Another company they're buying, the numbers are crazy, man.
1466
01:04:16,703 --> 01:04:19,913
When you sit and look at the numbers, they, they are crazy.
1467
01:04:20,244 --> 01:04:25,404
And I don't know what's wrong with the tech media, but they are missing this.
1468
01:04:25,779 --> 01:04:28,569
And when this blows up, I'm gonna look cool.
1469
01:04:28,599 --> 01:04:30,249
I'm gonna look cool as fracking shit.
1470
01:04:30,489 --> 01:04:31,899
This is gonna be very fun for me.
1471
01:04:31,899 --> 01:04:32,288
I'm
1472
01:04:32,408 --> 01:04:33,099
And everyone else is.
1473
01:04:33,099 --> 01:04:33,368
Very good.
1474
01:04:33,368 --> 01:04:34,929
Well, how could we have possibly known?
1475
01:04:35,139 --> 01:04:39,459
Oh, and I am not going to let anyone say that a moment anyone says that.
1476
01:04:39,609 --> 01:04:40,419
I will have link.
1477
01:04:40,449 --> 01:04:41,828
I have links ready to go.
1478
01:04:41,828 --> 01:04:42,969
I'm gonna be like, quick shot.
1479
01:04:43,809 --> 01:04:46,059
Oh, oh, oh, you, how could you have known?
1480
01:04:46,059 --> 01:04:47,739
You follow me, you follow me, cupcake.
1481
01:04:47,949 --> 01:04:48,759
Talk to me baby.
1482
01:04:48,759 --> 01:04:49,538
You have my cell.
1483
01:04:50,019 --> 01:04:51,519
I am going to be annoying.
1484
01:04:52,328 --> 01:04:53,469
I'm gonna be trotting around.
1485
01:04:53,469 --> 01:04:55,149
I'm gonna find the exact level of annoying
1486
01:04:55,149 --> 01:04:56,919
I can get before I start losing credit.
1487
01:04:57,489 --> 01:05:01,479
And what I think you're gonna see is you're gonna see a major outlet.
1488
01:05:01,479 --> 01:05:02,513
You're kind of already seeing it.
1489
01:05:02,513 --> 01:05:07,989
Deirdre Boer of CNN did a video to CNBC, even did a video today, uh,
1490
01:05:08,169 --> 01:05:11,049
talking about how the AI trade is in trouble 'cause the lack of revenue.
1491
01:05:11,409 --> 01:05:16,539
CNN's Alison Morrow has been on this ship for oh years now, just saying like,
1492
01:05:16,539 --> 01:05:22,058
Hey, this is rotten like this, this is a marketing scheme, but if I am wrong.
1493
01:05:22,989 --> 01:05:25,058
Something insane is gonna have to happen.
1494
01:05:25,058 --> 01:05:26,799
Something just completely unprecedented.
1495
01:05:26,979 --> 01:05:28,119
And I'll admit I'm wrong.
1496
01:05:28,209 --> 01:05:29,409
I'm not though.
1497
01:05:29,409 --> 01:05:32,319
It feels like Cassandra has never appreciated in her own time.
1498
01:05:32,558 --> 01:05:35,049
Ed, I, I really wanna thank you for taking the time to speak with me.
1499
01:05:35,289 --> 01:05:35,919
My pleasure.
1500
01:05:35,919 --> 01:05:36,699
Thank you for having me.
1501
01:05:36,939 --> 01:05:39,009
People wanna learn more about what you, uh,
1502
01:05:39,039 --> 01:05:41,109
what you espouse and I argue that they should.
1503
01:05:41,259 --> 01:05:42,969
Where's the best place for them to find you?
1504
01:05:43,089 --> 01:05:44,619
Go to be offline.com.
1505
01:05:45,129 --> 01:05:49,089
Just B-E-T-T-E-R-O-F-F-L-I-N-E.
1506
01:05:49,209 --> 01:05:50,379
I nearly didn't spell it.
1507
01:05:50,619 --> 01:05:51,279
Dot com.
1508
01:05:51,759 --> 01:05:52,719
Uh, it has everything.
1509
01:05:52,719 --> 01:05:53,679
It has links to everything.
1510
01:05:53,679 --> 01:05:55,569
Newsletter, podcast, PR firm.
1511
01:05:56,289 --> 01:05:57,668
Socials, all the good stuff.
1512
01:05:57,729 --> 01:06:00,008
And we'll of course put a link to that in the show notes as well.
1513
01:06:00,069 --> 01:06:01,058
And your snark bot.
1514
01:06:01,058 --> 01:06:02,048
I wanna see this.
1515
01:06:02,138 --> 01:06:04,508
I will include a link to that in the show notes as well.
1516
01:06:04,569 --> 01:06:08,918
Because the funny thing is, is like, I'm like a PR person, I'm a writer.
1517
01:06:09,098 --> 01:06:11,798
People don't know me like I'm a SaaS freak baby.
1518
01:06:11,798 --> 01:06:16,088
I, I know these companies, I know these economics and I feel like
1519
01:06:16,088 --> 01:06:19,029
maybe I'm just the only monster that could look at this stuff and make
1520
01:06:19,029 --> 01:06:23,319
sense of it because of the hours of SaaS related content I've consumed.
1521
01:06:23,499 --> 01:06:25,928
Oh, the, we like, um, Roger Auer at the
1522
01:06:25,928 --> 01:06:28,088
end of Bladerunner talking about webinars.
1523
01:06:28,388 --> 01:06:30,008
I've seen things you people wouldn't believe.
1524
01:06:31,418 --> 01:06:34,058
It's just, it's three hour long reinvent keynotes,
1525
01:06:35,258 --> 01:06:37,539
burning, burning me in Las Vegas, Nevada.
1526
01:06:37,928 --> 01:06:39,459
Ed, thank you again for your time.
1527
01:06:39,459 --> 01:06:40,539
I deeply appreciate it.
1528
01:06:40,749 --> 01:06:44,499
Ed Zitron, host of better Offline, writer of Where's Your
1529
01:06:44,499 --> 01:06:47,678
Ed at Newsletter, and I'm cloud economist, Cory Quinn.
1530
01:06:47,919 --> 01:06:50,258
If you've enjoyed this podcast, please leave a five
1531
01:06:50,258 --> 01:06:52,419
star review on your podcast platform of choice.
1532
01:06:52,419 --> 01:06:55,838
Whereas if you've hated this podcast, please leave a five star review on
1533
01:06:55,838 --> 01:06:59,468
your podcast platform of choice along with an angry, insulting comment.
1534
01:06:59,619 --> 01:07:02,229
Uh, be sure to let me know in that comment,
1535
01:07:02,258 --> 01:07:04,869
which pony you've bet your AI horse on.