Episode Transcript
1
00:00:00,000 --> 00:00:02,750
Bret: Welcome to DevOps and Docker
talk, and I'm your host, Bret.
2
00:00:03,630 --> 00:00:05,460
This episode is a special one.
3
00:00:05,520 --> 00:00:10,860
It's actually the first episode from a
totally new podcast I launched called
4
00:00:10,860 --> 00:00:15,930
Agentic DevOps, and that podcast is
gonna run in parallel with this one.
5
00:00:16,290 --> 00:00:19,650
So this one, the goal is
still for the last six years.
6
00:00:20,205 --> 00:00:24,795
Everything related to containers,
cloud native Kubernetes, and Docker,
7
00:00:24,795 --> 00:00:27,435
and the DevOps workloads around that.
8
00:00:27,975 --> 00:00:29,475
And I don't plan on changing any of that.
9
00:00:29,475 --> 00:00:31,455
We're gonna still have the same guests.
10
00:00:31,455 --> 00:00:36,135
A certain amount of those will be AI
related guests, but I was seeing a trend.
11
00:00:36,135 --> 00:00:38,379
I. That I'll talk about in the show.
12
00:00:38,739 --> 00:00:44,290
And I thought that Agentic DevOps was
going to be a big thing here in 2025.
13
00:00:44,290 --> 00:00:49,480
So a few months back we started working on
content episodes and theming and branding.
14
00:00:49,690 --> 00:00:55,720
A whole new podcast that I recommend
you check out at agenticdevops.Fm
15
00:00:56,000 --> 00:00:57,110
links in the show notes.
16
00:00:57,380 --> 00:01:00,680
And this is the first episode from
that podcast that I'm just presenting
17
00:01:00,680 --> 00:01:02,270
here so that you can check it out.
18
00:01:02,556 --> 00:01:07,446
Neral and I talk theory around what
we see coming and what might be
19
00:01:07,446 --> 00:01:12,126
a huge shift in how we use AI to
do our jobs as DevOps engineers.
20
00:01:12,516 --> 00:01:16,506
And that intention for that show is
to have more guests and to really
21
00:01:16,566 --> 00:01:20,466
dial in and focus on that very
niche topic, at least for this year.
22
00:01:20,466 --> 00:01:20,976
Who knows?
23
00:01:21,026 --> 00:01:23,306
it might be a bigger deal than this show,
24
00:01:23,556 --> 00:01:27,096
so if you enjoy this episode,
subscribe to that second podcast of
25
00:01:27,096 --> 00:01:29,556
mine, and now I'm gonna have two.
26
00:01:29,736 --> 00:01:30,756
So I hope you enjoy.
27
00:01:41,084 --> 00:01:45,824
Welcome to the first episode of
my new podcast, a Agentic DevOps.
28
00:01:45,964 --> 00:01:46,954
this episode.
29
00:01:47,674 --> 00:01:53,374
Is kicking off what I think is going
to be a big topic for my entire year,
30
00:01:53,554 --> 00:01:59,704
probably for the next few years around
wrangling AI into some usable format.
31
00:02:00,094 --> 00:02:06,384
For DevOps, you probably heard of AI
agents by now, or the MCP protocol.
32
00:02:06,504 --> 00:02:09,384
I guess I should just say MCP,
since P stands for protocol.
33
00:02:09,714 --> 00:02:14,454
And these two things together are
creating potentially something
34
00:02:14,454 --> 00:02:18,384
very useful for platform
engineering, DevOps, and that stuff.
35
00:02:18,684 --> 00:02:21,174
it has so much potential that.
36
00:02:21,564 --> 00:02:26,304
In the first quarter of 2025, I kind of
thought this was gonna be a big deal.
37
00:02:26,304 --> 00:02:29,754
This was gonna be, uh, if we can
figure out how to keep these things
38
00:02:29,754 --> 00:02:32,394
from hallucinating and going crazy
in our infrastructure, this could
39
00:02:32,394 --> 00:02:36,834
potentially be the AI shift for
infrastructure that I was waiting for.
40
00:02:37,254 --> 00:02:39,294
So started this podcast.
41
00:02:39,354 --> 00:02:43,749
We recorded our first episode at
KubeCon at the beginning of April,
42
00:02:43,749 --> 00:02:51,024
2025, and this is gonna be a series of
very specific episodes around getting.
43
00:02:52,089 --> 00:02:56,679
Ais to do useful automation and work
for DevOps, platform engineering,
44
00:02:56,859 --> 00:02:59,979
infrastructure management,
cloud, you know, all those things
45
00:03:00,309 --> 00:03:02,619
beyond just writing YAML, right?
46
00:03:02,889 --> 00:03:06,579
So the, the intro for this podcast,
there's a separate episode for intro.
47
00:03:06,579 --> 00:03:09,759
It kind of goes into my whole theory of
why I think this is gonna be a thing.
48
00:03:10,059 --> 00:03:13,599
And this episode we really try to break
down the basics and fundamentals for
49
00:03:13,599 --> 00:03:15,189
those of you that are catching up.
50
00:03:15,189 --> 00:03:16,524
Because it's a lot.
51
00:03:16,674 --> 00:03:17,994
There's a lot going on.
52
00:03:18,174 --> 00:03:22,924
It seems like We have announcements
every day this year around AI agents or
53
00:03:22,924 --> 00:03:24,954
Agentic, ai, however you wanna call it.
54
00:03:25,204 --> 00:03:29,224
I am calling it Agentic DevOps,
and hoping that name will stick.
55
00:03:29,674 --> 00:03:32,734
Now, this episode, since it's
from the beginning of April.
56
00:03:33,439 --> 00:03:37,309
And it is technically now just getting
released at the beginning of June.
57
00:03:37,819 --> 00:03:40,189
We're a little bit behind on
launching this new podcast.
58
00:03:40,519 --> 00:03:42,709
Um, I think everything
in it's still relevant.
59
00:03:42,709 --> 00:03:44,179
There's just been a lot more since.
60
00:03:44,389 --> 00:03:46,009
And I don't know the frequency yet.
61
00:03:46,009 --> 00:03:48,229
I don't know how often this
podcast is gonna happen.
62
00:03:48,469 --> 00:03:50,569
It could be potentially every other week.
63
00:03:50,749 --> 00:03:51,619
It could be weekly.
64
00:03:51,619 --> 00:03:54,649
I just don't know yet because we
are not gonna do the same thing
65
00:03:54,649 --> 00:03:56,719
here as on my usual podcast.
66
00:03:56,719 --> 00:03:59,509
If you're someone who knows that
one DevOps and Docker talk that I've
67
00:03:59,509 --> 00:04:02,869
been doing the last seven years, that
one is still gonna have AI in it.
68
00:04:02,869 --> 00:04:07,039
But this one is very specific and
there might be a few episodes that have
69
00:04:07,039 --> 00:04:11,629
syndication or whatever you wanna call
it, of the episodes on both podcasts.
70
00:04:12,184 --> 00:04:16,594
But most of the time we're gonna
keep the focus of just everything,
71
00:04:16,594 --> 00:04:20,434
DevOps, everything, containers on
the DevOps and Docker talk show.
72
00:04:20,494 --> 00:04:24,934
And this one is gonna be very
specific around implementing useful
73
00:04:25,564 --> 00:04:31,744
AI related things for Agentic DevOps,
or automating our DevOps with robots.
74
00:04:32,224 --> 00:04:35,734
So I hope you enjoyed this episode
with Nirmal from KubeCon London.
75
00:04:41,457 --> 00:04:42,187
Hey, I'm Bret.
76
00:04:42,257 --> 00:04:43,337
And we're at Kon.
77
00:04:44,282 --> 00:04:45,972
We are Hi, Nirmal.
78
00:04:46,292 --> 00:04:47,072
Nirmal: I'm Nirmal Metha.
79
00:04:47,072 --> 00:04:50,972
I'm a principal specialist solution
architect at AWS and these are
80
00:04:50,972 --> 00:04:55,472
my views and not of my employers,
but this episode is all about
81
00:04:55,562 --> 00:04:56,042
Bret: AI
82
00:04:56,042 --> 00:04:56,822
Nirmal: agents
83
00:04:57,122 --> 00:04:59,762
Bret: for DevOps and platform engineering.
84
00:04:59,762 --> 00:04:59,792
Ooh.
85
00:04:59,852 --> 00:05:03,332
So let's just start off real
quick with what is an AI agent?
86
00:05:03,362 --> 00:05:03,542
Okay.
87
00:05:03,542 --> 00:05:07,922
So we've heard of ai, we
know ai, gen, AI chat, GPT.
88
00:05:08,132 --> 00:05:09,002
We've talked about.
89
00:05:09,342 --> 00:05:12,672
running LLMs, running
inference on platforms.
90
00:05:12,702 --> 00:05:12,882
Yep.
91
00:05:12,942 --> 00:05:17,922
And that we are managing the workloads
that provide other people services.
92
00:05:17,982 --> 00:05:18,432
Absolutely.
93
00:05:18,702 --> 00:05:21,642
So how is AI agents different than that?
94
00:05:22,202 --> 00:05:24,742
Nirmal: This is a air in
terms of bleeding edge.
95
00:05:24,952 --> 00:05:25,192
Yeah.
96
00:05:25,672 --> 00:05:26,812
This is it, right?
97
00:05:26,812 --> 00:05:26,872
Yeah.
98
00:05:26,872 --> 00:05:28,132
Like we're a year ago.
99
00:05:28,132 --> 00:05:28,492
No one
100
00:05:28,492 --> 00:05:28,792
Bret: had this
101
00:05:28,792 --> 00:05:29,212
Nirmal: term
102
00:05:29,272 --> 00:05:29,902
Bret: six months ago.
103
00:05:29,902 --> 00:05:30,412
I don't think anybody's
104
00:05:30,412 --> 00:05:30,832
Nirmal: talking about it.
105
00:05:30,832 --> 00:05:31,417
I'm very few people.
106
00:05:31,487 --> 00:05:32,457
Yeah, very few people.
107
00:05:32,957 --> 00:05:36,977
and we've seen it in the news a
lot of vendors and big companies
108
00:05:37,027 --> 00:05:41,637
announcing Agentic ai, that's another
term's ai, so AI agents, Agentic
109
00:05:42,047 --> 00:05:49,937
It's giving your LLM, like your chat,
GPT or your Claude or local LM Lama.
110
00:05:49,997 --> 00:05:50,297
Yeah.
111
00:05:51,197 --> 00:05:55,157
Access to run commands.
112
00:05:55,257 --> 00:05:56,337
On your behalf.
113
00:05:56,697 --> 00:05:57,867
Or on its behalf.
114
00:05:58,437 --> 00:05:58,617
Bret: Yeah.
115
00:05:58,617 --> 00:06:01,047
And we call those tools like
that, if you hear that word.
116
00:06:01,107 --> 00:06:01,227
Tools.
117
00:06:01,227 --> 00:06:01,407
Yeah.
118
00:06:01,437 --> 00:06:04,137
That's like the generic
tool, like I guess a shell.
119
00:06:04,602 --> 00:06:05,532
Could be a tool.
120
00:06:05,562 --> 00:06:06,042
Correct.
121
00:06:06,092 --> 00:06:08,552
Reading a file could be a tool.
122
00:06:08,702 --> 00:06:12,572
Accessing a remote, API of
a web service is a tool.
123
00:06:12,602 --> 00:06:12,872
Yep.
124
00:06:12,902 --> 00:06:14,402
Searching could be a tool.
125
00:06:14,402 --> 00:06:18,322
And so these tools what what makes
that different than what we've
126
00:06:18,322 --> 00:06:20,962
been seeing in our code editors?
127
00:06:20,992 --> 00:06:21,322
Yeah.
128
00:06:21,472 --> 00:06:22,202
How is that different?
129
00:06:22,586 --> 00:06:25,636
Nirmal: I'm a platform engineer
and I want to build out an
130
00:06:25,636 --> 00:06:27,316
EKS cluster using Terraform.
131
00:06:27,316 --> 00:06:28,096
That's what we use.
132
00:06:28,396 --> 00:06:31,816
So I'll ask let's say Claude or chat GBT.
133
00:06:31,936 --> 00:06:32,116
Yeah.
134
00:06:32,176 --> 00:06:36,796
I'm a platform engineer and I want to
build a production ready EKS cluster.
135
00:06:36,826 --> 00:06:37,636
Please create.
136
00:06:37,936 --> 00:06:42,166
The assets I need, and it
will spit out some Terraform.
137
00:06:42,171 --> 00:06:42,436
Yaml, right?
138
00:06:42,436 --> 00:06:42,646
Yeah.
139
00:06:43,006 --> 00:06:44,231
Bret: And it's writing text.
140
00:06:44,231 --> 00:06:45,161
Nirmal: It's writing text.
141
00:06:45,161 --> 00:06:46,781
And I can, I'll double
you a little button.
142
00:06:46,781 --> 00:06:47,621
I copy that.
143
00:06:47,621 --> 00:06:50,351
Put it in, or there'll be, if you're
using Cursor, all these other tools,
144
00:06:50,351 --> 00:06:52,891
you can put it into some TF file.
145
00:06:52,891 --> 00:06:52,951
Yeah.
146
00:06:53,491 --> 00:06:57,001
I can then take that and I can ask
the LM what's the command that I
147
00:06:57,001 --> 00:07:00,121
need to run to apply this Terraform?
148
00:07:00,578 --> 00:07:04,876
To actually stand up the, what it's,
what's described in this terraform.
149
00:07:05,086 --> 00:07:07,786
It'll spit out, okay, you wanna
do Terraform plan and then
150
00:07:07,786 --> 00:07:09,016
Terraform apply and all that.
151
00:07:09,016 --> 00:07:12,731
Terraform in it or whatever, and
I'll just copy those commands and
152
00:07:12,761 --> 00:07:14,621
check 'em and write them myself.
153
00:07:15,671 --> 00:07:20,171
So the LLM is not executing
anything on my behalf.
154
00:07:20,171 --> 00:07:20,666
On, on your behalf.
155
00:07:21,031 --> 00:07:25,051
Agent would be defining a tool set.
156
00:07:25,111 --> 00:07:30,031
So I could give, I could define
a tool called Terraform or a tool
157
00:07:30,031 --> 00:07:35,881
called Shell I could describe what
that tool does in natural language.
158
00:07:36,181 --> 00:07:36,421
Bret: Okay.
159
00:07:36,721 --> 00:07:41,581
Nirmal: And then I can give
the LLM system a list of these
160
00:07:41,581 --> 00:07:43,111
tools and their descriptions.
161
00:07:43,411 --> 00:07:44,851
And tell it.
162
00:07:45,091 --> 00:07:45,751
Okay?
163
00:07:46,681 --> 00:07:47,731
Back to the same scenario.
164
00:07:47,731 --> 00:07:51,151
I'm a platform engineer and I want
to create an EKS production cluster
165
00:07:51,151 --> 00:07:56,199
using Terraform, and I want you
to create it right for me because
166
00:07:56,199 --> 00:07:58,149
it has the access to those tools.
167
00:07:58,149 --> 00:08:04,089
Now it internal reasons, okay,
I need to create some Terraform.
168
00:08:04,389 --> 00:08:07,089
I need to validate it in some
kind of way, and then I need.
169
00:08:07,639 --> 00:08:09,199
I need to execute this Terraform.
170
00:08:09,769 --> 00:08:11,629
Is there any tools that
I have in my toolbox
171
00:08:11,629 --> 00:08:13,759
Bret: In this case, sorry the
i is the, you're referring
172
00:08:13,759 --> 00:08:14,809
to yourself as the ai, right?
173
00:08:15,169 --> 00:08:15,319
Yeah.
174
00:08:15,319 --> 00:08:15,499
Sorry.
175
00:08:16,344 --> 00:08:17,809
It's no longer the
human doing this, right?
176
00:08:17,809 --> 00:08:17,893
No.
177
00:08:17,893 --> 00:08:19,303
We gave it instructions and we sit back
178
00:08:19,843 --> 00:08:22,277
Nirmal: from the perspective, from
the perspective of the, LLM the
179
00:08:22,277 --> 00:08:27,377
Gen AI tool itself, the LLM system
that's the I in this scenario.
180
00:08:27,377 --> 00:08:27,378
Yeah.
181
00:08:28,277 --> 00:08:30,137
I, the LLM is deciding.
182
00:08:31,007 --> 00:08:38,267
The Gen NI tool is looking at its list of
available tools and matching what it needs
183
00:08:38,267 --> 00:08:44,327
to it, figure it, it's reasoning about
what the end goal is and it looks and
184
00:08:44,327 --> 00:08:48,437
says, there's this tool called Terraform
that allows me to use infrastructure as
185
00:08:48,437 --> 00:08:51,167
code to deploy resources on the cloud.
186
00:08:51,677 --> 00:08:52,847
That sounds like what I need.
187
00:08:52,997 --> 00:08:53,507
Maybe.
188
00:08:53,647 --> 00:08:55,177
And it.
189
00:08:55,867 --> 00:08:58,567
Generates the terraform just like
it did the first time around.
190
00:08:59,137 --> 00:09:02,017
It knows what command to run.
191
00:09:02,017 --> 00:09:07,067
It generates the command and then
the magic here, a little box will
192
00:09:07,067 --> 00:09:10,907
show up and says, do you want me
to execute this on your behalf?
193
00:09:11,117 --> 00:09:14,897
You click the button, you click the
button, and then it executes that
194
00:09:14,927 --> 00:09:24,107
Terraform apply Uhhuh and it sounds very
simple, but it's a very different paradigm
195
00:09:24,107 --> 00:09:29,517
in terms of thinking about how we interact
with infrastructure or systems in general.
196
00:09:29,517 --> 00:09:31,227
Like broadly systems in general.
197
00:09:31,257 --> 00:09:35,247
Because we are no, like in this
way of looking at it or thinking
198
00:09:35,247 --> 00:09:42,057
about it, I, as the human, are no
longer executing those commands.
199
00:09:42,182 --> 00:09:42,272
I am.
200
00:09:42,302 --> 00:09:47,162
Trusting to a certain extent that
the LLM can figure out what it needs
201
00:09:47,162 --> 00:09:53,752
to do and giving it a guardrail
set of tools to use and execute.
202
00:09:54,052 --> 00:09:54,502
Bret: Yeah.
203
00:09:54,562 --> 00:09:57,212
And so we're giving the, we're
giving the Chaos monkey XI
204
00:09:57,212 --> 00:09:58,172
mean, it's automation, right?
205
00:09:58,172 --> 00:10:00,722
We could actually classify
this as just automation.
206
00:10:00,722 --> 00:10:01,622
It just happens to be.
207
00:10:02,357 --> 00:10:04,547
Figuring out what to
automate in real time.
208
00:10:04,787 --> 00:10:08,657
Rather than the traditional automation
where we have a very deterministic plan
209
00:10:08,657 --> 00:10:12,594
of, steps that are repeated over and
over again by a GitHub action runner
210
00:10:12,624 --> 00:10:14,544
or a CI CD platform or something.
211
00:10:14,754 --> 00:10:14,844
Yeah.
212
00:10:14,844 --> 00:10:20,544
Nirmal: And the agent part is the
piece of software that enables.
213
00:10:21,249 --> 00:10:22,359
The LLM to execute.
214
00:10:22,659 --> 00:10:22,929
Bret: Yeah.
215
00:10:22,979 --> 00:10:27,409
Nirmal: and pull, pulls this all
together and one, so back to what I was
216
00:10:27,409 --> 00:10:30,949
talking about with the infrastructure
and there was a part where I said,
217
00:10:30,979 --> 00:10:37,939
okay, how do we define what tools are
available for the agent system to use?
218
00:10:37,939 --> 00:10:42,761
and how do I want the
agent to call those tools?
219
00:10:43,946 --> 00:10:46,826
And reason about them, and
there's a protocol called
220
00:10:46,826 --> 00:10:48,896
MCP Model Context Protocol.
221
00:10:48,996 --> 00:10:54,306
Just outlining a standard way of
defining the tools, the system prompt
222
00:10:54,796 --> 00:10:56,176
for that tool and a description.
223
00:10:56,476 --> 00:10:58,836
Bret: And this is like an API where
you like define the spec of an API.
224
00:10:58,836 --> 00:11:04,081
Nirmal: It's a defined spec of an
API and the adoption of that API is
225
00:11:04,081 --> 00:11:05,101
Bret: just exploding right now,
226
00:11:05,221 --> 00:11:05,731
Nirmal: essentially.
227
00:11:05,821 --> 00:11:05,881
Bret: Yeah.
228
00:11:06,031 --> 00:11:08,971
So we're to, to under if you're not,
okay sorry, lemme back up a second.
229
00:11:09,211 --> 00:11:12,451
That's a very valid point because that's
the reason I wanted to record This's
230
00:11:12,451 --> 00:11:14,301
a I don't wanna be a hype machine.
231
00:11:14,631 --> 00:11:14,781
Correct.
232
00:11:14,781 --> 00:11:16,791
But I'm super excited right now.
233
00:11:16,951 --> 00:11:21,391
if you can see inside my, in
my enthusiastic brain, I've
234
00:11:21,391 --> 00:11:24,391
only been paying attention to
this for a little over a month.
235
00:11:24,706 --> 00:11:27,046
If you asked me two months ago
what an AI agent was, I'd say,
236
00:11:27,286 --> 00:11:29,266
I don't know a robot that's ai.
237
00:11:29,296 --> 00:11:29,791
I don't know.
238
00:11:30,436 --> 00:11:32,716
I now think I've got a
much better handle on this.
239
00:11:32,791 --> 00:11:36,556
I've been spending so much of my life
right now, deep diving into this, to
240
00:11:36,556 --> 00:11:39,586
the point that you and I are talking
about changing some of the focus
241
00:11:39,586 --> 00:11:41,026
this year on, on all these topics.
242
00:11:41,026 --> 00:11:41,386
Absolutely.
243
00:11:41,386 --> 00:11:44,056
Because I think this is gonna
dominate the conversation.
244
00:11:44,356 --> 00:11:46,846
This is, these are, there's gonna be
a lot of predictions in this and we're
245
00:11:46,846 --> 00:11:49,876
not gonna talk forever 'cause it's
gonna need to be multiple episodes to
246
00:11:49,876 --> 00:11:51,556
really break down what's going on here.
247
00:11:51,556 --> 00:11:52,906
But we now have the definitions.
248
00:11:53,156 --> 00:11:55,166
AI agents, what are tools?
249
00:11:55,586 --> 00:11:58,196
The protocol behind it is
essentially MCP right now.
250
00:11:58,196 --> 00:12:01,406
Although that's not necessarily gonna be
the only thing, it's just the thing right
251
00:12:01,406 --> 00:12:03,896
now that we're agreeing on by one company.
252
00:12:03,956 --> 00:12:04,466
Exactly.
253
00:12:04,766 --> 00:12:10,626
Nirmal: We have to caveat this with, this
is like this is early like Docker days.
254
00:12:10,656 --> 00:12:11,166
This is like
255
00:12:11,466 --> 00:12:14,016
Bret: Docker in day 60, right?
256
00:12:14,016 --> 00:12:14,106
Yes.
257
00:12:14,106 --> 00:12:17,726
Like we were like right after
Python in 2013 when we gave that
258
00:12:17,726 --> 00:12:19,136
de, when he gave that demo, Solomon.
259
00:12:19,616 --> 00:12:23,966
Like we all saw it and didn't
understand it fully, but it
260
00:12:23,966 --> 00:12:25,316
felt like something right.
261
00:12:25,316 --> 00:12:28,886
And like you and I both, that's why
we were early docker captains, is
262
00:12:28,886 --> 00:12:31,586
we saw that as a platform shift.
263
00:12:31,804 --> 00:12:35,294
we've seen these waves before over,
over our careers of many decades
264
00:12:35,484 --> 00:12:40,064
that we earned with this gray
beard status with effort and toil.
265
00:12:40,364 --> 00:12:43,814
And I feel like this is maybe the moment.
266
00:12:44,624 --> 00:12:48,794
That was the moment of 2013 and that,
and yeah, I'm not alone in that feeling.
267
00:12:48,854 --> 00:12:49,134
yes.
268
00:12:49,134 --> 00:12:52,674
Nirmal: And there's just to be clear,
there's massive differences between
269
00:12:52,794 --> 00:12:57,024
like paradigm shifts in terms of like
virtualization, cloud containers.
270
00:12:57,104 --> 00:13:02,004
And the tooling of software
development and systems development
271
00:13:02,004 --> 00:13:05,964
and right systems operations,
it's still in that same vein, but.
272
00:13:06,324 --> 00:13:07,154
Yeah, we're not replacing,
273
00:13:07,154 --> 00:13:09,944
Bret: this is not replacing infrastructure
or containers or anything like that.
274
00:13:09,994 --> 00:13:11,914
This is just gonna change the way we work.
275
00:13:12,214 --> 00:13:12,634
Nirmal: Correct.
276
00:13:12,634 --> 00:13:15,724
And also it's broader than
just like IT infrastructure.
277
00:13:15,854 --> 00:13:20,624
Like this has implications with
software design or application,
278
00:13:20,624 --> 00:13:22,154
like what an application does.
279
00:13:22,254 --> 00:13:25,344
And I want to think of
this as a teaser trailer.
280
00:13:25,659 --> 00:13:27,644
To subsequent new series, episode.
281
00:13:27,644 --> 00:13:27,766
A new series.
282
00:13:27,771 --> 00:13:28,449
Yeah, absolutely.
283
00:13:28,479 --> 00:13:28,779
We're gonna have to
284
00:13:28,779 --> 00:13:29,589
Bret: come up with a name.
285
00:13:29,589 --> 00:13:32,739
I'm toying around with the idea of
Agentic DevOps, and just classifying
286
00:13:32,739 --> 00:13:36,709
that as the absolutely as the theme
of certain levels of podcast episodes.
287
00:13:36,709 --> 00:13:37,760
You've heard it here first.
288
00:13:37,760 --> 00:13:37,904
Heard it here first.
289
00:13:37,939 --> 00:13:38,139
This
290
00:13:38,169 --> 00:13:39,439
Nirmal: is Agentic DevOps.
291
00:13:39,509 --> 00:13:41,999
Another term we're seeing is AI four ops.
292
00:13:42,299 --> 00:13:43,559
Again, this is early days.
293
00:13:43,589 --> 00:13:44,549
None of this is like
294
00:13:44,609 --> 00:13:44,789
Bret: Yeah.
295
00:13:44,789 --> 00:13:45,569
Set in stone at all.
296
00:13:45,569 --> 00:13:48,839
Yeah, and if you're at KU Con today
with us, if you were here at this
297
00:13:48,839 --> 00:13:52,019
conference all week, AI was a constant
topic, but it wasn't about this.
298
00:13:52,394 --> 00:13:57,164
It actually, there was only one talk in
an entire week that even touched on the
299
00:13:57,164 --> 00:14:03,794
idea of using AI to do the job of an
DevOps or operator or platform engineer.
300
00:14:03,854 --> 00:14:06,464
Like people are, what we're talking
about at KU Con for the last three
301
00:14:06,464 --> 00:14:10,664
years has been how to run the
inference and build the LLM models.
302
00:14:11,054 --> 00:14:15,164
And so we are just still using
human effort to do that work.
303
00:14:15,359 --> 00:14:19,514
But this, I feel like I'm gonna draw the
line in the sand and say, this is the.
304
00:14:19,974 --> 00:14:24,804
month or the definitely
the year, that kicks off.
305
00:14:25,274 --> 00:14:30,214
What will be a multi-year effort of
figuring out how we use automated
306
00:14:30,214 --> 00:14:33,874
LLMs Essentially with access to all
the tools we want to give it with
307
00:14:33,904 --> 00:14:36,284
the proper permissions and only
the permissions we want to give
308
00:14:36,284 --> 00:14:39,074
it right to do our work for us.
309
00:14:39,479 --> 00:14:42,809
In a less chaos mon monkey way, right?
310
00:14:42,809 --> 00:14:44,009
Like less chaotic way.
311
00:14:44,014 --> 00:14:44,204
Potentially.
312
00:14:44,279 --> 00:14:44,639
Potentially.
313
00:14:44,729 --> 00:14:46,709
It could, this thing can
easily go off the rails.
314
00:14:46,799 --> 00:14:47,189
Absolutely.
315
00:14:47,189 --> 00:14:51,959
I will probably reference in the show
notes Solomon Hike's recent talks about
316
00:14:51,969 --> 00:14:56,329
how they're now using Dagger, which
is primarily A-C-I-C-D pipeline tool.
317
00:14:56,379 --> 00:15:01,179
So he's talking, and a lot of my
language is actually from him iterating
318
00:15:01,179 --> 00:15:04,899
on his idea of what this might look
like when we're throwing a bunch
319
00:15:04,899 --> 00:15:10,599
of crazy hallucinating AI into what
we consider a deterministic world.
320
00:15:10,969 --> 00:15:11,219
Correct.
321
00:15:11,219 --> 00:15:17,399
Nirmal: I think with containers and cloud
and on the infrastructure APIs we have.
322
00:15:17,894 --> 00:15:22,694
We were chipping away and really
aiming at deterministic behavior
323
00:15:22,694 --> 00:15:24,566
with respect to infrastructure.
324
00:15:26,434 --> 00:15:28,646
Ironically, maybe not
ironically, I don't know.
325
00:15:29,006 --> 00:15:33,776
Now we're introducing a paradigm
shift that reintroduces a lot
326
00:15:33,776 --> 00:15:35,971
of non-determinism right into.
327
00:15:36,656 --> 00:15:40,376
A place that we have been fighting
to non-determinism for a long time.
328
00:15:41,066 --> 00:15:43,376
Bret: We have been working
to get rid of all that.
329
00:15:43,376 --> 00:15:46,226
And now we're, that's why I keep
saying Chaos monkey, because we're
330
00:15:46,226 --> 00:15:47,756
throwing a wrench into the system.
331
00:15:47,766 --> 00:15:52,056
That is in some ways feels like we're
going back to a world of, I don't
332
00:15:52,056 --> 00:15:53,706
know, what's the status of the system?
333
00:15:53,706 --> 00:15:54,036
I don't know.
334
00:15:54,706 --> 00:15:57,896
and this will probably be another
episode, I feel like this Agentic
335
00:15:57,896 --> 00:16:00,896
approach where we're actually
can have the potential to pit.
336
00:16:01,271 --> 00:16:03,341
The LLMs against each other, right?
337
00:16:03,341 --> 00:16:05,231
And have different
personas of these agents.
338
00:16:05,381 --> 00:16:07,901
One is the validator, one is the tester.
339
00:16:08,051 --> 00:16:10,121
One is one is the builder.
340
00:16:10,181 --> 00:16:11,771
And they can fight amongst each other.
341
00:16:11,861 --> 00:16:12,701
And it all works out.
342
00:16:12,701 --> 00:16:15,011
It actually ha happens to
actually work out better.
343
00:16:15,251 --> 00:16:19,091
And so if you're like me and
for the last three years of
344
00:16:19,151 --> 00:16:21,041
understanding, ever since GPT.
345
00:16:21,551 --> 00:16:22,961
3.5 or whatever came out.
346
00:16:22,961 --> 00:16:27,671
We all saw chat GPT as a product, and
then we started with GoodHub copilot and
347
00:16:27,721 --> 00:16:32,571
we started down this road As a DevOps
person, I haven't had a lot to talk about
348
00:16:32,811 --> 00:16:36,531
because I'm not interested in which model
is the fastest or the most accurate.
349
00:16:36,531 --> 00:16:37,191
'cause you know what?
350
00:16:37,321 --> 00:16:41,501
they all hallucinate and
still even today, years later.
351
00:16:42,491 --> 00:16:45,791
Code agents and we and you can see
this on YouTube, you watch basically
352
00:16:45,791 --> 00:16:49,421
thousands of videos on YouTube of
people trying to use these models to
353
00:16:49,421 --> 00:16:51,911
write perfect code and they just don't.
354
00:16:52,401 --> 00:16:56,151
And so we in ops, but we look at
that, I think, and the people I
355
00:16:56,151 --> 00:17:00,201
talk to even for years now are like,
we're never gonna use that for ops.
356
00:17:00,241 --> 00:17:03,061
But now my opinion has changed.
357
00:17:03,121 --> 00:17:03,271
Yeah.
358
00:17:03,621 --> 00:17:03,981
Nirmal: yeah.
359
00:17:03,981 --> 00:17:07,981
And I. If you're listening to this
and your gut reaction is, wait we
360
00:17:07,981 --> 00:17:10,291
have like APIs that are deterministic.
361
00:17:10,291 --> 00:17:10,891
Like you just
362
00:17:11,221 --> 00:17:11,461
Bret: Yeah.
363
00:17:11,761 --> 00:17:12,871
Nirmal: We can just call an API.
364
00:17:12,871 --> 00:17:17,431
We can have an automation tool call an
API to stand up infrastructure and like,
365
00:17:17,431 --> 00:17:23,041
why do we need to recreate like another
layer that makes it non-deterministic.
366
00:17:23,041 --> 00:17:27,751
And looks like an API but isn't an API
and you don't really know what it might
367
00:17:27,751 --> 00:17:30,481
do or which direction it might go.
368
00:17:30,841 --> 00:17:31,021
Yeah.
369
00:17:31,051 --> 00:17:32,261
And you're feeling I don't know.
370
00:17:32,261 --> 00:17:35,291
That doesn't seem like it would
solve any problems for me.
371
00:17:35,291 --> 00:17:37,301
And it seems like it might
introduce a lot of problems.
372
00:17:37,501 --> 00:17:39,811
You're in the right place because
that's exactly what we're gonna explore.
373
00:17:40,111 --> 00:17:40,531
Bret: Yeah.
374
00:17:40,881 --> 00:17:44,181
Nirmal: one thing for sure
though is it's here, right?
375
00:17:44,616 --> 00:17:51,206
I and so I feel like as good engineers,
as good system admins and operators
376
00:17:51,686 --> 00:17:53,301
Bret: are we enjoy, we love our crafts.
377
00:17:53,301 --> 00:17:54,231
We, we look at this as an.
378
00:17:54,546 --> 00:17:57,446
Art form of brain power and Right.
379
00:17:57,546 --> 00:18:00,766
Reaching for perfectionism in our
YAML and in our infrastructure
380
00:18:00,766 --> 00:18:02,716
optimization and our security.
381
00:18:02,936 --> 00:18:07,946
Nirmal: And we have a healthy
sense of skepticism on new tools,
382
00:18:07,946 --> 00:18:09,746
new processes, new mechanisms.
383
00:18:09,746 --> 00:18:09,956
Yeah.
384
00:18:10,016 --> 00:18:13,946
When you, when availability of your
services is paramount and reliability,
385
00:18:14,306 --> 00:18:16,796
you want to introduce new things in a.
386
00:18:17,186 --> 00:18:18,476
In a prudent manner.
387
00:18:18,576 --> 00:18:22,446
And so we're gonna take that
approach, but we're not going
388
00:18:22,446 --> 00:18:24,936
to dismiss that this exists.
389
00:18:24,991 --> 00:18:30,431
Clearly there's a lot of interest,
energy integration happening,
390
00:18:30,761 --> 00:18:35,781
experimentation happening and some
people are already starting to see value.
391
00:18:36,021 --> 00:18:36,201
Yeah.
392
00:18:36,251 --> 00:18:39,175
and we're gonna explore
with you where that, goes.
393
00:18:39,205 --> 00:18:39,415
Bret (2): Yeah.
394
00:18:39,415 --> 00:18:46,585
This, just to be clear, this is
KubeCon April, 2025 and almost
395
00:18:46,585 --> 00:18:48,565
no one is talking about this yet.
396
00:18:48,845 --> 00:18:52,835
It feels like it's right under the
surface of a lot of conversations and
397
00:18:52,835 --> 00:18:56,075
a lot of people maybe are thinking
about it, but I'm not even sure that
398
00:18:56,075 --> 00:18:58,880
we're honest with ourselves around.
399
00:18:59,645 --> 00:19:02,135
That this is coming,
whether we like it or not.
400
00:19:02,345 --> 00:19:09,455
And only because, yeah, not only, but
one of the large reasons is business.
401
00:19:10,055 --> 00:19:10,385
Okay.
402
00:19:10,745 --> 00:19:11,255
Lemme back up.
403
00:19:11,255 --> 00:19:15,275
You know how in a lot of organizations,
Kubernetes became a mandate, right?
404
00:19:15,275 --> 00:19:18,495
So there's lots of stories that came
out over the course of Kubernetes
405
00:19:18,495 --> 00:19:22,425
lifetime of teams being told that
they need to implement Kubernetes.
406
00:19:22,475 --> 00:19:27,545
It didn't come from a systems engineering
approach of solving a known problem.
407
00:19:27,545 --> 00:19:28,595
It came down.
408
00:19:28,880 --> 00:19:33,390
Because an executive decided that
they read a CIO magazine article
409
00:19:33,390 --> 00:19:35,580
that said Kubernetes was a cool
new thing and they did it right.
410
00:19:35,730 --> 00:19:37,020
I hear this all the time.
411
00:19:37,020 --> 00:19:41,280
I confirm this multiple times this
week with other people, and I now feel
412
00:19:41,280 --> 00:19:43,770
like we're not talking about it yet.
413
00:19:44,190 --> 00:19:51,300
But I did hear multiple analysts say their
organizations that they're working with
414
00:19:51,330 --> 00:19:57,480
expect that we are going to reduce the
number of personnel in infrastructure.
415
00:19:57,780 --> 00:19:58,830
Because of ai.
416
00:19:58,955 --> 00:20:02,315
the only way that's possible
is if we use agents to our
417
00:20:02,315 --> 00:20:04,295
advantage, because we can't, yeah.
418
00:20:04,295 --> 00:20:06,425
I still don't believe
we're replacing ourselves.
419
00:20:06,755 --> 00:20:09,735
I don't think the agents will
ever in, in the near term.
420
00:20:09,735 --> 00:20:12,955
And as far as we can see out, let's
say five years they will, they
421
00:20:12,955 --> 00:20:16,255
won't be running all infrastructure
in the world by themselves.
422
00:20:16,375 --> 00:20:17,815
They can't turn on servers.
423
00:20:17,995 --> 00:20:22,095
They maybe you can actually pixie boot
and do a power on a POE or whatever, but.
424
00:20:22,845 --> 00:20:27,405
Like we still need someone to give them
orders and rules and guidelines to go
425
00:20:27,405 --> 00:20:31,695
do the work, but to me, I'm starting
to wonder if very quickly, especially
426
00:20:31,695 --> 00:20:35,475
for those bleeding organizations that
are looking to squeeze out every cost
427
00:20:35,475 --> 00:20:40,425
optimization they can of their staff,
that they're going to be mandated to
428
00:20:40,605 --> 00:20:46,785
not just take AI as a code gen for yaml,
but to start using these agents to.
429
00:20:47,280 --> 00:20:51,375
Increase the velocity of their
work . And my, one of my stories is
430
00:20:51,375 --> 00:20:55,185
over the last 30 years I do this in
talks is every major shift has been
431
00:20:55,185 --> 00:20:57,675
about speed, cost reduction in speed.
432
00:20:57,915 --> 00:20:59,835
Sometimes we get 'em
both at the same time.
433
00:20:59,895 --> 00:21:01,275
Sometimes they're one or the other.
434
00:21:01,275 --> 00:21:03,435
We get a cost reduction, but we
don't go any faster, which is
435
00:21:03,435 --> 00:21:06,855
fine, or we're going faster, but
it's not necessarily cheaper yet.
436
00:21:06,855 --> 00:21:07,215
Nirmal: Right.
437
00:21:07,515 --> 00:21:07,755
Bret: And.
438
00:21:09,060 --> 00:21:13,020
I feel like this is maybe the next
one where We're gonna be feeling the
439
00:21:13,020 --> 00:21:17,010
pressure because all the devs are
gonna be writing code with ai, which
440
00:21:17,010 --> 00:21:21,150
in theory is going to improve their
performance, which means they're writing
441
00:21:21,150 --> 00:21:24,390
more code, shipping more, or need, or
wanting to ship more code, potentially.
442
00:21:24,390 --> 00:21:27,300
And if we're not using AI ourselves.
443
00:21:27,600 --> 00:21:32,250
To automate more of these platform
designs, platform build outs,
444
00:21:32,310 --> 00:21:35,430
troubleshooting when we're in production
and things are problematic and we
445
00:21:35,430 --> 00:21:38,050
don't wanna spend three hours trying
to find the source of the problem.
446
00:21:38,290 --> 00:21:43,510
If we're not starting to use agents to,
to automate a lot of that and reduce the
447
00:21:43,510 --> 00:21:48,550
time to market, so to speak, for a certain
feature or platform feature then I don't
448
00:21:48,550 --> 00:21:52,860
think these teams are gonna hire more
of us to help enable the devs to deploy.
449
00:21:53,190 --> 00:21:57,180
What it could end up happening is we
end up more with more shadow ops, where
450
00:21:57,180 --> 00:22:01,020
the developers are so fed up with us
not speeding up to the, if they're
451
00:22:01,020 --> 00:22:03,210
gonna go 10 x we have to go 10 x. Yeah.
452
00:22:03,260 --> 00:22:05,990
If they're gonna go three x or whatever
the number ends up being in the reports.
453
00:22:05,990 --> 00:22:09,255
And Gartner puts out like the AI
makes it efficient, more efficient
454
00:22:09,255 --> 00:22:11,235
for developers to, to code with ai.
455
00:22:11,235 --> 00:22:13,725
And the models get better and
the way they use it is better.
456
00:22:14,115 --> 00:22:17,575
And so they're shipping code faster and
they can do the same speed with three
457
00:22:17,575 --> 00:22:19,383
times less developers, or they can just.
458
00:22:19,825 --> 00:22:22,993
Produce three times more work, which I
think is more likely, because if it's
459
00:22:22,993 --> 00:22:25,903
the common denominator and everyone
has it, then that means every company
460
00:22:25,903 --> 00:22:29,323
can execute faster and they're gonna,
they're gonna want to do that because
461
00:22:29,323 --> 00:22:30,433
their competitors are doing that.
462
00:22:30,433 --> 00:22:33,193
So that's a's, that's a very
loaded and long prediction.
463
00:22:34,063 --> 00:22:35,173
Nirmal: That's a hypothesis.
464
00:22:35,663 --> 00:22:36,753
It's, I think there's
a lot of predict here.
465
00:22:36,753 --> 00:22:40,473
It's gonna take some time for us to
even chip away at that hypothesis,
466
00:22:40,473 --> 00:22:42,033
but it's a good starting point.
467
00:22:42,133 --> 00:22:47,593
If we're, but assuming that is like
the hypothesis that organizations
468
00:22:47,593 --> 00:22:51,103
are looking at to adopt these
tools that's a great starting point
469
00:22:51,103 --> 00:22:53,773
for us to help you figure out.
470
00:22:54,283 --> 00:22:57,133
what they are, why they are, what they do.
471
00:22:57,163 --> 00:22:57,313
Yeah.
472
00:22:57,313 --> 00:22:58,063
And how to use them.
473
00:22:58,363 --> 00:23:01,093
Bret: This is this, by the way, a
lot a little bit of that opinion
474
00:23:01,093 --> 00:23:03,553
of mine, and there's more to come
'cause I've got a lot more written
475
00:23:03,553 --> 00:23:04,423
down than we're never gonna get to.
476
00:23:04,843 --> 00:23:09,443
But a significant portion of that is
actually coming from what I've learned
477
00:23:09,443 --> 00:23:14,543
this week from analyst whose job it
is to figure this stuff out for their
478
00:23:14,543 --> 00:23:16,013
organization and their customers.
479
00:23:16,013 --> 00:23:16,373
Interesting.
480
00:23:16,553 --> 00:23:20,693
And so I, I am a little weighted by their.
481
00:23:21,698 --> 00:23:25,508
Almost unrealistic expectations
of how fast we can do this.
482
00:23:25,508 --> 00:23:26,678
'cause we are still humans.
483
00:23:26,838 --> 00:23:30,078
An organization can't adopt AI until
the humans learn how to adopt AI and
484
00:23:30,078 --> 00:23:31,728
the humans have to go at human speed.
485
00:23:31,808 --> 00:23:34,928
So we can't just flip a switch
and suddenly AI is here and
486
00:23:34,928 --> 00:23:35,888
running everything for us.
487
00:23:35,888 --> 00:23:38,978
At least not until we
have Iron Man's Jarvis.
488
00:23:39,008 --> 00:23:39,488
Or whatever.
489
00:23:39,488 --> 00:23:42,908
Like until we have that, we still have
to learn these tools and still have
490
00:23:42,908 --> 00:23:44,528
to adapt our platforms to use them.
491
00:23:44,528 --> 00:23:44,529
Yes.
492
00:23:44,534 --> 00:23:46,088
And adapt our learning to use them.
493
00:23:46,448 --> 00:23:47,738
And that's gonna take some time
494
00:23:47,798 --> 00:23:48,308
Nirmal: and.
495
00:23:48,663 --> 00:23:51,633
I'd like to, like the parting
thought for this is Okay.
496
00:23:51,633 --> 00:23:56,193
And here, like you said, there's an under
the surface kind of thing happening.
497
00:23:56,223 --> 00:23:56,583
Yeah.
498
00:23:56,943 --> 00:23:57,783
So whispers,
499
00:23:57,783 --> 00:23:58,983
Bret: it's almost like murmurs and under
500
00:23:58,983 --> 00:23:59,583
Nirmal: the surface.
501
00:23:59,588 --> 00:23:59,788
Yeah.
502
00:24:00,088 --> 00:24:02,793
AI agent, AI agents, mag
503
00:24:03,093 --> 00:24:03,423
Bret: DevOps.
504
00:24:03,453 --> 00:24:03,843
Ooh.
505
00:24:04,383 --> 00:24:05,583
This is our ASMR podcast.
506
00:24:05,583 --> 00:24:06,333
Moment of the podcast.
507
00:24:08,288 --> 00:24:09,518
Nirmal: Like MCP protocol.
508
00:24:09,818 --> 00:24:10,148
Bret: Yeah.
509
00:24:10,488 --> 00:24:14,448
Nirmal: you mentioned HA proxy on
the previous podcast, about load
510
00:24:14,448 --> 00:24:18,408
balancing and figuring out the
street, like token utilization of
511
00:24:18,408 --> 00:24:20,958
GPUs and tokens and all that stuff.
512
00:24:21,208 --> 00:24:25,318
and we had a conversation at the solo
booth and they were talking about having.
513
00:24:25,778 --> 00:24:30,008
A proxy for an MCP gateway, one of
the things that we're seeing the early
514
00:24:30,008 --> 00:24:32,688
signs of is these new workloads, right?
515
00:24:32,688 --> 00:24:39,268
This agentic kind of thinking Around
even just executing the agentic platform,
516
00:24:39,268 --> 00:24:43,678
if you will, And everything from
looking at the tokens and optimizing
517
00:24:43,678 --> 00:24:50,268
load balancing to inference endpoints
or MCP is, doesn't behave the same
518
00:24:50,268 --> 00:24:52,398
way as like just an http connection.
519
00:24:52,498 --> 00:24:53,158
Necessarily.
520
00:24:53,398 --> 00:24:53,968
And solo.
521
00:24:53,968 --> 00:24:56,368
We were talking to them and
they have an MCP gateway.
522
00:24:56,648 --> 00:24:59,198
We're seeing a little bit more
of a trend on AI gateways.
523
00:24:59,198 --> 00:25:04,598
Is DO the project has an AI gateway and
so this is not just another workload
524
00:25:04,778 --> 00:25:06,368
and looks like just a web server.
525
00:25:06,417 --> 00:25:09,117
And the networking and
everything is gonna be different.
526
00:25:09,197 --> 00:25:12,497
Not dramatically different,
but We'll, but drift different
527
00:25:12,497 --> 00:25:14,177
enough that we need to be aware.
528
00:25:14,777 --> 00:25:18,377
'cause even if you're not using
any of these tools, someone in your
529
00:25:18,377 --> 00:25:21,077
organization is probably gonna say,
oh, we need to integrate this stuff
530
00:25:21,107 --> 00:25:23,567
into our software, to our right.
531
00:25:23,567 --> 00:25:24,947
Whatever we're delivering.
532
00:25:25,427 --> 00:25:27,407
And we'll need to know
it even at that layer.
533
00:25:27,467 --> 00:25:30,887
So we're gonna also cover that
component as it relates to.
534
00:25:31,427 --> 00:25:32,897
The Kubernetes ecosystem, right?
535
00:25:32,912 --> 00:25:33,542
And cloud native.
536
00:25:33,842 --> 00:25:34,202
Bret: Yeah.
537
00:25:34,322 --> 00:25:38,172
I think this, if we had to do like an
elevator pitch for this podcast, it would
538
00:25:38,172 --> 00:25:46,252
be we now have a industry idea around
these terms agent, and then it uses an API
539
00:25:46,252 --> 00:25:50,568
called MCP to allow us to give more work.
540
00:25:50,898 --> 00:25:55,218
To these crazy robot texting things
that we have to talk to in human
541
00:25:55,218 --> 00:25:56,898
language and not with code, right?
542
00:25:56,898 --> 00:25:58,878
It's running code, but we're
not talking to it with code.
543
00:25:59,148 --> 00:26:04,238
And that it can now understand all the
tools we need to use and we can just give
544
00:26:04,238 --> 00:26:05,558
it a list of everything I wanted to use.
545
00:26:05,700 --> 00:26:08,820
here's my Kubernetes API, here's
all my other things that I, you have
546
00:26:08,820 --> 00:26:11,070
access to, and here's my problem.
547
00:26:11,400 --> 00:26:12,420
Go solve it.
548
00:26:12,570 --> 00:26:15,330
And that paradigm.
549
00:26:15,705 --> 00:26:19,125
Three months ago, two months ago
for me, I didn't know existed.
550
00:26:19,695 --> 00:26:22,695
And that's why I've been sitting
on the sidelines with ai.
551
00:26:22,695 --> 00:26:26,775
Like it's cool for writing programs
that mostly work in a demo.
552
00:26:27,025 --> 00:26:30,235
It's cool for adding a feature to
something I already have, but it's
553
00:26:30,235 --> 00:26:34,135
not doing my job as a platform
engineer or DevOps engineer.
554
00:26:34,185 --> 00:26:36,075
It's just helping me write text faster
555
00:26:36,169 --> 00:26:37,459
Then I can type into my keyboard.
556
00:26:37,509 --> 00:26:39,129
And that was not that interesting.
557
00:26:39,129 --> 00:26:41,499
That's why you didn't see a lot of
me talking about that on this show,
558
00:26:41,679 --> 00:26:42,609
was it just wasn't that interesting.
559
00:26:42,759 --> 00:26:48,369
This is an interesting topic for ops and
for absolutely engineers on the platform.
560
00:26:48,669 --> 00:26:48,849
Nirmal: Yep.
561
00:26:49,164 --> 00:26:49,464
Bret: So
562
00:26:49,944 --> 00:26:50,544
Nirmal: stay tuned.
563
00:26:50,634 --> 00:26:51,174
Yeah.
564
00:26:51,174 --> 00:26:54,504
And I, I love crazy texting robots.
565
00:26:54,534 --> 00:26:54,894
Crazy
566
00:26:54,894 --> 00:26:55,824
Bret: texting robots.
567
00:26:55,954 --> 00:26:57,544
Maybe that's the title.
568
00:26:57,544 --> 00:26:58,024
TBD.
569
00:27:00,244 --> 00:27:00,784
Alright.
570
00:27:00,844 --> 00:27:01,354
Alright.
571
00:27:01,354 --> 00:27:02,104
See you soon, man.
572
00:27:02,704 --> 00:27:02,914
See
573
00:27:02,914 --> 00:27:03,094
Nirmal: you.
574
00:27:03,094 --> 00:27:03,096
See you.
575
00:27:03,754 --> 00:27:04,324
Bye.
576
00:27:04,324 --> 00:27:04,384
Bye.