
Screaming in the Cloud
ยทE649
The Transformation Trap: Why Software Modernization Is Harder Than It Looks
Episode Transcript
1
00:00:00,290 --> 00:00:02,000
These were all the sort
of basic primitives.
2
00:00:02,000 --> 00:00:05,750
And then, you know, at some point
we said, well, recipes could also
3
00:00:05,770 --> 00:00:09,729
emit structured data in the form of
tables, just rows and columns of data.
4
00:00:10,270 --> 00:00:14,080
And we would allow folks to run
those over thousands or tens of
5
00:00:14,080 --> 00:00:18,119
thousands of these, loss of semantic
tree artifacts and extract data out.
6
00:00:18,949 --> 00:00:25,440
This wound up being the fruitful bed
for LLMs eventually arriving is that
7
00:00:25,440 --> 00:00:29,959
we had thousands of these recipes
emitting data in various different forms.
8
00:00:30,580 --> 00:00:34,359
And if you could just expose this tools,
all of those thousands of recipes to a
9
00:00:34,359 --> 00:00:38,270
model and say, okay, I have a question
for you about this business unit.
10
00:00:38,440 --> 00:00:42,760
The model could select the right recipe,
deterministically, run it on potentially
11
00:00:42,760 --> 00:00:46,360
hundreds of millions of lines of code,
get the data table back, reason about
12
00:00:46,360 --> 00:00:47,799
it, combine it with something else.
13
00:00:48,230 --> 00:00:52,329
And that's the sort of, I think,
foundation for large language
14
00:00:52,340 --> 00:00:55,730
models to help with large scale
transformation and impact analysis.
15
00:01:01,030 --> 00:01:02,899
Welcome to Screaming in the Cloud.
16
00:01:03,139 --> 00:01:08,679
I'm Corey Quinn, and my guest today has
been invited to the show because I've
17
00:01:08,719 --> 00:01:14,830
been experiencing, this may shock you some
skepticism around a number of things in
18
00:01:14,830 --> 00:01:17,170
the industry, but we'll get into that.
19
00:01:17,450 --> 00:01:20,740
Jonathan Schneider is the CEO of modern.
20
00:01:20,849 --> 00:01:23,559
Uh, and before that, you've
done a lot of things.
21
00:01:23,559 --> 00:01:25,460
Jonathan, first, thank you for joining me.
22
00:01:26,070 --> 00:01:26,369
Yeah.
23
00:01:26,380 --> 00:01:27,580
Thanks for having me here, Corey.
24
00:01:27,580 --> 00:01:28,259
Such a pleasure.
25
00:01:28,740 --> 00:01:31,350
Crying Out Cloud is one of the
few cloud security podcasts
26
00:01:31,350 --> 00:01:33,600
that's actually fun to listen to.
27
00:01:33,690 --> 00:01:36,589
Smart Conversations, great
guests, and zero fluff.
28
00:01:36,640 --> 00:01:41,340
If you haven't heard of it, it's a cloud
and AI security podcast from Wiz run
29
00:01:41,349 --> 00:01:44,000
by CloudSec pros, for CloudSec pros.
30
00:01:44,219 --> 00:01:46,789
I was actually one of the first
guests on the show and it's
31
00:01:46,789 --> 00:01:48,279
been amazing to watch it grow.
32
00:01:48,400 --> 00:01:53,410
Make sure to check them out
at wiz.io/crying-out-cloud.
33
00:01:53,830 --> 00:01:56,460
Uh, we, we always have to start with
a book story because honestly, I'm
34
00:01:56,460 --> 00:01:58,130
envious of those who can write a book.
35
00:01:58,150 --> 00:02:02,610
I just write basically 18 volumes
of Twitter jokes over the years,
36
00:02:02,610 --> 00:02:05,560
but never actually sat down and
put anything cohesive together.
37
00:02:05,840 --> 00:02:10,210
Uh, you were the author of SRE with
Java microservices and the co-author
38
00:02:10,210 --> 00:02:13,980
of Automated Code Remediation,
how to Refactor and Secure the
39
00:02:13,990 --> 00:02:15,730
Modern Software Supply Chain.
40
00:02:15,969 --> 00:02:18,459
So you are professionally
depressed, I assume.
41
00:02:19,330 --> 00:02:23,979
I mean, I as like most software
engineers, I hate writing documentation.
42
00:02:23,980 --> 00:02:28,260
So somehow that translated into, you
know, write a, a full scale book instead.
43
00:02:28,260 --> 00:02:30,640
I, I honestly don't
remember how that happened.
44
00:02:32,190 --> 00:02:35,270
A series of escalating poor life
choices is my experience of it.
45
00:02:35,270 --> 00:02:36,780
I think no one wants to write a book.
46
00:02:36,820 --> 00:02:40,349
Everyone wants to have written a book, and
then you went and did it a second time.
47
00:02:41,260 --> 00:02:42,840
Yeah, much more a smaller one.
48
00:02:42,840 --> 00:02:47,340
That second one, you know, just the 35
pager luckily, but, but, you know, still,
49
00:02:47,460 --> 00:02:49,910
um, or it's always, uh, quite the effort.
50
00:02:50,200 --> 00:02:53,879
So one thing that I, I wanted to bring
you in to talk about is that the core of
51
00:02:53,880 --> 00:02:57,959
what your company does, which is, I, I,
please correct me if I'm wrong on this
52
00:02:58,010 --> 00:03:00,820
software rewrites software modernization.
53
00:03:00,910 --> 00:03:05,850
Effectively, you were doing what
Amazon Queue transform purports to
54
00:03:05,850 --> 00:03:08,510
do, uh, before everyone went AI crazy.
55
00:03:08,730 --> 00:03:12,660
Yeah, it started for me almost 10
years ago now at Netflix on the
56
00:03:12,660 --> 00:03:16,290
engineering tools team, where I was
responsible for making people move
57
00:03:16,290 --> 00:03:20,910
forward, uh, in part, but they had
that freedom and responsibility
58
00:03:20,910 --> 00:03:23,760
culture, so I could tell 'em, you're
not where you're supposed to be.
59
00:03:23,760 --> 00:03:25,480
And they would say, great, do it for me.
60
00:03:25,480 --> 00:03:27,130
Otherwise I'm, I got other things to do.
61
00:03:27,340 --> 00:03:32,269
Uh, and so really that forced our
team into trying to find ways to
62
00:03:32,269 --> 00:03:34,310
automate that change on their behalf.
63
00:03:35,400 --> 00:03:38,640
I never worked at quite
that scale in production.
64
00:03:38,640 --> 00:03:41,880
I mean, I've consulted in there in
places like that, but the, that's a
65
00:03:41,880 --> 00:03:45,100
very different experience 'cause you're
hyperfocused on a specific problem.
66
00:03:45,500 --> 00:03:49,619
But even at the scales that I've
operated at, there was, there was
67
00:03:49,620 --> 00:03:53,369
never a. An intentional decision of
someone's gonna start out today, and
68
00:03:53,500 --> 00:03:57,110
we're gonna write this in a language
and framework that are 20 years old.
69
00:03:57,260 --> 00:03:59,330
So this stuff has always
been extant for a while.
70
00:03:59,370 --> 00:04:04,010
It is, it has grown roots, it has worked
its way into business processes and a
71
00:04:04,010 --> 00:04:05,769
bunch of things have weird dependencies.
72
00:04:05,770 --> 00:04:07,109
In some cases on bugs.
73
00:04:07,420 --> 00:04:11,839
People are not, uh, declining to modernize
software stacks because they haven't
74
00:04:11,850 --> 00:04:13,359
heard that there's a new version out.
75
00:04:13,770 --> 00:04:17,720
It's because this stuff is painfully
hard because people and organizations
76
00:04:17,720 --> 00:04:19,440
that they build are painfully hard.
77
00:04:20,190 --> 00:04:24,510
I I'm curious, in your experience
having gone through this at scale
78
00:04:24,510 --> 00:04:28,709
with zeros on the end of it, what,
what are the, what are the sticking,
79
00:04:28,740 --> 00:04:29,579
what are the sticking points of this?
80
00:04:29,590 --> 00:04:31,659
Why don't people migrate?
81
00:04:31,700 --> 00:04:34,539
Is it more of a technological problem
or is it more of a people problem?
82
00:04:35,370 --> 00:04:39,530
Well, first I would start and hopefully
with a, a sympathetic viewpoint for
83
00:04:39,530 --> 00:04:42,700
the developer, which is like, pretend
I haven't written any software yet
84
00:04:42,710 --> 00:04:44,200
and I'm actually starting from today.
85
00:04:45,259 --> 00:04:49,570
I look at all the latest available
things, and I make the perfectly optimal
86
00:04:49,570 --> 00:04:53,550
choices for every part of my tech
stack today, and I, I write this thing
87
00:04:53,550 --> 00:04:58,380
completely clean six months from now,
those are no longer the optimal choices.
88
00:04:59,099 --> 00:04:59,680
Oh God, yes.
89
00:04:59,690 --> 00:04:59,950
Me.
90
00:05:00,170 --> 00:05:01,970
Worst developer I ever
met is me six weeks ago.
91
00:05:02,160 --> 00:05:02,669
It's awful.
92
00:05:02,679 --> 00:05:04,139
Like, what, what was this idiot thinking?
93
00:05:04,139 --> 00:05:07,960
You do get blame and it's you and wow,
we need not talk about that anymore.
94
00:05:08,170 --> 00:05:10,950
But yeah, the past me
was terrible at this.
95
00:05:11,300 --> 00:05:11,720
That's right.
96
00:05:11,730 --> 00:05:13,340
And, and always will be future.
97
00:05:13,340 --> 00:05:15,180
You will be the pa the next past you.
98
00:05:15,190 --> 00:05:18,879
So it's, it's, there's never an
opportunity where we can say we're making
99
00:05:18,880 --> 00:05:22,490
the optimal choice and that optimal choice
will continue to be right going forward.
100
00:05:22,490 --> 00:05:27,650
So, uh, I think that paired with one
other fact, which is just that the.
101
00:05:27,930 --> 00:05:32,190
The tools available to us have essentially
industrial industrialized software
102
00:05:32,190 --> 00:05:37,159
production to the point where we can
write net new software super quickly
103
00:05:37,340 --> 00:05:41,749
using off the shelf and third party
open source components we're expected
104
00:05:41,750 --> 00:05:46,270
to because you have to ship ship fast
and you know, then what do you do when
105
00:05:46,270 --> 00:05:48,300
that stuff evolves at its own pace?
106
00:05:48,570 --> 00:05:52,820
So nobody's really been good
at it, and I think the more.
107
00:05:53,059 --> 00:05:57,600
Authorship, uh, automation that we've,
that we've, uh, developed for ourselves
108
00:05:57,600 --> 00:06:02,470
from IDE rule-based intention actions
to now, you know, AI authorship.
109
00:06:03,210 --> 00:06:06,890
This, like the time that we spend
maintaining what we've previously
110
00:06:06,890 --> 00:06:08,350
written has continued to go up.
111
00:06:08,889 --> 00:06:09,590
I would agree.
112
00:06:10,020 --> 00:06:16,089
I, I think that there has been a. A a,
a shift and a proliferation really, of
113
00:06:16,289 --> 00:06:18,299
technical stacks and software choices.
114
00:06:18,350 --> 00:06:23,090
And as you say, even if you make the
optimal selection of every piece of
115
00:06:23,090 --> 00:06:26,200
the stack, which incidentally is where,
where some people tend to founder, they
116
00:06:26,200 --> 00:06:30,140
spend six months trying to figure out the
best approach, pick a direction and go,
117
00:06:30,140 --> 00:06:32,300
even a bad decision can be made to work.
118
00:06:32,740 --> 00:06:36,039
But, but there are so many different
paths to go that it's a near certainty
119
00:06:36,039 --> 00:06:37,509
that whatever you have built, you're.
120
00:06:37,999 --> 00:06:41,520
There, you're going to be one
of a wide variety of different
121
00:06:41,520 --> 00:06:42,350
paths that you've picked.
122
00:06:42,510 --> 00:06:47,190
You're effectively become a unicorn pretty
quickly regardless of how mainstream
123
00:06:47,290 --> 00:06:48,719
each individual choice might be.
124
00:06:48,830 --> 00:06:49,289
That's right.
125
00:06:49,659 --> 00:06:49,919
Yep.
126
00:06:50,139 --> 00:06:52,249
That's just the nature of,
of software development.
127
00:06:53,400 --> 00:06:56,770
I I am curious since, uh, you did
bring up the, uh, Netflix, uh,
128
00:06:56,790 --> 00:06:58,670
freedom and responsibility culture.
129
00:06:58,940 --> 00:07:02,140
Uh, one thing that has made me
skeptical historically of Amazon
130
00:07:02,140 --> 00:07:05,489
queue's, transform abilities and, and
many large companies that have taken a
131
00:07:05,490 --> 00:07:10,310
bite at this apple is they, they train
these things and build these things.
132
00:07:10,379 --> 00:07:14,710
Inside of a culture that has a very
particular point of view that drives
133
00:07:14,710 --> 00:07:16,300
how software development is done.
134
00:07:16,720 --> 00:07:20,560
Uh, I like how many people have we met
that have left large tech companies
135
00:07:20,570 --> 00:07:24,289
to go, found a startup, tried to build
the exact same culture that they had
136
00:07:24,300 --> 00:07:28,410
at the large company and just founder
on the rocks almost immediately.
137
00:07:28,430 --> 00:07:31,890
Because the culture shapes the company
and the company shapes the culture.
138
00:07:32,070 --> 00:07:34,700
You, you can't cargo cult
it and expect success.
139
00:07:35,290 --> 00:07:37,990
How, how varied do you find
that these modernization
140
00:07:38,000 --> 00:07:39,720
efforts are based upon culture?
141
00:07:40,210 --> 00:07:43,560
I'm glad to say that for my own story,
I had a degree of indirection here.
142
00:07:43,560 --> 00:07:48,019
I didn't go straight from Netflix to, to
founding something, so I was at Netflix.
143
00:07:48,049 --> 00:07:51,640
I think that freedom and responsibility
culture meant that Netflix in particular
144
00:07:51,650 --> 00:07:57,249
had far less self similarity or
consistency than say, a Google that has a
145
00:07:57,260 --> 00:08:01,189
very prescriptive standard for formatting
and everything and the way they do things.
146
00:08:01,799 --> 00:08:03,300
And so I left Netflix.
147
00:08:03,300 --> 00:08:06,329
I went to Pivotal, VMware
was working with large.
148
00:08:06,650 --> 00:08:11,250
Enterprise customers like JP Morgan,
fidelity, home Depot, et cetera, working
149
00:08:11,250 --> 00:08:15,450
on an unrelated problem in continuous
delivery and saw them struggling with the
150
00:08:15,450 --> 00:08:18,730
same kind of problem of like migrations
and modern, like everybody does.
151
00:08:19,170 --> 00:08:23,509
And what struck me was that even
though they're very different cultures,
152
00:08:24,150 --> 00:08:29,810
uh, JP Morgan much more strongly
resembles Netflix than it does Google.
153
00:08:30,300 --> 00:08:36,059
Um, Netflix's uh, lack of consistency
was by design or by culture, intentional.
154
00:08:36,380 --> 00:08:39,664
And JP Morgan's is just by the very
sheer nature of the fact that they have
155
00:08:39,664 --> 00:08:43,630
60,000 developers and 25 years of this,
of history and development on this.
156
00:08:43,630 --> 00:08:49,550
And so a solution that works well
for, uh, dissimilar by design
157
00:08:49,600 --> 00:08:53,750
actually works well in the typical
enterprise, which is probably closer
158
00:08:53,750 --> 00:08:55,170
to Netflix than it is to Google.
159
00:08:56,160 --> 00:08:57,900
Yeah, a lot of it depends
on constraints too.
160
00:08:57,929 --> 00:09:01,820
Uh, JP Morgan is obviously highly
reg, sorry, JP Morgan Chase.
161
00:09:01,840 --> 00:09:05,180
They're particular about the naming
people are, they're obviously highly
162
00:09:05,180 --> 00:09:09,860
regulated and mistakes matter in a
different context than they do when your
163
00:09:09,990 --> 00:09:14,720
basic entire business is streaming movies
and also creating original content that
164
00:09:14,720 --> 00:09:16,160
you then cancel just when it gets good.
165
00:09:17,199 --> 00:09:17,426
Right, right, right.
166
00:09:17,479 --> 00:09:17,519
Right.
167
00:09:18,999 --> 00:09:19,809
Yes.
168
00:09:20,350 --> 00:09:23,500
So there's, there is that question,
I guess, of how this stuff evolves.
169
00:09:23,500 --> 00:09:27,023
But taking it a bit away from the
culture side of it, how do you
170
00:09:27,120 --> 00:09:31,120
find that modernization differs
between programming languages?
171
00:09:31,450 --> 00:09:34,140
I mean, I, I dunno if people are watching
this on the video or listening to it,
172
00:09:34,190 --> 00:09:38,280
if we may, we take all kinds, but you're
wearing a hat right now that says JVM,
173
00:09:38,290 --> 00:09:43,089
so I'm, I'm just gonna speculate wildly
that Java might be your first love, given
174
00:09:43,090 --> 00:09:44,760
that you did in fact write a book on it.
175
00:09:45,010 --> 00:09:46,489
It was one of my first loves.
176
00:09:46,490 --> 00:09:46,590
Yeah.
177
00:09:46,880 --> 00:09:49,150
The technically a Java champion right now.
178
00:09:49,150 --> 00:09:52,100
Although, you know, I actually started
in c plus plus and I hated Java for
179
00:09:52,100 --> 00:09:53,370
the first few years I worked on it.
180
00:09:53,370 --> 00:09:55,595
But, um, I actually think, uh,
181
00:09:55,900 --> 00:09:57,450
Stockholm Syndrome can work miracles.
182
00:09:57,600 --> 00:09:58,880
It it sure can.
183
00:09:58,940 --> 00:09:59,920
It absolutely can.
184
00:10:00,310 --> 00:10:01,970
I, I don't know that the.
185
00:10:02,260 --> 00:10:04,300
Problems are, are that different?
186
00:10:04,309 --> 00:10:06,520
There's, you know, a lot of
different engineering challenges.
187
00:10:06,520 --> 00:10:09,570
How statically typed is something,
how dynamically typed is it, how
188
00:10:09,570 --> 00:10:12,780
accurate can a, a transformation
be provably made to be?
189
00:10:12,810 --> 00:10:17,640
But in general, I think the problems are,
um, the social engineering problems are
190
00:10:17,640 --> 00:10:21,750
harder than the, than the specifics of
the transformation that's being made.
191
00:10:22,139 --> 00:10:24,080
And those social engineering
problems are like.
192
00:10:24,530 --> 00:10:29,489
Do I build a system that issues mass
pull requests from essential team to
193
00:10:29,490 --> 00:10:32,900
all the product teams and expect that
everybody's gonna merge them because.
194
00:10:33,300 --> 00:10:38,099
They love it when, you know, random things
show up in their, uh, in their PRQ or,
195
00:10:38,540 --> 00:10:43,200
uh, do product teams perceive that, like
unwelcome advice coming from an in-law
196
00:10:43,210 --> 00:10:46,100
and they're just looking for a reason to
reject it, you know, and then they would
197
00:10:46,100 --> 00:10:50,329
prefer, instead to have an experience
where, you know, when they're about to
198
00:10:50,330 --> 00:10:53,970
undergo a large scale transformation
that they pull or they initiate the
199
00:10:53,970 --> 00:10:55,420
change and then merge it themselves.
200
00:10:55,430 --> 00:10:59,780
So like those are the things that I
think are, are highly similar regardless
201
00:10:59,780 --> 00:11:02,160
of the tech stack or company that's.
202
00:11:02,170 --> 00:11:05,000
Uh, because people are people,
uh, kind of everywhere.
203
00:11:05,520 --> 00:11:05,590
Now
204
00:11:05,609 --> 00:11:09,860
you take the suite of Amazon Q transform
options and they have a bunch of of
205
00:11:09,900 --> 00:11:13,570
software modernization capabilities,
but also getting people off of VMware
206
00:11:13,570 --> 00:11:17,590
due to, you know, extortion as well as
getting off of the MI of the mainframe,
207
00:11:17,620 --> 00:11:20,410
which that last one is probably
the thing I'm the most skeptical of
208
00:11:20,870 --> 00:11:23,630
companies have been trying to get
off of the mainframe for 40 years.
209
00:11:23,639 --> 00:11:26,860
The problem is not that you can't
recreate something that does the same
210
00:11:26,860 --> 00:11:30,069
processing, it's that there are thousands
of business processes that are critically
211
00:11:30,070 --> 00:11:31,979
dependent on that thing and you can't.
212
00:11:32,139 --> 00:11:34,700
Migrate them one at a time in most cases.
213
00:11:35,969 --> 00:11:40,540
I am highly skeptical that just pour some
AI on it is necessarily going to move
214
00:11:40,540 --> 00:11:42,390
that needle in any material fashion.
215
00:11:43,180 --> 00:11:46,530
I think that there's a, a two
different kinds of activities here.
216
00:11:46,560 --> 00:11:51,020
One is code authorship, that new
authorship, uh, that's what the
217
00:11:51,460 --> 00:11:53,760
copilots are doing, the Amazon
queue is doing, et cetera.
218
00:11:53,770 --> 00:11:55,479
It's, it's really
assisting in that respect.
219
00:11:55,620 --> 00:11:59,030
And then there's code maintenance,
which is, I need to get this thing from
220
00:11:59,030 --> 00:12:00,650
one version of a framework to another.
221
00:12:00,950 --> 00:12:04,520
Maintenance can also include,
I'm trying to consolidate one
222
00:12:04,520 --> 00:12:08,200
feature flagging vendor to, or two
feature flagging vendors to one.
223
00:12:08,559 --> 00:12:09,500
Um, but.
224
00:12:10,240 --> 00:12:14,969
When I think think of something like
a COBOL to a modern stack JV young
225
00:12:15,440 --> 00:12:19,159
or.net or whatever the case might be,
I honestly see that less of as, as
226
00:12:19,160 --> 00:12:22,990
a maintenance activity and more as
an authorship activity, a new, and
227
00:12:23,260 --> 00:12:26,470
you're, you're writing net new software
in a different stack and a different
228
00:12:26,480 --> 00:12:28,430
set of expectations and assumptions.
229
00:12:29,110 --> 00:12:31,000
Um, and so I'm skeptical too.
230
00:12:31,090 --> 00:12:34,494
I don't, I don't think there's a
magic wand, but to the extent that our
231
00:12:34,590 --> 00:12:39,010
authorship tools help us accelerate
net new development, those problems.
232
00:12:39,610 --> 00:12:41,140
The cost of those problems goes down.
233
00:12:41,210 --> 00:12:41,949
I think over time.
234
00:12:43,290 --> 00:12:47,089
Yeah, that, that does track
and makes sense of how I tend
235
00:12:47,090 --> 00:12:48,380
to think about these things.
236
00:12:48,650 --> 00:12:52,259
But at the same time that the cost
of these things goes down and the
237
00:12:52,309 --> 00:12:56,640
technology increases, it still feels
like these applications that are decades
238
00:12:56,640 --> 00:13:00,960
old in some cases are still exploding
geometrically with respect to complexity.
239
00:13:01,850 --> 00:13:02,380
That's right.
240
00:13:02,610 --> 00:13:02,930
Yeah.
241
00:13:03,889 --> 00:13:05,170
Like how do you outrun it all?
242
00:13:05,700 --> 00:13:06,680
Well, um.
243
00:13:08,140 --> 00:13:13,510
Uh, to me there's not just one approach
here, but I feel like, um, you know,
244
00:13:14,310 --> 00:13:20,880
for my own sake and my, where my focus
is, is really trying to reclaim, uh,
245
00:13:20,900 --> 00:13:25,079
developer time in some area so that
it can refocus that effort elsewhere.
246
00:13:25,110 --> 00:13:29,470
And I think one thing I hear
pretty consistently is that because
247
00:13:29,470 --> 00:13:32,980
of that explosion in software
under management right now.
248
00:13:33,490 --> 00:13:36,280
A developer spending like 30
or 40% of their time just kind
249
00:13:36,280 --> 00:13:39,060
of resiting applications and
keeping, keeping the lights on.
250
00:13:39,610 --> 00:13:43,269
And that's something we need to like
get rid of a bit or as minimize as
251
00:13:43,270 --> 00:13:47,959
much as possible so that, you know, the
next feature they're developing isn't
252
00:13:47,980 --> 00:13:51,810
just a net new feature but is actually,
you know, pulling some like old system
253
00:13:51,810 --> 00:13:53,970
into a more modern framework as well.
254
00:13:53,980 --> 00:13:56,210
That's just another activity
that can go back onto their,
255
00:13:56,849 --> 00:13:57,880
uh, is something they can do.
256
00:13:58,250 --> 00:14:03,859
But that does track the, I guess the
scary part too, is it having lived
257
00:14:03,860 --> 00:14:06,720
through some of these myself where
we know that we need to upgrade the
258
00:14:06,730 --> 00:14:11,219
thing to break off the monolith, to
master the wolf, et cetera, et cetera,
259
00:14:11,220 --> 00:14:14,960
et cetera, and it feels like there's
never time to focus on that because you
260
00:14:14,960 --> 00:14:18,459
still have to ship features, but every
feature you're doing feels like it's
261
00:14:18,460 --> 00:14:20,050
digging the technical debt hole deeper.
262
00:14:20,320 --> 00:14:20,770
It is.
263
00:14:21,809 --> 00:14:22,210
It is.
264
00:14:22,250 --> 00:14:22,510
Yeah.
265
00:14:22,510 --> 00:14:27,050
So I mean that's, and this is what I mean
is like if we can take the assets that
266
00:14:27,050 --> 00:14:32,970
we have on our management right now and,
and like keep them moving forward, um,
267
00:14:33,080 --> 00:14:39,799
then um, we have like less drift and less,
you know, um, complexity to deal with.
268
00:14:39,849 --> 00:14:40,749
Overall.
269
00:14:40,940 --> 00:14:43,500
It's an important part of
piece of that puzzle I think.
270
00:14:43,990 --> 00:14:45,780
As you said, you've been
working on this for 10 years.
271
00:14:45,890 --> 00:14:50,900
Uh, gen AI really took off at the end
of 2023, give or take Well, during 2023.
272
00:14:51,559 --> 00:14:55,530
And I'm curious to get your
take on how that has evolved.
273
00:14:55,530 --> 00:14:59,290
I mean, yes, we all have to tell
a story on some level around that.
274
00:14:59,290 --> 00:15:04,120
Uh, your uur l is modern.ai, so clearly
there's, there is some marketing
275
00:15:04,120 --> 00:15:07,240
element to this, but, but you're, but
you're a reasonable person on this
276
00:15:07,240 --> 00:15:08,870
stuff and you go deeper than most do.
277
00:15:09,139 --> 00:15:12,390
I think a lot of what, what I've
developed over the last several years,
278
00:15:12,390 --> 00:15:16,630
or our team has, has been, you know,
accidentally leading towards this moment
279
00:15:16,639 --> 00:15:21,720
where, um, we've got a set of tools
that, uh, an LM can take advantage of.
280
00:15:21,730 --> 00:15:25,270
So the first thing was, you
know, when I'm looking at a code
281
00:15:25,280 --> 00:15:29,040
base, the text of the code is in.
282
00:15:29,040 --> 00:15:34,090
I think to the abstract syntax
tree of the code is insufficient.
283
00:15:34,090 --> 00:15:37,850
So things like tree sitters, you know,
that I won't mention all the things
284
00:15:37,850 --> 00:15:40,790
builds on top of tree sitter, but
if it's just abstract syntax tree,
285
00:15:41,580 --> 00:15:46,300
there's not enough information often
for a model to latch onto to know
286
00:15:46,309 --> 00:15:47,809
how to make the right transformation.
287
00:15:48,550 --> 00:15:51,990
And the reason I started open Rewrite
in the very, at the very beginning,
288
00:15:52,029 --> 00:15:55,439
10 years ago was because the very
first problem I was trying to to solve
289
00:15:55,440 --> 00:15:59,160
at Netflix was moving from blitz for
J and internal logging library to
290
00:15:59,360 --> 00:16:01,980
not blitz for J. We were just trying
to kill off something we regretted.
291
00:16:02,470 --> 00:16:08,050
And yet that logging library looked
almost identical in syn PAX to S
292
00:16:08,050 --> 00:16:09,870
SL for jj, any of the other ones.
293
00:16:10,490 --> 00:16:12,980
And so just looking at log info.
294
00:16:13,310 --> 00:16:15,890
Well that looks exactly like log
on info from another library.
295
00:16:15,890 --> 00:16:19,610
I couldn't, you know, narrowly
identify where blitz for day still
296
00:16:19,610 --> 00:16:21,129
was, even in the environment.
297
00:16:21,349 --> 00:16:23,440
So I had to kind of go one
level deeper, which is what
298
00:16:23,440 --> 00:16:24,970
does the compiler know about it?
299
00:16:25,780 --> 00:16:30,510
And that is actually a really difficult
thing to do, to just take Texta code and
300
00:16:30,510 --> 00:16:32,250
parse it into an abstracts and text tree.
301
00:16:32,710 --> 00:16:36,990
You can use tree sitter to go one
step further and actually exercise the
302
00:16:36,990 --> 00:16:38,720
compiler and do all the symbol solving.
303
00:16:38,870 --> 00:16:42,030
Well, that actually means you have to
exercise the compiler in some way.
304
00:16:42,059 --> 00:16:43,199
Well, how is that done?
305
00:16:43,200 --> 00:16:44,020
What are the source sets?
306
00:16:44,020 --> 00:16:44,939
What version does it require?
307
00:16:44,940 --> 00:16:49,130
What build tools require, like this winds
up being this like Hu, hugely complex
308
00:16:49,130 --> 00:16:53,370
decision matrix to encounter an arbitrary
repository and build out that LST.
309
00:16:55,160 --> 00:16:58,039
We built out that that LST
or loss of semantic tree, and
310
00:16:58,040 --> 00:17:01,000
then we started building these
recipes, which could modify them.
311
00:17:01,000 --> 00:17:04,999
And those recipes stacked on other
recipes to the point where like a spring
312
00:17:05,000 --> 00:17:08,109
boot migration has 3,400 steps in it.
313
00:17:08,829 --> 00:17:10,710
And these were all the
sort of basic primitives.
314
00:17:10,710 --> 00:17:14,450
And then, you know, at some point
we said, well, recipes could also
315
00:17:14,450 --> 00:17:18,430
emit structured data in the form of
tables, just rows and columns of data.
316
00:17:18,980 --> 00:17:22,780
And we would allow folks to run
those over thousands or tens of
317
00:17:22,780 --> 00:17:26,830
thousands of these loss of semantic
tree artifacts and extract data out.
318
00:17:27,660 --> 00:17:34,150
This wound up being the fruitful bed
for LLMs eventually arriving, is that
319
00:17:34,150 --> 00:17:38,669
we had thousands of these recipes
emitting data in various different forms.
320
00:17:39,290 --> 00:17:42,390
And if you could just expose as
tools, all of those thousands
321
00:17:42,390 --> 00:17:43,950
of recipes to a model and say.
322
00:17:44,350 --> 00:17:46,980
Okay, I have a question for
you about this business unit.
323
00:17:47,150 --> 00:17:51,469
The model could select the right recipe,
deterministically, run it on potentially
324
00:17:51,469 --> 00:17:55,199
hundreds of millions of lines of code,
get the data table back, reason about it,
325
00:17:55,219 --> 00:18:00,000
combine it with something else, and that's
the sort of, I think, foundation for
326
00:18:00,330 --> 00:18:04,429
large language models to help with large
scale transformation and impact analysis.
327
00:18:05,070 --> 00:18:09,780
This episode is sponsored by my
own company, the Duck Bill Group,
328
00:18:09,910 --> 00:18:12,250
having trouble with your AWS bill.
329
00:18:12,380 --> 00:18:15,189
Perhaps it's time to renegotiate
a contract with them.
330
00:18:15,420 --> 00:18:20,560
Maybe you're just wondering how to predict
what's going on in the wide world of AWS.
331
00:18:20,730 --> 00:18:23,649
Well, that's where the Duck
Bill group comes in to help.
332
00:18:24,120 --> 00:18:26,350
Remember, you can't duck the duck bill.
333
00:18:26,350 --> 00:18:29,229
Bill, which I am reliably
informed by my business partner
334
00:18:29,250 --> 00:18:31,179
is absolutely not our motto.
335
00:18:31,190 --> 00:18:36,090
Uh, to give a a somewhat simplified
example, uh, it's easy to envision 'cause
336
00:18:36,119 --> 00:18:40,370
some of us have seen this where we'll
have code that winds up cranking on data
337
00:18:40,370 --> 00:18:44,800
and generating an artifact, and then it
staes that object into S3 because that is
338
00:18:44,800 --> 00:18:46,669
the defacto storage system of the cloud.
339
00:18:47,080 --> 00:18:50,369
Next, it then picks up that same
object and then runs a different
340
00:18:50,370 --> 00:18:52,400
series of transformation objects on it.
341
00:18:52,650 --> 00:18:56,449
Now, from a code perspective, there
is zero visibility into whether that.
342
00:18:56,490 --> 00:19:00,389
Artifact being written to S3 is
simply an inefficiency that can
343
00:19:00,390 --> 00:19:03,050
be written out and just have it
passed directly to that sub-routine.
344
00:19:03,280 --> 00:19:06,800
Or if there's some external process,
potentially another business unit
345
00:19:06,980 --> 00:19:09,760
that needs to touch that artifact
for something for reporting.
346
00:19:09,990 --> 00:19:13,630
Uh, quarterly earnings are a terrific
source where a lot of this stuff sometimes
347
00:19:13,650 --> 00:19:18,889
winds up getting, uh, getting floated up
and it's, it is impossible without having.
348
00:19:19,240 --> 00:19:21,590
Conversations in many cases
with people in other business
349
00:19:21,590 --> 00:19:23,950
units entirely to, to get there.
350
00:19:24,179 --> 00:19:27,459
That's the stumbling block
that I have seen historically.
351
00:19:27,549 --> 00:19:30,229
I Is that the sort of thing that
you're, that you wind up having
352
00:19:30,230 --> 00:19:33,530
to think about when you're doing
these things or am I contextualizing
353
00:19:33,530 --> 00:19:35,170
this from a very different layer?
354
00:19:35,670 --> 00:19:39,820
I do think of this process of large
scale transformation, impact analysis,
355
00:19:39,830 --> 00:19:43,660
very much like what you're describing
as like a, a data warehouse, ETL
356
00:19:43,670 --> 00:19:46,669
type thing, which is, you know, I need
to take a source of data, which is
357
00:19:46,670 --> 00:19:48,349
the text to the code and enrich it.
358
00:19:48,570 --> 00:19:49,719
Into something that's everything.
359
00:19:49,720 --> 00:19:53,620
To compile our, knows all the dependencies
and everything else from that point.
360
00:19:53,820 --> 00:19:57,370
And once I have that data, that's a
computationally expensive thing to do.
361
00:19:57,370 --> 00:19:58,530
Once I have that.
362
00:19:58,780 --> 00:20:03,270
There's a lot of different
applications of that same data source.
363
00:20:03,920 --> 00:20:08,290
I, I should point out that I have
been skeptical of AI in a number of
364
00:20:08,300 --> 00:20:11,680
ways for a while now, and I wanna be
clear that when I say skeptical, I
365
00:20:11,680 --> 00:20:13,389
do mean I'm middle of the road on it.
366
00:20:13,410 --> 00:20:14,660
I see its value.
367
00:20:14,660 --> 00:20:17,540
I'm not one of those, it's just
a way that kill trees and it's
368
00:20:17,540 --> 00:20:19,259
a dumb markoff chain generator.
369
00:20:19,270 --> 00:20:20,799
No, that is absurd.
370
00:20:21,210 --> 00:20:25,630
I'm also not quite on the fence of this
changes everything and every business
371
00:20:25,630 --> 00:20:27,800
application should have AI baked into it.
372
00:20:28,030 --> 00:20:28,450
I am.
373
00:20:28,530 --> 00:20:33,740
I am very middle of the road on it,
and the problem that I see as I look
374
00:20:33,750 --> 00:20:37,889
through all of this is it, it feels
like it's being used to paper over.
375
00:20:38,000 --> 00:20:40,559
A bunch of these problems
where it has to talk to folks.
376
00:20:40,699 --> 00:20:43,600
I've used a lot of AI coding
assistance and I see where these things
377
00:20:43,600 --> 00:20:45,520
tend to fall short and fall down.
378
00:20:45,800 --> 00:20:49,699
Uh, a big one is that they are seem
incapable of saying, I don't know.
379
00:20:49,710 --> 00:20:51,780
We need to go get additional data.
380
00:20:52,030 --> 00:20:55,630
Instead, they are, uh, they're
extraordinarily confident and
381
00:20:55,630 --> 00:20:57,189
authoritative and also wrong.
382
00:20:57,330 --> 00:20:59,699
I say this as a white
dude who has two podcasts.
383
00:20:59,730 --> 00:21:02,960
I am conversant with the being
authoritatively wrong point of view
384
00:21:02,980 --> 00:21:04,879
here at sort of my people's culture.
385
00:21:05,600 --> 00:21:09,230
So it's, it's one of those, how do you,
how do you meet in the middle on that?
386
00:21:09,240 --> 00:21:13,740
How do you get the value without going
too far into the realm of absurdity?
387
00:21:14,099 --> 00:21:18,319
Well, I, I do think that these things need
to be, they need to collaborate together.
388
00:21:18,320 --> 00:21:22,550
And so, uh, so it is with Amazon
Q Code transformer, that's, that's
389
00:21:22,560 --> 00:21:27,020
working to provide migrations
for, uh, Java and other things.
390
00:21:27,200 --> 00:21:30,539
You see that, that Amazon Q
Code transformer actually uses.
391
00:21:30,690 --> 00:21:33,979
Open rewrite a rule-based or
deterministic system behind it to
392
00:21:33,980 --> 00:21:35,660
actually make a lot of those changes.
393
00:21:35,910 --> 00:21:38,210
An an open source tool that
incidentally you were the
394
00:21:38,210 --> 00:21:39,410
founder of, if I'm not mistaken.
395
00:21:39,470 --> 00:21:40,210
That's right, yeah.
396
00:21:40,210 --> 00:21:43,690
And that, that our technology is really
based on as well, and it's not just
397
00:21:43,690 --> 00:21:45,930
Amazon Q Code transformer as we've seen.
398
00:21:46,420 --> 00:21:49,729
Uh, you know, IBM Watson Migration
Assistant built on top of Open rer,
399
00:21:49,849 --> 00:21:53,219
Broadcom application advisor built on
top of Open Rewrite Microsoft GitHub,
400
00:21:54,110 --> 00:21:58,929
copilot AI Migration Assistant, I think
is the current name also built on that.
401
00:21:59,350 --> 00:22:00,870
And they, they're better together.
402
00:22:00,870 --> 00:22:04,600
I mean, it's, you know, that tool runs,
uh, open rewrite to make a bunch of
403
00:22:04,920 --> 00:22:08,180
deterministic changes and then follows
that up with further verification steps.
404
00:22:08,750 --> 00:22:13,490
That's, that is the, the golden path, I
think, is trying to find ways in which.
405
00:22:13,860 --> 00:22:17,550
Non-determinism is helpful and to
stitch together systems that are
406
00:22:17,550 --> 00:22:19,199
deterministic at their core as well.
407
00:22:19,560 --> 00:22:22,529
I hate to sound like an overwhelming
cynic on this, but it's one of the
408
00:22:22,530 --> 00:22:27,639
things I'm best at, uh, it the Python
two to Python three migration, because
409
00:22:28,100 --> 00:22:33,000
Unicode had no other real discernible
reason, uh, took a. Decade in no large
410
00:22:33,040 --> 00:22:36,449
part because the single biggest breaking
change was the way that print statements
411
00:22:36,450 --> 00:22:37,690
were then handled as a function.
412
00:22:37,960 --> 00:22:41,650
And you could get around that by importing
the future package, uh, which affected a
413
00:22:41,650 --> 00:22:43,260
lot of, uh, two to three migration stuff.
414
00:22:43,590 --> 00:22:48,879
But it still took a decade for the system
tools around the Red Hat ecosystem, for
415
00:22:48,880 --> 00:22:52,370
example, just run package management to
be written, to take advantage of this.
416
00:22:52,580 --> 00:22:53,530
And that was.
417
00:22:53,830 --> 00:22:56,400
And please correct me if I'm
wrong on this, a relatively
418
00:22:56,400 --> 00:23:00,360
trivial, straightforward uplift
from Python two to Python three.
419
00:23:00,550 --> 00:23:01,989
There was just a lot of it.
420
00:23:03,530 --> 00:23:07,010
Going, looking at that migration to
anything that's even slightly more
421
00:23:07,010 --> 00:23:09,720
complicated than that seal feels
like at a past a certain point of
422
00:23:09,720 --> 00:23:12,970
scale, an impossibility, you clearly
feel differently given that you've
423
00:23:12,970 --> 00:23:15,850
built a successful company and an
open source project around this.
424
00:23:16,020 --> 00:23:16,410
Yeah.
425
00:23:16,450 --> 00:23:18,580
I think actually one of the
characteristics that was difficult
426
00:23:18,590 --> 00:23:21,310
about that Python two to three migration
are there's things like that that you
427
00:23:21,310 --> 00:23:24,550
described that were fairly simple changes
and that were done at the language level.
428
00:23:24,920 --> 00:23:29,240
But alongside that came a host of
other library changes that were made.
429
00:23:29,810 --> 00:23:32,740
Not really because of Python two
to three, but because there's an
430
00:23:32,740 --> 00:23:34,632
opportunity, they're breaking things,
we'll break things and everybody,
431
00:23:34,632 --> 00:23:36,439
everybody, lets just break things, right?
432
00:23:36,480 --> 00:23:40,290
And so a lot of people got stuck on
not just the language level changes,
433
00:23:40,290 --> 00:23:43,149
but all those library changes
that happened at the same time.
434
00:23:43,720 --> 00:23:47,020
And that's an interesting
problem because it's kind of an
435
00:23:47,060 --> 00:23:48,810
unknown scoped problem, right?
436
00:23:49,020 --> 00:23:52,639
Well, how, how much breakage you have
in your libraries very much depends
437
00:23:52,639 --> 00:23:54,350
on the libraries that you're using.
438
00:23:54,630 --> 00:23:59,340
Right now, um, so I mentioned
earlier like the Spring Boot three
439
00:23:59,359 --> 00:24:01,600
migration, two to three migration open.
440
00:24:01,600 --> 00:24:04,360
Red recipe right now has 3,400 steps.
441
00:24:04,730 --> 00:24:08,000
I promise there's some part of two
to three that we don't cover yet.
442
00:24:08,139 --> 00:24:09,230
I don't know what that is.
443
00:24:09,550 --> 00:24:11,049
Uh, but somebody will encounter it.
444
00:24:11,389 --> 00:24:12,710
And for them
445
00:24:12,719 --> 00:24:13,899
in production, most likely
446
00:24:14,420 --> 00:24:16,200
in, yeah, they're gonna be
trying to run the recipe.
447
00:24:16,200 --> 00:24:19,750
They're gonna find something that, oh, you
don't cover Camel or something great, you
448
00:24:19,750 --> 00:24:23,860
know, like, uh, and so, and that's fine,
you know, and we will encounter that.
449
00:24:23,880 --> 00:24:27,030
And probably if they use Camel in one
place, they use it a bunch of places.
450
00:24:27,580 --> 00:24:32,239
And so it'll be worth it then to build
out that additional recipe that deals
451
00:24:32,250 --> 00:24:35,760
with that camel migration and then,
you know, boom, and then you know that,
452
00:24:35,779 --> 00:24:39,020
and then that's sort of contributed
back for the benefit of everybody else.
453
00:24:39,820 --> 00:24:45,090
I think what makes this approachable
or tractable is really that we're all
454
00:24:45,099 --> 00:24:49,060
sort of building on the same substrate
of third party and open source stuff.
455
00:24:50,750 --> 00:24:54,459
From JP Morgan all the way down
to tiny like, you know, 15 person
456
00:24:54,470 --> 00:24:55,950
engineering team, modern, like,
457
00:24:56,030 --> 00:24:58,660
oh, oh, we, we can't overemphasize
just how much open source
458
00:24:58,660 --> 00:24:59,940
has changed everything.
459
00:24:59,950 --> 00:25:03,960
Back in the Bell Labs days, seventies
and eighties, it was everyone
460
00:25:03,960 --> 00:25:06,770
had to basically build their own
primitives from the ground up.
461
00:25:06,770 --> 00:25:06,889
It,
462
00:25:07,559 --> 00:25:09,510
yeah, it was all completely bespoke.
463
00:25:10,350 --> 00:25:10,679
Yeah.
464
00:25:10,779 --> 00:25:14,325
Now it's almost become a trope like,
so implement quicksort in a whiteboard.
465
00:25:14,390 --> 00:25:16,520
Like, why would I ever need to do that?
466
00:25:17,300 --> 00:25:18,000
Okay.
467
00:25:18,700 --> 00:25:22,299
Uh, I, I guess another, another
angle on my skepticism here is
468
00:25:22,330 --> 00:25:27,680
I work with AWS Bills and the
AWS billing ecosystem is vast.
469
00:25:27,730 --> 00:25:30,260
Uh, but, but the billing
space is a bounded problem.
470
00:25:30,260 --> 00:25:30,650
Space.
471
00:25:30,709 --> 00:25:33,200
Unlike programming languages that
are turning complete, you can
472
00:25:33,200 --> 00:25:34,820
build anything your heart desires.
473
00:25:35,240 --> 00:25:38,850
Uh, even in the billing space, I
just came back from finops X in
474
00:25:38,850 --> 00:25:44,030
San Diego and none of the vendors
are really making a strong AI play.
475
00:25:44,040 --> 00:25:47,820
And I'm not surprised by this because I
have done a number of experiments with
476
00:25:47,820 --> 00:25:52,870
LLMs on AWS billing artifacts, and they
consistently make the same types of
477
00:25:52,880 --> 00:25:55,210
errors that seem relatively intractable.
478
00:25:55,400 --> 00:25:57,500
Uh, go ahead and make this optimization.
479
00:25:58,160 --> 00:26:01,969
That optimization is dangerous without
a little more context fed into it.
480
00:26:02,240 --> 00:26:05,860
So I guess my somewhat sophomoric
perspective has been, if you, if you can't
481
00:26:05,860 --> 00:26:09,520
solve these things with AI in a bounded
problem space, how can you begin to tackle
482
00:26:09,520 --> 00:26:11,590
them in these open-ended problem spaces?
483
00:26:12,470 --> 00:26:13,919
I, I'm, I'm with you actually.
484
00:26:13,920 --> 00:26:16,019
And there's a counterpoint to
this, which is that I think.
485
00:26:16,420 --> 00:26:19,479
That the, all the large foundation
models are somewhat undifferentiated.
486
00:26:20,420 --> 00:26:22,930
I mean, they kind of take pull
position at any given time, but
487
00:26:23,389 --> 00:26:23,679
Right.
488
00:26:23,690 --> 00:26:25,940
Two weeks later, the whole
ecosystem is different.
489
00:26:26,020 --> 00:26:26,500
Yeah.
490
00:26:26,560 --> 00:26:30,410
If they kind of all roughly have
the same capabilities and there
491
00:26:30,420 --> 00:26:33,430
are some, like we said, there are
very useful things they can do.
492
00:26:33,450 --> 00:26:34,689
There's some utility there.
493
00:26:35,389 --> 00:26:38,420
You know, there are places
where non-determinism is useful
494
00:26:38,910 --> 00:26:41,239
and to the extent that you can
apply that non-deterministic.
495
00:26:42,930 --> 00:26:43,440
Then great.
496
00:26:43,580 --> 00:26:45,970
You know, like that, that
that's, that's fantastic.
497
00:26:46,430 --> 00:26:51,530
But I'm not in a position where I think
Spring Boot two to three upgrade or
498
00:26:51,540 --> 00:26:55,790
Python two to three upgrade applied to
5 billion lines of code is going to be
499
00:26:56,440 --> 00:27:00,139
deterministically acceptable either now
or six months from now or a year from now.
500
00:27:01,499 --> 00:27:03,830
And maybe I'll be a fool and
wrong, but I don't think so.
501
00:27:04,670 --> 00:27:04,939
Oh yeah.
502
00:27:04,940 --> 00:27:08,370
Honestly, this is, this whole AI
revolution has sh turned my entire
503
00:27:08,370 --> 00:27:11,389
understanding of how computers
work on, its on their heads.
504
00:27:11,389 --> 00:27:14,940
Like short of a rand function,
you, you knew what the output of
505
00:27:14,990 --> 00:27:17,590
a given stance of code was going
to be, given a certain input.
506
00:27:17,920 --> 00:27:20,050
Now it kind of depends.
507
00:27:20,650 --> 00:27:21,120
It does.
508
00:27:21,410 --> 00:27:22,130
It really does.
509
00:27:23,340 --> 00:27:26,490
Yeah, the problem I run into is no
matter how clever I have been able to
510
00:27:26,490 --> 00:27:29,470
be and the people I've worked with we're
far smarter than I am, have been able
511
00:27:29,470 --> 00:27:31,930
to pull off, uh, the, these insights.
512
00:27:31,980 --> 00:27:34,989
There's always migration challenges
and things breaking in production
513
00:27:34,990 --> 00:27:38,050
just because of edge and corner
cases that we simply hadn't.
514
00:27:38,090 --> 00:27:42,080
Considered the difference now is
instead because there's a culture
515
00:27:42,080 --> 00:27:45,280
in any healthy workplace about
not throwing Steven under the bus.
516
00:27:45,500 --> 00:27:49,140
Well, throwing the robot under the
bus is a very different proposition.
517
00:27:49,140 --> 00:27:51,620
I told you AI was craps
as the half of your team.
518
00:27:51,620 --> 00:27:54,669
That's AI skeptic, and
it's not the ai, it's.
519
00:27:54,670 --> 00:27:57,219
Vault says the, uh, people
who are big into I, into I,
520
00:27:57,219 --> 00:27:59,030
ai, ai, business, daddy logic.
521
00:27:59,430 --> 00:28:02,450
And the reality is probably
these things are complicated.
522
00:28:02,470 --> 00:28:06,620
Uh, compute, NN neither computer nor
me nor beast are going to be able to
523
00:28:06,660 --> 00:28:08,390
catch all of these things in advance.
524
00:28:08,790 --> 00:28:10,019
That is why we have jobs.
525
00:28:10,529 --> 00:28:12,659
I've noticed this just in
even managing our team.
526
00:28:12,659 --> 00:28:17,040
You know, I, I catch people when
they say, you know, but Junie
527
00:28:17,050 --> 00:28:18,930
said this, or, but, you know.
528
00:28:19,630 --> 00:28:22,139
Uh, don't pass through to
me what your assistant said.
529
00:28:22,500 --> 00:28:24,889
You, you're the responsible
party when you tell me something.
530
00:28:24,910 --> 00:28:26,769
So you, you, you have a source.
531
00:28:27,049 --> 00:28:28,970
You check that source, you
verify the integrity of the
532
00:28:29,020 --> 00:28:30,160
source, then you pass it to me.
533
00:28:30,250 --> 00:28:30,570
Right?
534
00:28:30,630 --> 00:28:32,850
You can outsource the work,
but not the responsibility.
535
00:28:32,850 --> 00:28:36,140
A number of lawyers are finding this,
uh, uh, uh, to be the case when they're,
536
00:28:36,150 --> 00:28:39,300
they're not checking what paralegals have
done or at least blaming the paralegals
537
00:28:39,349 --> 00:28:40,490
for it, I'm sure.
538
00:28:40,840 --> 00:28:41,510
Exactly.
539
00:28:41,550 --> 00:28:42,340
Always has been.
540
00:28:42,809 --> 00:28:43,729
Always has, but
541
00:28:44,090 --> 00:28:47,849
it's, I, I also do worry that a lot of
the skepticism around this, uh, even
542
00:28:47,850 --> 00:28:52,680
my own, my own aspect of it comes from
a conscious or unconscious level of
543
00:28:52,820 --> 00:28:57,220
defensiveness where I'm worried this
thing is going to take my job away.
544
00:28:57,220 --> 00:29:00,390
So the first thing I do, just to
rationalize it to myself is point out
545
00:29:00,390 --> 00:29:03,000
that things I'm good at and that this
thing isn't good at, at the moment.
546
00:29:03,270 --> 00:29:05,210
Well, that, that's why
I'll always have a job.
547
00:29:05,990 --> 00:29:08,770
Conversely, I don't think computers
are gonna take jobs away from all
548
00:29:08,770 --> 00:29:10,690
of us in the foreseeable future.
549
00:29:10,990 --> 00:29:14,149
The answer is probably a middle
ground in a similar passion fashion.
550
00:29:14,160 --> 00:29:18,149
The way the industrial Revolution sort
of did a whole lot of, uh, of a number on
551
00:29:18,170 --> 00:29:20,210
people who are an in independent artisans.
552
00:29:20,620 --> 00:29:25,980
So there's a, it's an evolutional
process and I, I just worry that I am
553
00:29:26,140 --> 00:29:28,220
being too defensive, even unconsciously.
554
00:29:29,400 --> 00:29:30,850
I, I think that's sometimes too.
555
00:29:30,850 --> 00:29:35,180
I, I really do feel like this is
just a continuum of, of productivity
556
00:29:35,180 --> 00:29:38,899
improvement that's been underfoot for a
long time with different technologies.
557
00:29:38,900 --> 00:29:42,960
And I mean, I remember the very
first eclipse release and the
558
00:29:42,960 --> 00:29:46,870
very first eclipse release is when
they were providing, you know, uh,
559
00:29:46,880 --> 00:29:49,249
rules-based refactorings inside the IDE.
560
00:29:49,489 --> 00:29:51,850
And I remember being super
excited every two or three
561
00:29:51,850 --> 00:29:52,980
months when they dropped another.
562
00:29:52,980 --> 00:29:55,180
And just looking at the release
notes and seeing all the new.
563
00:29:55,690 --> 00:29:56,980
Things and what did that do?
564
00:29:56,980 --> 00:29:59,650
It made me faster at
writing that new code.
565
00:30:00,059 --> 00:30:03,630
And you know, here we've got another thing
that has very different characteristics.
566
00:30:03,640 --> 00:30:07,260
It's like, it's almost good at all
the things that IDE based refactor
567
00:30:07,320 --> 00:30:10,790
weren't good at, but I still guide it.
568
00:30:11,059 --> 00:30:15,420
And, you know, unlike a, yeah,
I think the, the drop CEO said,
569
00:30:15,530 --> 00:30:19,260
uh, our CTO said IDs will be
obsolete by the end of the year.
570
00:30:20,060 --> 00:30:21,080
I don't believe this at all.
571
00:30:21,750 --> 00:30:22,520
I don't believe this at all.
572
00:30:22,530 --> 00:30:23,650
I think we're still driving them.
573
00:30:24,400 --> 00:30:26,610
I am skeptical in the
extreme on a lot of that.
574
00:30:26,690 --> 00:30:32,389
I, I, I, because again, these, let's be
honest here, these people have a thing
575
00:30:32,400 --> 00:30:36,910
they need to sell and they have billions
and billions and billions of other
576
00:30:36,910 --> 00:30:38,800
people's money riding on the outcome.
577
00:30:39,250 --> 00:30:40,360
Yeah.
578
00:30:40,360 --> 00:30:42,680
That would shape my thinking
in a bunch of ways.
579
00:30:42,680 --> 00:30:44,220
Both subtle and grows too.
580
00:30:44,559 --> 00:30:49,420
I try and take the, a more neutral
stance on this, but who knows?
581
00:30:49,440 --> 00:30:53,299
I think it's not just neutral,
it's a mature stance and it's
582
00:30:53,309 --> 00:30:57,250
one that's, uh, it's, it's a lot
of experience going behind it.
583
00:30:57,310 --> 00:30:58,186
I, I think that you're right.
584
00:30:58,250 --> 00:31:01,080
I don't think we're we're
anywhere close to being obsolete.
585
00:31:02,130 --> 00:31:05,989
No, and I, and I also, frankly, I say
this coming from an operations background,
586
00:31:05,990 --> 00:31:08,199
CIS had been turned SRE type where.
587
00:31:08,710 --> 00:31:13,899
I have been through enough cycles of
seeing today's magical technology become
588
00:31:13,939 --> 00:31:18,860
tomorrow's legacy shit that I have to
support that I am, I have a natural
589
00:31:18,860 --> 00:31:23,999
skepticism built into almost every
aspect of this just based on history.
590
00:31:24,009 --> 00:31:24,570
If nothing else,
591
00:31:25,000 --> 00:31:26,560
you know what Vibe Coding reminds me of?
592
00:31:26,560 --> 00:31:30,459
It reminds me of model driven
architecture about 25 years ago.
593
00:31:30,480 --> 00:31:33,700
The like, you know, just
produce a UML diagram and don't
594
00:31:33,700 --> 00:31:36,000
worry like the, the codal.
595
00:31:36,000 --> 00:31:38,630
I'll ship the, I'll just generate
the rest of the application.
596
00:31:38,980 --> 00:31:43,010
Or it reminds me of, uh, behavior driven
development when we said, oh, we'll
597
00:31:43,010 --> 00:31:44,549
just put in business people's hands.
598
00:31:44,700 --> 00:31:46,709
They write the test and, you
know, don't, you know, we don't
599
00:31:46,709 --> 00:31:47,789
want engineers writing the test.
600
00:31:47,789 --> 00:31:48,230
You want business?
601
00:31:48,740 --> 00:31:53,400
Like, I feel like we've seen this play
out many, many, many times in various
602
00:31:53,410 --> 00:31:55,230
forms, and maybe this time's different.
603
00:31:55,300 --> 00:31:56,250
I, I don't think so.
604
00:31:57,090 --> 00:31:59,694
And to be honest, I, I like to say
that, well, computers used to be
605
00:31:59,830 --> 00:32:03,419
deterministic, but let's, let's be honest
with ourselves, we'd long ago crossed
606
00:32:03,420 --> 00:32:07,179
threshold where no individual person
can hold the entirety of what even a
607
00:32:07,190 --> 00:32:08,989
simple function is doing in their head.
608
00:32:09,290 --> 00:32:11,649
They, they are putting their
trust in the magic think box.
609
00:32:12,920 --> 00:32:13,370
That's right.
610
00:32:13,870 --> 00:32:14,199
Yes.
611
00:32:14,200 --> 00:32:15,010
That's absolutely right.
612
00:32:15,710 --> 00:32:17,849
So I really wanna thank you for
taking the time to speak with me.
613
00:32:17,860 --> 00:32:20,379
If people can, people wanna
go and learn more, where's the
614
00:32:20,380 --> 00:32:21,939
best place for them to find you?
615
00:32:22,690 --> 00:32:26,929
I think it's, it's easy to find me on
LinkedIn these days or, uh, you know,
616
00:32:26,930 --> 00:32:29,370
go find me on Moderna, M-O-D-E-R-N e.ai.
617
00:32:30,130 --> 00:32:31,320
Um, either place.
618
00:32:31,400 --> 00:32:34,230
Happy to always, uh, send me a
dm. Happy to answer questions
619
00:32:34,760 --> 00:32:36,689
and we'll of course put
that into the show notes.
620
00:32:36,820 --> 00:32:38,840
Thank you so much for your
time, I appreciate it.
621
00:32:39,080 --> 00:32:40,040
Okay, thank you Corey.
622
00:32:40,820 --> 00:32:43,610
Jonathan Schneider, CEO at modern.
623
00:32:43,730 --> 00:32:47,070
I'm Cloud Economist Corey Quinn,
and this is Screaming In the Cloud.
624
00:32:47,320 --> 00:32:50,480
If you've enjoyed this podcast,
please leave a five star review on
625
00:32:50,480 --> 00:32:52,070
your podcast platform of choice.
626
00:32:52,110 --> 00:32:55,820
Whereas if you hated this podcast,
please leave a five star review on
627
00:32:55,820 --> 00:33:00,169
your podcast platform of choice along
with an insulting comment that maybe
628
00:33:00,170 --> 00:33:04,340
you can find an AI system to transform
into something halfway literate.