Navigated to Cloud Costs vs AI Workloads, The Storage Decisions That Decide Scale

Cloud Costs vs AI Workloads, The Storage Decisions That Decide Scale

February 9
26 mins

Episode Description

Cloud bills are climbing, AI pipelines are exploding, and storage is quietly becoming the bottleneck nobody wants to own. Ugur Tigli, CTO at MinIO, breaks down what actually changes when AI workloads hit your infrastructure, and how teams can keep performance high without letting costs spiral.


In this conversation, we get practical about object storage, S3 as the modern standard, what open source really means for security and speed, and why “cloud” is more of an operating model than a place.


Key takeaways


• AI multiplies data, not just compute, training and inference create more checkpoints, more versions, more storage pressure

• Object storage and S3 are simplifying the persistence layer, even as the layers above it get more complex

• Open source can improve security feedback loops because the community surfaces regressions fast, the real risk is running unsupported, outdated versions

• Public cloud costs are often less about storage and more about variable charges like egress, many teams move data on prem to regain predictability

• The bar for infrastructure teams is rising, Kubernetes, modern storage, and AI workflow literacy are becoming table stakes


Timestamped highlights


00:00 Why cloud and AI workloads force a fresh look at storage, operating models, and cost control

00:00 What MinIO is, and why high performance object storage sits at the center of modern data platforms

01:23 Why MinIO chose open source, and how they balance freedom with commercial reality

04:08 Open source and security, why faster feedback beats the closed source perception, plus the real risk factor

09:44 Cloud cost realities, egress, replication, and why “fixed costs” drive many teams back inside their own walls

15:04 The persistence layer is getting simpler, S3 becomes the standard, while the upper stack gets messier

18:00 Skills gap, why teams need DevOps plus AIOps thinking to run modern storage at scale

20:22 What happens to AI costs next, competition, software ecosystem maturity, and why data growth still wins


A line worth keeping


“Cloud is not a destination for us, it’s more of an operating model.”


Pro tips for builders and tech leaders


• If your AI initiative is still a pilot, track egress and data movement early, that is where “surprise” costs tend to show up

• Standardize around containerized deployment where possible, it reduces the gap between public and private environments, but plan for integration friction like identity and key management

• Treat storage as a performance system, not a procurement line item, the right persistence layer can unblock training, inference, and downstream pipelines


What's next:

If you’re building with AI, running data platforms, or trying to get your cloud costs under control, follow the show and subscribe so you do not miss upcoming episodes. Share this one with a teammate who owns infrastructure, data, or platform engineering.

See all episodes

Never lose your place, on any device

Create a free account to sync, back up, and get personal recommendations.