Navigated to Anthropic Head of Pretraining on Scaling Laws, Compute, and the Future of AI

Anthropic Head of Pretraining on Scaling Laws, Compute, and the Future of AI

October 1
1h 4m

View Transcript

Episode Description

Ever wonder what it actually takes to train a frontier AI model?YC General Partner Ankit Gupta sits down with Nick Joseph, Anthropic's Head of Pre-training, to explore the engineering challenges behind training Claude—from managing thousands of GPUs and debugging cursed bugs to balancing compute between pre-training and RL. We cover scaling laws, data strategies, team composition, and why the hardest problems in AI are often infrastructure problems, not ML problems.

See all episodes

Never lose your place, on any device

Create a free account to sync, back up, and get personal recommendations.