Content Tagged "transformers"
JAX-ML Scaling Book
This book is a comprehensive guide to scaling training of models. It covers a wide range of topics, from hardware to software to model design, and provides a comprehensive overview of the current state of the art. It's a great read for anyone who wants to understand the challenges and solutions for scaling training of models. Its also got exercises and code examples to help you understand the concepts.
Picrotron
Picrotron provides a barebones implementation of Nvidia's Megatron-LM. It's a great read and a hackable starting point for anyone who wants to build their own language model.
Stanford CS153: Infra at Scale
This video is a great overview of the challenges and solutions for scaling AI inference. It covers a wide range of topics, from hardware to software, and provides a comprehensive overview of the current state of the art. Mike Abbot, from Anthropic, talks about the challenges of scaling training and inference, How scaling modern languge models is a fascinating engineering problem
How to train a model on 10K H100 GPUs
Provides a comprehensive overview of the challenges and solutions for scaling training a model on 10K H100 GPUs. Soumith Chintala, from Meta AI, provides a detailed overview of the challenges and solutions for scaling training a model on 10K H100 GPUs. From the hardware to the software, it's a great read for anyone who wants to understand the challenges and solutions for scaling training a model on a large cluster.