Quentin Anthony

Head of HPC at EleutherAI
Quentin Anthony is a prominent figure in AI and machine learning, particularly known for his work at EleutherAI. Holding the position of Head of HPC (High-Performance Computing), Anthony has made significant contributions to the field, including co-authoring 'Transformers Math 101,' a clear articulation of training rules of thumb for transformers. His research focuses on high-performance deep learning, training large language models (LLMs), and system tuning. Anthony's work also encompasses innovations in distributed training and strategies like ZeRO and 3D parallelism,...

Explore Quentin's ideas

click chart for more ...

Optimizing GPU Memory in Model Training

1. Optimizing Memory in Model Training
2. Scaling Models Across GPUs