TD

Tri Dao

Chief Scientist at Together AI
Tri Dao is recognized for co-creating FlashAttention, an innovative approach to improving the efficiency and speed of Transformers in machine learning, published in May 2022. Winning the ICML 2022 Outstanding Paper runner-up award, FlashAttention is celebrated for its substantial enhancement in memory efficiency and computational speed, being sub-quadratic at O(N). The technology has been widely adopted across the industry in various large language models for faster inference. Tri Dao recently completed his PhD in Computer Science at Stanford University, jointly...

Explore Tri's ideas

click chart for more ...

Advancements in AI: Memory and Attention

All
1. Challenges in AI Hardware and Software
2. Advancements in Attention Mechanisms
3. Exploring Alternatives to Transformers