Sveriges mest populära poddar
The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

Language Modeling With State Space Models with Dan Fu - #630

28 min22 maj 2023

Today we’re joined by Dan Fu, a PhD student at Stanford University. In our conversation with Dan, we discuss the limitations of state space models in language modeling and the search for alternative building blocks that can help increase context length without being computationally infeasible. Dan walks us through the H3 architecture and Flash Attention technique, which can reduce the memory footprint of a model and make it feasible to fine-tune. We also explore his work on improving language models using synthetic languages, the issue of long sequence length affecting both training and inference in models, and the hope for finding something sub-quadratic that can perform language processing more effectively than the brute force approach of attention.

The complete show notes for this episode can be found at https://twimlai.com/go/630

Fler avsnitt av The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

Visa alla avsnitt av The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) med Sam Charrington finns tillgänglig på flera plattformar. Informationen på denna sida kommer från offentliga podd-flöden.