Today we continue our AI Trends 2024 series with a conversation with Thomas Dietterich, distinguished professor emeritus at Oregon State University. As you might expect, Large Language Models figured prominently in our conversation, and we covered a vast array of papers and use cases exploring current research into topics such as monolithic vs. modular architectures, hallucinations, the application of uncertainty quantification (UQ), and using RAG as a sort of memory module for LLMs. Lastly, don’t miss Tom’s predictions on what he foresees happening this year as well as his words of encouragement for those new to the field.
The complete show notes for this episode can be found at twimlai.com/go/666.
Fler avsnitt av The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)
Visa alla avsnitt av The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) med Sam Charrington finns tillgänglig på flera plattformar. Informationen på denna sida kommer från offentliga podd-flöden.
