In this episode, we’re joined by Sebastian Ruder, PhD student studying NLP at National University of Ireland and Research Scientist at text analysis startup Aylien. We discuss recent milestones in neural NLP, including multi-task learning and pretrained language models. We also look at the use of attention-based models, Tree RNNs and LSTMs, and memory-based networks. Finally, Sebastian walks us through his ULMFit paper, which he co-authored with Jeremy Howard of fast.ai who I interviewed in episode 186.
Fler avsnitt av The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)
Visa alla avsnitt av The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) med Sam Charrington finns tillgänglig på flera plattformar. Informationen på denna sida kommer från offentliga podd-flöden.
