Members of the research community at Microsoft work continuously to advance their respective fields. Abstracts brings its audience to the cutting edge with them through short, compelling conversations about new and noteworthy achievements.
In this episode, Principal Researcher Alessandro Sordoni joins host Gretchen Huizinga to discuss “Joint Prompt Optimization of Stacked LLMs using Variational Inference.” In the paper, which was accepted at the 2023 Conference on Neural Information Processing Systems (NeurIPS), Sordoni and his coauthors introduce Deep Language Networks, or DLNs, an architecture that treats large language models as layers within a network and natural language prompts as each layer’s learnable parameters.
Fler avsnitt av Microsoft Research Podcast
Visa alla avsnitt av Microsoft Research PodcastMicrosoft Research Podcast med Researchers across the Microsoft research community finns tillgänglig på flera plattformar. Informationen på denna sida kommer från offentliga podd-flöden.
