Sveriges mest populära poddar
Intellectually Curious

Mixture of Experts Unpacked: The Sparse Engine Behind Today's Giant AI Models

6 min13 oktober 2025

A deep dive into Mixture of Experts (MoE): how sparse routing selects a tiny subset of experts for each input, enabling trillion-parameter models to run efficiently. We trace the idea from early Metapi networks to modern neural sparsity, explore load-balancing tricks, and see how MoE powers NLP, vision, and diffusion models. A practical guide to why selective computation is reshaping scalable AI.


Note:  This podcast was AI-generated, and sometimes AI can make mistakes.  Please double-check any critical information.

Sponsored by Embersilk LLC

Fler avsnitt av Intellectually Curious

Visa alla avsnitt av Intellectually Curious

Intellectually Curious med Mike Breault finns tillgänglig på flera plattformar. Informationen på denna sida kommer från offentliga podd-flöden.