We explore nested learning (NL), a paradigm where memory and optimization form an integrated system with multiple levels updating at different speeds, creating a spectrum memory system (CMS). See how traditional optimizers can be viewed as memory modules, how the HOPE architecture uses CMS blocks to handle longer contexts, and what needle-in-the-haystack experiments reveal about memory and language modeling. We’ll also discuss what self-modifying, continually learning AI could unlock in the future.
Note: This podcast was AI-generated, and sometimes AI can make mistakes. Please double-check any critical information.
Sponsored by Embersilk LLC
Fler avsnitt av Intellectually Curious
Visa alla avsnitt av Intellectually CuriousIntellectually Curious med Mike Breault finns tillgänglig på flera plattformar. Informationen på denna sida kommer från offentliga podd-flöden.
