What's really happening with AI compute infrastructure? The common story is that supply will catch up to demand—but the reality is more complicated when DRAM prices spike 60% quarterly and every hyperscaler is hoarding capacity. In this video, I share the inside scoop on why the global inference crisis is not a prediction but an observation of current conditions:
Why enterprise token consumption is scaling from 1 billion to 100 billion per worker annually
How memory, semiconductor, and GPU bottlenecks compound with no relief until 2028
What hyperscalers choosing their own products over customers means for enterprise allocation
Where sharp CTOs are securing capacity and building routing layers now
For enterprise leaders navigating the next 24 months, traditional planning frameworks are broken—and the window to act is closing fast.
Subscribe for daily AI strategy and news.
For playbooks and analysis: https://natesnewsletter.substack.com/p/executive-briefing-the-global-inference?
© Nate B. Jones 2026
Hosted on Acast. See acast.com/privacy for more information.
Fler avsnitt av AI News & Strategy Daily with Nate B. Jones
Visa alla avsnitt av AI News & Strategy Daily with Nate B. JonesAI News & Strategy Daily with Nate B. Jones med Nate B. Jones finns tillgänglig på flera plattformar. Informationen på denna sida kommer från offentliga podd-flöden.
