In this deep dive, we demystify FLOPS—the metric that quantifies a computer’s ability to crunch floating point math for science and AI. We’ll unpack why floating point arithmetic matters, compare FP64/FP32/FP16 precision, explain how peak FLOPS are computed for HPC systems, and trace the astonishing history from ENIAC to Frontier (and beyond), including distributed computing and the collapsing cost of computing power.
Note: This podcast was AI-generated, and sometimes AI can make mistakes. Please double-check any critical information.
Sponsored by Embersilk LLC
Fler avsnitt av Intellectually Curious
Visa alla avsnitt av Intellectually CuriousIntellectually Curious med Mike Breault finns tillgänglig på flera plattformar. Informationen på denna sida kommer från offentliga podd-flöden.
