This week, we’re diving into the hidden challenges of AI infrastructure with David Driggers, CEO of Cirrascale. David and I chat about the surprising failure points that traditional data center planning often misses when it comes to large-scale AI deployments, why old assumptions about power, cooling, and networking simply don’t hold up anymore and why the “just add GPUs” strategy can be a recipe for disaster without a true systems-thinking approach. Also this week, I check out a massive breakthrough where physicists are using AI to uncover entirely new laws of nature!
Links for May 1, 2026
More information about Cirrascale
AI Reveals Unexpected New Physics in Dusty Plasma (Emory University)
Physics-tailored machine learning reveals unexpected physics in dusty plasmas (PNAS)
Click here to check out the Fish Fry Archive.
Click here to subscribe to Fish Fry via Podbean
Click here to get the Fish Fry RSS Feed
Click here to subscribe to Fish Fry via Apple Podcasts
Click here to subscribe to Fish Fry via Spotify
Amelia’s Weekly Fish Fry – Episode 680
Release Date: May 1, 2026
Hello there, everyone! Welcome to episode 680 of Amelia’s Weekly Fish Fry, the electronic engineering podcast brought to you by EEJournal.com and written, produced, and hosted by me, Amelia Dalton.
Folks, the AI boom is here. While everyone seems to be talking about bigger models and faster GPUs, there’s a whole other side of the story happening behind the scenes — the infrastructure making all of it possible. Because it turns out, building real-world AI systems is a whole lot more complicated than just stacking up GPUs and hoping for the best.
This week, we’re diving into the hidden challenges of AI infrastructure with David Driggers, CEO of Cirrascale. David and I chat about the surprising failure points that traditional data center planning often misses when it comes to large-scale AI deployments, why old assumptions about power, cooling, and networking no longer hold up, and why a “just add GPUs” strategy can be a recipe for disaster without a true systems-level approach.
Also this week, I check out a new discovery involving a “fourth state of matter,” uncovered with the help of AI.
So, without further ado, please welcome Dave to Fish Fry.
Amelia Dalton: Hi David, thank you so much for joining me.
David Driggers: Hi, thanks for having me.
Amelia Dalton: Absolutely. First, David, talk to me about common or surprising failure points you’re seeing in real-world AI infrastructure that traditional data center planning often misses.
David Driggers: The main thing is that we’ve really crossed the chasm into requiring liquid-cooled data centers. We’ve been sort of bumbling along for a long time. High-performance computing has used liquid cooling at the rack level for a while — 30 kilowatts and up — but it was still niche compared to traditional enterprise data centers.
Now, water is a requirement. With densities of 50, 70, 100, and soon 400 kilowatts per rack, you simply can’t rely on air cooling anymore. It still surprises me that people are building new data centers focused on air cooling. We just can’t do it.
Amelia Dalton: So, talk to me specifically about Cirrascale. Can you share any recent challenges you’ve solved for large AI deployments?
David Driggers: One of the biggest challenges is speed and scale. Historically, data centers were planned three to five years out. Those plans have all been consumed. With the explosion of AI, the size of data centers isn’t just doubling or quadrupling — it’s increasing tenfold compared to just two years ago.
It’s not just about the overall facility size either. Traditionally, you might have had a 50-megawatt campus spread across multiple buildings. Today, we need 20-megawatt single suites — one data hall, one customer, one use case — and that’s ten times larger than what we needed just a couple of years ago.
Our biggest challenge has been finding data center providers nimble enough to adapt mid-build. They might start designing for 2-megawatt halls, then suddenly need to pivot to 4, then 8 megawatts — all within months. That kind of rapid change is unheard of historically.
Amelia Dalton: AI workloads are really shaking up the usual assumptions and designs we’ve relied on, right?
David Driggers: Absolutely. Data center designers tend to be conservative — they build slowly, and facilities are meant to last 30 years. But all bets are off now. Hardware is evolving every 12 to 18 months, and densities are skyrocketing. Two years ago, racks were around 50 kilowatts. Now they’re 100 kilowatts, and soon they’ll be 400.
It’s hard to design long-term infrastructure when the requirements are changing that quickly.
Amelia Dalton: What are the key trade-offs engineers face today when balancing power efficiency, cooling complexity, and network interconnect speed?
David Driggers: The biggest driver of density is backend networking — how GPUs communicate with each other and how servers interconnect. That communication needs to be as fast as possible.
As speeds increase, we have to use technologies that only work over very short distances — measured in inches rather than feet or meters. That’s what’s driving these dense, tightly integrated systems, especially for AI training workloads.
Amelia Dalton: There’s an approach often called “just add GPUs,” but that doesn’t really work without a broader systems perspective, right?
David Driggers: Exactly. These are full-system solutions, much like traditional supercomputers — except they’re the largest we’ve ever built. They require immense power, cooling, and connectivity.
And like any supercomputer, performance is limited by the weakest link. You could have thousands of servers running a single job, but if one node fails, the whole job fails. We’re constantly identifying and fixing bottlenecks to improve overall system performance.
Amelia Dalton: When companies start setting up their own sovereign or on-prem AI systems, what new challenges do they face compared to using cloud providers?
David Driggers: It’s extremely difficult to do this alone because everything is evolving so quickly. Power, cooling, networking, density, and system management all have to work together.
Most organizations need to rely on hyperscalers or neocloud providers who understand the full system. Even companies like NVIDIA are trying to simplify things by offering reference architectures, but that doesn’t solve the data center challenges.
Enterprises typically approach this as a one-off project. By the time they figure it out, the technology has changed, capacity requirements have shifted, or the space they planned to use is no longer available. Right now, finding data center space within 12 to 18 months is already a challenge.
That’s why many organizations prefer partners who allow more flexible, just-in-time decisions rather than committing to a 30-year infrastructure investment.
Amelia Dalton: All right, David — it’s time for your off-the-cuff question. If you could have one meal right now, anywhere in the world, what would it be?
David Driggers: I’d probably go to Thailand and have their lemongrass soup.
Amelia Dalton: That sounds lovely — yes, please! Well, this was super cool. Thank you so much for joining me.
David Driggers: Not a problem. Thanks for having me.
Have you heard about the amazing new research coming out of Emory University on dusty plasma?
A team of theoretical and experimental physicists recently published a study in PNAS demonstrating that AI can do far more than just predict outcomes or process data. By combining laboratory observations of dusty plasma with a specialized neural network, they showed that AI can help discover entirely new laws of physics.
Justin Burton, an Emory professor of experimental physics and co-author of the study, explains:
“We showed that we can use AI to discover new physics. Our AI method is not a black box — we understand how and why it works. The framework is universal and could be applied to many-body systems to open new routes to discovery.”
So, what exactly is dusty plasma?
Plasma is often called the fourth state of matter — an ionized gas where electrons and ions move freely, creating properties like electrical conductivity. It makes up about 99.9% of the visible universe, from solar wind to lightning.
Dusty plasma includes additional charged dust particles and appears in environments like Saturn’s rings, Earth’s ionosphere, and even wildfire smoke. On the Moon, low gravity allows charged dust to hover above the surface.
To study this, researchers recreated dusty plasma in vacuum chambers, suspending tiny particles and observing their motion under controlled conditions. Using a tomographic imaging system with high-speed cameras and laser sheets, they reconstructed 3D particle motion over time.
After more than a year of training, their AI model was able to separate particle motion into three forces: drag, environmental forces like gravity, and inter-particle forces. It even captured complex, asymmetric interactions between particles.
One example: a leading particle attracts a trailing particle, but the trailing particle repels the leader — a non-reciprocal force that’s extremely difficult to model traditionally.
The AI achieved over 99% accuracy in describing these forces and challenged long-standing assumptions about how particle size affects charge and interaction strength.
Looking ahead, the team has developed a physics-based neural network that can run on a standard desktop computer, offering a powerful tool for studying many-body systems across multiple fields.
As co-author Ilya Nemenman puts it:
“We can now see what’s occurring in exquisite detail and correct inaccuracies in long-standing theories.”
And Justin Burton sums it up perfectly:
“Used properly, AI can open doors to entirely new realms of discovery.”
From dusty plasma to Star Trek — I’d say that’s a pretty good leap.
If you’d like more information about Cirrascale or this study from Emory University, I’ve included links below the player on this week’s Fish Fry page on EEJournal.com and in the YouTube description.
Hey, have you checked out EEJournal on social media? You can find us on Facebook, LinkedIn, BlueSky, and Mastodon. And don’t forget our YouTube channel — it’s packed with tech content, including our popular Chalk Talk webcast series.
Thanks for tuning in! If you’ve got a cool new technology to share — or just want to chat — shoot me a line at amelia@eejournal.com or post a comment on our forums.
For the week of May 1, 2026, I’m Amelia Dalton — and you’ve been fried.



