Last week I was back in Las Vegas for Dell Technologies World 2025. The company has a broad portfolio but was telling a single-minded story: Dell has a relentless focus on AI, emphasizing the Dell AI Factory and the shift towards on-premises enterprise AI solutions.
Key Takeaways from Vegas
- AI is the new electricity — Michael Dell, quoting every keynote since 2023
- Liquid-cooled racks are now boardroom conversation
- NVIDIA everywhere, but little sign of open alternatives
- AI infrastructure is now a boxed solution — compute, storage, software, and integration, all in one (vendor-branded) bow
- Dell stock popped on the news — Wall Street loves a good AI growth story
AI Factory 2.0
A year after the first iteration, Dell rolled out the Dell AI Factory with NVIDIA 2.0, and this wasn’t just another product suite. It was a full-court press to own the enterprise AI conversation.
The message? Forget the hyperscalers. Build AI on-prem, your way, with Dell:
- New PowerEdge servers with NVIDIA Blackwell GPUs (hello, B100s) and optional liquid cooling for the thermally inclined.
- Project Lightning: a high-speed parallel file system for AI data ops. Think of it as the plumbing for Dell’s AI Factory.
- ObjectScale + PowerScale refreshes: tuned for inferencing workloads, now with S3 over RDMA — because low-latency storage is suddenly sexy again.
- Partnerships galore: Google (Gemini models on-prem), Cohere, Hugging Face — Dell’s calling in the LLM cavalry.
Quick take: Dell’s betting that AI workloads won’t all live in public cloud. And if enterprise buyers want AI inside the firewall, Michael Dell wants to be their guy.
Storage in the Spotlight
This is the first time in recent memory that storage was featured in keynote, let alone by Michael Dell. This year storage wasn’t just a footnote, it was core to the AI Factory narrative.
- ObjectScale enhancements promise up to 230% throughput improvement via RDMA. That’s big for inferencing at scale.
- PowerScale + Lightning = high-performance file access, tailored for agentic and streaming AI workloads.
- Telemetry and power efficiency baked in — Dell’s whispering “sustainability,” but shouting “cost savings.”
Bottom Line
Dell wants to own the AI infrastructure narrative — not by going head-to-head with hyperscalers, but by rewriting the playbook for on-prem AI. With NVIDIA as co-pilot, it’s betting that the next generation of enterprise AI won’t be built in someone else’s cloud.
The question is: Will CIOs buy the Factory… or piece together their own?
Keep watching: Dell’s now firmly planted at the intersection of infrastructure, enterprise AI, and geopolitical tech policy.