Memory & Flash Crisis Update (March 2026)

Semiconductor

The global memory market entered 2026 in a state of structural supply constraint. AI infrastructure demand has reallocated semiconductor manufacturing capacity toward HBM for GPU accelerators, creating scarcity in conventional DRAM and NAND flash products.

How to Think about VAST Data

Image of VAST Data logo

If you’ve been tracking the enterprise infrastructure space for the past few years, you’ve probably encountered VAST Data. And if you’re like most IT practitioners I talk to, you’ve probably filed them in the “high-performance storage vendor” folder in your brain.

Everpure: Pure Storage’s Rebrand & Evolution to Data Management Platform

Pure Storage Everpure Rebrand

Pure Storage has rebranded as Everpure, reflecting a multi-year evolution from storage management into broader data management, and is also acquiring 1touch: Executive Summary Pure Storage today rebranded as Everpure, matching its ongoing expansion from its roots in performance flash storage into the broader data management market. The transition is accompanied by its intent to […]

SUSE Acquires Losant: Extending the Open Edge Stack into Industrial IoT

Edge Computing

SUSE announced the acquisition of Losant, a Cincinnati-based industrial IoT (IIoT) platform. The acquisition marks a material expansion for SUSE, moving the company from being primarily an edge infrastructure provider (built around SUSE Linux Micro and K3s) to the application and orchestration layer, where operational data from industrial devices is aggregated, visualized, and acted on.

IBM FlashSystem: Next Generation, Autonomous Storage Meets Agentic AI

IBM FlashSystem

IBM recently announced a significant refresh of its FlashSystem all-flash storage portfolio, replacing the 5300, 7300, and 9500 product lines with new 5600, 7600, and 9600 models. The company also introduced its new FlashSystem.ai, an agentic AI administration layer that IBM claims can reduce manual storage management effort by up to 90%.

Research Note: Microsoft Azure Maia 200 Inference Accelerator

Microsoft Azure Maia 200

Microsoft recently announced its second-generation custom AI accelerator, the Maia 200. The new chip is an inference-optimized alternative to third-party GPUs in its Azure infrastructure. The company says the accelerator delivers 30% better performance per dollar than existing Azure hardware while supporting OpenAI’s GPT-5.2 models and Microsoft’s own synthetic data generation workloads.

Research Note: WD Innovation Day

Western Digital

Western Digital’s February 2026 Innovation Day showed a company fundamentally transformed from its legacy PC-centric storage roots into a critical AI infrastructure provider. The presentations unveiled breakthrough innovations that challenge long-held assumptions about hard drive technology limits.