Research Note: Microsoft Azure Maia 200 Inference Accelerator
Microsoft recently announced its second-generation custom AI accelerator, the Maia 200. The new chip is an inference-optimized alternative to third-party GPUs in its Azure infrastructure. The company says the accelerator delivers 30% better performance per dollar than existing Azure hardware while supporting OpenAI’s GPT-5.2 models and Microsoft’s own synthetic data generation workloads.





