Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

NetApp

Research Note: NetApp AI Data Announcements @ GTC 2025

At the recent GTC 2025 event, NetApp announced, in collaboration with NVIDIA, a comprehensive set of product validations, certifications, and architectural enhancements to its intelligent data products.

The announcements include NetApp’s integration with the NVIDIA AI Data Platform, support for NVIDIA’s latest accelerated computing systems, and expanded availability of enterprise-grade AI infrastructure offerings, including NetApp AFF A90 and NetApp AIPod.

NVIDIA DGX SuperPOD AFF A90 Certification

NetApp’s AFF A90 storage systems now support validated deployments with NVIDIA DGX SuperPOD. This combination provides a converged infrastructure for large-scale training and inferencing, enabling:

  • Integrated data management via NetApp ONTAP.
  • Support for secure, scalable, multi-tenant AI environments.
  • Enterprise-grade resilience, data protection, and anti-ransomware capabilities.
  • Compatibility with NVIDIA DGX B200, delivering what NetApp claims is 3× training performance and 15× inference performance relative to prior generations.

Certification for NVIDIA Cloud Partner Reference Architectures

NetApp now has certification for NVIDIA Cloud Partners operating HGX H200 and B200-based platforms. These environments deliver infrastructure for AI-as-a-service providers offering managed AI and private cloud workloads. This includes:

  • High-performance block and file storage from NetApp AFF A90 systems.
  • Scalable throughput across secure multi-tenant environments.
  • Full ONTAP feature set including SnapMirror® replication, storage efficiency, and automation tools.

AIPod Integration with NVIDIA-Certified Systems

NetApp AIPod now supports the NVIDIA-Certified Storage program, providing pre-validated enterprise storage solutions for a wide range of AI workloads. The latest updates to AIPod include:

  • Support for NVIDIA L40S and HGX H200 compute platforms.
  • Integration with NVIDIA AI Enterprise software stack, including NIM microservices.
  • Infrastructure alignment with RAG and agentic AI use cases.
  • Enhanced version with Lenovo infrastructure, supporting Red Hat OpenShift and Lenovo Xclarity.

NVIDIA AI Data Platform

NetApp aligned its architecture with the NVIDIA AI Data Platform reference design, offering a composable infrastructure for data-intensive AI applications. The integration includes:

  • A global metadata namespace enabling centralized data discovery and feature extraction.
  • An integrated AI data pipeline capable of tracking incremental data changes, classifying unstructured data, and generating compressed vector embeddings for semantic search and RAG inferencing.
  • A disaggregated storage architecture optimizing compute and flash storage utilization across hybrid cloud deployments.

Analysis

Its latest announcements show that NetApp can serve as the primary storage and data management for enterprise AI workloads. Aligning its product portfolio with NVIDIA’s reference architectures and AI software ecosystem strengthens its relevance across training, inferencing, and RAG-based agentic AI pipelines.

Deep integration with NVIDIA’s AI infrastructure stack, including support for new generative AI agents and microservices, allows NetApp to differentiate through its mature data services, hybrid cloud operability, and enterprise-grade security posture.

As enterprises adopt and deploy AI workloads, NetApp’s ability to unify high-performance storage with mature data management and hybrid cloud capabilities creates a durable advantage for the company. The company’s announcements at GTC show that its keeping pace and continues to be a strong contender for AI-focused enterprise needs.

Competitive Outlook & Advice to IT Buyers

These sections are only available to NAND Research clients. Please reach out to info@nand-research.com to learn more.

Disclosure: The author is an industry analyst, and NAND Research an industry analyst firm, that engages in, or has engaged in, research, analysis, and advisory services with many technology companies, which may include those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Leave a Reply

Your email address will not be published. Required fields are marked *