Storage Leaders Construct Infrastructure to Gas AI Brokers With NVIDIA AI Knowledge Platform

Storage Leaders Construct Infrastructure to Gas AI Brokers With NVIDIA AI Knowledge Platform

The world’s main storage and server producers are combining their design and engineering experience with the NVIDIA AI Knowledge Platform — a customizable reference design for constructing a new class of AI infrastructure — to offer techniques that allow a brand new era of agentic AI purposes and instruments.

The reference design is now being harnessed by storage system leaders globally to assist AI reasoning brokers and unlock the worth of data saved within the tens of millions of paperwork, movies and PDFs enterprises use.

NVIDIA-Licensed Storage companions DDN, Dell Applied sciences, Hewlett Packard Enterprise, Hitachi Vantara, IBM, NetApp, Nutanix, Pure Storage, VAST Knowledge and WEKA are introducing merchandise and options constructed on the NVIDIA AI Knowledge Platform, which incorporates NVIDIA accelerated computing, networking and software program.

As well as, AIC, ASUS, Foxconn, Quanta Cloud Know-how, Supermicro, Wistron and different authentic design producers (ODMs) are creating new storage and server {hardware} platforms that assist the NVIDIA reference design. These platforms function NVIDIA RTX PRO 6000 Blackwell Server Version GPUs, NVIDIA BlueField DPUs and NVIDIA Spectrum-X Ethernet networking, and are optimized to run NVIDIA AI Enterprise software program.

Such integrations enable enterprises throughout industries to shortly deploy storage and knowledge platforms that scan, index, classify and retrieve giant shops of personal and public paperwork in actual time. This augments AI brokers as they purpose and plan to unravel complicated, multistep issues.

Constructing agentic AI infrastructure with these new AI Knowledge Platform-based options may also help enterprises flip knowledge into actionable information utilizing retrieval-augmented era (RAG) software program, together with NVIDIA NeMo Retriever microservices and the AI-Q NVIDIA Blueprint.

Storage techniques constructed with the NVIDIA AI Knowledge Platform reference design flip knowledge into information, boosting agentic AI accuracy throughout many use circumstances. This may also help AI brokers and customer support representatives present faster, extra correct responses.

With extra entry to knowledge, brokers also can generate interactive summaries of complicated paperwork — and even movies — for researchers of every kind. Plus, they will help cybersecurity groups in preserving software program safe.

Main Storage Suppliers Showcase AI Knowledge Platform to Energy Agentic AI

Storage system leaders play a crucial function in offering the AI infrastructure that runs AI brokers.

Embedding NVIDIA GPUs, networking and NIM microservices nearer to storage enhances AI queries by bringing compute nearer to crucial content material. Storage suppliers can combine their document-security and access-control experience into content-indexing and retrieval processes, bettering safety and knowledge privateness compliance for AI inference.

Knowledge platform leaders comparable to IBM, NetApp and VAST Knowledge are utilizing the NVIDIA reference design to scale their AI applied sciences.

IBM Fusion, a hybrid cloud platform for operating digital machines, Kubernetes and AI workloads on Crimson Hat OpenShift, presents content-aware storage providers that unlock the that means of unstructured enterprise knowledge, enhancing inferencing so AI assistants and brokers can ship higher, extra related solutions. Content material-aware storage allows quicker time to insights for AI purposes utilizing RAG when mixed with NVIDIA GPUs, NVIDIA networking, the AI-Q NVIDIA Blueprint and NVIDIA NeMo Retriever microservices — all a part of the NVIDIA AI Knowledge Platform.

NetApp is advancing enterprise storage for agentic AI with the NetApp AIPod resolution constructed with the NVIDIA reference design. NetApp incorporates NVIDIA GPUs in knowledge compute nodes to run NVIDIA NeMo Retriever microservices and connects these nodes to scalable storage with NVIDIA networking.

VAST Knowledge is embedding NVIDIA AI-Q with the VAST Knowledge Platform to ship a unified, AI-native infrastructure for constructing and scaling clever multi-agent techniques. With high-speed knowledge entry, enterprise-grade safety and steady studying loops, organizations can now operationalize agentic AI techniques that drive smarter selections, automate complicated workflows and unlock new ranges of productiveness.

ODMs Innovate on AI Knowledge Platform {Hardware}

Providing their in depth expertise with server and storage design and manufacturing, ODMs are working with storage system leaders to extra shortly convey modern AI Knowledge Platform {hardware} to enterprises.

ODMs present the chassis design, GPU integration, cooling innovation and storage media connections wanted to construct AI Knowledge Platform servers which are dependable, compact, vitality environment friendly and inexpensive.

A excessive proportion of the ODM business’s market share contains producers based mostly or colocated in Taiwan, making the area an important hub for enabling the {hardware} to run scalable agentic AI, inference and AI reasoning.

AIC, based mostly in Taoyuan Metropolis, Taiwan, is constructing flash storage servers, powered by NVIDIA BlueField DPUs, that allow greater throughput and larger energy effectivity than conventional storage designs. These arrays are deployed in lots of AI Knowledge Platform-based designs.

ASUS partnered with WEKA and IBM to showcase a next-generation unified storage system for AI and high-performance computing workloads, addressing a broad spectrum of storage wants. The RS501A-E12-RS12U, a WEKA-certified software-defined storage resolution, overcomes conventional {hardware} limitations to ship distinctive flexibility — supporting file, object and block storage, in addition to all-flash, tiering and backup capabilities.

Foxconn, based mostly in New Taipei Metropolis, builds lots of the manufacturing business’s accelerated servers and storage platforms used for AI Knowledge Platform options. Its subsidiary Ingrasys presents NVIDIA-accelerated GPU servers that assist the AI Knowledge Platform.

Supermicro is utilizing the reference design to construct its clever all-flash storage arrays powered by the NVIDIA Grace CPU Superchip or BlueField-3 DPU. The Supermicro Petascale JBOF and Petascale All-Flash Array Storage Server ship excessive efficiency and energy effectivity with software-defined storage distributors and assist use with AI Knowledge Platform options.

Quanta Cloud Know-how, additionally based mostly in Taiwan, is designing and constructing accelerated server and storage home equipment that embody NVIDIA GPUs and networking. They’re well-suited to run NVIDIA AI Enterprise software program and assist AI Knowledge Platform options.

Taipei-based Wistron and Wiwynn provide modern {hardware} designs suitable with the AI Knowledge Platform, incorporating NVIDIA GPUs, NVIDIA BlueField DPUs and NVIDIA Ethernet SuperNICs for accelerated compute and knowledge motion.

Be taught extra concerning the newest agentic AI developments at NVIDIA GTC Taipei, operating Could 21-22 at COMPUTEX.