As adoption of AI becomes mainstream in organizations, storage infrastructure to support the large amounts of datasets fueling AI models becomes critical. IBM storage has been working with NVIDIA on NVIDIA DGX POD reference architectures built with NVIDIA DGX systems and IBM Spectrum Scale storage. IBM is simplifying AI development by delivering high performance storage for NVIDIA GPUs in enterprise AI workflows. IBM Spectrum Scale combined with ESS 3000 2U NVMe building blocks delivers up to 40GB/s throughput with linear scalability as additional ESS 3000 nodes are attached to an NVIDIA DGX cluster.
IBM Storage for Data and AI now brings data orchestration capabilities that differentiate our solution from other storage options, transforming ESS 3000 from a “high performance storage” to “high performance smart storage tier.” It provides the ability to connect ESS 3000 to an organization’s file and object store data lakes and cache required data to allow AI modeling in an NVIDIA DGX POD. Only IBM can intelligently cache required data from file and object storage with a global federated namespace that can span up to 8YB. This approach allows IBM customers to optimize TCO and speed productivity while they enjoy superior high performance for their AI workloads.