DDN Unleashes New AI400X2 Turbo for Superior Data Throughput

Share

Key takeaways:

* DDN has launched a new version of their renowned high-end storage solution, AI400X2 Turbo which speeds up advanced AI model training.
* The AI400X2 Turbo gives a 30% performance boost compared to its predecessor, providing more efficient workflows for customers.
* The new AI400X2 Turbo is designed to streamline back-up and checkpoint processes during training for Large Language Models (LLMs).
* AI400X2 storage power backs Nvidia’s Eos, the ninth-largest Supercomputer globally.

DataDirect Networks (DDN) launched AI400X2 Turbo, their newly enhanced high-end storage solution used in Advanced AI and high-performance computing. Their primary client, Nvidia, uses this to operate their Eos Supercomputer, one of the world’s top supercomputers.

AI400X2 Turbo: A Game Changer in High-Speed Storage Solutions

The upgraded AI400X2 Turbo promotes 30% better performance than the previous model, significantly benefiting user experience. The new version helps customers efficiently train large language models paired with Nvidia GPUs, enhancing their data management and analysis. With a long-standing heritage in developing storage solutions, DDN further solidifies its market dominance by addressing the rising demand for high-speed storage solutions in the modern AI era.

Unlike the amount of data used in traditional big data processing, the training data concept for large language models (LLMs) is relatively moderate. However, the crucial requirement concerning data storage is the constant backing up and checkpointing during LLM training. To fulfill this need, Nvidia adopted AI400X2 units roughly two years ago. Collectively, these storage systems were capable of retaining 1TB per second for reads and 500GB a second for writes.

The AI Power Behind Nvidia’s EOS Supercomputer

Adopting the AI400X2 for its Eos Supercomputer, Nvidia powered its computing system with DDN’s new storage solution. Eos, launched in March 2022, houses 48 AI400X2 appliances. The SuperPOD design, packed with over 4,600 H100 GPUs and 576f DGX systems, can read at 4.3 TB/sec and write at 3.1 TB/sec. These storage systems proved instrumental for Nvidia, enabling them to surpass their initial goal of ensuring around 2 TB/sec.

All these features are bundled in a 2U appliance size with the new AI400X2 Turbo. Customers are presented with a myriad of benefits from this 30% improvement. Whether it’s accomplishing more work in the same time frame, finishing tasks quicker, or completing the same tasks with fewer storage systems, the new AI400X2 Turbo meets each unique customer need.

An Investment Beyond Hardware

The total throughput of the AI400X2 Turbo increases even more when multiple appliances align with Nvidia DGX systems or SuperPODs. Although the hardware component is essential, DDN’s focus extends beyond the material investment. Considering the huge expenditures involved in infrastructure, data scientists, cooling, networks, and data centers, enhancing productivity becomes paramount. By investing 5% of the budget for DDN’s storage, customers can anticipate a higher productive output.

Despite uncertain market conditions, DDN has witnessed a sales boom because of GenAI’s rise. The company reported that its 2023 AI storage sales doubled the numbers from 2022. AI400X2 Turbo will be available soon. Each appliance can have 2.5-inch NVMe drives scaled from 30TB to 500TB capacities.

To sum up, the new AI400X2 Turbo brings a substantial change for both DDN and Nvidia. It amplifies the capabilities of high-performance computing and AI technology, positioning themselves favorably in the market to serve a wider and more demanding customer base.

Jonathan Browne
Jonathan Brownehttps://livy.ai
Jonathan Browne is the CEO and Founder of Livy.AI

Read more

More News