The development of AI infrastructures is increasingly shifting from compute-intensive training to scalable inference workloads. This change is fundamentally altering the requirements for storage architectures. Traditional combinations of HBM, DRAM and conventional enterprise SSDs are reaching their technical and economic limits when it comes to permanent access to large model parameters. Against this backdrop, NVIDIA and SK hynix are working on a new storage approach that will integrate NAND flash much more closely into the AI data path, according to consistent reports.
The focus is on an internal project called “Storage Next”, which is aimed at a special AI SSD. This solution is not intended to function as a classic mass storage medium, but as a kind of intermediate, inference-optimized storage layer. The aim is to provide high parallelism, extremely low latencies and a significantly higher IOPS rate than today’s enterprise SSDs. Values of up to 100 million IOPS are quoted, which is many times higher than current high-end NVMe solutions. These figures cannot currently be independently verified.
The technical background to this approach lies in the increasing size of modern AI models. Even generously dimensioned HBM memory on GPUs cannot permanently store all parameters. DRAM-based extensions are cost-intensive and energy-intensive, while classic SSDs react too slowly. A specially developed NAND solution with adapted controllers, optimized access paths and close GPU connection could create an intermediate level that combines capacity, throughput and efficiency.
SK hynix is reportedly planning to present a market-ready prototype by 2027 at the latest, with the first demonstrations potentially taking place as early as the end of 2026. NVIDIA is expected to contribute its experience in GPU and interconnect design in particular, while SK hynix will be responsible for the NAND technology and controller architectures. There have been no official confirmations regarding concrete specifications, interfaces or software stacks to date.
In addition to the technical opportunities, the market impact is also coming into focus. The global NAND market is already under increased pressure due to the growing storage requirements of cloud service providers and large AI operators. Should an AI SSD establish itself as the new standard for inference clusters, this could further accelerate the demand for high-quality NAND chips. Observers draw parallels with the current situation in the DRAM market, where AI-driven demand has led to a shortage of supply and rising prices. Whether and to what extent this development will affect the end customer market cannot be reliably predicted at present.
Conclusion
The planned AI SSD from NVIDIA and SK hynix makes it clear that storage architectures have become a key bottleneck in modern AI systems. An inference-optimized NAND solution could make technical sense in order to efficiently provide large model parameters and better utilize GPU resources. At the same time, the approach carries the risk of additional tensions in the NAND market, especially if the demand from large AI players continues to accelerate. In the absence of verified technical data and official announcements, the actual performance and market significance of this technology remains to be seen.
| Source | Key message | Link |
|---|---|---|
| Chosun Biz | Report on the internal project “Storage Next”, in which NVIDIA and SK hynix are jointly developing an inference-optimized AI SSD based on NAND, possible prototype presentation by 2026, market maturity from around 2027 | https://biz.chosun.com/it-science/ict/2025/12/16/AI-SSD-NVIDIA-SKHYNIX |
| The Korea Economic Daily | Analysis of the strategic cooperation between NVIDIA and SK hynix in the field of AI storage, focus on new NAND controller architectures and high IOPS figures for inference workloads | https://www.kedglobal.com/semiconductor/newsView/ked202512160012 |
| TrendForce | Classification of the potential impact of special AI SSDs on the global NAND market, increasing demand from AI data centers, parallels to current DRAM market development | https://www.trendforce.com/news/2025/12/16/ai-ssd-nand-demand-impact |


































1 Antwort
Kommentar
Lade neue Kommentare
Mitglied
Alle Kommentare lesen unter igor´sLAB Community →