Research | Roundhill Investments

The Memory Wall: Why AI’s Next Bottleneck Isn’t Compute

Written by Dave Mazza | February 17, 2026

For most of the last two years, the AI investment story revolved around a single question: can we build the models fast enough? That was the compute phase of the trade. GPUs, accelerators, and raw processing power were the constraint, and the market priced that reality aggressively.

That constraint is shifting.

As AI moves from experimentation to deployment at scale, the bottleneck is no longer just compute. The industry is now running into what engineers have long warned about: the memory wall. Processing power has advanced faster than the ability to move, access, and store data efficiently. At this stage, AI systems are increasingly limited by memory bandwidth and capacity rather than raw compute.

In short, AI is becoming memory bound, not compute bound. That evolution has meaningful implications for how the next phase of the AI trade plays out.

How Memory Went From Commodity to Gating Factor

Memory has historically been one of the most unforgiving corners of the semiconductor market. DRAM (Dynamic Random Access Memory) and NAND (Not-AND) were treated as commodities, cycles were driven by supply expansions, and periods of strong profitability were often followed by overinvestment and price collapses. Investors learned to be skeptical.

AI is changing that framework.

Modern AI workloads behave very differently from traditional computing. Training large language models requires massive datasets. Running them in production generates constant streams of inference data, logs, and telemetry. None of that data disappears. It is accessed repeatedly, stored indefinitely, and often duplicated across systems.

That accumulation effect matters. Memory and storage are no longer just inputs. They are increasingly the gating factor for scale, cost, and performance.

The AI Memory Stack, Explained

AI systems rely on several distinct layers of memory and storage, each playing a different role in the data lifecycle. What stands out in this cycle is that demand is tightening across all of them at once, not just in one isolated segment.

High Bandwidth Memory

High bandwidth memory, or HBM, sits physically next to AI accelerators and GPUs. It is designed for extremely high throughput and low latency, which makes it essential for training and inference workloads where speed and power efficiency matter more than capacity.

HBM is also difficult to manufacture. Advanced packaging, stacking, and yield challenges mean supply cannot be expanded quickly. That has turned HBM into one of the most strategically important components in the AI supply chain. HBM pricing and availability today look far more structural than cyclical.

SK Hynix (000660 KS) has emerged as the clear leader in HBM and has become a critical supplier to the AI ecosystem. Samsung (005930 KS) participates across HBM, DRAM, and NAND, bringing scale and breadth that few competitors can match.

DRAM

DRAM serves as the system memory in AI servers. It supports data preparation, CPU-side workloads, and inference pipelines that operate alongside accelerators.

As manufacturers divert capacity toward higher-value HBM production, traditional DRAM supply tightens. At the same time, AI-driven data center demand continues to grow. That combination has restored pricing power in a segment that historically struggled to sustain it. This reallocation dynamic is one of the underappreciated drivers of earnings leverage in memory today.

Among memory chip manufacturers, Micron (MU) sits at the center of this shift, with exposure to both AI-driven DRAM demand and advanced memory products.

NAND Flash and Enterprise SSDs

If HBM and DRAM allow AI models to run, NAND flash and solid-state storage (SSD) determine where AI data lives.

Training datasets, embeddings, inference outputs, and enterprise AI deployments all rely on fast, reliable storage. As AI adoption scales, data creation accelerates, and storage requirements grow accordingly.

SanDisk (SNDK) is directly exposed to this trend through its focus on NAND flash and enterprise SSDs. Kioxia (285A JP) remains one of the largest NAND producers globally, supplying flash and SSDs across cloud and enterprise markets.

This layer benefits not only from performance needs, but from the sheer volume of data AI generates.

Hard Disk Drives and Long-Term Storage

Despite advances in flash storage, hard disk drives (HDDs) remain essential at scale. HDDs offer the lowest cost per bit for storing massive datasets, backups, and archival data. AI does not eliminate traditional storage. It increases demand for it.

Western Digital (WDC) provides exposure to both enterprise SSDs and high-capacity HDDs that are critical for long-term AI data retention. Seagate Technology (STX) plays a similar role, with revenue tied closely to hyperscale data growth rather than short-term semiconductor pricing cycles.

These companies sit at the far end of the AI data lifecycle, where persistence matters more than speed.

Specialty and Embedded Memory

Beyond the data center, AI also drives demand for specialty memory used in automotive systems, industrial equipment, networking, and edge devices.

GigaDevice (3986 HK) focuses on NOR flash and embedded memory with long design cycles. Winbond Electronics (2344 TT) and Nanya Technology (2408 TT) provide exposure to specialty and commodity DRAM markets tied to global supply-demand dynamics.

These segments tend to be less volatile and benefit from long-term customer commitments rather than spot pricing.

Why This Memory Cycle Looks Different

Investors are right to be cautious given memory’s history. But this cycle differs in important ways.

First, AI-driven demand is structural. Once AI systems are deployed, they require ongoing inference, retraining, monitoring, and data retention. Memory demand becomes recurring rather than episodic.

Second, supply discipline has improved. Years of poor returns forced manufacturers to rationalize capex and prioritize higher-value products instead of flooding the market with capacity.

The Bottom Line

The AI bottleneck has moved.

The industry is transitioning from a phase defined by compute to one constrained by memory bandwidth and storage capacity. That shift is pulling demand across every layer of the memory stack, from high-bandwidth chips next to accelerators to the drives storing petabytes of training data.

Memory has long been viewed as a commodity. In the age of AI, it is increasingly becoming infrastructure.



This information is provided solely as general investment education. None of the information provided should be regarded as a suggestion to engage in or refrain from any investment related course of action. Investing involves risk, loss of principal is possible.

Not an offer: This document does not constitute advice or a recommendation or offer to sell or a solicitation to deal in any security or financial product. It is provided for information purposes only and on the understanding that the recipient has sufficient knowledge and experience to be able to understand and make their own evaluation of the proposals and services described herein, any risks associated therewith and any related legal, tax, accounting or other material considerations. To the extent that the reader has any questions regarding the applicability of any specific issue discussed above to their specific portfolio or situation, prospective investors are encouraged to contact 1-855-561-5728 or consult with the professional advisor of their choosing.

Forward-looking statements: Certain information contained herein constitutes “forward-looking statements,” which can be identified by the use of forward-looking terminology such as “may,” “will,” “should,” “expect,” “anticipate,” “project,” “estimate,” “intend,” “continue,” or “believe,” or the negatives thereof or other variations thereon or comparable terminology. Due to various risks and uncertainties, actual events, results or actual performance may differ materially from those reflected or contemplated in such forward-looking statements. Nothing contained herein may be relied upon as a guarantee, promise, assurance or a representation as to the future.

Use of Third-party Information: Certain information contained herein has been obtained from third party sources and such information has not been independently verified by Roundhill Financial Inc. No representation, warranty, or undertaking, expressed or implied, is given to the accuracy or completeness of such information by Roundhill Financial Inc. or any other person. While such sources are believed to be reliable, Roundhill Financial Inc. does not assume any responsibility for the accuracy or completeness of such information. Roundhill Financial Inc. does not undertake any obligation to update the information contained herein as of any future date.

Any indices and other financial benchmarks shown are provided for illustrative purposes only, are unmanaged, reflect reinvestment of income and dividends and do not reflect the impact of advisory fees. Investors cannot invest directly in an index.

Except where otherwise indicated, the information contained in this presentation is based on matters as they exist as of the date of preparation of such material and not as of the date of distribution or any future date. Recipients should not rely on this material in making any future investment decision. The performance data quoted represents past performance. Past performance does not guarantee future results. Current performance may be lower or higher than the performance data quoted.