16. March 2026 06:01 Samir Bashir
The memory industry is currently raking in huge profits thanks to the AI boom, but behind the scenes, nervousness is growing. While Samsung and SK hynix are benefiting massively from the current demand for DRAM and HBM, that is precisely the problem: anyone who expands too aggressively today could find themselves tomorrow sitting on a newly built billion-dollar factory producing for a declining market. And that wouldn’t be the first time. The current report paints a picture that is as old in the semiconductor industry as the price decline following every phase of euphoria: As long as AI servers, HBM orders, and data center infrastructure are soaking up everything, additional investments seem unavoidable. But as soon as demand normalizes, that same expansion shifts from a growth driver to a burden. This is precisely why Samsung and SK hynix are apparently proceeding more cautiously with the expansion of their DRAM capacities than the current market situation would actually suggest.
The fact that DRAM is currently in short supply is no longer a bold claim, but has long been evident in contract and spot prices. Price trends over the past few quarters clearly show that classic DRAM and, above all, HBM are not just performing well, but have in some cases already fallen into a structural shortage. To outsiders, this may initially sound like a luxury problem. For manufacturers, however, it is a double-edged sword. After all, memory is not a business where you can just “produce a little more.” New production lines, new cleanroom space, new packaging capacities, and the repurposing of existing manufacturing facilities cost time, capital, and, above all, nerves. Anyone investing in a high-price environment always does so with the risk that these very prices will collapse before the new capacity even begins to yield significant returns. And this is precisely where the current caution makes sense. Internally, Samsung appears to assume that the current phase of scarcity is not a permanent state. A market that emerges from the acute bottleneck by around 2028 would not be a shock, but rather almost the baseline scenario for anyone who has ever experienced a memory cycle. The real problem runs deeper than simply “too little DRAM.” The industry is currently not just building more memory, but shifting priorities in favor of HBM. And that is precisely what is further narrowing the supply of conventional DRAM products.
HBM is currently the cash cow of the memory industry. AI accelerators from NVIDIA and AMD—and, in the future, custom ASICs as well—are devouring high-bandwidth memory in quantities that, just a few years ago, sounded like the stuff of lab projects. The catch: HBM ties up not only wafer capacity but, above all, highly specialized packaging and testing resources. This involves TSV processes, stack assembly, thermal validation, and yield management at a level of complexity unknown to traditional PC or smartphone DRAM. The consequence is grim but logical: the more Samsung and SK hynix prioritize their most profitable HBM lines, the more pressure classic DRAM segments come under. This is precisely why costs are rising not only in the server sector but are increasingly spilling over into consumer devices. Smartphones are already feeling this particularly acutely, as memory’s share of the bill of materials is once again gaining significant weight.
One need only look at the period following the pandemic peak to understand why the current caution is not a sign of weakness, but pure self-preservation. After the post-COVID slump in demand, manufacturers were left with excess capacity, weak PC demand, and sluggish enterprise purchases. The result was predictable: inventories rose, prices fell, margins were squeezed, and suddenly the supposedly stable market had turned back into a classic semiconductor rollercoaster. Samsung, in particular, knows this game all too well. While the company can afford to invest countercyclically, even a giant of this size doesn’t expand capacity out of sheer nostalgia. SK hynix, on the other hand, has signaled on multiple occasions in the past that it plans expansion very strategically and with an eye on actual demand, not wishful thinking. This is not restraint born of fear, but of experience. Or to put it another way: No one in Seoul wants to sink billions into concrete, equipment, and cleanrooms in 2026, only to wage discount wars against their own inventory again in 2028.
For device manufacturers, the situation is grim. When memory remains scarce and the most profitable products are prioritized internally, unit costs rise along the entire supply chain. Smartphones, notebooks, SSD-based storage solutions, and even mainstream server platforms come under pressure as a result. This isn’t immediately passed on to customers on a one-to-one basis everywhere, but OEMs’ margin buffers aren’t infinite. In the smartphone segment in particular, a rise in DRAM costs can quickly become problematic. If the memory share of the BoM increases by double-digit percentages, this becomes immediately apparent in aggressively priced mid-range and volume models. Then there are only three options: lower margins, inferior features, or higher retail prices. Manufacturers rarely prefer the most customer-friendly option in this order.
The current trend is neither a panic reaction nor an operational mishap. Samsung and SK hynix are doing exactly what rational memory manufacturers must do in an overheated market: they are profiting from the shortage without blindly rushing into the next round of overcapacity. This may be unwelcome for OEMs and become more expensive for end customers, but from the suppliers’ perspective, it is simply sensible. The crucial point, therefore, is not whether more capacity is being built, but how much, how quickly, and for which segment. As long as HBM delivers the highest margins, traditional DRAM will remain, in many areas, the product that is needed but not necessarily prioritized. And this is precisely what gives rise to what we are currently seeing: not a genuine shortage in the traditional sense, but a strategically managed scarcity driven by clear profit logic. So anyone expecting memory prices to soon fall back into their old comfort zones would be wise to let go of that notion. The industry is not banking on a relaxation of the situation, but on discipline. And in the memory market, discipline is often just a more elegant way of saying: We’re supplying just enough to cause maximum pain, but not enough to end the party.
www.igorslab.de
The memory industry is currently raking in huge profits thanks to the AI boom, but behind the scenes, nervousness is growing. While Samsung and SK hynix are benefiting massively from the current demand for DRAM and HBM, that is precisely the problem: anyone who expands too aggressively today could find themselves tomorrow sitting on a newly built billion-dollar factory producing for a declining market. And that wouldn’t be the first time. The current report paints a picture that is as old in the semiconductor industry as the price decline following every phase of euphoria: As long as AI servers, HBM orders, and data center infrastructure are soaking up everything, additional investments seem unavoidable. But as soon as demand normalizes, that same expansion shifts from a growth driver to a burden. This is precisely why Samsung and SK hynix are apparently proceeding more cautiously with the expansion of their DRAM capacities than the current market situation would actually suggest.
The fact that DRAM is currently in short supply is no longer a bold claim, but has long been evident in contract and spot prices. Price trends over the past few quarters clearly show that classic DRAM and, above all, HBM are not just performing well, but have in some cases already fallen into a structural shortage. To outsiders, this may initially sound like a luxury problem. For manufacturers, however, it is a double-edged sword. After all, memory is not a business where you can just “produce a little more.” New production lines, new cleanroom space, new packaging capacities, and the repurposing of existing manufacturing facilities cost time, capital, and, above all, nerves. Anyone investing in a high-price environment always does so with the risk that these very prices will collapse before the new capacity even begins to yield significant returns. And this is precisely where the current caution makes sense. Internally, Samsung appears to assume that the current phase of scarcity is not a permanent state. A market that emerges from the acute bottleneck by around 2028 would not be a shock, but rather almost the baseline scenario for anyone who has ever experienced a memory cycle. The real problem runs deeper than simply “too little DRAM.” The industry is currently not just building more memory, but shifting priorities in favor of HBM. And that is precisely what is further narrowing the supply of conventional DRAM products.
HBM is currently the cash cow of the memory industry. AI accelerators from NVIDIA and AMD—and, in the future, custom ASICs as well—are devouring high-bandwidth memory in quantities that, just a few years ago, sounded like the stuff of lab projects. The catch: HBM ties up not only wafer capacity but, above all, highly specialized packaging and testing resources. This involves TSV processes, stack assembly, thermal validation, and yield management at a level of complexity unknown to traditional PC or smartphone DRAM. The consequence is grim but logical: the more Samsung and SK hynix prioritize their most profitable HBM lines, the more pressure classic DRAM segments come under. This is precisely why costs are rising not only in the server sector but are increasingly spilling over into consumer devices. Smartphones are already feeling this particularly acutely, as memory’s share of the bill of materials is once again gaining significant weight.
One need only look at the period following the pandemic peak to understand why the current caution is not a sign of weakness, but pure self-preservation. After the post-COVID slump in demand, manufacturers were left with excess capacity, weak PC demand, and sluggish enterprise purchases. The result was predictable: inventories rose, prices fell, margins were squeezed, and suddenly the supposedly stable market had turned back into a classic semiconductor rollercoaster. Samsung, in particular, knows this game all too well. While the company can afford to invest countercyclically, even a giant of this size doesn’t expand capacity out of sheer nostalgia. SK hynix, on the other hand, has signaled on multiple occasions in the past that it plans expansion very strategically and with an eye on actual demand, not wishful thinking. This is not restraint born of fear, but of experience. Or to put it another way: No one in Seoul wants to sink billions into concrete, equipment, and cleanrooms in 2026, only to wage discount wars against their own inventory again in 2028.
For device manufacturers, the situation is grim. When memory remains scarce and the most profitable products are prioritized internally, unit costs rise along the entire supply chain. Smartphones, notebooks, SSD-based storage solutions, and even mainstream server platforms come under pressure as a result. This isn’t immediately passed on to customers on a one-to-one basis everywhere, but OEMs’ margin buffers aren’t infinite. In the smartphone segment in particular, a rise in DRAM costs can quickly become problematic. If the memory share of the BoM increases by double-digit percentages, this becomes immediately apparent in aggressively priced mid-range and volume models. Then there are only three options: lower margins, inferior features, or higher retail prices. Manufacturers rarely prefer the most customer-friendly option in this order.
The current trend is neither a panic reaction nor an operational mishap. Samsung and SK hynix are doing exactly what rational memory manufacturers must do in an overheated market: they are profiting from the shortage without blindly rushing into the next round of overcapacity. This may be unwelcome for OEMs and become more expensive for end customers, but from the suppliers’ perspective, it is simply sensible. The crucial point, therefore, is not whether more capacity is being built, but how much, how quickly, and for which segment. As long as HBM delivers the highest margins, traditional DRAM will remain, in many areas, the product that is needed but not necessarily prioritized. And this is precisely what gives rise to what we are currently seeing: not a genuine shortage in the traditional sense, but a strategically managed scarcity driven by clear profit logic. So anyone expecting memory prices to soon fall back into their old comfort zones would be wise to let go of that notion. The industry is not banking on a relaxation of the situation, but on discipline. And in the memory market, discipline is often just a more elegant way of saying: We’re supplying just enough to cause maximum pain, but not enough to end the party.
Samsung and SK hynix are scaling back their DRAM expansion plans: Is there concern about potential overcapacity? | igor´sLAB
The memory industry is currently raking in huge profits thanks to the AI boom, but behind the scenes, nervousness is growing. While Samsung and SK hynix are benefiting massively from the current…
