Weekly: Memory is cyclical
16 min read.
It’s the Lunar New Year holiday here in Asia, so I will be off for all of next week. If something urgent happens, I may be back, as I did when the DeepSeek news broke last LNY holiday. Happy New Year!
Highlights
Memory is cyclical. Lots on the memory supercycle today. SemiAnalysis does a comprehensive review of the memory cycles in recent history, noting that “while there are clear similarities to prior cycles, this supercycle is shaping up to be both larger and longer in duration.” Robert Armstrong of the FT writes similarly, reminding investors that the memory industry is famously cyclical. And Daniel Tudor profiles memory leader SK Hynix for the FT.
Middle powers in the AI race. Sam Winter-Levy and Anton Leicht of the Carnegie Endowment for International Peace write of the trend towards AI bipolarity where the US and China emerge as winners in the race to the detriment of middle and smaller powers. They think about how middle powers might relieve themselves from this trap and how countries may respond to this seemingly pending future.
State of the CPU. SemiAnalysis offers another in-depth technical report on the state of the CPU. They describe how the CPU lost its status amid the era of AI data centres where GPUs and networking chips became most important. The research firm writes that the data centre CPU is becoming relevant again, with implications for Intel, as an example.
Thanks for reading.
Table of Contents
Dylan Patel, Ray Wang, Myron Xie, Doug, and Jeff Koch, “Memory Mania: How a Once-in-Four-Decades Shortage Is Fueling a Memory Boom,” SemiAnalysis, 02/06/2026.
Robert Armstrong, “Memory investors have forgotten the last cycle,” FT, 02/10/2026.
Daniel Tudor, “How a ‘zombie’ chipmaker became Nvidia’s vital AI ally,” FT, 02/11/2026.
Sam Winter-Levy and Anton Leicht, “The AI Divide: How U.S.-Chinese Competition Could Leave Most Countries Behind,” Foreign Affairs, 02/10/2026.
David Sacks and Steve Honig, “U.S.-Taiwan Trade Agreement Leaves Major Questions Open,” CFR, 02/12/2026.
Gerald Wong and Dylan Patel, “CPUs are Back: The Datacenter CPU Landscape in 2026,” SemiAnalysis, 02/10/2026.
Ian King and Maggie Eastland, “Nvidia Is Heading Back to China. Here’s Why It’s Fraught,” Bloomberg, 02/07/2026.
The Economist, “Arm wants a bigger slice of the chip business,” The Economist, 02/12/2026.
1.
Dylan Patel, Ray Wang, Myron Xie, Doug, and Jeff Koch, “Memory Mania: How a Once-in-Four-Decades Shortage Is Fueling a Memory Boom,” SemiAnalysis, 02/06/2026.
Prices of memory are going crazy. SemiAnalysis has been calling this out for over a year since late 2024. The scariest thing is that we aren’t even close to the peak. We go through fab by fab production and expansion versus detailed end market demand by memory type to forecast memory revenue, pricing, and margin better than anyone else. This has all been detailed in the SemiAnalysis memory model for a while, but we will share it more publicly today.
The memory industry has been defined by commoditization, which comes with cyclicality. This outcome reflects a combination of industry-wide competitive behavior, recurring lapses in capital discipline, and the nature of DRAM scaling we explained earlier.
At its core, memory’s cyclicality is driven by timing mismatches between demand changes and corresponding supply responses. Aside from the buffer of short-term inventories, DRAM supply is not very flexible. It can take years to bring meaningful new DRAM supply online, trying to meet demand that fluctuates daily.
For those who have lived through multiple memory cycles, the central question when it comes to this supercycle is the same: when will this cycle peak? It is natural that both investors and the supply chain remain cautious, particularly as memory stocks rally sharply over short periods. In our view, however, while there are clear similarities to prior cycles, this supercycle is shaping up to be both larger and longer in duration, driven by dynamics that are very much unique to this cycle.
Currently, the DRAM industry is operating in a deeply supply-constrained environment, and based on our Memory Industry Model, we believe the supply–demand imbalance is deteriorating rather than normalizing.
2.
Robert Armstrong, “Memory investors have forgotten the last cycle,” FT, 02/10/2026.
These are all companies that make computer memory. Western Digital and Seagate make hard disc drives (slow, low-cost storage). Micron, SK Hynix and Samsung focus on Dram (fast, expensive memory for use in active applications) and SanDisk specialises in Nand (middle-priced, solid-state memory for quick retrieval). It is easy to be hypnotised by the staggering 1,200 per cent rise in SanDisk’s shares in the past six months. But the others are all up 180 to 280 per cent over the same period. SanDisk, Western and Micron are the three best-performing stocks in the S&P 500 over the past year.
It’s been a blistering run, driven by the AI boom’s apparently bottomless thirst for memory. Memory prices have been a tear (see here). SanDisk has been the star recently because it has only just become clear how much Nand AI applications will require. The company has gone from burning cash in 2024 to generating almost a billion dollars in free cash flow in the past quarter alone.
The memory industry is intensely cyclical. Demand rises and falls quickly and supply is slow to respond, leading to shortages, gluts and wild price swings. Between mid-2020 and January of 2022, shares in Western, Seagate, Micron and SK Hynix all rose 100 per cent or more, only to give it all back in next nine months:
There were cycles of equal or even greater violence that peaked in 2014 and 2018. But the sheer speed of the most recent run in the share prices throws previous interactions into the shade. So: will the decline be as violent as rise? When will it come? Or is this cycle different?
Still, everyone expects this expansionary stage of this cycle to be longer than usual — potentially another several years — because demand is so extraordinary. “You look at history, you look back five years, it’s going to correct,” says Jonathan Goldberg of Digits to Dollars Advisory. “The amplitude of this cycle is greater, so it can go up from here. [But] there are a lot of [investors] in semiconductors who weren’t around five years ago, who will say this time is different. But the fact is the cycle has not changed.”
Others argue that the emergence of High Bandwidth Memory means that things will be different this time. HBM is a specialised form of DRam for high-performance computing made by Samsung, SK Hynix and Micron. “So much of this cycle is being driven by HBM,” says Ben Bajarin of Creative Strategies. “There is more differentiation; it does not become a commodity any time soon . . . I think there is a new floor for memory revenues.”
3.
Daniel Tudor, “How a ‘zombie’ chipmaker became Nvidia’s vital AI ally,” FT, 02/11/2026.
For years the most typical career aspiration for a young Korean would have been to snag a job at Samsung. But a recent survey of young jobseekers showed the country has a new most sought-after employer: SK Hynix.
The chipmaker is enjoying its most successful period thanks to dominance of one of the global economy’s most critical technologies — the high-bandwidth memory chips that are powering AI development.
SK Hynix has outmuscled better known chipmakers, including its great rival Samsung, to claim more than half of the global market for HBM chips, which allow the massive volumes of data required for AI to flow at high speeds. The company is the primary HBM supplier to Nvidia and was recently selected by Microsoft for its own proprietary AI chips.
Shortages of HBM as well as less powerful Dram and NAND memory are propelling prices higher, with SK Hynix reaping the benefit. Its fourth-quarter revenues were 66 per cent higher than a year earlier while its operating margins were 58 per cent, better than even TSMC, the world’s biggest chipmaker. SK Hynix’s market capitalisation is up 340 per cent over the past 12 months to Won640tn ($438bn).
Once “a follower”, SK Hynix was “now a shaper” of the chip sector, said Kwon Seok-joon of Sungkyunkwan University in Seoul. “SK Hynix has made the memory constraint its advantage.”
The company is preparing to double down on AI, moving away from just making chips by committing $10bn of capital to an “AI solutions firm”.
AI would be a “fourth quantum leap” for SK Group, the sprawling chaebol, or large family-owned conglomerate, to which SK Hynix belongs, group chair Chey Tae-won has said.
4.
Sam Winter-Levy and Anton Leicht, “The AI Divide: How U.S.-Chinese Competition Could Leave Most Countries Behind,” Foreign Affairs, 02/10/2026.
The future of artificial intelligence will be controlled by the United States and China. The two countries employ 70 percent of the world’s top machine learning researchers, command 90 percent of global computing power, and attract the vast majority of AI investment—more than twice the combined total of every other state combined. In past technological revolutions, powers that were not at the frontier could gradually adopt new capabilities and catch up. But the AI revolution will be different, locking those countries into a strategic trap that could consign much of the world to technological vassalage.
This trap particularly affects what might be called the AI middle powers: countries such as France, India, and the United Kingdom, which have substantial state capacity and economic resources but lack the scale, capital, energy, and computing power to build frontier AI systems on their own. These powers face three principal challenges. First, their access to frontier AI capabilities is subject to the whims of policymakers in Washington and Beijing. Second, they remain exposed to AI’s disruptive effects—including job losses, social upheaval, and the expansion of AI-enhanced cybercrime—whether or not they share in its benefits. Third, they lack the leverage and the policy tools necessary to shape AI’s development or manage its consequences.
Enduring marginalization is not inevitable. But avoiding it will require the middle powers to retain access to frontier AI capabilities and identify what economic and strategic value they can offer to a world transformed by AI systems. Different paths to these goals remain open: some middle powers may choose to align themselves with the United States or China, some may attempt to play Washington and Beijing against each other to extract concessions, and others may mount an ambitious attempt at technological sovereignty. But all of them will ultimately have to reckon with what a global AI economy might look like—and where they might find leverage within it.
5.
David Sacks and Steve Honig, “U.S.-Taiwan Trade Agreement Leaves Major Questions Open,” CFR, 02/12/2026.
Taiwan has become only the seventh U.S. trading partner to reach a Reciprocal Trade Agreement with the Trump administration.
Taiwan’s leading role in semiconductor production and information and communication technology (ICT) goods is largely driving this sharp rise in two-way trade. Taiwanese firms account for 60 percent of all foundry revenue and produce over 90 percent of the most advanced chips, while also exporting other ICT hardware to the United States. If the AI boom continues and the U.S.-China trade war intensifies, Taiwan may leapfrog China and become America’s third-largest trading partner.
One throughline from President Trump’s first term to the Biden administration and now to President Trump’s second term is the perceived need to reshore chip production to the United States. Undergirding this push is the desire to not be reliant on any single foreign source for chips that are vital to the U.S. economy and to national security. Put bluntly, the United States does not want to find itself in the position of being unable to fight a war over Taiwan because it cannot get the chips it needs for its weapons. The CHIPS and Science Act, signed into law in 2022, attempted to turbocharge these efforts by providing billions in funding for semiconductor manufacturing, workforce development, and research.
The Trump administration has continued to prioritize semiconductor manufacturing, but at the same time has rolled back funding for research and development and worker training (President Trump has called the CHIPS and Science Act “a horrible, horrible thing” and urged Congress to “get rid” of it). Due largely to the threat of tariffs, Taiwan Semiconductor Manufacturing Company (TSMC) increased its investment pledge to $165 billion, which will go toward chip fabrication and processing plants as well as a research and development facility in Arizona. Secretary of Commerce Howard Lutnick has shared that the Trump administration is seeking to bring 40 percent of Taiwan’s semiconductor supply chain to the United States. Lutnick added, “We’re going to bring it all over so we become self-sufficient in the capacity of building semiconductors.”
Such talk of self-sufficiency has raised concerns in Taiwan that its “silicon shield” is eroding and that the United States is seeking to make the island expendable. Wu Cheng-wen, the head of Taiwan’s National Science and Technology Council, has insisted Taiwan will not allow its chip industry to be “hollowed out.” Taiwan’s vice premier, Cheng Li-chiun, has stated that Lutnick’s goal is not realistic and that the island’s “most advanced R&D and manufacturing processes must be carried out first in Taiwan.”
Achieving the Trump administration’s onshoring goals will prove difficult. Over the course of decades, Taiwan has built and developed an ecosystem of critical supplier networks and human capital, which will be hard to replicate in the United States. In addition, manufacturing costs in the United States are higher than they are in Taiwan. TSMC has noted that talent shortages, equipment maintenance issues, and labor laws have presented challenges for its operations in Arizona. Nonetheless, TSMC recently announced that it intends to build up to 12 fabs in Arizona, up from the initial six announced in early 2025. The company is also introducing its advanced 2nm process a year earlier than planned.
6.
Gerald Wong and Dylan Patel, “CPUs are Back: The Datacenter CPU Landscape in 2026,” SemiAnalysis, 02/10/2026.
Since 2023, the datacenter story has been simple. GPUs and networking are king. The arrival and subsequent explosion of AI Training and Inference have shifted compute demands away from the CPU. This meant that Intel, the primary supplier of server CPUs, failed to ride the wave of datacenter buildout and spending. Server CPU revenue remained relatively stagnant as hyperscalers and neoclouds focused on GPUs and datacenter infrastructure.
At the same time, the same hyperscalers have been rolling their own ARM-based datacenter CPUs for their cloud computing services, closing off a significant addressable market for Intel. And within their own x86 turf, Intel’s lackluster execution and uncompetitive performance to rival AMD has further eroded market share. Without a competent AI accelerator offering, Intel was left to tread water while the rest of the industry feasted.
Over the last 6 months this has changed massively. We have posted multiple reports to Core Research and the Tokenomics Model about soaring CPU demand. The primary drivers we have shown and modeled are reinforcement learning and vibe coding’s incredible demand on CPUs. We have also covered major CPU cloud deals by multiple vendors with AI labs. We also have modeling of how many CPUs of what types are being deployed.
However, Intel’s recent rallies and changing demand signals in the latter part of 2025 have shown that CPUs are now relevant again. In their latest Q4 earnings, Intel saw an unexpected uptick in datacenter CPU demand in late 2025 and are increasing 2026 capex guidance on foundry tools and prioritizing wafers to server from PC to alleviate supply constraints in serving this new demand. This marks an inflection point in the role of CPUs in the datacenter, with AI model training and inference using CPUs more intensively.
2026 is an exciting year for the datacenter CPU, with many new generations launching this year from all vendors amid the boom in demand. As such, this piece serves to paint the CPU landscape in 2026. We lay the groundwork, covering the history of the datacenter CPU and the evolving demand drivers, with deep dives on datacenter CPU architecture changes from Intel and AMD over the years.
We then focus on the 2026 CPUs, with comprehensive breakdowns on Intel’s Clearwater Forest, Diamond Rapids and AMD’s Venice and their interesting convergence (and divergence) in design, discussing the performance differences and previewing our CPU costing analysis.
Next, we detail the ARM competition, including NVIDIA’s Grace and Vera, Amazon’s Graviton line, Microsoft’s Cobalt, Google’s Axion CPU lines, Ampere Computing’s merchant ARM silicon bid and their acquisition by Softbank, ARM’s own Phoenix CPU design and look at Huawei’s home grown Kunpeng CPU efforts.
7.
Ian King and Maggie Eastland, “Nvidia Is Heading Back to China. Here’s Why It’s Fraught,” Bloomberg, 02/07/2026.
Why was Nvidia locked out of China?
Citing national security concerns, Washington limited the export of high-performance semiconductors to China during Trump’s first presidency, a policy that was upheld by his successor Joe Biden and tightened again when Trump returned to the White House. Policymakers still worry the same chips that power AI services such as ChatGPT and Claude risk accelerating the development of China’s nuclear weapon systems, military intelligence and cyberwarfare capabilities.
US export license requirements have effectively blocked Nvidia and US rival Advanced Micro Devices Inc. from supplying their best products to Chinese customers. The two California-based companies lead the market for AI accelerators — the chips that create and then run AI software. Both tried to offer China-only versions of their product lines. These parts are less capable and are designed not to trigger the rules. Even these offerings were later caught in a tightening of the restrictions or largely rejected by Chinese customers under instruction from Beijing.
Why is China crucial for Nvidia?
In the first place, it’s the world’s biggest single market for chips, accounting for $229 billion in annual revenue — a third of the industry total — in 2024. Huang expects Chinese companies will be spending $50 billion annually on AI chips alone within two to three years.
For Huang, it’s not just about securing a share of that business. Nvidia’s decisive technological edge has made it the world’s preeminent provider of the equipment powering the AI boom. By forcing China’s army of AI developers to seek out alternatives to Nvidia for their computation needs, the US chip export restrictions have energized China’s efforts to create an AI chip ecosystem that could ultimately threaten Nvidia’s leadership. The longer Nvidia is excluded from China, the bigger the potential market opportunity for Chinese rivals such as Huawei Technologies Co. and Cambricon Technologies Corp.
8.
The Economist, “Arm wants a bigger slice of the chip business,” The Economist, 02/12/2026.
Arm’s current model captures only a sliver of the value it creates. Analysts expect revenue this fiscal year to be around $5bn, with half from royalties and the rest from licensing fees. That is up by about 20% from 2025, but still dwarfed by those of bigger chipmakers such as Nvidia, Broadcom or Intel. According to Visible Alpha, a data provider, last year Arm earned royalties of $0.86 per mobile chip, or 2.5-5% of the price.
The company is eager to extract more value. But how? To illustrate, Mr Haas uses an analogy. For most of its history, Arm sold designs for individual processors. Think of them as “Lego bricks”. Recently it has also started selling blueprints for pre-assembled blocks of processors known as “subsystems”. Bloomberg Intelligence, a research group, estimates that these bring the company three times as much revenue per chip. Arm believes that subsystems could make up over half of its royalties within a couple of years.
At some point, however, subsystems become whole toys. One option is to develop custom chips for cloud providers. That has proved lucrative for Broadcom: making bespoke chips for Google and Amazon has helped push its market value above $1.6trn (Arm is worth $135bn). Some analysts think Arm will go further and design and sell its own chips. Rumours suggest that Meta, a social-media giant, will be the first customer. Mr Haas is careful not to be drawn.
Either route would bring Arm a bigger cut from its designs, but would entail risks. Creating finished chips, or moving in that direction, would undermine the claim that it does not compete with its customers.
Arm’s ownership could affect its choices. SoftBank, the Japanese conglomerate that owns over 85% of the firm, has been assembling its own chip portfolio, buying Ampere, which makes server processors, and Graphcore, which designs AI chips. In August it bought 2% of Intel for $2bn. Masayoshi Son, SoftBank’s boss, is said to be keen to build an AI champion to rival Nvidia. Mr Haas, who sits on SoftBank’s board, talks up synergies across the group’s chip businesses. But all this may push Arm away from being a neutral supplier of designs.
Mr Haas says his biggest worry is whether Arm is investing fast enough to take advantage of the AI opportunity. Chips take years to design and build; AI models evolve in months. Whether the company can move quickly enough is one question. Whether it can make the most of AI without undermining the model that put its designs everywhere is another.
–

