The Microchip Era Is About to End
By
George Gilder
Nov. 3, 2025 1:34 pm ET
		
		
	
	
		
	Jensen Huang in Washington, Oct. 28. Kent Nishimura/Bloomberg News
We are in the microchip era, which promises an industrial revolution that will bring artificial intelligence to almost all human activity.
The exemplar of this era is Nvidia Corp. Its market capitalization of around $5 trillion makes it the world’s most valuable company. Jensen Huang, Nvidia’s founder and CEO, dazzled the audience at the company’s AI conference in Washington last week. In his keynote address, Mr. Huang detailed the advances Nvidia’s chips have wrought. He thanked President Trump for bringing chip fabrication back to the U.S. from Asia with energy policies that enable domestic AI microchip production.
Nvidia’s latest chips are mostly encased in plastic packages and resemble an ant or a beetle with copper wires for legs. Each chip holds as many as 208 billion transistor switches and costs about $30,000. In a revolutionary breakthrough, these data-center chips no longer act independently like the central processing unit in your laptop. Instead, enmeshed by the thousands and even the millions in data centers, they function as a single “hyperscale” computer, with their collective thinking designated AI. The world’s supreme data center is Colossus 2 in Memphis, Tenn., engine of Elon Musk’s xAI. As the source for Grok and self-driving cars, Colossus 2 integrates an estimated one million Nvidia chips in one vast computer.
The “chip” has so captivated the minds of our time that even makers of new devices call its potential successor a “giant chip” or “superchip.” But the new device is in fact the opposite of a microchip, lacking separate processing units or memories in plastic packages with wire “legs.”
The U.S. government considers chips vital and strategic. The 2022 Chips Act authorized more than $200 billion to support chip fabrication in the U.S. and keep it away from China. Microchips shape U.S. foreign policy from the Netherlands, home of ASML, the No. 1 maker of chip-fabrication tools, to Taiwan and its prodigious Taiwan Semiconductor Manufacturing Co. TSMC commands a market share of more than 95% of the leading-edge chips that enable cellphones and other advanced equipment.
By cutting off the Chinese chip market, which contains the majority of semiconductor engineers, U.S. industrial policies have hampered American producers of wafer-fabrication equipment—essential for making chips—without slowing China’s ascent. In the wake of these protectionist policies, launched around 2020, Chinese semiconductor capital equipment production has risen by 30% to 40% annually, compared with annual growth of about 10% in the U.S.
This change echoes the effect of the U.S. ban in May 2019 and after of telecom gear made by Chinese powerhouse Huawei. The ban lowered U.S. company sales to Huawei by $33 billion between 2021 and 2024 while Huawei’s global market share expanded.
Industrial policies and protectionism nearly always favor incumbent industries facing obsolescence. In this respect, the Chips Act and related bans and tariffs are no different from subsidies for ethanol in gasoline or sugar beets in Louisiana or now subsidies for rare-earth mining at a time when rare earths can be profitably harvested from electronic waste using new technology developed at Rice University. All the efforts to save microchip production in the U.S. come amid undeniable portents of the end of microchips.
The signs are clear in the exquisite physics of the crucial machine that defines and limits the size and density of chips. Some of us call it the “Extreme Machine.” The latest version, made by ASML, performs high numerical aperture extreme ultraviolet lithography. If you aren’t Chinese, you can buy an Extreme Machine for about $380 million. Roughly 44 have been sold so far. It comes in about 250 crates and takes hundreds of specialized engineers about six months to install. IBM’s research director, Darío Gil, calls it “the world’s most complicated machine.”
The Extreme Machine is a kind of camera. It projects patterns of light on what might be called the “films” or “photoresists” on the surface of 12-inch silicon wafers through a quartz-and-chrome photo mask bearing the chip design.
Governing everything that happens in the Extreme Machine is a convergence of physical laws and engineering constraints summed up as the reticle limit. The reticle defines the size of chips, and chip size in turn defines the granularity of AI computation. Thus the reticle limit determines how many graphics processing units—mostly from Nvidia—must be linked to perform some AI task. Beyond a certain point—roughly 800 square millimeters, or 1.25 square inches—the laws of light and light speed prohibit larger designs.
You can see the effects of the reticle limit in the ever-mounting complexity of Nvidia-defined vast hyperscale data centers. The result—smaller, denser chips and “chiplets,” each with its own elaborate packaging—is a greater need for ultimate reintegration of the processes for coherent outcomes. The calculation first has to be dispersed among many chips, then recompiled. The effect is more communications overhead between chips requiring ever more complex packages, ever more wires and fiber-optic links.
The result of the inexorable reticle limit is the end of chips. What’s next? A wafer-scale integration model, which bypasses chips altogether. Mr. Musk pioneered this concept at Tesla with his now-disbanded Dojo computer project; the effort has been recreated as DensityAI.
Cerebras of Palo Alto, Calif., used the concept in its WSE-3 wafer-scale engine. The WSE-3 boasts some four trillion transistors—14 times as many as Nvidia’s Blackwell chip—with 7,000 times the memory bandwidth. Cerebras inscribed the memory directly on to the wafer rather than relegating it to distant chips and chiplets in high-bandwidth memory mazes. The company stacked up its wafer-scale engines 16-fold, thereby reducing a data center to a small box with 64 trillion transistors.
Also working on a full wafer-scale future is David Lam, founder of Lam Research Corp., the world’s third-largest wafer-fabrication equipment company. In 2010, Mr. Lam founded Multibeam Corp., which created a machine that performs multi-column e-beam lithography. The technology allows manufacturers to bypass the reticle limit. Multibeam has already demonstrated the capability to inscribe 8-inch wafers. Look Ma, no chips! No China! (Or even Taiwan.) No elaborate packaging in the Philippines or Shenzhen.
The post-microchip era, with data centers in a box of wafer scale processors, is coming. America, not China, should lead the way.
Mr. Gilder is author of “Life After Capitalism: The Information Theory of Economics.”
	
				
			The future is in wafers. Data centers will be the size of a box, not vast energy-hogging structures.
By
George Gilder
Nov. 3, 2025 1:34 pm ET
We are in the microchip era, which promises an industrial revolution that will bring artificial intelligence to almost all human activity.
The exemplar of this era is Nvidia Corp. Its market capitalization of around $5 trillion makes it the world’s most valuable company. Jensen Huang, Nvidia’s founder and CEO, dazzled the audience at the company’s AI conference in Washington last week. In his keynote address, Mr. Huang detailed the advances Nvidia’s chips have wrought. He thanked President Trump for bringing chip fabrication back to the U.S. from Asia with energy policies that enable domestic AI microchip production.
Nvidia’s latest chips are mostly encased in plastic packages and resemble an ant or a beetle with copper wires for legs. Each chip holds as many as 208 billion transistor switches and costs about $30,000. In a revolutionary breakthrough, these data-center chips no longer act independently like the central processing unit in your laptop. Instead, enmeshed by the thousands and even the millions in data centers, they function as a single “hyperscale” computer, with their collective thinking designated AI. The world’s supreme data center is Colossus 2 in Memphis, Tenn., engine of Elon Musk’s xAI. As the source for Grok and self-driving cars, Colossus 2 integrates an estimated one million Nvidia chips in one vast computer.
The “chip” has so captivated the minds of our time that even makers of new devices call its potential successor a “giant chip” or “superchip.” But the new device is in fact the opposite of a microchip, lacking separate processing units or memories in plastic packages with wire “legs.”
The U.S. government considers chips vital and strategic. The 2022 Chips Act authorized more than $200 billion to support chip fabrication in the U.S. and keep it away from China. Microchips shape U.S. foreign policy from the Netherlands, home of ASML, the No. 1 maker of chip-fabrication tools, to Taiwan and its prodigious Taiwan Semiconductor Manufacturing Co. TSMC commands a market share of more than 95% of the leading-edge chips that enable cellphones and other advanced equipment.
By cutting off the Chinese chip market, which contains the majority of semiconductor engineers, U.S. industrial policies have hampered American producers of wafer-fabrication equipment—essential for making chips—without slowing China’s ascent. In the wake of these protectionist policies, launched around 2020, Chinese semiconductor capital equipment production has risen by 30% to 40% annually, compared with annual growth of about 10% in the U.S.
This change echoes the effect of the U.S. ban in May 2019 and after of telecom gear made by Chinese powerhouse Huawei. The ban lowered U.S. company sales to Huawei by $33 billion between 2021 and 2024 while Huawei’s global market share expanded.
Industrial policies and protectionism nearly always favor incumbent industries facing obsolescence. In this respect, the Chips Act and related bans and tariffs are no different from subsidies for ethanol in gasoline or sugar beets in Louisiana or now subsidies for rare-earth mining at a time when rare earths can be profitably harvested from electronic waste using new technology developed at Rice University. All the efforts to save microchip production in the U.S. come amid undeniable portents of the end of microchips.
The signs are clear in the exquisite physics of the crucial machine that defines and limits the size and density of chips. Some of us call it the “Extreme Machine.” The latest version, made by ASML, performs high numerical aperture extreme ultraviolet lithography. If you aren’t Chinese, you can buy an Extreme Machine for about $380 million. Roughly 44 have been sold so far. It comes in about 250 crates and takes hundreds of specialized engineers about six months to install. IBM’s research director, Darío Gil, calls it “the world’s most complicated machine.”
The Extreme Machine is a kind of camera. It projects patterns of light on what might be called the “films” or “photoresists” on the surface of 12-inch silicon wafers through a quartz-and-chrome photo mask bearing the chip design.
Governing everything that happens in the Extreme Machine is a convergence of physical laws and engineering constraints summed up as the reticle limit. The reticle defines the size of chips, and chip size in turn defines the granularity of AI computation. Thus the reticle limit determines how many graphics processing units—mostly from Nvidia—must be linked to perform some AI task. Beyond a certain point—roughly 800 square millimeters, or 1.25 square inches—the laws of light and light speed prohibit larger designs.
You can see the effects of the reticle limit in the ever-mounting complexity of Nvidia-defined vast hyperscale data centers. The result—smaller, denser chips and “chiplets,” each with its own elaborate packaging—is a greater need for ultimate reintegration of the processes for coherent outcomes. The calculation first has to be dispersed among many chips, then recompiled. The effect is more communications overhead between chips requiring ever more complex packages, ever more wires and fiber-optic links.
The result of the inexorable reticle limit is the end of chips. What’s next? A wafer-scale integration model, which bypasses chips altogether. Mr. Musk pioneered this concept at Tesla with his now-disbanded Dojo computer project; the effort has been recreated as DensityAI.
Cerebras of Palo Alto, Calif., used the concept in its WSE-3 wafer-scale engine. The WSE-3 boasts some four trillion transistors—14 times as many as Nvidia’s Blackwell chip—with 7,000 times the memory bandwidth. Cerebras inscribed the memory directly on to the wafer rather than relegating it to distant chips and chiplets in high-bandwidth memory mazes. The company stacked up its wafer-scale engines 16-fold, thereby reducing a data center to a small box with 64 trillion transistors.
Also working on a full wafer-scale future is David Lam, founder of Lam Research Corp., the world’s third-largest wafer-fabrication equipment company. In 2010, Mr. Lam founded Multibeam Corp., which created a machine that performs multi-column e-beam lithography. The technology allows manufacturers to bypass the reticle limit. Multibeam has already demonstrated the capability to inscribe 8-inch wafers. Look Ma, no chips! No China! (Or even Taiwan.) No elaborate packaging in the Philippines or Shenzhen.
The post-microchip era, with data centers in a box of wafer scale processors, is coming. America, not China, should lead the way.
Mr. Gilder is author of “Life After Capitalism: The Information Theory of Economics.”
			
				Last edited: 
			
		
	
								
								
									
	
								
							
							