By Kamal Khan
The semiconductor world has always been the beating heart of tech innovation, powering everything from our smartphones to the latest AI breakthroughs. However, as chip complexity increases and market demands accelerate, adherence to traditional development cycles may be stagnating design teams and slowing … Read More
Tag: nvidia
Synopsys and NVIDIA Forge AI Powered Future for Chip Design and Multiphysics Simulation
In a landmark announcement at NVIDIA’s GTC Washington, D.C. conference Synopsys unveiled deepened collaborations with NVIDIA to revolutionize semiconductor design and engineering through agentic AI, GPU-accelerated computing, and AI-driven physics simulations. This partnership, building on over three decades… Read More
Yes Intel Should Go Private
Lip-Bu Tan started as Intel CEO on March 18th of this year and some very impressive changes have already taken place. Intel started the year with more than 100,000 employees and will finish the year with around 75,000. Reporting structures have been flattened and the Intel culture is being transformed back into an innovation driven… Read More
Webinar: From Theory to Reality: NVIDIA GPUs in Modern Simulations
Join our webinar to see how AI-driven CFD-DEM workflows are transforming downhole plugging design. Learn to simulate interactions, automate exploration, and speed up decisions in R&D and design.
Date/Time:
September 4, 2025
1 PM EDT
Venue:
Virtual
Overview
Join us for an exciting series of live events dedicated to exploring
Can RISC-V Help Recast the DPU Race?
ARM’s Quiet Coup in DPUs
The datacenter is usually framed as a contest between CPUs (x86, ARM, RISC-V) and GPUs (NVIDIA, AMD, custom ASICs). But beneath those high-profile battles, another silent revolution has played out: ARM quietly displaced Intel and AMD in the Data Processing Unit (DPU) market.
DPUs — also called SmartNICs… Read More
A Big Step Forward to Limit AI Power Demand
By now everyone knows that AI has become the all-consuming driver in tech and that NVIDIA GPU-based platforms are the dominant enabler of this revolution. Datacenters worldwide are stuffed with such GPUs, serving AI workloads from automatically drafting emails and summarizing meetings to auto-creating software and controlling… Read More
Semiconductors Still Strong in 2025
The global semiconductor market in 2Q 2025 was $180 billion, up 7.8% from 1Q 2025 and up 19.6% from 2Q 2024, according to WSTS. 2Q 2025 marked the sixth consecutive quarter with year-to-year growth of over 18%.
The table below shows the top twenty semiconductor companies by revenue. The list includes companies which sell devices… Read More
Google Cloud: Optimizing EDA for the Semiconductor Future
On July 9, 2025, a DACtv session featured a Google product manager discussing the strategic importance of electronic design automation (EDA) and how Google Cloud is optimizing it for the semiconductor industry, as presented in the YouTube video. The talk highlighted Google Cloud’s role in addressing the escalating complexity… Read More
Siemens EDA and Nvidia: Pioneering AI-Driven Chip Design
On July 18, 2025, Siemens EDA and Nvidia presented a compelling vision for the future of electronic design automation (EDA) at a DACtv event, emphasizing the transformative role of artificial intelligence (AI) in semiconductor and PCB design. Amit Gupta, Vice President and General Manager at Siemens EDA, and John Lynford, head… Read More
New Cooling Strategies for Future Computing
Power densities on chips increased from 50-100 W/cm2 in 2010 to 200 W/cm2 in 2020, creating a significant challenge in removing and spreading heat to ensure reliable chip operation. The DAC 2025 panel discussion on new cooling strategies for future computing featured experts from NVIDIA Research, Cadence, ESL/EPFL, the University… Read More
