The evolution of hyperscale data center infrastructure to support the processing of trillions of parameters for large language models has created some rather substantial design challenges. These massive processing facilities must scale to hundreds of thousands of accelerators with highly efficient and fast connections.… Read More
Author: Mike Gianfagna
Synopsys Enables AI Advances with UALink
Perforce Webinar: Can You Trust GenAI for Your Next Chip Design?
GenAI is certainly changing the world. Every day there are new innovations in the use of highly trained models to do things that seemed impossible just a short while ago. As GenAI models take on more tasks that used to be the work of humans, there is always a nagging concern about accuracy and bias. Was the data used to train the model … Read More
Weebit Nano Moves into the Mainstream with Customer Adoption
Disruptive technology typically follows a path of research, development, early deployment and finally commercial adoption. Each of these phases are difficult and demanding in different ways. No matter how you measure it, getting to the finish line is a significant milestone for any company. Weebit Nano is disrupting the way… Read More
PDF Solutions and the Value of Fearless Creativity
PDF Solutions has been around for over 30 years. The company began with a focus on chip manufacturing and yield. Since the beginning, PDF Solutions anticipated many shifts in the semiconductor industry and has expanded its impact with enhanced data analytics and AI. Today, the company’s impact is felt from design to manufacturing,… Read More
DAC TechTalk – A Siemens and NVIDIA Perspective on Unlocking the Power of AI in EDA
AI was everywhere at DAC. Presentations, panel discussions, research papers and poster sessions all had a strong dose of AI. At the DAC Pavillion on Monday two heavy weights in the industry, Siemens and NVIDIA took the stage to discuss AI for design, both present and future. What made this event stand out for me was the substantial… Read More
Synopsys Webinar – Enabling Multi-Die Design with Intel
As we all know, the age of multi-die design has arrived. And along with it many new design challenges. There is a lot of material discussing the obstacles to achieve more mainstream access to this design architecture, and some good strategies to conquer those obstacles. Synopsys recently published a webinar that took this discussion… Read More
CAST Webinar About Supercharging Your Systems with Lossless Data Compression IPs
Much of advanced technology is data-driven. From the cloud and AI accelerators to automotive processing and edge computing, data storage and transmission efficiency are of critical importance. It turns out that lossless data compression is a key ingredient to deliver these requirements.
While there are both software and hardware… Read More
Scaling 3D IC Technologies – Siemens Hosts a Meeting of the Minds at DAC
3D IC was a very popular topic at DAC. The era of heterogeneous, multi-chip design is here. There were a lot of research results and practical examples presented. What stood out for me was a panel at the end of day two of DAC that was hosted by Siemens. This panel brought together an impressive group of experts to weigh in on what was really… Read More
DAC News – proteanTecs Unlocks AI Hardware Growth with Runtime Monitoring
As AI models grow exponentially, the infrastructure supporting them is struggling under the pressure. At DAC, one company stood out with a solution that doesn’t just monitor chips, it empowers them to adapt in real time to these new workload requirements.
Unlike traditional telemetry or post-silicon debug tools, proteanTecs… Read More
Perforce at DAC, Unifying Software and Silicon Across the Ecosystem
As the new name reflects, chip and system design were a major focus at DAC. So was the role of AI to enable those activities. But getting an AI-enabled design flow to work effectively across chip, subsystem and system-level design presents many significant challenges. One important one is effectively managing the vast amount of… Read More








AI RTL Generation versus AI RTL Verification