Generative AI is all the rage with systems like ChatGPT, Google Bard and DALL-E being introduced with great fanfare in the past year. The EDA industry has also been keen to adopt the trends of using AI techniques to assist IC engineers across many disciplines. Saugat Sen, Product Marketing at Cadence did a video call with me to explain… Read More
Artificial Intelligence
Takeaways from SNUG 2023
Synopsys pulled out all the stops for this event. I attended the first full day, tightly scripted from Aart’s keynote kick off to 1×1 interviews with Synopsys executives to a fireside chat between Sassine Ghazi (President and COO) and Rob Aitken (ex-Fellow at Arm, now Distinguished Architect at Synopsys). That’s a lot of … Read More
Interconnect Under the Spotlight as Core Counts Accelerate
In the march to more capable, faster, smaller, and lower power systems, Moore’s Law gave software a free ride for over 30 years or so purely on semiconductor process evolution. Compute hardware delivered improved performance/area/power metrics every year, allowing software to expand in complexity and deliver more capability… Read More
AI is Ushering in a New Wave of Innovation
Artificial intelligence (AI) is transforming many aspects of our lives, from the way we work and communicate to the way we shop and travel. Its impact is felt in nearly every industry, including the semiconductor industry, which plays a crucial role in enabling the development of AI technology.
One of the ways AI is affecting our… Read More
AI in Verification – A Cadence Perspective
AI is everywhere or so it seems, though often promoted with insufficient detail to understand methods. I now look for substance, not trade secrets but how exactly they using AI. Matt Graham (Product Engineering Group Director at Cadence) gave a good and substantive tutorial pitch at DVCon, with real examples of goal-centric optimization… Read More
Full-Stack, AI-driven EDA Suite for Chipmakers
Semiconductor technology is among the most complex of technologies and the semiconductor industry is among the most demanding of industries. Yet the ecosystem has delivered incredible advances over the last six decades from which the world has benefitted tremendously. Yes, of course, the markets want that break-neck speed… Read More
Narrow AI vs. General AI vs. Super AI
Artificial intelligence (AI) is a term used to describe machines that can perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and language translation. AI is classified into three main types: Narrow AI, General AI, and Super AI. Each type of AI has its unique… Read More
Scaling AI as a Service Demands New Server Hardware
While I usually talk about AI inference on edge devices, for ADAS or the IoT, in this blog I want to talk about inference in the cloud or an on-premises datacenter (I’ll use “cloud” below as a shorthand to cover both possibilities). Inference throughput in the cloud is much higher today than at the edge. Think about support in financial… Read More
MIPI D-PHY IP brings images on-chip for AI inference
Edge AI inference is getting more and more attention as demand grows for AI processing across an increasing number of diverse applications, including those requiring low-power chips in a wide range of consumer and enterprise-class devices. Much of the focus has been on optimizing the neural network processing engine for these… Read More
Deep thinking on compute-in-memory in AI inference
Neural network models are advancing rapidly and becoming more complex. Application developers using these new models need faster AI inference but typically can’t afford more power, space, or cooling. Researchers have put forth various strategies in efforts to wring out more performance from AI inference architectures,… Read More
Flynn Was Right: How a 2003 Warning Foretold Today’s Architectural Pivot