The semiconductor industry has gone through several major transitions driven by different dynamics such as shift in business models (fab-centric to fab-less), product segmentation (system design house, IP developers) and end market applications (PC to cloud; and recently, to both automotive and Internet of Things — IOT’s,… Read More
Artificial Intelligence
Processing Power Driving Practicality of Machine Learning
Despite their recent rise to prominence, the fundamentals of AI, specifically neural networks and deep learning, were established as far back as the late 50’s and early 60’s. The first neural network, the Perceptron, had a single layer and was good certain types of recognition. However, the Perceptron was unable to learn how to… Read More
Connecting Coherence
If a CPU or CPU cluster in an SoC is the brain of an SoC, then the interconnect is the rest of the central nervous system, connecting all the other processing and IO functions to that brain. This interconnect must enable these functions to communicate with the brain, with multiple types of memory, and with each other as quickly and predictably… Read More
First Line of Defense for Cybersecurity: AI
The year 2017 wasn’t a great year for cyber-security; we saw a large number of high-profile cyber attacks; including Uber, Deloitte, Equifax and the now infamous WannaCry ransomware attack, and 2018 started with a bang too with the hackingof Winter Olympics. The frightening truth about increasingly cyber-attacks is … Read More
An AI assist for 5G enhanced Mobile Broadband for mobile platforms
If you’re not up-to-speed on 5G, there are three use-cases: eMBB(enhanced mobile broadband) for mobile platforms (Gbps rates, immersive gaming, VR, AR – spectrum usage also extends up to mmWave, but that’s a different topic), mMTCfor massive machine type communication (ultra-low cost, ultra-low power, very dense networks)… Read More
What does a Deep Learning Chip Look Like
There’s been a lot of discussion of late about deep learning technology and its impact on many markets and products. A lot of the technology under discussion is basically hardware implementations of neural networks, a concept that’s been around for a while.
What’s new is the compute power that advanced semiconductor technology… Read More
Data Security – Why It Might Matter to Design and EDA
According to the Economist, “The world’s most valuable resource is no longer oil, but data”. Is this the case?Data is the by-product ofmany aspects of recent technology dynamics and is becoming the currency of today’s digital economy. All categories in Gartner’s Top10 Strategic Technology Trends for 2018 (Figure… Read More
Unexpected Help for Simulation from Machine Learning
I attend a lot of events on machine learning and write about it regularly. However, I learned some exciting new information about machine learning in a very surprising place recently. Every year for the last few years I have attended the HSPICE SIG dinner hosted by Synopsys in Santa Clara. This event starts with a vendor fair featuring… Read More
Quantum computers may be more of an imminent threat than AI
Elon Musk, Stephen Hawking and others have been warning about runway artificial intelligence, but there may be a more imminent threat: quantum computing. It could pose a greater burden on businesses than the Y2K computer bug did toward the end of the ’90s.
Quantum computers are straight out of science fiction. Take the “traveling… Read More
Machine Learning And Design Into 2018 – A Quick Recap
How could we differentiate between deep learning and machine learning as there are many ways of describing them? A simple definition of these software terms can be found here. Let’s look into Artificial Intelligence (AI), which was coined back in 1956. The term AI can be defined as human intelligence exhibited by machines.… Read More


An Insight into Building Quantum Computers