When I first started working in the semiconductor industry back in 1982, I realized that there was a race going on between the complexity of the system being designed and the capabilities of the technology in the tools and systems used to design them. The technology used to design the next generation of hardware was always lagging… Read More
Artificial Intelligence
Looking Ahead: What is Next for IoT
Over the past several years, the number of devices connected via Internet of Things (IoT) has grown exponentially, and it is expected that number will only continue to grow. By 2020, 50 billion connected devices are predicted to exist, thanks to the many new smart devices that have become standard tools for people and businesses… Read More
Being Intelligent about AI ASICs
The progression from CPU to GPU, FPGA and then ASIC affords an increase in throughput and performance, but comes at the price of decreasing flexibility and generality. Like most new areas of endeavor in computing, artificial intelligence (AI) began with implementations based on CPU’s and software. And, as have so many other applications,… Read More
Managing Your Ballooning Network Storage
As companies scale by adding more engineers, there is a tendency to spread across multiple design sites as they strive to hire the best available talent. Multi-site development also impacts startups as they try to minimize their burn rate by having an offsite design center such as India, China or Vietnam.
Both the IoT and automotive… Read More
Semiconductor, EDA Industries Maturing? Wally Disagrees
Wally Rhines (President and CEO of Mentor, A Siemens Business) has been pushing a contrarian view versus the conventional wisdom that the semiconductor business, and by extension EDA, is slowing down. He pitched this at DVCon and more recently at U2U where I got to hear the pitch and talk to him afterwards.
What causes maturing is… Read More
UBER car accident: Verifying more of the same versus the long-tail cases
The recent fatal accident involving an UBER autonomous car, was reportedly not caused – as initially assumed – by a failure of the many sensors on the car to recognize the cyclist. It was instead caused by a failure of the software to take the right decision in regard to that “object”. The system apparently… Read More
Machine Learning Drives Transformation of Semiconductor Design
Machine learning is transforming how information processing works and what it can accomplish. The push to design hardware and networks to support machine learning applications is affecting every aspect of the semiconductor industry. In a video recently published by Synopsys, Navraj Nandra, Sr. Director of Marketing, takes… Read More
AI processing requirements reveal weaknesses in current methods
The traditional ways of boosting computing throughput are either to increase operating frequency or to use multiprocessing. The industry has done a good job of applying these techniques to maintain a steady increase in performance. However, there is a discontinuity in the needs for processing power. Artificial Intelligence… Read More
Webinar: ASICs Unlock Deep Learning Innovation
In March, an AI event was held at the Computer History Museum entitled “ASICs Unlock Deep Learning Innovation.” Along with Samsung, Amkor Technology and Northwest Logic, eSilicon explored how these companies form an ecosystem to develop deep learning chips for the next generation of AI applications. There was also a keynote … Read More
Open-Silicon, Credo and IQ-Analog Provide Complete End-to-End Networking ASIC Solutions
The end-to-end principle as defined by Wikipedia is a design framework in computer networking. In networks designed according to this principle, application-specific features reside in the communicating end nodes of the network, rather than in intermediary nodes, such as gateways and routers, that exist to establish the … Read More
Facing the Quantum Nature of EUV Lithography