Webinar DFT 2025 banner mar11 (1)
WP_Term Object
(
    [term_id] => 6435
    [name] => AI
    [slug] => artificial-intelligence
    [term_group] => 0
    [term_taxonomy_id] => 6435
    [taxonomy] => category
    [description] => Artificial Intelligence
    [parent] => 0
    [count] => 573
    [filter] => raw
    [cat_ID] => 6435
    [category_count] => 573
    [category_description] => Artificial Intelligence
    [cat_name] => AI
    [category_nicename] => artificial-intelligence
    [category_parent] => 0
    [is_post] => 
)

Maximize Bandwidth in your Massively Parallel AI SoCs?

Maximize Bandwidth in your Massively Parallel AI SoCs?
by Daniel Nenni on 07-20-2018 at 12:00 pm

Artificial Intelligence is one of the most talked about topics on the conference circuit this year and I don’t expect that to change anytime soon. AI is also one of the trending topics on SemiWiki with organic search bringing us a wealth of new viewers. You may also have noticed that AI is a hot topic for webinars like the one I am writing… Read More


Platform ASICs Target Datacenters, AI

Platform ASICs Target Datacenters, AI
by Bernard Murphy on 07-17-2018 at 7:00 am

There is a well-known progression in the efficiency of different platforms for certain targeted applications such as AI, as measured by performance and performance/Watt. The progression is determined by how much of the application can be run with specialized hardware-assist rather than software, since hardware can be faster… Read More


Deep Learning: Diminishing Returns?

Deep Learning: Diminishing Returns?
by Bernard Murphy on 07-12-2018 at 7:00 am

Deep learning (DL) has become the oracle of our age – the universal technology we turn to for answers to almost any hard problem. This is not surprising; its strength in image and speech recognition, language processing and multiple other domains amaze and shock us, to the point that we’re now debating AI singularities. But then,… Read More


Leveraging AI to help build AI SOCs

Leveraging AI to help build AI SOCs
by Tom Simon on 06-25-2018 at 12:00 pm

When I first started working in the semiconductor industry back in 1982, I realized that there was a race going on between the complexity of the system being designed and the capabilities of the technology in the tools and systems used to design them. The technology used to design the next generation of hardware was always lagging… Read More


Looking Ahead: What is Next for IoT

Looking Ahead: What is Next for IoT
by Ahmed Banafa on 06-13-2018 at 12:00 pm

Over the past several years, the number of devices connected via Internet of Things (IoT) has grown exponentially, and it is expected that number will only continue to grow. By 2020, 50 billion connected devices are predicted to exist, thanks to the many new smart devices that have become standard tools for people and businesses… Read More


Being Intelligent about AI ASICs

Being Intelligent about AI ASICs
by Tom Simon on 06-06-2018 at 12:00 pm

The progression from CPU to GPU, FPGA and then ASIC affords an increase in throughput and performance, but comes at the price of decreasing flexibility and generality. Like most new areas of endeavor in computing, artificial intelligence (AI) began with implementations based on CPU’s and software. And, as have so many other applications,… Read More


Managing Your Ballooning Network Storage

Managing Your Ballooning Network Storage
by Alex Tan on 05-24-2018 at 12:00 pm

As companies scale by adding more engineers, there is a tendency to spread across multiple design sites as they strive to hire the best available talent. Multi-site development also impacts startups as they try to minimize their burn rate by having an offsite design center such as India, China or Vietnam.

Both the IoT and automotive… Read More


Semiconductor, EDA Industries Maturing? Wally Disagrees

Semiconductor, EDA Industries Maturing? Wally Disagrees
by Bernard Murphy on 05-22-2018 at 7:00 am

Wally Rhines (President and CEO of Mentor, A Siemens Business) has been pushing a contrarian view versus the conventional wisdom that the semiconductor business, and by extension EDA, is slowing down. He pitched this at DVCon and more recently at U2U where I got to hear the pitch and talk to him afterwards.


What causes maturing is… Read More


UBER car accident: Verifying more of the same versus the long-tail cases

UBER car accident: Verifying more of the same versus the long-tail cases
by Moshe Zalcberg on 05-21-2018 at 12:00 pm

The recent fatal accident involving an UBER autonomous car, was reportedly not caused – as initially assumed – by a failure of the many sensors on the car to recognize the cyclist. It was instead caused by a failure of the software to take the right decision in regard to that “object”. The system apparently… Read More


Machine Learning Drives Transformation of Semiconductor Design

Machine Learning Drives Transformation of Semiconductor Design
by Tom Simon on 05-14-2018 at 12:00 pm

Machine learning is transforming how information processing works and what it can accomplish. The push to design hardware and networks to support machine learning applications is affecting every aspect of the semiconductor industry. In a video recently published by Synopsys, Navraj Nandra, Sr. Director of Marketing, takes… Read More