Webinar DFT 2025 banner mar11 (1)
WP_Term Object
(
    [term_id] => 6435
    [name] => AI
    [slug] => artificial-intelligence
    [term_group] => 0
    [term_taxonomy_id] => 6435
    [taxonomy] => category
    [description] => Artificial Intelligence
    [parent] => 0
    [count] => 573
    [filter] => raw
    [cat_ID] => 6435
    [category_count] => 573
    [category_description] => Artificial Intelligence
    [cat_name] => AI
    [category_nicename] => artificial-intelligence
    [category_parent] => 0
    [is_post] => 
)

How Deep Learning Works, Maybe

How Deep Learning Works, Maybe
by Bernard Murphy on 01-04-2018 at 7:00 am

Deep learning, modeled (loosely) on the way living neurons interact, has achieved amazing success in automating recognition tasks, from recognizing images more accurately in some cases than we or even experts can, to recognizing speech and written text. The engineering behind this technology revolution continues to advance… Read More


What’s old is new again – Analog Computing

What’s old is new again – Analog Computing
by Bernard Murphy on 01-02-2018 at 7:00 am

Once in a while I like to write on a fun, off-beat topic. My muse today is analog computing, a domain that some of us antiques in the industry recall with fondness, though sadly in my case without hands-on experience. Analog computers exploit the continuous nature of analog signals together with a variety of transforms to represent… Read More


IBM Plays With The AI Giants With New, Scalable And Distributed Deep Learning Software

IBM Plays With The AI Giants With New, Scalable And Distributed Deep Learning Software
by Patrick Moorhead on 01-01-2018 at 11:00 am

I’ve been following IBM’s AI efforts with interest for a quite a while now. In my opinion, the company jump-started the current cycle of AI with the introduction of Watson back in the 2000s and has steadily been ramping up its efforts since then. Most recently, I wrote about the launch of PowerAI, IBM’s software toolkit solution toRead More


Neural Networks Leverage New Technology and Mimic Ancient Biological Systems

Neural Networks Leverage New Technology and Mimic Ancient Biological Systems
by Tom Simon on 12-26-2017 at 12:00 pm

Neural networks make it possible to use machine learning for a wide variety of tasks, removing the need to write new code for each new task. Neural networks allow computers to use experiential learning instead of explicit programming to make decisions. The basic concepts related to neural networks were first proposed in the 1940’s,… Read More


CES Preview with Cadence!

CES Preview with Cadence!
by Daniel Nenni on 12-18-2017 at 7:00 am

The Consumer Electronics Show (CES) is in its 50th year believe it or not! The first one was in New York (1967) with 250 exhibitors and 17,500 attendees. Portable radios and TVs were all the rage followed by VCRs in 1970 and camcorders and compact discs in 1981. This year there will be 3,900+ exhibits and an estimated 170,000 attendees… Read More


CEVA and Local AI Smarts

CEVA and Local AI Smarts
by Bernard Murphy on 11-28-2017 at 7:00 am

When we first started talking about “smart”, as in smart cars, smart homes, smart cities and the like, our usage of “smart” was arguably over-generous. What we really meant was that these aspects of our daily lives were becoming more computerized and connected. Not to say those directions weren’t useful and exciting, but we weren’t… Read More


Advanced ASICs – It Takes an Ecosystem

Advanced ASICs – It Takes an Ecosystem
by Mike Gianfagna on 11-26-2017 at 2:00 pm

I remember the days of the IDM (integrated device manufacturer). For me, it was RCA, where I worked for 15 years as the company changed from RCA to GE and then ultimately to Harris Semiconductor. It’s a bit of a cliché, but life was simpler then, from a customer point of view at least. RCA did it all. We designed all the IP, did the physical… Read More


ASIC and TSMC are the AI Chip Unsung Heroes

ASIC and TSMC are the AI Chip Unsung Heroes
by Daniel Nenni on 11-20-2017 at 7:00 am

One of the more exciting design start market segments that we track is Artificial Intelligence related ASICs. With NVIDIA making billions upon billions of dollars repurposing GPUs as AI engines in the cloud, the Application Specific Integrated Circuit business was sure to follow. Google now has its Tensor Processing Unit, Intel… Read More


Deep Learning and Cloud Computing Make 7nm Real

Deep Learning and Cloud Computing Make 7nm Real
by Daniel Nenni on 11-05-2017 at 7:00 am

The challenges of 7nm are well documented. Lithography artifacts create exploding design rule complexity, mask costs and cycle time. Noise and crosstalk get harder to deal with, as does timing closure. The types of applications that demand 7nm performance will often introduce HBM memory stacks and 2.5D packaging, and that creates… Read More


Design Data Intelligence

Design Data Intelligence
by Bernard Murphy on 11-02-2017 at 7:00 am

We have an urge to categorize companies, and when our limited perspective is of a company that helps with design, we categorize it as an EDA company. That was my view of Magillem, but I have commented before that my view is changing. I’m now more inclined to see them more as the design equivalent of a business intelligence organization… Read More