Being Intelligent about AI ASICs

Being Intelligent about AI ASICs
by Tom Simon on 06-06-2018 at 12:00 pm

The progression from CPU to GPU, FPGA and then ASIC affords an increase in throughput and performance, but comes at the price of decreasing flexibility and generality. Like most new areas of endeavor in computing, artificial intelligence (AI) began with implementations based on CPU’s and software. And, as have so many other applications,… Read More


Machine Learning Drives Transformation of Semiconductor Design

Machine Learning Drives Transformation of Semiconductor Design
by Tom Simon on 05-14-2018 at 12:00 pm

Machine learning is transforming how information processing works and what it can accomplish. The push to design hardware and networks to support machine learning applications is affecting every aspect of the semiconductor industry. In a video recently published by Synopsys, Navraj Nandra, Sr. Director of Marketing, takes… Read More


AI processing requirements reveal weaknesses in current methods

AI processing requirements reveal weaknesses in current methods
by Tom Simon on 05-07-2018 at 12:00 pm

The traditional ways of boosting computing throughput are either to increase operating frequency or to use multiprocessing. The industry has done a good job of applying these techniques to maintain a steady increase in performance. However, there is a discontinuity in the needs for processing power. Artificial Intelligence… Read More


My advice to the world’s entrepreneurs: Copy and steal the Silicon Valley way

My advice to the world’s entrepreneurs: Copy and steal the Silicon Valley way
by Vivek Wadhwa on 05-07-2018 at 7:00 am

In a videoconference hosted by Indian startup media publication Inc42, I gave entrepreneurs some advice that startled them. I said that instead of trying to invent new things, they should copy and steal all the ideas they can from China, Silicon Valley and the rest of the world. A billion Indians coming online through inexpensiveRead More


Will the Rise of Digital Media Forgery be the End of Trust?

Will the Rise of Digital Media Forgery be the End of Trust?
by Matthew Rosenquist on 04-22-2018 at 12:00 pm

Technology is reaching a point where it can nearly create fake video and audio content in real-time from handheld devices like smartphones.

In the near future, you will be able to Facetime someone and have a real-time conversation with more than just bunny ears or some other cartoon overlay. Imagine appearing as a person of your Read More


Artificial Intelligence calls for Smart Interconnect

Artificial Intelligence calls for Smart Interconnect
by Tom Simon on 04-18-2018 at 7:00 am

Artificial Intelligence based systems are driving a metamorphosis in computing, and consequently precipitating a large shift in SOC design. AI training is often done in the cloud and has requirements for handling huge amounts of data with forward and backward data connections. Inference usually occurs at the edge and must be… Read More


FPGA, Data and CASPA: Spring into AI (2 of 2)

FPGA, Data and CASPA: Spring into AI (2 of 2)
by Alex Tan on 03-23-2018 at 12:00 pm

Adding color to the talks, Dr. Jeff Welser, VP and IBM Almaden Research Lab Director showed how AI and recent computing resources could be harnessed to contain data explosion. Unstructured data growth by 2020 would be in the order of 50 Zetta-bytes (with 21 zeros). One example, the Summit supercomputer developed by IBM for use at… Read More


Don’t believe the hype about AI in business

Don’t believe the hype about AI in business
by Vivek Wadhwa on 03-18-2018 at 7:00 am

To borrow a punch line from Duke professor Dan Ariely, artificial intelligence is like teenage sex: “Everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it.” Even though AI systems can now learn a game and beat champions within hours, they are hard … Read More


First Line of Defense for Cybersecurity: AI

First Line of Defense for Cybersecurity: AI
by Ahmed Banafa on 02-25-2018 at 7:00 am

The year 2017 wasn’t a great year for cyber-security; we saw a large number of high-profile cyber attacks; including Uber, Deloitte, Equifax and the now infamous WannaCry ransomware attack, and 2018 started with a bang too with the hackingof Winter Olympics. The frightening truth about increasingly cyber-attacks is … Read More


What does a Deep Learning Chip Look Like

What does a Deep Learning Chip Look Like
by Daniel Nenni on 02-16-2018 at 12:00 pm

There’s been a lot of discussion of late about deep learning technology and its impact on many markets and products. A lot of the technology under discussion is basically hardware implementations of neural networks, a concept that’s been around for a while.

What’s new is the compute power that advanced semiconductor technology… Read More