Neural nets are a hot topic these days and encourage us to think of solutions to complex tasks like image recognition in terms of how the human brain handles that task. But our model today for this neuromorphic computing is several steps removed from how neurons actually work. We’re still using conventional digital computation … Read More
Tag: bernard murphy
The Higgs Boson and Machine Learning
Technology in and around the LHC can sometimes be a useful exemplar for how technologies may evolve in the more mundane world of IoT devices, clouds and intelligent systems. I wrote recently on how LHC teams manage Big Data; here I want to look at how they use machine learning to study and reduce that data.
The reason high-energy physics… Read More
What’s the Biggest Number?
Time for a little fun again. Most of us played this game when we were kids. It fairly quickly degenerates into “infinity plus one” or the even more preemptive “whatever you say next plus one”. But if you’re not allowed to use infinity and you have to name the number and demonstrate how you get to it, is this still interesting? For mathematicians… Read More
Radio Integration – the Benefits of Built-In
It’s always a pleasure when a vendor gives a really informative, vendor-independent presentation on what’s happening in some domain of the industry and wraps up with (by that point) a well-deserved summary of that vendor’ solutions in that space. Ron Lowman did just that at the Linley conference on Mobile and Wearables, where … Read More
A Credible Player at the Power Table
For a while it seemed like Mentor lived on the margins of the (RTL) design-for-power game. They had interesting micro-architectural optimization capabilities through their Calypto heritage but no real industry chops in power estimation, a must-have when you are claiming to reduce power. Better known offerings in RTL power … Read More
Limits to Deep Reasoning in Vision
If you are a regular reader, you’ll know I like to explore the boundaries of technology. Readers I respect sometimes interpret this as a laughable attempt to oppose the inevitable march of progress, but that is not my purpose. In understanding the limits of a particular technology, it is possible to envision what properties a successor… Read More
Dragging RTL Creation into the 21st Century
When I was at Atrenta, we always thought it would be great to do as-you-type RTL linting. It’s the natural use model for anyone used to writing text in virtually any modern application (especially on the Web, thanks to Google spell and grammar-checks). You may argue that you create your RTL in Vi or EMACS and you don’t need no stinking… Read More
The Appeal of a Multi-Purpose DSP
When you think of a DSP IP, you tend to think of very targeted applications – for baseband signal processing or audio or vision perhaps. Whatever the application, sometimes you want a solution optimally tuned to that need: best possible performance and power in the smallest possible footprint. These needs will continue,… Read More
Formally Crossing the Chasm
Formal verification for hardware was stuck for a long time with a reputation of being interesting but difficult to use and consequently limited to niche applications. Jasper worked hard to change this, particularly with their apps for JasperGold and I have been seeing more anecdotal information that mainstream adoption is growing.… Read More
Big Data Lessons from the LHC
Big Data techniques have become important in many domains, not just to drive marketing strategies but also for semiconductor design, as evidenced by Ansys’ recent announcements around their use of Big Data analytics. And they should become even more important in the brave new world of the IoT. So it makes sense to look at an organization… Read More