CEVA Dolphin Weninar SemiWiki 800x100 260419 (1)

Network generator embeds TensorFlow, more CNNs

Network generator embeds TensorFlow, more CNNs
by Don Dingee on 06-27-2016 at 4:00 pm

Research on deep learning and convolutional neural networks (CNNs) is on the rise – and embedding new algorithms is drawing more attention. At CVPR 2016, CEVA is launching their 2[SUP]nd[/SUP] generation Deep Neural Network (CDNN2) software with new support for Google TensorFlow. Continue reading “Network generator embeds TensorFlow, more CNNs”


Two New Announcements from Tanner EDA at #53DAC

Two New Announcements from Tanner EDA at #53DAC
by Daniel Payne on 06-27-2016 at 12:00 pm

Most mergers and acquisitions in the EDA world simply don’t work out financially a year or two after the deal is done, however I was pleasantly surprised to learn that Tanner EDA is doing quite well at #53DAC this year after the acquisition by Mentor Graphics back in March 2015. Everyone that I’ve been meeting with at Tanner EDA for several years now is still around, and in fact they are in growth mode. This year at DAC they had two announcements that should delight both IC designers and IoT designers. Tanner EDA even had their own booth at DAC, separate from the Mentor booth.


Massimo Sivilotti, Greg Lebsack, Jeff Miller

IC Designers
Running DRC and LVS within the Tanner EDA tools has always been easy to do, but you would typically also have to run the industry standard Calibre tool for sign off. Now, there’s a special version of Calibre just for Tanner EDA users, and they dubbed it Calibre One. No more importing of Calibre design rules and running two different DRC/LVS tools, now you can add-on the Calibre One to your existing L-Edit tool from Tanner EDA.

This integration is lower-priced than a full seat of Calibre, so it should be easier on the wallet. Even the licensing within Tanner EDA tools now uses the industry-standard FlexLM, so it’s a bit easier for your CAD group to get all the tools up and running from a centralized license file.

So the name Calibre One refers to the idea that one user can access the tool, running on one CPU, which is a perfect fit for the typical AMS designs done with Tanner. Calibre One includes all four tools:

  • Calibre nmDRC (Design Rule Checking)
  • Calibre nmLVS (Layout Versus Schematic)
  • Calibre xRC (eXtraction of Resistance and Capacitance)
  • Calibre RVE (Results Viewing Environment)

IoT Designers
Many IoT chip designs require a processor, sensors, actuators and a way to model everything quickly and simply before production. In this second announcement we have Tanner EDA partnering with the leading processor IP company, ARM, on their Cortex M0 core. ARM calls this their DesignStart program, and it makes it easier for IoT designers to get their products designed and verified on time, and within budget. To make the deal even sweeter, they’re offering the complete IC design tool suite and IP for a reference design at no cost. Once you’ve proven your concept, then you can buy a fast-track license for commercial use.

With this Tanner design flow you can create IoT designs with digital, Cortex M0, analog, AMS and MEMS in an integrated environment. For your free reference design you actually use a web-based virtual lab to use the design tools.

Summary
Tanner EDA is on a roll, and they are still an affordable EDA vendor for you to consider on designs for AMS, MEMS and AMS. Now with the new Calibre One and ARM DesignStart program announcements they are opening up their tools for broader adoption. Give Massimo, Greg or Jeff a call and find out how to take the next step.

Related Blogs


eSilicon Offers Free Semiconductor IP For Universities!

eSilicon Offers Free Semiconductor IP For Universities!
by Daniel Nenni on 06-27-2016 at 7:00 am

It is easy to forget the importance of academia’s role in the semiconductor ecosystem but it is important not to. If you look at the DNA of the semiconductor industry you will see how dependent we are on academic research for innovation and the necessary disruption that keeps us all gainfully employed. FinFETs are the first things that come to mind since it has been a trending term on SemiWiki for the past three years, but there are many more.

Unfortunately for modern day academics, semiconductor research presents more expensive hurdles than before, one of those being multi project wafers (MPWs), which is why I’m happy to write again about eSilicon and their eMuse University Program:

eMUSe stands for eSilicon MPW University Services. It is an umbrella term that describes a complete experience for semiconductor researchers – advanced Internet technology, world-class service and a community of fellow researchers. eMUSe is changing the way semiconductor research is done. Our online technology makes finding the right MPW process, schedule and cost simple and intuitive. No phone calls or paperwork, just free access to answers, 24/7. World-class service provides the right guidance, IP and on-time delivery. And our growing community allows for some significant cost reduction and information sharing.

Visit the eSilicon eMuse landing page for more information which includes blogs by some very experienced people:

There is also a call for authors if you want to share your experience, observations, and opinions with the eMuse community:Join the community – change the world!

What’s new at eMuse since I have written last you ask? Excellent question:

New: Free IP for University Research MPWs
eSilicon takes great pride in supporting university research organizations around the world in their development of custom integrated circuits. Support includes:

  • The eSilicon[SUP]®[/SUP] STAR platform, a free, automated online secure environment that provides a self-service, transparent, accurate, real-time experience from IC design through volume ASIC production. Request a STAR account.
  • Universities around the world have found MPW Explorer to be particularly useful in streamlining their multi-project wafer (MPW) quote and manufacturing processes
  • STAR Navigator allows users to search, select and try eSilicon semiconductor IP online

Free IP

As of June 6, 2016, eSilicon standard, off-the-shelf IP listed in STAR Navigator, below, is free to universities who are creating MPWs for research purposes (no production or commercial use).

Memory IP:

  • Ternary CAM (TCAM)
  • Binary CAM (BCAM)
  • Pseudo two-port SRAM
  • Dual-port SRAM
  • Single-port SRAM
  • Two-port asynchronous register file
  • Four-port register file
  • Two-port register file
  • One-port register file
  • Single-port fast cache
  • ROM

Interface IP: I/O Libraries

  • General-purpose I/O libraries
  • Specialty I/O libraries
  • DDR I/O libraries

Foundries: Dongbu, GLOBALFOUNDRIES, LFoundry, Samsung, SMIC, TSMC and UMC

Technologies:
14nm, 16nm, 28nm, 40nm, 65nm, 90nm, 110nm, 130nm, 150nm, 180nm

I will close with the eMuse mantra because I could not have said it better myself:

We are in awe of the research being done by our university clients. We have no doubt that some of it will indeed change the world someday. Your job is to innovate and not negotiate the semiconductor technology access maze. This is where we can help. eMUSe from eSilicon lets you get your idea realized in silicon with minimal cost and effort – leaving you more time to work on the truly hard stuff.


Whose IoT devices were breached in 2015?

Whose IoT devices were breached in 2015?
by Bill McCabe on 06-26-2016 at 4:00 pm

IoT, as we all know, is not without issues–though we have become reliant upon it in many ways.. In 2015, there were some very viable and tangible proofs that the IoT field is fraught with real peril and that we as IoT designers, developers and companies need to be paying more attention to security. Just how many different IoT companies and arenas were breached? The answer might surprise you– not to mention terrify you.

Most of us read about the car that was taken over and driven into a ditch. The ramifications of that were clear to all of us, but some even more frightening things have taken place this year..

Did you know that a flight was taken over– and the man who took over the flight bragged that he had also manipulated the space station?

In the past year, the following hacks have taken place.

Medical devices–The FDA ordered that specific drug pumps be no longer used. The software was bad enough that hackers could change the dosage being delivered to people who were using them.So we have the possibility of murder by internet?? http://www.securityweek.com/fda-issues-alert-over-vulnerabl…

The DOE--According to a June 2015 Congressional Research Service (CRS) report, hackers successfully compromised U.S. Department of Energy computer systems more than 150 times between 2010 and 2014. “Records show 53 of the 159 successful intrusions were “root compromises ” “http://www.usatoday.com/…/cyber-attacks-doe-energy/71929786/

A Steel Mill
–An entire steel mill was breached resulting in “massive destruction of equipment” http://www.wired.com/…/…/german-steel-mill-hack-destruction/

The US National Nuclear Security Administration
–The people who are responsible for managing and securing the entire nation’s nuclear weapons stockpile, experienced 19 successful cyber attacks during the four-year period of 2010 – 2014

Firearms–TrackingPoint makes a smart rifle–what it does is to digitally “tag” a target, and then locks the trigger until the gun is perfectly positioned to hit it –and it can hit up to half a mile away but… now there has been a serious flaw found in the software so that a hacker could make a law enforcement hit the hostage rather than the intended target.http://money.cnn.com/2015/07/29/technology/hack-smart-rifle/

Offshore Oil Rigs
–Hackers have also shut down an oil rig by tilting it sideways..They hit another rig so hard with malware it was not seaworthy for 19 days..

Government Buildings
Department of Homeland Security recently disclosed that hackers had managed to penetrate a state government facility and a manufacturing plant in New Jersey–now all they did was change the temperature, but what COULD they have done.. really think about that.
Last.. but not least.. go ahead and buy that cool toaster and refrigerator….. a funny thing happened with hundreds of kitchens in the UK. All of tehm were hacked and the resultant hack wouldn’t allow them to make certan kinds of food in their toaster or store it in their fridge.http://www.cbronline.com/…/iot-security-breach-forces-kitch…

IOT is a time saver and offers us incredible convenience, but as we’re beginning to find out, there are some real ramifications to the use of IoT devices that we need to be aware of. More to the point, companies and industries who are offering these devices need to take full responsibility to assure the security of the devices they are offering. IoT security workers and developers are more important than ever before..

For more information about IOT and Security check out our new website www.internetofthingsrecruting.com – Need to update you IOT Security Team – Click Here to schedule a free IOT Needs Assessment Call.


Getting Maximum Performance Bang for Your Buck through Parallelism

Getting Maximum Performance Bang for Your Buck through Parallelism
by Bernard Murphy on 06-26-2016 at 12:00 pm

Finding a way to optimally parallelize linear code for multi-processor platforms has been a holy grail of computer science for many years. The challenge is that we think linearly and design algorithms in the same way, but then want to speed up our analysis by adding parallelism to the algorithms we have already designed.

But the “first design, then parallelize” approach is intrinsically hard because you’re trying to impress parallel structure onto sequential code, which still leaves algorithm designers with the burden of deciding where and how best to do that. Which they do with varying degrees of success, facing risks in missing potential race conditions and bigger and further opportunities to parallelize.

This is not to say that there hasn’t been progress in at least simplifying the description task. OpenMP is an approach using a directive-based style, where pragmas are defined on top of existing code. This works in as far as it provides that simplification of describing where and how you want to parallelize. Pragmas can be used to separate functions which can be run in parallel, which is certainly simpler than hand-written threading. But you still have to be sure that underlying functions are thread-safe and intuitive understanding of the algorithm often becomes significantly harder as you move pieces around to accommodate those pragmas.

For many applications this may be good enough. But no-one would claim this comes anywhere near the aspirational goal – describe what you want in some kind of algorithm and let the compiler take care of optimizing for maximum parallelism with race-free safety. For many applications, best possible performance is a non-negotiable top-priority. Finite Element Analysis for stress, thermal, flow and EMI problems are good examples. Higher performance means more accurate results and highest possible accuracy/quality of result is the only thing that matters.

That means algorithm designers are often willing to re-write core algorithms, even in new languages, especially when code needs to be re-factored anyway. And that opens up opportunities to consider very different approaches to coding, including switching from directive-based programming (describing what calculation to perform and how to perform it) to declarative programming (describing what calculation to perform and letting the compiler figure out the best way to perform it).

One such approach, designed originally by Texas Tech in partnership with NASA, subsequently transferred through and now marketed by Texas Multicore Technologies (TMT), is based on a language called SequenceL™. The product and language don’t aim to be yet another general purpose language. This is designed for serious math and science algorithms. The compiler optimizes from SequenceL into C++ (with optional OpenCL for GPU targeting), which can coexist with algorithms for other purposes built through more pedestrian paths.

As a very simple example you can multiply 2 matrices in a single statement. Parallelism is inferred from this structure.
MatrixMultiply(A(2),B(2))[i,j]:= sum( A[i,all] * B[all,j] );

This illustrates the objective to express mathematical intent and not have it become entangled and made opaque by implementation details, which in SequenceL are left to the compiler.

How well this works is illustrated by multiple customer results. One industrial application was Emerson Process Management’s need to improve software for building network graphs for plants and oilfields. Their existing Java-based solution was estimated to require unreasonable runtimes to build a graph for 1000 nodes. Worse still, after 5 months of redesign the authors of the original code failed to improve the Java implementation sufficiently to get within target performance. Then they reached out to TMT who completed a SequenceL solution in 3 weeks, a solution which also happened to beat target performance by 10X.

Another customer was able to get a 26X speed up in a core Fortran-based computational fluid dynamics solver with 25% less code, delivering a solution that used to take 2 weeks, now completing in overnight runs. Yet another customer, very experienced in parallelizing code, commented that SequenceL was like MatLab on steroids. Pretty high praise. A lot of this is apparently due not just to automating obvious parallelism but also to being able to find and automate finer-grained opportunities to optimize that would be beyond human patience (and schedules) to explore. And throughout to do so with guaranteed safety against race conditions.

As the IoT takes off, problems like this are going to become increasingly important. It isn’t all going to be about Big Data analytics. It’s also going to be about hard science and engineering analytics. I suspect you’re going to be hearing more about TMT in the near future. You can learn more about TMT and SequenceL HERE.

More articles by Bernard…


Industry 4.0 Challenges Should Sound Familiar to Tech Companies

Industry 4.0 Challenges Should Sound Familiar to Tech Companies
by Raman Chitkara on 06-26-2016 at 7:00 am

Following a series of acquisitions, one manufacturer is planning to pivot from a business model based on hardware sales to one based on monthly fees for bundled hardware, software, service, and connectivity. Sensors, the cloud, wireless connectivity and data analytics all play a role in this new model. It is a massive undertaking for an industrial company, and a preview of the coming business transformation to Industry 4.0 that most industrial enterprises are likely to face.

The Industry 4.0 transformation is not just about a high level of automation of the factory but also about the digitization and integration of manufacturing operations and the supply chain, the digitization of product and service offerings, and the implementation of digital business models and direct customer contact, something generally non-existent today. It is about redesigning capabilities and operating models to take advantage of an array of digital technologies now reaching full maturity.

At the heart of Industry 4.0 is the Industrial Internet of Things (IIoT), which Gartner Research estimates will become a significantly larger market than the consumer IoT.

But if you think Industry 4.0 only applies to manufacturing companies, think again. Many of the same issues that confront manufacturers in Industry 4.0 are the same ones that confront software and other tech sector companies when they move to cloud-based, anything-as-a-platform (XaaS) subscription models.

In PwC’s recent “The Global Industry 4.0 Survey,” reported in the whitepaper, “Industry 4.0: Building the digital enterprise,” we found that industrial companies, like the one cited above, are just beginning to invest and take steps to transform themselves into digital enterprises in the Industry 4.0 mold. We know from public statements that some well-known names are moving fast in this direction—General Electric, Boeing, Honeywell and Caterpillar, to name a few.

Embracing the Digital Enterprise
According to the PwC survey of more than 2,000 industrial participants in nine sectors and 26 countries, these companies are going to look a lot different in five years. For example:

  • Respondents expect to significantly increase their portfolios of digital products and services; more than twice as many expect to be at an advanced level in this area by 2020 than today.
  • Almost three-quarters of companies expect to have highly digitised horizontal and vertical value-chain processes in five years.

Do these trends ring true for tech sector companies? I think they do. The survey also sheds light on some of the key hurdles to accomplishing digitization, including at least two that should resonate with the tech sector:

  • Survey respondents say the biggest implementation challenge isn’t the right technology, rather a lack of digital culture and skills. This finding is also consistent with PwC’s Digital IQ research, which for nine years has explored how organizations across industries can derive value from digital investments.
  • Despite the fact that sharpened data analytics skills are essential to Industry 4.0, 38% of respondents currently rely on selective, ad-hoc data analytics capabilities of single employees; another 9% have no significant capabilities at all.

The skills and talent issue was also a key finding of PwC’s Annual Global CEO survey for 2016, published earlier this year – and especially among the tech CEO sub sample in that survey. In the 2010 survey, 58% of tech CEOs said they were concerned about the availability of key skills. In this year’s survey, 80% said availability of talent is their top concern.


Some tech sector companies, especially software vendors that have or are transitioning to the XaaS model, are ahead of the industrial sector in becoming digital enterprises. Tech sector companies with hardware products – semiconductor and other electronics OEMs of all kinds – face essentially the same issues as industrials. (In fact, 10% of the participants in the survey were from the electronics industry segment.)

Tech Faces Similar Issues
The survey findings are worth a look by anyone in the tech sector, not only for parallels to their own experiences, but also to understand issues their industrial customers face. The tech sector can identify multiple opportunities for new or enhanced business in them.

The similarities between the tech sector and the industrial sector are not just evident. There’s a significant convergence between the two worlds, along with specific inter-linkages and inter-dependencies – for example, via IoT.

Whatever the sector, Industry 4.0 is based on interoperability, information transparency, technical assistance and decentralization and autonomous decision-making. These include multiple transitions for industrial companies – across sales, go-to-market, products and offerings that become enriched through addition of sensors and networking, and the associated impact on the end-to-end, order-to-cash cycle.
For tech sector companies, these transitions likely sound familiar as they are all being driven by the evolution of business models towards XaaS. For the foreseeable future, we anticipate that current business models will coexist alongside XaaS; this also means that technology companies must decide carefully on how much to invest in sustaining the current offerings and operating models, while also building for the future.

Tech companies need to consider the same actions recommended for industrial companies in the study. Specifically, they will want to map out their digital strategy;


10 signs on the neural-net-based ADAS road

10 signs on the neural-net-based ADAS road
by Don Dingee on 06-24-2016 at 12:00 pm

Every day I read stuff about the coming of fully autonomous vehicles, and it’s not every day we get a technologist’s view of the hurdles faced in getting there. Chris Rowen, CTO of Cadence’s IP group, gave one of the best presentations I’ve seen on ADAS technology and convolutional neural networks (CNNs) at #53DAC, pointing toward 10 signs on the road ahead. Continue reading “10 signs on the neural-net-based ADAS road”


ARM and Mentor Enabling the Ecosystem for the Backbone of IoT

ARM and Mentor Enabling the Ecosystem for the Backbone of IoT
by Daniel Nenni on 06-24-2016 at 7:00 am

Charlene Marini (VP of ARM Segment Marketing) did a nice presentation at the ARM/Mentor Summit last month at the Mentor HQ in Fremont. I just got the slides so let me give you a quick summary from my notes. It was a very good presentation on IoT and emulation which in my mind is the new simulation. I also attended an IoT panel at #53DAC that included Mentor, ARM, and Samsung so this will be a combo blog.

IoT has been the top trending term on SemiWiki for the past year mostly because everyone is trying to figure out what IoT really is and more importantly where the profits will come from. Please remember that IoT is a rebranded version of the Embedded Market. Intel changed their Embedded Group to IoT last year, right? So that is what IoT really is.


I think we can all agree that IoT will be a very large and diverse market (like embedded has always been) but one thing I think we are missing is that it will be a HUGELY competitive market. For perspective, let’s say that today 1 out of 10 chip designs make it into high volume production, which may be a bit optimistic. For the Internet of Things it will be closer to 1 out of 100 and I think I’m being very optimistic here. So when people say that mature process nodes are ideal for IoT I laugh out loud because your “mature node” design is going to fail against one using FD-SOI or FinFETs if power and performance are at all a consideration.

Quick question, when was the last time you were inside a 300mm and a 200mm fab? Just last year I toured a new 300mm fab and a “mature” node fab (130nm) and it was like time traveling. There is no way I’m putting an IoT design that my job or company may depend on through a back-in-time-machine but I digress…

Charlene’s presentation starts with the growth chart above. Whether you agree on the specific numbers or not, the growth rate between 15 billion and 28 billion devices 5 years from now is very believable in my opinion.

I think we can also agree with ARM that IoT will require:

  • Capacity and Latency (Automotive)
  • Scalability (Diversity)
  • Velocity (data transfer)
  • Agility (updatable)
  • Efficient Compute (everywhere)
  • Ecosystem (diversity and choice)


The most important point here is ecosystem of course and nobody does ecosystem better than ARM. That brings us to the second part of the Summit and that is verification with Mentor Graphics. Mentor and verification is like peanut butter and jelly, you rarely have one without the other.

Mentor and ARM have a long history of collaboration across a number of technology areas: Embedded, Simulation, Emulation, Test, etc… ARM also uses the Mentor Enterprise Verification Platform, including theVeloce andQuesta platforms, to verify new processor IP and system IP designs. Don’t forget, Mentor is also a fabless chip company and is responsible for the core silicon inside the Veloce emulators so they “walk the walk” as well as “talk the talk.”

Also Read: Army of Engineers on Site Only Masks Weakness

The verification challenge is well documented but, for effect, I will end with these three slides from Jean-Marie Brunet’s presentation: Shift Left: Networking, Mobile and Multimedia SoC Verification:



Why is the IoT Catnip to Hackers?

Why is the IoT Catnip to Hackers?
by Bill McCabe on 06-23-2016 at 4:00 pm

The latest developments in IoT security will protect the companies that use them from disastrous hacks.Rob Enderle writing in CIO Magazine May 20 about a new security certification for IOT products lauded the new offering and cited other measures that responsible IoT businesses must take to secure the future of their companies. His opinion piece couldn’t come at a better time.

Those of us watching the IOT “back door” swing open to hackers have been wondering how and when a product certification like this would become industry standard. Underwriter Laboratory’s Cybersecurity Assurance Program (CAP) just might work. But it’s only a start.

The three-level certification process, according to Enderle, will work fine as long as it’s subject to a “rigorous audit process.” However, he also agrees that using a remote network hub with security stopgaps in place (which is what most are doing now) won’t do a thing to protect wireless devices.

Where we are now, where we need to go

During the NXP/FTF Technology Forum 2016, a group of panelists was asked if the Internet of Things was secure yet. What do you think they answered? Yes, they said, no.

Here’s the rub—and the same thing that Enderle writes about: The connected devices in cars, homes, phones need to have specialty security hardware to stop many attacks. Another missing link, according to Global Business Development Manager Damon Kachur at Symantec, is the need to institute “a massive education process compelling security providers to educate consumers on how to operate their devices securely.”

Using cryptography, requiring several rounds of authentication per day, and manufacturers hiring hackers to break into their IoT devices before they put them on the assembly line—these were also solutions that Forum panelists came up with to secure the IoT.

Horror stories averted?
The stories with the highest profiles are those that see connected cars taken over and crashed; cell phones hijacked and set on fire; and that Target breach, when hackers stole credit cards from Target headquarters using the building’s HVAC systems to get in. What else do we need to do, besides work on certification processes and make sure that before we build the next IoT device, we’ve protected it from hackers?

It’s clear that businesses engaged in the IoT revolution need to make security “job one”. There are heartening signs that this indeed is the case. A recent Accenture paper on IOT security claimed that “businesses surveyed by the World Economic Forum identified cyber-attack vulnerabilities as their most important IoT concern.” And an article last month in Forbes reported that venture capitalists are now “following the money” to underwrite cybersecurity start-ups: “Boston-based Lux Research says investment in “cyberphysical” security startups rose 78% to $228 million in 2015, and will increase to $400 million this year. The report cites rapid adoption of IoT tech, with the potential threats it brings in the area of internet connectivity in cars, homes and factories.”

Businesses that are eager to make money on the IOT without being willing to spend the money on securing it will be increasingly prone to customer data breaches and other high-profile disasters that will close their doors—and slow the adoption of IoT devices—and spending—for years to come. Smart companies need to make an investment in securing their latest IoT game changing use-case or product– or their customers and partners won’t want to make an investment in them.


Bridging the Gap between Foundry and IC Design at #53DAC

Bridging the Gap between Foundry and IC Design at #53DAC
by Daniel Payne on 06-23-2016 at 12:00 pm

In our semiconductor ecosystem we often specialize the engineers and therefore EDA tools into separate silos like Foundry, front-end design, back-end design, tapeout, etc. What I discovered at #53DAC a few weeks ago was that some EDA companies actually bridge the gap between foundry engineers and IC designers with their tools. Proplus Design Solutions is one such company and I had the pleasure to talk with Lianfeng Yang about this.
Continue reading “Bridging the Gap between Foundry and IC Design at #53DAC”