RVN! 26 Banner revised (800 x 100 px) (600 x 100 px)

Take a drive on the IoT with V2V

Take a drive on the IoT with V2V
by Bill Boldt on 07-08-2014 at 11:00 pm

What platform has become the most sophisticated and intimate personal electronic environment ever? The car. To paraphrase a famous automotive company’s top executive, car companies are transforming the car into a powerful smartphone that allows drivers to carry around, customize, and interact with their digital world. Automotive electronics are currently centered around people (infotainment and communications) and the machine itself (to run the car and provide safety and convenience). Now a third element is emerging; namely, Vehicle-to-Vehicle (V2V) communications.

Just like that sounds, cars will soon “talk and listen” to one another — automatically. They will share information like proximity, speed, direction, road conditions, as well as other things that have yet to been imagined. The chief driver of V2V is signaling impending collisions so that the cars can automatically take countermeasures. That, of course, means the V2V network will become a critical technology for self- and assisted-driving cars.

While it may seem revolutionary, V2V is really an evolutionary branch of Internet of Things (IoT) technologies, which are creating a world where smart, secure, and communicating, sensors will become ubiquitous in planes, trains, and automobiles; inside homes; inside commercial buildings; on highways; in cities and towns; in agriculture; in factories; in retail spaces; and worn by and implanted in humans and animals. The Internet of Things could eventually connect everything from cars to cats.

A term that is being used to describe the technologies making such a smart, sensor saturated world is “sensor dust,” which captures the Zeitgeist that super tiny, smart, communicating sensors will be everywhere — like dust. Sensors, of course, are never just sensors. They are always connected to other things–mainly microcontrollers (MCUs). With the advent of ultra-low power and energy harvesting technology, the sensor-MCU combination has become an ideal, clear, and present foundation for widespread sensor roll out. Sensing often implies by its very nature detection and communication from a distance, and that is where wireless communication comes into play.

The dark side is that remote sensing and communication open the door very wide for bad actors who want to intercept, spoof, and misuse the data streaming freely through the air. So, security (encryption and/or authentication) becomes the final piece of the picture, and arguably the element that makes IoT even possible to be widely adopted. Huge amounts of information are already being collected every day about traffic flow from phone users worldwide (without their knowing it). Such storehouses of data can be mined real time and used to provide personal traffic reports to subscribers while driving. At least that is the story. As the car moves from one place to the other, social networking can be effectuated in real time to locate friends or certain activities and happenings (automotive flash-mob, anyone?). But, what consumers really want their whereabouts and other information out in the open in a completely uncontrolled way? No one. People are becoming extremely sensitive to data insecurity and there is a growing need to trust how the information that is being collected will be used. Without some type of trust, the IoT could be doomed. Maybe the term “Internet of Trust” should be coined to make that point obvious.

V2V & IoT
The evolution of V2V and IoT are intimately related because they both will be composed of the very same technological blocks. The overlap is easy to see. The foundational components of each are miniaturized MCUs, sensors, wireless technology, and security devices that operate using ultra low power. Describing IoT and V2V as equations, they could be expressed in the following way:

IoT = (MCU + Sensor + Security + Wireless) [SUP]Low Power[/SUP]
V2V= IoT + Car

Equation one might imply that companies that can integrate the factors will lead in the build-out of the IoT market. Equation two effectively states that V2V is the IoT on wheels. In any case, there are certain basic blocks that must be integrated, and they must be integrated in the right way for the particular use-case. IoT and V2V design flexibility and time to market will matter, a lot. (But that is a topic for another time.) The growth of the connected car platform is expected to be remarkable. That makes sense since the car is the one place that GPS/NAV systems, smart phones, tablets, DVDs, CDs, MP3s, Bluetooth, satellite radio, high power stereo amps, speakers, voice control, and the Internet can all come together and interact with each other.

Such convergence is making the car into an advanced personal hub. Market researchers have estimated that revenue for the connected car market will grow from $17 billion in 2012 to $54.5 billion in 2018 for hardware and services (telematics, telecom, and in-vehicle). Unit sales of embedded, tethered, and smartphone equipped cars are expected to grow from around 10 million units in 2012 to 67 million by 2018, with over 50% of that volume being embedded systems that are controlled by media and sensor control systems.

Media control systems are not only becoming a standard feature in new cars, but according to consumer electronics and auto industry researchers, a chief reason that people are selecting certain cars over others. Electronics are becoming a main forethought rather than a minor afterthought for car buyers. Sophisticated electronic systems are becoming mandatory, and this powerful dynamic will only accelerate as more electronics products, features, and services are sped to the market by the car makers, consumer electronics companies, smartphone makers, and software providers.
However, all this electronic stuff has presented a huge challenge, which is safety. Using products such as the cell phone in the car actually interferes badly with driving. Anyone who has placed a call, or even worse tried to text while driving (and who hasn’t), can testify to the fact that dial-driving is a bad idea. So, what can be done to get cars electronics, phones, and humans to play well together in a safe way? The solution has been summed up succinctly by the CEO of a major auto maker who refers to in-car control systems as being able to free the user from the tyrannies and dangers of messing with that little phone while you drive. Rather than a car and phone (and other electronics) being at odds with each other, the car is transforming into the newest electronic platform: one that is highly integrated, easy to use, and distinct from anything else to date. It is easy to see that the emerging alloyed car-plus-consumer platform is primed for cars to talk to one another without the need of human intervention.

The list of electronics functions in cars is evolving fast and will likely include multi-person gaming; GPS with location-based services such as real time traffic and road condition updates; vehicle monitoring for maintenance status, performance, and eco-friendliness; vehicle and personal security; connection to home control/security systems; social networking opportunities related to location, and especially safety. In fact, the US Deportment and Transportation (DoT) and National Highway Traffic Safety Administration (NHTSA) are partnering with research institutions and auto companies to collaborate on technology development and interoperability of V2V to promote traffic safety. V2V can transform the automotive experience more than anything since Henry Ford’s assembly line made cars available to the working class. The notion of a car driving itself still sounds like pure science fiction, but prototypes are already driving themselves. So, it is just a question of time before we have auto-automobiles. (auto[SUP]2[/SUP]mobiles) where you simply have to tell your personal digital assistant where you want to go, then take a seat in your personal infotainment pod until you get there.

But, well before that happens we will see significant improvements in safety due to V2V. It is clear that the lucrative auto electronics platform is already right in the sights of all car makers, and they clearly plan to take it to the next level and the next level after that, with no end in sight. As noted, electronic things sell cars, and more advanced electronics will show up in the more advanced cars. Then, last year’s advanced systems will naturally move down-market, so even more advanced systems will be needed for next year’s up-market cars. This endless cycle of innovation will drive automotive companies to create V2V and self-driving ecosystems sooner rather than later. As we move towards the self-driving omega-point we will see V2V and IoT showing up very early in the journey.

V2V (the IoT on wheels) will make it hard to tell where the car ends and the phone, tablet, computer, and sensors begin.

Interested in learning more about Atmel’s automotive portfolio? Check out our automotive-qualified category breakdown below:


Bill Boldt, Sr. Marketing Manager, Crypto Products Atmel Corporation


Modeling and Analysis of Single Event Effects (SEE)

Modeling and Analysis of Single Event Effects (SEE)
by Daniel Payne on 07-08-2014 at 4:00 pm

Single Event Effects (SEE) are important because we depend upon our consumer, industrial and aerospace products to work reliably. Protons, electrons, neutrons, or alpha particles may perturb the MOS or bipolar device operation in either a destructive or non-destructive fashion. Galactic cosmic rays are one source of these particles and by the time they reach Earth we have 10’s of particles per square cm. Even packaging materials have alpha particles as sources for SEE.

Continue reading “Modeling and Analysis of Single Event Effects (SEE)”


IMEC Technology Symposium

IMEC Technology Symposium
by Paul McLellan on 07-08-2014 at 12:52 pm

Yesterday I attended the IMEC Technology Forum at Semicon West. As always with IMEC, they present so much information it is like drinking from a firehose. I’ll say more about the future of process technology in a blog later this week, but this blog is about IMEC itself. It is an amazing success story. Let’s face it, if you were going to guess where the worlds most advanced semiconductor R&D is being coordinated, you probably wouldn’t have picked Belgium if you didn’t already know the answer.

IMEC is celebrating its 30[SUP]th[/SUP] anniversary this year. It is located in Leuven in Belgium. Originally it was called Interuniversity Microelectronics Centre but now IMEC is just its name. Luc van den Hove, the CEO of IMEC, gave a presentation on Creative Business Models in a Consolidating Semiconductor Landscape. Over the years its way of operating has changed. Originally it worked in individual partnerships with companies, and then it brought companies together to solve specific problems. In its current incarnation it partners with semiconductor manufacturers, and equipment manufacturers, at the pre-competitive stage, so currently focused on 7nm and beyond, two or three generations out beyond what is in current volume manufacturing.

The cost of semiconductor R&D has been rising faster than semiconductor revenue and has got so expensive that nobody can really go it alone. One way to get the costs down again is to share the costs. IMEC has gradually come to be the place where this sharing gets done. They are partnered with all 4 major logic manufacturers (Intel, TSMC, Samsung and GF) and with all 4 major memory manufacturers (Samsung, SKHynix, Micron and Toshiba/Sandisk). They are also partnered with almost all the equipment manufacturers.


They have a lot of clean room space (and more being built) and so can run experimental wafers, experimental equipment and so on. So the current semiconductor R&D model is that everyone cooperates at IMEC on the basic R&D. Currently a lot of work on new transistor architectures such as gate-all-around (GAA, silicon nanowires), vertical versions where source is on top of the gate on top of the drain with a silicon nanowire running vertically, and other futuristic approaches. Also new materials for the BEOL metal fabric. Not to mention lots of lithography work especially on the EUV roadmap and directed self-assembly (DSA).

I went to the Samsung Healthcare announcement about a month ago and I was surprised to find that they were partnering with IMEC, not just in semiconductor technology but in medical technology too. It turns out IMEC is a world leader in health innovation too, what they call the Internet of Healthy Things. Starting with fitness gadgets, then medical-grade ambulatory monitors, consumer-grade lab testing (sell for $10-20 in drugstores), on to DNA analysis, single cell analysis and even brain probes.

In fact using the foundation of their work in semiconductor they have built ecosystems of partners in areas other than medical: low power wireless, energy, sensor systems, automotive.

IMEC is the largest ecosystem in the world related to semiconductor technology.


More articles by Paul McLellan…


From ARM7 to such a Large CPU cores Port-Folio

From ARM7 to such a Large CPU cores Port-Folio
by Eric Esteve on 07-08-2014 at 3:13 am

I have heard about ARM processor for the very first time in 1990, when I was interviewed by ES2 Design Center manager before being hired to subcontract an ASIC design for ES2. I don’t know why, but I remember very well that he told me about two of the ES2 partners: ARM as a processor IP core provider and TSMC as a Foundry partner if, by chance, one of the ASIC socket had to run into high production volume (at that time, ES2 was known to support fast prototyping, thanks to an ebeam based processing flow; good for prototypes, not really suited for large production volume). Then, in 1995, ES2 was bought by Atmel and the CEO Georges Perlegos decided to build a brand new fab supporting 350nm to 180nm in Rousset, France, in 1999. I joined Atmel at that time as (Standard Cell) ASIC PMM, reporting to the same person. Our job was not only to win ASIC design in front of the traditional competitors, but also to face not less than three internal competitors: Gate Array (US based group), MHS ASIC (France) and TEMIC ASIC (Germany). Thus, we had to severely brainstorm to highlight our differentiators! The most important was clearly our ability to support ASIC integrating ARM core, our group managing all the ARM-related developments in Rousset. This positioning was good, but the ASIC business decline was inevitable… That’s why my former boss is no more in charge of the ASIC business, but of ARM based Microcontroller family. Within Atmel, ARM based Microcontroller product line has grown from less than $10 million in 2003, to reach several $100 million in 2013…

Another short story, to provide another perspective: in 1998, working as an ASIC PME with TI, I have seen the explosion of the Wireless BU. The WBU success was clearly based on BaseBand IC, or the winning team made of TI DSP and ARM7TDMI IP core, evolving to the OMAP Application Processor of the 2000’s. The selection of the ARM7 core by the Ericsson, Nokia, Alcatel, Motorola (and many more) initially to support the GSM standard in the mid 1990’s, and later all the following standards, including CDMA, 2G, 3G, 4G, has defined ARM IP core as the “de facto” wireless Application Processor CPU. ARM position in such an exploding market has created a huge royalty flow, fueling ARM Ltd R&D and allowing the company to build and develop a complete IP core port-folio as we know it today, covering almost all potential applications, that we have called ubiquity in one of the very first (ARM vs Intel) blog on Semiwiki.

  • Embedded: Automotive Infotainment Embedded Computing, General Purpose MCU, IoT, Smart Card, Smart Meter.
  • Mobile: Computing, Smartphone, Feature Phone, Connectivity and Modem, Mobile Payment
  • Home: Blu-Ray and DVD, Digital Set Top Box, Digital Still Camera, Digital TV, Gaming
  • Enterprise: HDD/SSD, Flash cards and UFD, Home Networking, Network Infrastructure

This long list of application supported by ARM Processor core explains why ARM Holding (ARMH) stock has been x10 within 10 years!

If you take a look at ARM Cortex port-folio, you will face with a dilemma: which Cortex processor to choose, between Cortex A57 or A53, Cortex-A (7 cores), Cortex-R (R7, R5 or R4), Cortex-M (5 cores) or SecureCore families? A family like Cortex-A53 and Cortex-A57 seems to be dedicated to Smartphone. But if you take a look at the above picture (taken during ARM TechCon 2012), you can see a certain evolution, moving from Smartphone dedicated Application Processor counting four Cortex-A53 to Superphone/Tablet, integrating two Cortex-A57 cores on top the four A53. The next step according with ARM is “naturally” the Mobile Computer (four A53 + four A57). If you look at the last application, a MPU integrating 16 Cortex-A57 cores plus a Cache Coherent Network (CCN) and the Level 3 cache, the target is clearly the Server market, delivering today most of Intel margin in $, thanks to high chip ASP. As you can see, a family counting only two cores allows targeting multiple applications! If you consider that ARM CPU IP core portfolio is built around four more families, you will probably need some help to find the optimized solution…

Why not using the specific selector tool available on ARM web site?

We have mentioned the Wireless or Mobile market as the “cow” generating the huge cash flow, thanks to the high production level for ARM-based Application Processor and the related royalty payment. ARM Ltd. has been cleaver enough to get this cash… and even smarter in investing it, first into R&D and port-folio development, and also to build a very strong ecosystem with over 1000 partners delivering silicon, development tools and software. Such a strategy may appear as just common sense or good business practice today in 2014, but we have to remember that ARM has been a real innovator, and was the first company to really understand how important it will be for the future of their business to develop such an ecosystem. Companies like Intel, for example, could have done it 20 years in advance of ARM, but securing and reinforcing their MPU monopoly was their main concern, not building an ecosystem…

Eric Esteve from IPNEST –

More Articles by Eric Esteve…..


Intel Custom Foundry Explained!

Intel Custom Foundry Explained!
by Daniel Nenni on 07-07-2014 at 7:00 pm

The exciting news is that Intel landed their first big SoC customer with Panasonic’s System LSI Business Division. These 14nm SoCs will be targeted to audio visual equipment markets. The significance here to me is that Intel not only has a big SoC customer, Intel now has a non-Silicon Valley based foundry customer. It is critical for a foundry to be able to operate world-wide and Japan, as a country, is an important market as they are leading the transition from IDM to the fabless business model.

Sunit Rikhi presented at SEMICON West today, Intel made the slides available (HERE) and I do greatly appreciate the transparency. They are an interesting read and I highly recommend you browse them. Before the presentation Sunit asked for a copy of “Fabless: The Transformation of the Semiconductor Industry” which I happily gave him with a custom inscription. Hopefully we can meet again and discuss the book in more detail. The better Intel understands the dynamics of the fabless semiconductor ecosystem the better the return on investment they will get and the more investment they will make, for the greater good, right?

Out of the 42 slides, here are my 6 favorite:

[LIST=1]

  • The Value of Better Transistors (#4). Certainly a valid point but a better transistor does not directly translate into better chips. The FinFET versions of competing FPGAs and SoCs due out next year will have the final word on this.
  • Expect More From Moore (#7). This is a knock on the TSMC’s “More than Moore” slogan which I found quite amusing. I was the only one who laughed but the subtle point was well taken.
  • Intel Customer Foundry Ecosystem (#14). Synopsys for foundation IP was news to me. This is significant if that IP is optimized for Intel processes. Intel is Synopsys’ biggest customer of course. Will ARM be added to that list? ARM’s foundation IP is optimized for ARM processors so probably not.
  • IDM Advantage: Foundry Plus (#17). This is a great list of services, design to tested chips, meaning that Intel competes with the ASIC companies such as eSilicon and Global Unichip. I do like the Foundry Plus sound bite.
  • Response: Reflect the Marketplace in our Workforce (#32). I’m guessing that this is in response to me pointing out that the majority of the Intel Custom Foundry employees are from inside Intel. The search I did on LinkedIn contradicts this slide based on years of experience but LinkedIn search has failed me before.
  • IDM Challenge: Separation of Intel Business Unit and Customer IP (#34). Intel uses a firewall to ensure “Separation by Infrastructure Design”. Samsung did this by building separate fabs in Texas for Apple and now licensing 14nm to the GlobalFoundries NY fab. TSMC does not have to do this of course.

    All in all it was a good presentation, absolutely. Intel is now in a quiet period for the Q2 2014 conference call so more detailed information was not available. The most interesting piece of information that I gleaned from this presentation is that Intel started 22nm CUSTOMER shuttles in 2011, 14nm in 2013, and 10nm will start in 2015 (slide #11). This means, according to my calculations, Intel 22nm was 2 years ahead of TSMC 20nm, Intel 14nm is less than 1 year ahead of TSMC and Samsung, and 10nm will be too close to call.


  • S-engine Moves up the Integration of IPs into SoCs

    S-engine Moves up the Integration of IPs into SoCs
    by Pawan Fangaria on 07-07-2014 at 8:30 am

    As the semiconductor design community is seeing higher and higher levels of abstraction with standard IPs and other complex, customized IPs and sub-systems integrated together at the system level, sooner than later we will find SoCs to be just assemblies of numerous IPs selected off-the-self according to the design needs and specifications. Does that sound so simple? No, it’s harder than we can think of. A major burden will be on the SoC integration and verification engineers to explore optimum connectivity, find best NoC (Network-on-Chip), check every connection, debug the design, and do chip finishing by editing at the system level. And all of that has to be done within a short span of time to meet the ever shrinking time-to-market window. I am sure, focus of EDA will increase towards seamless integration of IPs into SoC; on-the-fly, correct by construction and well tested which can provide better yield.

    During the 51[SUP]st[/SUP] DAC, Concept Engineeringhas introduced S-engine[SUP]TM[/SUP], a specialized schematic generation capability at system level which is a step in the right direction and at an opportune time towards IP development and SoC integration. The design houses can integrate this capability into their design tools which can allow them to visualize, debug and edit schematics at higher levels of abstraction when they are working at the top level of SoC, trying to integrate various IPs together. It’s natural that a fair amount of editing of schematics will be required at the top level while stitching the IPs together; S-engine provides that smart editing feature at the system level.

    The S-engine generates schematics automatically that allows visualization at any desired level, typically the interfaces at the system level which are used to configure the IP blocks and their assembly. The smart editing at this juncture, combined with high performance and capacity allows on-the-fly management of IPs with interactive visualization, assembly and architecture design of the system, thus enabling creation of complex, high quality SoCs, NoCs and IP sub-systems. A powerful P&R technology can handle complex SoC designs of these days and produce clean schematics at the system level.

    The S-engine capability from Concept provides an opportunity for EDA vendors to integrate it into their tools for such high level editing, visualization and debugging needed for system level integration. As a matter of fact, the S-engine can be easily integrated into any HLS (High Level Synthesis) tool to provide it the required control and visibility over the entire synthesis process in order to produce an optimized design at a desired level. It supports a single schematic to have components at different levels such as system, RTL, and gate, thus supporting the inherent heterogeneity in IPs from various third parties.

    There are many features in S-engine which enhance its flexibility and capability to analyze and edit at different localized regions of a schematic as desired by the user. For example an IGEN symbol can have collapsible or expandable interface pins as per requirement for ease of viewing and analyzing particular buses in a schematic. Similarly there is a provision for priority routing for interface nets.

    A designer can have a separate toolbar for certain selected schematic objects for ease of viewing, analyzing and editing those objects. There are native images which can be used as graphical attributes. Also, there are specific comment graphic objects that can be used for easy reference.

    Since the design complexity is growing at rapid pace with multiple functions being accommodated on the same chip, enabled by multiple IPs developed separately and integrated together, it becomes evident that the IPs must be visualized, analyzed and accommodated at the system level to produce a correct and optimized SoC. This initiates the need for a capability like S-engine which can be easily integrated into EDA tools through simple and robust APIs. It provides two-way communication with the host application for cross probing, highlighting, ballooning, and other operations. The Concept proprietary algorithms enhance performance for on-the-fly schematic creation with excellent interactive editing. Interactive modification of schematic fragments is allowed for incremental schematic editing. The built-in system- and IP-level symbols enable the application to work without specific symbol libraries.

    The S-engine with its appropriate API interfaces is available on multiple platforms with customizable GUI environments such as Tcl/Tk, Qt, MFC, Java SDK, Perl/Tk and wxWidgets. If you are thinking of upgrading your tools for SoC designs involving multiple IPs, it’s worth exploring S-engine and integrating it into your tools.

    More Articles by Pawan Fangaria…..

    lang: en_US


    Is Now the Time to Buy Bitcoin?

    Is Now the Time to Buy Bitcoin?
    by mbriggs on 07-06-2014 at 9:00 pm

    I have to admit I, thus far, have been the ultimate Bitcoin cynic. Watching the price go from $2 in the fall of 2011 to $1132 in December 2013 was dizzying. It seemed reminiscent of Dutch tulip mania. A bitcoin that is not backed by anything physical such as gold, or by a government, strikes me as only slightly less valuable than a tulip.

    My son is a big Bitcoin fan. I told him that for old guys, like me, the bank tax is worthwhile. My favorite example is my Visa card. I pay it off every month so I’m immune to the usurious interest rates. I really like the fact that I get fraud protection for free. I like the fact that I get an email every time something is charged to my account. I like getting miles.

    However, I am changing my tune. This is mostly because of my dislike of the banks.

    Why I hate the banks

    I recently applied for a home refi. I get the fact that regulations are tight and banks can’t make risky loans. I did a refi a couple of years ago with Quicken. The amount of information they require you to provide is painful. My expectation was set, and I was willing to grin and bear it. However, the refi with Chase took unreasonable to grandnew levels. I can’t imagine a less risky loan proposition than I presented. I finally said “no more” after the third round of requested documentation.

    I read about Wall Street traders complaining because their bonuses are only in the small millions of dollars. This irritates me because these people add zero value to society. Why can’t these brilliant people actually make things?

    Am I alone in thinking, and hoping, that the banks in their current form go away? Have you heard of peer to peer lending sites such as Lending Club or Prosper?Why not borrow from your peers at a cheaper rate than you can get from a bank?

    How about IPOs? Investment bankers (underwriters) make millions. Morgan Stanley, and others, made $100M on the Facebook IPO. It’s too bad Facebook didn’t use the dutch auction that Google used for their IPO, and pay the bankers zip.

    Back to Bitcoin

    It continues to amaze me that Bitcoin has persevered through so many obstacles. These obstacles are many, but a few include:

    “China has barred all financial institutions, such as Baidu, from handling Bitcoin transactions. In addition, the Russian prosecutor general announced on Feb. 6, 2014, that the use of Bitcoins and other digital currencies is illegal under its current law.”

    You also can’t stop reading about the stolen, lost, or confiscated Bitcoins from likes of Mt Gox and Silk Road. See Business Insider’s article on $500 Million Worth Of Bitcoin Has Been Stolen Since 2010.

    However, in the face of all this adversity, it seems the price of Bitcoin is starting to stabilize. California has lifted it’s ban on Bitcoin.Some very smart VC such as Andreeson Horowitz, Union Square Ventures, and Tim Draper are into Bitcoin in a big way. Coinbasehas come out with a secure Bitcoin wallet, so hopefully many of the thefts are behind us.

    If you find getting started with Bitcoin a little daunting, check out https://trybtc.com/. It will walk you through the process, and make it a little fun.

    If the price of a Bitcoin dips back down below $600 (it’s currently $635) I may buy some.


    MIMO, Always On, 3D Imaging and Computer Vision…

    MIMO, Always On, 3D Imaging and Computer Vision…
    by Eric Esteve on 07-06-2014 at 12:08 pm

    You can read all these articles in the latest CEVA Newsletter, if you didn’t read it first in Semiwiki! The blog describing the “Maximum Likelihood MIMO Implementation” is certainly going deep technically, as it introduce a complex Digital Signal Processing technique, Multiple Input Multiple Output (MIMO). MIMO is just like magic, as it could allow a x4 bandwidth multiplication, both for emission and reception. This DSP technique is all but trivial, but with good DSP engineer developing the right algorithm on the right piece of hardware, here a DSP core from CEVA, it’s possible to boost a base station and reach such bandwidth multiplication. The reader will discover why a linear algorithm, easy to implement, cannot fully exploit the MIMO benefits, when an optimal Maximum A Posteriori (MAP) approximation MIMO algorithm will generates high latency penalties. Finally, the non-linear MIMO receiver implementation known as Maximum Likelihood Detector (MLD), more demanding on processing than a linear receiver, will offer significantly higher bit rates for the same channel conditions. You also can find a white paper going deeper into MIMO analysis.

    “Bluetooth on CEVA-TeakLite-4: it’s All about “Always-on” article will certainly enjoy the people convinced that IoT is the next big thing for the SC industry! Here is an extract from the blog written in Semiwiki about Always-on: “The Internet of Things comprises a multitude of devices, technologies and form factors, with many use cases and requirements. The CEVA-TeakLite-4 specifically targets user-centric IoT devices, where natural user interface, audio playback and voice communication represent key attributes of the device. This can include for example, voice activation, face triggering and other ‘always-on’ functionality in a smartphone, smart watch, smart home controller or wireless speakers. The ultra-low power nature of the CEVA-TeakLite-4 DSP ensures that these ‘always-on’ features consume minimal battery life. All of this functionality can run concurrently on the DSP without the need for a host CPU, reducing the die size and lowering power consumption of the overall device. Illustrating this, a real-life use case implementing Bluetooth Low Energy, always-on UI and sensor fusion on the CEVA-TeakLite-4 DSP requires less than 150K gates and consumes less than 150uW when implemented in a 28nm process.” Take a look at the CEVA Newsletter too…

    Accelerating Computer Vision Applications? Thanks to CEVA’ADK for the CEVA-MM3101, image processing platform acquires new resources like gesture recognition, emotion detection and augmented reality.
    Meanwhile, CEVA continues to build new resources into the libraries and examples available through the ADK. The following new kernels have recently been added to the library:

    • Matrix inversion
    • Feature detection: FAST9, HOG, SURF
    • New filters: bilinear, bicubic
    • Object detection: LBP, HAAR, SVM, ORB
    • Image processing: Histogram, gamma
    • Optical flow: FLT, block matching

    In addition, new sample algorithms have been provided, demonstrating the capabilities of the CEVA-MM3101 for:

    • Face detection and recognition
    • Gesture recognition
    • Palm tracking
    • Augmented reality
    • Object detection and tracking
    • Emotion detection

    You will learn in the Newsletter how the partnership between CEVA and nViso was key to develop “emotion detection”.

    One of the articles part of this Newsletter was not in Semiwiki: “CEVA Targets Wearable” has been extracted from a report from the Linley Group, explaining that successful devices (to support wearable) will require processor custom designed for this application. If you go to CEVA web page, you will have the ability to download the complete report from The Linley Group, on top of reading CEVA-Newsletter.

    Eric Esteve from IPNEST

    More Articles by Eric Esteve…..

    lang: en_US


    The Grand Folly of India’s Foundry Plans

    The Grand Folly of India’s Foundry Plans
    by Peter Gasperini on 07-06-2014 at 10:10 am

    At the beginning of the year, New Delhi’s outgoing government launched an initiative purported to drive the nation’s technology independence and reduce the current account deficit on electronics imports. The initiative describes a partnership between New Delhi and two industrial consortiums for the building of semiconductor manufacturing plants – one outside of the capital and the other in Western Gujarat.

    “Every age has its peculiar folly: Some scheme, project, or fantasy into which it plunges, spurred on by the love of gain, the necessity of excitement, or the force of imitation.” – Charles Mackay, “Extraordinary Popular Delusions and the Madness of Crowds”

    The plan is ambitious. The 28nm fab near New Delhi and the 22nm foundry in Western Gujarat will each support 40,000 wafer starts per month. The plan is also very expensive. Together, the facilities are projected to cost just under $11B USD. The government’s role in these two separate efforts is essentially that of a zero cost banker. Each consortium will receive an interest free loan of approximately $900M USD. Furthermore, the central government is on the hook for 25% of the total capital outlays, and is providing further incentives in the form of a bundle of tax deductions and duty exemptions. Thus, New Delhi’s expenses in this effort, not counting the tax incentive bundle, are just shy of $4B. What did the previous Cabinet expect to receive as a benefit to the nation for such an industrial policy?

    Well, India’s electronics imports reached $31B in 2013, and that number is expected to explode to a whopping $400B by 2020. The contention is that laws requiring local content will translate to a drop in the net import value of electronics and help ameliorate the balance of trade. More directly, though, the Cabinet advertised the economic benefits it expected from the project – an estimated 22,000 direct jobs and another 100,000 spawned from enterprises that would sprout up to support foundry operations and employees in their immediate locations. There are other hoped-for benefits of a more strategic nature. The policy echoes sentiments that call for greater Indian participation in the Information and Technology sector of the global economy, along with all the economic clout and revenues which follow. Furthermore, as India grows in its expected role as a world power, efforts need to be made to provide local sources for the components that are used in the increasingly sophisticated navigation and communications suites, avionics and targeting systems used in the aircraft, naval vessels, ground vehicles and missiles of India’s armed forces. But will this really be of benefit to India? A very quick look at basic math provides a distinctly negative answer to this question.

    Consider the following: the central government in New Delhi will be responsible for subsidies and credit to the efforts of the two industrial consortia amounting to $4B (and recall that this is not counting an additional un-quantified bundle of tax and customs subsidies.) If the estimates of consequent employment are correct (and that’s a big ‘if’, folks), each direct job will cost the Indian taxpayer over $180,000. In a country where the per capita income is $4300, the cost-benefit ratio is atrocious. Adding the ‘indirect’ job creation estimate for a total of 122,000 expected positions (also a very questionable figure), the cost is still at least $33,000/job. The gross inefficiency of such government ‘investment’ is glaring. The balance of trade benefits are even more dubious.

    Let’s assume that both fabs were fully functional today and produced a complete range of semiconductor devices for 3C (communications, computing and consumer) electronics. This would, by necessity, have to include analog, RF, mixed signal, optical and TTL components, microprocessors, DRAM, SRAM, Flash, programmable logic, SoCs, discretes, MCUs and so on. The combined 80,000 wafers/month capacity of the two fabs amounts to roughly 1.5% of worldwide market share in 200mm wafer equivalents. In 2013, the semiconductor market was $315B on a global basis. Thus, one could theoretically produce $4.8B of semiconductor value to offset the 2013 electronics import tab, reducing it to roughly $26B thru regulations requiring 100% local content.

    However, in 2020 the effect is more or less inconsequential, as fab capacity limitations would reduce a $400B deficit merely to $395B. India’s fab capacity would have to expand by an order of magnitude to keep pace, with a consequent explosive growth in cost.These considerations just scratch the surface of what is becoming increasingly identified as a historic debacle in India’s central government industrial development planning. There are many other factors involved in assessing the short and long term utility and efficacy of this initiative, including the role of the central government, tax policy, technology considerations, social issues, environmental aspects and broader implications to the national economy.

    These issues and more are explored in greater depth at the Vigil Futuri blog and the Pune Chips website.

    lang: en_US


    Sonics and Qualcomm Make a Deal

    Sonics and Qualcomm Make a Deal
    by Paul McLellan on 07-06-2014 at 9:00 am

    Some background. Sonics has been in the network-on-chip (NoC) business for a long time. Nearly 18 years years. When Arteris launched their products, Sonics figured Arteris were infringing Sonics’s patents and in 2011 brought a complaint against them. Details are here. Arteris looked at a couple of their own patents (if you are really that interested in the details, the patents are as follows: 7,574,629 – Method and device for switching between agents; 7,769,027 – Method and device for managing priority during the transmission of a message) and decided that Sonics was infringing them. But Arteris’ claims have been dismissed.

    Qualcomm then bought a lot of Arteris. They acquired the engineering team, all the source code, the patents and so on. The old Arteris still has limited rights (to use the patents, the source code and to build a new engineering team). But Qualcomm now owns the two patents above.

    That caused a problem for Sonic’s customers, especially ones that competed directly with Qualcomm. What if Qualcomm came after them for infringing the patents? In some ways, with the US legal system, it doesn’t matter too much whether you win or lose since the cost, the time and the distraction of a lawsuit means everyone loses.

    So yesterday Sonics announced that they had signed a patent non-assert agreement on these two patents. Also all remaining claims in the case involving these patents have been dismissed pursuant to a joint motion by Qualcomm and Sonics. It is actually effective from March 24th.


    Grant Pierce, the CEO of Sonics, emphasized that this is not the end of litigation.“As we have said before, Sonics will continue to seek protection for its broad patent portfolio, including patents that were invented and granted in the U.S., as outlined in Sonics’ complaint, Complaint for Patent Infringement (Sonics v. Arteris), of November 1, 2011. We are committed to seeing this time consuming process through to its completion.”

    Talking of Grant Pierce, he was just elected to the EDAC board last month when the elections were held during #DAC51. He is the only new board member, the others continue from before.

    There are a lot of moving parts in this story between Sonics, Arteris, Qualcomm and their respective customers and competitors. And of course the original Sonics complaint against Arteris is still to be decided. The Sonics press release is here.