RVN! 26 Banner revised (800 x 100 px) (600 x 100 px)

Author Interview: Bernard Murphy on his latest book

Author Interview: Bernard Murphy on his latest book
by Daniel Nenni on 01-01-2020 at 10:00 am

Book Cover

Over the last 40 years, Bernard has worked with semiconductor and EDA companies in hands-on, management and consulting roles in engineering, sales and marketing. He most recently co-founded Atrenta where he created and led the development of SpyGlass, retiring as CTO when Atrenta was acquired by Synopsys. Post-retirement, he’s been an active blogger for SemiWiki. He’s also written a couple of books under the SemiWiki label and he independently advises a number of clients on marketing content.

Why did you decide to write The Tell-Tale Entrepreneur?

Over the last 40 years, I’ve created, suffered and edited more than my fair share of biz and tech communication. Which has reinforced my view, widely shared, that our communication is pretty bad. Pitches, blogs, white papers aiming to convince are at best unconvincing, at worst painful. What’s curious is that, from engineers to CEOs, think we’re good at PowerPoint. Yet we’re terrified at the thought of writing. PowerPoint feels like a familiar template, communicate-by-numbers. Word has no template; we have to start from a blank sheet, hence the terror. Which suggests the format is beside the point, we suck at communication either way. PowerPoint just lulls us thinking we don’t.

I started out just as bad, but I worked hard to improve. Through a lot of trial and error I believe I figured out the problem. I want to pass this on, not through a boring how-to book but through an entertaining set of short stories. Designed to show you how to make your communication just as engaging.

How is storytelling different and why is it better than other ways to communicate?

Take PowerPoint as a reference. It’s well established and has great value in structured contexts where efficient information transfer is the goal. Status updates, project planning, training, technical due diligence. But it’s weak in persuasion. Where you need to convince a client, prospect, investor that you have the best product for their needs. Or you are their best possible partner. Or you failed to deliver what you promised and now must rescue the relationship.

These are times when slides are the wrong answer. You have to connect emotionally with your audience – building excitement around a new direction, dealing with fear of possible failure, maybe pointing out pitfalls that you already understand well. Or convincing them there will be no more mistakes. Eyeball to eyeball conversations.

The best way to guide that conversation is through a story. You’re looking at them (not slides), they’re looking at you, and you’re telling them a story, appealing to their emotions.

Storytelling isn’t a new idea. What makes your approach different?

Storytelling is a very old idea. Today we relegate stories to entertainment, expecting that business communication needs a more professional approach. Our brains don’t agree. We’ve been telling stories from beginning of time. Not just to entertain but also to pass on wisdom, culture, beliefs, laws. Our brains are wired to receive stories efficiently, motivating us to action. Not so for data and logic dumps. That’s why we find PowerPoints so boring and start scrolling through texts, emails, anything rather than listen to the speaker.

Instead tell a story. Stories are naturally engaging, especially when they sound roughly relevant to the audience’s goals. We want to know what’s going to happen next. Calling the hero to adventure. Facing tests together, proving ourselves a worthy mentor. The big challenge where it could all go wrong, but somehow our hero makes it through, now stronger, more capable. And the final challenge. Much more interesting than texts and emails. They learn what you can do for them along the way, in a context they recognize.

Storytelling is big in marketing now. Tons of advice online on how to do it – blogs, whitepapers, companies who want to advise you (for a fee). But there’s something a bit odd about this advice. It all seems to come in the same business communication standard format: explanations, bullet lists and charts. You want to learn how to tell stories. Wouldn’t it be better to do that by reading stories?

That’s my innovation – I explain how to tell stories by telling you stories.

Who do you think will find value in this book?

Anyone in tech who must communicate with customers, prospects, investors. Who wants to reach markets through blogs and white papers. Those who aspire to sell their company. All will relate to experiences in these stories. And will I hope will be inspired to re-imagine and improve their own stories, based on these examples.

The book is also written for a general audience. Anyone interested in real stories drawn from different phases in the lives of tech ventures. Technology plays a role in these stories but isn’t primary, so I’ve simplified quite a lot. People, opportunities, challenges, growth are the most important elements.

These stories are for everyone, but especially for us communicators in tech. Most important, I hope you will begin to understand why our audiences are thirsting for stories, not more death-by-PowerPoint.

Where can we find the book?

The Tell-Tale Entrepreneur is available for pre-order on Amazon and will be released on January 26th, 2021.


ANSYS, TSMC Document Thermal Reliability Guidelines

ANSYS, TSMC Document Thermal Reliability Guidelines
by Bernard Murphy on 01-01-2020 at 6:00 am

Automotive Reliability Guide min

Advanced IC technologies, 5nm and 7nm FinFET design and stacked packaging, are enabling massive levels of integration of super-fast circuits. These in turn enable much of the exciting new technology we hear so much about: mobile gaming and ultra-high definition mobile video through enhanced mobile broadband in 5G, which requires support for millimeter wave frequencies; high-speed networking in hyperscalar datacenters through 100G connectivity; blazing fast AI accelerators in those same datacenters; and fusion of multiple sensor sources to build environment-aware intelligence for automotive safety and autonomy, building security, autonomous drones and many more capabilities.

With new technologies we always find new challenges. ANSYS and others have been hearing from chip and system builders supporting these domains that they are seeing increasing post-silicon failures in the devices they are building. These devices are nominally perfectly fine, pass standard testing, but fail in system operation primarily related to voltage, timing and process variations. Tianhao Zhang  (Dir. Foundry Relations at ANSYS) says that between what they are hearing from customers and industry reviews, 75% of these product failures can be attributed to thermal or vibration effects.

Thermal also increases cost through need for more advanced cooling, it reduces performance through increased resistance in the interconnect and degraded transistor performance and it increases noise leading to random failures. It also decreases reliability, on chip through electromigration and device aging, and in the package and system through mechanical stress during to warping.

This is not a problem that can be dealt with later. One chip design VP has said that self-heating (related to FinFETs) and thermal analysis are now absolute requirements for automotive and high-performance computing applications. Another noted that compared to planar designs they are now seeing temperature increases in metal of 10 to 20 degrees, and that is making design for reliability much more challenging.

TSMC has been hearing all the same issues and has been increasing the number of checks they require, particularly thermal checks, to offset these types of problem. TSMC has worked closely with ANSYS to prove and document a thermal solution they jointly support. This includes an ANSYS reference flow for transistor, chip and package/3D-IC levels, from 20nm down to 5nm. These can be downloaded from the TSMC portal.

They are also working together on solution guides for specific application flows. For example, ANSYS now provides solution guides for automotive development on 16nm and 7nm. These cover electromigration, thermal and ESD topics. In the thermal analysis section, the document details multiple areas including the flow, and also provides test cases and case studies.

The ANSYS analysis is not based on a simple averaging of thermal effects. They analyze all the way down to the physical implementation of transistors and interconnect systems under representative activity scenarios, to estimate local heating, interconnect heating and heat dissipation. They do this using analytics from RedHawk, together with finite-element analysis applied at the die, stacked die, package and board level. And they’re computing temperature profile by looking at (thermal) conduction, radiation and convection flows, the last of these though detailed fluidics analysis. This is a true bottoms-up multi-physics solution. You can learn more in THIS WEBINAR, presented by Tianhao and Karthik Srinivasan (Sr Prod Mgr at ANSYS).


Ten Trends of Blockchain in 2020

Ten Trends of Blockchain in 2020
by Ahmed Banafa on 12-31-2019 at 6:00 am

Ten Trends of Blockchain in 2020

It’s clear that blockchain will revolutionize operations and processes in many industries and governments agencies if adopted, but its adoption requires time and efforts, in addition blockchain technology will stimulate people to acquire new skills, and traditional business will have to completely reconsider their processes to harvest the maximum benefits from using this promising technology. [2]

The following 10 trends will dominate blockchain technology in 2020:

1. Blockchain as a Service (BaaS) By Big Tech Companies
One of the promising blockchain trends in 2020 is BaaS, short for Blockchain As A Service. It is a new blockchain trend that is currently integrated with a number of startups as well as enterprises. BaaS is a cloud-based service that enables users to develop their own digital products by working with blockchain. These digital products may be smart contracts, decentralized applications (Dapps), or even other services that can work without any setup requirements of the complete blockchain-based infrastructure.

Some of the companies developing a blockchain that provide BaaS service are Microsoft and Amazon, consequently shaping the future of blockchain applications. [1]

 2. Federated Blockchain Moves to The Center Stage
Blockchain networks can be classified as: Private, Public, Federated or Hybrid. The term Federated Blockchain can be referred to as one of the best blockchain latest trends in the industry. It is merely an upgraded form of the basic blockchain model, which makes it more ideal for many specific use cases.

In this type of blockchain, instead of one organization, multiple authorities can control the pre-selected nodes of blockchain. Now, this selected group of various nodes will validate the block so that the transactions can be processed further. In 2020, there will be a rise in the usage of federated blockchain as it provides private blockchain networks, a more customizable outlook. [1]

3. Stablecoins Will Be More Visible
Using Bitcoin as an example of cryptocurrencies its highly volatile in nature. To avoid that volatility stablecoin came to the picture strongly with stable value associate with each coin. As of now, stablecoins are in their initial phase and it is predicted that 2020 will be the year when blockchain stablecoins will achieve their all-time high. [1]

One driving force for using stablecoin is the introduction of Facebook’s cryptocurrency “Libra” in 2020 even with all the challenges facing this new cryptocurrency proposed by Facebook and the shrinking circle of partners in libra.org [4].

4. Social Networking Problems Meet Blockchain Solution
There are around 2.77 Billion social media users around the globe in 2019.

The introduction of blockchain in social media will be able to solve the problems related to notorious scandals, privacy violations, data control, and content relevance. Therefore, the blockchain blend in the social media domain is another emerging technology trend in 2020.

With the implementation of blockchain, it can be ensured that all the social media published data remain untraceable and cannot be duplicated, even after its deletion. Moreover, users will get to store data more securely and maintain their ownership. Blockchain also ensures that the power of content relevance lies in the hands of those who created it, instead of the platform owners. This makes the user feel more secure as they can control what they want to see. One daunting task is to convince social media platforms to implemented it, this can be on a voluntary base or as a results of privacy laws similar to GDPR. [1]

5. Interoperability and Blockchain Networks
Blockchain interoperability is the ability to share data and other information across multiple blockchain systems as well as networks. This function makes it simple for the public to see and access the data across different blockchain networks. For example, you can send your data from one Ethereum blockchain to another specific blockchain network. Interoperability is a challenge but the benefits are vast [5].

6. Economy and Finance Will Lead Blockchain Applications
Unlike other traditional businesses, the banking and finance industries don’t need to introduce radical transformation to their processes for adopting blockchain technology. After it was successfully applied for the cryptocurrency, financial institutions begin seriously considering blockchain adoption for traditional banking operations.

PWC report, 77 percent of financial institutions are expected to adopt blockchain technology as part of an in-production system or process by 2020.

Blockchain technology will allow banks to reduce excessive bureaucracy, conduct faster transactions at lower costs, and improve its secrecy. One of the blockchain predictions made by Gartner is that the banking industry will derive 1 billion dollars of business value from the use of blockchain-based cryptocurrencies by 2020.

Moreover, blockchain can be used for launching new cryptocurrencies that will be regulated or influenced by monetary policy. In this way, banks want to reduce the competitive advantage of standalone cryptocurrencies and achieve greater control over their monetary policy. [2]

7. Blockchain Integration into Government Agencies
The idea of the distributed ledger is also very attractive to government authorities that have to administrate very large quantities of data. Currently, each agency has its separate database, so they have to constantly require information about residents from each other. However, the implementation of blockchain technologies for effective data management will improve the functioning of such agencies.

According to Gartner, by 2022, more than a billion people will have some data about them stored on a blockchain, but they may not be aware of it. Also, national cryptocurrencies will appear, it’s inevitable that governments will have to recognize the benefits of blockchain-derived currencies. Digital money is the future and nothing will stop. [3]

8. Blockchain Combines with IoT
The IoT tech market will see a renewed focus on security as complex safety challenges crop up. These complexities stem from the diverse and distributed nature of the technology. The number of Internet-connected devices has breached the 26 billion mark. Device and IoT network hacking will become commonplace in 2020. It is up to network operators to stop intruders from doing their business.

The current centralized architecture of IoT is one of the main reasons for the vulnerability of IoT networks. With billions of devices connected and more to be added, IoT is a big target for cyber-attacks, which makes security extremely important.

Blockchain offers new hope for IoT security for several reasons. First, blockchain is public, everyone participating in the network of nodes of the blockchain network can see the blocks and the transactions stored and approves them, although users can still have private keys to control transactions. Second, blockchain is decentralized, so there is no single authority that can approve the transactions eliminating Single Point of Failure (SPOF) weakness. Third and most importantly, it’s secure—the database can only be extended and previous records cannot be changed [7].

Many IoT based companies adopts blockchain technology for their business solutions. The International Data Corporation (IDC) is expecting that 20 percent of IoT deployments will enable blockchain services by 2020. [3]

9. Blockchain with AI 
With the integration of AI (Artificial Intelligence) with blockchain technology will make for a better development. This integration will show a level of improvement in blockchain technology with adequate amount of applications.

The International Data Corporation (IDC) suggests that global spending on AI will reach $57.6 billion by 2020 and 51% of businesses will be making the transition to AI with blockchain integration.

Additionally, blockchain can also make AI more coherent and understandable, and we can trace and determine why decisions are made in machine learning. Blockchain and its ledger can record all data and variables that go through a decision made under machine learning.

Moreover, AI can boost blockchain efficiency far better than humans, or even standard computing can. A look at the way in which blockchains are currently run on standard computers proves this with a lot of processing power needed to perform even basic tasks

Examples of applications of AI in Blockchain: Smart Computing Power, Creating Diverse Data Sets, Data Protection, Data Monetization, Trusting AI Decision Making. [6]

10. Demand for Blockchain Experts 
Blockchain is a new technology and there are only few percent of individuals who are skilled in this technology. As blockchain technology becoming a fast-increasing and wide-spreading technology, that creates a situation for many to develop skills and experience about blockchain technology.

Even though the number of experts in blockchain fields is increasing, on the other hand the implementation of this technology has a rapid growth which will create a situation for the demand of Blockchain experts by 2020. [3]

It’s worth saying that there are genuine efforts by universities and colleges to catch up with this need, but the rate of graduating students with enough skills to deal with blockchain technology is not enough to fill the gap. Also, Companies are taking steps to build on their existing talents by adding training programs for developing and managing blockchain networks.

Ahmed Banafa, Author the Books :

Secure and Smart Internet of Things (IoT) Using Blockchain and AI

Blockchain Technology and Applications

Read more articles at : https://medium.com/@banafa

References

[1] https://www.mobileappdaily.com/top-emerging-blockchain-trends

[2] https://www.aithority.com/guest-authors/blockchain-technology-in-the-future-7-predictions-for-2020/

[3] https://www.bitdeal.net/blockchain-technology-in-2020

[4] https://medium.com/altcoin-magazine/to-libra-or-not-to-libra-e2d5ddb5455b

[5] https://blockgeeks.com/guides/cosmos-blockchain-2/

[6] https://medium.com/altcoin-magazine/blockchain-and-ai-a-perfect-match-e9e9b7317455

[7] https://medium.com/@banafa/ten-trends-of-iot-in-2020-b2


TSMC, Huawei, the US Government, and China

TSMC, Huawei, the US Government, and China
by Daniel Nenni on 12-30-2019 at 6:00 am

Morris Chang TSMC

The media is trying to disparage the semiconductor industry again. It’s hard to not take this type of desperate journalism personal. Semiconductor people are the smartest and hardest working people in the world and we deserve better, absolutely.

Morris and Sophie Chang TSMC

TSMC founder sees trade dispute as ‘reality show with no script’ July 2018

The latest media scam is that the US Government is pressuring TSMC about stopping wafer shipments to Huawei (HiSilicon). The Financial Times started it with “US urges Taiwan to curb chip exports to China” and the cut/paste media sites jumped all over it and “made it their own”.

TSMC responded with:

“We did not have any discussion with either the Taiwan or the U.S. governments regarding shipping wafers to HiSilicon, nor have we received any instruction from either government not to make the shipments,” TSMC spokesperson Elizabeth Sun told Caixin in an email, adding that it will continue shipments while complying with trade regulations.

Remember, TSMC has two fabs in China and plenty of room for expansion. The US accounts for 61% of TSMC’s revenue and China is a growing 17%. Taiwan is 8%, Japan 6% and others are 1%. The question is: What would happen if TSMC cut wafer shipments to the US or China? Answer: The end of modern life as we now know it.

Another ignorant quote:

“Last month, a U.S. official informed Taiwanese diplomats that the semiconductors produced by TSMC and then procured by Huawei, were ending up in Chinese missile guidance systems aimed at Taiwan, as per the reporting by Financial Times.”

I can assure you TSMC knows more about what their customers are doing than politicians in any country including Taiwan. There are very few secrets inside the fabless semiconductor ecosystem and TSMC knows more than most. And does it really matter who made what, when, and where in the case of war? It doesn’t matter because there is nothing you can do about it. That ship sailed a long time ago.

Bottom line: TSMC is the new Switzerland and has the full support of the US, Taiwan, and China Governments.

Another interesting headline:

“Samsung is pouring $116 billion towards beating TSMC in the race to 5nm and beyond”

First and foremost, TSMC has already won the race to 5nm and EUV if the finish line is high volume manufacturing versus press releases or “leaked” road maps.

In order for Apple to ship millions of iProducts in Q4 2020 the 5nm EUV process must be frozen by the end of 2019 starting production in Q1 2020. In fact, TSMC recently outlined their 5nm process at IEDM.

I remember when SMIC launched in 2000 and suggested that they would compete with TSMC. It was believable to me because the China Government was strongly behind them and the China consumer market was theirs for the taking. Unfortunately, competing with TSMC proved too hard for SMIC who then resorted to stealing trade secrets. The resulting litigation cost SMIC hundreds of millions of dollars and 10% of their stock.

To say that SMIC is a trailing edge foundry is quite generous. SMIC has just now released a 14nm process four years after TSMC who is now at 5nm with full EUV. SMIC doesn’t even have an EUV machine yet and they may not get one if the current political turmoil is not properly addressed.

According to reports, the SMIC 14nm was co-developed with Qualcomm who also worked with TSMC and Samsung on 14/16nm processes. I’m sure the TSMC and Samsung legal staff already have SMIC 14nm die under review.

GlobalFoundries also had their sites set on competing with TSMC but that never really happened, not even close.

Samsung officially became a pure-play foundry in 2017 when they reorganized all of their logic fabs under Samsung Foundry. Samsung Electronics is Samsung Foundry’s biggest customer of course but they do have a long history of external foundry business. Apple was the big start with the introduction of the iProducts and other big fabless companies (Qualcomm) have followed.

Samsung certainly is a leader in connectivity and IoT now that all Samsung appliances, TVs, and other electronic gadgets have WiFi so they can talk to you throughout the day. You should see the Samsung booth at CES. It’s more of a connected city than a trade show booth but I digress.

Bottom line: While Samsung’s “pouring $116 billion towards beating TSMC” is impressive you have to understand that the TSMC ecosystem of partners and customers have poured trillions of dollars into TSMC staying ahead of all foundry comers, right?


Learning to Love Lyft Again

Learning to Love Lyft Again
by Roger C. Lanctot on 12-29-2019 at 6:00 am

When I landed at San Francisco International Airport last Tuesday morning around 1 a.m. I was determined to locate the airport taxi rank and take a cab to my hotel in Santa Clara. The idea of hailing an Uber or Lyft seemed essentially nonsensical to me since I knew professional taxi drivers would be waiting as they usually are at most airports. Why call for a ride when drivers are waiting.

As convenient and compelling as the ride hailing business model is, the airport ride hailing pickup is the single use case that makes no sense to me. As I approached the taxi stand in the wee hours last week, the last of three cabs pulled away from the curb leaving no taxis for me in my mildly frazzled state.

The taxi “concierge” manning the taxi rank post (a pointless job if there ever was one) suddenly appeared to me to be a valuable source of information. “Excuse me,” I said. “What is the fare to Santa Clara.”

Consulting a spreadsheet of rates posted on a pillar nearby he told me: “$147.”

That monumental sum dictated a change in plans. Like LAX and many other airports, SFO has shifted the ride hailing app pickup area to a nearby parking garage – something like level 4 if I was at Terminal 3 – of which I was unsure.

Using the Lyft app and making my way to pickup spot F6 I was soon on my way to Santa Clara in a Lyft-designated vehicle as the clock approached 1:30 a.m. More importantly, the Lyft ride was quoted, in advance, as $47 — one third the price of the posted taxi fare.

There were several decisions reflected here, including the decision to not rent a car. At that hour of night and with plans for only a 1-2-day visit to Silicon Valley, renting a car seemed unnecessary.

My decision was validated immediately as the combination of construction on Highway 101 and crash-related closures of both North- and South-bound traffic lanes taxed my Lyft driver’s creativity in routing me around the late-night incidents. By 2 a.m. I was checking in at my hotel and happily noting that I would not need to pay for overnight parking for a car I did not rent.

My visit to Silicon Valley ultimately extended for three more days and around eight more Lyft rides – all super convenient, relatively inexpensive, and with pleasant drivers who, like that first Lyft driver, skillfully dodged traffic jams by cleverly resorting to surface streets.

The moral of the story: It’s true that airport taxi ranks trump ride hailing apps for convenience – at least for immediate availability if not for app-based payment. But exorbitant fares are a severe buzzkill. In an ominous note, upon my return home, the news arrived of the shuttering of the SuperShuttle franchise that once dominated shared rides to and from airports. With Supershuttle gone, the reliable airport taxi rank will be the next transportation option to disappear, if fares fail to meet the fairness test. I can thank SFO taxis for restoring my support for Lyft.


Computing with Light

Computing with Light
by Daniel Nenni on 12-27-2019 at 6:00 am

Evolution of programmable photonics

I recently wrote about this year’s Cadence Photonics Summit. As I mentioned in that post, it was a fascinating event with several companies providing useful and informative presentations. You can access some of the presentations on the event site. One presentation, given by Jose Capmany of iPronics, was especially interesting to me, so I will dive into it a bit.

Evolution of programmable photonics

The current commercial efforts to utilize photonics have typically focused (pun intended) on data transmission. Most of these efforts are utilizing the same pulse-amplitude modulation with four levels (PAM4) technology that is used in high-speed (e.g., SerDes) copper data transmission. But the field is growing much faster than this as more optical circuits are available. These functions—filters, delay lines, RF phase shifters, switches, MUXs, beamformers, arbitrary waveform generators, and optoelectronic oscillators—are enabling a new class of photonics, RF/mm photonics. One interesting circuit discussed in the presentation was the European Research Council’s ERC ADG 2016 UMWP Chip Project. As stated in the presentation, “The main objective of UMWP CHIP is the design, implementation and validation of a universal integrated microwave photonics programmable signal processor capable of performing the most important MWP functionalities featuring unique broadband and performance metrics.” This is a serious piece of engineering work.

Jose’s presentation took a bit of a diversion here, and at first, I did not understand where he was headed. He reviewed the history and evolution of the field-programmable gate array (FPGA). But then he went on to describe a new form of a gate array, the field programmable photonics gate array (FPPGA). The fabric in an FPPGA is not populated with look-up tables (LUTs). Instead, the fabric consists of reversible 2×2 unitary gates. These gates work with analog signals and unitary 2×2 matrix Algebra U(2). Reversible gates are built by transforming the Pauli Matrices, which are as well known in quantum information as in quantum computing (QC)!

FPPGA Basics

Some call QC the study of a non-classical model of computation. That, to me, is an over-simplification. It deals with functions that transform states rather than using binary math and traditional logical operators.

If you attended SEMI this summer, you had a chance to see at the IBM Quantum Computer that was on display there. It looks like an exotic piece of hardware. But how do you program it? It has its challenges, as Bernard Murphy pointed out in blog early this year, Contrarian Views on Quantum Computing. But QC holds great promise in cryptography, simulation, simulated annealing, solving linear equations, and more. So QC is not going away. It will just have to chip away at the digital logic design, algorithm by algorithm, and that will take a long time before it becomes more widely adopted. However, it is very likely to capture some niche markets very quickly with its higher efficiencies for certain problems.

Cadence, in its collaboration with Lumerical, has delivered the photonics design tools to support the implementation of photonics designs (PICs) now. No need to wait for that. I hope to see the day when Cadence also produces a QC simulator, or maybe Lumerical — we will just have to see.


Cryptocurrency Exchange Hacks are on the Rise

Cryptocurrency Exchange Hacks are on the Rise
by Matthew Rosenquist on 12-26-2019 at 10:00 am

Seven major cryptocurrency exchanges were victimized in 2019, totaling over $160 million in financial theft. As predicted, cybercriminal hackers targeted crypto exchanges in 2019 and the trend will continue into 2020.

Crypto exchanges are relatively new, as compared to those in the traditional financial markets. It is a hotbed of competition which drives innovation and is attractive to criminals. Over 400 cryptocurrency exchanges exist and all are vying for a piece of the growing $200+ billion market. New features and updates are constantly modifying the software and technology infrastructure. Over six thousand unique digital coin and token assets exist and the scope of management complexity continues to grow for these online markets. With constant change, vulnerabilities are inadvertently introduced.

Many of the exchanges have not matured, from a cybersecurity perspective, to properly validate, maintain, and defend their online services. Most of the sites focus on maintaining services and growing the user-base, with little attention to security.  The race to establish themselves and be competitive has blinded them from investing in the necessary cybersecurity controls. In comparison, the brick-and-mortar banking sector is well versed in the risks of cyber-attacks. With decades of experience, they spend an inordinate amount higher than other industries on security.

Wherever there is value, the risk of theft exists. Digital tokens and coins are different than dollars and government-issued currencies, but they have value and can be transformed into just about any desirable form of money on the planet, which makes them a desirable target.

Additionally, the risks of being caught are small. Crypto assets can be easily stored, hidden, transferred, and laundered. Law enforcement’s effectiveness is less than optimal and not a significant deterrent. Their tools lack refinement, international cooperation is weak, and cybercrime laws are poorly defined. Investigation and recovery of crypto assets are problematic at best, which increases the lure to attackers. Improvements and new capabilities for pursuing criminals in the digital landscape are being made, but progress is slow.

The combination of significant wealth, online accessibility, numerous vulnerabilities, and a plausible exit strategy for stolen assets makes for attractive targets. The result is that cybercriminals are beginning to explore and invest in targeting cryptocurrency exchanges, where vast amounts are consolidated in one place. The results have been staggering, with some hacks netting over $40 million to the digital thieves.

  1. Upbit             $49M  November 26th
  2. Bitpoint          $32M  July 12th
  3. Bitrue             $4M    June 27th
  4. Binance          $40M  May 7th
  5. DragonEx       $7M    March 24th
  6. Bithumb         $13M  March 30th
  7. Cryptopia       $16M  January 15th

The successful heists embolden and encourage more to attempts to target this industry. Until the cybersecurity measures increase to align with the threats, attacks will continue to rise and a wider range of targets will fall victim. It is a self-reinforcing cycle.

I predict 2020 to see even greater numbers of attacks and losses to the cryptocurrency exchanges, product vendors, service providers, and the holdings sector. Cyber criminals will find new ways to exploit, defraud, and steal from the cryptocurrency ecosystem at a scale never seen before. This trend is here to stay for the foreseeable future.


AAA: Killer Automotive Safety Systems

AAA: Killer Automotive Safety Systems
by Roger C. Lanctot on 12-26-2019 at 6:00 am

AAA is out with a new study, conducted on its behalf by the Virginia Tech Transportation Institute, that purports to show, among other things, that advanced automotive safety systems may lull drivers into a false sense of security leading to distracted driving or worse. The takeaway from this impressively elaborate study is that car makers should take greater care and responsibility in deploying these systems and training dealers and drivers.

Understanding the Impact of Technology: Do Advanced Driver Assistance and SemiAutomated Vehicle Systems Lead to Improper Driving Behavior? – VTTI/AAA

AAA has increasingly built its auto safety brand on widening the awareness and mitigating the impact of distracted driving – estimated by the USDOT to take upwards of 3,000 lives on U.S. highways annually. For years, AAA led the charge against smartphone use while driving, going so far as to assert that even hands-free smartphone use was hazardous and should be sanctioned – based on studies of the cognitive load on drivers.

If you are starting to get the impression that the AAA is the last organization you want in the backseat of your car on your next long trip, then you are in tune with my sentiments. At the very moment that the automotive industry is being transformed by active safety systems designed to avoid collisions, keep drivers in their lanes, or alert drivers to objects in their blind spots – AAA is sounding the alarm that drivers may be becoming over-reliant on these systems and taking their eyes off the road.

This AAA position is perfectly aligned with the Insurance Institute for Highway Safety which for years claimed that blindspot detection systems, lane keeping assistants, and automatic emergency braking solutions were failing to reduce claims rates because consumers were turning them off.  So you are damned if you do (turn them on) according to the AAA and damned if you don’t (turn them on), according to IIHS.

More importantly, these two great advocates of driving safety were speaking out against the proliferation of safety systems rather than embracing them and describing how they might be enhanced.

Car companies such as Subaru, Nissan, Ford, Volkswagen, BMW, Volvo, and Hyundai that have taken the lead in deploying safety systems across their vehicle line ups and leveraging safety in their branding ought to be recognized, praised, and rewarded for having done so. The reality is that these active safety systems are being deployed by auto makers in the absence of regulatory mandates and in recognition of the fact that consumers highly value safety in their vehicles and are willing to pay for it.

In fact, consumers are so willing to pay more for safety systems in their cars that most have ignored the fact that the propensity of auto insurers fail to provide insurance incentives for adding these systems – with some exceptions. Kudos to USAA.

The fact of the matter is that cars should not hit things! Car crashes are a product flaw and any technology designed to prevent crashes – such as lane keeping, blindspot detection, and automatic emergency braking – should be on a path to universal industry adoption.

The AAA/VTTI study makes note of a variety of valuable insights ranging from the differing levels of driving capability of study participants to the varying behaviors reflected in the process of developing familiarity with new safety systems. The study also identifies the various challenges associated with different types of user experiences and interfaces for indicating when systems are turned on or off and when and how alerts are communicated.

Of course, the study is based on cars currently in the market, meaning the results of the study are nearly useless or irrelevant in the context of constantly evolving automotive safety systems. Some of the systems on the road today have driver information displays that are either too small or hidden behind the steering wheel, or may lack audible cues to go with visible indicators. And there is an unfortunate lack of consistency between car brands.

(It’s worth noting the growing adoption of driver monitoring systems globally – including the recently proposed Euro NCAP percent eye closure standard – intended to ensure future driver attentiveness.)

The revolution of active auto safety systems washing over the automotive industry is arriving in the form of increasingly inexpensive camera- and radar-based systems capable of identifying roadway obstacles and anticipating their movements. Ever more powerful on-board processing technology is allowing safety systems to deliver the kind of collision avoidance capability consumers should expect.

Over the past 10 years automotive safety advocates from the National Highway Traffic Safety Administration to AAA have taken to blaming drivers for 94% or more of all vehicle crashes. For them, it’s nearly always the fault of the nut behind the wheel.

In the absence of fully-functioning automated driving systems to remove the nut from this proposition I believe it is reasonable to expect auto makers to do their best to enable their products to avoid collisions leveraging widely available technologies. Studies like the AAA/VTTI project are useful in identifying the scope of the challenge – but pointless for arriving at a solution.

The solution lies in enhanced user interfaces, increased on-board processing capabilities, and the proliferation of vehicle sensors. All of this could be aided by a coordinated effort within the insurance industry to reward drivers that adopt, pay for, and use these systems – and that includes rewarding the auto makers that develop and deliver the systems.

It’s going to be decades before we remove that nut from behind the wheel. It’s time that we, as an industry, did our utmost to help him or her out.


No Coal in This Stocking: VCs and Nuclear Fusion

No Coal in This Stocking: VCs and Nuclear Fusion
by Bernard Murphy on 12-25-2019 at 6:00 am

Is fusion energy close?

Tis the time of year when product pitches are 100% at consumers. No-one in their right mind wants to push the nerdy behind-the-scenes stuff we usually talk about. This is a chance for me to go off the rails a little and consider unusual directions in innovation. We know all about VCs underwriting self-driving cars, intelligent everything and anything, clouds, blockchain, and so on ad nauseam, but what about nuclear fusion? Everything we’re building these days needs electricity. How about non-polluting (in principle) and unbounded (ditto) power generation? Stack all unicorns on top of each other and they still can’t compete with a value proposition like that.

Fusion involves banging hydrogen nuclei together (there are variants) to create helium nuclei. This process generates net energy and powers stars. Nuclear fission also generates energy when uranium or plutonium nuclei break up into smaller and energetic bits. Fission works just fine as a power source but has us all worried about long-lived radioactive by-products and the harm they can cause. Fusion generates no such radioactive by-products (again in principle) and actually generates more energy per reaction than fission. Also its fuel is hydrogen. We have quite a bit of that around, in water. And fusion generates helium, not carbon. So far at least we have no issues with excess helium (maybe we’ll all start talking funny). Seems like a no-brainer.

Except it isn’t. We’ve been working on fusion reactors since the 1940’s, without a commercial reactor to show for it yet. You have to bang the nuclei together really hard to overcome electrostatic repulsion. And once they fuse together, you have to contain the energy to sustain continued fusion. Containing a plasma at hundreds of millions of degrees is far from trivial. But we’re still trying, certainly in universities and national labs. Lockheed and Microsoft Research have even got in on the act.

When government-funded research is struggling, maybe it’s time to encourage more private involvement. That’s the view of a number of VCs, apparently dreaming of trillion-dollar, Aramco-scale IPOs. There’s a good article in the Economist on private ventures in fusion: Commonwealth Fusion Systems (a spin-out of MIT), Tokomak Energy (a spin-out of the UK Atomic Energy Authority), General Fusion in Canada, TAE Technologies in California and First Light Fusion (a spin-out of my alma mater). Between them they have raised close to $1B in funding. Government programs inevitably have richer sponsors. The ITER reactor in Europe is already a $20B program and aims to be fully operational by 2045.

There’s a more detailed article on Commonwealth Fusion Systems, which has a much more aggressive goal – to be operational by 2025 (try getting VCs to underwrite a program that will start to deliver in 2045). They intend to start with a 50MW reactor and scale over time to 200MW, the kind of plant that could take the place of a wind or solar farm.

The CEO makes a good point – renewables will never be able to completely replace carbon-based energy sources. The numbers don’t work to scale to that level. But fusion just might. The risks associated with fusion are similar to those associated with regular industrial plants, nowhere near the risks we associate with fission. The carbon footprint will be tiny, though we have yet to determine if a helium footprint is something we have to worry about.

There’s an old joke that fusion is just 30 years away from reality, and always will be. AI was the butt of similar jokes until quite recently. Perhaps with all this public and private attention, fusion will become a reality sooner than we expected.


Avoiding Fines for Semiconductor IP Leakage

Avoiding Fines for Semiconductor IP Leakage
by Daniel Payne on 12-24-2019 at 10:00 am

Percipient IPLM

In my semiconductor and EDA travels I’ve enjoyed visiting engineers across the USA, Canada, Europe, Japan, Taiwan and South Korea. I’ll never forget on one trip to South Korea where I was visiting a semiconductor company and upon reaching the lobby a security officer asked me to take out my laptop computer, because he wanted me to issue the dir command at the C: prompt so that he could write down how many files were on my computer, and the exact number of bytes. After my visit and presentation with the customer, the same security officer checked my laptop again to make certain that there were no extra files on my laptop, keeping his facility safe from IP theft from a visiting EDA vendor. That got me to thinking about how semiconductor IP is used, shared and protected, because in the USA we have “the deemed export rule” where the release of a controlled technology and info to a non-U.S. person, is considered an Export.

Releasing sensitive IP to a non-U.S. person while not having a deemed export license is a violation, and the fines can cost you dearly, thousands to millions of dollars. One semiconductor company paid a $10M fine for violating export control laws in 2014.

When I designed chips at Intel we would share IP between design groups in California, Oregon, Japan and Israel, but all of our IP tracking was done by a simple email exchange, nothing really traceable and certainly not enforceable. So in the industry today we have the issue of IP leakage, which is simply IP that inadvertently is getting shipped to a country where it is not allowed. Here are four examples of IP leakage to consider:

  1. Access controls where anyone can view and download IP without any enforcement.
  2. Using tar balls to share IP.
  3. An IP block embedded inside other IP, but without any visibility or traceability.
  4. A traveling engineer unaware of IP restrictions in a visited geography.

The IP Lifecycle Management (IPLM) experts at Methodics are on top of this issue of IP leakage and have designed a tool called Percipient that helps engineers prevent accidental IP leakage. The approach with Percipient is to use a centralized, traceable management system with an admin sets up permissions at the very start of a project. Three levels of permissions are defined: Read, Write, Owner.

Percipient works with existing infrastructure like the Unix file system and your favorite Data Management (DM) system: Perforce, Subversion, Git, etc. All of your IC design data stays in its native format, and Percipient integrates with the data sources, connecting IP producers to IP users.

With this approach an engineer can quickly build a workspace using a native DM system, and each workspace is traceable and tracked, so no more email messages and manual methods to keep track of everything by hand.

IP often uses a hierarchy which includes many smaller IP blocks, however if one embedded IP block several levels deep is restricted then how would a user know about that at the top level?

The architecture of Percipient understands and preserves all of your IP hierarchy, so there’s never a chance of accidentally sharing a restricted IP block buried deep inside of any hierarchy.

The challenge of an engineer traveling to a new geography and then starting to build a workspace, inadvertently starting to use restricted IP needs to be addressed. In Percipient there’s a feature called ‘geo-fencing’, where there’s a self-managed IP Cache and fencing enforces a “Do Not Download” list for all IPs in the cache. An admin marks each restricted IP block. Here’s a diagram of how the “Do Not Download” feature is enforced:

 

In this methodology a user is blocked from loading any restricted IP for their geography, and the admin can show through traceability that no sensitive IP was accidentally leaked.

Summary

The semiconductor industry has spread worldwide, and yet the concern for protecting semiconductor IP remains a looming issue as ITAR (International Traffic in Arms Regulations) and Deemed Export Rules have steep financial penalties for those companies that are leaking restricted IP. Instead of using manual methods to track IP and risk the chance of IP leakage, why not use something like Percipient that helps to automate and enforce IP reuse in a safe, legal manner.

In this blog I’ve summarized the features and methodology used in the Percipient IPLM tool which block accidental IP leakage, so that your engineers can concentrate on bringing to market new SoC systems and products that satisfy export rules with the least manual overhead.

To read the complete 11 page paper on this topic, visit the Methodics site and register.

Related Blogs