webinar banner2025 (1)

CES 2020: Dashboard Distraction Dilemma

CES 2020: Dashboard Distraction Dilemma
by Roger C. Lanctot on 01-05-2020 at 10:00 am

CES 2020 Dashboard Distraction Dilemma

Tesla Motors did it again. In September of 2019 Tesla launched its “Software Version 10.0,” a software update targeted primarily at in-vehicle infotainment systems. Principal among the extensive menu of updates was something called “Tesla Theater” which added streaming video content sources including Netflix, Hulu, Youtube, and vehicle tutorials, to the in-dash system.

Before you safety Mavens get your regulations in a bunch, the streaming content is only accessible when the vehicle is in park. Still, the streaming sources join an expanding roster of games, audio sources, and even a karaoke app complete with singalong lyrics in the dashboard of Tesla vehicles.

While Tesla always seems to be well out in front of automotive technology trends, the increased quantity and variety of in-dash content reflects a trend sweeping the industry via which infotainment screens are becoming much more lively and informative. I can’t say I am a fan of video content in the dashboard, but video was already there before Tesla launched Software 10.0 and I believe the entire industry is missing some obvious and more driving oriented applications for in-dash video.

The National Highway Traffic Safety Administration (NHTSA) mandated in-dash live back-up camera video implementations beginning with new cars manufactured in 2018 onward.  This means, as I am fond of noting, that the NHTSA doesn’t want you staring at your dashboard screen – unless your driving in reverse. (It’s worth noting that the U.S. is the only automotive regulatory authority in the world that has chosen this mandatory path to safer driving.)

The backup camera mandate in the U.S. is creating a generation of befuddled drivers that now believe it is best to drive their cars backwards while looking at the in-dash display. It is just as horrible as it sounds. I still remember my first backup camera driving experience shortly after I took delivery of my new 2007 Infiniti G35 and immediately proceeded to back into the garbage cans at the end of my driveway. (More a reflection on me, I suppose.)

Before the Cameron Gulbransen Kids Transportation Safety Act was signed into law, though, moving images on dashboard screens had already become common in the form of embedded navigation systems. The CGKTSA was intended to save approximately 200 lives a year from back overs.

During the time preceding enactment of CGKTSA, the NHTSA was actively pursuing safety regulations for automatic emergency braking and targeting distracted driving with infotainment system design guidelines. The transition from the Obama Administration to the Trump Administration, though, saw a further erosion of the NHTSA’s regulatory leadership.

The CGKTSA was brought about by legislation (following extensive research by the NHTSA) and not by regulatory oversight. The adoption of infotainment system design guidelines was voluntary within the automotive industry as was the latest effort to broaden the adoption of automatic emergency braking technology.

Tesla’s flirtation with in-dash distractions arrives with the NHTSA at an ebb in its regulatory authority at precisely the moment that issues of automated driving, privacy, and vehicle cybersecurity are crying out for governmental guidance. CES 2020 in Las Vegas next week will find auto makers and their suppliers plunging into this regulatory vacuum with dazzling and distracting in-dash displays spreading across dashboards and designed to enable everything from singing karaoke while driving to buying a café latte on the go.

The evolution of the in-dash experience started with the onset of digital radio adding meta data for artists and radio stations; and integrating digital content such as traffic, weather, fuel, and parking information; enabling search for broadcast content; and setting the stage for integration with streaming audio sources – i.e. hybrid radio – see: Audi/RadioDNS.

In the U.S., the range of digital content now available in dashboards is staggering and includes satellite radio, broadcast radio, podcasts, content from connected mobile devices, and even audio books. In this regard it is important to note the pre-CES 2020 merger of Tivo and Xperi – two kings of meta data and content management – and the potential acquisition of iHeartMedia by Liberty Media (which owns a majority stake in SiriusXM). The objective of all of this content delivery is to reach a captive seatbelted audience more than likely in transit to a destination for economic activity – cue timely advertising messages.

The contenders for this audience, who will be in attendance at CES 2020, include SiriiusXM, Xevo, Telenav, TomTom, HERE, iHeartMedia, Cumulus, Apple, Amazon, Google, Microsoft, and, of course, every auto maker and Tier 1 supplier. (To be honest, Visa, Mastercard, Paypal, and multiple other payment players are in on this game as well.) What all of these organizations are missing, though, is the need for critical driving information in the dash.

Over the Christmas/New Year’s break I had the occasion to view Alex Roy’s “The Secret Race” about his record setting drive across the U.S.

Apple and Amazon links: https://www.apexthesecretrace.com/

Car enthusiast Roy set the record with the help of extensive route planning – taking into account traffic, weather, and local law enforcement – and aerial reconnaissance in the form of a following spotter plane. In crossing the country in less than 28 hours Roy overcomes various surprises in the form of construction, some traffic, weather, and some very slow drivers – some of which might have been avoided with tools available today that were not available when Roy made his record-setting journey.

Not every driver can be assisted by an eye in the sky, but it is precisely this sort of capability that is enabled by existing analogs.

To this day car makers have failed to find a way to integrate traffic camera information to in-dash systems. TrafficLand, a traffic camera service provider based near my home in Northern Virginia and which helps local departments of transportation manage their hundreds of traffic cameras, has an app to allow users to access live traffic camera videos showing live feeds from the road ahead.

To me, an in-dash traffic camera application accessible via voice interface to display traffic conditions is a more relevant and essential in-dash video experience. But it doesn’t end there. I expect consumers will want to be able to access their Ring and home Webcams in their car dashboards before long.

Ever listening to a different drummer, Tesla has added Twitch to its in-dash infotainment access to go with the Tesla Arcade. Tesla also enables video clips created by its cars’ Sentry Mode to be shared via the company’s mobile app – so that, too, is a means to leverage vehicle-based video.

A further step might be a full integration with other Tesla drivers to share their live dashcam info – or clips – in the event of roadside incidents that may be impacting traffic conditions. It won’t be long before Tesla’s – and other so-equipped vehicles – are able to report traffic violations to local law enforcement – live. Whether on a long drive or a daily commute, what driver wouldn’t want to actually SEE what is happening along one’s intended route – from live cameras be they built into cars or installed along the roadside?

Suffice it to say that there is ample space in the increasingly ample dashboard displays being deployed in the automotive industry for companies to capitalize on advertising, customer engagement, collision mitigation, traffic management, and brand differentiation opportunities. CES 2020 in Las Vegas is the perfect place to assess the progress and efficacy of these gambits and place your bets.


China in 2020: Trade, technology, and the path ahead for US-China relations

China in 2020: Trade, technology, and the path ahead for US-China relations
by Gordon Orr on 01-05-2020 at 6:00 am

China in 2020 Trade technology and the path ahead for US China relations

This is the first of a five part series

2019 in China brought together long running challenges, such as uncertainty over US-China tariff levels and ever more intrusive regulation of business in China, with a few unexpected ones as well: the crisis in Hong Kong and the flare up triggered by tweets from an NBA coach, to mention just two. Yet for many businesses, opportunities flourished throughout the year as China’s economy grew roughly 6 percent. And in multiple key industries, the government’s commitment to global leadership started to pay dividends.

2020 will offer a similar mix of evolving, often worsening, challenges. Growing separation between the US and China in technology sectors seems inevitable. While some companies will evolve to remain relevant in both markets, others will choose to focus on one. In 2020 this separation may become broader, impacting financial markets much more directly. China’s economic momentum will continue in 2020 with domestic consumption leading the way, selectively creating opportunities. If China’s priority sectors match those of your business, 2020 will be a good year to step up as the taps of government funding remain open for now.

US-China relations

Multiple areas of growing separation between the US and Chinese economies predicted in last year’s note were largely realized – investment flows, supply chain, data flows, people flows, technology procurement, standards. In all these areas, further separation will occur in 2020. One example, US government agencies, such as the National Institutes of Health and the Department of Energy, not just the Department of Defense, have been presenting US university administrators with hundreds of case examples where they believe non-US academics, largely Chinese, have failed to disclose parallel funding for their research from overseas governments along with commitments to share their IP discoveries with those governments. Those academics will likely be proactively excluded from US universities; many others will self-select out or simply not come to the US in the first place. Restrictions on investment from China into the US will shift from a focus on larger deals, which have shrunk to almost zero, to direct and indirect (i.e. through funds) investment into technology startups.

I did anticipate a year ago that we would have clarity about tariffs by now, not the ongoing uncertainty that holds back investment plans in supply chain and factories. Looking into 2020, if there is finally agreement it seems likely to be narrow and not likely to be long lasting. Multinationals have suffered least from tariff volatility. They typically send no more than 15 percent of their China production to the US and have multiple factories around the world that they can move production for the US to. Almost none of these factories are or will be in the US. Smaller businesses, often foreign-owned, that focuses solely on exports to the US, have been most hurt.

Factories do continue to move out of China. Manufacturers are also consolidating in China, doubling down on technology in their remaining factories. Indeed, China is rapidly becoming the world center for the Internet of Things in factories. These trends preceded the US tariffs and have only been marginally accelerated by them. More non-Chinese companies than Chinese are shutting down factories in China, but not all move production out of China as they close. A good number outsource their manufacturing to a Chinese owned company producing in China, believing that the Chinese company will be lower cost than the foreign-owned factory, and just as good quality.

New areas of US-China separation will come into focus in 2020. Financial markets will be front and center. The U.S.-China Economic and Security Review Commission’s 2019 report to Congress has as its first recommendation to delist Chinese companies on US exchanges that do not meet four criteria. No Chinese company listed in the US meets all four, many won’t meet any. This threat covers around 500 companies with a cumulative market capitalization of about US$ 1 trillion (dominated by Alibaba). It was smart of Alibaba to get its secondary listing in Hong Kong in place in November 2019. Companies such as Ping An’s fintech subsidiary, OneConnect, which has announced plans to list in New York, may reconsider. After all, less than US$2 billion has been raised by Chinese companies on the NYSE and Nasdaq so far this year, down 74 percent from last year. Some Chinese tech companies may list domestically within China where listings generally achieve higher earnings multiples and Chinese regulators have quietly made it possible for companies using the Variable Interest Entity (VIE) structure to list domestically.

Technology tensions

The US and Chinese governments continue their rush to embrace greater technology separation. 2020 may be a tipping point. On one side the US government excludes Chinese companies from buying US sourced technology components (at least from being able to do so with certainty), from investing in US technology companies, and from supplying their technology products into the US.

On the other, the Chinese government has launched an over US$20 billion fund to support Chinese independence in a broad range of manufacturing technologies to go alongside its similar sized fund to support developments in semiconductors.

China’s “secure and control” initiative is encouraging government departments and state-owned enterprises to buy technology without US content – perhaps 25 percent of the traditional PC and server market. Chinese manufacturers’ share of the server and storage market had already risen from around 30 percent in 2012 to 70-80 percent in 2018. It is set to rise higher. In smartphones, four Chinese brands hold more than 85 percent of the Chinese market and less than 1 percent of the US. China’s internet giants, with the exception of TikTok, are absent from the US (TikTok may not retain its presence in the US for long if US legislators sustain their focus on the company); the US giants have long been absent from China.

The pinch point in semiconductors of Taiwanese contract manufacturers who play a key role for both US and Chinese companies will become much more visible in 2020, with greater levels of government to government pressure exerted on the key companies.

One of the few business-focused outcomes from the recent 4th Plenum were plans to establish a “new national system for making breakthroughs in core technologies under socialist market economy conditions.” This feels very similar to state-driven industrial policies contained in “Made in China 2025”, if not yet with the quantitative targets. In some areas, China is likely to achieve goals quickly, for example, as China still represents nearly a quarter of global manufacturing output; taking leadership in smart factories should be a no brainer. China is turning its cities into large-scale pilots for 5G-enabled smart cities at a pace that will allow China to set de facto standards. Their products will not be accepted in the US, especially not as many will require access to large scale data sets that dwarf those that Chinese companies have been blocked from.

All Chinese and US tariffs could be eliminated tomorrow and only have a marginal impact on these trends. Both governments have embraced growing separation, the only question is how fast and with how much pain is incurred as we proceed.


Semiconductor Review 2019 into 2020!

Semiconductor Review 2019 into 2020!
by Daniel Nenni on 01-03-2020 at 6:00 am

CES 2020

Semiconductors continue to surge and lead technology sectors all over the world. TSMC has always been my economic bellwether and 2019 was another great year as the TSM share price almost doubled. But it looks like the best is yet to come with TSMC significantly increasing CAPEX to cover 7nm and 5nm demand.

TSMC CEO C.C. Wei increased 2019 CAPEX from $10.5B in 2018 to more than $14B in 2019 with a big Q4 spend. Remember, TSMC builds capacity based on customer orders and not a dart board forecasting. With TSMC winning both the 7nm and 5nm popular vote, 2020 should be another blockbuster CAPEX year to backfill demand.

There are two VERY disruptive semiconductor trends to watch in the next year or three and that is the large systems companies taking control of their silicon (including Google, Facebook, Amazon, Microsoft, etc…) and China also taking control of their silicon.

Apple started it and now all systems companies in competitive markets will follow. It will be interesting to see who the big players are at CES 2020 next week and more importantly how many of them are making their own chips. My bet would be the majority of them including the automakers. It’s not really a fair bet since SemiWiki.com is the leading semiconductor design enablement portal with more than 3.25 million unique views and we get to see who reads what, when, and where they are from.

Why are systems companies dominating semiconductor design? Because they can use prototyping and emulation to get a jump on verification and software development and really tune the silicon to the system. Systems companies are also VERY competitive and can write some VERY big checks and they will not miss tape-outs or product ship dates. This is so un-fabless-like it isn’t even funny. It really is a new semiconductor world order.

Speaking of a new world order, China is also disrupting the semiconductor industry with billions of dollars invested in the “Made in China 2025” semiconductor supply chain initiative.

Think about it, China consumes more than 50% of semiconductor production worldwide and they only produce about 20% of said chips. My guess is 2020 and 2021 will see unprecedented China chip manufacturing growth due to increased memory (DRAM and NAND) manufacturing capacity coming online. Add in the political turmoil motivator and memory hogging mobile, 5G and AI, the Made in China 2025 initiative will get a major boost, my opinion.

I will be in China again this month and am excited to see what’s new. You can Google around all you want but there is nothing like being there.

2019 was also a big year for SemiWiki.com. We unleashed SemiWiki 2.0 in June with many new cloud-based features and more to come. Traffic and member registration is again growing double digits and we are already working on SemiWiki 3.0.

I would truly like to thank all of our bloggers, partners, readers, and registered members for your continued support. SemiWiki has been an exciting 10 year adventure and I’m looking forward to working with you all in the coming years. After spending my entire 35+ year career in semiconductors I can say that without a doubt the best is yet to come, absolutely!


Photonics Come into Focus: 2020 Predictions

Photonics Come into Focus: 2020 Predictions
by Rich Goldman on 01-02-2020 at 10:00 am

Photonics 2020 Predictions
Credit: Jeffrey Tseng / Intel via ACM

In the past, I’ve focused my annual predictions on electronics – ICs and EDA – but recently I’ve turned my focus to photonics, so my 2020 predictions are primarily in this area.

Historically, photonics has been the Gallium Arsenide of technologies; it was, is and always will be the technology of the future. Analysts have forever predicted the rise of photonics; that next year as Moore’s Law ends or slows in electronics, photonics will assert itself, entering the hockey stick phase of its growth. While photonics is playing a constantly greater role in our technology ecosystem with an impressive growth rate, the predicted explosive growth hasn’t yet happened.

Why not? There are a few reasons. First, photonics doesn’t adhere to Moore’s Law: The wavelength of light is the wavelength of light. It is a constant; it just doesn’t halve every two years, so the phenomenal gains in electronics driven by Moore’s Law just don’t apply to photonics.

Next, as always, engineers are clever. Engineers constantly breach barriers in electronics once considered insurmountable. So, applications where photonics were projected to replace electronics remain filled by more and more clever electronic designs instead. The replacement of electronics by photonics may still be inevitable, but the timeline keeps moving out.

Finally, the photonics ecosystem has not yet evolved sufficiently to support a large scale commercial market. Whereas electronics have evolved over the past >half century to become the sophisticated, well-oiled design and manufacturing ecosystem of today, this evolution has not yet occurred in photonics. Today’s photonics ecosystem still most closely resembles the electronics ecosystem of the early 1980s.

2019 saw the acquisition of the major photonics suppliers by even more major players, such as Cisco’s acquisition of Luxtera, and now Acacia Communications, along with II-VI’s acquisition of Finisar, the world’s leading supplier of optical communications products, and Broadcom’s reacquisition of their optical transceiver assets from Foxconn. Many see this trend as an acknowledgment of the coming of age of photonics, and the need for major telecommunications providers to offer leading photonics solutions.

What about this year? Will photonics come into focus in 2020? I’ll leave the hockey stick inflection point predictions to analysts who are paid for this sort of thing, but I will predict several trends with impact to the timing of that inflection point.

Photonics is becoming relevant, then prevalent, and finally dominant at shorter and shorter distances. Today, telecommunications delivered over kilometers to your home and business travel via fiber optics, an application dominated by photonics. Now photonics has moved into the data center. Massive hyperscale data centers across the globe struggle with power consumption and cost, heat, bandwidth, and data latency. Replacing copper wire with fiber optics addresses all these issues. Compared to copper, fiber is cheaper, faster, lower latency, higher bandwidth, and consumes less power, thus lowering heat and power costs.

The takeover of fiber optics between racks in the data center is largely complete, and fiber has moved onto interconnecting servers in the same rack. So photonics has already moved from dominance at kilometer distances, to prevalence at ten of meters distances, to relevance at single meter distances. In 2020, Photonic Integrated Circuits (PIC) will become more commonly commercially available, making photonics relevant at millimeter distances. Work is underway to integrate the photonics, including the laser, on-chip with electronics, moving photonics relevance down to microns.

As Ethernet data transmission speed continues its migration from 100G to 200G in 2020, photonics becomes more attractive in the transceivers required at either end of the optical fiber. The build-out for 100G is largely complete. In 2020 the transition to 200G will be well underway, with early adopters moving onto 400G. With clever engineering, electronics can deliver at 100G, but photonics is competitive here, and many 100G photonics transceiver designs are available on the market. We’ll see more cleverness from IC designers, but at 400G, electronics will lose more of its grip in the transceiver market and photonics will begin its move from relevance to prevalence. By the time we reach 800G and 1T (well past 2020), photonics will be in full dominance, with nary an electronics transceiver to be found.

The dominance of photonics in the data center will be hastened by the FANGs (Facebook, Amazon/Apple, Netflix, Google) building photonic design teams focused on photonic transceivers tuned to their own specifications, a new trend in 2020. Operating massive data centers, they will benefit tremendously by photonic designs crafted to their own specific needs, just as we have seen as the FANGs design their own ICs. We may not see fruit of this activity in 2020, but it will significantly increase the world’s photonic design capacity and hasten the evolution of the commercial photonics ecosystem.

Dominated by small niche commercial or R&D fabs such as SMART Photonics, LionX, Ligentec, imec, Leti, and AIM, today’s photonics foundries are generally geared toward R&D or providing MPWs, rather than large commercial runs. While solid foundries, advanced in their photonics offerings (such as Indium Phosphide for lasers), these fabs do not today have the capacity to drive a large commercial market, and they have not yet had the opportunity to develop the extreme customer support processes that large semiconductor foundries have built over decades.

The FANGS are key customers of the world’s leading semiconductor foundries. These foundries are taking notice of the increased photonic design activity and have entered or are contemplating entering the photonics business. This will hasten it’s commercialization as these fabs apply their production knowledge and experience and skills in building mature ecosystems.

Over the last couple of years, leading semiconductor foundries such as TowerJazz and GlobalFoundries commenced servicing the photonics business. 2020 will see other major semiconductor foundries (beyond those whose name is two words run together) entering the photonics business. The new entry of these foundries will serve to legitimize silicon photonics as a commercial business. One attractive advantage that photonics offers semiconductor foundries is the lack of need for leading-edge technology, so there is no requirement for the massive R&D and capital investment of electronics. Instead, photonics makes use of fully capitalized semiconductor fabrication equipment to return impressive margins.

One indicator of the maturing of photonics is the emergence of process development kits (PDKs) from foundries. The first photonic PDKs emerged about two years ago, and they are starting to become prevalent as foundries deliver PDKs targeting a variety of design tools. These PDKs tend to be primitive in comparison to libraries delivered by semiconductor foundries, but they are a significant and important step in the maturing of the commercial photonics ecosystem requiring a strong partnership between foundries and photonic design automation (PDA) companies. Together, they are producing new PDKs and advancing the state of existing PDKs. As more PICs are manufactured and measured, sufficient data is becoming available for statistical analysis. 2020 will see the emergence of statistical-based PDKs, enabling Monte Carlo and corner statistical analysis in more advanced PDA simulation tools. This will lead to more robust designs, with a new focus on manufacturability, another requirement for the commercialization of photonics.

The established EDA vendors are taking note of the emerging photonic-electronic market. They are delivering design tools targeted to this market and forming key alliances with the leading PDA companies to provide a complete integrated design flow. Last year, Mentor introduced LightSuite Photonic Compiler while leveraging their Tanner tools to provide schematics and layout. Cadence introduced curvilinear capability via CurvyCore to enable its industry-leading custom design platform, Virtuoso for photonics. Both Mentor and Cadence have integrated their design flow with the leading photonic simulation provider, Lumerical. For example, Cadence has provided co simulation capabilities, enabling the entire design flow to be driven through the Virtuoso cockpit. Synopsys has pursued more of a go-it-alone strategy.

I predict that the entry of the major EDA vendors portends higher price tags for PDA tools. Average Selling Prices for popular EDA tools are magnitudes higher than for PDA tools. This imbalance is not sustainable long term, as it will hold back needed investment in PDA and the ability of the PDA companies to compete in the new EPDA environment. Though the change won’t be sudden, it will be consistent.

While integrated Electronic-Photonic Design Automation (EPDA) flows emerged last year, in 2020 they will become more sophisticated with the addition of statistical and design for manufacturing (DFM) capabilities. Statistical considerations will require far more compute power, so in 2020 we will also see High Performance Computing applied to PDA, with Amazon AWS and Microsoft Azure becoming significant players delivering photonics in the cloud by making use of all those photonics in their data centers.

In contrast to electronics, a photonic design consists of just a few meticulously crafted components. Many of these components can be found in the PDKs delivered by foundries, but each leading edge photonic design will always include some critical components that the more generalized foundry PDKs cannot deliver. This creates an opportunity for a few well-positioned companies to establish a photonic IP (PIP) business. Well managed companies with extreme photonic design capability and focus, and inexpensive access to design tools are likely to spearhead the emergence of this market. With their unrelenting focus on photonic design, these companies will deliver superior designs. Companies looking to deliver leading-edge photonic designs will engage these PIP providers to outsource their component design in order to focus their own resources on other value-added areas such as the overall PIC design. In it’s infancy, the PIP business will likely hold serval similarities to custom design services.

Breakthroughs in photonics design methodology will provide higher quality, more manufacturable designs and will lower the barrier so that photonic design no longer requires a PhD in physics. Better designs will push forward the applications that photonics can compete in and win at. More qualified designers will result in a greater ability for companies to staff their photonics design team, resulting in greater competition leading to better products and faster evolution.

Already we are seeing the impact of Photonic Inverse Design from sources like Stanford, and Lumerical co operating with the open source community. We are starting to see component design completed much more simply with much improved Figures of Merit, over much shortened design cycle (days compared to months).

We are seeing improvements even on the best published designs, often completed in a matter of days. We are seeing orders of magnitude improvements in components. Photonic Inverse Design’s simplified, automated design methodology will replace today’s manual, iterative process, and will be applied to wide variety of photonic components in 2020. Photonic Inverse Design ‘s impact to photonics will be similar to the impact of logic synthesis to IC design in the 1980s. It will widen the circle of qualified photonics designers and hasten time to market for photonics designs. I think of the transisition to Photonic Inverse Design as analogous to raising the level of abstraction that designers work at. Just as raising the level of abstraction of IC design unleashed a torrent of IC designer productivity, I expect to see similar improvements in the productivity of photonics designers.

Applications:
Transceivers: 2020 will continue the trend of photonics’ takeover in the data center. In 2020, this will become more pronounced at we move from 100G to 200G and onto 400G Ethernet transmission speeds.

LiDAR: In 2020, we will see the introduction of multiple photonics-driven LiDAR designs. LiDAR is a key technology for autonomous vehicles, but it’s not feasible to mount onto everyday passenger cars those rotating cans seen on today’s prototype autonomous vehicles, and its not feasible to pass the thousands of dollars cost of those rotating cans onto the future everyday buyers of autonomous passenger cars. A large number of startups are focused on reducing the size (to a deck of cards) and cost (by an order of magnitude) of LiDAR, and several of them will reveal their designs in 2020. Additionally we will see a photonics-based LiDAR design from at least one established, leading LiDAR company.

LiDAR is another application impacted by the cleverness of engineers. The position of Tesla’s Elon Musk (or is it Elon Musk’s Tesla?) is that radar+cameras will be good enough for autonomous vehicles, so there is no need for LiDAR. The driving question is a race between lowering the cost & size of LiDAR vs. improving the capabilities of radar+camera. The winner of this race will determine the fate of a volume driver of photonics in autonomous vehicles. The checkered flag will be waved well past 2020.

5G: In 2020, we will see the build out of 5G in earnest. This will drive volume in PICs as new photonics-friendly technologies such as NG-PON2 are deployed in both the front haul and the back haul. There will be a hockey stick inflection in 5G also as the second part of 5G, the millimeter wave for short range within buildings, is deployed. This more extensive 5G buildout will not occur in earnest in 2020.

Sensors: Boring perhaps, but an application where photonics is making steady progress, and that progress will continue through 2020. Medical is a particularly interesting area for sensors with strong opportunity for photonics. The progress in medical will be paced more by legal regulations than by technology, and 2020 will not see a breakthrough in this area.

AR/VR: With some credibility, it is said that the easiest way to predict our technology future is to watch Star Trek. All of Star Trek’s technology will eventually come to pass. That’s good news for photonics. If we are ever to cavort in the holodeck, photonics will play a big role.

Quantum computing: Quantum is another application that will drive photonics adoption. Quantum is challenging to predict (I can’t tell whether it is here or there. . . ). 2020 will NOT be the year of Quantum, but I do I predict there will be at least one important quantum announcement that will blow everyone’s mind. The announcement will exist at both a large established company and a start up in both places at the same time.

Summary:
Photonics, the technology of the future, will see solid advancement in 2020. Growth rate will be impressive, with abundant applications coming into focus. Growth will be tethered by the cleverness of engineers extending electronics, and the evolution of the photonics ecosystem. Signs of maturity are becoming more prevalent as commercial foundries join the fray and design automation matures. 2020 is the year that the commercialization of photonics comes into focus.


STT MRAM Highlights from IEDM 2019

STT MRAM Highlights from IEDM 2019
by Don Draper on 01-02-2020 at 6:00 am

IEDM 2019 Logo

IEDM 2019 had the theme: “Innovative Devices for an Era of Connected Intelligence” of which MRAM is a leading contributor.  Following a very informative Plenary Session, Monday afternoon led off with Session 2: Memory Technology – STT-MRAM.  This session has seven important STT-MRAM papers describing the progress of this technology and are summarized below. Especially highlighted are two papers showing high-performance devices suitable for Last Level Cache implementation including Write Error Rate (WER) reliable 2 ns switching and a single device with WER 1e-11 by the IBM-Samsung MRAM Alliance. Very high endurance values of 1e12 cycles with 4 ns read time and retention time of 1 second at 110C were achieved by Intel.  MRAM pioneer Everspin demonstrated a 1Gb stand-alone DDR4 compatible MRAM product in 28nm. Samsung achieved a 1Gb embedded eMRAM in 28nm FDSOI. Global Foundries demonstrated  a device capable of 125C operation and magnetic immunity of 600Oe. Samsung developed a process capable of hybrid memories implementing either high-speed or high-retention in a single chip. TSMC’s eMRAM supports -40 to 150C operation with magnetic shielding.  In addition, there were several other MRAM-related papers in other sessions, and an MRAM poster session jointly sponsored by IEDM and the IEEE Magnetics Society.

 

2.1:  Demonstration of a Reliable 1Gb Standalone Spin-Transfer Torque MRAM for Industrial Applications

Sanjeev Aggarwal, et al, Everspin Technologies, Inc.

Long an MRAM product development leader, Everspin demonstrates their stand-alone  1Gb STT-MRAM chip in 28nm. This paper describes productization and superior performance operation of the 1Gb 1.2V DDR4 STT-MRAM in 28nm CMOS shown in Fig. 1 with capability for industrial temperature range applications of -35C to 110C.

Fig. 1. Top down images of the Everspin 40nm 1.5 V DDR3 256 Mb (top) and the 1.2V DDR4 28nm 1 Gb (bottom) STT-MRAM product dies.

MRAM devices are implemented as magnetically-programmable resistors between two BEOL metal layers as shown in Fig. 2.

Fig. 2.  Schematic diagrams showing integration of the pMTJ bits in the 1 Gb array and adjacent logic areas in the chip’s BEOL metallization.

The Magnetic Tunnel Junction (MTJ) consists of a fixed magnetic layer with high perpendicular magnetic anisotropy, an MgOx tunnel barrier and a magnetic free layer. Upon application of a critical voltage, a current of spin-polarized electrons tunnels through the MgOx barrier to flip the polarization of the free layer to be in a parallel or anti-parallel magnetic state, showing low or high resistance respectively to a read current.  The free layer can be optimized for different applications.  During write, no backhopping or switching abnormalities were observed indicating a large window for switching reliability for the industrial application temperature range from -35C to 110C.  DIMM cycling indicated an endurance lifetime greater than 2e11 cycles. Fig. 3 shows data retention as a function of temperature of 10 years at 85C or 3 months at 100C.

Fig. 3.  Time to Failure vs. temperature for Data Retention (DR) bakes of a collection of 1 Gb dies.  Solid line fit indicates DR of 10 years at 85°C and 3 months at 100°C.

 

2.2:  1 Gb High density Embedded STT-MRAM in 28nm FDSOI Technology

Lee et al, R&D Center, Samsung Electronics Co.

Based on the already shipping 8Mb 28nm FD-SOI eMRAM product, Samsung announces their embedded 1Gb product demonstrating read and write operation from    -40C to 105C. For high performance and stable yield of over 90%, a temperature-compensated write driver and write assistor were implemented.  Improved endurance up to 1e10 cycles was achieved to broaden eMRAM applications to eDRAM replacement. To guarantee high yield, 2b ECC was implemented. The MTJ stack is based on MgO/CoFeB. With an operating voltage of 1.0V and a 50ns read pulse, at an operating temperature from -40C to 105C, 10 years retention at 105C and endurance of 1e6 endurance cycles is demonstrated.  The unit cell size is 0.036 um2 . MTJ stack engineering gave a higher TMR of over 200% and an improvement in MTJ efficiency (retention divided by switching current).  Fig. 4 shows the vertical architecture and the TEM picture of the MTJ cell array.

Fig. 4. Vertical structure and TEM images of MTJ cell array with Bottom Electrode Contact (BEC) embedded in 28nm FDSOI logic process.

Performance is illustrated by the room temperature shmoo plot in Fig. 5, showing the product spec VDD of 1.00V and the read pulse of 50ns.

Fig. 5. Shmoo plot for 1Gb chip as a function of read condition at room temperature.

The tuneability of the process to yield different products with 10-year data retention temperature and corresponding endurance is shown in Fig. 6.

Fig. 6. Correlation between endurance and 10 year data retention temperature properties. With improved efficiency, retention temperature can be enhanced for the same endurance cycle.

 

2.3:  22nm FD-SOI Embedded MRAM Technology for Industrial-grade MCU and IOT Applications

B. Naik, et al, GlobalFoundries

The 40Mb, 0.8V embedded MRAM with 2b ECC achieved reliable operation from -40C to 125C with 5x solder reflows, 400C BEOL flows and 1e6 endurance cycling and stand-by magnet immunity of 600 Oe at 105C for 10 years.  A high magnetoresistance (MR) ratio (Rap-Rp)/Rp where Rp is the parallel resistance or state ”0” and Rap is the anti-parallel resistance or state “1” and the figure-of-merit  MR/s(Rp) resistance distributions are shown in Fig. 7.

Fig. 7 Bit-cell resistance distributions of Rp and Rap showing separation of 28 s(Rp).

Write shmoo data for AP->P at 37 ticks and P->AP at 28 ticks for 200ns write pulse at 0.8V at -40C is shown in Fig. 8.

Fig. 8.  Write shmoo for AP->P at 37 ticks and P->AP at 28 ticks for 200ns write pulse.

Read shmoo is shown in Fig. 9 showing operation at 19ns read pulse.

Fig. 9.  Read shmoo showing read operation at 19ns.

Projected standby magnetic field immunity at 105C for 10 years is 600Oe.  Standby magnetic field at immunity at 10 years as a function of temperature is shown in Fig.10.

Fig. 10. Standby magnetic field immunity as a function of temperature.

In active mode, the magnetic immunity of 500 Oe is limited by the endurance margin.

 

2.4:  2 MB Array-Level Demonstration of STT-MRAM Process and Performance Towards L4 Cache Applications

Juan G. Alzate, et al, Intel  Corporation

L4 cache-level application performance and reliability is shown for a 2 MB STT-MRAM array. This requires high density, high bandwidth and high endurance across industrial temperatures of operation.  The required specifications for L4 cache application of an STT-MRAM are shown in Table I.

Table I.  Target specs for STT-MRAM in an L4  cache application.

A bandwidth of >256 GB/sec and an array density of > 10Mb/mm2 are needed to be an SRAM or eDRAM replacement.   The density requirement as shown in Fig. 11 limits the bitcell pitch and access transistor size and consequently restricts the maximum current available for STT write thus limiting the data retention time to *one second* at the maximum operating temperature of 110C.

Fig. 11. Tighter bitcell pitch required for L4 cache compared to the eNVM application.

The write endurance requirement of 1e12 cycles on the other hand limits the maximum write current to ensure endurance fails remain within ECC-correctable limits.  To achieve an acceptable ECC correctable 1 Gb array Bit Error Rate (BER) of <100 dpm (probability of 1Gb array fail of 1e-4), the required fixed and random Write Error Rate (WER) errors are shown in Fig. 12 for two different architectures, 128b words with Triple Error Correction (TEC) and 512b with Double Error Correction (DEC).  The random BER needs to be 1e-8 to 1e-10 for 1e12 write events.

Fig. 12.  ECC calculation for allowed BER of both fixed location fails (dashed) and random fails (solid) vs 1Gb array fail probability (ECC uncorrectable) assuming either 128b words with Triple Error Correction (TEC) (blue) or 512b words with Dual Error Correction(DEC) (orange).

The 55nm MTJ needs a reliable stack optimization and reactive ion etch (RIE) process.  Defective fails were found to be shorting modes (hard shorts and soft shorts) that reduce the resistance and TMR.  Failing bits at time=0 are fused out.  Acceptable WER levels and shorter write pulses require overdriving the MTJ, limited by the available drive current and endurance considerations as shown in Fig. 13.

Fig. 13.  Write current distributions limited by available drive current and endurance requirements and read disturb requirements.

The minimum current is that required by read disturb considerations and improves as temperature decreases, hence read disturb is measured at 95C by hammering full words with 1e7 reads.  Write Error Rate curves are shown in Fig. 14 for MTJs scaled from the NVM application and the optimized L4 cache device with 10ns write pulse shown in blue.

Fig. 14. Write Error Rates (WER) for different devices, showing the optimized L4 cache MTJ in blue.

The critical condition for WER is at -10C, but as temperature increases, MJTs become easier to write and at higher temperatures the VCC can be reduced.  Endurance measurements are done at 105C due to thermal activation of defects causing MgO dielectric breakdown.

 

2.5:  A novel integration of STT-MRAM for on-chip hybrid memory by utilizing non-volatility modulation

J.-H Park, et al, Semiconductor R&D Center, Samsung Electronics co. Ltd.

Samsung illustrates that it is possible to have either high-retention or high-speed STT-MRAM hybrid memory in separate zones in a single eight Mb chip in 28nm FD-SOI logic as illustrated in Fig. 15.

Fig. 15. Illustration of on-chip hybrid memory which can have two different sub-zones having MTJ arrays of modulated non-volatility: Zone I has relaxed non-volatility for high speed operation and Zone II has strict non-volatility for high retention requirements.

Retention was demonstrated at 10 years at 220C.  For high-speed operation improvements were made in TMR, short fail probability, overdrive and write error rate. By tailoring the magnitude of perpendicular magnetic anisotropy (PMA) of MTJs without modifying the deposition process, the non-volatility in selected areas can be  manipulated. Fig. 16 shows 10-year data retention temperature as a function of MTJ switching current.

Fig. 16. 10-year data retention temperature as a function of MTJ switching current.

To enable high-speed operation, wide read- and write margin are required. Read margin is increased by higher TMR at low RA by minimizing short failure.  Two different MTJ processes, Process A and Process B are compared.  Wider write margin is achieved by higher breakdown voltage (shown in Fig. 17), lower switching voltage, wider voltage margin between read and write and tighter distribution.

Fig. 17. Breakdown voltage as a function of MTJ resistance.

Fig. 18 shows write shmoo plots for 8Mb eMRAM macros integrated with the two types of MTJs of Process A and Process B , respectively.  MTJs of Process A pass with much reduced write fail for the shorter pulse-width condition.

Fig. 18.   Room temperature write shmoo plots  as a function of pulse width and bitline voltage for two different processes, Process A (a) and Process B (b).

By implementing a highly tunable diversity of performances in a single chip as if multiple heterogeneous memories were embedded, both high performance and high retention memories can be implemented in the same chip, forming the hybrid memory. This is done by modulating the PMA energy to manipulate the non-volatility of MTJs.

 

2.6:  Spin-transfer torque MRAM with reliable 2 ns writing for last level cache applications

Hu, et al, IBM-Samsung MRAM Alliance

Reliable 2 ns and 3 ns switching with two-terminal devices as opposed to the low-density, three-terminal SOT (Spin Orbit Transfer) devices, enables fast and dense MRAM products for Last Level Cache (LLC) applications.  Reliable 2 ns switching was achieved for an STT-MRAM with 100% WER yield at 1e-6 write-error floor using 49nm CD MTJ.

In Fig. 19, switching current increases as pulse width decreases for two different free-layer designs, Stack1 and Stack2,  annealed at 400C for 60minutes.

Fig. 19. Switching current vs pulse-width curves of two stacks  with different free-layer materials each showing the thermally activated longer pulse width regime and the shorter pulse width of the precessional switching regime.

For long write pulses of 10 ns and above, the switching is thermally activated, but for short pulses of 10 ns and less, it is in the precessional switching regime governed by the conservation of electron spin angular momentum. LLC applications requiring write pulses <10ns operate in the precessional switching regime determined by the free-layer materials properties. The shorter pulse-width show steep increase of switching current, degradation of the WER slope and the occurrence of WER anomalies, all of which are addressed through materials optimization.

254 devices fabricated with free-layer type I having a nominal size of 49nm and median energy barrier Eb=55kT reached the required 1e-6 WER floor with 2 ns write pulses, illustrated in Fig. 20.  A single device with CD=49nm and 2 ns write pulses reached the 1e-11 WER floor.

Fig. 20 (a) showing WER as a function of write voltage reaching the required 1e-6 error floor and showing the shape and duration of the 2ns pulse with a FWHM of 1.7ns.

In a test of smaller 36nm MTJs, all 256 devices tested with 3ns write pulses reached the 1e-6 error floor and 242 of 256 devices tested with 2 ns write pulses reached the 1e-6 error floor for W0 operation while 228 reached the required error floor for W1 operation.  Reference layer WER anomalies known as backhopping were observed.

 

2.7  22nm STT-MRAM for Reflow and Automotive Uses with High Yield, Reliability and Magnetic Immunity  with Performance and Shielding Options

 J. Gallagher, et al, Taiwan Semiconductor Manufacturing company

A 32Mb embedded STT-MRAM in 22nm was produced using a cell area of 0.046 um2 accommodating MTJs of varying CDs for different retention and performance requirements. The technology supports  6x  solder-reflow-capability and -40C to 150C operation with data retention > 10years. The most recent process gave zero median t0 die bit fails per wafer as a result of the main improvement being the elimination of MTJ shorting defects.  The main challenge for high yield at 150C is the reduction of the read window due to falling off of TMR with temperature, as shown in Fig. 21.

Fig. 21 Read window reduction due to falloff of TMR at temperature

Due to the stochastic nature of magnetic switching, write-verify-write is used, where the first shots incorporate lower amplitude write pulses both for power savings and for endurance stress minimization. If multiple low amplitude shots do not result in a successful write, final high amplitude write pulses may be needed to achieve high yields. At 25C all cells were written successfully with one shot whereas at -40C, 0-15% of the dice needed a second shot. Solder reflow reliability was demonstrated through six simulated reflow cycles, equivalent to 10 year retention at 225C.  Since endurance has the highest failure rates at low temperature cycling, for 1e6 write cycles  were tested at -40C,  the resulting 0.029 ppm fails were within the 1 ppm margin for ECC. There was no change in parallel or anti-parallel cell read current distribution after 100K cycles at -40C as shown in Fig. 22.

Fig. 22 Showing no change in either parallel (Rp) or anti-parallel (Rap) cell read current after 100K cycles

Read disturb rates showed < 1ppm for 1e12 cycles, as shown in Fig. 23 as a function of bitline bias voltage.

Fig. 23 Read disturb rates showed < 1ppm for 1e12 cycles, as a function of bitline bias voltage.

Investigations of magnetic immunity showed stand-by bit error rates for packaged MRAM arrays to be below 1ppm BER for 10-year exposures of 1100, 750 and 600 Oe at 25C, 85C and 125C respectively as shown in Fig. 24.

Fig. 24.  Packaged MRAM arrays below 1ppm BER for 10-year exposures of 1100, 750 and 600 Oe at 25C, 85C and 125C respectively.

In-package shielding was used to protect against a tampering attack with a 3.5kOe magnet.  Failure rates of an unshielded sample were ~30% after ~one second whereas the shielded part had <one ppm after 80hours at 25C for a reduction factor of >1e6 sensitivity.

Parts with smaller CDs were used for higher performance, trading off solder-reflow capability but still having very high retention >10 years at >150C. Tables II and III show read and write performance for a 0.038um2 cell.  Table II shows read time and voltage shmoo at 125C showing a 6ns read cycle.

Table II.  Shmoo showing read pulse width and bitline voltage at 125C

Table III shows bit line write voltage and programming pulse width shmoo for multi-shot programming at -40C. The smaller CDs achieved endurance of better than one ppm after 1e9 write cycles at -40C.

Table III.  Shmoo showing bitline write voltage with pulse width for multi-shot programming at – 40C.

 


Who is Driving This Car Anyway?

Who is Driving This Car Anyway?
by Roger C. Lanctot on 01-01-2020 at 10:00 am

Who is Driving This Car Anyway

My Lyft driver in San Jose thought his Hyundai had “autopilot,” alluding I suspected, to Tesla Motors’ feature of the same name which has placed that company at the forefront of driving automation development and the focal point of fatal crash investigations. Before either of us got hurt I gently disabused my driver of his dangerous delusion, pointing out that his car was likely equipped with lane keeping technology and, possibly, adaptive cruise control and/or automatic emergency braking.

All the driver knew was that on different occasions his car, on its own, had avoided colliding with other cars, primarily by slowing or stopping.

This is the conundrum facing the automotive industry on the cusp of a new decade and another Consumer Electronics Show (2020) opening within a week in Las Vegas. How to make cars safer without making drivers less careful? My Lyft driver was a newly-minted fan of collision avoidance technology without really understanding why or how it worked.

The issue seems relatively benign on the surface but it touches the core marketing challenges of making cars safer without making them too expensive, and defining an evolutionary path to fully autonomous driving. Strategy Analytics research has routinely shown that safety technology is in demand from consumers. It is something consumers are looking and willing to pay for in a new car.

But safety is being redefined as auto makers and regulators shift the focus from passive safety (airbags, seatbelts, child restraints, etc.) to active safety specifically designed to avoid collisions, by allowing on-board vehicle systems to seize control of the car – under appropriate circumstances.

Nvidia opened this Pandora’s box at CES 2019 with the introduction of its DRIVE Autopilot system, described by the company as “Level 2+.” The DRIVE Autopilot is intended to integrate multiple sensor suites to deliver a variety of assisted driving functions including lane keeping, driver monitoring, and adaptive cruise control while being scalable to higher levels of automated driving.

In its own words, Nvidia described the DRIVE AutoPilot as integrating “for the first time high-performance NVIDIA Xavier system-on-a-chip (SoC) processors and the latest NVIDIA DRIVE Software to process many deep neural networks (DNNs) for perception as well as complete surround camera sensor data from outside the vehicle and inside the cabin. This combination enables full self-driving autopilot capabilities, including highway merge, lane change, lane splits and personal mapping. Inside the cabin, features include driver monitoring, AI copilot capabilities and advanced in-cabin visualization of the vehicle’s computer vision system.”

The announcement reflected the desperation of companies like Nvidia, Intel, Qualcomm, Renesas, and, yes, Tesla itself – to deliver an affordable, mass market self-driving or near self-driving experience. Nvidia’s choice of “Level 2+” terminology was an effort to distinguish the product from competing system-on-chip (SoC) solutions and define a “new” market segment.

The reality is that there is no such thing as Level 2+. Nvidia is attempting to suggest a value proposition that is more than just an advanced driver assist system (ADAS) which requires the driver to remain engaged and vigilant at the steering wheel. DRIVE Autopilot is “something” more.

There are two problems with this Nvidia marketing proposition. First of all, it perpetuates the perplexity brought on by Tesla’s own Autopilot offering which is decidedly NOT an autonomous driving system and definitely DOES require drivers to pay attention and keep their hands on the wheel.

The second problem with Nvidia’s Level 2+ nomenclature, aside from the fact that it lacks an endorsement from standards-setting or regulatory bodies, is that it is not a single thing. While it highlights the limitations of existing ADAS systems, it fails to remedy these shortcomings completely and fails to define a marketable consumer value proposition.

My colleague, Ian Riches, vice president of the global automotive practice at Strategy Analytics, summed up the issue in a seminar in Tokyo nearly a month ago when he asked the attendees (car makers and their suppliers): “How many consumers will pay for this technology?” The operative term: “consumers.”

The great virtue of Nvidia’s messaging and positioning is that the company emphasizes the integration of external sensing systems with driver monitoring technology. This is the value proposition that every car maker is wrestling with: How to assist drivers while at the same time insisting that drivers continue to pay attention to the driving task?

General Motors is something of a leader, along with Tesla Motors, in bringing what could be described as Level 2+ systems to market in the form of Super Cruise and Autopilot, respectively. Of course, these two systems work in different ways – and Super Cruise, a hands-free adaptive cruise control system, is a $2,500 option available on a limited range of Cadillacs. (General Motors has yet to set a date for the launch of Ultra Cruise – and has been forced to reconfigure Super Cruise to compensate for sunlight interfering with the original Super Cruise sensors.)

German auto makers Daimler and Audi have been advancing their driver assist portfolios toward automation, with Audi flirting with Level 3 automation in Europe. Nissan has brought ProPILOT to market to mixed reviews and Toyota is preparing a 2020 launch for its Team Mate driver assistant reputedly capable of lane changing, merging, and passing.

Nvidia’s DRIVE Autopilot helps to deliver all of these value proposition, but it does so at considerable cost. Next week at CES 2020 there will no doubt be many more demonstrations and announcements addressing assisted driving. The question remains as to whether and what kind of market there is for these solutions. In the words of my colleague, Ian, “How on earth will we get a return on these investments this side of 2030?”

The question is much simpler for me. Drivers should pay attention when driving and cars should not collide with other cars, pedestrians, or inanimate objects. The fact that cars DO collide with things quite routinely and with catastrophic results is but one indication that we are failing as an industry. Given the societal cost of 1.3M annual highway fatalities globally, the ongoing effort to enhance vehicle safety is worth the short-term confusion of misleading nomenclature and the high cost of research. This is the highest and most important calling in today’s automotive industry.


Author Interview: Bernard Murphy on his latest book

Author Interview: Bernard Murphy on his latest book
by Daniel Nenni on 01-01-2020 at 10:00 am

Book Cover

Over the last 40 years, Bernard has worked with semiconductor and EDA companies in hands-on, management and consulting roles in engineering, sales and marketing. He most recently co-founded Atrenta where he created and led the development of SpyGlass, retiring as CTO when Atrenta was acquired by Synopsys. Post-retirement, he’s been an active blogger for SemiWiki. He’s also written a couple of books under the SemiWiki label and he independently advises a number of clients on marketing content.

Why did you decide to write The Tell-Tale Entrepreneur?

Over the last 40 years, I’ve created, suffered and edited more than my fair share of biz and tech communication. Which has reinforced my view, widely shared, that our communication is pretty bad. Pitches, blogs, white papers aiming to convince are at best unconvincing, at worst painful. What’s curious is that, from engineers to CEOs, think we’re good at PowerPoint. Yet we’re terrified at the thought of writing. PowerPoint feels like a familiar template, communicate-by-numbers. Word has no template; we have to start from a blank sheet, hence the terror. Which suggests the format is beside the point, we suck at communication either way. PowerPoint just lulls us thinking we don’t.

I started out just as bad, but I worked hard to improve. Through a lot of trial and error I believe I figured out the problem. I want to pass this on, not through a boring how-to book but through an entertaining set of short stories. Designed to show you how to make your communication just as engaging.

How is storytelling different and why is it better than other ways to communicate?

Take PowerPoint as a reference. It’s well established and has great value in structured contexts where efficient information transfer is the goal. Status updates, project planning, training, technical due diligence. But it’s weak in persuasion. Where you need to convince a client, prospect, investor that you have the best product for their needs. Or you are their best possible partner. Or you failed to deliver what you promised and now must rescue the relationship.

These are times when slides are the wrong answer. You have to connect emotionally with your audience – building excitement around a new direction, dealing with fear of possible failure, maybe pointing out pitfalls that you already understand well. Or convincing them there will be no more mistakes. Eyeball to eyeball conversations.

The best way to guide that conversation is through a story. You’re looking at them (not slides), they’re looking at you, and you’re telling them a story, appealing to their emotions.

Storytelling isn’t a new idea. What makes your approach different?

Storytelling is a very old idea. Today we relegate stories to entertainment, expecting that business communication needs a more professional approach. Our brains don’t agree. We’ve been telling stories from beginning of time. Not just to entertain but also to pass on wisdom, culture, beliefs, laws. Our brains are wired to receive stories efficiently, motivating us to action. Not so for data and logic dumps. That’s why we find PowerPoints so boring and start scrolling through texts, emails, anything rather than listen to the speaker.

Instead tell a story. Stories are naturally engaging, especially when they sound roughly relevant to the audience’s goals. We want to know what’s going to happen next. Calling the hero to adventure. Facing tests together, proving ourselves a worthy mentor. The big challenge where it could all go wrong, but somehow our hero makes it through, now stronger, more capable. And the final challenge. Much more interesting than texts and emails. They learn what you can do for them along the way, in a context they recognize.

Storytelling is big in marketing now. Tons of advice online on how to do it – blogs, whitepapers, companies who want to advise you (for a fee). But there’s something a bit odd about this advice. It all seems to come in the same business communication standard format: explanations, bullet lists and charts. You want to learn how to tell stories. Wouldn’t it be better to do that by reading stories?

That’s my innovation – I explain how to tell stories by telling you stories.

Who do you think will find value in this book?

Anyone in tech who must communicate with customers, prospects, investors. Who wants to reach markets through blogs and white papers. Those who aspire to sell their company. All will relate to experiences in these stories. And will I hope will be inspired to re-imagine and improve their own stories, based on these examples.

The book is also written for a general audience. Anyone interested in real stories drawn from different phases in the lives of tech ventures. Technology plays a role in these stories but isn’t primary, so I’ve simplified quite a lot. People, opportunities, challenges, growth are the most important elements.

These stories are for everyone, but especially for us communicators in tech. Most important, I hope you will begin to understand why our audiences are thirsting for stories, not more death-by-PowerPoint.

Where can we find the book?

The Tell-Tale Entrepreneur is available for pre-order on Amazon and will be released on January 26th, 2021.


ANSYS, TSMC Document Thermal Reliability Guidelines

ANSYS, TSMC Document Thermal Reliability Guidelines
by Bernard Murphy on 01-01-2020 at 6:00 am

Automotive Reliability Guide min

Advanced IC technologies, 5nm and 7nm FinFET design and stacked packaging, are enabling massive levels of integration of super-fast circuits. These in turn enable much of the exciting new technology we hear so much about: mobile gaming and ultra-high definition mobile video through enhanced mobile broadband in 5G, which requires support for millimeter wave frequencies; high-speed networking in hyperscalar datacenters through 100G connectivity; blazing fast AI accelerators in those same datacenters; and fusion of multiple sensor sources to build environment-aware intelligence for automotive safety and autonomy, building security, autonomous drones and many more capabilities.

With new technologies we always find new challenges. ANSYS and others have been hearing from chip and system builders supporting these domains that they are seeing increasing post-silicon failures in the devices they are building. These devices are nominally perfectly fine, pass standard testing, but fail in system operation primarily related to voltage, timing and process variations. Tianhao Zhang  (Dir. Foundry Relations at ANSYS) says that between what they are hearing from customers and industry reviews, 75% of these product failures can be attributed to thermal or vibration effects.

Thermal also increases cost through need for more advanced cooling, it reduces performance through increased resistance in the interconnect and degraded transistor performance and it increases noise leading to random failures. It also decreases reliability, on chip through electromigration and device aging, and in the package and system through mechanical stress during to warping.

This is not a problem that can be dealt with later. One chip design VP has said that self-heating (related to FinFETs) and thermal analysis are now absolute requirements for automotive and high-performance computing applications. Another noted that compared to planar designs they are now seeing temperature increases in metal of 10 to 20 degrees, and that is making design for reliability much more challenging.

TSMC has been hearing all the same issues and has been increasing the number of checks they require, particularly thermal checks, to offset these types of problem. TSMC has worked closely with ANSYS to prove and document a thermal solution they jointly support. This includes an ANSYS reference flow for transistor, chip and package/3D-IC levels, from 20nm down to 5nm. These can be downloaded from the TSMC portal.

They are also working together on solution guides for specific application flows. For example, ANSYS now provides solution guides for automotive development on 16nm and 7nm. These cover electromigration, thermal and ESD topics. In the thermal analysis section, the document details multiple areas including the flow, and also provides test cases and case studies.

The ANSYS analysis is not based on a simple averaging of thermal effects. They analyze all the way down to the physical implementation of transistors and interconnect systems under representative activity scenarios, to estimate local heating, interconnect heating and heat dissipation. They do this using analytics from RedHawk, together with finite-element analysis applied at the die, stacked die, package and board level. And they’re computing temperature profile by looking at (thermal) conduction, radiation and convection flows, the last of these though detailed fluidics analysis. This is a true bottoms-up multi-physics solution. You can learn more in THIS WEBINAR, presented by Tianhao and Karthik Srinivasan (Sr Prod Mgr at ANSYS).


Ten Trends of Blockchain in 2020

Ten Trends of Blockchain in 2020
by Ahmed Banafa on 12-31-2019 at 6:00 am

Ten Trends of Blockchain in 2020

It’s clear that blockchain will revolutionize operations and processes in many industries and governments agencies if adopted, but its adoption requires time and efforts, in addition blockchain technology will stimulate people to acquire new skills, and traditional business will have to completely reconsider their processes to harvest the maximum benefits from using this promising technology. [2]

The following 10 trends will dominate blockchain technology in 2020:

1. Blockchain as a Service (BaaS) By Big Tech Companies
One of the promising blockchain trends in 2020 is BaaS, short for Blockchain As A Service. It is a new blockchain trend that is currently integrated with a number of startups as well as enterprises. BaaS is a cloud-based service that enables users to develop their own digital products by working with blockchain. These digital products may be smart contracts, decentralized applications (Dapps), or even other services that can work without any setup requirements of the complete blockchain-based infrastructure.

Some of the companies developing a blockchain that provide BaaS service are Microsoft and Amazon, consequently shaping the future of blockchain applications. [1]

 2. Federated Blockchain Moves to The Center Stage
Blockchain networks can be classified as: Private, Public, Federated or Hybrid. The term Federated Blockchain can be referred to as one of the best blockchain latest trends in the industry. It is merely an upgraded form of the basic blockchain model, which makes it more ideal for many specific use cases.

In this type of blockchain, instead of one organization, multiple authorities can control the pre-selected nodes of blockchain. Now, this selected group of various nodes will validate the block so that the transactions can be processed further. In 2020, there will be a rise in the usage of federated blockchain as it provides private blockchain networks, a more customizable outlook. [1]

3. Stablecoins Will Be More Visible
Using Bitcoin as an example of cryptocurrencies its highly volatile in nature. To avoid that volatility stablecoin came to the picture strongly with stable value associate with each coin. As of now, stablecoins are in their initial phase and it is predicted that 2020 will be the year when blockchain stablecoins will achieve their all-time high. [1]

One driving force for using stablecoin is the introduction of Facebook’s cryptocurrency “Libra” in 2020 even with all the challenges facing this new cryptocurrency proposed by Facebook and the shrinking circle of partners in libra.org [4].

4. Social Networking Problems Meet Blockchain Solution
There are around 2.77 Billion social media users around the globe in 2019.

The introduction of blockchain in social media will be able to solve the problems related to notorious scandals, privacy violations, data control, and content relevance. Therefore, the blockchain blend in the social media domain is another emerging technology trend in 2020.

With the implementation of blockchain, it can be ensured that all the social media published data remain untraceable and cannot be duplicated, even after its deletion. Moreover, users will get to store data more securely and maintain their ownership. Blockchain also ensures that the power of content relevance lies in the hands of those who created it, instead of the platform owners. This makes the user feel more secure as they can control what they want to see. One daunting task is to convince social media platforms to implemented it, this can be on a voluntary base or as a results of privacy laws similar to GDPR. [1]

5. Interoperability and Blockchain Networks
Blockchain interoperability is the ability to share data and other information across multiple blockchain systems as well as networks. This function makes it simple for the public to see and access the data across different blockchain networks. For example, you can send your data from one Ethereum blockchain to another specific blockchain network. Interoperability is a challenge but the benefits are vast [5].

6. Economy and Finance Will Lead Blockchain Applications
Unlike other traditional businesses, the banking and finance industries don’t need to introduce radical transformation to their processes for adopting blockchain technology. After it was successfully applied for the cryptocurrency, financial institutions begin seriously considering blockchain adoption for traditional banking operations.

PWC report, 77 percent of financial institutions are expected to adopt blockchain technology as part of an in-production system or process by 2020.

Blockchain technology will allow banks to reduce excessive bureaucracy, conduct faster transactions at lower costs, and improve its secrecy. One of the blockchain predictions made by Gartner is that the banking industry will derive 1 billion dollars of business value from the use of blockchain-based cryptocurrencies by 2020.

Moreover, blockchain can be used for launching new cryptocurrencies that will be regulated or influenced by monetary policy. In this way, banks want to reduce the competitive advantage of standalone cryptocurrencies and achieve greater control over their monetary policy. [2]

7. Blockchain Integration into Government Agencies
The idea of the distributed ledger is also very attractive to government authorities that have to administrate very large quantities of data. Currently, each agency has its separate database, so they have to constantly require information about residents from each other. However, the implementation of blockchain technologies for effective data management will improve the functioning of such agencies.

According to Gartner, by 2022, more than a billion people will have some data about them stored on a blockchain, but they may not be aware of it. Also, national cryptocurrencies will appear, it’s inevitable that governments will have to recognize the benefits of blockchain-derived currencies. Digital money is the future and nothing will stop. [3]

8. Blockchain Combines with IoT
The IoT tech market will see a renewed focus on security as complex safety challenges crop up. These complexities stem from the diverse and distributed nature of the technology. The number of Internet-connected devices has breached the 26 billion mark. Device and IoT network hacking will become commonplace in 2020. It is up to network operators to stop intruders from doing their business.

The current centralized architecture of IoT is one of the main reasons for the vulnerability of IoT networks. With billions of devices connected and more to be added, IoT is a big target for cyber-attacks, which makes security extremely important.

Blockchain offers new hope for IoT security for several reasons. First, blockchain is public, everyone participating in the network of nodes of the blockchain network can see the blocks and the transactions stored and approves them, although users can still have private keys to control transactions. Second, blockchain is decentralized, so there is no single authority that can approve the transactions eliminating Single Point of Failure (SPOF) weakness. Third and most importantly, it’s secure—the database can only be extended and previous records cannot be changed [7].

Many IoT based companies adopts blockchain technology for their business solutions. The International Data Corporation (IDC) is expecting that 20 percent of IoT deployments will enable blockchain services by 2020. [3]

9. Blockchain with AI 
With the integration of AI (Artificial Intelligence) with blockchain technology will make for a better development. This integration will show a level of improvement in blockchain technology with adequate amount of applications.

The International Data Corporation (IDC) suggests that global spending on AI will reach $57.6 billion by 2020 and 51% of businesses will be making the transition to AI with blockchain integration.

Additionally, blockchain can also make AI more coherent and understandable, and we can trace and determine why decisions are made in machine learning. Blockchain and its ledger can record all data and variables that go through a decision made under machine learning.

Moreover, AI can boost blockchain efficiency far better than humans, or even standard computing can. A look at the way in which blockchains are currently run on standard computers proves this with a lot of processing power needed to perform even basic tasks

Examples of applications of AI in Blockchain: Smart Computing Power, Creating Diverse Data Sets, Data Protection, Data Monetization, Trusting AI Decision Making. [6]

10. Demand for Blockchain Experts 
Blockchain is a new technology and there are only few percent of individuals who are skilled in this technology. As blockchain technology becoming a fast-increasing and wide-spreading technology, that creates a situation for many to develop skills and experience about blockchain technology.

Even though the number of experts in blockchain fields is increasing, on the other hand the implementation of this technology has a rapid growth which will create a situation for the demand of Blockchain experts by 2020. [3]

It’s worth saying that there are genuine efforts by universities and colleges to catch up with this need, but the rate of graduating students with enough skills to deal with blockchain technology is not enough to fill the gap. Also, Companies are taking steps to build on their existing talents by adding training programs for developing and managing blockchain networks.

Ahmed Banafa, Author the Books :

Secure and Smart Internet of Things (IoT) Using Blockchain and AI

Blockchain Technology and Applications

Read more articles at : https://medium.com/@banafa

References

[1] https://www.mobileappdaily.com/top-emerging-blockchain-trends

[2] https://www.aithority.com/guest-authors/blockchain-technology-in-the-future-7-predictions-for-2020/

[3] https://www.bitdeal.net/blockchain-technology-in-2020

[4] https://medium.com/altcoin-magazine/to-libra-or-not-to-libra-e2d5ddb5455b

[5] https://blockgeeks.com/guides/cosmos-blockchain-2/

[6] https://medium.com/altcoin-magazine/blockchain-and-ai-a-perfect-match-e9e9b7317455

[7] https://medium.com/@banafa/ten-trends-of-iot-in-2020-b2


TSMC, Huawei, the US Government, and China

TSMC, Huawei, the US Government, and China
by Daniel Nenni on 12-30-2019 at 6:00 am

Morris Chang TSMC

The media is trying to disparage the semiconductor industry again. It’s hard to not take this type of desperate journalism personal. Semiconductor people are the smartest and hardest working people in the world and we deserve better, absolutely.

Morris and Sophie Chang TSMC

TSMC founder sees trade dispute as ‘reality show with no script’ July 2018

The latest media scam is that the US Government is pressuring TSMC about stopping wafer shipments to Huawei (HiSilicon). The Financial Times started it with “US urges Taiwan to curb chip exports to China” and the cut/paste media sites jumped all over it and “made it their own”.

TSMC responded with:

“We did not have any discussion with either the Taiwan or the U.S. governments regarding shipping wafers to HiSilicon, nor have we received any instruction from either government not to make the shipments,” TSMC spokesperson Elizabeth Sun told Caixin in an email, adding that it will continue shipments while complying with trade regulations.

Remember, TSMC has two fabs in China and plenty of room for expansion. The US accounts for 61% of TSMC’s revenue and China is a growing 17%. Taiwan is 8%, Japan 6% and others are 1%. The question is: What would happen if TSMC cut wafer shipments to the US or China? Answer: The end of modern life as we now know it.

Another ignorant quote:

“Last month, a U.S. official informed Taiwanese diplomats that the semiconductors produced by TSMC and then procured by Huawei, were ending up in Chinese missile guidance systems aimed at Taiwan, as per the reporting by Financial Times.”

I can assure you TSMC knows more about what their customers are doing than politicians in any country including Taiwan. There are very few secrets inside the fabless semiconductor ecosystem and TSMC knows more than most. And does it really matter who made what, when, and where in the case of war? It doesn’t matter because there is nothing you can do about it. That ship sailed a long time ago.

Bottom line: TSMC is the new Switzerland and has the full support of the US, Taiwan, and China Governments.

Another interesting headline:

“Samsung is pouring $116 billion towards beating TSMC in the race to 5nm and beyond”

First and foremost, TSMC has already won the race to 5nm and EUV if the finish line is high volume manufacturing versus press releases or “leaked” road maps.

In order for Apple to ship millions of iProducts in Q4 2020 the 5nm EUV process must be frozen by the end of 2019 starting production in Q1 2020. In fact, TSMC recently outlined their 5nm process at IEDM.

I remember when SMIC launched in 2000 and suggested that they would compete with TSMC. It was believable to me because the China Government was strongly behind them and the China consumer market was theirs for the taking. Unfortunately, competing with TSMC proved too hard for SMIC who then resorted to stealing trade secrets. The resulting litigation cost SMIC hundreds of millions of dollars and 10% of their stock.

To say that SMIC is a trailing edge foundry is quite generous. SMIC has just now released a 14nm process four years after TSMC who is now at 5nm with full EUV. SMIC doesn’t even have an EUV machine yet and they may not get one if the current political turmoil is not properly addressed.

According to reports, the SMIC 14nm was co-developed with Qualcomm who also worked with TSMC and Samsung on 14/16nm processes. I’m sure the TSMC and Samsung legal staff already have SMIC 14nm die under review.

GlobalFoundries also had their sites set on competing with TSMC but that never really happened, not even close.

Samsung officially became a pure-play foundry in 2017 when they reorganized all of their logic fabs under Samsung Foundry. Samsung Electronics is Samsung Foundry’s biggest customer of course but they do have a long history of external foundry business. Apple was the big start with the introduction of the iProducts and other big fabless companies (Qualcomm) have followed.

Samsung certainly is a leader in connectivity and IoT now that all Samsung appliances, TVs, and other electronic gadgets have WiFi so they can talk to you throughout the day. You should see the Samsung booth at CES. It’s more of a connected city than a trade show booth but I digress.

Bottom line: While Samsung’s “pouring $116 billion towards beating TSMC” is impressive you have to understand that the TSMC ecosystem of partners and customers have poured trillions of dollars into TSMC staying ahead of all foundry comers, right?