Lately, I’ve been cataloging the number of impossible-to-verify technologies we face. All forms of machine learning and inference applications fall into this category. I’ve yet to see a regression test to prove a chip for an autonomous driving system will do the right thing in all cases. Training data bias is another interesting… Read More
The Story of Ultra-WideBand – Part 4: Short latency is king
How Ultra-wideband aligns with 5G’s premise
In part 3, we discussed the time-frequency duality or how time and bandwidth are interchangeable. If one wants to compress in time a wireless transmission, more frequency bandwidth is needed. This property can be used to increase the accuracy of ranging, as we saw in part 3. Another very… Read More
The Story of Ultra-WideBand – Part 3: The Resurgence
In Part 2, we discussed the second false-start of Ultra-WideBand (UWB) leveraging over-engineered orthogonal frequency-division multiplexing (OFDM) transceivers, launching at the dawn of the great recession and surpassed by a new generation of Wi-Fi transceivers. These circumstances signed the end of the proposed applications… Read More
The Story of Ultra-WideBand – Part 2: The Second Fall
Over-engineered to perfection, outmaneuvered by Wi-Fi
In Part 1 of this series, we recounted the birth of wideband radio at the turn of the 20th century, and how superheterodyne radio killed wideband radios for messaging after 1920. But RADAR kept wideband research alive through World War 2 and the Cold War. Indeed, the story of… Read More
The Story of Ultra-WideBand – Part 1: The Genesis
In the middle of the night of April 14, 1912, the R.M.S. Titanic sent a distress message. It had just hit an iceberg and was sinking. Even though broadcasting an emergency wireless signal is common today, this was cutting edge technology at the turn of the 20th century. This was made possible by the invention of a broadband radio developed… Read More
Edge Computing – The Critical Middle Ground
Ron Lowman, product marketing manager at Synopsys, recently posted an interesting technical bulletin on the Synopsys website entitled How AI in Edge Computing Drives 5G and the IoT. There’s been a lot of discussion recently about the emerging processing hierarchy of edge devices (think cell phone or self-driving car), cloud… Read More
FPGAs in the 5G Era!
FPGAs, today and throughout the history of semiconductors, play a critical role in design enablement and electronic systems. Which is why we included the history of FPGAs in our book “Fabless: The Transformation of the Semiconductor Industry” and added a new chapter in the 2019 edition on the history of Achronix.
In a recent blog… Read More
ANSYS, TSMC Document Thermal Reliability Guidelines
Advanced IC technologies, 5nm and 7nm FinFET design and stacked packaging, are enabling massive levels of integration of super-fast circuits. These in turn enable much of the exciting new technology we hear so much about: mobile gaming and ultra-high definition mobile video through enhanced mobile broadband in 5G, which requires… Read More
The First Must-Have in 5G
If I was asked about must-have needs for 5G, I’d probably talk about massive MIMO and a lot of exotic parallel DSP processing, also perhaps need for new intelligent approaches to link adaptation and intelligent network slicing in the infrastructure. But there’s something that comes before that all that digital cleverness, in … Read More
Shipments of 5G Smartphones Will Surge to 900 Million Units in 2024
5G smartphones will increase from just 13 million units in 2019 to 900 million in 2024, as previous 2G/3G/4G smartphones shipments will decline slightly over the 2019-2024 period, reaching parity with 5G smartphones in 3Q 2023, as shown in the chart below.
According to The Information Network’s report “Hot ICs: … Read More
IEDM 2025 – TSMC 2nm Process Disclosure – How Does it Measure Up?