WP_Term Object
(
    [term_id] => 15
    [name] => Cadence
    [slug] => cadence
    [term_group] => 0
    [term_taxonomy_id] => 15
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 604
    [filter] => raw
    [cat_ID] => 15
    [category_count] => 604
    [category_description] => 
    [cat_name] => Cadence
    [category_nicename] => cadence
    [category_parent] => 157
)
            
14173 SemiWiki Banner 800x1001
WP_Term Object
(
    [term_id] => 15
    [name] => Cadence
    [slug] => cadence
    [term_group] => 0
    [term_taxonomy_id] => 15
    [taxonomy] => category
    [description] => 
    [parent] => 157
    [count] => 604
    [filter] => raw
    [cat_ID] => 15
    [category_count] => 604
    [category_description] => 
    [cat_name] => Cadence
    [category_nicename] => cadence
    [category_parent] => 157
)

Compute and Communications Perspectives on Automotive Trends

Compute and Communications Perspectives on Automotive Trends
by Bernard Murphy on 03-19-2025 at 6:00 am

Automotive electronics is a fast-moving space, especially around sensing and distilling intelligence from that sensing. This serves three main pillars: autonomy, electrification and advances in the car cockpit. Autonomy at multiple levels remains an important goal and continues to advance, technically and geographically. Now the automotive cockpit has become a focus for innovation, with advances in infotainment, connectivity, driver and occupancy monitoring systems, even health monitors. Lots of demand for new and advanced functionality, yet OEMs are detemined to minimize bill of materials and to emphasize software driven functionality for flexibility and growth. Which, coming back to sensing, demands higher levels of integration in multi-modal sensing, fusion, and connectivity from the edge to zonal controllers to central compute. Cadence just released a webinar on this topic, hosted by Amit Kumar (Director Product Marketing and Management, Tensilica Product Group) and William Chen (Group Director, Product Marketing for Protocol Interface IPs).

Compute and Communications Perspectives on Automotive Trends

Trends and Solutions in Sensing

Modern cars now run to 40 sensors or more: cameras, radar, lidar, even thermal is on the horizon. Some sensors are narrow focus, long range for forward and rear collision avoidance at speed, some are medium range perhaps for more detailed scene analysis, side sensors are shorter range still for detection in blind spots. Since each has its strengths and weaknesses, the most reliable inferencing often depending on fusion between two or more sensing streams, say between vision and radar for complementary object detection (and velocity info) in good light or poor light conditions.

In-cabin, driver monitoring systems (DMS) check the driver’s gaze and head pose (are they looking at the road ahead or falling asleep). These sensors are typically camera-based, sometimes augmented by radar or other methods. Occupant monitoring systems (OMS) detect other occupants in the car to provide additional input for warnings (someone not wearing a seatbelt, or you left a child in the car) or corrective actions. These may also use camera sensors, though radar provides better coverage to detect objects out of visual range (small children, pets, packages in the footwell). There is even work now on monitoring driver health, through radar or seatbelt sensors. Finally, sensor-based voice and gesture control help keep a driver’s eyes on the road, though I won’t touch on these topics here.

Each of these sensor streams requires specialized conditioning before an AI stage: image signal processing for a camera, a radar pipeline for radar systems (FFTs for range/Doppler, beamforming, multiple other steps on the way to building a point cloud), a different pipeline for LIDAR, and (I’m guessing) something like a camera pipeline for IR/thermal. Then comes fusion, blending these inputs together in an ML analysis to refine detection/classification. Cadence Tensilica has been strong in this space for many years, through their Vision and ConnX families of DSPs for vision/radar pipelines. The Neo family of NPU cores handles AI processing, though some simpler neural net functions can also be hosted in the Vision/ConnX cores.

Levels of Autonomy

You can see above how these products map to different SAE levels from L1 to L5. Hardware products come with deep support libraries and SDK, together with an extensive software ecosystem. Naturally all of this is functional safety qualified in ASIL B or ASIL D as appropriate.

Trends and Solutions in Connectivity

I must thank the webinar hosts for helping me tie together an important trend behind electronic design for automotive. Not just the point concepts – sensing, AI, zonal controllers – but the grand plan and needs that emerge from that plan. Sensing and AI give us intelligent autonomy at various levels. At the same time, auto OEMs want to keep cars affordable. That motivates a trend to a multi-purpose SoC which can serve edge, zonal or central compute needs with (as mentioned earlier) software-defined behavior to adapt to different objectives.

Such big multi-purpose systems must stuff more functionality within a package, hence the need for chiplet integrations. Chiplet subsystems will support multiple sensor interfaces, for video/radar streams (connecting through MIPI), DSP functions for image and other signal processing, GPU functions for infotainment, an AI subsystem for recognition/classification, a CPU cluster for multithreaded compute and an interface subsystem for the wide range of protocols that must be supported, from UCIe to connect between chiplets, to Ethernet for longer range, PCIe for fast point-to-point, CAN for engine control, and so on. (When you consolidate everything, you must also consolidate communications interfaces.)

ADAS Chiplet decomposition

Here also Cadence has strength in their family of interface solutions from PCIe (Gen 3.1 through Gen7), MIPI CSI-2, Ethernet ( up to 224G and UCIe 2.0., together with the latest DDR, LPDDR, GDDR and HBM memory interfaces. To highlight these and to help accelerate chiplets and chiplet-based systems, they recently released a reference chiplet available to be used by chiplet and full system designers as an aid in prototyping and testing their own products. In addition to very active deployment across a wide range of Tier1s and OEMs, these guys are serious about continuing to push the boundaries on automotive connectivity.

Very instructive webinar. You can watch it HERE.

Share this post via:

Comments

There are no comments yet.

You must register or log in to view/post comments.