The second panel is about system coverage and big data. Coverage metrics have been used to gauge the quality of verification efforts during development. At system level, there are still no standardized metrics to measure full coverage. The emergence of PSS, better formal verification, enhanced emulation and prototyping techniques may deliver some answers to the issue.
Functional verification – From vendor’s standpoint, verification is a risk analysis issue, which can be rectified with the use of graph-based analysis. The claim is that one could reason with data space and decide area to probe further. On the other hand, some panelists believe that functional verification still offer value. Hierarchy and abstraction inject complexity over time. While calculus/algebra deals with abstraction, hierarchy has posed several challenges to verification.
Portable Stimulus Standard (PSS) role — Vendor’s POV on PSS usage: it does not require giving-up functional coverage and should enable one to push-down system-level check to block-level. But the user states that issues to be checked at lower level are different (such as registers and data-paths) versus top-level types of checks (such as transition related). However, the intent of checking is finding a bug. Vendor agreed on how PSS can help catch bug at block that found at system level, and pointed out that DV guys at lower level versus software guy at top-level needs a better handshake.
Functional Coverage— Bug planning caused shift earlier left, better preparation for everyone. Take system level coverage to lower block level. Coverage heat map, clustering, identify which tests useful or not –all are good. Need to understand how to run functional coverage post-silicon. Few panelists agreed that coverage model need to be rethought and clever people are needed. Comment from the floor, if everything placed into graph, how to catch if something is missed. Would it be using decision graph work on entire space or coverage closure assisted with ML?.
Big data and ML— Big data implied inference problem (statistical inference) – how to extract meaning such as techniques with graphics, machine learning. Vendor’s POV that inference can be used to get full coverage with ML, checks that cause failure. Anticipating of less combination of logic paths be exercised when system get more complex and data getting bigger. A user panelist believes that logical inferencing within reach, however the space so big, need to get models that make sense. Don’t know about ML, but statistical method should be known approach already. How to address data merging from block-level IP, system level, emulators, FPGA – does big data help? Another vendor stated that PSS may not be the answer; need to know how to compact data; it’s an infrastructure issue. Audience comment: “Not handling data conditioning may translate to big data issue”.
Formal verification can be categorized into two camps: the deep FPV advocates writing more custom written assertions and the broad formal apps advocates such as connectivity check, unreachability analysis or register verification. The third panel explore the reasoning behind each preference.
Go broad arguments — Designer cares about usability and ease of debug. Return On Investment (ROI) for doing formal can still be questioned. Usually formal expert is hard to find while one can ramp up verification as long as one is RTL literate and know DV. Formal applications to be considered such as Sequential Equivalence Checks (SEQ), connectivity, X-property, CSR, security, deadlock.
Go deep arguments — Formal 2.0 methodology is key to success, to understand value proposition and scope, training and process are also crucial. On Formal Property Verification (FPV) growth: why not used more? It’s true deep FPV needing few heroes for wider usage. One user said a key principle: do unit testing. Individual design engineer needs to understand local RTL prior to turn-in. Find bug at lowest level, etc. Clear expectation for designer for unit testing is key to success to growing designer to exercise FPV? Have supporting tasks to make it happen. Formal 3.0 — Should have wider use endorsement. It’s not hard and doing formal to identify post-silicon bug has value. Hence, we need to go deep. Write assertion to identify bug. Need to verify design going deep, not resource intensive with formal. It just needs different mindset. User example with regards to resources vs usage:
2015: 2 blocks — 1 person,
2018: 30+ blocks — 3 persons
The final panel is related to smarter and faster verification. What trends and impact of data analytics or ML may have.
Pre-vs-Post Silicon Data — Data recycling, reuse in formal verification. How to align pre and post-silicon data? Usage scenario from mobile/phone → run during pre-silicon, but not 1:1 as lack of infrastructure. Customers deal with lots of data and size. How to collapse and aggregate as we move forward. Need to be smart about managing this. Connecting both pre and post domains, there are opportunities. Post silicon data: how to identify and recapture into emulation (along with test-benches). Vendor opinion to use Portable stimulus to provide feedback upstream to pre-silicon space. How to capture post-silicon as pre-silicon model. Currently using PS as vehicle to solve.
Smartness in debug — Significant resources spent on debug, which usually need adequate level of signatures. Documenting debug process, automating it and potentially apply ML in the process. If a pattern emerges during the process of debug, can we automate it? Question on can we detect design changes and correlate with design problems. Goal is to reduce debug time — we are not there yet. One panelist commented that he had already seen some facilities in assisting debug.
Hidden barriers in ecosystems — Several panelists concurred that silos mode does exist. An example to remedy the problem, through the use of IP-XACT as bridge for DV and logic designer. Formalizing the specification, generate test cases also help. Spending time more on complex cases. Continuous integration, catching corner cases, collaboration is key between DV and logic designer.
Overall, there is an increased interest, though gradual, in considering new solutions (PSS, localized verification methods, spec-capture, etc.) in addressing design verification and validation, while keep pushing the envelope on existing verification flows.