How do I know when my hardware design is correct and meets all of the specifications? For many years the answer was simple, simulate as much as you can in the time allowed in the schedule and then hope for the best when silicon arrives for testing. There is a complementary method for ensuring that hardware design meets the specifications by using formal verification, a mathematical technique to prove that the design functions properly under all possible cases. Nicolae Tusinschi, Siemens EDA wrote a white paper on this topic, so this blog shares what I learned about formal verification.

Simulation only tests what you provide as sequential inputs to a design, hoping that you may get close to covering all of the states in a design. Formal verification is an exhaustive approach, analyzing all possible states and input combinations, finding any potential violations to intended behavior.
For formal verification engineers do write assertions, assumptions and cover properties with languages like SystemVerilog Assertions (SVA) or the Property Specification Language (PSL), depending on preferences. An assertion states the expected behavior of a design. Constraints are used to limit the formal analysis to only be valid input sequences or states. Cover properties tell you how completely your design has been verified. These are the pieces that direct the formal verification methodology, showing that the design will work as specified through all conditions.
Pre-built formal applications make formal use easier than gaining formal experience. One formal app can check for clock domain crossing issues, identifying synchronization problems without any special knowledge. Control logic was an early use for formal verification with its limited state space, yet new formal tools can also verify data paths with their larger number of data values and operations. Siemens offers a full suite of formal verification solutions with Questa One Static and Formal Verification (SFV)..
Yes, formal verification has limits in terms of memory usage and total state space, so on large designs is recommended to use formal selectively on critical components, then rely on simulation for the remaining components.
Formal analysis can get stuck and report inconclusive results if the design is highly complex, or assertions are beyond the tool capacity to compute in one run. Limiters to formal analysis can be a large state space, deep sequential depth or property complexity. Using bounded proofs in formal verification will check an assertion only within a certain number of clock cycles, producing results in a more feasible amount of time. On the other hand, formal tools help identify limits by reporting the “cone of influence”, which is the logic that affects each property being verified. Questa One SFV shows the logic cone of influence, listing assumptions and signals, allowing you to address the complexity and maybe remove or add some assumptions.
For best results it’s recommended that assertions be written simply and with short sequential depth. Decomposition examples were provided for modular partitioning across multiple sub-modules. Counters create large sequential depth but can be abstracted by replacing them with a smaller one or using a non-deterministic model. Large memories can create a giant number of state bits, so you can either black box the memory model or reduce the memory size.
An example of abstracting memory is when there are 128 entries with a 64 bit data width. Memory addresses that are not 100 can be abstracted using this netlist cutpoint command. This abstracted memory has the number of state bits reduced to just 64 from 8,192, which improves the formal runtime speed.
for {set i 0} {$i < 128} {incr i} {
if {$i != 100} {
netlist cutpoint memory_instance.mem[$i]
}
}
Summary
This white paper on formal verification shares many examples to explain where and how to use the technology most effectively. Best practices include strategically applying formal, combining both formal and simulation, iteratively refining where formal is used, and documenting to other team members the assertions, assumptions and abstraction usages. Formal verification uses a mathematical proof of your design being correct, instead of relying on probably being correct. Designs that are safety-critical or require high-reliability benefit greatly from higher levels of assurance provided by formal verification.
The larger the design and the more complex the design, the greater the role that formal verification provides. Formal apps cut down the learning curve and produce faster verification results.
Read the entire 19 page white paper here.
Related Blogs
- Siemens Proposes Unified Static and Formal Verification with AI
- Transforming Functional Verification through Intelligence
- Siemens Fleshes out More of their AI in Verification Story
- Something new in High Level Synthesis and High Level Verification


Comments
There are no comments yet.
You must register or log in to view/post comments.