Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/index.php?threads/constrained-random-verification-and-simulators.1263/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021270
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Constrained Random Verification and Simulators

In response to my earlier post on CRV, Daniel Payne raised an interesting question on Semiwiki - "Is there any difference in choosing a VHDL, Verilog, System C or System Verilog simulator from Cadence, Synopsys or Mentor and then using the CRV approach”?

Before answering this, let’s quickly check the CRV approach.....

Constrained Random Verification includes choosing an HVL, defining the test bench architecture and developing constraints to generate legal random stimuli. When this test bench is used to simulate the DUT, a random seed value and a simulator become part of the verification environment. The seed helps in reproducing the failure only if the other inputs i.e. the test bench architecture (components hierarchy) and the set of constraints are constant. Any change to these inputs may lead to different results with the same seed value also. The random seed value and the constraints are fed to the constraint solver integrated in the simulator to generate random values.

Let’s assume the HVL used is System Verilog (IEEE 1800). The standard provides details on the data types, how to randomize, how to define constraints and output of expressions. It does not specify the algorithms involved and the order in which the constraint solver should generate the outcome. The implementation of the solver is left to the sole discretion of the simulator R&D teams. The available simulators from different vendors may also support multiple solvers that tradeoff between memory usage and performance.

Therefore, a given input may not give the same results when run on –

- Simulators from different sources OR
- Different versions of the same simulator OR
- Using different solver from the same simulator.

All the constraint solvers claim to solve standard puzzles like Sudoku, N-Queen problem etc. Some of them even claim to solve these standard problems in limited or minimum iterations. Although these claims are correct but its reproducibility entirely depends upon how the constraints are defined. The capability of any constraint solver to return results in a given time primarily depends upon the representation of constraints. For a given equation, the solver may sometimes time out, but if the constraints typifying the equation are restructured it gives results instantly.
So, what to look for in the simulator when implementing CRV approach?

All the industry standard simulators have stable solvers varying on performance and there is no standard way to benchmark them. However one should consider other important aspects too -

- Support of the data types planned for implementing the test bench.
- Ease of debugging the constraint failures.

Next important part is how to define clean and clear constraints that would aid the solver in giving out results quickly. Following the KISS (Keep It Simple Silly) principle sets the right ground. Split the constraints to smaller equations, particularly if the expressions involve multiple mathematical operators and data types. This will provide better control on the constraints blocks and help in solving these constraints easily. Defining easy to solve constraint would improve the predictability of the outcome i.e. the generated value to be in a legal range and how fast it can be generated.

Coming back to the main question, the answer is YES, different simulators will give different results (values and fastness) while using CRV approach. This will affect the turnaround time of regressions and coverage closure too. However, there is no standard benchmarking available on the constraint solvers from these simulators. It is up to the verification engineer how best he/she can define the constraints and turn the output to be solver independent.

NOTE – This doesn’t mean that solver will never hit a bottleneck. Even with well defined constraints, the constraint solvers fail to deliver the results sometimes.

http://whatisverification.blogspot.com
 
Last edited:

simguru

Member
There is always the question: is simulation the best way to verify anything?

If you are talking about logic design then it seems like the best approach is to use fast/cheap logic simulation with something like SystemC for verifying functionality pre-synthesis, and then formal verification to verify that your synthesis was correct. As a simulation expert I don't think any of the Verilog/VHDL simulation tools are up to the job of simulating modern power-managed SoC designs in high-variance Silicon - a fast behavioral analog simulator is probably a better bet.

There is of course some motivation on the part of the EDA companies to sell methodologies that require more licenses, and there's nothing like burning cpu cycles on compute farms doing exhaustive simulation for that.

Personally I avoid VHDL for a number of reasons, and I think the jury is still out on SV.
 
The size of the logic design is so humongous that a cheap simulator with limited technical support may not fulfill the requirement of executing the test plan. Choice of methodology & HVL to realize the test plan is another issue. SystemC has been good in modeling but still lacks behind e & SV. As for power aware verification, some of the tools have in built support for this in the simulator and are able to generate most of the cases possible. LEC can help check that the synthesis is correct but then GLS is still a practice due to multiple reasons listed in my post on GLS before.

I agree that EDA companies have been successful in driving the HVL/Methodology selection based on business needs instead of engineering. My experiences with e, Vera, SystemC & SV makes me choose 'e' over any other HVL. It's the upgradation on the methodologies like UVM that would continue the survival of SV then the language itself.
 
Top