Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/why-i-stopped-writing-rtl.2374/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021770
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Why I stopped "writing" RTL

J

joconnor

Guest
Don’t get me wrong, I still create RTL, I just don’t “write it” anymore. OK, most of it anyway. I stopped “writing it” because I can’t hide from verification engineers. When I looked at my customers, they were mostly EDA tools. The only time another engineer ever looks at my code is when I pull them into a review. Truth is, I’d rather they just reviewed my architectural specification and made sure my approach was solid. They’d probably prefer that also. Anybody wake up on Monday morning thinking, “Oh great! I get to review someone else’s code today.”?


Most of my customers change my code into something else. Again, nobody really wants to see my code. They’re happy with the new format. That is, until something bad happens. That’s when everyone wants to review it. Now that my code is under attack, I don’t want to show it to anybody. I just want to fix it quickly and silence my critics.


Seasoned engineers fly under the radar by creating code with reasonable area, timing, power and testability. All the stuff EDA tools can produce in volumes. Some tools give you a quick peak at what the code is going to look like on down the tool chain before you have to show it to anyone. Great stuff! Why can’t I hide from Verification engineers? It's their life’s work to find something wrong with my precious code.


I spent some time thinking about verification engineers and what they really want from me. They convert my code into a seemingly infinite set of signals that only I can decode on a huge display (or two). It’s all gibberish to them. If I let them watch me decode it, they’re all the more convinced I have no idea what I’m doing. Verification engineers call them “bugs” and put descriptions of them in bugzilla for everyone to see; I call them “corner cases” . Still, nobody wants to see my code.


So I decided to give the verification guys what they really want: a functional coverage model in a nice neat package they could attack instead of my code. This solves two problems: first, they’re focused only the design intent model, and second, when the model is covered, they can go away and pester someone else satisfied with the knowledge that everything I gave them is working properly.


So I started a business and created an EDA tool to generate the code and the functional coverage model from one source. That source being the timing diagrams and flow diagrams I include in my architecture specification. Solid Oak’s CoverAll generates RTL, assertions, path coverage, test bench templates and formal scripts from those same diagrams. Now the playing field is leveled. The verification engineers can find “corner cases” in my design intent and I can find “holes” in their testbenches. It’s a win-win.


Jim O’Connor
Solid Oak Technologies

<script src="//platform.linkedin.com/in.js" type="text/javascript"></script>
<script type="IN/Share" data-counter="right"></script>
 
Last edited by a moderator:
Jim,

Thanks for the info. Who else is using your EDA tools over the past two years? Are you mostly a consulting business or an EDA product business?
 
Info on Solid Oak Technologies

Jim,

Thanks for the info. Who else is using your EDA tools over the past two years? Are you mostly a consulting business or an EDA product business?

Daniel,

We have mainly been a consulting/contract business, using the CoverAll tool for those deliverables. We are now engaging with customers as an EDA product business and have ongoing evaluations in progress.

Jim
 
Jim,

Thanks for the update, most EDA companies spend about 30% of their revenue on sales and marketing, so it takes some effort to migrate from a services to a product business. It's the small companies like yours that take the risk to actually innovate and create something new and different.

All the best to you in 2013.
 
Jeff Gruszynski • For any newbies, this is a very specific case where this is true.

We still write RTL but it's because it's our products and our firmware which no one else will ever touch. It's not generally true that the average engineer won't write RTL and only rely on CAD IMO.
 
What about functional coverage?

Jeff Gruszynski • For any newbies, this is a very specific case where this is true.

We still write RTL but it's because it's our products and our firmware which no one else will ever touch. It's not generally true that the average engineer won't write RTL and only rely on CAD IMO.

Hi Jeff,

I think the difference here is that CoverAll only generates what you specify, the same as if you "wrote it" in a text editor...the code just comes from flow diagrams you create. You spend your time creating the diagrams, not writing if...then...else. The added benefit is that CoverAll can generate assertions and functional coverage automatically from those same diagrams, which you would also have to do by hand. It's this coverage model that adds value to the design and helps the verification community do their jobs more efficiently.

How do you guarantee coverage now?

If you're still skeptical, send me an RTL module (something around 1500 lines) and testbench and I'll convert it to the CoverAll format and give you the coverage model back as well as the testbench coverage numbers (both code and functional coverage).

Jim
 
How about all Linux?

Jim,

Thanks for the info. Who else is using your EDA tools over the past two years? Are you mostly a consulting business or an EDA product business?

Interesting new tool. I need to look more into what you offer. I'm wondering why in your flow diagram you don't show Eclipse
and libreoffice-Draw to match the rest of the Linux tool set. CAD tools, except for PCB design, are pretty much all Linux for ASIC
design/verifiation.

Maybe you could get John Cooley to investigate, report on your new addition for verification.

Lots of luck with your new CAD tool offering,

-Frank
 
Sergei Steshenko • OK, we've been creating a lot of autogenerated (from higher representation) code way back in the nineties.

I am still waiting for a paradigm shift though. I mean specs are still written in a non-formal language, so in practice their incompleteness and contradictions are often discovered during the RTL coding stage.
 
Mikko Kivioja • "I am still waiting for a paradigm shift though. I mean specs are still written in a non-formal language", writes Sergei and i do understant that a bit wrong on purpose.

But hey, what was the idea behind VHDL? There was something about a specification that can be for example simulated if i remember it right. Or could that VHDL, or Verilog as well, already be the language of specification? Now it is year 2013 and VHDL has been available already from 1987, so are we all using it in a wrong way?

I totally agree that something should be done to those incomplete/incompetent/late/misleading/written-with-wrong-MS-Word/FrameMaker-version/etc.(add your words of blame here) specifications, i.e. those files that are written with 'spoken' language instead of formal languages. I have been using 'VHDL code generation' quite many years also already, creation of VHDL with Perl from Excel sheets saved as CSV files being the simples example.
 
Sergei Steshenko • We had developed by me and my colleagues a Perl-based hierarchical whole chip build system way back in the nineties.

Of course, part of the system was, as I wrote, generation of Verilog code when and where desired.

Among other things we had port autoconnect features - with manual overrides wherew desired.

We even had auto portlist generation when/where desired - I mean, suppose we need envelope E block for blocks A, B, C.

A, B, C are autoconnected, and what did get autoconnected was automatically converted into E portlist, and the whole E was autogenerated.
 
Which Specs? High-Level Specs or Module-level Specs?

Hi Sergei,

Sergei Steshenko • OK, we've been creating a lot of autogenerated (from higher representation) code way back in the nineties.

Just curious, but what was the generation tool and are you still using high level auto generation? And auto generation of what? RTL structural code?

Sergei Steshenko • I mean specs are still written in a non-formal language, so in practice their incompleteness and contradictions are often discovered during the RTL coding stage.

Which specs are you referring to: the high level architectural spec or the low-level module spec? If you include the flow diagrams and timing diagrams I mentioned above, there should be no ambiguity. Writing the spec is equivalent to writing the RTL.

I agree that high level specs are often ambiguous, but it is the designer's responsibility to remove the ambiguities in the lower-level module specs. A simple example: If I ask you to create an up/down counter with separate "up" and "down" controls, I'm being ambiguous because I haven't explicitly told you what to do when "up" and "down" occur at the same time. However, when you implement the counter and test it you have to make a decision: give priority to "up", give priority to "down", or do nothing when "up" and "down" occur together. And how do you capture this design choice you've made? By explicitly stating, via an assertion, what your implementation will do when both "up" and "down" are active at the same time. This assertion can be generated from the flow diagram, but if it's not it can be captured manually and added to the generated assertions. This is what CoverAll does. It generates RTL and assertions from the specified flow diagrams and allows designers to manually add design assumptions to capture the complete design intent.

This accomplishes two things: First, it completely describes the designs' intent and removes the ambiguity. Second, it gives quantitative evidence that the condition has been tested. If it's tested and passes the verification expectation for the "up" and "down" condition, you know your implementation choice was correct.

Look at the alternative. If you don't generate the assertions, you have not captured your design intent. When simulation is complete, you can not say for sure the condition was met without manually checking all simulation runs until you find one where "up" and "down" occur together.

Jim
 
But hey, what was the idea behind VHDL? There was something about a specification that can be for example simulated if i remember it right. Or could that VHDL, or Verilog as well, already be the language of specification? Now it is year 2013 and VHDL has been available already from 1987, so are we all using it in a wrong way?

The guy writing the spec is most of the time not an RTL programmer. And I would prefer informal specs over a guy who thinks as VHDL as the next iteration on Visual Basic.
 
Back
Top