Thought Leadership

A Decade of SystemVerilog: Unifying Design and Verification?

It’s hard for me to believe that SystemVerilog 3.1 was released just over 10 years ago. The 3.1 version added Object-Oriented Programming features for testbench development to a language predominately used for RTL design synthesis. Making debug easier was one of the driving forces in unifying testbench and design features into a single language. The semantics for evaluating expressions and executing statements would be the same in the testbench and design. Setting breakpoints and stepping through the code would be seamless. That should have made it easier for either a verification or a design engineer to understand a complete verification environment. Or maybe it would enable either one to at least understand enough of the environment to isolate a particular problem.

Ten years later, I have yet to see that promise fulfilled. Most design engineers still debug their simulations the same way they debug in the lab: they look at waveforms. During simulation, they rarely look at the design source code, and certainly never look at the testbench code (unless it’s just basic pin wiggling like a waveform). Verification engineers are not much different. They rely on waveform debugging because that is what they were brought up on, and many do not even realize source-level debugging is available to them. However the test/testbench is more like a piece of software than a hardware description, and there are many things about a modern testbench that is difficult to display in a waveform (e.g. call stacks, local variables, and random constraints). And methodologies like the UVM add many layers of source-level complexity that most users do not have the time to wade through.

Next week I will be presenting as part of an Industry Special Session during the Forum on specification & Design Languages (FDL September 24-26,2013) that will discuss these issues and try to get more involvement from the academic and user communities to help resolve them. Was combining constructs from many languages into one a success? Can tools provide representations of source-level constructs in an easier graphical form? We hopefully will not need another decade.

Dave Rich

Dave Rich is Verification Technologist at Mentor Graphics and is one of the authors of Mentor’s Advanced Verification Methodology cookbook. He began his career as a design and verification engineer in 1981 at Data General. In 1987, he joined Gateway Design Automation as one of the first application engineers to support Verilog-XL. At Gateway, he helped design many of the early features of the Verilog Hardware Description Language (HDL), and after Cadence acquired Gateway, helped prepare the Language Reference Manual (LRM) that would eventually be donated to the newly formed Open Verilog International. In 1995, he joined another Verilog simulation company, Frontline Design Automation as an AE manager and later as a Product Manager after it was acquired by Avant!. In 1998, he joined Ambit Design and worked as a consulting engineer for both synthesis and simulation products after it was acquired by Cadence. In 2000, he joined Co-Design Automation as Director of Application Engineering where the Superlog HDL was being developed that eventually became the basis of the Accellera SystemVerilog 3.0 standard. Co-Design Automation was acquired by Synopsys in 2002. Dave began work on numerous technical committees within Accellera and later the IEEE P1800 working group, which he continues today.

More from this author

Comments

8 thoughts about “A Decade of SystemVerilog: Unifying Design and Verification?
  • Pingback: AgileSoC
  • “the test/testbench is more like a piece of software than a hardware description”

    I couldn’t agree with this more.

    Why do we continue to attempt to bolt-on newer software constructs to Verilog rather than just using an existing software language for what is essentially just software? Why try to compete with C, C++, Python etc? These languages have vast amounts of cumulative investment resulting in efficient execution, thousands of open-source third-party libraries and are more accessible. If SV was any good as a software language in its right it would be adopted outside of EDA.

  • The questions raised by Neil and Chris are good questions to be asking. They go much deeper than the debug specific area I was focused on.
    Coincidentally, I’m in the middle of taking a full C++ class after years of sporadically using SystemC. One of the topics is comparing the OOP model of C++ versus languages Java, C#, and others. The question that always comes up is: “What is the best language?” and the answer is” The one you are not using!”
    HDLs certainly do handle single bit manipulations and massive amounts of parallelism more efficiently than most other programming languages.
    And although tesbenches are more like software than hardware, the people writing the code are more disciplined in hardware than software. If I ever thought the errors coming out of a SystemVerilog compiler like Questa were not helpful, I’m now so thankful I’m not using a C++ compiler daily after taking this class.

    Do read Neil’s post. It’s not really the language the code is written in, but the development process used to write the code. http://www.agilesoc.com/2013/09/19/a-decade-of-systemverilog-driving-design-and-verification-apart/

  • dave,

    Off topic a bit here, but I always find it interesting when I hear people say that hardware developers are more disciplined than software developers. I wonder why we think that’s the case? My opinion (for the little it’s worth) is that we *think* we’re more disciplined because we have to be more disciplined… in asic dev anyway with the cost of failure supposedly being higher than it is with catastrophic software failure.

    From what I’ve learned the last few years, I feel comfortable suggesting high-end software teams are far more disciplined than high-end hardware teams. Example: teams deploying features several times a day without issue. another example: teams that operate without bug tracking tools b/c defects are fixed on the spot. Then there’s the teams that use atdd/bdd/tdd. The structure and discipline required to do those types of things is quite incredible and something I’ve not seen or heard of from hardware organizations. Certainly, I’ve heard the stories on the lower end as well of manual testing and regression so obviously there’s no blanket statement to be made. On average, I think we’re probably equally disciplined. But in terms of structure, discipline and overall effectiveness of top-end sw v. top-end hardware teams, it sure feels like they’re leading us as opposed to the other way around.

    Thanks for posting the agilesoc link. one suggestion I forgot to add in there is pulling in psychologists along with academia and user communities. Seems the software folks have also figured out there’s a human component to design (and usage). seems we’re so fixated on the technical that the human aspects that drive productivity are completely underestimated. the human component eventual trickles down to language.

    -neil

  • > The question that always comes up is: “What is the best language?” and the answer is” The one you are not using!”

    For any given task we should be asking some basic questions and evaluating what is the best tool for the job. Do we favour development speed or execution speed, what skillsets are available, how much are the licenses, what third-party libraries are available, do we have to interface to other systems etc. For most software tasks it is extremely unlikely that SystemVerilog is the optimal language. For verification of hardware designs it may or may not be the best option but it seems that a full evaluation of the options is a rarity.

    > HDLs certainly do handle single bit manipulations and massive amounts of parallelism more efficiently than most other programming languages.

    I’m not the software industry would agree with that statement. Let’s take a common massively parallel task like a webserver. There are many companies with deep pockets and highly talented engineers continually researching ways to serve greater numbers of simultaneous connections efficiently and I’m not aware of anybody using SV as a webserver!

    > And although tesbenches are more like software than hardware, the people writing the code are more disciplined in hardware than software.

    In the same way that 93% of drivers consider themselves to be above average 😉

    I don’t think SW or HW can claim to be more disciplined – there are all types of engineer – slow, brilliant, sloppy, methodical – in both hardware and software. We should bring HW and SW teams together more often to compare notes on how best to tackle various problems, both sides might learn a thing or two.

  • Chris,

    There will always be domain specific languages, and there will always be some overlap between those languages. The key is to make sure those domains can communicate and the people in those different domains communicate. That is the crux of Neil’s comments.
    BTW, there is a big difference between a web-server servicing many independent connections versus the highly synchronized threads involved when describing hardware.

  • Dave, Chris,

    The language of choice is quite important as you have all recognized. I’d like to make the point thought that although the HDL have the programming constructs to describe parallelism when it comes to the hardware this is quite different to what occurs when the design is simulated. At this point it seems the default is drop down to a linear processing route. As such a language that has more efficient/usable/tested world brings greater benefits to the verification problem.

    Stu

  • Dave,

    I agree that domain specific languages are necessary. I suppose my question is why do we assume we should use the same language for verification that we use to describe the hardware? I don’t think this is a rational decision but it just happens to be the default industry position. Personally I think this hinders our progress.

    I think there are other drawbacks to blurring the HDL/HVL distinction which aren’t mentioned – for example some features are added to the language to assist verification and happen to be synthesisable but it’s hit-and-miss whether that feature was supported. We have a well defined and all-encompassing language but varying (and sometimes unpredictable) levels of support in different tools.

    > BTW, there is a big difference between a web-server servicing many independent connections versus the highly synchronized threads involved when describing hardware.

    But we’re discussion about verification, not describing hardware (for which I do not doubt HDLs and a modern simulator are very efficient). Unless you’re delving into the realms of synthesisable protocol checkers or similar, verification is entirely software.

Leave a Reply

This article first appeared on the Siemens Digital Industries Software blog at https://blogs.stage.sw.siemens.com/verificationhorizons/2013/09/19/a-decade-of-systemverilog-unifying-design-and-verification/