Thursday 30 July 2015

“Verification or Validation? What do you think?”

Would you kindly help me clear something up?
A colleague and I were having a lively argument about the difference between Validation and Verification and it got me thinking. What’s the difference anyway? What do we mean by these terms and do you and I mean the same thing?
Winning the argument seemed pretty important to me at the time, so I did some deep and extensive research (oh, alright, I Googled it) and found the Wikipedia definitions, as follows . . .
  • Verification is a quality control process that is used to evaluate whether a product, service, or system complies with regulations, specifications, or conditions imposed at the start of a development phase.”
  • Validation is a quality assurance process of establishing evidence that provides a high degree of assurance that a product, service, or system accomplishes its intended requirements.”
These definitions appear to say the same thing but if you dig into the semantics there IS a difference. A friend at ARM put it very nicely in a presentation to a DVClub Conference in the UK late last year. He said the difference was highlighted in the two questions . . .
 I’m pretty sure I’ve heard this before and it does appear many places on the web so my ARM friend may not be totally original here, but the questions are good questions, none-the-less (if you know where they originate, then please let me know).
What we infer from the questions are the following. . .
Verification is a matching of results to specification. It is a methodical process of proving something does what you asked for, and nothing else. The specification is taken as golden. The aim; a proof that the design meets the specification.
Validation, on the other hand, is the exercising of the design to check that it is fit for purpose. It is a subjective process of using the design, perhaps in situ, definitely with the embedded software, to see if it does what you need. The specification is NOT golden and in effect is under test along with the design. The aim: a proof that the design AND the specification meet purpose.
So where does FPGA-based Prototyping come in? Well, here’s what you told us. In answer to a question in the FPMM download survey, nearly 2000 users kindly shared the following data about their reasons to use FPGA-based Prototyping. . .
Wait a minute; this says that people use FPGA-based Prototyping to “Verify” the RTL, or to “verify” the hardware and software co-design. So, is FPGA-based Prototyping a verification technology? Clearly, looking at the most popular answer, prototypes do expose RTL bugs that sneak through the normal verification process. How is that possible?

A Safety Net for Verification?

Verification is impacted by the classic speed-accuracy trade-off. We can have high accuracy in an RTL simulator and even go to the gate level, but speed is so far below real-time that some tests simply take too long, even on an accelerator or an emulator. On the other hand, high-level modelling in SystemC and other virtual prototypes gives us much better performance but we no longer have cycle-accurate results. Only FPGA-based Prototyping offers the unique combination of high-speed AND cycle-accuracy, allowing longer and more complex tests to be run on the prototype than in a Simulator, catching otherwise unseen RTL bugs.
So, Is FPGA-based Prototyping a verification technology? No;  It lacks the necessary observability, controllability and determinism required for the objective testing of RTL, however, we could quite rightly consider it as a “safety net” for verification.

Objective vs. Subjective

Because Verification is an objective comparison of results against the specification, there is massive scope for automation, for example as in the UVM and VMM methodologies. Validation, however, is more subjective and so less easy to automate, relying more on the expertise of the prototypers themselves. Prototypers need to see the system running, in the real environment actually performing the task for which the specification was created. We may also choose to exercise the design outside of the specification’s envelope in order to explore further optimisation, or to improve upon the specification. There is an emphasis on debug skills and in-lab investigation.

FPGA-based Prototyping is a Validation Technology

Looking back at your survey responses, we see that “System Validation”, “System Integration” and “Software Development” are also popular uses for FPGA-based Prototyping. These use modes are definitely in the validation camp. Here we are using the FPGA-based Prototype as a substitute for the first-silicon and in effect, we are running early acceptance tests on the design and its software. Once again, we are taking advantage of its unique combination of high-speed and accuracy.
In many cases the FPGA-based Prototype is used as a real-world platform upon which to exercise the software, especially software at lower (physical) levels of the stack. Of course, we may find some RTL bugs when we run the real software at speed (kudos indeed to the verification team if we don’t!) and this is an excellent by-product, however, that was not the prototype’s original purpose. Simulation and Emulation are better for verification while FPGA-based Prototyping is better for validation. Virtual prototyping also falls into the validation camp, with emphasis on the higher levels of the software stack and pre-RTL stages of the design.
If finding RTL bugs is your purpose, then simulation and a good verification Methodology will be your best bet. If exercising software and validating the system is our purpose, then prototyping is a far better choice than any verification technology.
In the end, most SoC teams will use both and value the contribution of each equally.

No comments:

Post a Comment

The Future of Remote Work, According to Startups

  The Future of Remote Work, According to Startups No matter where in the world you log in from—Silicon Valley, London, and beyond—COVID-19 ...