verification vs validation

P

pradeep

Guest
Hi,

can any one give the difference between verification and validation ?

what is system level validation ?

what is emulation ?

with regards
pradeep.g
 
pradeepg@vlsi1.sastra.edu (pradeep) wrote
can any one give the difference between verification and validation ?
You validate an expression by verifying it is true for several
examples.
You verify an expression if do this for every possible example.

"All prime numbers are odd" is validateable with 3,5,7,41,43 but not
verifyable because its wrong for 2 *g*.
It is very easy to validate that your 32-bit adder will do the right
thing, but verifying that your adder will be correct for all 2^33
possible inputs is a hard job.

bye Thomas
 
Thomas Stanka wrote:
pradeepg@vlsi1.sastra.edu (pradeep) wrote

can any one give the difference between verification and validation ?


You validate an expression by verifying it is true for several
examples.
You verify an expression if do this for every possible example.

"All prime numbers are odd" is validateable with 3,5,7,41,43 but not
verifyable because its wrong for 2 *g*.
It is very easy to validate that your 32-bit adder will do the right
thing, but verifying that your adder will be correct for all 2^33
possible inputs is a hard job.

bye Thomas
This is true and is the reason we now have formal verification tools that can
verify the functionality of code using mathematical proof techniques, rather
than trying to empirically test using all possible inputs. They are generally
orders of magnitude faster, but of course, this is yet another tool to buy/license.

--

Regards,

Brent Hayhoe.

Aftonroy Limited
Email: <A
HREF="mailto:&#066;&#114;&#101;&#110;&#116;&#046;&#072;&#097;&#121;&#104;&#111;&#101;&#064;&#065;&#102;&#116;&#111;&#110;&#114;&#111;&#121;&#046;&#099;&#111;&#109;">
 
Larry Doolittle wrote:

In article &lt;bpkl83$ujp$1@newsg2.svr.pol.co.uk&gt;, Brent Hayhoe wrote:

... we now have formal verification tools that can
verify the functionality of code using mathematical proof techniques, rather
than trying to empirically test using all possible inputs. They are generally
orders of magnitude faster, but of course, this is yet another tool to buy/license.


The software people have generally given up on the "formal proof
of correctness" concept. It turns out to be just as hard and
error-prone to set up the proof as it is to write good code.
This may be true for software in general, but not for HDLs. Synopsys' Formality
and Mentor Graphics FormalPro are but 2 examples of commercial formal
verification tools. Also the development and standardization of PSL is another
example of the commitment to this concept.

The later is to do with property checking, i.e. stating what you think your code
should do.

They also have a very wide use in equivalence checking, used to verify post
synthesis and post place &amp; route tool gate level code, essential if you don't
really trust your tools!

From my "attitudes" page:

"There are two ways of constructing a software design. One way is
to make it so simple that there are obviously no deficiencies
and the other is to make it so complicated that there are no
obvious deficiencies."
- C A R Hoare

.. and of course this applies just as well to HDL as software.
This is not about making design code complicated. Good code needs to be
understandable in order to be maintainable, often overlooked. Moreover, this is
about capturing 'design intent' and provide a method of verifying it.

The second use comes from a deep-seated mistrust of application tools, which are
of course written in software!

And the need for this, well a lot of HDL code is target at ASICs and once
committed it is neither an easy or cheap option to patch up errors. Software on
the other hand is more forgiving.

--

Regards,

Brent Hayhoe.

Aftonroy Limited
Email: <A
HREF="mailto:&#066;&#114;&#101;&#110;&#116;&#046;&#072;&#097;&#121;&#104;&#111;&#101;&#064;&#065;&#102;&#116;&#111;&#110;&#114;&#111;&#121;&#046;&#099;&#111;&#109;">
 
In article &lt;bpkl83$ujp$1@newsg2.svr.pol.co.uk&gt;, Brent Hayhoe wrote:
... we now have formal verification tools that can
verify the functionality of code using mathematical proof techniques, rather
than trying to empirically test using all possible inputs. They are generally
orders of magnitude faster, but of course, this is yet another tool to buy/license.
The software people have generally given up on the "formal proof
of correctness" concept. It turns out to be just as hard and
error-prone to set up the proof as it is to write good code.

From my "attitudes" page:

"There are two ways of constructing a software design. One way is
to make it so simple that there are obviously no deficiencies
and the other is to make it so complicated that there are no
obvious deficiencies."
- C A R Hoare

... and of course this applies just as well to HDL as software.

- Larry
 
In article &lt;bplfqa$h29$1@newsg2.svr.pol.co.uk&gt;, Brent Hayhoe wrote:
Larry Doolittle wrote:

The software people have generally given up on the "formal proof
of correctness" concept. It turns out to be just as hard and
error-prone to set up [formal correctness] proof as it is to
write good code.

This may be true for software in general, but not for HDLs. Synopsys'
Formality and Mentor Graphics FormalPro are but 2 examples of commercial
formal verification tools. Also the development and standardization of
PSL is another example of the commitment to this concept.
Just because companies sell products and standards committees hold
meetings doesn't mean there is any lasting substance behind the concepts.

They also have a very wide use in equivalence checking, used to verify post
synthesis and post place &amp; route tool gate level code, essential if you
don't really trust your tools!
If you don't trust your tools, _they_ did not understand the Hoare quote,
and you should switch tools. If all tools for a language are unreliable,
then the language itself is defective. How do you prove that your
equivalence checking software itself is not buggy?

This is not about making design code complicated. Good code needs to be
understandable in order to be maintainable, often overlooked. Moreover,
this is about capturing 'design intent' and provide a method of verifying
it.
Both conventional HDL and formal correctness proofs have to capture
design intent. And at some level I can understand that two different
and cross-checkable ways of expressing design intent are a Good Thing [TM].
But if the design cannot be described in a way that is "so simple that
there are obviously no deficiencies" in either paradigm, you're sunk.
Once you come up with a sensible paradigm with which to capture the design
intent, synthesize it and go on to the next project!

Of course, none of what I say argues against test benches, spot checks,
regression tests, or any other technique to exercise your code. But
the simpler the original design (or even, the simpler the initial
representation of the original design), the easier it is for such tests
to build confidence that the design _always_ does the right thing.

And the need for this, well a lot of HDL code is target at ASICs and
once committed it is neither an easy or cheap option to patch up errors.
Software on the other hand is more forgiving.
Depends. Read about the Space Shuttle software effort sometime.

- Larry
 
Synonym for "verification" is revising. You ensure that your design at a
lower
level of abstraction conforms to specification (a design, model at higher
level of
abstraction). You prove they match. You look at first, look
at onother and conclude they are equal (do the same job, have the same
behaviour, function). Lower level model just shows more structural details
but responce to the same input verctors should give the same responce.

Validation is a test where you demonstrate (to you teacher, boss, customer
or yourself) that the fature you've implemented does what it was intended to
do. You do not mach one design against another.






I would be happy if anybody explains differences emulation vs. simulation
vs. imitation once again. I have the following synomims for them:

imitation = fake, falsification;

simulation = modelling. We develope a device but we are not ready to put
our model (most likely described at high-level of abstraction) into real HW.
because of high costs and low information we can get from HW. We perform
modelling of its behaviour in SW instead. Waveforms give complete
information;

emulation = an enviroment. The device (or a program) we have developed
cannot exist without communication with external world. When our design is
ready we can connect an emulator to it rather than real target environment
to test all the params of out device. In-Circuit emulators emulating CPU can
be used to test RAM and all the perepherientals. Then roles can be changed
and emulators of perepherintals can test real CPU. Another type of emulators
are FPGAs and SW platforms. Programs as well as devices cannot exist alone,
they need a target execution device (like CPU, OpSys). You can imitate the
environment your program desires to see using emulator. You can run Excel
designed for x86/Windows on Sparc/Solaris or IP Core designed for ASIC
technology on FPGA. Emulators are used to replace real target execution
devices or their prototypes when target device is not available or to
collect more info from your design(whether a program or device).

I feel that these my explanations are true but cannot find exact
definitions.
 
Larry Doolittle &lt;ldoolitt@recycle.lbl.gov&gt; wrote:
In article &lt;bplfqa$h29$1@newsg2.svr.pol.co.uk&gt;, Brent Hayhoe wrote:
They also have a very wide use in equivalence checking, used to verify post
synthesis and post place &amp; route tool gate level code, essential if you
don't really trust your tools!

If you don't trust your tools, _they_ did not understand the Hoare quote,
and you should switch tools. If all tools for a language are unreliable,
then the language itself is defective. How do you prove that your
equivalence checking software itself is not buggy?
The problem is, that there are some statements that synthesise tool
dependend.
Good example are Statemachines. You can't trust any synthesis tool
that it generates the logic _you_ expect. Of course it will likely
generate a logic, that behaves "correct" for the "normal" set of
inputs but what if your statemachine enter an illegal state? What if
you intended a special encoding?
We just migrated some vhdl code from synopsis to synplify and the
testbenches didn't report any differences, the structural equivalence
did.

But you've picked the right point that you can't trust structural
equivalence checking if you can't trust your vhdl-code *g*.

bye Thomas
 

Welcome to EDABoard.com

Sponsor

Back
Top