Driving "X"

  • Thread starter pranavtailor@gmail.com
  • Start date
P

pranavtailor@gmail.com

Guest
Hi,
I have one query regarding driving X. For some protocols there is an
"inter frame gap" during which the DUT (design under test) is not
concerned about the values coming on its data bus. So why is it ok or
not ok to drive "X" instead of some random "1" and "0" patterns from
the test bench?
Thanks in advance,
Pranav
 
On 26 Feb 2006 12:05:12 -0800, "pranavtailor@gmail.com"
<pranavtailor@gmail.com> wrote:

Hi,
I have one query regarding driving X. For some protocols there is an
"inter frame gap" during which the DUT (design under test) is not
concerned about the values coming on its data bus. So why is it ok or
not ok to drive "X" instead of some random "1" and "0" patterns from
the test bench?
Thanks in advance,
Pranav
It depends on what is supposed to happen during inter frame gap. Some
communication systems use an "idle" with a strict definition. Some
systems allow you to put the output drivers in common mode which can
be modeled with X but you have to start with a reset or known idle
before actual communication starts. In any case, it is probably a good
idea not to let X get out of your "chip". Define a bus state
(encoding) which you can use when the channel is idle.
 
Hi,
I wanted to know whether its a good idea to drive x from the
testbench in to the chip (DUT)...and not the DUT driving X.
Pranav
 
Depends upon your protocol spec and what you want to test. If your
protocol spec implies that the other device (not the DUT) might be
driving garbage at certain points and you want to ensure that your
chip doesn't read/react to the garbage, then you might want to drive X
from your testbench. You might also want to drive random values in
addition to make sure your DUT isn't "special casing" X inputs.

However, you need to read the protocol spec carefully and be prepared
for arguments, since there may be ways to interpret the spec that
require the non-DUT not to drive garbage in those periods (perhaps
requiring it to drive Z for instance).

Hope this helps,
-Chris

*****************************************************************************
Chris Clark Internet : compres@world.std.com
Compiler Resources, Inc. Web Site : http://world.std.com/~compres
23 Bailey Rd voice : (508) 435-5016
Berlin, MA 01503 USA fax : (978) 838-0263 (24 hours)
------------------------------------------------------------------------------
 
On 27 Feb 2006 09:22:34 -0500, Chris F Clark
<cfc@shell01.TheWorld.com> wrote:

Depends upon your protocol spec and what you want to test. If your
protocol spec implies that the other device (not the DUT) might be
driving garbage at certain points and you want to ensure that your
chip doesn't read/react to the garbage, then you might want to drive X
from your testbench. You might also want to drive random values in
addition to make sure your DUT isn't "special casing" X inputs.

However, you need to read the protocol spec carefully and be prepared
for arguments, since there may be ways to interpret the spec that
require the non-DUT not to drive garbage in those periods (perhaps
requiring it to drive Z for instance).
I've also used this as a crude way of checking for timing violations.
Many rams, for example, have a time in which they will be driving
their data lines (so Z isn't appropriate) but the data isn't
guaranteed to be valid.

Regards,
Allan
 
Driving X output DUT could indicate a contention problem. Ideally, a DUT
should be a well-defined machine whose output is always predictable
(well, except for random generator logics) at any time. Driving X from a
DUT is confusing at best: is it the intention of the design to output an
"X"? or is a design error? The test vectors should be able to determine
whether an error has occured solely based upon the outputs.

~jz

pranavtailor@gmail.com wrote:
Hi,
I wanted to know whether its a good idea to drive x from the
testbench in to the chip (DUT)...and not the DUT driving X.
Pranav
 

Welcome to EDABoard.com

Sponsor

Back
Top