SystemVerilog: How to get wall clock time delays w/o knowled

M

mrfirmware

Guest
I'm trying to testbench a PCIe endpoint device. Each PCIe packet that
requires a completion from the endpoint device must receive the
completion packet within a specified time limit. When I initiate the
request (send the packet to the endpoint) I would like to record the
current simulation time, e.g. req[trn_id].give_up_time = $realtime +
20ms, and then spawn a task waiting for the completion to come back or
else bail out when simulator-time exceeds give_up_time.

My problem is, I cannot programatically determine how many realtime
ticks make up 20ms. I suppose I could grab the output of string from
$printtimescale and parse it for the units but that seems a bit
clunky. Am I missing something obvious here?

Thanks,
- Mark
 
On Jan 31, 1:25 pm, mrfirmware <mrfirmw...@gmail.com> wrote:
I'm trying to testbench a PCIe endpoint device. Each PCIe packet that
requires a completion from the endpoint device must receive the
completion packet within a specified time limit. When I initiate the
request (send the packet to the endpoint) I would like to record the
current simulation time, e.g. req[trn_id].give_up_time = $realtime +
20ms, and then spawn a task waiting for the completion to come back or
else bail out when simulator-time exceeds give_up_time.

My problem is, I cannot programatically determine how many realtime
ticks make up 20ms. I suppose I could grab the output of string from
$printtimescale and parse it for the units but that seems a bit
clunky. Am I missing something obvious here?

Thanks,
- Mark
Yeah, put a `timescale directive in your file.
 
On Feb 1, 12:39 am, Ryan <Ryan.Warner...@gmail.com> wrote:
On Jan 31, 1:25 pm, mrfirmware <mrfirmw...@gmail.com> wrote:

I'm trying to testbench a PCIe endpoint device. Each PCIe packet that
requires a completion from the endpoint device must receive the
completion packet within a specified time limit. When I initiate the
request (send the packet to the endpoint) I would like to record the
current simulation time, e.g. req[trn_id].give_up_time = $realtime +
20ms, and then spawn a task waiting for the completion to come back or
else bail out when simulator-time exceeds give_up_time.

My problem is, I cannot programatically determine how many realtime
ticks make up 20ms. I suppose I could grab the output of string from
$printtimescale and parse it for the units but that seems a bit
clunky. Am I missing something obvious here?

Yeah, put a `timescale directive in your file.
Don't use 'em. I set the timescale once at the simulator invocation -
all modules are forced to that timescale. A value which I could then
use throughout my testbench a priori. However, it seems a bit odd that
the language doesn't have a concept of wall clock time or timers for
that matter.
- Mark
 
On Feb 1, 8:25 am, mrfirmware <mrfirmw...@gmail.com> wrote:
Don't use 'em. I set the timescale once at the simulator invocation -
all modules are forced to that timescale. A value which I could then
use throughout my testbench a priori. However, it seems a bit odd that
the language doesn't have a concept of wall clock time or timers for
that matter.
I agree that this is odd. I find it strange that the delays
themselves are unitless, and that the units are specified separately
for the entire module by a compiler directive mechanism.

But I find it even stranger that you are not using this available
mechanism to specify the units in your design, instead waiting until
you invoke the simulator to decide the units. That is a much greater
separation between the delays and the units, since your design doesn't
contain the units at all.

In SystemVerilog, you can specify time literals with units. They are
still just real numbers, but are reverse-scaled by the timescale so
that timescaling will give the desired value. That means that you can
not only get the delay you want (such as 20ms), but you can also
extract the timescale. For example, the value (1.0/1fs) will tell you
the ratio between a delay of 1.0 and a delay of 1fs, which is the
timescale as a number of femtoseconds.
 
On Feb 1, 1:24 pm, sh...@cadence.com wrote:
On Feb 1, 8:25 am, mrfirmware <mrfirmw...@gmail.com> wrote:



Don't use 'em. I set the timescale once at the simulator invocation -
all modules are forced to that timescale. A value which I could then
use throughout my testbench a priori. However, it seems a bit odd that
the language doesn't have a concept of wall clock time or timers for
that matter.

I agree that this is odd. I find it strange that the delays
themselves are unitless, and that the units are specified separately
for the entire module by a compiler directive mechanism.

But I find it even stranger that you are not using this available
mechanism to specify the units in your design, instead waiting until
you invoke the simulator to decide the units. That is a much greater
separation between the delays and the units, since your design doesn't
contain the units at all.
Hmm... I thought since timescale only meant something at simulation
time it would make sense to specify it at sim-time and not in the
many, many source files I have. This way, if I wish to change from 1ns/
1ps to 1us/1ns I don't have to hunt the timescale directive down in
each and every file. I realize I could include a my_timescale.vh file
to provide this feature but it seems clunky when you can just specify
timescale when invoking the simulator. Maybe I'm missing something
since I'm just a C programmer learning to write testbenches for other
peoples' RTL.

In SystemVerilog, you can specify time literals with units. They are
still just real numbers, but are reverse-scaled by the timescale so
that timescaling will give the desired value. That means that you can
not only get the delay you want (such as 20ms), but you can also
extract the timescale. For example, the value (1.0/1fs) will tell you
the ratio between a delay of 1.0 and a delay of 1fs, which is the
timescale as a number of femtoseconds.
Oh. Then:

real tlp_timeout_time = $realtime + 20ms;

is the answer I'm looking for.

Thank you very much!

- Mark
 
On Feb 1, 2:29 pm, mrfirmware <mrfirmw...@gmail.com> wrote:
On Feb 1, 1:24 pm, sh...@cadence.com wrote:



On Feb 1, 8:25 am, mrfirmware <mrfirmw...@gmail.com> wrote:

Don't use 'em. I set the timescale once at the simulator invocation -
all modules are forced to that timescale. A value which I could then
use throughout my testbench a priori. However, it seems a bit odd that
the language doesn't have a concept of wall clock time or timers for
that matter.

I agree that this is odd. I find it strange that the delays
themselves are unitless, and that the units are specified separately
for the entire module by a compiler directive mechanism.

But I find it even stranger that you are not using this available
mechanism to specify the units in your design, instead waiting until
you invoke the simulator to decide the units. That is a much greater
separation between the delays and the units, since your design doesn't
contain the units at all.

Hmm... I thought since timescale only meant something at simulation
time it would make sense to specify it at sim-time and not in the
many, many source files I have. This way, if I wish to change from 1ns/
1ps to 1us/1ns I don't have to hunt the timescale directive down in
each and every file. I realize I could include a my_timescale.vh file
to provide this feature but it seems clunky when you can just specify
timescale when invoking the simulator. Maybe I'm missing something
since I'm just a C programmer learning to write testbenches for other
peoples' RTL.

In SystemVerilog, you can specify time literals with units. They are
still just real numbers, but are reverse-scaled by the timescale so
that timescaling will give the desired value. That means that you can
not only get the delay you want (such as 20ms), but you can also
extract the timescale. For example, the value (1.0/1fs) will tell you
the ratio between a delay of 1.0 and a delay of 1fs, which is the
timescale as a number of femtoseconds.

Oh. Then:

real tlp_timeout_time = $realtime + 20ms;

is the answer I'm looking for.

Thank you very much!

- Mark
The more obvious question to me would be what do expect the delay to
be when you write:
a <= #2 b;
I would think that has to be known when you write the testbench
and therefore the timescale directive would be appropriate.

Or do you always write
a <= #2.0ns b;
?
Regards,
Gabor
 
On Feb 2, 10:36 am, gabor <ga...@alacron.com> wrote:
On Feb 1, 2:29 pm, mrfirmware <mrfirmw...@gmail.com> wrote:



On Feb 1, 1:24 pm, sh...@cadence.com wrote:

On Feb 1, 8:25 am, mrfirmware <mrfirmw...@gmail.com> wrote:

Don't use 'em. I set the timescale once at the simulator invocation -
all modules are forced to that timescale. A value which I could then
use throughout my testbench a priori. However, it seems a bit odd that
the language doesn't have a concept of wall clock time or timers for
that matter.

I agree that this is odd. I find it strange that the delays
themselves are unitless, and that the units are specified separately
for the entire module by a compiler directive mechanism.

But I find it even stranger that you are not using this available
mechanism to specify the units in your design, instead waiting until
you invoke the simulator to decide the units. That is a much greater
separation between the delays and the units, since your design doesn't
contain the units at all.

Hmm... I thought since timescale only meant something at simulation
time it would make sense to specify it at sim-time and not in the
many, many source files I have. This way, if I wish to change from 1ns/
1ps to 1us/1ns I don't have to hunt the timescale directive down in
each and every file. I realize I could include a my_timescale.vh file
to provide this feature but it seems clunky when you can just specify
timescale when invoking the simulator. Maybe I'm missing something
since I'm just a C programmer learning to write testbenches for other
peoples' RTL.

In SystemVerilog, you can specify time literals with units. They are
still just real numbers, but are reverse-scaled by the timescale so
that timescaling will give the desired value. That means that you can
not only get the delay you want (such as 20ms), but you can also
extract the timescale. For example, the value (1.0/1fs) will tell you
the ratio between a delay of 1.0 and a delay of 1fs, which is the
timescale as a number of femtoseconds.

Oh. Then:

real tlp_timeout_time = $realtime + 20ms;

is the answer I'm looking for.

Thank you very much!

- Mark

The more obvious question to me would be what do expect the delay to
be when you write:
a <= #2 b;
I would think that has to be known when you write the testbench
and therefore the timescale directive would be appropriate.

Or do you always write
a <= #2.0ns b;
?
Regards,
Gabor
Good question but different problem. I'm in testbench land. I needed a
20ms delay from now for a watchdog timer (of sorts) so "+ 20ms" was
the answer I needed.

- Mark
 
On Sat, 2 Feb 2008 07:36:06 -0800 (PST),
gabor <gabor@alacron.com> wrote:

The more obvious question to me would be what do expect the delay to
be when you write:
a <= #2 b;
Two timeunits of whatever the `timescale is at the point where
the compiler sees this code. Note that SystemVerilog permits
the use of this form:

timeunit 1ns;
timeprecision 100ps;

*within* a module (in fact, it must be the very first statement
in the module) so that your module's code is not at the mercy of
compilation order, capricious use of `include and so forth.

Or do you always write
a <= #2.0ns b;
That's another way to do it. Of course, that still relies on
the timeprecision being 1ns or better (if the global timeprecision
is 10ns or greater, then "2.0ns" will get rounded to zero).
--
Jonathan Bromley, Consultant

DOULOS - Developing Design Know-how
VHDL * Verilog * SystemC * e * Perl * Tcl/Tk * Project Services

Doulos Ltd., 22 Market Place, Ringwood, BH24 1AW, UK
jonathan.bromley@MYCOMPANY.com
http://www.MYCOMPANY.com

The contents of this message may contain personal views which
are not the views of Doulos Ltd., unless specifically stated.
 
On Feb 2, 10:36 am, gabor <ga...@alacron.com> wrote:
The more obvious question to me would be what do expect the delay to
be when you write:
  a <= #2 b;
I would think that has to be known when you write the testbench
and therefore the timescale directive would be appropriate.
Gabor's point here is very similar to mine. Someone writing a delay
presumably has in mind what real-world delay that is supposed to
represent, and will therefore specify a timescale in the design, to
make sure that happens. It is odd that the language makes it so
clunky to do what you would generally want to do.

The use of #2 instead of #1 in this example was presumably
deliberate. There might be cases where you would use #1 without a
timescale, to mean "a small nonzero delay, I don't care exactly how
much." But it is unlikely you would use #2 without a specific
timescale in mind.

On the other hand, with zero-delay modeling styles, there may not be
many explicit delays in a design anyway. Delays to generate a
particular clock period might be the only ones.
 

Welcome to EDABoard.com

Sponsor

Back
Top