verilog inline delay style question

  • Thread starter Lawrence Nospam
  • Start date
L

Lawrence Nospam

Guest
I have a coworker who uses a very annoying style
of writing verilog. Unfortunately, he does this
for a reason, and I cannot think of a better way
to write it.

I hope someone can help me get rid of his style.

The bad style:

always @(posedge clk)
reg_var <= #CtoD expression;

The reason he does this is because he (sometimes)
mixes behaviorial code and gate-level code in
verification on a module-by-module basis.

The gate-level code has a clock tree which has
different non-zero delays to it's flops. The #CtoD
is needed so that the behaviorial output comes after
the latest gate-level flop captures. This avoids
hold-time problems which don't happen in real life.
Without this, a mixed gate/verilog system will FAIL.

Plus this lets him see something like gate delays
in simulating behaviorial code.


I want those delays (strewn throughout ALL of his
code) to go away. Can anyone help me out?


Is there any way to do something, like a spec-param
per file, or ANYTHING on a command line, module, or
file basis, so that he can satisfy his need for a
delay fom clock to data without cluttering the code?

(I want to stay with always blocks. I can't talk
him into instantiating a flop module with internal
delay.)

Thanks for any ideas.

Lawrence NoSpam
 

Welcome to EDABoard.com

Sponsor

Back
Top