[DC] Determine parameter in set_input_delay?

D

Davy

Guest
Hi all,

When use set_input_delay/set_output_delay, how to determine the -max/-
min parameter? Is it calculated by hand , calculated by tools, or give
out by some standard specification?

Code:
set_input_delay -max 498 -clock EXTSCL [find port ddc_sda_i]
set_input_delay -min 0 -clock EXTSCL [find port ddc_sda_i]

set_output_delay -max 498 -clock CLK1MHZ [find port ddc_sda_o]
set_output_delay -min 0 -clock CLK1MHZ [find port ddc_sda_o]


Best regards,
Davy
 
"Davy" <zhushenli@gmail.com> wrote in message
news:1172478762.294069.14980@q2g2000cwa.googlegroups.com...
Hi all,

When use set_input_delay/set_output_delay, how to determine the -max/-
min parameter? Is it calculated by hand , calculated by tools, or give
out by some standard specification?

Code:
set_input_delay -max 498 -clock EXTSCL [find port ddc_sda_i]
set_input_delay -min 0 -clock EXTSCL [find port ddc_sda_i]

set_output_delay -max 498 -clock CLK1MHZ [find port ddc_sda_o]
set_output_delay -min 0 -clock CLK1MHZ [find port ddc_sda_o]


Best regards,
Davy
It depends. If you're doing hierarchical synthesis, some tool might help you
budgetting your internal I/O timing constraints. When going of chip, it's
either you who decides what the user gets or the other way around: the user
tells you what he wants.

Alvin.
 

Welcome to EDABoard.com

Sponsor

Back
Top