K
Kelvin Tsai @ Singapore
Guest
Hi, all:
Why does simulation clock period affect power consumption?
I am performing power analysis based on unit delay simulation on a
0.18um process. My library
uses 1ns for each cell by default. The combination logic levels are
above 5 levels while my
clock is 5 ns. so, I tried to simulate with a clock rate of 10ns
instead and get away with the
SAIF whose duration and TC0 and TC1 are all twice of what my intended,
but the ratio is good.
Now I run analysis, the power consumption halved from the number with
SAIF from zero delay simulation...
In DC I use same 5ns clock rate for both analysis. The other analysis
were performed separate with
SAIF from zero delay simulation, and was simulated at clock period of
5ns.
Is there anything wrong with my procedure? Does the
Best Regards,
Kelvin.
Why does simulation clock period affect power consumption?
I am performing power analysis based on unit delay simulation on a
0.18um process. My library
uses 1ns for each cell by default. The combination logic levels are
above 5 levels while my
clock is 5 ns. so, I tried to simulate with a clock rate of 10ns
instead and get away with the
SAIF whose duration and TC0 and TC1 are all twice of what my intended,
but the ratio is good.
Now I run analysis, the power consumption halved from the number with
SAIF from zero delay simulation...
In DC I use same 5ns clock rate for both analysis. The other analysis
were performed separate with
SAIF from zero delay simulation, and was simulated at clock period of
5ns.
Is there anything wrong with my procedure? Does the
Best Regards,
Kelvin.