Calculating the delay

S

Syed Huq

Guest
Hi,

I've been trying to implement a certain logic where I'm trying to store the samples from an ADC specifically the LM97600. The ADC is interfaced to a Virtex-5 FPGA and connected through 10-high speed LVDS lanes. The Virtex-5 FPGA uses GTX Transceivers to receive the sampled data from the ADC.
I'm trying to store the data samples on a trigger signal but I'm having a hard time calculating the delay and storing the data samples.

I'm storing the data samples on the internal memory of the FPGA using a BRAM. For now, I'm just trying to get a simple trigger to work where the signal is stored on the arrival of a trigger.

The signal is basically a very short pulse with a rise and fall time of 2.5 ns with a pulse width of 4 ns. This pulse is generated using a signal generator and split into two with one going to the ADC and another going through a comparator to generate the trigger signal. I used a scope to calculate the arrival times and the trigger signal arrives around 1.2us later to the FPGA I/O than the pulse arrives at the input of the ADC. I'm not exactly sure the propagation delay involved in the ADC sampling, and the delay in the SERDES receiver. The data is also remapped from 10 lane to 8-bit data before arriving at the memory.

Since the memory I'm using in the FPGA is pretty small 256 bits x 8192 , I'm not exactly sure how to go about calculating all the delays and ensuring that the pulse is captured exactly when the trigger goes high.
 
On Monday, 12 January 2015 12:02:49 UTC-6, rickman wrote:
On 1/12/2015 5:51 AM, Syed Huq wrote:
Hi,

I've been trying to implement a certain logic where I'm trying to store the samples from an ADC specifically the LM97600. The ADC is interfaced to a Virtex-5 FPGA and connected through 10-high speed LVDS lanes. The Virtex-5 FPGA uses GTX Transceivers to receive the sampled data from the ADC.
I'm trying to store the data samples on a trigger signal but I'm having a hard time calculating the delay and storing the data samples.

I'm storing the data samples on the internal memory of the FPGA using a BRAM. For now, I'm just trying to get a simple trigger to work where the signal is stored on the arrival of a trigger.

The signal is basically a very short pulse with a rise and fall time of 2.5 ns with a pulse width of 4 ns. This pulse is generated using a signal generator and split into two with one going to the ADC and another going through a comparator to generate the trigger signal. I used a scope to calculate the arrival times and the trigger signal arrives around 1.2us later to the FPGA I/O than the pulse arrives at the input of the ADC. I'm not exactly sure the propagation delay involved in the ADC sampling, and the delay in the SERDES receiver. The data is also remapped from 10 lane to 8-bit data before arriving at the memory.

Since the memory I'm using in the FPGA is pretty small 256 bits x 8192 , I'm not exactly sure how to go about calculating all the delays and ensuring that the pulse is captured exactly when the trigger goes high.

My recommendation is that you use the IOB FFs on each input to
synchronize your FPGA internal signals to the internal clock. Then the
timing to the block RAM is an FPGA internal timing issue and can be
handled with timing constraints.

The timing of the I/O signals will be simpler as well, but this is
largely a board design issue. The FPGA should provide timing data
between the clock and the I/O pins for the LVDS interface. You will
need to do an analysis of the timing of the other components and your
board trace delays. There will be a path from the clock source to the
ADC clock input, through the ADC and back out the data pins to the FPGA
data input pins. This is in parallel with the clock path through the
comparator to the FPGA clock input. You will need to be sure the max
and min delays provide a wide enough eye opening that the clock to the
FPGA is during the stable portion after accounting for the setup and
hold times.

--

Rick

How would I determine what the delay is from the I/O pad to the register of the trigger as well as the delay for the input of the LVDS lanes to the registers where it is first stored ?
 
On 1/12/2015 5:51 AM, Syed Huq wrote:
Hi,

I've been trying to implement a certain logic where I'm trying to store the samples from an ADC specifically the LM97600. The ADC is interfaced to a Virtex-5 FPGA and connected through 10-high speed LVDS lanes. The Virtex-5 FPGA uses GTX Transceivers to receive the sampled data from the ADC.
I'm trying to store the data samples on a trigger signal but I'm having a hard time calculating the delay and storing the data samples.

I'm storing the data samples on the internal memory of the FPGA using a BRAM. For now, I'm just trying to get a simple trigger to work where the signal is stored on the arrival of a trigger.

The signal is basically a very short pulse with a rise and fall time of 2.5 ns with a pulse width of 4 ns. This pulse is generated using a signal generator and split into two with one going to the ADC and another going through a comparator to generate the trigger signal. I used a scope to calculate the arrival times and the trigger signal arrives around 1.2us later to the FPGA I/O than the pulse arrives at the input of the ADC. I'm not exactly sure the propagation delay involved in the ADC sampling, and the delay in the SERDES receiver. The data is also remapped from 10 lane to 8-bit data before arriving at the memory.

Since the memory I'm using in the FPGA is pretty small 256 bits x 8192 , I'm not exactly sure how to go about calculating all the delays and ensuring that the pulse is captured exactly when the trigger goes high.

My recommendation is that you use the IOB FFs on each input to
synchronize your FPGA internal signals to the internal clock. Then the
timing to the block RAM is an FPGA internal timing issue and can be
handled with timing constraints.

The timing of the I/O signals will be simpler as well, but this is
largely a board design issue. The FPGA should provide timing data
between the clock and the I/O pins for the LVDS interface. You will
need to do an analysis of the timing of the other components and your
board trace delays. There will be a path from the clock source to the
ADC clock input, through the ADC and back out the data pins to the FPGA
data input pins. This is in parallel with the clock path through the
comparator to the FPGA clock input. You will need to be sure the max
and min delays provide a wide enough eye opening that the clock to the
FPGA is during the stable portion after accounting for the setup and
hold times.

--

Rick
 
Oh,no. Sorry. The trigger that I'm talking about is an external TTL input in my experiment.

So basically, I'm wondering about the delay between the I/O pad to the register. Similarly, the delay from the GTX transceivers receiving the data to being stored in the BRAM.
 
On Monday, 12 January 2015 18:38:20 UTC-6, rickman wrote:
On 1/12/2015 7:01 PM, Syed Huq wrote:
Oh,no. Sorry. The trigger that I'm talking about is an external TTL input in my experiment.

So basically, I'm wondering about the delay between the I/O pad to the register. Similarly, the delay from the GTX transceivers receiving the data to being stored in the BRAM.

You need to explain what you are doing. Are you using the trigger as a
clock, a clock enable or something else?

--

Rick

Sorry if I'm confusing. Basically, my trigger signal is an external LVTTL signal which is something similar to an enable signal. There is analog data coming in to the ADC, sampled and then received by the FPGA. I don't want the BRAM to store the data in the BRAM until it receives the trigger signal (i.e an active high signal on a certain I/O pin) The trigger signal is generated by an external trigger circuit when it receives a square wave.

I am trying to synchronize the square pulse to arrive at the BRAM at the same time as the trigger signal going high, so I am storing the correct data. Right now, my timing seems to be off since I seem to lose the data from the ADC for the square pulse and I end up storing invalid data on the BRAM.

I hope I've explained it better this time.
 
On 1/12/2015 5:39 PM, Syed Huq wrote:
On Monday, 12 January 2015 12:02:49 UTC-6, rickman wrote:
On 1/12/2015 5:51 AM, Syed Huq wrote:
Hi,

I've been trying to implement a certain logic where I'm trying to store the samples from an ADC specifically the LM97600. The ADC is interfaced to a Virtex-5 FPGA and connected through 10-high speed LVDS lanes. The Virtex-5 FPGA uses GTX Transceivers to receive the sampled data from the ADC.
I'm trying to store the data samples on a trigger signal but I'm having a hard time calculating the delay and storing the data samples.

I'm storing the data samples on the internal memory of the FPGA using a BRAM. For now, I'm just trying to get a simple trigger to work where the signal is stored on the arrival of a trigger.

The signal is basically a very short pulse with a rise and fall time of 2.5 ns with a pulse width of 4 ns. This pulse is generated using a signal generator and split into two with one going to the ADC and another going through a comparator to generate the trigger signal. I used a scope to calculate the arrival times and the trigger signal arrives around 1.2us later to the FPGA I/O than the pulse arrives at the input of the ADC. I'm not exactly sure the propagation delay involved in the ADC sampling, and the delay in the SERDES receiver. The data is also remapped from 10 lane to 8-bit data before arriving at the memory.

Since the memory I'm using in the FPGA is pretty small 256 bits x 8192 , I'm not exactly sure how to go about calculating all the delays and ensuring that the pulse is captured exactly when the trigger goes high.

My recommendation is that you use the IOB FFs on each input to
synchronize your FPGA internal signals to the internal clock. Then the
timing to the block RAM is an FPGA internal timing issue and can be
handled with timing constraints.

The timing of the I/O signals will be simpler as well, but this is
largely a board design issue. The FPGA should provide timing data
between the clock and the I/O pins for the LVDS interface. You will
need to do an analysis of the timing of the other components and your
board trace delays. There will be a path from the clock source to the
ADC clock input, through the ADC and back out the data pins to the FPGA
data input pins. This is in parallel with the clock path through the
comparator to the FPGA clock input. You will need to be sure the max
and min delays provide a wide enough eye opening that the clock to the
FPGA is during the stable portion after accounting for the setup and
hold times.

--

Rick

How would I determine what the delay is from the I/O pad to the register of the trigger as well as the delay for the input of the LVDS lanes to the registers where it is first stored ?

The data sheet has that info. As long as you use a dedicated clock
input, the clock is hard wired. So the timing does not depend on
routing. Just like any other device the FFs in the IOBs have setup and
hold times. I assume what you are calling "trigger" is what the rest of
us call a clock.

--

Rick
 
On 1/12/2015 7:01 PM, Syed Huq wrote:
Oh,no. Sorry. The trigger that I'm talking about is an external TTL input in my experiment.

So basically, I'm wondering about the delay between the I/O pad to the register. Similarly, the delay from the GTX transceivers receiving the data to being stored in the BRAM.

You need to explain what you are doing. Are you using the trigger as a
clock, a clock enable or something else?

--

Rick
 
On 1/12/2015 8:27 PM, Syed Huq wrote:
On Monday, 12 January 2015 18:38:20 UTC-6, rickman wrote:
On 1/12/2015 7:01 PM, Syed Huq wrote:
Oh,no. Sorry. The trigger that I'm talking about is an external TTL input in my experiment.

So basically, I'm wondering about the delay between the I/O pad to the register. Similarly, the delay from the GTX transceivers receiving the data to being stored in the BRAM.

You need to explain what you are doing. Are you using the trigger as a
clock, a clock enable or something else?

--

Rick

Sorry if I'm confusing. Basically, my trigger signal is an external LVTTL signal which is something similar to an enable signal. There is analog data coming in to the ADC, sampled and then received by the FPGA. I don't want the BRAM to store the data in the BRAM until it receives the trigger signal (i.e an active high signal on a certain I/O pin) The trigger signal is generated by an external trigger circuit when it receives a square wave.

I am trying to synchronize the square pulse to arrive at the BRAM at the same time as the trigger signal going high, so I am storing the correct data. Right now, my timing seems to be off since I seem to lose the data from the ADC for the square pulse and I end up storing invalid data on the BRAM.

I hope I've explained it better this time.

If you are using the trigger signal as an enable, just run that through
an IOB FF as well to keep it synchronized to the data and use it to
enable the block RAM.

--

Rick
 
Syed Huq wrote:
On Monday, 12 January 2015 18:38:20 UTC-6, rickman wrote:
On 1/12/2015 7:01 PM, Syed Huq wrote:
Oh,no. Sorry. The trigger that I'm talking about is an external TTL input in my experiment.

So basically, I'm wondering about the delay between the I/O pad to the register. Similarly, the delay from the GTX transceivers receiving the data to being stored in the BRAM.
You need to explain what you are doing. Are you using the trigger as a
clock, a clock enable or something else?

--

Rick

Sorry if I'm confusing. Basically, my trigger signal is an external LVTTL signal which is something similar to an enable signal. There is analog data coming in to the ADC, sampled and then received by the FPGA. I don't want the BRAM to store the data in the BRAM until it receives the trigger signal (i.e an active high signal on a certain I/O pin) The trigger signal is generated by an external trigger circuit when it receives a square wave.

I am trying to synchronize the square pulse to arrive at the BRAM at the same time as the trigger signal going high, so I am storing the correct data. Right now, my timing seems to be off since I seem to lose the data from the ADC for the square pulse and I end up storing invalid data on the BRAM.

I hope I've explained it better this time.

If I understand correctly, you're trying to calculate the difference in
the arrival time to your internal logic between the analog path and the
digital path given a trigger that happens at the same time as the analog
pulse to the ADC?

The ADC data sheet should show the latency from sampling to the data
output. Simulation should show the latency in the serial data
reception.

The LVTTL input delay is also easily seen in simulation.

On the other hand it sounds like what you're trying to do is something
like an digital storage oscilloscope, and they typically go about it
differently. Normally they would store analog data into RAM in a
circular fashion until triggered. Then they continue to store for
some number of samples after the trigger based on the desired position
of the trigger within the data buffer (usually this defaults to centered
in the buffer, so 1/2 the buffer size of additional samples would be
captured). Capturing before the trigger as well as after allows you
to see the input even if the trigger happens after the interesting
event.

--
Gabor
 
On Tuesday, 13 January 2015 09:22:47 UTC-6, gabor wrote:
Syed Huq wrote:
On Monday, 12 January 2015 18:38:20 UTC-6, rickman wrote:
On 1/12/2015 7:01 PM, Syed Huq wrote:
Oh,no. Sorry. The trigger that I'm talking about is an external TTL input in my experiment.

So basically, I'm wondering about the delay between the I/O pad to the register. Similarly, the delay from the GTX transceivers receiving the data to being stored in the BRAM.
You need to explain what you are doing. Are you using the trigger as a
clock, a clock enable or something else?

--

Rick

Sorry if I'm confusing. Basically, my trigger signal is an external LVTTL signal which is something similar to an enable signal. There is analog data coming in to the ADC, sampled and then received by the FPGA. I don't want the BRAM to store the data in the BRAM until it receives the trigger signal (i.e an active high signal on a certain I/O pin) The trigger signal is generated by an external trigger circuit when it receives a square wave.

I am trying to synchronize the square pulse to arrive at the BRAM at the same time as the trigger signal going high, so I am storing the correct data. Right now, my timing seems to be off since I seem to lose the data from the ADC for the square pulse and I end up storing invalid data on the BRAM.

I hope I've explained it better this time.

If I understand correctly, you're trying to calculate the difference in
the arrival time to your internal logic between the analog path and the
digital path given a trigger that happens at the same time as the analog
pulse to the ADC?

The ADC data sheet should show the latency from sampling to the data
output. Simulation should show the latency in the serial data
reception.

The LVTTL input delay is also easily seen in simulation.

On the other hand it sounds like what you're trying to do is something
like an digital storage oscilloscope, and they typically go about it
differently. Normally they would store analog data into RAM in a
circular fashion until triggered. Then they continue to store for
some number of samples after the trigger based on the desired position
of the trigger within the data buffer (usually this defaults to centered
in the buffer, so 1/2 the buffer size of additional samples would be
captured). Capturing before the trigger as well as after allows you
to see the input even if the trigger happens after the interesting
event.

--
Gabor

Gabor,

You are exactly right about how I wish to store the data. I'm using the same technique as you've suggested or the one that used in the Digital Oscilloscopes. I'm using a cyclical buffer and I've implemented a logic to store 1/2 of data from before the trigger and one half of data after the trigger.

So my cyclical buffer is 200ns long, 100ns of pretrigger data and 100 ns of post-trigger data. But since the pulse itself is not more than 10 ns, my timing seems to be way off in the sense that, it doesn't even fall into the 200 ns window so I'm completely missing the pulse that I want to capture.

So that's the reason I'm trying to minimize the delay between the digital trigger signal arriving and the analog data arriving into the FPGA at the same time.

Can you elaborate a bit on the simulation part ? I've never attempted to simulate the delay in the LVTTL input or the serial data arriving at the GTX Transceivers.
 
On 1/16/2015 9:44 AM, Syed Huq wrote:
On Tuesday, 13 January 2015 09:22:47 UTC-6, gabor wrote:
Syed Huq wrote:
On Monday, 12 January 2015 18:38:20 UTC-6, rickman wrote:
On 1/12/2015 7:01 PM, Syed Huq wrote:
Oh,no. Sorry. The trigger that I'm talking about is an external TTL input in my experiment.

So basically, I'm wondering about the delay between the I/O pad to the register. Similarly, the delay from the GTX transceivers receiving the data to being stored in the BRAM.
You need to explain what you are doing. Are you using the trigger as a
clock, a clock enable or something else?

--

Rick

Sorry if I'm confusing. Basically, my trigger signal is an external LVTTL signal which is something similar to an enable signal. There is analog data coming in to the ADC, sampled and then received by the FPGA. I don't want the BRAM to store the data in the BRAM until it receives the trigger signal (i.e an active high signal on a certain I/O pin) The trigger signal is generated by an external trigger circuit when it receives a square wave.

I am trying to synchronize the square pulse to arrive at the BRAM at the same time as the trigger signal going high, so I am storing the correct data. Right now, my timing seems to be off since I seem to lose the data from the ADC for the square pulse and I end up storing invalid data on the BRAM.

I hope I've explained it better this time.

If I understand correctly, you're trying to calculate the difference in
the arrival time to your internal logic between the analog path and the
digital path given a trigger that happens at the same time as the analog
pulse to the ADC?

The ADC data sheet should show the latency from sampling to the data
output. Simulation should show the latency in the serial data
reception.

The LVTTL input delay is also easily seen in simulation.

On the other hand it sounds like what you're trying to do is something
like an digital storage oscilloscope, and they typically go about it
differently. Normally they would store analog data into RAM in a
circular fashion until triggered. Then they continue to store for
some number of samples after the trigger based on the desired position
of the trigger within the data buffer (usually this defaults to centered
in the buffer, so 1/2 the buffer size of additional samples would be
captured). Capturing before the trigger as well as after allows you
to see the input even if the trigger happens after the interesting
event.

--
Gabor

Gabor,

You are exactly right about how I wish to store the data. I'm using the same technique as you've suggested or the one that used in the Digital Oscilloscopes. I'm using a cyclical buffer and I've implemented a logic to store 1/2 of data from before the trigger and one half of data after the trigger.

So my cyclical buffer is 200ns long, 100ns of pretrigger data and 100 ns of post-trigger data. But since the pulse itself is not more than 10 ns, my timing seems to be way off in the sense that, it doesn't even fall into the 200 ns window so I'm completely missing the pulse that I want to capture.

So that's the reason I'm trying to minimize the delay between the digital trigger signal arriving and the analog data arriving into the FPGA at the same time.

Can you elaborate a bit on the simulation part ? I've never attempted to simulate the delay in the LVTTL input or the serial data arriving at the GTX Transceivers.

I don't recommend that you simulate delays in general. In FPGAs delays
are typically analyzed using static timing analysis. If you want to
analyze delays you should do that on paper by adding up the external
delays. You should *not* be trying to use internal FPGA logic and
routing delays to compensate for external delays because the internal
delays will change with every iteration of place and route.

Your external delays can be calculated using the various chip timing
from the data sheets. This timing should pin point the clock cycle all
of the external signals arrive at the inputs to the FPGA with margin to
account for the setup and hold times. Once inside the FPGA you should
be counting clock cycles since everything will be synchronous. The
internal timing analysis is only needed to make sure you are meeting the
period spec of the clock. There is no need to deal with detailed logic
or routing timing inside the FPGA in a synchronous design.

What is the clock speed in the FPGA? If you are trying to fill a buffer
in 200 ns it must be a very fast clock.

--

Rick
 
rickman wrote:
On 1/16/2015 9:44 AM, Syed Huq wrote:
On Tuesday, 13 January 2015 09:22:47 UTC-6, gabor wrote:
Syed Huq wrote:
On Monday, 12 January 2015 18:38:20 UTC-6, rickman wrote:
On 1/12/2015 7:01 PM, Syed Huq wrote:
Oh,no. Sorry. The trigger that I'm talking about is an external
TTL input in my experiment.

So basically, I'm wondering about the delay between the I/O pad to
the register. Similarly, the delay from the GTX transceivers
receiving the data to being stored in the BRAM.
You need to explain what you are doing. Are you using the trigger
as a
clock, a clock enable or something else?

--

Rick

Sorry if I'm confusing. Basically, my trigger signal is an external
LVTTL signal which is something similar to an enable signal. There
is analog data coming in to the ADC, sampled and then received by
the FPGA. I don't want the BRAM to store the data in the BRAM until
it receives the trigger signal (i.e an active high signal on a
certain I/O pin) The trigger signal is generated by an external
trigger circuit when it receives a square wave.

I am trying to synchronize the square pulse to arrive at the BRAM at
the same time as the trigger signal going high, so I am storing the
correct data. Right now, my timing seems to be off since I seem to
lose the data from the ADC for the square pulse and I end up storing
invalid data on the BRAM.

I hope I've explained it better this time.

If I understand correctly, you're trying to calculate the difference in
the arrival time to your internal logic between the analog path and the
digital path given a trigger that happens at the same time as the analog
pulse to the ADC?

The ADC data sheet should show the latency from sampling to the data
output. Simulation should show the latency in the serial data
reception.

The LVTTL input delay is also easily seen in simulation.

On the other hand it sounds like what you're trying to do is something
like an digital storage oscilloscope, and they typically go about it
differently. Normally they would store analog data into RAM in a
circular fashion until triggered. Then they continue to store for
some number of samples after the trigger based on the desired position
of the trigger within the data buffer (usually this defaults to centered
in the buffer, so 1/2 the buffer size of additional samples would be
captured). Capturing before the trigger as well as after allows you
to see the input even if the trigger happens after the interesting
event.

--
Gabor

Gabor,

You are exactly right about how I wish to store the data. I'm using
the same technique as you've suggested or the one that used in the
Digital Oscilloscopes. I'm using a cyclical buffer and I've
implemented a logic to store 1/2 of data from before the trigger and
one half of data after the trigger.

So my cyclical buffer is 200ns long, 100ns of pretrigger data and 100
ns of post-trigger data. But since the pulse itself is not more than
10 ns, my timing seems to be way off in the sense that, it doesn't
even fall into the 200 ns window so I'm completely missing the pulse
that I want to capture.

So that's the reason I'm trying to minimize the delay between the
digital trigger signal arriving and the analog data arriving into the
FPGA at the same time.

Can you elaborate a bit on the simulation part ? I've never attempted
to simulate the delay in the LVTTL input or the serial data arriving
at the GTX Transceivers.

I don't recommend that you simulate delays in general. In FPGAs delays
are typically analyzed using static timing analysis. If you want to
analyze delays you should do that on paper by adding up the external
delays. You should *not* be trying to use internal FPGA logic and
routing delays to compensate for external delays because the internal
delays will change with every iteration of place and route.

Your external delays can be calculated using the various chip timing
from the data sheets. This timing should pin point the clock cycle all
of the external signals arrive at the inputs to the FPGA with margin to
account for the setup and hold times. Once inside the FPGA you should
be counting clock cycles since everything will be synchronous. The
internal timing analysis is only needed to make sure you are meeting the
period spec of the clock. There is no need to deal with detailed logic
or routing timing inside the FPGA in a synchronous design.

What is the clock speed in the FPGA? If you are trying to fill a buffer
in 200 ns it must be a very fast clock.

Rick,

I think you are misunderstanding the point. He's looking for delays
in clock cycles, which can be determined in behavioral simulation. My
suggestion was to get the pipeline delays from the ADC data sheet (this
can be many clock cycles depending on the architecture of the ADC) and
then build a simulation test bench that matches the expected arrival
times of both the ADC data corresponding to the trigger event and the
LVTTL trigger signal.

--
Gabor
 
On 1/16/2015 10:57 AM, GaborSzakacs wrote:
rickman wrote:
On 1/16/2015 9:44 AM, Syed Huq wrote:
On Tuesday, 13 January 2015 09:22:47 UTC-6, gabor wrote:
Syed Huq wrote:
On Monday, 12 January 2015 18:38:20 UTC-6, rickman wrote:
On 1/12/2015 7:01 PM, Syed Huq wrote:
Oh,no. Sorry. The trigger that I'm talking about is an external
TTL input in my experiment.

So basically, I'm wondering about the delay between the I/O pad
to the register. Similarly, the delay from the GTX transceivers
receiving the data to being stored in the BRAM.
You need to explain what you are doing. Are you using the trigger
as a
clock, a clock enable or something else?

--

Rick

Sorry if I'm confusing. Basically, my trigger signal is an external
LVTTL signal which is something similar to an enable signal. There
is analog data coming in to the ADC, sampled and then received by
the FPGA. I don't want the BRAM to store the data in the BRAM until
it receives the trigger signal (i.e an active high signal on a
certain I/O pin) The trigger signal is generated by an external
trigger circuit when it receives a square wave.

I am trying to synchronize the square pulse to arrive at the BRAM
at the same time as the trigger signal going high, so I am storing
the correct data. Right now, my timing seems to be off since I seem
to lose the data from the ADC for the square pulse and I end up
storing invalid data on the BRAM.

I hope I've explained it better this time.

If I understand correctly, you're trying to calculate the difference in
the arrival time to your internal logic between the analog path and the
digital path given a trigger that happens at the same time as the
analog
pulse to the ADC?

The ADC data sheet should show the latency from sampling to the data
output. Simulation should show the latency in the serial data
reception.

The LVTTL input delay is also easily seen in simulation.

On the other hand it sounds like what you're trying to do is something
like an digital storage oscilloscope, and they typically go about it
differently. Normally they would store analog data into RAM in a
circular fashion until triggered. Then they continue to store for
some number of samples after the trigger based on the desired position
of the trigger within the data buffer (usually this defaults to
centered
in the buffer, so 1/2 the buffer size of additional samples would be
captured). Capturing before the trigger as well as after allows you
to see the input even if the trigger happens after the interesting
event.

--
Gabor

Gabor,

You are exactly right about how I wish to store the data. I'm using
the same technique as you've suggested or the one that used in the
Digital Oscilloscopes. I'm using a cyclical buffer and I've
implemented a logic to store 1/2 of data from before the trigger and
one half of data after the trigger.

So my cyclical buffer is 200ns long, 100ns of pretrigger data and 100
ns of post-trigger data. But since the pulse itself is not more than
10 ns, my timing seems to be way off in the sense that, it doesn't
even fall into the 200 ns window so I'm completely missing the pulse
that I want to capture.

So that's the reason I'm trying to minimize the delay between the
digital trigger signal arriving and the analog data arriving into the
FPGA at the same time.

Can you elaborate a bit on the simulation part ? I've never attempted
to simulate the delay in the LVTTL input or the serial data arriving
at the GTX Transceivers.

I don't recommend that you simulate delays in general. In FPGAs
delays are typically analyzed using static timing analysis. If you
want to analyze delays you should do that on paper by adding up the
external delays. You should *not* be trying to use internal FPGA
logic and routing delays to compensate for external delays because the
internal delays will change with every iteration of place and route.

Your external delays can be calculated using the various chip timing
from the data sheets. This timing should pin point the clock cycle
all of the external signals arrive at the inputs to the FPGA with
margin to account for the setup and hold times. Once inside the FPGA
you should be counting clock cycles since everything will be
synchronous. The internal timing analysis is only needed to make sure
you are meeting the period spec of the clock. There is no need to
deal with detailed logic or routing timing inside the FPGA in a
synchronous design.

What is the clock speed in the FPGA? If you are trying to fill a
buffer in 200 ns it must be a very fast clock.


Rick,

I think you are misunderstanding the point. He's looking for delays
in clock cycles, which can be determined in behavioral simulation. My
suggestion was to get the pipeline delays from the ADC data sheet (this
can be many clock cycles depending on the architecture of the ADC) and
then build a simulation test bench that matches the expected arrival
times of both the ADC data corresponding to the trigger event and the
LVTTL trigger signal.

Why would you need to do a simulation to count clock cycles? That would
be part of your design and should be determined before you write one
line of code. We can all count, right?

--

Rick
 
On 1/16/2015 11:25 AM, rickman wrote:
On 1/16/2015 10:57 AM, GaborSzakacs wrote:
rickman wrote:
On 1/16/2015 9:44 AM, Syed Huq wrote:
On Tuesday, 13 January 2015 09:22:47 UTC-6, gabor wrote:
Syed Huq wrote:
On Monday, 12 January 2015 18:38:20 UTC-6, rickman wrote:
On 1/12/2015 7:01 PM, Syed Huq wrote:
Oh,no. Sorry. The trigger that I'm talking about is an external
TTL input in my experiment.

So basically, I'm wondering about the delay between the I/O pad
to the register. Similarly, the delay from the GTX transceivers
receiving the data to being stored in the BRAM.
You need to explain what you are doing. Are you using the trigger
as a
clock, a clock enable or something else?

--

Rick

Sorry if I'm confusing. Basically, my trigger signal is an external
LVTTL signal which is something similar to an enable signal. There
is analog data coming in to the ADC, sampled and then received by
the FPGA. I don't want the BRAM to store the data in the BRAM until
it receives the trigger signal (i.e an active high signal on a
certain I/O pin) The trigger signal is generated by an external
trigger circuit when it receives a square wave.

I am trying to synchronize the square pulse to arrive at the BRAM
at the same time as the trigger signal going high, so I am storing
the correct data. Right now, my timing seems to be off since I seem
to lose the data from the ADC for the square pulse and I end up
storing invalid data on the BRAM.

I hope I've explained it better this time.

If I understand correctly, you're trying to calculate the
difference in
the arrival time to your internal logic between the analog path and
the
digital path given a trigger that happens at the same time as the
analog
pulse to the ADC?

The ADC data sheet should show the latency from sampling to the data
output. Simulation should show the latency in the serial data
reception.

The LVTTL input delay is also easily seen in simulation.

On the other hand it sounds like what you're trying to do is something
like an digital storage oscilloscope, and they typically go about it
differently. Normally they would store analog data into RAM in a
circular fashion until triggered. Then they continue to store for
some number of samples after the trigger based on the desired position
of the trigger within the data buffer (usually this defaults to
centered
in the buffer, so 1/2 the buffer size of additional samples would be
captured). Capturing before the trigger as well as after allows you
to see the input even if the trigger happens after the interesting
event.

--
Gabor

Gabor,

You are exactly right about how I wish to store the data. I'm using
the same technique as you've suggested or the one that used in the
Digital Oscilloscopes. I'm using a cyclical buffer and I've
implemented a logic to store 1/2 of data from before the trigger and
one half of data after the trigger.

So my cyclical buffer is 200ns long, 100ns of pretrigger data and 100
ns of post-trigger data. But since the pulse itself is not more than
10 ns, my timing seems to be way off in the sense that, it doesn't
even fall into the 200 ns window so I'm completely missing the pulse
that I want to capture.

So that's the reason I'm trying to minimize the delay between the
digital trigger signal arriving and the analog data arriving into the
FPGA at the same time.

Can you elaborate a bit on the simulation part ? I've never attempted
to simulate the delay in the LVTTL input or the serial data arriving
at the GTX Transceivers.

I don't recommend that you simulate delays in general. In FPGAs
delays are typically analyzed using static timing analysis. If you
want to analyze delays you should do that on paper by adding up the
external delays. You should *not* be trying to use internal FPGA
logic and routing delays to compensate for external delays because the
internal delays will change with every iteration of place and route.

Your external delays can be calculated using the various chip timing
from the data sheets. This timing should pin point the clock cycle
all of the external signals arrive at the inputs to the FPGA with
margin to account for the setup and hold times. Once inside the FPGA
you should be counting clock cycles since everything will be
synchronous. The internal timing analysis is only needed to make sure
you are meeting the period spec of the clock. There is no need to
deal with detailed logic or routing timing inside the FPGA in a
synchronous design.

What is the clock speed in the FPGA? If you are trying to fill a
buffer in 200 ns it must be a very fast clock.


Rick,

I think you are misunderstanding the point. He's looking for delays
in clock cycles, which can be determined in behavioral simulation. My
suggestion was to get the pipeline delays from the ADC data sheet (this
can be many clock cycles depending on the architecture of the ADC) and
then build a simulation test bench that matches the expected arrival
times of both the ADC data corresponding to the trigger event and the
LVTTL trigger signal.

Why would you need to do a simulation to count clock cycles? That would
be part of your design and should be determined before you write one
line of code. We can all count, right?

Sometimes it's part of your design and sometimes it's part of someone
else's design included as a black-box in your system. If you don't
have that source code, or don't want to try to decipher its intent,
then simulation is a simple way to see the delay. Using the SERDES
blocks in Xilinx parts it's not so obvious what delay you get from
the start of a serial word to the parallel word coming out at the
user interface on a separately derived clock. In any case it makes
sense to simulate a design, even a simple one, to make sure it does
what you expected.

--
Gabor
 
The clock speed of the FPGA is 200 MHz to capture the 5 Gs/s signals from the ADC. I looked at the delay of the LM97600 ADC and it shows the propagation delay through the ADC as 42 clock cycles.

Gabor is right about the design being someone else'. Besides the delay in the ADC, I still need to figure out the delay in the GTX Transceivers. I believe I should be able to count the number of clock cycles from the output of the GTX Transceivers to when the data is ready to be stored in the BRAM.

How would I go about simulating the delay in the GTX Transceivers ?

On Friday, 16 January 2015 09:36:
53 UTC-6, rickman wrote:
On 1/16/2015 9:44 AM, Syed Huq wrote:
On Tuesday, 13 January 2015 09:22:47 UTC-6, gabor wrote:
Syed Huq wrote:
On Monday, 12 January 2015 18:38:20 UTC-6, rickman wrote:
On 1/12/2015 7:01 PM, Syed Huq wrote:
Oh,no. Sorry. The trigger that I'm talking about is an external TTL input in my experiment.

So basically, I'm wondering about the delay between the I/O pad to the register. Similarly, the delay from the GTX transceivers receiving the data to being stored in the BRAM.
You need to explain what you are doing. Are you using the trigger as a
clock, a clock enable or something else?

--

Rick

Sorry if I'm confusing. Basically, my trigger signal is an external LVTTL signal which is something similar to an enable signal. There is analog data coming in to the ADC, sampled and then received by the FPGA. I don't want the BRAM to store the data in the BRAM until it receives the trigger signal (i.e an active high signal on a certain I/O pin) The trigger signal is generated by an external trigger circuit when it receives a square wave.

I am trying to synchronize the square pulse to arrive at the BRAM at the same time as the trigger signal going high, so I am storing the correct data. Right now, my timing seems to be off since I seem to lose the data from the ADC for the square pulse and I end up storing invalid data on the BRAM.

I hope I've explained it better this time.

If I understand correctly, you're trying to calculate the difference in
the arrival time to your internal logic between the analog path and the
digital path given a trigger that happens at the same time as the analog
pulse to the ADC?

The ADC data sheet should show the latency from sampling to the data
output. Simulation should show the latency in the serial data
reception.

The LVTTL input delay is also easily seen in simulation.

On the other hand it sounds like what you're trying to do is something
like an digital storage oscilloscope, and they typically go about it
differently. Normally they would store analog data into RAM in a
circular fashion until triggered. Then they continue to store for
some number of samples after the trigger based on the desired position
of the trigger within the data buffer (usually this defaults to centered
in the buffer, so 1/2 the buffer size of additional samples would be
captured). Capturing before the trigger as well as after allows you
to see the input even if the trigger happens after the interesting
event.

--
Gabor

Gabor,

You are exactly right about how I wish to store the data. I'm using the same technique as you've suggested or the one that used in the Digital Oscilloscopes. I'm using a cyclical buffer and I've implemented a logic to store 1/2 of data from before the trigger and one half of data after the trigger.

So my cyclical buffer is 200ns long, 100ns of pretrigger data and 100 ns of post-trigger data. But since the pulse itself is not more than 10 ns, my timing seems to be way off in the sense that, it doesn't even fall into the 200 ns window so I'm completely missing the pulse that I want to capture.

So that's the reason I'm trying to minimize the delay between the digital trigger signal arriving and the analog data arriving into the FPGA at the same time.

Can you elaborate a bit on the simulation part ? I've never attempted to simulate the delay in the LVTTL input or the serial data arriving at the GTX Transceivers.

I don't recommend that you simulate delays in general. In FPGAs delays
are typically analyzed using static timing analysis. If you want to
analyze delays you should do that on paper by adding up the external
delays. You should *not* be trying to use internal FPGA logic and
routing delays to compensate for external delays because the internal
delays will change with every iteration of place and route.

Your external delays can be calculated using the various chip timing
from the data sheets. This timing should pin point the clock cycle all
of the external signals arrive at the inputs to the FPGA with margin to
account for the setup and hold times. Once inside the FPGA you should
be counting clock cycles since everything will be synchronous. The
internal timing analysis is only needed to make sure you are meeting the
period spec of the clock. There is no need to deal with detailed logic
or routing timing inside the FPGA in a synchronous design.

What is the clock speed in the FPGA? If you are trying to fill a buffer
in 200 ns it must be a very fast clock.

--

Rick
 
On 1/20/2015 1:34 PM, Syed Huq wrote:
The clock speed of the FPGA is 200 MHz to capture the 5 Gs/s signals from the ADC. I looked at the delay of the LM97600 ADC and it shows the propagation delay through the ADC as 42 clock cycles.

Gabor is right about the design being someone else'. Besides the delay in the ADC, I still need to figure out the delay in the GTX Transceivers. I believe I should be able to count the number of clock cycles from the output of the GTX Transceivers to when the data is ready to be stored in the BRAM.

How would I go about simulating the delay in the GTX Transceivers ?

You might simulate the circuit to understand the code written by someone
else although that is not the way I would do it. But simulating the
circuit to understand the SERDES seems overkill. Why not read the data
sheet for the SERDES? Surely they explain all the details, no?

Then once you have designed your circuit to account for the latencies in
the ADC and the rest of the circuitry, you can simulate the FPGA to
verify it is doing what you expect and then test the board to verify the
ADC is doing what you expect.

I looked at the ADC data sheet and found it difficult to figure out what
the delay would be from the analog input to the LVDS outputs. I hope
you have a better understanding of this than what I could get quickly.

Rick


On Friday, 16 January 2015 09:36:
53 UTC-6, rickman wrote:
On 1/16/2015 9:44 AM, Syed Huq wrote:
On Tuesday, 13 January 2015 09:22:47 UTC-6, gabor wrote:
Syed Huq wrote:
On Monday, 12 January 2015 18:38:20 UTC-6, rickman wrote:
On 1/12/2015 7:01 PM, Syed Huq wrote:
Oh,no. Sorry. The trigger that I'm talking about is an external TTL input in my experiment.

So basically, I'm wondering about the delay between the I/O pad to the register. Similarly, the delay from the GTX transceivers receiving the data to being stored in the BRAM.
You need to explain what you are doing. Are you using the trigger as a
clock, a clock enable or something else?

--

Rick

Sorry if I'm confusing. Basically, my trigger signal is an external LVTTL signal which is something similar to an enable signal. There is analog data coming in to the ADC, sampled and then received by the FPGA. I don't want the BRAM to store the data in the BRAM until it receives the trigger signal (i.e an active high signal on a certain I/O pin) The trigger signal is generated by an external trigger circuit when it receives a square wave.

I am trying to synchronize the square pulse to arrive at the BRAM at the same time as the trigger signal going high, so I am storing the correct data. Right now, my timing seems to be off since I seem to lose the data from the ADC for the square pulse and I end up storing invalid data on the BRAM.

I hope I've explained it better this time.

If I understand correctly, you're trying to calculate the difference in
the arrival time to your internal logic between the analog path and the
digital path given a trigger that happens at the same time as the analog
pulse to the ADC?

The ADC data sheet should show the latency from sampling to the data
output. Simulation should show the latency in the serial data
reception.

The LVTTL input delay is also easily seen in simulation.

On the other hand it sounds like what you're trying to do is something
like an digital storage oscilloscope, and they typically go about it
differently. Normally they would store analog data into RAM in a
circular fashion until triggered. Then they continue to store for
some number of samples after the trigger based on the desired position
of the trigger within the data buffer (usually this defaults to centered
in the buffer, so 1/2 the buffer size of additional samples would be
captured). Capturing before the trigger as well as after allows you
to see the input even if the trigger happens after the interesting
event.

--
Gabor

Gabor,

You are exactly right about how I wish to store the data. I'm using the same technique as you've suggested or the one that used in the Digital Oscilloscopes. I'm using a cyclical buffer and I've implemented a logic to store 1/2 of data from before the trigger and one half of data after the trigger.

So my cyclical buffer is 200ns long, 100ns of pretrigger data and 100 ns of post-trigger data. But since the pulse itself is not more than 10 ns, my timing seems to be way off in the sense that, it doesn't even fall into the 200 ns window so I'm completely missing the pulse that I want to capture.

So that's the reason I'm trying to minimize the delay between the digital trigger signal arriving and the analog data arriving into the FPGA at the same time.

Can you elaborate a bit on the simulation part ? I've never attempted to simulate the delay in the LVTTL input or the serial data arriving at the GTX Transceivers.

I don't recommend that you simulate delays in general. In FPGAs delays
are typically analyzed using static timing analysis. If you want to
analyze delays you should do that on paper by adding up the external
delays. You should *not* be trying to use internal FPGA logic and
routing delays to compensate for external delays because the internal
delays will change with every iteration of place and route.

Your external delays can be calculated using the various chip timing
from the data sheets. This timing should pin point the clock cycle all
of the external signals arrive at the inputs to the FPGA with margin to
account for the setup and hold times. Once inside the FPGA you should
be counting clock cycles since everything will be synchronous. The
internal timing analysis is only needed to make sure you are meeting the
period spec of the clock. There is no need to deal with detailed logic
or routing timing inside the FPGA in a synchronous design.

What is the clock speed in the FPGA? If you are trying to fill a buffer
in 200 ns it must be a very fast clock.

--

Rick

--

Rick
 

Welcome to EDABoard.com

Sponsor

Back
Top