T
trmatthe
Guest
Hi. My hobby is working on old arcade boards - generally fixing dead
ones. At the moment I have a Hameg scope with 20 MHz bandwidth. I've
noticed a number of times that when looking at data lines on higher
speed boards, I get a confusing picture. For example, watching a data
line on a 12MHz 68000 gives me apparently two pictures super-imposed.
I will have a clear and bright wave showing the changing logic states,
but also a fainter set of logic transitions super-imposed.
I have a feeling that this is related to the bandwidth of my scope -
I've searched the archive here and have learnt some more information,
but I'm still not knowledgeable enough to shell out cash yet. I have a
feeling I need a higher bandwidth scope. To further confuse myself, I
have somehow convinced myself that I need a faster timebase. I can go
down to 0.5uS but was thinking that even faster would allow me a more
accurate display. As a final thought, I did wonder if the "echo" I'm
seeing on the screen is actually a badly terminated bus and I'm just
seeing reflections.
My understanding is that an faster timebase allows me to see less on
the screen, but with more accuracy - e.g. seeing the slope on a square
wave. Having been brought up in a digital world, I equate bandwidth
with resolution and accuracy.
Have I made enough sense to allow you to give me advice? As I learn
more I'll be progressing to CPUs with faster clock speeds, so I can
only assume the problem will get worse in that I could miss logic
transitions.
If none of this makes sense, I'd very much appreciate somebody
explaining the relationship between bandwidth/timebase/resolution (or
a pointer to a website?)
thanks,
Tim
ones. At the moment I have a Hameg scope with 20 MHz bandwidth. I've
noticed a number of times that when looking at data lines on higher
speed boards, I get a confusing picture. For example, watching a data
line on a 12MHz 68000 gives me apparently two pictures super-imposed.
I will have a clear and bright wave showing the changing logic states,
but also a fainter set of logic transitions super-imposed.
I have a feeling that this is related to the bandwidth of my scope -
I've searched the archive here and have learnt some more information,
but I'm still not knowledgeable enough to shell out cash yet. I have a
feeling I need a higher bandwidth scope. To further confuse myself, I
have somehow convinced myself that I need a faster timebase. I can go
down to 0.5uS but was thinking that even faster would allow me a more
accurate display. As a final thought, I did wonder if the "echo" I'm
seeing on the screen is actually a badly terminated bus and I'm just
seeing reflections.
My understanding is that an faster timebase allows me to see less on
the screen, but with more accuracy - e.g. seeing the slope on a square
wave. Having been brought up in a digital world, I equate bandwidth
with resolution and accuracy.
Have I made enough sense to allow you to give me advice? As I learn
more I'll be progressing to CPUs with faster clock speeds, so I can
only assume the problem will get worse in that I could miss logic
transitions.
If none of this makes sense, I'd very much appreciate somebody
explaining the relationship between bandwidth/timebase/resolution (or
a pointer to a website?)
thanks,
Tim