Care And Feeding of LEDs and DC is Better Than Pulsed by HP

  • Thread starter Watson A.Name \"Watt Sun
  • Start date
Thanks, I guess......
Sorry I omitted a detail, I REALLY thought since I specified the
(NTSC) it would be clear.
I have only designed displays since 1963, so I could use a LOT of
help.... I'm sure!
Pretty good credentials. The stuff I am talking about is not normally
stuff you need to know in the 1960's, 1970's or even 1980's.

However, these days, it is important to be familiar with the fact that
a single TV frame can contain two different images in the two fields
(even scanlines and oddscanlines). It's common knowledge against 21st
century designers. It also complicates the design of line doublers,
which has to interpolate between the lines ("guess the pixels" so to
speak).

You've probably heard of Sony/Toshiba/Whatever's marketing
terminologies of things like DRC, PureProgressive, XBR, and whatnot.
Some of these technologies digitally processes NTSC material and
digitally upconverts the 480i image (525i, including vertical blanking
interval), to HDTV modes such as 720p and 1080i, and does all the
interpolation between scanlines. If you study the C/C++ source code
of one of the open-source versions of these interpolators (at least
480i -> 480p), http://www.dscaler.org , you suddenly realize things
are much more complicated today, since there's so much internal
processing done inside a typical HDTV set today. Don't get me
started on the digital equipment that cable companies and broadcasters
use these days, either...

These two-fields-in-one-frame is a necessary item of knowledge mantra
for upconverter designers, HDTV television, digital editing (since
you're importing interlaced video for a progressive-scan computer
display), and so on.

This knowledge is also mandatory among the chip designers of DVD
players -- where DVD is normally stored as interlaced video, and it
also gets more complex for these chip designers, when it is needed to
playback these interlaced video as progressive scan (in
progressive-scan DVD players).

Anyway, all NTSC TV's since the invention of NTSC have always been
capable of 60 distinct image per second via alternating fields (even
vs odd fields -- although some day high vs low fields, pick your own
preferred terminology), if the video camera captures a new image all
over again between fields (nearly all of those used for news and
sports TV do, at least). It's also why you notice fast motion is
typically much smoother during broadcast news/sports than via 24fps
film -- to the human eye, temporally, you're really comparing 24
versus 60 ... not 24 versus 30.

Thanks,
Mark Rejhon
 
don@manx.misty.com (Don Klipstein) wrote in message news:<slrnc646a6.qen.don@manx.misty.com>...
And the screen is illuminated something like 85% of the time. And, the
way I hear it, movie projectors add an extra interruption of the light in
the middle of showing a frame so that the flicker rate is 48 Hz. That's
roughly 17.8 milliseconds on, 3 milliseconds off.
That's true. I watched some films on an old film projector before the
days that each movie frame was automatically re-projected. Man, those
were very flickery -- 24 fps flicker is REALLY BAD! (That's why they
call them "flicks" -- because everybody can see the flicker).

Today's movie projectors really do a good job these days at
eliminating flicker, despite the 24fps source material.


Most CRTs have the phosphor fade quite a bit within a millisecond, so
they would have flicker that is usually noticeable if they used 48 Hz.
It definitely is. When using PowerStrip at
http://www.entechtaiwan.com
to assign a custom 48 Hz refresh rate to my Radeon 9600 Pro, I can
REALLY see the flicker. (PowerStrip can set refresh rate in 0.001 Hz
increments on Radeon cards). My multisync monitor starts to lose
sync at approximately 45 Hz, your monitor may vary.

Actually, it works on nVidia Geforce series cards, but at somewhat
lower precision (I think 0.1 or 0.05 Hz increments). I've done
50 Hz on a Geforce2 and Geforce2 MX AGP graphics cards before.


NTSC TV sets are interlaced and the vertical scan rate is 60 Hz. If I
look at one with a strong magnifier so that individual horizontal lines
really stand out, I see the lines flickering.
Yeah, that's the individual scanlines flickering at 30 Hz. I see that
too. But if you step back, you can't tell anymore because your eyes
blends the whole screen image (from even scanlines) with the whole
screen image (from odd scanlines) and you just see 60 Hz, and it no
longer flickers at a distance. These two separate alternating 30 Hz
flickers (offset in cycle 1/60th of a second apart), integrate into
one 60 Hz flicker to the human eye, and thus you cannot see the
flicker easily anymore. In normal TV watching conditions, the 60 Hz
flicker is not noticeable to most people.

(Except whenever a weather broadcaster uses graphics that are too
fine-grained and unfiltered -- flickery lines and edges in weather
graphics used to be a problem in the 80's and 90's)


Mark Rejhon
 
don@manx.misty.com (Don Klipstein) wrote in message news:<slrnc646a6.qen.don@manx.misty.com>...
Most CRTs have the phosphor fade quite a bit within a millisecond, so
they would have flicker that is usually noticeable if they used 48 Hz.
I should add - older color televisions had longer persistence
phosphors (1950's and 1960's color TVs) but that had some nasty side
effects like ghosting effects during fast motion. That's good for
things like radar screens, but bad for things like color TV video.
So they switched to medium and shorter persistence phosphors over the
years, which had much more crystal clear motion. However, it does
heighten flicker for those who are abnormally sensitive to it
(thankfully, almost nobody notices the 60 Hz flicker during regular
living room lighting).
 
hmurray@suespammers.org (Hal Murray) wrote:
Why would the leads on LEDs be silver plated?
For ease of soldering.

--
William Smith
ComputerSmiths Consulting, Inc. www.compusmiths.com
 

Welcome to EDABoard.com

Sponsor

Back
Top