120hz versus 240hz

I don't know how a 240Hz "scan rate" would be achieved.
It's probably a sort of trick that the set's electronics use
to make the picture seem just that much more stable.
Actually, it's a frame rate. It can be done by interpolation, by inserting
blank frames, or a combination of the two.
 
On Fri, 26 Feb 2010 10:20:29 -0800, William Sommerwerck <grizzledgeezer@comcast.net> wrote:
I don't know how a 240Hz "scan rate" would be achieved.
It's probably a sort of trick that the set's electronics use
to make the picture seem just that much more stable.

Actually, it's a frame rate. It can be done by interpolation, by inserting
blank frames, or a combination of the two.

or perhaps it's just a flash rate, the pulses powering the LED backlight
 
On 2/26/2010 8:46 AM Meat Plow spake thus:

On Thu, 25 Feb 2010 15:47:16 -0800, "William Sommerwerck"
grizzledgeezer@comcast.net>wrote:

Ignoring the fact that colour displays are finely tuned
to the way that human colour vision works, and an alien
would likely wonder what we'd been smoking.

This has nothing whatever to do with color rendition.

Who is Sylvia, anyway?

A troll?
Why would you jump to that conclusion?

Oh, forgot; that's what you do.


--
You were wrong, and I'm man enough to admit it.

- a Usenet "apology"
 
On Fri, 26 Feb 2010 10:15:56 -0800, "William Sommerwerck"
<grizzledgeezer@comcast.net>wrote:

** And if you put the remark back into its context --
what it IS relevant to becomes obvious.

No it doesn't.

Agreed. It seemed unrelated, even out of left field. I suspect Sylvia didn't
properly express what she wanted to say.
You have your posting style. No big deal to me. Those who don't want
to read you have their choice to or not to.
 
Hi!

Yes, that is how it was explained to me from a salesman as
well as what I gathered from online info. So, apparently, it is
still an LCD screen.
Yes, I'm sure it is. The only thing that's changed is the way the
panel is illuminated so you can see a picture. It used to be that
practically all LCD panels were backlight by a fluorescent tube (or a
set of tubes).

For a variety of reasons, this has changed. (These reasons would be
mercury in fluorescent tubes, lifetime of said tubes as compared to
LEDs, complexity of the driving electronics and energy efficiency.)

The "120Hz" refresh rate would not be hard to achieve. An interlaced
scanning method produces a picture that (in many cases) appears to
flicker much less than a non-interlaced one.

(IBM used to use a similar trick with their 8514 display. It used a
44Hz interlaced vertical scan rate that IBM called an "88Hz" scan
rate. It worked reasonably well, as long as you used an IBM monitor
with longer persistence phosphors and didn't have anything like a
fluorescent light fixture illuminating the room. If you did, the two
tended to "beat" against one another and the effect is annoying. And
if you didn't use an IBM monitor with the special phosphors, that
increased the apparent "flicker" level.)

I don't know how a 240Hz "scan rate" would be achieved. It's probably
a sort of trick that the set's electronics use to make the picture
seem just that much more stable.

William
 
"Phil Allison" <phil_a@tpg.com.au> wrote in message
news:7uoqa3Fit1U1@mid.individual.net...
"Arfa Daily"
"Phil Allison"
"William Sommerwanker IDIOT "

First, the only televisions that use LEDs use OLEDs. There are none
using
conventional LEDs.


** Fraid " LED TVs " are on sale all over the world right now.

FUCKWIT !!

http://en.wikipedia.org/wiki/LED-backlit_LCD_television



Your Wiki reference says it all. These are NOT LED televisions,

** But they are called " LED TVs " by their makers and so are

*KNOWN BY THAT NAME* to members of the public.


Fools like YOU and Sommerwanker would complain that a bottle of "Steak
Sauce" contained no steak.



.... Phil
I guess we should refer to all LCD sets by their backlight type. That makes
the one on my wall a CCFL TV. And I guess all of those DLP, LCD, DiLA,
SXRD, and LCoS projection sets should be called mercury vapor or whatever
type of lamp they use. And the new projectors could be called LED
projectors as well, even if they are DLP devices.

The point is that referring to the set by the type of backlight it uses is
very misleading and is causing much confusion in the marketplace.

Leonard
 
<Meat Plow>


** And if you put the remark back into its context - what it IS relevant
to becomes obvious.;

No it doesn't.

** Yes it does - you ASD FUCKED TENTH WIT !
 
<Meat Plow> wrote in message news:3i5job.5qb.19.13@news.alt.net...
On Fri, 26 Feb 2010 02:32:38 -0000, "Arfa Daily"
arfa.daily@ntlworld.com>wrote:


"William Sommerwerck" <grizzledgeezer@comcast.net> wrote in message
news:hm7241$21g$1@news.eternal-september.org...
First, the only televisions that use LEDs use OLEDs. There are none
using
conventional LEDs.

Second, there are no strict definitions of what these refresh rates
mean.
In
some cases, the set generates an interpolated image at that rate, in
others,
a blank (black) raster is inserted. Some sets combine both.

I don't like this enhancement (which was one of the reasons I bought a
plasma set). It has a nasty side-effect -- it makes motion pictures look
like video. This might be fine for a TV show; it isn't when you're
watching
movies. Be sure that whatever set you purchase has some way of defeating
it
the enhancement.

You need to actually look at the sets you're considering with program
material you're familiar with.



Seconded on all counts, and also the reason that I recently bought a
plasma
TV (Panasonic, 50" full HD panel, 400Hz). I have not seen a single thing
about this TV that I don't like so far, unlike the LCD TVs that I have in
the house, and the LCDs that cross my bench for repair, all of which
suffer
from motion artifacts, scaling artifacts, and motion blur ...

This plasma TV has produced absolutely stunning HD pictures from the
Winter
Olymics, with not the slightest sign of motion artifacts of any
description,
even on the fastest content like downhill skiing, and bobsleigh etc. In
contrast, the same content that I have seen on LCDs, has been perfectly
dreadful.

Arfa


Maybe I'm not picky but those motion artifacts just aren't important
enough for me to want to spend thousands on something that doesn't
produce them. I have a fairly cheep 32" and while it does produce some
artifacts they are insignificant to the overall performance.

But the point is that you no longer have to pay thousands to get that
performance. The plasma that I recently bought was little more to buy than a
'good' LCD, but the performance is easily a whole order of magnitude higher.
CRT sets did not suffer from motion artifacts, and I wasn't prepared to
'downgrade' my viewing experience by buying something which did. The LCD
that I have in my kitchen, also 32" and also 'fairly cheap', does suffer
from motion artifacts which are particularly noticeable on high speed stuff
like the winter olympics. I actually do find these significant and annoying,
and I would not consider having to put up with such a picture on my main TV.
Fortunately, the latest generation of affordable plasmas, means that I don't
have to :)

Arfa
 
On 27/02/2010 1:22 AM, William Sommerwerck wrote:
LCDs don't flicker anyway, regardless of their framerate. The frame
rate issue relates to addressing the judder you get as a result of
the image consisting of a sequence of discrete images, rather than
one that continously varies.

Not quite, otherwise the issue would occur with plasma displays. Indeed, it
would with any moving-image recording system.

The problem is that LCDs don't respond "instantaneously". They take a finite
time to go from opaque to the desired transmission level, and then back
again. The result is that the image can lag and "smear". (25 years ago, the
first pocket LCD color TVs from Casio had terrible smear, which added an
oddly "artistic" quality to sports.)

For reasons not clear to me, adding interpolated images reduces the smear.
This makes absolutely no sense whatever, as the LCD now has /less/ time to
switch. I've never gotten an answer on this.
Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen in
character sizes steps, the eye perceived phantom squares at intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to
follow the square, but the square itself was moving in discrete steps.
So the eye was causing the image of the square to be smeared across the
retina. I was seeing this effect on a CRT screen, but the longer the
persistence of the image on the screen the worse the effect would be.
Interpolating the position of the image on the screen would reduce that
effect.

However, I can't explain why this would be less pronounced on a plasma
screen.

It doesn't help that much TV material that was recorded on film is
transmitted with with odd and even interlaced frames that are scans of
the same underlying image (or some variation thereon), so that the
effective refresh rate considerably lower that the interlaced rate.

Interlaced images can be de-interlaced. Note that most product reviews test
displays for how well they do this.
They have to be deinterlaced for display on any screen with significant
persistence, but deinterlacing doesn't increase the underlying frame rate.

Sylvia.
 
<snip>


"Sylvia Else" <sylvia@not.at.this.address> wrote in message
news:4b888aaa$0$32078$c3e8da3@news.astraweb.com...
Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen in
character sizes steps, the eye perceived phantom squares at intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to follow
the square, but the square itself was moving in discrete steps. So the eye
was causing the image of the square to be smeared across the retina. I was
seeing this effect on a CRT screen, but the longer the persistence of the
image on the screen the worse the effect would be. Interpolating the
position of the image on the screen would reduce that effect.

However, I can't explain why this would be less pronounced on a plasma
screen.


Because LCD cells are painfully slow at switching, which equates to a long
persistence phosphor on a CRT, which as you say yourself, makes the effect
worse. Plasma cells are very fast, particularly now that 'pre-fire'
techniques are used to 'ready' them for switching. This is the equivalent of
having a very short persistence phosphor on a CRT. If you arrange for the
drive electronics to be able to deliver the cell drives with no more delay
than the cells themselves are contributing, then the result will be smooth
motion without any perceivable blur, which is pretty much how it was with a
standard domestic CRT based CTV.

Arfa


It doesn't help that much TV material that was recorded on film is
transmitted with with odd and even interlaced frames that are scans of
the same underlying image (or some variation thereon), so that the
effective refresh rate considerably lower that the interlaced rate.

Interlaced images can be de-interlaced. Note that most product reviews
test
displays for how well they do this.


They have to be deinterlaced for display on any screen with significant
persistence, but deinterlacing doesn't increase the underlying frame rate.

Sylvia.
 
On Sat, 27 Feb 2010 01:30:10 -0000, "Arfa Daily"
<arfa.daily@ntlworld.com>wrote:

Meat Plow> wrote in message news:3i5job.5qb.19.13@news.alt.net...
On Fri, 26 Feb 2010 02:32:38 -0000, "Arfa Daily"
arfa.daily@ntlworld.com>wrote:


"William Sommerwerck" <grizzledgeezer@comcast.net> wrote in message
news:hm7241$21g$1@news.eternal-september.org...
First, the only televisions that use LEDs use OLEDs. There are none
using
conventional LEDs.

Second, there are no strict definitions of what these refresh rates
mean.
In
some cases, the set generates an interpolated image at that rate, in
others,
a blank (black) raster is inserted. Some sets combine both.

I don't like this enhancement (which was one of the reasons I bought a
plasma set). It has a nasty side-effect -- it makes motion pictures look
like video. This might be fine for a TV show; it isn't when you're
watching
movies. Be sure that whatever set you purchase has some way of defeating
it
the enhancement.

You need to actually look at the sets you're considering with program
material you're familiar with.



Seconded on all counts, and also the reason that I recently bought a
plasma
TV (Panasonic, 50" full HD panel, 400Hz). I have not seen a single thing
about this TV that I don't like so far, unlike the LCD TVs that I have in
the house, and the LCDs that cross my bench for repair, all of which
suffer
from motion artifacts, scaling artifacts, and motion blur ...

This plasma TV has produced absolutely stunning HD pictures from the
Winter
Olymics, with not the slightest sign of motion artifacts of any
description,
even on the fastest content like downhill skiing, and bobsleigh etc. In
contrast, the same content that I have seen on LCDs, has been perfectly
dreadful.

Arfa


Maybe I'm not picky but those motion artifacts just aren't important
enough for me to want to spend thousands on something that doesn't
produce them. I have a fairly cheep 32" and while it does produce some
artifacts they are insignificant to the overall performance.


But the point is that you no longer have to pay thousands to get that
performance. The plasma that I recently bought was little more to buy than a
'good' LCD, but the performance is easily a whole order of magnitude higher.
CRT sets did not suffer from motion artifacts, and I wasn't prepared to
'downgrade' my viewing experience by buying something which did. The LCD
that I have in my kitchen, also 32" and also 'fairly cheap', does suffer
from motion artifacts which are particularly noticeable on high speed stuff
like the winter olympics. I actually do find these significant and annoying,
and I would not consider having to put up with such a picture on my main TV.
Fortunately, the latest generation of affordable plasmas, means that I don't
have to :)
I went from a 35" Toshiba CRT back in 1998 to the 53" Panasonic
progressive scan projector. Really enjoyed the 480p resolution. And
after one repair of the STK chips and realignment by your's truly it
still has about 90% of its original performance. But the cheepo Olevia
32" LCD still blows it away. Nice accurate color, high contrast ratio.
Good refresh rate. I can use it as a 1680x1050 computer monitor and it
has much better contrast and brightness than my LG Flatron L222WT
22" monitor. The only thing wrong is the sound sucks. The internal
speakers buzz on bass notes of the right frequency and there is a
slight hum when the background is dimmed on the low ambient light
setting. So I added a sub woofer and use the set's speakers for mid
and high range audio. The set does have excellent spatial stereo
resolution. On the whole the Olevia is a good set and I forget about
its shortcomings. I don't watch tv much, just a couple hours in the
evening after 6. And I usually end up falling asleep in my recliner
after two hours or so :)
 
On 28/02/2010 12:12 AM, Arfa Daily wrote:
snip


"Sylvia Else"<sylvia@not.at.this.address> wrote in message
news:4b888aaa$0$32078$c3e8da3@news.astraweb.com...

Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen in
character sizes steps, the eye perceived phantom squares at intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to follow
the square, but the square itself was moving in discrete steps. So the eye
was causing the image of the square to be smeared across the retina. I was
seeing this effect on a CRT screen, but the longer the persistence of the
image on the screen the worse the effect would be. Interpolating the
position of the image on the screen would reduce that effect.

However, I can't explain why this would be less pronounced on a plasma
screen.



Because LCD cells are painfully slow at switching, which equates to a long
persistence phosphor on a CRT, which as you say yourself, makes the effect
worse. Plasma cells are very fast, particularly now that 'pre-fire'
techniques are used to 'ready' them for switching. This is the equivalent of
having a very short persistence phosphor on a CRT. If you arrange for the
drive electronics to be able to deliver the cell drives with no more delay
than the cells themselves are contributing, then the result will be smooth
motion without any perceivable blur, which is pretty much how it was with a
standard domestic CRT based CTV.

Arfa
It seems to me that the effect would be visible on any display that has
any degree of persistence. Even if LCDs switched instantaneously, they'd
still be displaying the image for the full frame time and then
instantaneously switching to the next image. This would produce the
smearing effect in the way I've described. To avoid it, one needs a
display that produces a short bright flash at, say, the beginning of the
display period, and remains dark for the rest of the time. As I
understand plasma displays, that's not how they work.

Sylvia.
 
"Sylvia Else" <sylvia@not.at.this.address> wrote in message
news:006912a3$0$2891$c3e8da3@news.astraweb.com...
On 28/02/2010 12:12 AM, Arfa Daily wrote:
snip


"Sylvia Else"<sylvia@not.at.this.address> wrote in message
news:4b888aaa$0$32078$c3e8da3@news.astraweb.com...

Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen in
character sizes steps, the eye perceived phantom squares at intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to
follow
the square, but the square itself was moving in discrete steps. So the
eye
was causing the image of the square to be smeared across the retina. I
was
seeing this effect on a CRT screen, but the longer the persistence of
the
image on the screen the worse the effect would be. Interpolating the
position of the image on the screen would reduce that effect.

However, I can't explain why this would be less pronounced on a plasma
screen.



Because LCD cells are painfully slow at switching, which equates to a
long
persistence phosphor on a CRT, which as you say yourself, makes the
effect
worse. Plasma cells are very fast, particularly now that 'pre-fire'
techniques are used to 'ready' them for switching. This is the equivalent
of
having a very short persistence phosphor on a CRT. If you arrange for the
drive electronics to be able to deliver the cell drives with no more
delay
than the cells themselves are contributing, then the result will be
smooth
motion without any perceivable blur, which is pretty much how it was with
a
standard domestic CRT based CTV.

Arfa

It seems to me that the effect would be visible on any display that has
any degree of persistence. Even if LCDs switched instantaneously, they'd
still be displaying the image for the full frame time and then
instantaneously switching to the next image. This would produce the
smearing effect in the way I've described. To avoid it, one needs a
display that produces a short bright flash at, say, the beginning of the
display period, and remains dark for the rest of the time. As I understand
plasma displays, that's not how they work.

Sylvia.

I think you are mis-understanding the principles involved here in producing
a picture perceived to have smooth smear-free movement, from a sequence of
still images. Any medium which does this, needs to get the image in place as
quickly as possible, and for a time shorter than the period required to get
the next picture in place. This is true of a cinema picture, a CRT
television picture, an LCD television picture, or a plasma or OLED picture.
Making these still images into a perceived moving image, has nothing to do
with the persistence of the phosphor, but is a function of retinal
retention, or 'persistence of vision'. Black and white TV CRTs used a
phosphor blend known as 'P4', and tricolour CRTs typically used 'P22'. Both
of these are designated as being short persistence types. The green and blue
phosphors used in a colour CRT, have persistence times of typically less
than 100uS, and the red around 2 - 300uS.

The switching time of modern LCD cells is around 1 - 2 mS, and plasma cells
can switch in around 1uS. This means that the plasma cell can be switched
very quickly, and then allowed to 'burn' for as long or short a period as
the designer of the TV decides is appropriate - typically, I would think, of
the same order of time as the persistence of a P22 phosphor, thus allowing
the plasma panel to closely match the fundamental display characteristics of
a typical P22 CRT.

A good description of why the slow switching time of LCD cells is still a
problem in terms of motion blur, and what the manufacturers do to try to
overcome this, can be found at

http://en.wikipedia.org/wiki/LCD_television#Response_time

Arfa
 
On 1/03/2010 12:17 PM, Arfa Daily wrote:
"Sylvia Else"<sylvia@not.at.this.address> wrote in message
news:006912a3$0$2891$c3e8da3@news.astraweb.com...
On 28/02/2010 12:12 AM, Arfa Daily wrote:
snip


"Sylvia Else"<sylvia@not.at.this.address> wrote in message
news:4b888aaa$0$32078$c3e8da3@news.astraweb.com...

Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen in
character sizes steps, the eye perceived phantom squares at intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to
follow
the square, but the square itself was moving in discrete steps. So the
eye
was causing the image of the square to be smeared across the retina. I
was
seeing this effect on a CRT screen, but the longer the persistence of
the
image on the screen the worse the effect would be. Interpolating the
position of the image on the screen would reduce that effect.

However, I can't explain why this would be less pronounced on a plasma
screen.



Because LCD cells are painfully slow at switching, which equates to a
long
persistence phosphor on a CRT, which as you say yourself, makes the
effect
worse. Plasma cells are very fast, particularly now that 'pre-fire'
techniques are used to 'ready' them for switching. This is the equivalent
of
having a very short persistence phosphor on a CRT. If you arrange for the
drive electronics to be able to deliver the cell drives with no more
delay
than the cells themselves are contributing, then the result will be
smooth
motion without any perceivable blur, which is pretty much how it was with
a
standard domestic CRT based CTV.

Arfa

It seems to me that the effect would be visible on any display that has
any degree of persistence. Even if LCDs switched instantaneously, they'd
still be displaying the image for the full frame time and then
instantaneously switching to the next image. This would produce the
smearing effect in the way I've described. To avoid it, one needs a
display that produces a short bright flash at, say, the beginning of the
display period, and remains dark for the rest of the time. As I understand
plasma displays, that's not how they work.

Sylvia.


I think you are mis-understanding the principles involved here in producing
a picture perceived to have smooth smear-free movement, from a sequence of
still images. Any medium which does this, needs to get the image in place as
quickly as possible, and for a time shorter than the period required to get
the next picture in place. This is true of a cinema picture, a CRT
television picture, an LCD television picture, or a plasma or OLED picture.
Making these still images into a perceived moving image, has nothing to do
with the persistence of the phosphor, but is a function of retinal
retention, or 'persistence of vision'.
The fact that a sequence of still images are perceived as a moving
picture is clearly a consequence of visual persistence. And it's obvious
that things will look bad if the images actually overlap. But that's not
what we're discussing.

We're discussing why certain types of display don't do such a good job
despite having a reasonably sharp transition from one image to the next.

The Wikipedia article you cited said that even LCD switching times of
2ms are not good enough "because the pixel will still be switching while
the frame is being displayed." I find this less than convicing as an
explanation. So what if the pixel is switching while the frame is being
displayed? It's not as if the eye has a shutter, and the transition time
is much less that the eye's persistence time anyway.

Sylvia.
 
In article <PrNhn.74162$_W6.55448@newsfe30.ams2>,
Arfa Daily <arfa.daily@ntlworld.com> wrote:
You should also be aware that there are several 'resolutions' of screen and
drive to take into consideration. Almost all TV showrooms both here and in
the US, tend to have the sets running on at least an HD picture, and often a
BluRay picture. This makes them look very good at first glance. Problem is
that in normal day to day use when you get it back home, you are going to be
watching standard resolution terrestrial broadcasts on it, and on many sets,
these look pretty dreadful, and it is the reason that so many people are
disappointed with their purchase when they get it home, and think that it is
not what they saw in the store.
At least in my area (Seattle), all the mainstream over the air stations
are HD, now. (Each station has its own transmitter, so they don't have
shared "multiplexes" like in the UK). The typical situation is an
HD subchannel with the main program and one SD subchannel with some
secondary service (sports, weather, old movies, old TV show, or an
SD copy of the main signal to feed analog cable).

There's a smaller number of stations that trade selection for resolution,
running four or five SD subchannels. Christian broadcasting and
speciality stuff like the secondary channels on the other stations.

The only way you'd get stuck with standard definition on the national
networks is to still be using analog cable. (I'm not familiar with
what you get with the two (subscription only) national direct broadcast
satellite providers).

Mark Zenier mzenier@eskimo.com
Googleproofaddress(account:mzenier provider:eskimo domain:com)
 
"Sylvia Else" <sylvia@not.at.this.address> wrote in message
news:4b8b1bc5$0$5339$c3e8da3@news.astraweb.com...
On 1/03/2010 12:17 PM, Arfa Daily wrote:
"Sylvia Else"<sylvia@not.at.this.address> wrote in message
news:006912a3$0$2891$c3e8da3@news.astraweb.com...
On 28/02/2010 12:12 AM, Arfa Daily wrote:
snip


"Sylvia Else"<sylvia@not.at.this.address> wrote in message
news:4b888aaa$0$32078$c3e8da3@news.astraweb.com...

Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen
in
character sizes steps, the eye perceived phantom squares at
intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to
follow
the square, but the square itself was moving in discrete steps. So the
eye
was causing the image of the square to be smeared across the retina. I
was
seeing this effect on a CRT screen, but the longer the persistence of
the
image on the screen the worse the effect would be. Interpolating the
position of the image on the screen would reduce that effect.

However, I can't explain why this would be less pronounced on a plasma
screen.



Because LCD cells are painfully slow at switching, which equates to a
long
persistence phosphor on a CRT, which as you say yourself, makes the
effect
worse. Plasma cells are very fast, particularly now that 'pre-fire'
techniques are used to 'ready' them for switching. This is the
equivalent
of
having a very short persistence phosphor on a CRT. If you arrange for
the
drive electronics to be able to deliver the cell drives with no more
delay
than the cells themselves are contributing, then the result will be
smooth
motion without any perceivable blur, which is pretty much how it was
with
a
standard domestic CRT based CTV.

Arfa

It seems to me that the effect would be visible on any display that has
any degree of persistence. Even if LCDs switched instantaneously, they'd
still be displaying the image for the full frame time and then
instantaneously switching to the next image. This would produce the
smearing effect in the way I've described. To avoid it, one needs a
display that produces a short bright flash at, say, the beginning of the
display period, and remains dark for the rest of the time. As I
understand
plasma displays, that's not how they work.

Sylvia.


I think you are mis-understanding the principles involved here in
producing
a picture perceived to have smooth smear-free movement, from a sequence
of
still images. Any medium which does this, needs to get the image in place
as
quickly as possible, and for a time shorter than the period required to
get
the next picture in place. This is true of a cinema picture, a CRT
television picture, an LCD television picture, or a plasma or OLED
picture.
Making these still images into a perceived moving image, has nothing to
do
with the persistence of the phosphor, but is a function of retinal
retention, or 'persistence of vision'.

The fact that a sequence of still images are perceived as a moving picture
is clearly a consequence of visual persistence. And it's obvious that
things will look bad if the images actually overlap. But that's not what
we're discussing.

We're discussing why certain types of display don't do such a good job
despite having a reasonably sharp transition from one image to the next.

The Wikipedia article you cited said that even LCD switching times of 2ms
are not good enough "because the pixel will still be switching while the
frame is being displayed." I find this less than convicing as an
explanation. So what if the pixel is switching while the frame is being
displayed? It's not as if the eye has a shutter, and the transition time
is much less that the eye's persistence time anyway.

Sylvia.
Well, I dunno how else to put it. I'm just telling it as I was taught it
many years ago at college. I was taught that short persistence phosphors
were used on TV display CRTs, to prevent motion blur, and that the
individual images were integrated into a moving image, by persistence of
vision, not phosphor. I was also taught that the combination of POV and the
short decay time of the phosphor, led to a perceived flicker in the low
frame-rate images, so the technique of splitting each frame into two
interlaced fields, transmitted sequentially, was then born, which totally
overcame this shortcoming of the system. Always made sense to me. Always
made sense also that any replacement technology had to obey the same 'rule'
of putting the still image up very quickly, and not leaving it there long,
to achieve the same result.

If you think about it, the only 'real' difference between an LCD panel, and
a plasma panel, is the switching time of the individual elements. On the LCD
panel, this is relatively long, whereas on the plasma panel, it is short.
The LCD panel suffers from motion blur, but the plasma panel doesn't. Ergo,
it's the element switching time which causes this effect ... ??

Actually, looking into this a bit further, it seems that POV is nothing like
as simple as it would seem. I've just found another Wiki article

http://en.wikipedia.org/wiki/Persistence_of_vision

which would seem to imply that flat panel display techniques leave the first
image in place until just before the second image is ready to be put up, and
then the third, fourth and so on. If this is the case, and the reasoning
behind the fast 'frame rates' that are now being used by manufacturers, then
it would seem reasonable (to me at least) that in order for this technique
to work correctly, the element switching time would have to be as short as
possible, which would explain why a device with a switching time 1000 times
faster than another, would produce 'sharper' pictures with smoother
blur-less motion.

Arfa
 
Arfa Daily wrote:
Making these still images into a perceived moving image, has nothing to do
with the persistence of the phosphor, but is a function of retinal
retention, or 'persistence of vision'. Black and white TV CRTs used a
phosphor blend known as 'P4', and tricolour CRTs typically used 'P22'. Both
of these are designated as being short persistence types. The green and blue
phosphors used in a colour CRT, have persistence times of typically less
than 100uS, and the red around 2 - 300uS.
Short and long persistance are relative terms. Compared to the P1 phosphors
of radar screens and osciloscopes, P4 phosphors are relatively short
persistence. Compared to an LED they are long persistance.

Note that there is a lot of "wiggle room" in there, supposedly the human
eye can only see at 24 frames per second, which is 50ms.

Also note that there are relatively few frame rates in source material,
NTSC TV is 30/1001 frames per second, PAL TV is 25. Film is 24, which was
stretched to 25 for PAL TV and reduced to 24/1001 for NTSC TV.

Film shot for direct TV distribution (MTV really did have some technological
impact) was shot at 30/1001 frames per second.

Digital TV could be any frame rate, but they have stuck with the old standards,
US digital TV is still the same frame rate as NTSC and EU, etc. digital TV is
still 25 FPS.

Lots of video files online are compressed at lower frame rates because of
the way they are shown. The screens still operate at their regular frame
rate, the computer decoding them just repeats them as necessary.

Geoff.

--
Geoffrey S. Mendelson, Jerusalem, Israel gsm@mendelson.com N3OWJ/4X1GM
New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or
understanding, as in he has a sub-wikipedia understanding of the situation.
i.e possessing less facts or information than can be found in the Wikipedia.
 
Arfa Daily wrote:
Well, I dunno how else to put it. I'm just telling it as I was taught it
many years ago at college. I was taught that short persistence phosphors
were used on TV display CRTs, to prevent motion blur, and that the
individual images were integrated into a moving image, by persistence of
vision, not phosphor. I was also taught that the combination of POV and the
short decay time of the phosphor, led to a perceived flicker in the low
frame-rate images, so the technique of splitting each frame into two
interlaced fields, transmitted sequentially, was then born, which totally
overcame this shortcoming of the system. Always made sense to me. Always
made sense also that any replacement technology had to obey the same 'rule'
of putting the still image up very quickly, and not leaving it there long,
to achieve the same result.
It's more complicated than that. You only see one image, which has been
created in your brain from several sources. The most information comes
from the rods in your eyes, they are light level (monochromatic) sensors,
as it were and they are the most prevalent. This means most of what you see
is from the combination of two sets of monochrome images with slightly
to wildly different information.

Then there are the cones, or color sensors. There are far less of them
and they are less sensitive to light, which is why night vision is black
and white.

There are also blind spots where the optic nerves attach to the retina.

None of these show up on their own, they are all integrated into the one
image you see. You never notice that you have two blind spots, you don't
notice the lack of clarity in colors (due to the fewer number of spots)
and rarely, if ever do you notice the difference between your eyes.

If you were for example to need glasses in one eye and not the other, or have
not quite properly prescibed lenses, your image will appear sharp, not blurred
on one side and sharp on the other.

Lots of tricks have been used over the years to take advantage of the
limitations of the "equipment" and the process. For example, anything faster
than 24 frames a second is not perceived as being discrete images, but one
smooth image.

The 50 and 60 fields per second (a field being half an interlaced frame) were
chosen not because they needed to be that fast (48 would have done), but to
eliminate interefence effects from electrical lights.

Color is another issue. The NTSC (and later adopted by the BBC for PAL)
determined that a 4:1 color system was good enough, i.e. color information
only needed to be changed (and recorded) at 1/4 the speed of the light level.

In modern terms, it means that for every 4 pixels, you only have to have
color information once. Your eye can resolve the difference in light levels,
but not in colors.

This persists to this day, MPEG type encoding is based on that, it's not
redgreenblue, redgreenblue, redgreenblue, redgreenblue of a still
picture or a computer screen, it's the lightlevel, lightlevel,
lightlevel, lightlevel colorforallfour encoding that was used by NTSC
and PAL.

In the end, IMHO, it's not frame rates, color encoding methods, at all, as
they were fixed around 1960 and not changed, but it is display technology as
your brain perceives it.

No matter what anyone says here, it's the combination of exact implementation
of display technology, and your brain that matter. If the combination
looks good, and you are comfortable watching it, a 25 fps CRT, or a 100FPS
LED screen, or even a 1000 FPS display, if there was such a thing, would look
good if everything combined produce good images in YOUR brain, and
bad if some combination produces something "wrong".



If you think about it, the only 'real' difference between an LCD panel, and
a plasma panel, is the switching time of the individual elements. On the LCD
panel, this is relatively long, whereas on the plasma panel, it is short.
The LCD panel suffers from motion blur, but the plasma panel doesn't. Ergo,
it's the element switching time which causes this effect ... ??
There is more to that too. An LCD is like a shutter. It pivots on its
axis and is either open or closed. Not really, there is a discrete
time from closed (black) to open (lit) and therefore a build up of
brightness.

Plasma displays are gas discharge devices, they only glow when there is enough
voltage to "fire" them until it drops below the level needed to sustain
the glow. That depends more upon the speed of the control electronics than
any (other) laws of physics, visocsity of the medium the crystals are in,
temperature, etc.

That's the aim of LED backlit TV screens (besides less power consumption, heat,
etc). They only are lit when the crystals are "open", so there is no time
where you see partially lit "pixels".

Geoff.


--
Geoffrey S. Mendelson, Jerusalem, Israel gsm@mendelson.com N3OWJ/4X1GM
New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or
understanding, as in he has a sub-wikipedia understanding of the situation.
i.e possessing less facts or information than can be found in the Wikipedia.
 
Lots of tricks have been used over the years to take advantage
of the limitations of the "equipment" and the process. For example,
anything faster than 24 frames a second is not perceived as being
discrete images, but one smooth image.
Actually, it's 16 frames a second. However, that rate is not fast enough to
prevent flicker -- which is why silent films were sometimes called
"flickers". This is one of the reasons the frame rate was increased to 24
with the introduction of sound.


The 50 and 60 fields per second (a field being half an interlaced frame)
were chosen not because they needed to be that fast (48 would have
done), but to eliminate interefence effects from electrical lights.
That's new to me.


Color is another issue. The NTSC (and later adopted by the BBC for PAL)
determined that a 4:1 color system was good enough, i.e. color information
only needed to be changed (and recorded) at 1/4 the speed of the light
level.

NTSC is actually 4.2/1.5, or roughly 2.8 to 1. PAL is closer to 5:1.


That's the aim of LED backlit TV screens (besides less power consumption,
heat, etc). They only are lit when the crystals are "open", so there is no
time
where you see partially lit "pixels".
I hate to spoil things, Geoff, but liquid crystals are quite capable of
taking intermediate positions -- that is, forming a continuous gray scale.
 
On 1/03/2010 11:13 PM, William Sommerwerck wrote:
Lots of tricks have been used over the years to take advantage
of the limitations of the "equipment" and the process. For example,
anything faster than 24 frames a second is not perceived as being
discrete images, but one smooth image.

Actually, it's 16 frames a second. However, that rate is not fast enough to
prevent flicker -- which is why silent films were sometimes called
"flickers". This is one of the reasons the frame rate was increased to 24
with the introduction of sound.


The 50 and 60 fields per second (a field being half an interlaced frame)
were chosen not because they needed to be that fast (48 would have
done), but to eliminate interefence effects from electrical lights.

That's new to me.
Well, the story I heard way back when is that it was to synchronise the
picture's vertical frequency with the mains frequency, so that
inadequacies in power smoothing produced static distortions in the
picture rather than much more noticable rolling distortions.

Sylvia.
 

Welcome to EDABoard.com

Sponsor

Back
Top