120hz versus 240hz

The 50 and 60 fields per second (a field being half an interlaced frame)
were chosen not because they needed to be that fast (48 would have
done), but to eliminate interefence effects from electrical lights.

That's new to me.

Well, the story I heard way back when is that it was to synchronise
the picture's vertical frequency with the mains frequency, so that
inadequacies in power smoothing produced static distortions in the
picture rather than much more noticable rolling distortions.
That's what I heard, too. But that's not "interefence effects from
electrical lights".
 
That's what I heard, too. But that's not "interefence effects from
electrical lights".

You are assuming that all interference would be on the screen
itself and none would be visual. Since flourescent and to some
extent incandescent lights blink (what is the persistance of an
incadescent light?) at 60 Hz, there is a strobing effect if there
are lights on in the room with the TV.
Incandescent lights have almost no flicker, due to the thermal inertia of
the filament. Fluorescent lighting was not common in living rooms at the
time the standards were set.
 
"Geoffrey S. Mendelson" <gsm@cable.mendelson.com> wrote in message
news:slrnhon6oi.uvt.gsm@cable.mendelson.com...
Arfa Daily wrote:
Well, I dunno how else to put it. I'm just telling it as I was taught it
many years ago at college. I was taught that short persistence phosphors
were used on TV display CRTs, to prevent motion blur, and that the
individual images were integrated into a moving image, by persistence of
vision, not phosphor. I was also taught that the combination of POV and
the
short decay time of the phosphor, led to a perceived flicker in the low
frame-rate images, so the technique of splitting each frame into two
interlaced fields, transmitted sequentially, was then born, which totally
overcame this shortcoming of the system. Always made sense to me. Always
made sense also that any replacement technology had to obey the same
'rule'
of putting the still image up very quickly, and not leaving it there
long,
to achieve the same result.

It's more complicated than that. You only see one image, which has been
created in your brain from several sources. The most information comes
from the rods in your eyes, they are light level (monochromatic) sensors,
as it were and they are the most prevalent. This means most of what you
see
is from the combination of two sets of monochrome images with slightly
to wildly different information.

Then there are the cones, or color sensors. There are far less of them
and they are less sensitive to light, which is why night vision is black
and white.

There are also blind spots where the optic nerves attach to the retina.

None of these show up on their own, they are all integrated into the one
image you see. You never notice that you have two blind spots, you don't
notice the lack of clarity in colors (due to the fewer number of spots)
and rarely, if ever do you notice the difference between your eyes.

If you were for example to need glasses in one eye and not the other, or
have
not quite properly prescibed lenses, your image will appear sharp, not
blurred
on one side and sharp on the other.

Lots of tricks have been used over the years to take advantage of the
limitations of the "equipment" and the process. For example, anything
faster
than 24 frames a second is not perceived as being discrete images, but one
smooth image.

The difference in resolution between the brightness and colour receptors in
human eyes, is well known and understood, but I don't think that this, or
any other physical aspect of the eye's construction, has any effect on the
way that motion is perceived from a series of still images.


The 50 and 60 fields per second (a field being half an interlaced frame)
were
chosen not because they needed to be that fast (48 would have done), but
to
eliminate interefence effects from electrical lights.

Color is another issue. The NTSC (and later adopted by the BBC for PAL)
determined that a 4:1 color system was good enough, i.e. color information
only needed to be changed (and recorded) at 1/4 the speed of the light
level.

In modern terms, it means that for every 4 pixels, you only have to have
color information once. Your eye can resolve the difference in light
levels,
but not in colors.

This persists to this day, MPEG type encoding is based on that, it's not
redgreenblue, redgreenblue, redgreenblue, redgreenblue of a still
picture or a computer screen, it's the lightlevel, lightlevel,
lightlevel, lightlevel colorforallfour encoding that was used by NTSC
and PAL.

In the end, IMHO, it's not frame rates, color encoding methods, at all, as
they were fixed around 1960 and not changed, but it is display technology
as
your brain perceives it.

Yes, I was not sure exactly why you were going into all of the colour
encoding issues in the context of LCD motion blur. This has nothing to do
with it. It is the display technology that is causing this. It is simply not
as good as other technologies in this respect, despite all of the efforts of
the manufacturers to make it otherwise ...


No matter what anyone says here, it's the combination of exact
implementation
of display technology, and your brain that matter. If the combination
looks good, and you are comfortable watching it, a 25 fps CRT, or a 100FPS
LED screen, or even a 1000 FPS display, if there was such a thing, would
look
good if everything combined produce good images in YOUR brain, and
bad if some combination produces something "wrong".

But this isn't so. A crap picture may, I agree, look 'ok' to someone who
knows no better, but that doesn't alter the fact that it is still a crap
picture that those who *do* know better, will see for what it is. LCD panels
produce crap images in terms of motion blur, and when compared for this
effect to CRTs, plasma panels, and OLEDs.


If you think about it, the only 'real' difference between an LCD panel,
and
a plasma panel, is the switching time of the individual elements. On the
LCD
panel, this is relatively long, whereas on the plasma panel, it is short.
The LCD panel suffers from motion blur, but the plasma panel doesn't.
Ergo,
it's the element switching time which causes this effect ... ??

There is more to that too. An LCD is like a shutter. It pivots on its
axis and is either open or closed. Not really, there is a discrete
time from closed (black) to open (lit) and therefore a build up of
brightness.

I was talking in terms of the fundamental visual principle in that they are
both matrixed cell-based displays requiring similar frame buffering and
driving techniques in signal terms. I was not referring to the way that each
technology actually produces coloured light from the individual cells, which
is clearly entirely different in both cases, from the raster based CRT
principle which, like plasma panels, doesn't suffer from motion blur.


Plasma displays are gas discharge devices, they only glow when there is
enough
voltage to "fire" them until it drops below the level needed to sustain
the glow. That depends more upon the speed of the control electronics than
any (other) laws of physics, visocsity of the medium the crystals are in,
temperature, etc.


It doesn't really rely on the speed of the drive electronics since there are
techniques used to bring the plasma cells to a 'pre-fire' condition just
below the point at which the gas actually ionises. This allows the cells to
be fired with a small drive voltage, and without having to wait for the cell
to build up to the point where it actually fires. This is how they can get
the switching speed of the cells down to as little as 1uS


That's the aim of LED backlit TV screens (besides less power consumption,
heat,
etc). They only are lit when the crystals are "open", so there is no time
where you see partially lit "pixels".

Geoff.
Hmmm. That's not the way I've seen it described. Most of the hype about this
development seems to concentrate on producing dynamic contrast enhancement
by modulating the LEDs' brightness in an area-specific way, depending on the
picture content in front of them.

Arfa
 
William Sommerwerck wrote:
Well, the story I heard way back when is that it was to synchronise
the picture's vertical frequency with the mains frequency, so that
inadequacies in power smoothing produced static distortions in the
picture rather than much more noticable rolling distortions.

That's what I heard, too. But that's not "interefence effects from
electrical lights".
You are assuming that all interference would be on the screen itself and none
would be visual. Since flourescent and to some extent incandescent lights
blink (what is the persistance of an incadescent light?) at 60 Hz, there is
a strobing effect if there are lights on in the room with the TV.

While some peope (me) like to watch TV in the dark, many people watch TV's
with lights on. Some manufacturers went as far as to include light sensors
in their TV sets automaticly adjusting the brightness to compensate for
room lighting as it changes.

Since some people live in places where only flourescent lights are allowed,
they have no choice if there is interference, either turn off the lights
entirely, or live with it.

I guess that could be a new tourism slogan for this summer, "Visit Israel,
and bring home real light bulbs." :)

Geoff.

--
Geoffrey S. Mendelson, Jerusalem, Israel gsm@mendelson.com N3OWJ/4X1GM
New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or
understanding, as in he has a sub-wikipedia understanding of the situation.
i.e possessing less facts or information than can be found in the Wikipedia.
 
On Mon, 1 Mar 2010 01:17:03 -0000, "Arfa Daily"
<arfa.daily@ntlworld.com>wrote:

"Sylvia Else" <sylvia@not.at.this.address> wrote in message
news:006912a3$0$2891$c3e8da3@news.astraweb.com...
On 28/02/2010 12:12 AM, Arfa Daily wrote:
snip


"Sylvia Else"<sylvia@not.at.this.address> wrote in message
news:4b888aaa$0$32078$c3e8da3@news.astraweb.com...

Many years ago (using a Sinclair Spectrum no less) I noticed an effect
whereby if a small character sized square was moved across the screen in
character sizes steps, the eye perceived phantom squares at intervening
positions. Since the computer was not displaying these intermediate
squares, their appearance must have been due to the eye. The likely
explanation was that the eye was traversing the screen smoothly to
follow
the square, but the square itself was moving in discrete steps. So the
eye
was causing the image of the square to be smeared across the retina. I
was
seeing this effect on a CRT screen, but the longer the persistence of
the
image on the screen the worse the effect would be. Interpolating the
position of the image on the screen would reduce that effect.

However, I can't explain why this would be less pronounced on a plasma
screen.



Because LCD cells are painfully slow at switching, which equates to a
long
persistence phosphor on a CRT, which as you say yourself, makes the
effect
worse. Plasma cells are very fast, particularly now that 'pre-fire'
techniques are used to 'ready' them for switching. This is the equivalent
of
having a very short persistence phosphor on a CRT. If you arrange for the
drive electronics to be able to deliver the cell drives with no more
delay
than the cells themselves are contributing, then the result will be
smooth
motion without any perceivable blur, which is pretty much how it was with
a
standard domestic CRT based CTV.

Arfa

It seems to me that the effect would be visible on any display that has
any degree of persistence. Even if LCDs switched instantaneously, they'd
still be displaying the image for the full frame time and then
instantaneously switching to the next image. This would produce the
smearing effect in the way I've described. To avoid it, one needs a
display that produces a short bright flash at, say, the beginning of the
display period, and remains dark for the rest of the time. As I understand
plasma displays, that's not how they work.

Sylvia.


I think you are mis-understanding the principles involved here in producing
a picture perceived to have smooth smear-free movement, from a sequence of
still images. Any medium which does this, needs to get the image in place as
quickly as possible, and for a time shorter than the period required to get
the next picture in place. This is true of a cinema picture, a CRT
television picture, an LCD television picture, or a plasma or OLED picture.
Making these still images into a perceived moving image, has nothing to do
with the persistence of the phosphor, but is a function of retinal
retention, or 'persistence of vision'. Black and white TV CRTs used a
phosphor blend known as 'P4', and tricolour CRTs typically used 'P22'. Both
of these are designated as being short persistence types. The green and blue
phosphors used in a colour CRT, have persistence times of typically less
than 100uS, and the red around 2 - 300uS.

The switching time of modern LCD cells is around 1 - 2 mS, and plasma cells
can switch in around 1uS. This means that the plasma cell can be switched
very quickly, and then allowed to 'burn' for as long or short a period as
the designer of the TV decides is appropriate - typically, I would think, of
the same order of time as the persistence of a P22 phosphor, thus allowing
the plasma panel to closely match the fundamental display characteristics of
a typical P22 CRT.

A good description of why the slow switching time of LCD cells is still a
problem in terms of motion blur, and what the manufacturers do to try to
overcome this, can be found at

http://en.wikipedia.org/wiki/LCD_television#Response_time

Arfa
So it can be compared abstractly to progressive scanning in a CRT set?
 
On Mon, 1 Mar 2010 04:13:20 -0800, "William Sommerwerck"
<grizzledgeezer@comcast.net>wrote:

Actually, it's 16 frames a second. However, that rate is not fast enough to
prevent flicker -- which is why silent films were sometimes called
"flickers".

Or shortened to "Flicks" as is widely still used today.
 
On Mon, 1 Mar 2010 14:03:27 -0000, "Arfa Daily"
<arfa.daily@ntlworld.com>wrote:

But this isn't so. A crap picture may, I agree, look 'ok' to someone who
knows no better, but that doesn't alter the fact that it is still a crap
picture that those who *do* know better, will see for what it is. LCD panels
produce crap images in terms of motion blur, and when compared for this
effect to CRTs, plasma panels, and OLEDs.
I've seen plasma and oled. I can see differences between those and
standard LCD panels but not in my wildest dreams would I call them
crap. Most of my viewing is done in standard 480P 4:3 aspect cropped
to fill the screen. I don't need a high dollar plasma set for that, it
would be overkill. I own a Sony 720i/1080i HDMI upscaling DVDR that
produces sharp clear video. Once in a while I do notice a scan wave
because of the upscaling but the grand scheme of things make those
things very forgettable.
 
But this isn't so. A crap picture may, I agree, look 'ok' to someone
who knows no better, but that doesn't alter the fact that it is still a
crap picture that those who *do* know better, will see for what it is.
LCD panels produce crap images in terms of motion blur, and when
compared for this effect to CRTs, plasma panels, and OLEDs.

I've seen plasma and oled. I can see differences between those and
standard LCD panels but not in my wildest dreams would I call them
crap. Most of my viewing is done in standard 480P 4:3 aspect cropped
to fill the screen. I don't need a high dollar plasma set for that, it
would be overkill. I own a Sony 720i/1080i HDMI upscaling DVDR that
produces sharp clear video. Once in a while I do notice a scan wave
because of the upscaling but the grand scheme of things make those
things very forgettable.
The 32" Vizio LCD in my den has a very wide viewing angle and does not show
significant smearing or blurring with rapid motion. (I paid about $380 for
it.)

With respect to scaling... People here and elsewhere have said they see no
point to Blu-ray disks, as they see little or no difference with upscaled
DVDs. Ergo, Blu-rays are a ripoff. I watched the Blu-ray of "The Sixth
Sense" yesterday, which threw this issue into sharp perspective.

The transfer is typical Disney -- extremely sharp and detailed, with rich
colors. It's close to demo quality.

Some of the supplemental material includes scenes from the Blu-ray transfer
that have been letterboxed into a 4:3 image. (Got that?) When I select ZOOM
on my Kuro, that section is blown up to full screen. ("The Sixth Sense" was
shot at 1.85:1.) Viewing at these images in isolation -- they look fine.
They're slightly soft, but one might believe it's the fault of the source
material. They don't look upscaled -- until you compare them with
full-resolution Blu-ray. There is no comparison!
 
William Sommerwerck wrote:
The 50 and 60 fields per second (a field being half an interlaced frame)
were chosen not because they needed to be that fast (48 would have
done), but to eliminate interefence effects from electrical lights.

That's new to me.

Well, the story I heard way back when is that it was to synchronise
the picture's vertical frequency with the mains frequency, so that
inadequacies in power smoothing produced static distortions in the
picture rather than much more noticable rolling distortions.

That's what I heard, too. But that's not "interefence effects from
electrical lights".

60 Hz ws used in the US to prevent hum bars from rolling up or down
the screen due to the difference in the line & scan frequencies. A faint
bar would be hard to spot if it was not moving, but very annoying if it
did. People have to remember that the standards were set when
Electronics was fairly new, and rather crude designs were used.


--
Greed is the root of all eBay.
 
In article <4b8bb33b$0$14745$c3e8da3@news.astraweb.com>,
Sylvia Else <sylvia@not.at.this.address> wrote:
On 1/03/2010 11:13 PM, William Sommerwerck wrote:
Lots of tricks have been used over the years to take advantage
of the limitations of the "equipment" and the process. For example,
anything faster than 24 frames a second is not perceived as being
discrete images, but one smooth image.

Actually, it's 16 frames a second. However, that rate is not fast
enough to prevent flicker -- which is why silent films were sometimes
called "flickers". This is one of the reasons the frame rate was
increased to 24 with the introduction of sound.


The 50 and 60 fields per second (a field being half an interlaced
frame) were chosen not because they needed to be that fast (48 would
have done), but to eliminate interefence effects from electrical
lights.

That's new to me.

Well, the story I heard way back when is that it was to synchronise the
picture's vertical frequency with the mains frequency, so that
inadequacies in power smoothing produced static distortions in the
picture rather than much more noticable rolling distortions.
UK TV went off mains lock many many years ago. Something like the early
'60s, - before colour arrived here. When sets were still valve.

--
*Always drink upstream from the herd *

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.
 
"Arfa Daily" <arfa.daily@ntlworld.com> wrote in message
news:ROPin.69472$Mk3.66033@newsfe20.ams2...
"Geoffrey S. Mendelson" <gsm@cable.mendelson.com> wrote in message
news:slrnhon6oi.uvt.gsm@cable.mendelson.com...
Arfa Daily wrote:
Well, I dunno how else to put it. I'm just telling it as I was taught it
many years ago at college. I was taught that short persistence phosphors
were used on TV display CRTs, to prevent motion blur, and that the
individual images were integrated into a moving image, by persistence of
vision, not phosphor. I was also taught that the combination of POV and
the
short decay time of the phosphor, led to a perceived flicker in the low
frame-rate images, so the technique of splitting each frame into two
interlaced fields, transmitted sequentially, was then born, which
totally
overcame this shortcoming of the system. Always made sense to me. Always
made sense also that any replacement technology had to obey the same
'rule'
of putting the still image up very quickly, and not leaving it there
long,
to achieve the same result.

It's more complicated than that. You only see one image, which has been
created in your brain from several sources. The most information comes
from the rods in your eyes, they are light level (monochromatic) sensors,
as it were and they are the most prevalent. This means most of what you
see
is from the combination of two sets of monochrome images with slightly
to wildly different information.

Then there are the cones, or color sensors. There are far less of them
and they are less sensitive to light, which is why night vision is black
and white.

There are also blind spots where the optic nerves attach to the retina.

None of these show up on their own, they are all integrated into the one
image you see. You never notice that you have two blind spots, you don't
notice the lack of clarity in colors (due to the fewer number of spots)
and rarely, if ever do you notice the difference between your eyes.

If you were for example to need glasses in one eye and not the other, or
have
not quite properly prescibed lenses, your image will appear sharp, not
blurred
on one side and sharp on the other.

Lots of tricks have been used over the years to take advantage of the
limitations of the "equipment" and the process. For example, anything
faster
than 24 frames a second is not perceived as being discrete images, but
one
smooth image.


The difference in resolution between the brightness and colour receptors
in human eyes, is well known and understood, but I don't think that this,
or any other physical aspect of the eye's construction, has any effect on
the way that motion is perceived from a series of still images.



The 50 and 60 fields per second (a field being half an interlaced frame)
were
chosen not because they needed to be that fast (48 would have done), but
to
eliminate interefence effects from electrical lights.

Color is another issue. The NTSC (and later adopted by the BBC for PAL)
determined that a 4:1 color system was good enough, i.e. color
information
only needed to be changed (and recorded) at 1/4 the speed of the light
level.

In modern terms, it means that for every 4 pixels, you only have to have
color information once. Your eye can resolve the difference in light
levels,
but not in colors.

This persists to this day, MPEG type encoding is based on that, it's not
redgreenblue, redgreenblue, redgreenblue, redgreenblue of a still
picture or a computer screen, it's the lightlevel, lightlevel,
lightlevel, lightlevel colorforallfour encoding that was used by NTSC
and PAL.

In the end, IMHO, it's not frame rates, color encoding methods, at all,
as
they were fixed around 1960 and not changed, but it is display technology
as
your brain perceives it.


Yes, I was not sure exactly why you were going into all of the colour
encoding issues in the context of LCD motion blur. This has nothing to do
with it. It is the display technology that is causing this. It is simply
not as good as other technologies in this respect, despite all of the
efforts of the manufacturers to make it otherwise ...



No matter what anyone says here, it's the combination of exact
implementation
of display technology, and your brain that matter. If the combination
looks good, and you are comfortable watching it, a 25 fps CRT, or a
100FPS
LED screen, or even a 1000 FPS display, if there was such a thing, would
look
good if everything combined produce good images in YOUR brain, and
bad if some combination produces something "wrong".


But this isn't so. A crap picture may, I agree, look 'ok' to someone who
knows no better, but that doesn't alter the fact that it is still a crap
picture that those who *do* know better, will see for what it is. LCD
panels produce crap images in terms of motion blur, and when compared for
this effect to CRTs, plasma panels, and OLEDs.





If you think about it, the only 'real' difference between an LCD panel,
and
a plasma panel, is the switching time of the individual elements. On the
LCD
panel, this is relatively long, whereas on the plasma panel, it is
short.
The LCD panel suffers from motion blur, but the plasma panel doesn't.
Ergo,
it's the element switching time which causes this effect ... ??

There is more to that too. An LCD is like a shutter. It pivots on its
axis and is either open or closed. Not really, there is a discrete
time from closed (black) to open (lit) and therefore a build up of
brightness.


I was talking in terms of the fundamental visual principle in that they
are both matrixed cell-based displays requiring similar frame buffering
and driving techniques in signal terms. I was not referring to the way
that each technology actually produces coloured light from the individual
cells, which is clearly entirely different in both cases, from the raster
based CRT principle which, like plasma panels, doesn't suffer from motion
blur.



Plasma displays are gas discharge devices, they only glow when there is
enough
voltage to "fire" them until it drops below the level needed to sustain
the glow. That depends more upon the speed of the control electronics
than
any (other) laws of physics, visocsity of the medium the crystals are in,
temperature, etc.



It doesn't really rely on the speed of the drive electronics since there
are techniques used to bring the plasma cells to a 'pre-fire' condition
just below the point at which the gas actually ionises. This allows the
cells to be fired with a small drive voltage, and without having to wait
for the cell to build up to the point where it actually fires. This is how
they can get the switching speed of the cells down to as little as 1uS



That's the aim of LED backlit TV screens (besides less power consumption,
heat,
etc). They only are lit when the crystals are "open", so there is no time
where you see partially lit "pixels".

Geoff.

Hmmm. That's not the way I've seen it described. Most of the hype about
this development seems to concentrate on producing dynamic contrast
enhancement by modulating the LEDs' brightness in an area-specific way,
depending on the picture content in front of them.

Arfa
The way it was described to me is there are seveal hundred LEDs which are
each assigned specific "areas" of the screen. So it would seem that if you
have a bright AND dark area within an individual LED's jurisdiction, there
would be some sort of conflict. Unless, of course, such jurisdictions are
actually blended into the others. But they would still have to average their
brilliance. Either way, I could see how there would be a contrast
improvement across the screen as a whole since more lights is always better
than ONE.
 
"Mark Zenier" <mzenier@eskimo.com> wrote in message
news:hmgpb8027j7@enews1.newsguy.com...
In article <PrNhn.74162$_W6.55448@newsfe30.ams2>,
Arfa Daily <arfa.daily@ntlworld.com> wrote:
You should also be aware that there are several 'resolutions' of screen
and
drive to take into consideration. Almost all TV showrooms both here and in
the US, tend to have the sets running on at least an HD picture, and often
a
BluRay picture. This makes them look very good at first glance. Problem is
that in normal day to day use when you get it back home, you are going to
be
watching standard resolution terrestrial broadcasts on it, and on many
sets,
these look pretty dreadful, and it is the reason that so many people are
disappointed with their purchase when they get it home, and think that it
is
not what they saw in the store.

At least in my area (Seattle), all the mainstream over the air stations
are HD, now. (Each station has its own transmitter, so they don't have
shared "multiplexes" like in the UK). The typical situation is an
HD subchannel with the main program and one SD subchannel with some
secondary service (sports, weather, old movies, old TV show, or an
SD copy of the main signal to feed analog cable).

There's a smaller number of stations that trade selection for resolution,
running four or five SD subchannels. Christian broadcasting and
speciality stuff like the secondary channels on the other stations.

The only way you'd get stuck with standard definition on the national
networks is to still be using analog cable. (I'm not familiar with
what you get with the two (subscription only) national direct broadcast
satellite providers).

Mark Zenier mzenier@eskimo.com
Googleproofaddress(account:mzenier provider:eskimo domain:com)
So what band are we talking here ? Are these UHF digital transmissions ? How
many OTA HD channels would you typically have available in any given area ?
Do you know what compression scheme they are using ?

The digital terrestrial TV being provided here in the UK now, currently
carries no HD content, despite ongoing promises. This is due to some extent
on the government reneging on a promise to make more of the UHF band
available to the broadcasters. Having now told them that they can't have any
more, and the broadcasters having already filled up what they have got
available with multiplexes carrying 'proper' channels and crap channels in a
ratio of about 1 to 5, the only option that they are now left with is to use
another different and non standard variant of mpeg 4 compression.

The situation via direct broadcast satellite is much clearer. Here, they
have so much bandwidth available that they are able to carry many HD
channels, so this is where people here get their HD content from.
Unfortunately, the satellite operator charges us another tenner ($15) a
month for the privilege of receiving these transmissions ... :-(

Arfa
 
"Geoffrey S. Mendelson"
The 50 and 60 fields per second (a field being half an interlaced frame)
were
chosen not because they needed to be that fast (48 would have done), but
to
eliminate interefence effects from electrical lights.

** But the lights concerned were those being used to illuminate the TV
studio.

When frame rates are not locked to the AC supply frequency, faint shadows
can be seen moving up or down studio images on a monitor or home TV set -
due to the twice per cycle dip in brightness of incandescent lamps.

Other fixes include using lamps with sufficient thermal inertia or groups of
lamps on different phases to eliminate the light modulation.



...... Phil
 
In article <td_in.163148$Dy7.138444@newsfe26.ams2>,
Arfa Daily <arfa.daily@ntlworld.com> wrote:
The digital terrestrial TV being provided here in the UK now, currently
carries no HD content, despite ongoing promises.
Not so. BBC HD is transmitted on FreeView as is ITV HD. CH4 and 5 will be
added shortly. This is from the London transmitter. Not sure about
everywhere.

--
*Who are these kids and why are they calling me Mom?

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.
 
In article <7v3f4mFldkU1@mid.individual.net>,
Phil Allison <phil_a@tpg.com.au> wrote:

"Geoffrey S. Mendelson"

The 50 and 60 fields per second (a field being half an interlaced
frame) were chosen not because they needed to be that fast (48 would
have done), but to eliminate interefence effects from electrical
lights.

** But the lights concerned were those being used to illuminate the TV
studio.
Studio luminaries are commonly filament lamps. To allow easy control of
level, and because of their continuous spectrum light output.

When frame rates are not locked to the AC supply frequency, faint
shadows can be seen moving up or down studio images on a monitor or
home TV set - due to the twice per cycle dip in brightness of
incandescent lamps.
In the UK TV hasn't been mains locked for about 40 years. I'd guess other
countries the same. The mains frequency varies too much for modern
requirements.

Other fixes include using lamps with sufficient thermal inertia or
groups of lamps on different phases to eliminate the light modulation.
Fluorescent types are used on location these days, but use high frequency
ballasts. HID types don't run at mains frequency either.

Only time I've seen a phased array used was for a boxing ring - before
high frequency ballasts became common.

--
*I pretend to work. - they pretend to pay me.

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.
 
"Dave Plowman (News)" <dave@davenoise.co.uk> wrote in message
news:50f1a111e3dave@davenoise.co.uk...
In article <td_in.163148$Dy7.138444@newsfe26.ams2>,
Arfa Daily <arfa.daily@ntlworld.com> wrote:
The digital terrestrial TV being provided here in the UK now, currently
carries no HD content, despite ongoing promises.

Not so. BBC HD is transmitted on FreeView as is ITV HD. CH4 and 5 will be
added shortly. This is from the London transmitter. Not sure about
everywhere.

--
*Who are these kids and why are they calling me Mom?

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.
None available on FV here in my east midlands location. Just looked at my
"TV Times" (national) listings mag, and it claims that BBC HD is available
on Freesat CH 108, Sky CH 143 and Virgin Cable CH 108. Likewise, it says
that ITV HD is only available on Freesat via the 'red button' service. In
any case, BBC HD is hardly a useful service, as they just stick a mixture of
their total network output on there at random times. I was recording
"Survivors" on BBC HD via sat on series link. Suddenly, the series finale
has disappeared from the recording list. I check the schedules, and it's
just not on there. Some random olympics programme or something. So I hastily
set it to record on SD BBC. Then, a couple of days later, it randomly
appears again on BBC HD at some obscure time when they had a slot to fill.
ITV HD, from what I've seen of it on the Freesat service, seems to be just
for football matches, once in a while. Either service is hardly inspiring
for people with HD TV sets and a built-in DTTV tuner, as most have.

So I would have to conclude that at the moment, the London area is possibly
unique in carrying these services. Just as a matter of interest, what
equipment is required to receive these FreeView HD transmissions, and has
the compression scheme now been finalised then, to allow manufacturers to
produce necessary equipment in bulk ?

Interesting that you say that CH5 is shortly going to be placing HD content
onto FreeView. At the moment, they have no HD output at all, and I would
have thought that if they were about to start, then the first places would
have been on the Sky satellite service, and Virgin cable, where there is an
existing customer base, with fully operational equipment to allow them to
access and view the service.

Channel Four I can understand wanting to provide a FreeView service as they
already produce an HD mirror of their SD service on Sky and Virgin.

Just as a matter of interest, do you know what cameras they use for
producing their HD content (or their programme makers / suppliers) ? Just
that their HD output is stunningly good compared to some other efforts by
other stations. And I'm talking original 'native' HD here, not just content
that was shot in standard res, and then placed on the station's HD channel.
Taking, for instance, Phil and Kirsty's "Relocation, Relocation" (Wednesday
8pm) programme on CH4. The image quality is absolutely cracking, and
everything you would expect HD to be. Likewise, "Extreme Engineering" on
NatGeo I think it is, and "American Chopper" on Discovery. OTOH, "Lost" and
"24" from Sky 1 both claim to be 'originals' in HD format, but although they
look better in HD than they do in SD, they still seem to lack that
'pin-sharp' quality that the other programmes I've cited, have. As you are
'in the business' so to speak, just wondered if you had any insights into
this ?

Arfa
 
"Dave Plowman (Fucking Nut Case Pommy Cunt )

Studio luminaries are commonly filament lamps.
** DUUUUUHHHHHHHHHH !!!!!!!!!!

WRONG CONTEXT - you fucking STUPID MORON !



In the UK TV hasn't been mains locked for about 40 years.
** WRONG CONTEXT - you fucking STUPID MORON !


Fluorescent types are used on location these days,

** WRONG CONTEXT - you fucking STUPID MORON !


Only time I've seen a phased array used was for a boxing ring

** WRONG CONTEXT - you fucking STUPID MORON !

Someone PLEEEEASE go and SHOOT this imbecile through the head !!!



..... Phil
 
"Geoffrey S. Mendelson" <gsm@cable.mendelson.com> wrote in message
news:slrnhopt25.sj.gsm@cable.mendelson.com...
Arfa Daily wrote:

The digital terrestrial TV being provided here in the UK now, currently
carries no HD content, despite ongoing promises. This is due to some
extent
on the government reneging on a promise to make more of the UHF band
available to the broadcasters. Having now told them that they can't have
any
more, and the broadcasters having already filled up what they have got
available with multiplexes carrying 'proper' channels and crap channels
in a
ratio of about 1 to 5, the only option that they are now left with is to
use
another different and non standard variant of mpeg 4 compression.

It's not nonstandard. MPEG4 is one of those "evolving standards", so that
they can sell you a decoder box or TV that supports the current variants
and next week turn around and sell you a new one.

Or if you have a computer, provide a firmware update.

It gets rid of the problem that CRT TVs had, they did not change fast
enough
to get people buying new ones in a fast enough cycle to keep the companies
in business.

I have a spare TV that I bought in 1986 and AFAIK, it still works. We have
not yet switched to digital over the air here (Israel).

Speaking of MPEG4, Israel chose H.264 with AAC audio, a combination no one
had ever used before. The idea was to squeeze as many regular (520p 4:3)
channels in one 8mHz DVD-T channel.


The situation via direct broadcast satellite is much clearer. Here, they
have so much bandwidth available that they are able to carry many HD
channels, so this is where people here get their HD content from.
Unfortunately, the satellite operator charges us another tenner ($15) a
month for the privilege of receiving these transmissions ... :-(

Same here, but it's 40 NIS ($25).

BTW, where do those HDTV BBC programs come from? They are not over the
air?

Geoff.
--
BBC HD is currently available by direct broadcast satellite, and from the
Virgin cable service. I receive it via the former. It would seem that in a
few areas, it is now available via the FreeView DTTV service which is
replacing our current analogue service over the next couple of years.
However, although I receive FreeView from one of the 'main' national
transmitter sites, the FreeView HD service will not be available to me for
some long time yet, according to

http://www.radioandtelly.co.uk/freeviewhd.html

A different DTTV receiver is required, and it looks as though the only one
currently available is 180 quid ($270 ish). I can't see many people wanting
to hang yet another receiver on the end of their 'HD Ready' TV sets, for
that sort of money, and to receive just a few HD services. There is never
going to be the bandwidth available to put more on there, alongside the
other services.

I'm not sure what exactly you mean by "an evolving standard". That seems an
oxymoron if ever I heard one. Either it's a standard, or it's an evolving
system. It can't be both. The sat broadcasters have been using the same
transmission standards for years, and don't seem to suffer problems with
compatibility of receiving equipment. The DTTV service, OTOH, seems to be a
mish-mash compromise system, which has changed 'standards' and names several
times, in an effort to make it do what was, in truth, never going to be
practically possible ...

Arfa
 
Arfa Daily wrote:
The digital terrestrial TV being provided here in the UK now, currently
carries no HD content, despite ongoing promises. This is due to some extent
on the government reneging on a promise to make more of the UHF band
available to the broadcasters. Having now told them that they can't have any
more, and the broadcasters having already filled up what they have got
available with multiplexes carrying 'proper' channels and crap channels in a
ratio of about 1 to 5, the only option that they are now left with is to use
another different and non standard variant of mpeg 4 compression.
It's not nonstandard. MPEG4 is one of those "evolving standards", so that
they can sell you a decoder box or TV that supports the current variants
and next week turn around and sell you a new one.

Or if you have a computer, provide a firmware update.

It gets rid of the problem that CRT TVs had, they did not change fast enough
to get people buying new ones in a fast enough cycle to keep the companies
in business.

I have a spare TV that I bought in 1986 and AFAIK, it still works. We have
not yet switched to digital over the air here (Israel).

Speaking of MPEG4, Israel chose H.264 with AAC audio, a combination no one
had ever used before. The idea was to squeeze as many regular (520p 4:3)
channels in one 8mHz DVD-T channel.

The situation via direct broadcast satellite is much clearer. Here, they
have so much bandwidth available that they are able to carry many HD
channels, so this is where people here get their HD content from.
Unfortunately, the satellite operator charges us another tenner ($15) a
month for the privilege of receiving these transmissions ... :-(
Same here, but it's 40 NIS ($25).

BTW, where do those HDTV BBC programs come from? They are not over the air?

Geoff.
--
Geoffrey S. Mendelson, Jerusalem, Israel gsm@mendelson.com N3OWJ/4X1GM
New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or
understanding, as in he has a sub-wikipedia understanding of the situation.
i.e possessing less facts or information than can be found in the Wikipedia.
 
I'm not sure what exactly you mean by "an evolving standard".
That seems an oxymoron if ever I heard one. Either it's a standard,
or it's an evolving system. It can't be both.
To the best of my understanding, all audio and video codecs carry with them
the information need to correctly decode the transmission. This allows (for
example) DVDs and Blu-rays to use varying bitrates and different codecs. (If
this isn't right, please correct me.)
 

Welcome to EDABoard.com

Sponsor

Back
Top