Bit of a Con Really - Follow-up ...

On 24 May,
tony sayer <tony@bancom.co.uk> wrote:

I don't have the time to discuss this at length, but NTSC's unfortunate
reverse-acronym was the result of poor studio standards, and is not
inherent in the system. PAL incorporated phase alternation to partly
compensate for transmission problems (non-linear group delay) in Europe.


Wasn't something done to either the NTSC transmission spec or the sets that
largely alleviated that .. sometime after the original system started?..
Wasn't it the improved standards in receivers following the introduction of
solid state technology? The transistorised sets didn't drift as much.

--
B Thumbs
Change lycos to yahoo to reply
 
I don't have the time to discuss this at length, but NTSC's
unfortunate reverse-acronym was the result of poor studio
standards, and is not inherent in the system.

It is. Multipath effects caused unacceptable phase and color shifts.
This is like saying that the design of eggs is fundamentally flawed, because
if you drop them, they break.
 
"Dave Plowman (News)" <dave@davenoise.co.uk> wrote in message
news:5060acb8fbdave@davenoise.co.uk...
In article <gvcars$tjr$1@news.eternal-september.org>,
William Sommerwerck <grizzledgeezer@comcast.net> wrote:
I don't have the time to discuss this at length, but NTSC's unfortunate
reverse-acronym was the result of poor studio standards, and is not
inherent in the system. PAL incorporated phase alternation to partly
compensate for transmission problems (non-linear group delay) in Europe.

IIRC, nowt to do with studios, but the transmission process. Hence the
tint control on NTSC sets which is absent on PAL ones.
The implication of "never twice the same color" was that there was something
inherently unstable in the system.

The US had high-quality microwave transmission systems with excellent timing
and group delay characteristics. Europe did not.

To those in the US... When was the last time you adjusted the hue control on
your analog TV?



If I remember my BBC training correctly, NTSC gives theoretically better
'studio' pictures than PAL.
Yes, because it has wider chroma bandwidth. Other than that, they are
essentially the same system.
 
<me9@privacy.net> wrote in message news:5060ADB913%brian13434@lycos.co.uk...
On 24 May,
tony sayer <tony@bancom.co.uk> wrote:

I don't have the time to discuss this at length, but NTSC's
unfortunate
reverse-acronym was the result of poor studio standards, and is not
inherent in the system. PAL incorporated phase alternation to partly
compensate for transmission problems (non-linear group delay) in
Europe.


Wasn't something done to either the NTSC transmission spec or the sets
that
largely alleviated that .. sometime after the original system started?..

Wasn't it the improved standards in receivers following the introduction
of
solid state technology? The transistorised sets didn't drift as much.
No, tube sets were stable. Remember, the demodulator is locked to the burst
signal.
 
In article <gvcr87$l8n$1@news.eternal-september.org>, William
Sommerwerck <grizzledgeezer@comcast.net> scribeth thus
"Dave Plowman (News)" <dave@davenoise.co.uk> wrote in message
news:5060acb8fbdave@davenoise.co.uk...
In article <gvcars$tjr$1@news.eternal-september.org>,
William Sommerwerck <grizzledgeezer@comcast.net> wrote:
I don't have the time to discuss this at length, but NTSC's unfortunate
reverse-acronym was the result of poor studio standards, and is not
inherent in the system. PAL incorporated phase alternation to partly
compensate for transmission problems (non-linear group delay) in Europe.

IIRC, nowt to do with studios, but the transmission process. Hence the
tint control on NTSC sets which is absent on PAL ones.

The implication of "never twice the same color" was that there was something
inherently unstable in the system.

The US had high-quality microwave transmission systems with excellent timing
and group delay characteristics. Europe did not.
Are you referring to the studio to transmitter links?...



To those in the US... When was the last time you adjusted the hue control on
your analog TV?



If I remember my BBC training correctly, NTSC gives theoretically better
'studio' pictures than PAL.

Yes, because it has wider chroma bandwidth. Other than that, they are
essentially the same system.
--
Tony Sayer
 
Dave Plowman (News) wrote:
In article <gvcds3$oon$2@news.albasani.net>,
The Natural Philosopher <tnp@invalid.invalid> wrote:
But then different makes of transparencies give different results...

And transparencies are usually used for top quality magazine prints not
'projected onto a screen' anyway.

And are adjusted as part of the printing process.

Precisely.
 
William Sommerwerck wrote:
I don't have the time to discuss this at length, but NTSC's
unfortunate reverse-acronym was the result of poor studio
standards, and is not inherent in the system.

It is. Multipath effects caused unacceptable phase and color shifts.

This is like saying that the design of eggs is fundamentally flawed, because
if you drop them, they break.


Its more akin to saying that if you want to play handball with eggs,
don't do it on a concrete patio.
 
I don't have the time to discuss this at length, but NTSC's unfortunate
reverse-acronym was the result of poor studio standards, and is not
inherent in the system. PAL incorporated phase alternation to partly
compensate for transmission problems (non-linear group delay)
in Europe.

IIRC, nowt to do with studios, but the transmission process. Hence
the tint [sic] control on NTSC sets which is absent on PAL ones.

The implication of "never twice the same color" was that there was
something inherently unstable in the system.

The US had high-quality microwave transmission systems with
excellent timing and group delay characteristics. Europe did not.

Are you referring to the studio to transmitter links?...
No, when I say "poor studio standards", I'm talking about such things as the
failure to set up cameras correct, keep a close eye on burst phase, etc,
etc, etc. Garbage in, garbage out.
 
And transparencies are usually used for top-quality
magazine prints not 'projected onto a screen' anyway.

And are adjusted as part of the printing process.

Precisely.
Many years ago I read about the work at National Geographic that was put
into making color separations and printing plates to produce extremely
high-quality images in the magazine. It was not simple.
 
"William Sommerwerck" <grizzledgeezer@comcast.net> wrote in message
news:gv6k6o$62e$1@news.eternal-september.org...
The LCD only filters light from the backlight. If you don't have a full
spectrum white in the first place the you can't expect decent colour.

Not so. All you have to do is hit the defined points in CIE diagram. The
Pioneer plasma sets hit them dead-on.
There is a little more to it than that, Bill. The points in the CIE diagram
are only part of what makes for a proper image. If you are referring to the
triangle of points on the colorimetry plot, you are only seeing the color of
the points, not the luminance. The resulting image is a matter of
saturation, hue, and luminance. You only see the first two with the CIE
chart that shows gamut.

The other aspect of getting the points on the colorimetry chart right is
that it tells you nothing about the colors in between and at different
levels. All it tells you is the color of the points you measure. How they
are mixed and create intermediate colors depends on the spectrum of each of
the primaries and the color decoding scheme in the display. The underlying
assumption in video is that we have a spectrum for each primary that is
similar to the CIE standard observer curves, which are approximations of how
we perceive color. When you deviate from those spectra you have to
compensate or you will be intermediate colors that have to much or too
little energy in a particular primary. There are no standards for how this
is done, because there are so many variations in backlighting and filters in
the displays. There are not even good metrics for getting to the bottom of
the problem yet.

The result is that the LED backlit displays can look very good, but
sometimes have a little strange color reproduction.

As for the black level and contrast ratios, they are only an improvement to
the degree that they can control backlighting locally. As the number of
controlled areas increase, the useful contrast ratio in an actual image may
begin to approach the on/off numbers that they advertise, but with less
zones of control, those numbers are simply meaningless to real video.

Leonard
 
<meow2222@care2.com> wrote in message
news:2fecfc9d-7718-425b-94f4-ed50311ac00a@n4g2000vba.googlegroups.com...
William Sommerwerck wrote:
The LCD only filters light from the backlight. If you don't have a full
spectrum white in the first place the you can't expect decent colour.

Not so. All you have to do is hit the defined points in CIE diagram. The
Pioneer plasma sets hit them dead-on.

Indeed. None of the major display techologies deliver full spectrum,
nor do they need to.


NT

This is true only if you have custom LUTs or decoding algorithms for a
display based on the relationship between the spectra of the lighting and
the CIE standard observer functions that cameras are generally aligned to
approximate. The other thing that no one mentions is that trying to make up
for spectral shortcomings with different filters and decoding reduces the
efficiency of the lighting system.

There is usually a "rest of the story" beyond the naive assumptions that get
thrown around about reproducing color. This thread is full of examples.

Leonard
 
tony sayer wrote:
In article <gvcars$tjr$1@news.eternal-september.org>, William
Sommerwerck <grizzledgeezer@comcast.net> scribeth thus
That may be a different story because PAL TV sets never had them. NTSC
sets needed them because the phase of the color carrier wandered and
often shifted to the green, while PAL sets reset the phase each line
and
therefore were always "correct".

NTSC does not, and never had, an inherent problem with phase stability.

I cant conclude anything, but I know 2 things:
1. NTSC is widely known as Never The Same Color twice
2. The PAL system includes measures to counter phase shift causing
colour issues, so I can only conclude that the system engineers
thought this was a problem with NTSC.

I don't have the time to discuss this at length, but NTSC's unfortunate
reverse-acronym was the result of poor studio standards, and is not inherent
in the system. PAL incorporated phase alternation to partly compensate for
transmission problems (non-linear group delay) in Europe.


Wasn't something done to either the NTSC transmission spec or the sets
that largely alleviated that .. sometime after the original system
started?..

VIR was introduced decades ago. It inserted reference signals into
the vertical interval, near the start of each field of video. That
allowed automatic adjustment of the equipment, and eliminated the video
gain, black level, chroma gain, and phase controls that each operator
could adjust, to 'their' preference. NTSC wasn't the problem, it was
that everyone along the signal path could play with it. A system that
had VIR from the cameras to the transmitter had no problems. Of course,
that doesn't stop opinionated people from bashing a system they don't
understand.


--
You can't have a sense of humor, if you have no sense!
 
meow2222@care2.com wrote:
William Sommerwerck wrote:

That may be a different story because PAL TV sets never had them. NTSC
sets needed them because the phase of the color carrier wandered and
often shifted to the green, while PAL sets reset the phase each line and
therefore were always "correct".

NTSC does not, and never had, an inherent problem with phase stability.

I cant conclude anything, but I know 2 things:
1. NTSC is widely known as Never The Same Color twice
Only by idiots.

2. The PAL system includes measures to counter phase shift causing
colour issues, so I can only conclude that the system engineers
thought this was a problem with NTSC.

And fwiw, IIUC PAL rendered colours are designed to alternate the
error line after line rather than get each line colour correct, so
like many such measures it usually solves the problem, but not always.

I have yet to be impressed by an LCD/PLASMA TV. Every single one of them
I have seen is oversaturated and too bright.

isnt that just an adjustment thing? And yes, I agree many wont go dim
enough, but some do.

NT

--
You can't have a sense of humor, if you have no sense!
 
"Arfa Daily" <arfa.daily@ntlworld.com> wrote in message
news:guPRl.35856$TE3.29513@newsfe13.ams2...
meow2222@care2.com> wrote in message
news:1ec4979e-c73a-4dc0-88a4-2bb1ba1d7a17@s12g2000yqi.googlegroups.com...
Arfa Daily wrote:

The LCD only filters light from the backlight. If you don't have a
full
spectrum white in the first place the you can't expect decent colour.
White LEDs aren't quite there yet are they?

Archie


Absolutely true, except that this particular TV doesn't use white LEDs
in
its 'revolutionary' backlighting scheme. It uses small RGB arrays, which
is
why I was questioning whether there was any control over the individual
elements in each array, such that the colour temperature of the
nominally
white light that they produce, could be varied. Which would then, of
course,
have a corresponding effect on the displayed colour balance. It just
seemed
to me that given they have gone to the trouble of using RGB arrays,
rather
than white LEDs, the reason for that might have been to get a full(er)
spectrum white.

Arfa

colour temp can be controlled using the LEDs or the LCD, I'm not sure
it makes any big difference which one.

RGB LEDs would give the same white as a triphosphor&uv white LED, but
with more colour control. The standard 2 colour white LED would be
useless on a 3 channel display. And fwiw bichromic white LEDs have
huge colour balance variation, way outside of whats acceptable for a
display.


NT

Which is why, given that they've put these LEDs under at least some kind
of control in order to implement their (claimed) enhanced black
reproduction scheme, that I was questioning whether the scheme maybe
allowed for a degree of user intervention under the guise of "tint" or
whatever, and which might have accounted for why on this particular TV -
the only example that I've seen on and working so far - the flesh tones
were so poor compared to Pan and Sony offerings in the same display stack,
showing the same picture. I'm trying to get a handle on why a company with
the products and reputation of Sammy, are a) using advertising terminology
that appears to be questionable in the context that it appears, and b)
producing a set, claiming it to be the dog's bollocks of display
technology, which does not appear - to my eye at least - to be as good as
their traditionally CCFL backlit offerings, or those of other
manufacturers.

I saw the latest all singing and dancing LCD HD Pan, just released, in my
friend's shop yesterday. Uses conventional CCFL backlighting. Not as thin
as the Sammy, but getting there. Apart from the usual slight gripes that
you could direct at any LCD panel when examined closely, the picture was
quite stunning, and the colour rendition was as close to 'perfect' as you
could reasonably expect. Certainly, flesh tones *appeared* accurate, but I
accept that is subjective. Anyway, whichever-whatever, more accurate than
they appeared on the LED backlit Sammy ...

Arfa


The why is pretty clear. Samsung is a whore, like all of the other vendors,
only a little more so than some others. They are interested in market share
and will create whatever hype they think will help them sell sets. The
degree to which it is actually better only has to matter up to the point
that too many people figure it out and it hurts sales. As we get better at
quantifying why the sets look a little weird on certain colors they will
improve the spectrum of the backlighting and improve the color decoding to
compensate. It won't happen if people keep spewing the nonsense that all
you have to do is hit the primary and secondary colorimetry targets to get
perfect color. That is just a starting point, and for some sets that do not
have proper color decoding or gamut, may actually be the wrong compromise.

Leonard
 
tony sayer wrote:
In article <gvcr87$l8n$1@news.eternal-september.org>, William
Sommerwerck <grizzledgeezer@comcast.net> scribeth thus
"Dave Plowman (News)" <dave@davenoise.co.uk> wrote in message
news:5060acb8fbdave@davenoise.co.uk...
In article <gvcars$tjr$1@news.eternal-september.org>,
William Sommerwerck <grizzledgeezer@comcast.net> wrote:
I don't have the time to discuss this at length, but NTSC's unfortunate
reverse-acronym was the result of poor studio standards, and is not
inherent in the system. PAL incorporated phase alternation to partly
compensate for transmission problems (non-linear group delay) in Europe.

IIRC, nowt to do with studios, but the transmission process. Hence the
tint control on NTSC sets which is absent on PAL ones.

The implication of "never twice the same color" was that there was something
inherently unstable in the system.

The US had high-quality microwave transmission systems with excellent timing
and group delay characteristics. Europe did not.

Are you referring to the studio to transmitter links?...

The cross country network feeds, that were owned & operated by AT&T.
Those were replaced by C & KU band satellite feeds in the '80s. Some TV
stations now feed CATV headends via fiber optic. They maintain the off
air equipment as a backup, in case of a failure in the F-O path.

I was a TV Broadcast Engineer in the '70s - '90s in the US.

--
You can't have a sense of humor, if you have no sense!
 
"William Sommerwerck" <grizzledgeezer@comcast.net> wrote in message
news:gv7qup$af6$1@news.eternal-september.org...
The LCD only filters light from the backlight. If you don't have a full
spectrum white in the first place the you can't expect decent colour.
White LEDs aren't quite there yet are they?

Absolutely true, except that this particular TV doesn't use white LEDs in
its 'revolutionary' backlighting scheme. It uses small RGB arrays, which
is
why I was questioning whether there was any control over the individual
elements in each array, such that the colour temperature of the nominally
white light that they produce, could be varied. Which would then, of
course,
have a corresponding effect on the displayed colour balance. It just
seemed
to me that given they have gone to the trouble of using RGB arrays,
rather
than white LEDs, the reason for that might have been to get a full(er)
spectrum white.

In a very broad sense, the last thing you want is a "full-spectrum" light.
The standard primaries are diluted with too much white as it is.

In a very broad sense you are correct, but in terms of understanding what is
going on with color reproduction in LCD displays ( and others) you are
making a point that is the equivalent of trying to make D65 with an
incandescent lamp.

White is a rather useless term. All "white" has a color and is a mix of
other colors.

Primaries do not get diluted with white. They get desaturated by adding the
other colors. What you want is a spectrum that is correct, not "white," nor
"full spectrum," nor narrow band RGB. Correct depends upon the assumptions
that are made in recording the image, as well as upon the filters and color
decoding that you implement in the display.

As I have said many times, the underlying assumption that video has used is
that RGB spectral densities should follow the standard observer curves.
When you violate that assumption on the display end, you get some unusual
results with some colors and you have to compensate in your color decoder.
The degree to which, and the techniques used are unclear in these sets. The
results are mixed. Given the sloppy nature of color decoding and color
management in consumer displays in general over the years, however, these
sets are likely to be perfectly acceptable to most consumers. They are
likely better than much of what they have been viewing in the past by quite
a margin. That does not mean that they are going to accurately reproduce
color. Most consumers and most of the posters here would not likely know
( and may not prefer ) accurate color reproduction in a display if they were
to happen to see it.

Leonard
 
In article <gve0go$q22$1@news.eternal-september.org>,
William Sommerwerck <grizzledgeezer@comcast.net> wrote:
No, when I say "poor studio standards", I'm talking about such things as
the failure to set up cameras correct, keep a close eye on burst phase,
etc, etc, etc. Garbage in, garbage out.
I'd be most surprised if all but the very smallest station with only one
camera made mistakes like this.

--
*Constipated People Don't Give A Crap*

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.
 
"William Sommerwerck" <grizzledgeezer@comcast.net> wrote in message
news:gv8lqs$626$1@news.eternal-september.org...
I guess it comes down to definitions and how 'full spectrum' is
perceived.
Rightly or wrongly, I tend to think of it as a spectrum which contains
the
same component colours in the same ratios, as natural daylight...

That's a reasonable definition for a video display, but it's not
sufficient
for source lighting. It's difficult to make a "full spectrum" fluorescent
lamp, especially one that produces good color rendition for photograpy.


but I guess even that varies depending on filtering effects of cloud
cover and haze and so on. Even so, I'm sure that there must be some
definition of 'average spectrum daylight', and I would expect that any
display technology would aim to reproduce any colour in as closely
exact a way as it would appear if viewed directly under daylight.

The standard is D6500, a 6500K continuous spectrum from a black-body
source.
What you suggest is, indeed, the intent.

There is no such standard as D6500. One standard, the one used for most
video for the color of white, is D65. D65 specifies NOTHING about the
spectrum, only the x,y coordinates of the COLOR of light. It happens to be
approximately 6504K. The term D6500 is slang and sloppy use that confuses
the issues of colorimetry and coordinated color temperature.

There are other standards for the color of white that are used for purposes
other than video. Some specific purposes in film and cinema, as well as in
video use other standards, but for the most part D65 is accepted as the
color of white for modern video. The truth is that virtually no consumer
displays come out of the box set anywhere near D65, nor producing correct
color for any color, including white. What you see in showrooms and when
you take a set out of the box is likely a color temp for white that is
nearly twice what it should be.

Leonard
 
"William Sommerwerck" <grizzledgeezer@comcast.net> wrote in message
news:gv9m1j$2ib$1@news.eternal-september.org...
Every lcd TV I have seen has colour temp adjustments.

What, readily user accessible ?

It depends on what you define as a color temperature adjustment. Many (if
not most) sets do not have the detailed adjustments that make possible
both
correct color temperature and good grayscale tracking. When they do, these
are not usually available to the customer.

You have not looked at many modern displays carefully. Many, actually most
of the better sets, have these controls in the user menus now. Some even
have color management that goes far beyond gray scale and let you adjust the
colorimetry (and perhaps luma) of the primaries and secondaries. Most
professional calibrations these days never involve going into a service
menu.

Leonard
 
<meow2222@care2.com> wrote in message
news:5d37b31f-e42c-4ba4-9bfc-6e8a4e03ee11@m17g2000vbi.googlegroups.com...
Arfa Daily wrote:
meow2222@care2.com> wrote in message
news:1e56875d-3af4-4041-832c-c511a21147dc@n8g2000vbb.googlegroups.com...
William Sommerwerck wrote:
I guess it comes down to definitions and how 'full spectrum' is
perceived.
Rightly or wrongly, I tend to think of it as a spectrum which
contains
the
same component colours in the same ratios, as natural daylight...

That's a reasonable definition for a video display, but it's not
sufficient
for source lighting. It's difficult to make a "full spectrum"
fluorescent
lamp, especially one that produces good color rendition for
photograpy.


but I guess even that varies depending on filtering effects of cloud
cover and haze and so on. Even so, I'm sure that there must be some
definition of 'average spectrum daylight', and I would expect that
any
display technology would aim to reproduce any colour in as closely
exact a way as it would appear if viewed directly under daylight.

The standard is D6500, a 6500K continuous spectrum from a black-body
source.
What you suggest is, indeed, the intent.


TBH I think this is overplaying the significant of daylight. Almost
any monitor is adjustable to suit preferences of anything from 5000K
to 10,000K, and some go lower. None manke any attempt to copy the
colour spectrum of daylight, they merely include the same colour temp
as daylight as one of the options. None of the major display types
have any ability to copy a daylight spectrum, as they're only RGB
displays.


NT

But take account of the fact that we're talking domestic television sets
here, not computer monitors. For the most part, TV sets do not display
the
same type of content as a computer monitor, and do not include user
accessible colour temperature presets or adjustments,

fwiw my main set does, and I'm sure its not unique. Generally though a
TV is a much lower quality animal than a monitor, and displays much
lower quality data.


which is why I made
the point earlier that in general, LCD TVs are set correctly 'out of the
box'.

because they can be. CRTs are more variable, and the circuits used to
drive them a lot less precise, partly because CRT sets are generally
older, and the sort of standards expected in monitors have only begun
crossing over to tvs in recent years.


As far as overplaying the significance of daylight goes, I'm not sure
that I
follow what you mean by that. If I look at my garden, and anything or
anybody in it, the illumination source will be daylight, and the colours
perceived will be directly influenced by that. If I then reproduce that
image on any kind of artificial display, and use a different reference
for
the white, then no other colour will be correct either,

what makes you think that just one specific colour temp is 'correct'?
Real daylight is all over the place colour temp wise, and the end user
experiences those changes without any problem. Also any self
respecting monitor offers a range of colour temps, since its nothing
but a taste matter


which was ever the
case when CRTs were set up to give whites which were either too warm or
too
cold, even by a fraction.

but thats down to historic reasons, customers never expected precise
colour temp, and screens were routinely set up by eye. The circuits
involved couldnt set themselves up the way a modern LCD set can, there
was normally no feedback on colour channels, just open loop CRT gun
drive on top of a massive dc offset, so the systems were inherently
variable. Plus the fact that CRT gamma was often way off from the real
world made it hard, or should I say impossible, to set such sets to
give a faithful reproduction in other respects anyway.


Maybe we're talking at cross purposes here, or I'm
not understanding something properly, but it seems to me that the colour
temperature and CRI of the backlighting on an LCD TV, would be crucially
important to correct reproduction of colours.

It has almost nothing to do with it, because the level of each colour
channel output on the screen depends on both the light source and the
settings of the LCD R,G,B channels. Within reason, any temperature
colour backlight can produce any temperature colour picture.


All I know is, is that the flesh tones were poor on the example that I
saw,
compared to other LCD TVs which were showing the same picture. The
fundamental difference between those sets and the Sammy, was the CCFL vs
LED
backlighting, so it seems reasonable to draw from that, the inference
that
the backlighting scheme may well be the cause, no ?

Arfa

Its just a guess. In fact any desired flesh tone can be reproduced
using almost any colour temp backlight, certainly anything from 3,000K
to 10,000K. Think about the process, you've got 3 colour channels,
each of which has a given level of light from the backlight, which is
then attenuated to any desired degree by the LCD pixel.


NT

While this is true, it would be virtually impossible to get all colors right
with some arbitrary color backlight. You could get a subset right and get
all the others completely wrong.

Leonard
 

Welcome to EDABoard.com

Sponsor

Back
Top