Bit of a Con Really - Follow-up ...

There is usually a "rest of the story" beyond the naive assumptions
that get thrown around about reproducing color. This thread is full of
examples.
This is not a simple subject. I have Mees' "The Reproduction of Color"
(which is, what, 40+ years old?) and it's tough sledding. I had less trouble
with integral calculus.
 
"Dave Plowman (News)" <dave@davenoise.co.uk> wrote in message
news:5060ff694ddave@davenoise.co.uk...
In article <gve0go$q22$1@news.eternal-september.org>,
William Sommerwerck <grizzledgeezer@comcast.net> wrote:
No, when I say "poor studio standards", I'm talking about such things as
the failure to set up cameras correct, keep a close eye on burst phase,
etc, etc, etc. Garbage in, garbage out.

I'd be most surprised if all but the very smallest station with only one
camera made mistakes like this.

--
*Constipated People Don't Give A Crap*

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.

You would be very surprised.

Leonard
 
Primaries do not get diluted with white.
What I was implying was that you could reproduce a wider range of colors if
the primaries weren't as close to the center of the chart. Radial movement
represents changes in saturation -- dilution with white.

As for not knowing accurate color reproduction when you see it... What sorts
of preferences does the average viewer have? If you don't have the original
for comparison, it can be difficult to judge.

I never considered myself an expert in color reproduction, but your comments
have encouraged me to dig out Mees and give him another try. (I'm not
promising anything.)
 
You have not looked at many modern displays carefully. Many, actually
most of the better sets, have these controls in the user menus now.
My Pioneer does, but heck if I'm touching them without instrumentation.


Some even have color management that goes far beyond gray scale and
let you adjust the colorimetry (and perhaps luma) of the primaries and
secondaries.
The pioneer has six adustments, for RGB and CMY.
 
"Dave Plowman (News)" <dave@davenoise.co.uk> wrote in message
news:5060ff694ddave@davenoise.co.uk...
In article <gve0go$q22$1@news.eternal-september.org>,
William Sommerwerck <grizzledgeezer@comcast.net> wrote:

No, when I say "poor studio standards", I'm talking about such things as
the failure to set up cameras correct, keep a close eye on burst phase,
etc, etc, etc. Garbage in, garbage out.

I'd be most surprised if all but the very smallest station with only one
camera made mistakes like this.
If you look at the first season of "Barney Miller", you'll see poor camera
convergence, and slight color shifts between the cameras. And this was in
the 1970s, and at ABC's studios.
 
"William Sommerwerck" <grizzledgeezer@comcast.net> wrote in message
news:gve8eh$r7u$1@news.eternal-september.org...
Primaries do not get diluted with white.

What I was implying was that you could reproduce a wider range of colors
if
the primaries weren't as close to the center of the chart. Radial movement
represents changes in saturation -- dilution with white.

As for not knowing accurate color reproduction when you see it... What
sorts
of preferences does the average viewer have? If you don't have the
original
for comparison, it can be difficult to judge.

I never considered myself an expert in color reproduction, but your
comments
have encouraged me to dig out Mees and give him another try. (I'm not
promising anything.)


Most modern consumers have been conditioned to higher and higher color temps
for white and over saturated color over the last thirty years or so.
Manufacturers realized years ago that in the first few seconds of viewing,
where most impressions are made in showrooms, the impression is dominated by
contrast and color saturation. This has nothing to do with perceiving color
naturally, but everything to do with marketing and competing with a wall of
other sets. It is not uncommon for displays to be sold with factory
settings that have color temps in the 13000K range, completely crushed
blacks and whites, and far to saturated color. Many consumers like this
more VIVID look. Others prefer to see a more accurate reproduction of the
product as it was produced, and more realistic portrayal of color. This
requires substantial changes from OOB settings for most consumer displays,
at least in the USA.

Leonard
 
"William Sommerwerck" <grizzledgeezer@comcast.net> wrote in message
news:gve8ej$r7u$2@news.eternal-september.org...
You have not looked at many modern displays carefully. Many, actually
most of the better sets, have these controls in the user menus now.

My Pioneer does, but heck if I'm touching them without instrumentation.


Some even have color management that goes far beyond gray scale and
let you adjust the colorimetry (and perhaps luma) of the primaries and
secondaries.

The pioneer has six adustments, for RGB and CMY.

I would agree. The ability of most consumers to do more than make a mess is
very unlikely. Even someone like myself, having calibrated displays for 30
years, can't do much to align a color management system without a GOOD
meter. I can get gray scale improved, but not really accurate.

Most sets now have RGB gains and cuts for gray scale in the user menu. Some
have far more available.

Leonard
 
William Sommerwerck wrote:
Many years ago I read about the work at National Geographic that was put
into making color separations and printing plates to produce extremely
high-quality images in the magazine. It was not simple.
At that time they only accepted kodachrome slides.

Geoff.


--
Geoffrey S. Mendelson, Jerusalem, Israel gsm@mendelson.com N3OWJ/4X1GM
 
Most modern consumers have been conditioned to higher and higher color
temps
for white and over saturated color over the last thirty years or so.
Manufacturers realized years ago that in the first few seconds of viewing,
where most impressions are made in showrooms, the impression is dominated
by
contrast and color saturation. This has nothing to do with perceiving
color
naturally, but everything to do with marketing and competing with a wall
of
other sets. It is not uncommon for displays to be sold with factory
settings that have color temps in the 13000K range, completely crushed
blacks and whites, and far to saturated color. Many consumers like this
more VIVID look. Others prefer to see a more accurate reproduction of the
product as it was produced, and more realistic portrayal of color. This
requires substantial changes from OOB settings for most consumer displays,
at least in the USA.
You will be pleased to hear that my Pioneer is set to PURE, with all the
controls at their default settings (except for a bit of Sharpness goosing).
The image is just plain gaw-juss.

I considered having a $350 calibration performed, but decided that I wasn't
going to pay that much for a technician who knows even less about
colorimetry than I to perform. The Pioneers are supposedly nearly correct
out of the box.

If you want a demo disk, get the Blu-ray of "The Searchers". I don't care
for the movie, but the VistaVision photography is jaw-dropping. "Amadeus"
and "2001" are almost as good. With the best material, you sometimes think
you're looking through a sheet of glass at the thing itself.
 
"Geoffrey S. Mendelson" <gsm@mendelson.com> wrote in message
news:slrnh1la3l.d4s.gsm@cable.mendelson.com...
William Sommerwerck wrote:

If you look at the first season of "Barney Miller", you'll see poor
camera
convergence, and slight color shifts between the cameras. And this was in
the 1970s, and at ABC's studios.

Look at the early shows of Star Trek: The Next Generation. They were
lit and photgraphed as if they were films. There are many scenes where
there is action in the shadows.

You would have seen what was happening if you were watching it on film,
on TV it was just a grayish blur.

If I remember correctly, they were shot on film.
I don't understand what you're talking about. "Barney Miller" was videotape,
"ST TNG" was film.

Regardless of whether tape or film is used, the cinematographer is likely to
light the scenes according to what the thinks the average TV is able to
reproduce.
 
The ability of most consumers to do more than make a mess is
very unlikely. Even someone like myself, having calibrated displays
for 30 years, can't do much to align a color management system
without a GOOD meter. I can get gray scale improved, but not really
accurate.
Does anyone make cheap-but-good instrumentation? I could justify a $500
investment.

(I can hear you laughing now.)
 
In article <jvxSl.69962$9w4.39673@newsfe08.iad>,
Leonard Caillouet <nospam@noway.com> wrote:
No, when I say "poor studio standards", I'm talking about such things
as the failure to set up cameras correct, keep a close eye on burst
phase, etc, etc, etc. Garbage in, garbage out.

I'd be most surprised if all but the very smallest station with only
one camera made mistakes like this.

You would be very surprised.
Perhaps standards are higher in the UK, then.

--
*Laugh alone and the world thinks you're an idiot.

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.
 
"William Sommerwerck" <grizzledgeezer@comcast.net> wrote in message
news:gve9d0$48t$1@news.eternal-september.org...
Most modern consumers have been conditioned to higher and higher color
temps
for white and over saturated color over the last thirty years or so.
Manufacturers realized years ago that in the first few seconds of
viewing,
where most impressions are made in showrooms, the impression is dominated
by
contrast and color saturation. This has nothing to do with perceiving
color
naturally, but everything to do with marketing and competing with a wall
of
other sets. It is not uncommon for displays to be sold with factory
settings that have color temps in the 13000K range, completely crushed
blacks and whites, and far to saturated color. Many consumers like this
more VIVID look. Others prefer to see a more accurate reproduction of
the
product as it was produced, and more realistic portrayal of color. This
requires substantial changes from OOB settings for most consumer
displays,
at least in the USA.

You will be pleased to hear that my Pioneer is set to PURE, with all the
controls at their default settings (except for a bit of Sharpness
goosing).
The image is just plain gaw-juss.

I considered having a $350 calibration performed, but decided that I
wasn't
going to pay that much for a technician who knows even less about
colorimetry than I to perform. The Pioneers are supposedly nearly correct
out of the box.

If you want a demo disk, get the Blu-ray of "The Searchers". I don't care
for the movie, but the VistaVision photography is jaw-dropping. "Amadeus"
and "2001" are almost as good. With the best material, you sometimes think
you're looking through a sheet of glass at the thing itself.

There are lots of calibration techs out there that know little more than how
to point a probe at the set and adjust gray scale. There are a few dozen,
perhaps, that really understand what it takes to make an accurate display.
I suggest you look at the list at ISF Forum. The couple of hundred members
who subscribe there are among the best in the world, and all but a handful
of the elite calibration pros are found there.

Leonard
 
"Geoffrey S. Mendelson" <gsm@mendelson.com> wrote in message
news:slrnh1la3l.d4s.gsm@cable.mendelson.com...
William Sommerwerck wrote:
If you look at the first season of "Barney Miller", you'll see poor
camera
convergence, and slight color shifts between the cameras. And this was in
the 1970s, and at ABC's studios.

Look at the early shows of Star Trek: The Next Generation. They were
lit and photgraphed as if they were films. There are many scenes where
there is action in the shadows.

You would have seen what was happening if you were watching it on film,
on TV it was just a grayish blur.

If I remember correctly, they were shot on film.

Geoff.

--
Geoffrey S. Mendelson, Jerusalem, Israel gsm@mendelson.com N3OWJ/4X1GM

Your memory is incorrect, in this case.

Leonard
 
"William Sommerwerck" <grizzledgeezer@comcast.net> wrote in message
news:gve9nt$74v$1@news.eternal-september.org...
The ability of most consumers to do more than make a mess is
very unlikely. Even someone like myself, having calibrated displays
for 30 years, can't do much to align a color management system
without a GOOD meter. I can get gray scale improved, but not really
accurate.

Does anyone make cheap-but-good instrumentation? I could justify a $500
investment.

(I can hear you laughing now.)

The cheapest I would even consider for most current displays is the i1 Pro.
None of the tristimulus colorimeters will be able to measure the narrow
spectrum of many modern displays, nor likely match the filters in wider
spectrum lighted systems. Even the i1 pro is marginal for the LED and Laser
sets, from what I understand. Better meters will be many thousands of
dollars.

The best pricing that you will find is packaged with the CalMAN software.
It is also one of the few software packages that is versatile enough to do
just about everything that you might need with most meters.

Leonard
 
"Dave Plowman (News)" <dave@davenoise.co.uk> wrote in message
news:506105475adave@davenoise.co.uk...
In article <jvxSl.69962$9w4.39673@newsfe08.iad>,
Leonard Caillouet <nospam@noway.com> wrote:
No, when I say "poor studio standards", I'm talking about such things
as the failure to set up cameras correct, keep a close eye on burst
phase, etc, etc, etc. Garbage in, garbage out.

I'd be most surprised if all but the very smallest station with only
one camera made mistakes like this.

You would be very surprised.

Perhaps standards are higher in the UK, then.

--
*Laugh alone and the world thinks you're an idiot.

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.

Probably. Not as many stations, either.

Leonard
 
William Sommerwerck wrote:
If you look at the first season of "Barney Miller", you'll see poor camera
convergence, and slight color shifts between the cameras. And this was in
the 1970s, and at ABC's studios.
Look at the early shows of Star Trek: The Next Generation. They were
lit and photgraphed as if they were films. There are many scenes where
there is action in the shadows.

You would have seen what was happening if you were watching it on film,
on TV it was just a grayish blur.

If I remember correctly, they were shot on film.

Geoff.

--
Geoffrey S. Mendelson, Jerusalem, Israel gsm@mendelson.com N3OWJ/4X1GM
 
William Sommerwerck wrote:
Regardless of whether tape or film is used, the cinematographer is likely to
light the scenes according to what the thinks the average TV is able to
reproduce.
That was my point. They lit (and photographed it) as if it were going to be
shown in theaters and not on TV.

Geoff.

--
Geoffrey S. Mendelson, Jerusalem, Israel gsm@mendelson.com N3OWJ/4X1GM
 
Leonard Caillouet wrote:
meow2222@care2.com> wrote in message
news:2fecfc9d-7718-425b-94f4-ed50311ac00a@n4g2000vba.googlegroups.com...
William Sommerwerck wrote:
The LCD only filters light from the backlight. If you don't have a full
spectrum white in the first place the you can't expect decent colour.

Not so. All you have to do is hit the defined points in CIE diagram. The
Pioneer plasma sets hit them dead-on.

Indeed. None of the major display techologies deliver full spectrum,
nor do they need to.


NT


This is true only if you have custom LUTs or decoding algorithms for a
display based on the relationship between the spectra of the lighting and
the CIE standard observer functions that cameras are generally aligned to
approximate.
What that has to do with it I dont know. If you find an RGB display
with violet output, I'm all ears.


NT
 
Leonard Caillouet wrote:
meow2222@care2.com> wrote in message
news:5d37b31f-e42c-4ba4-9bfc-6e8a4e03ee11@m17g2000vbi.googlegroups.com...
Arfa Daily wrote:
meow2222@care2.com> wrote in message
news:1e56875d-3af4-4041-832c-c511a21147dc@n8g2000vbb.googlegroups.com...
William Sommerwerck wrote:
I guess it comes down to definitions and how 'full spectrum' is
perceived.
Rightly or wrongly, I tend to think of it as a spectrum which
contains
the
same component colours in the same ratios, as natural daylight...

That's a reasonable definition for a video display, but it's not
sufficient
for source lighting. It's difficult to make a "full spectrum"
fluorescent
lamp, especially one that produces good color rendition for
photograpy.


but I guess even that varies depending on filtering effects of cloud
cover and haze and so on. Even so, I'm sure that there must be some
definition of 'average spectrum daylight', and I would expect that
any
display technology would aim to reproduce any colour in as closely
exact a way as it would appear if viewed directly under daylight.

The standard is D6500, a 6500K continuous spectrum from a black-body
source.
What you suggest is, indeed, the intent.


TBH I think this is overplaying the significant of daylight. Almost
any monitor is adjustable to suit preferences of anything from 5000K
to 10,000K, and some go lower. None manke any attempt to copy the
colour spectrum of daylight, they merely include the same colour temp
as daylight as one of the options. None of the major display types
have any ability to copy a daylight spectrum, as they're only RGB
displays.


NT

But take account of the fact that we're talking domestic television sets
here, not computer monitors. For the most part, TV sets do not display
the
same type of content as a computer monitor, and do not include user
accessible colour temperature presets or adjustments,

fwiw my main set does, and I'm sure its not unique. Generally though a
TV is a much lower quality animal than a monitor, and displays much
lower quality data.


which is why I made
the point earlier that in general, LCD TVs are set correctly 'out of the
box'.

because they can be. CRTs are more variable, and the circuits used to
drive them a lot less precise, partly because CRT sets are generally
older, and the sort of standards expected in monitors have only begun
crossing over to tvs in recent years.


As far as overplaying the significance of daylight goes, I'm not sure
that I
follow what you mean by that. If I look at my garden, and anything or
anybody in it, the illumination source will be daylight, and the colours
perceived will be directly influenced by that. If I then reproduce that
image on any kind of artificial display, and use a different reference
for
the white, then no other colour will be correct either,

what makes you think that just one specific colour temp is 'correct'?
Real daylight is all over the place colour temp wise, and the end user
experiences those changes without any problem. Also any self
respecting monitor offers a range of colour temps, since its nothing
but a taste matter


which was ever the
case when CRTs were set up to give whites which were either too warm or
too
cold, even by a fraction.

but thats down to historic reasons, customers never expected precise
colour temp, and screens were routinely set up by eye. The circuits
involved couldnt set themselves up the way a modern LCD set can, there
was normally no feedback on colour channels, just open loop CRT gun
drive on top of a massive dc offset, so the systems were inherently
variable. Plus the fact that CRT gamma was often way off from the real
world made it hard, or should I say impossible, to set such sets to
give a faithful reproduction in other respects anyway.


Maybe we're talking at cross purposes here, or I'm
not understanding something properly, but it seems to me that the colour
temperature and CRI of the backlighting on an LCD TV, would be crucially
important to correct reproduction of colours.

It has almost nothing to do with it, because the level of each colour
channel output on the screen depends on both the light source and the
settings of the LCD R,G,B channels. Within reason, any temperature
colour backlight can produce any temperature colour picture.


All I know is, is that the flesh tones were poor on the example that I
saw,
compared to other LCD TVs which were showing the same picture. The
fundamental difference between those sets and the Sammy, was the CCFL vs
LED
backlighting, so it seems reasonable to draw from that, the inference
that
the backlighting scheme may well be the cause, no ?

Arfa

Its just a guess. In fact any desired flesh tone can be reproduced
using almost any colour temp backlight, certainly anything from 3,000K
to 10,000K. Think about the process, you've got 3 colour channels,
each of which has a given level of light from the backlight, which is
then attenuated to any desired degree by the LCD pixel.


NT


While this is true, it would be virtually impossible to get all colors right
with some arbitrary color backlight. You could get a subset right and get
all the others completely wrong.

Leonard
With each colour channel you've got everything available from
backlight output x LCD max down to backlight output x LCD minimum.
AFAIK that covers every flesh tone on this planet, unless one goes
down to 2000K backlight or some other very extreme value.


NT
 

Welcome to EDABoard.com

Sponsor

Back
Top