NTSC versus PAL

  • Thread starter William Sommerwerck
  • Start date
In article <isw-55C6A6.19545706042010@[216.168.3.50]>,
isw <isw@witzend.com> wrote:
In article <4bbbee16$0$24357$c3e8da3@news.astraweb.com>,
Sylvia Else <sylvia@not.at.this.address> wrote:

--snippety-snip--

I'm left wondering what exactly was the *real* problem that PAL was
intended to fix.

Political. The Europeans didn't want US companies selling sets there.
Didn't stop the Japanese, etc. But US companies would have to do other
mods to their products for European sales anyway. Like mains voltage and
frequency. Most couldn't be bothered - even when that's all which had to
be changed.

--
*Letting a cat out of the bag is easier than putting it back in *

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.
 
I'm left wondering what exactly was the *real* problem that PAL
was intended to fix. It appears that the NTSC tint control could
only address a fixed phase offset between the colour burst and
the subcarrier, with both transmitters and TV sets able to
maintain that offset sufficiently closely that the hue wouldn't
vary from left to right of the picture.
Correct.


Other issues, such as non-linear phase shift would have been
a problem for NTSC viewers, regardless of the tint control.
Also correct.


So were NTSC viewers tolerating colour pictures that couldn't
be set right even with the tint control? Or is there something
else that I've missed?
You /have/ missed something, which I explained "long ago and far away".
<grin>

The US TV-distribution system DID NOT generally suffer from non-linear
group-delay problems, whereas the European system DID. That's it.

Even without the extra delay line, there is some degree of visual color
averaging, which tends to mitigate the phase error.
 
Political. The Europeans didn't want US companies selling
sets there.

Didn't stop the Japanese, etc. But US companies would have
to do other mods to their products for European sales anyway.
Like mains voltage and frequency. Most couldn't be bothered --
even when that's all which had to be changed.
I don't buy that. US sets would have been fairly expensive in Europe, even
in the mid-60s. Not to mention the strong competition from Thomson, Philips,
etc.
 
On Wed, 07 Apr 2010 12:29:40 +1000, Sylvia Else
<sylvia@not.at.this.address> wrote:

On 7/04/2010 10:08 AM, Arfa Daily wrote:
"Sylvia Else"<sylvia@not.at.this.address> wrote in message
news:4bba994c$0$15459$c3e8da3@news.astraweb.com...
On 6/04/2010 12:53 AM, William Sommerwerck wrote:
However... If the burst phase is wrong, then there is no cancellation
of
errors, because there are no "errors" /in the signal itself/. (Right?
(???))
Therefore, I don't see how line averaging can be used to eliminate the
need
for a manual hue control.

Think of the chroma signal as a vector with its y coordinate equal the
red difference component, and the x coordinate equal to the blue
difference component. A phase error rotates that vector about the z
axis. Effectively, the blue difference component receives a bit of the
red difference component, and vice versa.

On alternate lines the phase of the red difference component *only* is
inverted. In our view, this has the effect of reflecting the vector in
the x axis - what was a positive y value becomes negative.

The same phase error causes this vector to rotate in the same direction
about the z axis, but because of the reflection, the mixing of the
components has the opposite sign.

If you then negate the resulting red difference component of the second
line, and average with the red difference component of the first line,
the parts received from the blue difference component cancel out,
leaving a red different component that equals the original, multiplied
by the cosine of the phase error. The same applies to the blue
component. The result is that the hues are correct, but not as saturated
as they shoud have been.

No argument. That's always been my understanding. But...

If the burst phase gets screwed up somewhere along the line, no amount of
line averaging will fix the problem, because there's nothing "wrong" with
the subcarrier to fix.

If the burst has a random phase relationship to the colour subcarrier on
each line, then my analysis falls apart because the vectors would have
random orientations. In such a situation a PAL receiver would do no better
than NTSC, and they'd both perform awfully.

If the burst just has a fixed phase offset from the true colour
subcarrier, then the averaging will work.

Indeed it will work if the colour subcarrier drifts in a consistent way
relative to the burst - or if the receiver's oscillator similarly drifts.
The effect of such a drift on an NSTC picture would be a variation of tint
from left to right. However, a tint control wouldn't be able to address
that problem - it would simply move the horizontal position on the screen
where the colours are accurate - suggesting that it doesn't occur in
practice except in equipment that is recognisably broken.


Many years back, Bush in the UK produced a colour decoder which was
'revolutionary' compared to other manufacturers' efforts, in that the
subcarrier was regenerated in the decoder directly from the burst, rather
than being a free-running oscillator just locked to the burst with a PLL.
They did this by deriving a phase-adjustable pulse from the H-flyback, and
using this to 'notch out' the burst from the back porch period. The 10
cycles of burst thus recovered, were then applied directly to the 4.43MHz
crystal, which caused it to ring at exactly the same frequency and in
exactly the same phase as the original subcarrier. Always seemed to work
pretty well, and they continued to use this system over a period of probably
10 years or more, covering three chassis designs / revisions.

Arfa

I'm left wondering what exactly was the *real* problem that PAL was
intended to fix. It appears that the NTSC tint control could only
address a fixed phase offset between the colour burst and the
subcarrier, with both transmitters and TV sets able to maintain that
offset sufficiently closely that the hue wouldn't vary from left to
right of the picture.

Other issues, such as non-linear phase shift would have been a problem
for NTSC viewers, regardless of the tint control.

So were NTSC viewers tolerating colour pictures that couldn't be set
right even with the tint control? Or is there something else that I've
missed?

Sylvia.
Part of the difficulity in understanding is that perhaps you don't
have experience with early American color televisions... I certainly
remember how in the 60s we had to adjust the tint control on a regular
(show by show) basis, because of lack of consistancy.

Today, with predominatly digital systems, it has been so long since
I've touched a tint control, that I wonder if they still exist!

Anyone who had one of those old, tube (valve) color sets, with the 21"
round color CRT, will remember seeing green skies, and blue grass
while having skin colors set to the proper shade. Get the sky blue,
and the skin turned red, or blue, or green!
 
Part of the difficulity in understanding is that perhaps you
don't have experience with early American color televisions...
I certainly remember how in the 60s we had to adjust the tint
control on a regular (show by show) basis, because of lack
of consistancy.
Yes -- a lack of consistency. That was not the fault of NTSC, but of the
broadcasters.


Anyone who had one of those old, tube (valve) color sets,
with the 21" round color CRT, will remember seeing green
skies, and blue grass while having skin colors set to the
proper shade. Get the sky blue, and the skin turned red,
or blue, or green!
I don't think that's correct. The cameras (and/or encoders) would have had
to have been very badly set up for that to happen.


On a related subject... I remember reading long, long ago that the first RCA
color TV had /four/ controls for adjusting the color, which the author
described as a "combination lock"! Anyone know anything about this?
 
On 7/04/2010 10:12 PM, William Sommerwerck wrote:
I'm left wondering what exactly was the *real* problem that PAL
was intended to fix. It appears that the NTSC tint control could
only address a fixed phase offset between the colour burst and
the subcarrier, with both transmitters and TV sets able to
maintain that offset sufficiently closely that the hue wouldn't
vary from left to right of the picture.

Correct.


Other issues, such as non-linear phase shift would have been
a problem for NTSC viewers, regardless of the tint control.

Also correct.


So were NTSC viewers tolerating colour pictures that couldn't
be set right even with the tint control? Or is there something
else that I've missed?

You /have/ missed something, which I explained "long ago and far away".
grin
OK, I vaguely remember your saying that now.

In the UK, colour was only transmitted on a new 625 line service
(newish, in the case of BBC2), in parallel for a long time with a
monochrome 405 line service (except BBC2), and I'd have thought the new
transmission infrastructure could have been built to obviate the
non-linear group-delay, given that it existed in the USA.

And, as I commented before, the Sony Trinitron sets, which didn't
implement PAL, performed acceptably according to my memory.

Sylvia.
 
On 7/04/2010 11:23 PM, William Sommerwerck wrote:
Part of the difficulity in understanding is that perhaps you
don't have experience with early American color televisions...
I certainly remember how in the 60s we had to adjust the tint
control on a regular (show by show) basis, because of lack
of consistancy.

Yes -- a lack of consistency. That was not the fault of NTSC, but of the
broadcasters.
I have to wonder what the broadcasters were doing to achieve that.
Contriving to get the colour burst phase consistent amongst cameras in a
studio (so that the tint stayed the same for a show), but inconsistent
with the actual colour subcarrier, would take some doing.

Sylvia.
 
In the UK, colour was only transmitted on a new 625-line
service (newish, in the case of BBC2), in parallel for a long
time with a monochrome 405 line service (except BBC2),
and I'd have thought the new transmission infrastructure
could have been built to obviate the non-linear group-delay,
given that it existed in the USA.
You're probably correct.
 
Yes -- a lack of consistency. That was not the fault
of NTSC, but of the broadcasters.

I have to wonder what the broadcasters were doing to achieve
that. Contriving to get the colour burst phase consistent amongst
cameras in a studio (so that the tint stayed the same for a show),
but inconsistent with the actual colour subcarrier, would take
some doing.
There is no subcarrier or burst signal in the cameras. They aren't needed at
that point, and are added during the encoding process.

Setting them up is another matter. The early episodes of "Barney Miller"
provide a good example of poor setup, with inconsistent color, and poor
convergence.
 
On 8/04/2010 12:21 AM, William Sommerwerck wrote:
Yes -- a lack of consistency. That was not the fault
of NTSC, but of the broadcasters.

I have to wonder what the broadcasters were doing to achieve
that. Contriving to get the colour burst phase consistent amongst
cameras in a studio (so that the tint stayed the same for a show),
but inconsistent with the actual colour subcarrier, would take
some doing.

There is no subcarrier or burst signal in the cameras. They aren't needed at
that point, and are added during the encoding process.
Ok, so the separate colour signals (and luminance?) are sent from the
cameras. Still, at some point the colour signals have to be encoded
using the colour subcarrier, and a bit of the latter has to be included
as the burst. Failing to keep them in phase would require a considerable
amount of indifference.

Which I think you've also said ;)


Setting them up is another matter. The early episodes of "Barney Miller"
provide a good example of poor setup, with inconsistent color, and poor
convergence.
Poor convergence? The mind boggles.

Sylvia.
 
William Sommerwerck wrote:
Part of the difficulity in understanding is that perhaps you
don't have experience with early American color televisions...
I certainly remember how in the 60s we had to adjust the tint
control on a regular (show by show) basis, because of lack
of consistancy.

Yes -- a lack of consistency. That was not the fault of NTSC, but of the
broadcasters.

And AT&T who provided the coaxial cables that fed the video to all
the stations on a network. The tint and chroma level could be adjusted
at every facility in the system. I knew someone who worked for AT&T at
the time, and he told me what a pain it was to compensate for the
cable. When the network switched to a different studio or city for a
show, it threw everything out of calibration.


Anyone who had one of those old, tube (valve) color sets,
with the 21" round color CRT, will remember seeing green
skies, and blue grass while having skin colors set to the
proper shade. Get the sky blue, and the skin turned red,
or blue, or green!

I don't think that's correct. The cameras (and/or encoders) would have had
to have been very badly set up for that to happen.

On a related subject... I remember reading long, long ago that the first RCA
color TV had /four/ controls for adjusting the color, which the author
described as a "combination lock"! Anyone know anything about this?

He may be talking about the three 'drive' controls that set the gain
for each channel. These are set up to provide equal gain to get a white
line during setup. They are service adjustments on TVs, but on an early
design they may have been easier to get to. Some TVs still had hollow
plastic shaft extenders that passed through the rear of floor model
cabinets to adjust these and other pots.

The fourth would be the actual dolor intensity control.


--
Lead free solder is Belgium's version of 'Hold my beer and watch this!'
 
Setting them up is another matter. The early episodes of
"Barney Miller" provide a good example of poor setup, with
inconsistent color, and poor convergence.

Poor convergence? The mind boggles.
Oh, yes. The pickups had to be aligned. The "modern" system, in which
solid-state sensors are attached to a prism/beamsplitter was not practical
with vidicons and Plumbicons.
 
On a related subject... I remember reading long, long ago
that the first RCA color TV had /four/ controls for adjusting
the color, which the author described as a "combination lock"!
Anyone know anything about this?

He may be talking about the three 'drive' controls that set the
gain for each channel. These are set up to provide equal gain
to get a white line during setup. They are service adjustments
on TVs, but on an early design they may have been easier to get to.
No, these were supposedly user controls. Anybody got a photo of the user
controls for a CT-100?
 
In article <hpi4c4$tk4$1@news.eternal-september.org>,
William Sommerwerck <grizzledgeezer@comcast.net> wrote:
There is no subcarrier or burst signal in the cameras. They aren't
needed at that point, and are added during the encoding process.

Setting them up is another matter. The early episodes of "Barney Miller"
provide a good example of poor setup, with inconsistent color, and poor
convergence.
So camera setup was poor - as was the later stages of transmission?

This certainly wasn't the case in the UK - despite the transmitters being
fed with land lines.

--
*Where do forest rangers go to "get away from it all?"

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.
 
In article <hpi7k4$nnh$1@news.eternal-september.org>,
William Sommerwerck <grizzledgeezer@comcast.net> wrote:
Setting them up is another matter. The early episodes of
"Barney Miller" provide a good example of poor setup, with
inconsistent color, and poor convergence.

Poor convergence? The mind boggles.

Oh, yes. The pickups had to be aligned. The "modern" system, in which
solid-state sensors are attached to a prism/beamsplitter was not
practical with vidicons and Plumbicons.
Registration on cameras. Convergence on monitors?

Did you have videcon colour cameras? First UK ones were plumbicon. Apart
from the ancient IO RCA ones used for tests.

--
*24 hours in a day ... 24 beers in a case ... coincidence? *

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.
 
Registration on cameras. Convergence on monitors?
Yes. Thanks for the correction.


Did you have videcon colour cameras? First UK ones
were Plumbicon.
Yes, because you started so late.

The first RCA cameras used vidicons (I think) -- though they might have used
image orhticons.

They later had a four-pickup camera that used an image orthicon to generate
a perfectly registered (by definition) luminance signal, plus three
vidicons.
 
William Sommerwerck wrote:
Setting them up is another matter. The early episodes of
"Barney Miller" provide a good example of poor setup, with
inconsistent color, and poor convergence.

Poor convergence? The mind boggles.

Oh, yes. The pickups had to be aligned. The "modern" system, in which
solid-state sensors are attached to a prism/beamsplitter was not practical
with vidicons and Plumbicons.

Local stations weren't immune, either. Some locally produced shows
in Dayton, ohio aired from poorly converged cameras in the '70s & '80s


--
Lead free solder is Belgium's version of 'Hold my beer and watch this!'
 
In article <hpibre$pjs$1@news.eternal-september.org>,
William Sommerwerck <grizzledgeezer@comcast.net> wrote:
Registration on cameras. Convergence on monitors?

Yes. Thanks for the correction.

Did you have videcon colour cameras? First UK ones
were Plumbicon.

Yes, because you started so late.

The first RCA cameras used vidicons (I think) -- though they might have
used image orhticons.
Three 3 inch IO were the ones I remember. Being used for tests long before
colour broadcasting started in the UK.

They later had a four-pickup camera that used an image orthicon to
generate a perfectly registered (by definition) luminance signal, plus
three vidicons.
That's a configuration I never saw. The first colour cameras here were all
four tube plumblicons. I was taught the colour response of a videcon
wasn't suitable.

BTW I'm not surprised your setup engineers had problems - with a mixture
of IO and videcon. ;-)

--
*Santa's helpers are subordinate clauses*

Dave Plowman dave@davenoise.co.uk London SW
To e-mail, change noise into sound.
 
"Dave Plowman (News)" wrote:
In article <hpibre$pjs$1@news.eternal-september.org>,
William Sommerwerck <grizzledgeezer@comcast.net> wrote:
Registration on cameras. Convergence on monitors?

Yes. Thanks for the correction.

Did you have videcon colour cameras? First UK ones
were Plumbicon.

Yes, because you started so late.

The first RCA cameras used vidicons (I think) -- though they might have
used image orhticons.

Three 3 inch IO were the ones I remember. Being used for tests long before
colour broadcasting started in the UK.

They later had a four-pickup camera that used an image orthicon to
generate a perfectly registered (by definition) luminance signal, plus
three vidicons.

That's a configuration I never saw. The first colour cameras here were all
four tube plumblicons. I was taught the colour response of a videcon
wasn't suitable.

RCA built their TK44 color studio cameras with Vidicons. They
changed the model number to TK46 when they switched to Plumicons. Most
of the parts were interchangeable, so I used a pair of TK44 cameras for
spare modules & as a test jig to keep three TK46 cameras working the way
we wanted. The TK44s were used by TV stations for years, but needed
brighter studio lighting.


--
Lead free solder is Belgium's version of 'Hold my beer and watch this!'
 
In article <510423273edave@davenoise.co.uk>,
"Dave Plowman (News)" <dave@davenoise.co.uk> wrote:

>,
isw <isw@witzend.com> wrote:
In article <4bbbee16$0$24357$c3e8da3@news.astraweb.com>,
Sylvia Else <sylvia@not.at.this.address> wrote:

--snippety-snip--

I'm left wondering what exactly was the *real* problem that PAL was
intended to fix.

Political. The Europeans didn't want US companies selling sets there.

Didn't stop the Japanese, etc.
But *they* wanted to sell sets *here*.

Isaac
 

Welcome to EDABoard.com

Sponsor

Back
Top