From: isw on
In article <0095e510$0$2343$c3e8da3(a)news.astraweb.com>,
Sylvia Else <sylvia(a)not.at.this.address> wrote:

> On 2/04/2010 9:50 PM, William Sommerwerck wrote:
> >> I think you'll find that was the intent. However, if the phase error is
> >> too great, the eye averaging doesn't work so well, hence the
> >> introduction of the delay line.
> >
> >> At which point you wonder why bother sending two colour signals in
> >> quadrature if you're just going to average them with the next scan line
> >> anyway.
> >
> > But you don't have to average them. NTSC doesn't. And the delay line can be
> > used for comb filtering.
> >
> >
> >> SECAM avoids that complexity by just going straight to the delay
> >> line. I lived in Paris for 18 months. If there's a quality difference
> >> between a SECAM and PAL picture, it was far from obvious.
> >
> > The problem is, SECAM /requires/ the delay line because the system transmits
> > only the red or blue color-difference signal at any time. This is what I was
> > talking about -- it keeps the transmission side cheap, while making the user
> > pay more for their TV.
> >
> > For most images, you won't see a difference. But in an image with strong
> > vertical color transitions, you'll see aliasing, especially when the image
> > moves vertically.
> >
> >
>
> If we were building an analogue colour TV transmission infrastructure
> now, then maybe we'd go the NTSC route, since it eliminates the delay
> line. But it's undoubtedly true that, for whatever reasons, in earlier
> times, NTSC didn't perform that well, whereas those whose systems were
> PAL or SECAM got good colour pictures from day one.

And had high-brightness flicker for just as long...

Isaac
From: Sylvia Else on
On 2/04/2010 12:13 AM, William Sommerwerck wrote:

> If the transmission network has constant group delay, the hue setting should
> be set 'n forget, and never need to be changed.

It's not clear to me why that wasn't the case anyway. Whatever phase
error was introduced to the colour signal by the transmission system
would also affect the colour burst. If the problem could be addressed by
means of a tint control with a setting that remained stable even over
the duration of a program, it rather seems to imply that a phase error
between the colour burst and the colour subcarrier was built into the
signal at the studio.

Sylvia.
From: David on


"Sylvia Else" <sylvia(a)not.at.this.address> wrote in message
news:4bb6f524$0$1483$c3e8da3(a)news.astraweb.com...
> On 2/04/2010 12:13 AM, William Sommerwerck wrote:
>
>> If the transmission network has constant group delay, the
>> hue setting should
>> be set 'n forget, and never need to be changed.
>
> It's not clear to me why that wasn't the case anyway.
> Whatever phase error was introduced to the colour signal
> by the transmission system would also affect the colour
> burst. If the problem could be addressed by means of a
> tint control with a setting that remained stable even over
> the duration of a program, it rather seems to imply that a
> phase error between the colour burst and the colour
> subcarrier was built into the signal at the studio.
>
> Sylvia

One big problem was differential phase and gain in the
transmission path. In this case both the amplitude and phase
of the color information was influenced by the total
amplitude of the signal including the luminance. Since the
burst was at IRE 0 and the average picture content was IRE
50 or so, differential phase shifted the color hue.

David


From: William Sommerwerck on
> If we were building an analogue colour TV transmission
> infrastructure now, then maybe we'd go the NTSC route,
> since it eliminates the delay line.

PAL doesn't /require/ a delay line.


> But it's undoubtedly true that, for whatever reasons, in earlier
> times, NTSC didn't perform that well, whereas those whose
> systems were PAL or SECAM got good colour pictures from
> day one.

NTSC has always "performed well". Poor NTSC image quality was always due to
bad studio practice.


From: William Sommerwerck on
>> If the transmission network has constant group delay,
>> the hue setting should be set 'n forget, and never need
>> to be changed.

> It's not clear to me why that wasn't the case anyway. Whatever
> phase error was introduced to the colour signal by the transmission
> system would also affect the colour burst. If the problem could be
> addressed by means of a tint control with a setting that remained
> stable even over the duration of a program, it rather seems to imply
> that a phase error between the colour burst and the colour subcarrier
> was built into the signal at the studio.

We're talking about non-linear group delay. This is not a simple phase error
in the burst, but a non-time-constant delay across the bandwidth of the
chroma signal. Any such non-linear delay will introduce varying color errors
that cannot be corrected with a single hue setting.