Difference in luma-chroma delay of C64/C128 compared to standard S-video

classic Classic list List threaded Threaded
81 messages Options
12345
Reply | Threaded
Open this post in threaded view
|

Difference in luma-chroma delay of C64/C128 compared to standard S-video

MiaM
Hi!

As many people already know, C64 is older than the consumer S-video
signal format, and doesen't comply completely to that standard.

The result is that if you make an adaptor from C64 to a S-video input
the luminance will be slightly more to the left than the chrominance.
If you on the other hand feed a S-video signal to an old Commodore CRT
monitor the opposite will happend, i.e. luminance slightly to the right
of the chrominance.

Is the exact timing of this some kind of known fact, or is it still
something that needs to be calculated or measured?

Some 15-20 years ago I modified a CRT TV set to give correct picture
both with a standard S-video source (i.e. CD32) and with a C64, using a
thumb wheel switch to select one of the 8 possible different delays in
a TDA4565 IC that were used in that TV. I don't remember which setting
I used with C64 and with CD32 though.

I see four approaches:

1: Compare the schematics of some TV sets with S-video input to the
schematics of a Commodore CRT monitor with "C64 S-video" input, and
figure out the difference

2: Find some spec on S-video timing. I've googled but haven't found
anything.

3: Measure colour bars from a C64 and a known S-video source (for
example CD32)

4: Extrapolate from looking at a picture (the mistiming seems to be
approximately 2 pixels, so two cycles of the pixel clock would be about
the mistiming)

5: Experiment with different delays.

My idea is to figure out the optimal delay and then just calculate what
cable length gives such delay (afaik it depends on what kind of
insulation the cable uses so maybe a few different lengths could be
calculated for different common types of 75 ohm coax cable). Then
anyone who wants a perfect picture could just route the luminance
through a cable of the correct lenght.

If I'm not mistaken it would probably be a cable lengt of about 30
metres (100 feet) +/- 50% or so. That seems like a rather long cable
but it's not that bad to hide under a desk or behind a TV.

The point is that it would be a simple thing anyone with a soldering
iron could do, without any need for some fancy electronics. Just pick
up a spool of enough tv antenna coax cable and solder it in, and get a
real picture improvement.

IIRC the picture with a correct delay and a TV/monitor with CTI (Colour
Transient Improvement, or some similar function) looks just as good as
with a RGB signal. (For signals with higher resolution, like 640 pixels
wide on CD32, anything that is only one pixel wide will be black and
white, otherwise that will also look as good as a RGB signal).

--
(\_/) Copy the bunny to your mails to help
(O.o) him achieve world domination.
(> <) Come join the dark side.
/_|_\ We have cookies.

       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

silverdr-2

> On 2017-08-31, at 20:06, Mia Magnusson <[hidden email]> wrote:
>
> Hi!
>
> As many people already know, C64 is older than the consumer S-video
> signal format, and doesen't comply completely to that standard.

Well, it doesn't comply to /any/ video standard if we want to tell the truth :-)

> 1: Compare the schematics of some TV sets with S-video input to the
> schematics of a Commodore CRT monitor with "C64 S-video" input, and
> figure out the difference

The differences may be rather big and please note that it might not be obvious where the timing difference is handled as well as that it might come from some analogue "tweaks" rather than a pure design difference.

> 2: Find some spec on S-video timing. I've googled but haven't found
> anything.

There are "Rec."s about the video timings but I don't see how this alone could help. You still need to measure things.

> 3: Measure colour bars from a C64 and a known S-video source (for
> example CD32)

I just (few weeks ago) wrote a small proggy for the 64 to display the quasi-standard colour bars over the whole screen. I wrote it to measure some other aspects but can be used to measure the difference. Just connect the 64 displaying the bars to a good waveform/vectorscope measurement set and compare it to a known standard source (like the broadcast test signal generator). As I wrote a minute ago in another thread I once ran a studio and I still have all of those (scopes and generators) if needed.

> 4: Extrapolate from looking at a picture (the mistiming seems to be
> approximately 2 pixels, so two cycles of the pixel clock would be about
> the mistiming)

Chroma has lower resolution than luma so there will always be some mistiming when we talk one pixel for example. I remember doing those things (finding the best relation between luma and chroma using the studio equipment while looking at both the picture and the WFM. It was required especially when working with non-Betacam material.

> 5: Experiment with different delays.

Until your WFM shows what you want it to.

> My idea is to figure out the optimal delay and then just calculate what
> cable length gives such delay (afaik it depends on what kind of
> insulation the cable uses so maybe a few different lengths could be
> calculated for different common types of 75 ohm coax cable). Then
> anyone who wants a perfect picture could just route the luminance
> through a cable of the correct lenght.

I am not sure what you mean. The propagation times over properly matched line is close to negligible. You may get some quasi-impedance-matching by trimming the cable to a specific length but a) it's a mother of bad ideas when it comes to impedance matching and b) we talk relatively low frequencies here, where it doesn't work that well.

> If I'm not mistaken it would probably be a cable lengt of about 30
> metres (100 feet) +/- 50% or so. That seems like a rather long cable
> but it's not that bad to hide under a desk or behind a TV.

Ah... with tens of metres of length difference you might get some propagation time difference but you'd still need to do this mostly by trial and error and you'll get different results with different cables. Bear in mind that there is also attenuation.

> The point is that it would be a simple thing anyone with a soldering
> iron could do, without any need for some fancy electronics. Just pick
> up a spool of enough tv antenna coax cable and solder it in, and get a
> real picture improvement.

I think it's still better to make a small circuit for that :-) and you might get better connectors on the output side while you're at it ;-)

--
SD! - http://e4aws.silverdr.com/


       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

MiaM
Den Thu, 31 Aug 2017 21:03:18 +0200 skrev [hidden email]:

>
> > On 2017-08-31, at 20:06, Mia Magnusson <[hidden email]> wrote:
> >
> > Hi!
> >
> > As many people already know, C64 is older than the consumer S-video
> > signal format, and doesen't comply completely to that standard.
>
> Well, it doesn't comply to /any/ video standard if we want to tell
> the truth :-)

Yes, I know that the level for the chroma is a bit off and that the
output stages doesen't really have the correct impedance either, but
that usually works fine as it is. Almost (?) every TV has AGC for the
chroma, adjusted by the level of the color burst so it's probably not a
big problem. :)

> > 1: Compare the schematics of some TV sets with S-video input to the
> > schematics of a Commodore CRT monitor with "C64 S-video" input, and
> > figure out the difference
>
> The differences may be rather big and please note that it might not
> be obvious where the timing difference is handled as well as that it
> might come from some analogue "tweaks" rather than a pure design
> difference.

Well atleast the CRT TV I modified had a "s-video" control signal that
just selected between two different delays. Selecting or not selecting
delays in a Commodore CRT would probably be easy to spot.

> > 2: Find some spec on S-video timing. I've googled but haven't found
> > anything.
>
> There are "Rec."s about the video timings but I don't see how this
> alone could help. You still need to measure things.

Well, the composite video seems to have standard timing, and we know
that the modulator just mixes the luma and chroma signal (with a RC
filter which might change the timing slightly but that's easy to
measure), so we could probably assume that the wanted delay is the
difference between the specs for composite video and s-video.

> > 3: Measure colour bars from a C64 and a known S-video source (for
> > example CD32)
>
> I just (few weeks ago) wrote a small proggy for the 64 to display the
> quasi-standard colour bars over the whole screen. I wrote it to
> measure some other aspects but can be used to measure the difference.
> Just connect the 64 displaying the bars to a good
> waveform/vectorscope measurement set and compare it to a known
> standard source (like the broadcast test signal generator). As I
> wrote a minute ago in another thread I once ran a studio and I still
> have all of those (scopes and generators) if needed.

I don't have that kind of equipment, just a (modern digital)
oscilloscope so I'd have to look at the waveform. The chroma-luma
timing should be visible that way too.

> > 4: Extrapolate from looking at a picture (the mistiming seems to be
> > approximately 2 pixels, so two cycles of the pixel clock would be
> > about the mistiming)
>
> Chroma has lower resolution than luma so there will always be some
> mistiming when we talk one pixel for example. I remember doing those
> things (finding the best relation between luma and chroma using the
> studio equipment while looking at both the picture and the WFM. It
> was required especially when working with non-Betacam material.

Yes, but with my current setup it's obvious that there is mistiming.
And many modern color decoders can do tricks to detect a transient and
improve the bandwith of the transient, so it looks better.

> > 5: Experiment with different delays.
>
> Until your WFM shows what you want it to.
>
> > My idea is to figure out the optimal delay and then just calculate
> > what cable length gives such delay (afaik it depends on what kind of
> > insulation the cable uses so maybe a few different lengths could be
> > calculated for different common types of 75 ohm coax cable). Then
> > anyone who wants a perfect picture could just route the luminance
> > through a cable of the correct lenght.
>
> I am not sure what you mean. The propagation times over properly
> matched line is close to negligible. You may get some
> quasi-impedance-matching by trimming the cable to a specific length
> but a) it's a mother of bad ideas when it comes to impedance matching
> and b) we talk relatively low frequencies here, where it doesn't work
> that well.

A 75 ohm coax shouldn't have any impedance matching problems if the
TV/monitor has a proper 75 ohm termination. (I've seen 82 ohms in many
cases, probably because that was cheaper, but it's rather close). As
long as the reciever has a correct match the transmitter might be a bit
off without any real problem.

> > If I'm not mistaken it would probably be a cable lengt of about 30
> > metres (100 feet) +/- 50% or so. That seems like a rather long cable
> > but it's not that bad to hide under a desk or behind a TV.
>
> Ah... with tens of metres of length difference you might get some
> propagation time difference but you'd still need to do this mostly by
> trial and error and you'll get different results with different
> cables.

It seems like for most RG cables there are three different progagation
speeds, so it would be easy to have three different legnths for a user
to try:

https://cdn.shopify.com/s/files/1/0986/4308/files/Cable-Delay-FAQ.pdf

> Bear in mind that there is also attenuation.

Yes, but that could probably be ignored at such low frequencies.
Atleast TV aerial cables would only attenuate a few dB with tens of
metres and more than 100 times higher frequency, and the cables sold
for parabolic dishes would have even less attenuation.


> > The point is that it would be a simple thing anyone with a soldering
> > iron could do, without any need for some fancy electronics. Just
> > pick up a spool of enough tv antenna coax cable and solder it in,
> > and get a real picture improvement.
>
> I think it's still better to make a small circuit for that :-) and
> you might get better connectors on the output side while you're at
> it ;-)
 
Well, the point of using a long cable is that the cable is available in
shops almost everywhere so you don't have to order stuff.


--
(\_/) Copy the bunny to your mails to help
(O.o) him achieve world domination.
(> <) Come join the dark side.
/_|_\ We have cookies.

       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

peter
We could probably wire up one of the Rhombus TZB-series passive delay
chips with a switch and do pretty well to fix this.

http://pdf1.alldatasheet.com/datasheet-pdf/view/268382/RHOMBUS-IND/TZB12-10.html

--
Pete Rittwage


> Den Thu, 31 Aug 2017 21:03:18 +0200 skrev [hidden email]:
>>
>> > On 2017-08-31, at 20:06, Mia Magnusson <[hidden email]> wrote:
>> >
>> > Hi!
>> >
>> > As many people already know, C64 is older than the consumer S-video
>> > signal format, and doesen't comply completely to that standard.
>>
>> Well, it doesn't comply to /any/ video standard if we want to tell
>> the truth :-)
>
> Yes, I know that the level for the chroma is a bit off and that the
> output stages doesen't really have the correct impedance either, but
> that usually works fine as it is. Almost (?) every TV has AGC for the
> chroma, adjusted by the level of the color burst so it's probably not a
> big problem. :)
>
>> > 1: Compare the schematics of some TV sets with S-video input to the
>> > schematics of a Commodore CRT monitor with "C64 S-video" input, and
>> > figure out the difference
>>
>> The differences may be rather big and please note that it might not
>> be obvious where the timing difference is handled as well as that it
>> might come from some analogue "tweaks" rather than a pure design
>> difference.
>
> Well atleast the CRT TV I modified had a "s-video" control signal that
> just selected between two different delays. Selecting or not selecting
> delays in a Commodore CRT would probably be easy to spot.
>
>> > 2: Find some spec on S-video timing. I've googled but haven't found
>> > anything.
>>
>> There are "Rec."s about the video timings but I don't see how this
>> alone could help. You still need to measure things.
>
> Well, the composite video seems to have standard timing, and we know
> that the modulator just mixes the luma and chroma signal (with a RC
> filter which might change the timing slightly but that's easy to
> measure), so we could probably assume that the wanted delay is the
> difference between the specs for composite video and s-video.
>
>> > 3: Measure colour bars from a C64 and a known S-video source (for
>> > example CD32)
>>
>> I just (few weeks ago) wrote a small proggy for the 64 to display the
>> quasi-standard colour bars over the whole screen. I wrote it to
>> measure some other aspects but can be used to measure the difference.
>> Just connect the 64 displaying the bars to a good
>> waveform/vectorscope measurement set and compare it to a known
>> standard source (like the broadcast test signal generator). As I
>> wrote a minute ago in another thread I once ran a studio and I still
>> have all of those (scopes and generators) if needed.
>
> I don't have that kind of equipment, just a (modern digital)
> oscilloscope so I'd have to look at the waveform. The chroma-luma
> timing should be visible that way too.
>
>> > 4: Extrapolate from looking at a picture (the mistiming seems to be
>> > approximately 2 pixels, so two cycles of the pixel clock would be
>> > about the mistiming)
>>
>> Chroma has lower resolution than luma so there will always be some
>> mistiming when we talk one pixel for example. I remember doing those
>> things (finding the best relation between luma and chroma using the
>> studio equipment while looking at both the picture and the WFM. It
>> was required especially when working with non-Betacam material.
>
> Yes, but with my current setup it's obvious that there is mistiming.
> And many modern color decoders can do tricks to detect a transient and
> improve the bandwith of the transient, so it looks better.
>
>> > 5: Experiment with different delays.
>>
>> Until your WFM shows what you want it to.
>>
>> > My idea is to figure out the optimal delay and then just calculate
>> > what cable length gives such delay (afaik it depends on what kind of
>> > insulation the cable uses so maybe a few different lengths could be
>> > calculated for different common types of 75 ohm coax cable). Then
>> > anyone who wants a perfect picture could just route the luminance
>> > through a cable of the correct lenght.
>>
>> I am not sure what you mean. The propagation times over properly
>> matched line is close to negligible. You may get some
>> quasi-impedance-matching by trimming the cable to a specific length
>> but a) it's a mother of bad ideas when it comes to impedance matching
>> and b) we talk relatively low frequencies here, where it doesn't work
>> that well.
>
> A 75 ohm coax shouldn't have any impedance matching problems if the
> TV/monitor has a proper 75 ohm termination. (I've seen 82 ohms in many
> cases, probably because that was cheaper, but it's rather close). As
> long as the reciever has a correct match the transmitter might be a bit
> off without any real problem.
>
>> > If I'm not mistaken it would probably be a cable lengt of about 30
>> > metres (100 feet) +/- 50% or so. That seems like a rather long cable
>> > but it's not that bad to hide under a desk or behind a TV.
>>
>> Ah... with tens of metres of length difference you might get some
>> propagation time difference but you'd still need to do this mostly by
>> trial and error and you'll get different results with different
>> cables.
>
> It seems like for most RG cables there are three different progagation
> speeds, so it would be easy to have three different legnths for a user
> to try:
>
> https://cdn.shopify.com/s/files/1/0986/4308/files/Cable-Delay-FAQ.pdf
>
>> Bear in mind that there is also attenuation.
>
> Yes, but that could probably be ignored at such low frequencies.
> Atleast TV aerial cables would only attenuate a few dB with tens of
> metres and more than 100 times higher frequency, and the cables sold
> for parabolic dishes would have even less attenuation.
>
>
>> > The point is that it would be a simple thing anyone with a soldering
>> > iron could do, without any need for some fancy electronics. Just
>> > pick up a spool of enough tv antenna coax cable and solder it in,
>> > and get a real picture improvement.
>>
>> I think it's still better to make a small circuit for that :-) and
>> you might get better connectors on the output side while you're at
>> it ;-)
>
> Well, the point of using a long cable is that the cable is available in
> shops almost everywhere so you don't have to order stuff.
>
>
> --
> (\_/) Copy the bunny to your mails to help
> (O.o) him achieve world domination.
> (> <) Come join the dark side.
> /_|_\ We have cookies.
>
>        Message was sent through the cbm-hackers mailing list
>


       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

silverdr-2
In reply to this post by MiaM

> On 2017-08-31, at 21:26, Mia Magnusson <[hidden email]> wrote:
>
>>> As many people already know, C64 is older than the consumer S-video
>>> signal format, and doesen't comply completely to that standard.
>>
>> Well, it doesn't comply to /any/ video standard if we want to tell
>> the truth :-)
>
> Yes, I know that the level for the chroma is a bit off and that the
> output stages doesen't really have the correct impedance either, but
> that usually works fine as it is. Almost (?) every TV has AGC for the
> chroma, adjusted by the level of the color burst so it's probably not a
> big problem. :)

The levels are off specs, the timings are off specs, the impedance is off specs, the sync pulses are off specs, the signal is non-interlaced, and so on :-)

Some time ago I was intensely brainstorming and pushing (even here) an idea to design an interface that would "legalise" all the above so that every piece of standard compliant (and expecting standard signal) piece of equipment could handle the video from the VIC-II equipped CBM machines. That is until I realised that it can't really be done.

> Well atleast the CRT TV I modified had a "s-video" control signal that
> just selected between two different delays. Selecting or not selecting
> delays in a Commodore CRT would probably be easy to spot.
> [...]
> Well, the composite video seems to have standard timing, and we know
> that the modulator just mixes the luma and chroma signal (with a RC
> filter which might change the timing slightly but that's easy to
> measure), so we could probably assume that the wanted delay is the
> difference between the specs for composite video and s-video.

Now, I might be missing something but to me the composite video is a way of 2to1 encoding/muxing so that two signals can be transmitted over one line rather than two. Unless I am now mistaken, there should be no differences (other than unintended) in the luma/chroma timing between the two. Now that you say about your CRT TV, I start wondering whether I am really missing something.. like time needed to decode composite.. hmm.. but that still shouldn't affect interrelation between the two. I always assumed that what I was doing in the studio was due to unintended differences caused by non-broadcast quality of the equipment used rather than inherent differences.

>>> 3: Measure colour bars from a C64 and a known S-video source (for
>>> example CD32)
>>
>> I just (few weeks ago) wrote a small proggy for the 64 to display the
>> quasi-standard colour bars over the whole screen. I wrote it to
>> measure some other aspects but can be used to measure the difference.
>> Just connect the 64 displaying the bars to a good
>> waveform/vectorscope measurement set and compare it to a known
>> standard source (like the broadcast test signal generator). As I
>> wrote a minute ago in another thread I once ran a studio and I still
>> have all of those (scopes and generators) if needed.
>
> I don't have that kind of equipment, just a (modern digital)
> oscilloscope so I'd have to look at the waveform. The chroma-luma
> timing should be visible that way too.

Yes. Good oscilloscope will be fine.

> Yes, but with my current setup it's obvious that there is mistiming.

Interesting. I tested my "colour bars" program on the real-hardware and s-video monitors and didn't notice anything obvious. Might it be that the reason is somewhere else? Like that your current s-video equipped display is not driven by analogue video circuitry anymore and therefore has problems with the non-standard-compliant signal from the 64?

> [...]
> Well, the point of using a long cable is that the cable is available in
> shops almost everywhere so you don't have to order stuff.

I see. But still your "obvious" timing problem smells somewhat fishy to me. Never really noticed that kind of issues when using analogue s-video displays. Including the truly hi-res, broadcast studio types of. My bet is that yours is not analogue, is it? And if it is - might it be miscalibrated somehow?  OTOH you say you did this before for the same reason..

Well - then I probably don't have anything clever to say on that. I might help with the measurement and comparison between a reliable, standard (CD32 might not count as such) signal source and the 64. BTW - which board version you have? There were big differences between the modulators and the resulting output. From tack sharp to fully soapy for example. So this also might affect in a way.

--
SD! - http://e4aws.silverdr.com/


       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

HÁRSFALVI Levente-2
In reply to this post by MiaM
Hi!,


On 2017-08-31 20:06, Mia Magnusson wrote:
> Hi!
>
> As many people already know, C64 is older than the consumer S-video
> signal format, and doesen't comply completely to that standard.
> ...

I don't know all the answers, but I might add something (or what)...

- In a TV set / composite monitor, group delay between luma and
demodulated U' and V' components is inevitable, due to the need of
separating two signals of different bandwidth first (chroma from luma),
and then the need of demodulating the color difference signals (whilst
luma, by itself, doesn't need to be demodulated). To compensate for
that, the devices usually have a small, some-10 ns analog delay line in
the luma signal path. How well this compensation works depends on analog
components and setup in earlier PAL TV sets and monitors. (Later models
might have that integrated into some chip, just as the earlier 1H glass
electroacoustic delay line has made it to a form of sampled mixed-signal
integrated circuits later, and suffer less from the degradation and/or
variance of analog component values).

- I've never noticed this unusual color delay problem on C64s on late
(high resolution, small pixels mask) CRT tvs and monitors myself. (On
the other hand, it was long ago when I played around with a C64 the last
time). Nor did I ever notice such delay with 264 series machines. I
definitely did notice the problems that originate from the unusually
high pixel clock and luma bandwidth of the C64 (--> pseudo-colors and
color artifacts as a result), but that one might be unrelated to the
color shift problem you're describing.

- Some years ago, I had a long (and still not finished :-) ) fight
fixing and correcting a PAL Commodore 1702 monitor (a 1702-T / a Toshiba
CRT version, to be precise). All I can say that the luma delay was
anything but perfect after that many years, or by design, or I don't
really know how-why. The delay even differed for composite and separate
luma-chroma input mode; nevertheless it was way off for both modes
(about 1...1.5 pixels difference on the screen), both with Commodore
machines, a recent DVD player that I used to generate test signals, or a
standard TV picture generator instrument that I purchased and used
later. I've never noticed that problem on my other "old style"
(composite / separate luma-chroma / large raster mask) Commodore color
monitor (a 1801, that is). All in all: I'd say, if the question was some
strange luma-chroma delay value of some particular Commodore display
make, I'd much rather suspect a by-design poor luma delay circuit than
some third-party supplier purposely tailoring the luma delay value of
their product to the task.

(As a side-note, AFAIK basically all Commodore monitors have been
manufactured by contractors... JVC, Sharp, Philips, Samsung, and so on;
and also, the picture of the C64 would need to be consistently
differently shifted on standard CRT TVs and early Commodore displays if
Commodore had had the manufacturers tailor the luma delay value of their
products; which AFAIK has not been confirmed).

- For the 1702 fix, I think I found something promising: some
manufacturers offer tapped analog delay lines (that is, usual, coil-type
delay lines with many taps). With that I think I can replace the
original delay line in the display, and select a tap that provides
optimal delay. I'll definitely calibrate the thing against some standard
signal source, and not a Commodore machine (but as said, ATM I don't
expect significant differences). Such delay line might also be a
suitable base component for your case (or, I don't know... with that,
impedance matching might become difficult).


Best regards,

Levente

       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

MiaM
In reply to this post by silverdr-2
Den Thu, 31 Aug 2017 22:14:56 +0200 skrev [hidden email]:

>
> > On 2017-08-31, at 21:26, Mia Magnusson <[hidden email]> wrote:
> >
> >>> As many people already know, C64 is older than the consumer
> >>> S-video signal format, and doesen't comply completely to that
> >>> standard.
> >>
> >> Well, it doesn't comply to /any/ video standard if we want to tell
> >> the truth :-)
> >
> > Yes, I know that the level for the chroma is a bit off and that the
> > output stages doesen't really have the correct impedance either, but
> > that usually works fine as it is. Almost (?) every TV has AGC for
> > the chroma, adjusted by the level of the color burst so it's
> > probably not a big problem. :)
>
> The levels are off specs, the timings are off specs, the impedance is
> off specs, the sync pulses are off specs, the signal is
> non-interlaced, and so on :-)

:)

> Some time ago I was intensely brainstorming and pushing (even here)
> an idea to design an interface that would "legalise" all the above so
> that every piece of standard compliant (and expecting standard
> signal) piece of equipment could handle the video from the VIC-II
> equipped CBM machines. That is until I realised that it can't really
> be done.

Well, the slightly off sync frequencys (and being non interlace) would
require a TBC to correct but the levels, impedance and the timing
difference between chroma and luma could easily be corrected. Also any
improper sync signal (lenght of pulses, wafeform e.t.c.) could be
corrected. But as the sync frequencies is non-standard there is
probably not much use for such correction, as all the non-standard
stuff anyway works on most displays (except that the chroma-luma delay
gives a worse picture quality than what is possible to achieve).

> > Well atleast the CRT TV I modified had a "s-video" control signal
> > that just selected between two different delays. Selecting or not
> > selecting delays in a Commodore CRT would probably be easy to spot.
> > [...]
> > Well, the composite video seems to have standard timing, and we know
> > that the modulator just mixes the luma and chroma signal (with a RC
> > filter which might change the timing slightly but that's easy to
> > measure), so we could probably assume that the wanted delay is the
> > difference between the specs for composite video and s-video.
>
> Now, I might be missing something but to me the composite video is a
> way of 2to1 encoding/muxing so that two signals can be transmitted
> over one line rather than two. Unless I am now mistaken, there should
> be no differences (other than unintended) in the luma/chroma timing
> between the two. Now that you say about your CRT TV, I start
> wondering whether I am really missing something.. like time needed to
> decode composite.. hmm.. but that still shouldn't affect
> interrelation between the two. I always assumed that what I was doing
> in the studio was due to unintended differences caused by
> non-broadcast quality of the equipment used rather than inherent
> differences.

For some reason the decoding process in a TV anyway needs a delay. It's
likely that the s-video standard were set to make a S-VHS player as
simple as possible, i.e. bypassing any delay that's needed for a VHS to
do composite -> separate chroma/luma -> FM modulate luma and frequency
shift chroma -> record to tape, playback from tape -> FM demodulate
luma and frequency shift chroma -> combine luma and chroma.

So therefore it makes sense that the signals could have different
timing specs for composite v.s. s-video in general.

> >>> 3: Measure colour bars from a C64 and a known S-video source (for
> >>> example CD32)
> >>
> >> I just (few weeks ago) wrote a small proggy for the 64 to display
> >> the quasi-standard colour bars over the whole screen. I wrote it to
> >> measure some other aspects but can be used to measure the
> >> difference. Just connect the 64 displaying the bars to a good
> >> waveform/vectorscope measurement set and compare it to a known
> >> standard source (like the broadcast test signal generator). As I
> >> wrote a minute ago in another thread I once ran a studio and I
> >> still have all of those (scopes and generators) if needed.
> >
> > I don't have that kind of equipment, just a (modern digital)
> > oscilloscope so I'd have to look at the waveform. The chroma-luma
> > timing should be visible that way too.
>
> Yes. Good oscilloscope will be fine.

Although cheap I find my Rigol rather good (even better than what I
remember from digital scopes that we had at a previous job about 15-20
years ago, even though those were HP and other "good brands"). I've
successfully used the "view one line" on the composite output from a
VIC-20 and were able to identify where on the screen it says "READY.",
so it seems to work with Commodore semi-non-standard signals :)

> > Yes, but with my current setup it's obvious that there is mistiming.
>
> Interesting. I tested my "colour bars" program on the real-hardware
> and s-video monitors and didn't notice anything obvious. Might it be
> that the reason is somewhere else? Like that your current s-video
> equipped display is not driven by analogue video circuitry anymore
> and therefore has problems with the non-standard-compliant signal
> from the 64?

My current setup uses a perhaps 10 year old 40" Samsung plasma flat
screen TV with it's s-video input. But I remember that the timing issue
were the same when using a CRT TV and I especially remember that the
timing were different on a C64 and a CD32, and that the C64 did fit a
Commodore CRT monitor while CD32 fit the CRT TV, but using CD32
on a Commodore CRT monitor or C64 on the CRT TV also gave the wrong
timing.

> > [...]
> > Well, the point of using a long cable is that the cable is
> > available in shops almost everywhere so you don't have to order
> > stuff.
>
> I see. But still your "obvious" timing problem smells somewhat fishy
> to me. Never really noticed that kind of issues when using analogue
> s-video displays. Including the truly hi-res, broadcast studio types
> of. My bet is that yours is not analogue, is it? And if it is - might
> it be miscalibrated somehow?  OTOH you say you did this before for
> the same reason..

Could it maybe be that the professional displays compensate for timing
issues (by looking at the timing of the color burst v.s. the sync
pulses)?

It might also be that my digital TV set tries to do something with the
timing but as the C64 timing is a bit too off the TV takes an incorrect
guess and displays an even worse picture than on most other displays.

In practise the light green border on the C128 default screen has a
white stripe right next to the right edge of the visible area, and
there is corresponding bleed from the green border onto the text area
to the left of the visible area. The white stripe is about as wide as
two pixels, which is why I'm guessing about 30 metres cable delay would
probably make the picture better. (Sorry if I'm writing the same stuff
all over, I tend to forget what I've already told or not :) )

I assume from your e-mail adress that you too live in the 50Hz/PAL
area :)

> Well - then I probably don't have anything clever to say on that. I
> might help with the measurement and comparison between a reliable,
> standard (CD32 might not count as such) signal source and the 64. BTW
> - which board version you have? There were big differences between
> the modulators and the resulting output. From tack sharp to fully
> soapy for example. So this also might affect in a way.

Currently I actually have a PAL C128 (not CR), the C64 that I had while
I used a CRT TV has been sold of long ago. Not sure which revision of
the C128, I haven't looked inside yet. :) I have another C128 waiting
for repair (starts up with correct background and border color but
screen is filled with incorrect characters, so it's actually good
enough to see this timing issue). Picture seems to be rather sharp, but
not as sharp as the 80 mode (using a simple resistor network to feed
the analogue rgb input of a scart socket on the TV). I would at least
say that the picture is sharp enough to make the two pixels wide stuff
in the characters unnecessary, the font just looks fat.

I have my old CRT TV and also a bunch of Commodore CRT monitors
somewhere (underneath behind...) so I could probably try them too. If
the CRT TV still works it would be interesting to chech the timing
thumb wheel. I don't have any perfect reference signal generator but
have a few different dvd player, digital set top boxes e.t.c. that I
could use as s-video "references".



--
(\_/) Copy the bunny to your mails to help
(O.o) him achieve world domination.
(> <) Come join the dark side.
/_|_\ We have cookies.

       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

silverdr-2

> On 2017-08-31, at 23:19, Mia Magnusson <[hidden email]> wrote:
>
> Well, the slightly off sync frequencys (and being non interlace) would
> require a TBC to correct but the levels, impedance and the timing
> difference between chroma and luma could easily be corrected. Also any
> improper sync signal (lenght of pulses, wafeform e.t.c.) could be
> corrected.

When I was pushing that, I was almost sure everything could be handled and corrected. The eventual show-stopper was when I realised that odd and even fields are actually of different duration and there is no way to make VIC-II generate make every other field longer/shorter.

> But as the sync frequencies is non-standard there is
> probably not much use for such correction, as all the non-standard
> stuff anyway works on most displays (except that the chroma-luma delay
> gives a worse picture quality than what is possible to achieve).

The non-standard C64 signal works on analogue displays as they are both analogue driven and also (in big part because of that) accommodate much wider differences. The current circuitry rarely works acceptably well with strongly non-standard signals like those from the 64.

> For some reason the decoding process in a TV anyway needs a delay. It's
> likely that the s-video standard were set to make a S-VHS player as
> simple as possible, i.e. bypassing any delay that's needed for a VHS to
> do composite -> separate chroma/luma -> FM modulate luma and frequency
> shift chroma -> record to tape, playback from tape -> FM demodulate
> luma and frequency shift chroma -> combine luma and chroma.
>
> So therefore it makes sense that the signals could have different
> timing specs for composite v.s. s-video in general.

I /think/ the difference in the delays in the decoders are caused by different signal paths, not by the difference in the specs of CVBS vs. Y/C. The delay lines are needed for other reasons.

To make sure (I believe so but am happy to verify) I can take a test signal generator (still should have a higher-end Fluke somewhere) and put both on the scope to see if there is any noticeable difference. You said you estimated the delay you wanted to introduce. What would that be? In other words - for what time difference am I supposed to look?

>> Yes. Good oscilloscope will be fine.
>
> Although cheap I find my Rigol rather good

I used wrong wording. For that purpose, virtually any working oscilloscope will be fine. By "good" I meant more something like "working correct", which is not always the case with some old ones. And - BTW - I also use Rigols for some good time. They are good value for money IMHO. But they could be less noisy for sure.

> (even better than what I
> remember from digital scopes that we had at a previous job about 15-20
> years ago, even though those were HP and other "good brands"). I've
> successfully used the "view one line" on the composite output from a
> VIC-20 and were able to identify where on the screen it says "READY.",
> so it seems to work with Commodore semi-non-standard signals :)

Yes, oscilloscope will always work with that :-)

> [...]
> My current setup uses a perhaps 10 year old 40" Samsung plasma flat
> screen TV with it's s-video input.

Normally, I would easily bet the problem on this one, but then you write:

> But I remember that the timing issue
> were the same when using a CRT TV and I especially remember that the
> timing were different on a C64 and a CD32, and that the C64 did fit a
> Commodore CRT monitor while CD32 fit the CRT TV, but using CD32
> on a Commodore CRT monitor or C64 on the CRT TV also gave the wrong
> timing.

... and that's the confusing part.

>> I see. But still your "obvious" timing problem smells somewhat fishy
>> to me. Never really noticed that kind of issues when using analogue
>> s-video displays. Including the truly hi-res, broadcast studio types
>> of. My bet is that yours is not analogue, is it? And if it is - might
>> it be miscalibrated somehow?  OTOH you say you did this before for
>> the same reason..
>
> Could it maybe be that the professional displays compensate for timing
> issues (by looking at the timing of the color burst v.s. the sync
> pulses)?

No, the exact opposite is more true. They show things "as they are". That's the one of the main reasons for using them in studios in the first place.

> It might also be that my digital TV set tries to do something with the
> timing but as the C64 timing is a bit too off the TV takes an incorrect
> guess and displays an even worse picture than on most other displays.

As before - that would be my first bet. And it really is a hit'n miss game with which one of the TVs/adapters will produce acceptable result and which not. More often it is "not" than "yes".

> In practise the light green border on the C128 default screen has a
> white stripe right next to the right edge of the visible area, and
> there is corresponding bleed from the green border onto the text area
> to the left of the visible area. The white stripe is about as wide as
> two pixels, which is why I'm guessing about 30 metres cable delay would
> probably make the picture better. (Sorry if I'm writing the same stuff
> all over, I tend to forget what I've already told or not :) )

Ah, now I forgot what you already wrote before - so that's the timing difference you're looking at - two pixelclock cycles of a C64 PAL signal, right?

> I assume from your e-mail adress that you too live in the 50Hz/PAL
> area :)

I do. Although there is hardly any PAL left outside of the retro stuff circles.

> Currently I actually have a PAL C128 (not CR)

I don't know that much about the 128 modulator differences but with C64s the differences were really big.

--
SD! - http://e4aws.silverdr.com/


       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

Pasi 'Albert' Ojala
In reply to this post by MiaM
On 01.09.2017 00:19, Mia Magnusson wrote:

>
> For some reason the decoding process in a TV anyway needs a delay. It's
> likely that the s-video standard were set to make a S-VHS player as
> simple as possible, i.e. bypassing any delay that's needed for a VHS to
> do composite -> separate chroma/luma -> FM modulate luma and frequency
> shift chroma -> record to tape, playback from tape -> FM demodulate
> luma and frequency shift chroma -> combine luma and chroma.
>
> So therefore it makes sense that the signals could have different
> timing specs for composite v.s. s-video in general.
Hi Mia,

The delay is only needed for PAL (Phase-Alternating Line), and some of
the first PAL receivers didn't bother with having the delay. In PAL the
phase of the color-difference signal V is inverted on every line. The
delay is used to "perfectly separate" U and V by calculating
(u+v)-(u'-v') = 2V and (u+v)+(u'-v') = 2U.

"perfectly" only if the two successive lines of a field contain the same
color information. They are also separated by the line from the other
interlaced field in the final frame, so it's seldom the case. However,
the main feature of PAL is that it corrects the phase issues of color,
so no tint control is needed.

S-VHS stores luminance and chrominance separately on the tape, so it is
easy to provide them also separately in the S-VIDEO connector.

There can still be timing difference between luma and chroma, causing
color information to shift in addition to having smaller bandwidth, and
thus not naturally co-inciding with the edges of the luminance signal.
Digital televisions have methods to look for the edge on the luminance
and then correct the edge of the chrominance to match.

In theory composite is produced by summing the Y and C of S-VIDEO. In
practice you may get time shift whenever they are combined or separated.

-Pasi


       Message was sent through the cbm-hackers mailing list
smf
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

smf
In reply to this post by silverdr-2
On 31/08/2017 22:58, [hidden email] wrote:

> When I was pushing that, I was almost sure everything could be handled
> and corrected. The eventual show-stopper was when I realised that odd
> and even fields are actually of different duration and there is no way
> to make VIC-II generate make every other field longer/shorter.

If you could adjust the VIC II timing externally then it would likely
upset a game or demo.

It's weird that some modern tv's have problems with 240p from analogue
inputs (including component). However they appear to be ok when fed HDMI.

I wonder how we could pursuade the people churning out the Wii & PS2
component to HDMI adapters that there was demand for a c64 y/c to HDMI.



       Message was sent through the cbm-hackers mailing list
smf
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

smf
In reply to this post by HÁRSFALVI Levente-2


On 31/08/2017 22:04, HÁRSFALVI Levente wrote:
(As a side-note, AFAIK basically all Commodore monitors have been
manufactured by contractors... JVC, Sharp, Philips, Samsung, and so on;
and also, the picture of the C64 would need to be consistently
differently shifted on standard CRT TVs and early Commodore displays if
Commodore had had the manufacturers tailor the luma delay value of their
products; which AFAIK has not been confirmed).

Supposedly the monitor was tailored to the 64.

commodore wouldn't have a problem selling you a monitor that would only work properly with the c64, because they wanted a reason to sell you another monitor.

But commodore quality control was only interested in a vague sense of working and not catching fire too often.

I never could afford a monitor back then, so I couldn't tell you whether it was misaligned or not.

Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

Gerrit Heitsch
In reply to this post by smf
On 09/01/2017 10:46 AM, smf wrote:

> On 31/08/2017 22:58, [hidden email] wrote:
>
>> When I was pushing that, I was almost sure everything could be handled
>> and corrected. The eventual show-stopper was when I realised that odd
>> and even fields are actually of different duration and there is no way
>> to make VIC-II generate make every other field longer/shorter.
>
> If you could adjust the VIC II timing externally then it would likely
> upset a game or demo.
>
> It's weird that some modern tv's have problems with 240p from analogue
> inputs (including component). However they appear to be ok when fed HDMI.
>
> I wonder how we could pursuade the people churning out the Wii & PS2
> component to HDMI adapters that there was demand for a c64 y/c to HDMI.

... or see if you can make a 'simple' circuit that takes Y/C on the
input and provides component on the output. The TDA4510 does that if I
read the datasheet right. Not sure if it's still available.

  Gerrit


       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

silverdr-2
In reply to this post by smf

> On 2017-09-01, at 10:46, smf <[hidden email]> wrote:
>
> On 31/08/2017 22:58, [hidden email] wrote:
>
>> When I was pushing that, I was almost sure everything could be handled and corrected. The eventual show-stopper was when I realised that odd and even fields are actually of different duration and there is no way to make VIC-II generate make every other field longer/shorter.
>
> If you could adjust the VIC II timing externally then it would likely upset a game or demo.

Depends. It's academical now but SCROLY/RASTER register driven stuff shouldn't be affected except if cpu cycle-counted across VBI. But - in any case - once I eventually realised that it would require alternating the duration then it became clear that it can't really be done the way I wanted it. Namely fully standard compliant and with no perceivable latency.

> It's weird that some modern tv's have problems with 240p from analogue inputs (including component). However they appear to be ok when fed HDMI.

It's the digitising circuitry that has problems with those.The 240p existed in the digital domain for long but it wasn't really so much there in analogue and "typical" analogue input circuitry is not prepared to handle it.

> I wonder how we could pursuade the people churning out the Wii & PS2 component to HDMI adapters that there was demand for a c64 y/c to HDMI.

? AFAIR it's not /that/ difficult and there are adapters that do this kind of things, including ones that specialise with the old computers and consoles signal.

--
SD! - http://e4aws.silverdr.com/


       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

silverdr-2
In reply to this post by Pasi 'Albert' Ojala

> On 2017-09-01, at 10:44, Pasi 'A1bert' Ojala <[hidden email]> wrote:
>
> On 01.09.2017 00:19, Mia Magnusson wrote:
>>
>> For some reason the decoding process in a TV anyway needs a delay. It's
>> likely that the s-video standard were set to make a S-VHS player as
>> simple as possible, i.e. bypassing any delay that's needed for a VHS to
>> do composite -> separate chroma/luma -> FM modulate luma and frequency
>> shift chroma -> record to tape, playback from tape -> FM demodulate
>> luma and frequency shift chroma -> combine luma and chroma.
>>
>> So therefore it makes sense that the signals could have different
>> timing specs for composite v.s. s-video in general.
> Hi Mia,
>
> The delay is only needed for PAL (Phase-Alternating Line), and some of the first PAL receivers didn't bother with having the delay.

Just for completeness - a delay line is not *only* for PAL, where it could theoretically be omitted, leading to Hannover bars effect ;-) it is also required for SECAM, where it acts as temporary storage between two consecutive lines. True there were no SECAM outputting CBM machines :-)

> In theory composite is produced by summing the Y and C of S-VIDEO. In practice you may get time shift whenever they are combined or separated.

Which is why studio equipment (TBCs and Co.) has adjustments for keeping them in sync.

--
SD! - http://e4aws.silverdr.com/


       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

HÁRSFALVI Levente-2
In reply to this post by HÁRSFALVI Levente-2
Ps.


In the Poynton book (Digital Video and HDTV: Algorithms and Interfaces)
one can find the following sentences about the NTSC standard (and also
for a couple of other references, including PAL ).

"...The Y’ and C components should be time-coincident
within ±25 ns. Error in chroma timing is known as
chroma-luma delay. "

(2003 edition, page 514)

So, to answer one of the original questions, the luma-chroma delay of
standard composite and separate Y/C video should be 0 ns within a +-25
ns margin.



On 2017-08-31 23:04, HÁRSFALVI Levente wrote:

> Hi!,
>
>
> On 2017-08-31 20:06, Mia Magnusson wrote:
>> Hi!
>>
>> As many people already know, C64 is older than the consumer S-video
>> signal format, and doesen't comply completely to that standard.
>> ...
>
> I don't know all the answers, but I might add something (or what)...
>
> - In a TV set / composite monitor, group delay between luma and
> demodulated U' and V' components is inevitable, due to the need of
> separating two signals of different bandwidth first (chroma from luma),
> and then the need of demodulating the color difference signals (whilst
> luma, by itself, doesn't need to be demodulated). To compensate for
> that, the devices usually have a small, some-10 ns analog delay line in
> the luma signal path. How well this compensation works depends on analog
> components and setup in earlier PAL TV sets and monitors. (Later models
> might have that integrated into some chip, just as the earlier 1H glass
> electroacoustic delay line has made it to a form of sampled mixed-signal
> integrated circuits later, and suffer less from the degradation and/or
> variance of analog component values).
>
> - I've never noticed this unusual color delay problem on C64s on late
> (high resolution, small pixels mask) CRT tvs and monitors myself. (On
> the other hand, it was long ago when I played around with a C64 the last
> time). Nor did I ever notice such delay with 264 series machines. I
> definitely did notice the problems that originate from the unusually
> high pixel clock and luma bandwidth of the C64 (--> pseudo-colors and
> color artifacts as a result), but that one might be unrelated to the
> color shift problem you're describing.
>
> - Some years ago, I had a long (and still not finished :-) ) fight
> fixing and correcting a PAL Commodore 1702 monitor (a 1702-T / a Toshiba
> CRT version, to be precise). All I can say that the luma delay was
> anything but perfect after that many years, or by design, or I don't
> really know how-why. The delay even differed for composite and separate
> luma-chroma input mode; nevertheless it was way off for both modes
> (about 1...1.5 pixels difference on the screen), both with Commodore
> machines, a recent DVD player that I used to generate test signals, or a
> standard TV picture generator instrument that I purchased and used
> later. I've never noticed that problem on my other "old style"
> (composite / separate luma-chroma / large raster mask) Commodore color
> monitor (a 1801, that is). All in all: I'd say, if the question was some
> strange luma-chroma delay value of some particular Commodore display
> make, I'd much rather suspect a by-design poor luma delay circuit than
> some third-party supplier purposely tailoring the luma delay value of
> their product to the task.
>
> (As a side-note, AFAIK basically all Commodore monitors have been
> manufactured by contractors... JVC, Sharp, Philips, Samsung, and so on;
> and also, the picture of the C64 would need to be consistently
> differently shifted on standard CRT TVs and early Commodore displays if
> Commodore had had the manufacturers tailor the luma delay value of their
> products; which AFAIK has not been confirmed).
>
> - For the 1702 fix, I think I found something promising: some
> manufacturers offer tapped analog delay lines (that is, usual, coil-type
> delay lines with many taps). With that I think I can replace the
> original delay line in the display, and select a tap that provides
> optimal delay. I'll definitely calibrate the thing against some standard
> signal source, and not a Commodore machine (but as said, ATM I don't
> expect significant differences). Such delay line might also be a
> suitable base component for your case (or, I don't know... with that,
> impedance matching might become difficult).
>
>
> Best regards,
>
> Levente
>
>        Message was sent through the cbm-hackers mailing list
>


       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

Marko Mäkelä
In reply to this post by silverdr-2
On Fri, Sep 01, 2017 at 11:45:35AM +0200, [hidden email] wrote:
>Just for completeness - a delay line is not *only* for PAL, where it
>could theoretically be omitted, leading to Hannover bars effect ;-)

This was a new term for me. For completeness, here is the Wikipedia
page:

https://en.wikipedia.org/wiki/Hanover_bars

(For some reason the English name is one letter shorter than the German
name.)

Be sure to read the linked articles; very interesting stuff on dot
crawl, chroma dots and so on.

>it is also required for SECAM, where it acts as temporary storage
>between two consecutive lines. True there were no SECAM outputting CBM
>machines :-)

Were there actually any SECAM outputting home computers, or were they
all RGB or RGBI through SCART? I wonder how/if the French people used
computer-generated graphics or title pages in their VHS home videos. How
would you record from a C64 or Amiga to SECAM VHS, for example?

        Marko

       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

HÁRSFALVI Levente-2
On 2017-09-01 21:51, Marko Mäkelä wrote:
> ...
> Were there actually any SECAM outputting home computers, ...

Some of the late Russian ZX Spectrum clones come to mind first. (Most of
them produced SECAM "by hand"; that is, by building the thing from TTLs
and discretes).

       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

silverdr-2
In reply to this post by Marko Mäkelä

> On 2017-09-01, at 21:51, Marko Mäkelä <[hidden email]> wrote:
>
> On Fri, Sep 01, 2017 at 11:45:35AM +0200, [hidden email] wrote:
>> Just for completeness - a delay line is not *only* for PAL, where it could theoretically be omitted, leading to Hannover bars effect ;-)
>
> This was a new term for me.

Probably because you haven't lost too many hours because somebody (sometimes me) forgot to re-patch cables correctly on the panel ;-)

>> it is also required for SECAM, where it acts as temporary storage between two consecutive lines. True there were no SECAM outputting CBM machines :-)
>
> Were there actually any SECAM outputting home computers, or were they all RGB or RGBI through SCART?

I don't recall any. At least of the original brands. Maybe some "clones" made in the USSR as Levente mentioned.

> I wonder how/if the French people used computer-generated graphics or title pages in their VHS home videos.

In France they probably used greyscale for title pages :-)

> How would you record from a C64 or Amiga to SECAM VHS, for example?

I can't say for France but outside of there, there were hardly any purely SECAM recorders. I don't remember a single one that would be SECAM but not PAL at the same time. Even those that were SECAM enabled were usually ME-SECAM, which was basically a PAL machine with a kind of slightly modified PAL circuitry that did not reject the signal as not conforming to PAL standard. The disadvantage was that such recordings (non-scientifically-confirmed but possible, subjective impression) were more colour distortion prone. While SECAM broadcast signal was generally good, home recordings using ME-SECAM equipment showed degradation substantially faster than PAL recordings.

--
SD! - http://e4aws.silverdr.com/


       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

silverdr-2
In reply to this post by HÁRSFALVI Levente-2

> On 2017-09-01, at 20:55, HÁRSFALVI Levente <[hidden email]> wrote:
>
> In the Poynton book (Digital Video and HDTV: Algorithms and Interfaces)
> one can find the following sentences about the NTSC standard (and also
> for a couple of other references, including PAL ).
>
> "...The Y’ and C components should be time-coincident
> within ±25 ns. Error in chroma timing is known as
> chroma-luma delay. "
>
> (2003 edition, page 514)
>
> So, to answer one of the original questions, the luma-chroma delay of
> standard composite and separate Y/C video should be 0 ns within a +-25
> ns margin.

Interestingly I recall some of the equipment having the adjustments for chroma/luma relation in something as seemingly strange as 37ns steps. No idea where it came from. Maybe the rationale was going like if it's something less than 25ns then don't bother but if it's bigger then shifting by 37 (instead of 25) will surely bring it into the "0 ±25 ns" range, even if not exactly on "0", whether it's 26 or 50 or 60, ...

--
SD! - http://e4aws.silverdr.com/


       Message was sent through the cbm-hackers mailing list
Reply | Threaded
Open this post in threaded view
|

Re: Difference in luma-chroma delay of C64/C128 compared to standard S-video

Ingo Korb
[hidden email] writes:

> Interestingly I recall some of the equipment having the adjustments
> for chroma/luma relation in something as seemingly strange as 37ns
> steps. No idea where it came from.

That's half of a pixel with the usual 13.5MHz BT.601 luma sampling
frequency or a quarter of a pixel for chroma.

-ik

       Message was sent through the cbm-hackers mailing list
12345