[comp.sys.atari.8bit] Chroma/Luma on 800XL

fisher@star.dec.com (Burns Fisher ZKO1-1/D42 DTN 381-1466) (01/07/87)

A raging (well mild) debate has been going on around here regarding the 
presence or absence of a chroma signal on the 800XL video output.  Could
someone give a reasonably technical explanation of what the situation is?

Synopsis of the argument:

An article a few years ago in Antic says that XLs don't have chroma.
However, it appears that when you hook up the luma/chroma output pins to
the luma/chroma inputs of a monitor, it works.  There is some confirmation
for the Antic side if you look at the board.  There are four connections to
the video jack.  This is just right for Composite/Luma/Audio/Ground; no room
for a chroma signal.

There are also references floating around to the fact that it is possible to
get the chroma signal with a magic capacitor or two, but nothing definitive
has surfaced.

Questions:

1.  Is it true that there is no chroma on an 800XL?

2.  If so, what IS happening when we make connections to the chroma input?
    (For example, does composite into the chroma input have the same effect?)

3.  Can we get chroma via hacking?

4.  Does it matter?

Thanks.

Burns

jhs@MITRE-BEDFORD.ARPA (01/08/87)

1.  No, there is no chroma output on the stock 800XL.

2.  Yes, it can be brought out, as it is present internally.  One group that
    did this says that a 200-ohm resistor from the junction of R68 and R67
    to Pin 5 of monitor jack J2 will do it.  On the other hand, this will
    obviously give a somewhat higher than standard source impedance, which
    could cause loss of resolution if the other end of the monitor video
    cable is not properly terminated.  You might want to fiddle with the
    circuit to get the proper source impedance while keeping the proper
    chroma level.  If you need help, I can probably provide it or find
    someone who can.  At most you would need to change the resistor value
    and perhaps add another resistor to ground or something to get it right.

    The source impedance would be approximated closely by 20 ohms plus the
    series resistance you use (200 ohms in the above example).  Most of
    the 20 ohms comes from the emitter resistance of Q5 which is about 26
    ohms at room temperature for most silicon small-signal transistors.
    R66 shunts this to bring it down to the 20 ohms.  R75 should have little
    effect since nothing will be connected to Pin 4 of J2.  To get a 75-ohm
    source you would therefore need about 56 ohms in series (a standard value)
    but this would increase the chroma level quite a bit.  You should be
    able to figure out a series/shunt combination that gives 75 ohms (or 50
    if that is what your monitor wants) and still gives a reasonable chroma
    signal level at the monitor.

3.  Yes, it does matter.  If you use luminance and chroma separately in a
    NTSC decoder which uses the full chroma bandwidth, you can get resolution
    essentially as good as the better RGB monitors give.  Some monitors such
    as the Commodore 1802 (?) do this and give excellent results -- more than
    good enough for 80-column word processing with CDY's OMNIVIEW upgrade
    chip, for example.  Other graphics applications also look much better
    with the enhanced resolution, I am told.  It's a very simple mod --
    "go for it"!

-John Sangster
jhs@mitre-bedford.arpa