[net.micro.atari16] 520st composit output

BillHolland.ES@XEROX.COM (07/02/86)

I connected a MONOCROME (non atari) to my ST composite output and the
resolution is worst that the low or medium resolution going through the
rf on my color TV .I tried all kinds of colors using the control
panel,but the display looks terrible. I know that the display is ok
because I connected it to a mono source and it looks great. Any
Solutions anyone?? 

Bill 

jhs@disunix.UUCP (07/02/86)

Composite signals are made by black magic to fit into a nominal 6 MHz TV
channel -- in fact I believe the video portion has to fit into 4.5 MHz.
To do this it is necessary to low-pass filter the luminance signal to leave
room for the chroma signal, and also to bandpass filter the chroma (or lowpass
filter it in baseband form before putting it on the color subcarrier).
Anyway, these lowpass operations amount to smearing of detail in the picture
or intentionally limiting resolution.  Apparently the chroma suffers the
worst, which is why in color TV pictures the red areas sometimes appear to be
painted on sloppily so they spill outside the area that was supposed to be
red.

If you could get the baseband luminance signal out of the Atari before it has
been filtered, you might get the resolution you desire.  I have no data on the
circuit, but maybe somebody out there knows if this signal is available to be
brought out.  I speculate that it may be available somewhere over on the
monochrome monitor output.  If it is still active somewhere even when you have
plugged into the other jack, you may be able to get access to it.

Too bad they didn't provide an option of 60-Hz standard monochrome output as
well as composite.

The other trick that comes to mind is to run your monochrome monitor off the
GREEN output of the RGB output.  That is supposed to give a reasonable
monochrome picture for most normal scenes.

Good luck.

-John Sangster / jhs@mitre-bedford.arpa

rb@cci632.UUCP (07/11/86)

In article <8607022017.AA02840@mitre-bedford.ARPA> jhs@disunix.UUCP writes:
>Composite signals are made by black magic to fit into a nominal 6 MHz TV
>channel -- in fact I believe the video portion has to fit into 4.5 MHz.
>To do this it is necessary to low-pass filter the luminance signal to leave
>room for the chroma signal, and also to bandpass filter the chroma (or lowpass
>filter it in baseband form before putting it on the color subcarrier).
>Anyway, these lowpass operations amount to smearing of detail in the picture
>or intentionally limiting resolution.  Apparently the chroma suffers the
>worst, which is why in color TV pictures the red areas sometimes appear to be
>painted on sloppily so they spill outside the area that was supposed to be
>red.

One factor in this filter (for monchrome) is that the "filter" is actually
an analog filter, which means that it is rated in terms of db or mv / ns,
some people have been able to get higher resolution from monochrome simply
by using less signal contrast, and letting the tv provide the contrast
enhancement.

The problem with using a standard color TV is that the TV has "stripes" or
pixels that are dyed for each color.  If you try to double the resolution,
this will cause two pixels of different colors to be fired at each dyed
pixel.  The effect can be quite unusual :-).  High resolution monitors
running in low resolution mode simply cycle the same three color "bits"
twice.

>If you could get the baseband luminance signal out of the Atari before it has
>been filtered, you might get the resolution you desire.  I have no data on the
>circuit, but maybe somebody out there knows if this signal is available to be
>brought out.  I speculate that it may be available somewhere over on the
>monochrome monitor output.  If it is still active somewhere even when you have
>plugged into the other jack, you may be able to get access to it.

>Too bad they didn't provide an option of 60-Hz standard monochrome output as
>well as composite.

The problem here is that the television actually runs 30Hz with interlace to
get 60 Hz signals.  On the ST, the mono mode is normally 70 Hz non-interlaced.
This means you have to have 2.x times the bandwidth of a normal tv.  It would
be possible to get a relatively flicker free signal using 70 Hz interlaced
into a normal black and white TV.

>The other trick that comes to mind is to run your monochrome monitor off the
>GREEN output of the RGB output.  That is supposed to give a reasonable
>monochrome picture for most normal scenes.

The question here is, does the green output provide sufficient blanking
and retrace levels to keep the mono in sync?

I would suggest that you take a look at "The Cheap Video Cookbook" and
"The Son of Cheap Video".  In addition to some of the ideas mentioned
above, they have a nice "recipe" for an RGB to composite monchrome
adapter using mostly descrete logic and one or two transistors.

You also may want to look at the spec sheets for the MC137X motorola
line for an RGB to TV converter.