[net.micro.amiga] Technically speaking.. Flicker

spencer@oberon.UUCP (Randy Spencer) (01/23/86)

*** REPLACE THIS LINE WI...munchmunchmunch

I really need a good description of exactly why is it that the Amiga will
flicker in the 400 mode.  I understand Television reasonably well, enough
to understand that television is a set of 60 fields of 262 lines of video
which extend from the above the top of the visible portion of the screen
to just below the bottom. There is 1/60th of a second worth of motion and
then the next frame is drawn, this time the lines are drawn between the
previous lines.  This process is done so as to keep the screen from fading
before the electron beam gets to the bottom of 512 lines after a 1/30th
of a second.  When the Amiga is in 200 mode I assume that it is drawing
on the first field and then not on the second?  Or is it drawing every
forth line and in the second field it will pick up the middle of the
remaining 3?  If the IBM pc only has 200 lines why is it that there is
not such a noticeable dark line on the undrawn line, how about the C64?
I had always assumed that it was just repeating the lines to fill up
400 lines worth of space.
Perhaps the real problem that I could not figure out in my own head is,
if the Amiga had to flicker to remain compatible with NTSC, why is it
that even the highest resolution video of the day will not flicker like
the Amiga?  Infact, no one ever mentioned that NTSC flickered to me before,
PAL, sure, the whole 625 lines are drawn before it starts over again,
sort of like a strobe.
Any chance it has to do with unstable sync signals from the Amiga, when
I run Setlace it sure seems to start shaking the formerly rock steady
lines around on the screen.

If this becomes a popular question to respond to feel free to send me
mail, I will post what I learn.

==============================================================================
Randal Spencer      Student DEC Consulting - University of Southern California
Home: 937 N. Beverly Glen Bl. Bel Air California 90077          (213) 470-0428
Arpa: Spencer@USC-ECL  or  Spencer@USC-Oberon          Bitnet: Spencer@USCVAXQ
------------------------------------------------------------------------------

keithd@cadovax.UUCP (Keith Doyle) (01/24/86)

In article <187@oberon.UUCP> spencer@oberon.UUCP (Randy Spencer) writes:
>I really need a good description of exactly why is it that the Amiga will
>flicker in the 400 mode.   ......... 

>.......       If the IBM pc only has 200 lines why is it that there is
>not such a noticeable dark line on the undrawn line, how about the C64?

Jeez, I always thought the black lines between the lines on the IBM PC 
to be VERY noticeable.  A color filled rectangle on the screen to me 
looks like it was filled with a horizontal stripe pattern.  But, I have
been using better than average monitors, I`ve noticed the standard PC
monitors have such a large dot-pitch that the color dots just seem to
smudge the video all together, somewhat covering up this black-line
problem.

>I had always assumed that it was just repeating the lines to fill up
>400 lines worth of space.

Wrong.  The lines are interlaced so the apparent overall screen flicker
is at 60hz not at 30hz.  Actually, though, real small details ARE flickering
at 30hz, not 60hz, if they are smaller than 2 lines high.

>Perhaps the real problem that I could not figure out in my own head is,
>if the Amiga had to flicker to remain compatible with NTSC, why is it
>that even the highest resolution video of the day will not flicker like
>the Amiga?  Infact, no one ever mentioned that NTSC flickered to me before,

This is because most video programs are made up of signals that do not have
a lot of horizontal hard edges and/or lots of small detail with lots of
contrast.  By the same token, depending on what you are displaying on the
Amiga screen, you may or may not notice the flicker.  In using Deluxe-Paint
in hi-res mode the menu bars and stuff flicker like a SOB, but I've painted
several pictures that use a lot of grey-scale and don`t have a lot of
horizontal edges that don't flicker any more than the lower res pictures.
It all depends on what you are displaying.  Computers tend to be worst
case, character displays have a lot of small details, and other computer
pictures can be inclined to have a lot of skinny horizontal lines.  Video
images don't have this problem much, except sometimes you might notice such
flicker on nightly news/weather programs where they are using some kind of
display generator, or color keying or something that produces a contrasty
horizontal edge.

>Any chance it has to do with unstable sync signals from the Amiga, when
>I run Setlace it sure seems to start shaking the formerly rock steady
>lines around on the screen.
>
>Randal Spencer      Student DEC Consulting - University of Southern California

Again, when you run setlace, the horizontal edges flicker at 30hz, instead of
60hz.  Depending on what you are displaying, (and window boxes like to present
plenty of horizontal edges) this problem is noticable to a varying degree.
Adjusting the preferences colors can minimize this effect somewhat.  Using
all pastel type colors, for both background and foreground usually helps.

I expect the up and coming camera digitizer add-on should provide a ready
means to input video-like images to the hi-res mode that should prove to
have little flicker, just like the usual video images we are all used to.
Can't wait for this little bugger to become available, though they better
save the pictures in EA's IFF format so I can use Deluxe Paint to monkey
with them.

Keith Doyle
#  {ucbvax,ihnp4,decvax}!trwrb!cadovax!keithd
#  cadovax!keithd@ucla-locus.arpa

hull@hao.UUCP (Howard Hull) (01/26/86)

If you know about television, then I presume you know that with interlaced
television, one field of 262 lines starts at the upper left of the screen,
whereas the other field (1/60 second later) starts at the upper center of
the screen and goes for half a line before continuing with full-length lines.
When the Amiga is in 320 or 640 by 200 mode, it simply lays the second field
directly on top of the first, all starting at the upper left.  Ordinary TV
usually blanks at least a 1.3 millisecond worth of lines (appx 20 lines per
field, or a total of 40 lines per frame) to black.  The Amiga blanks 62 lines
to black during vertical retrace (31 at the top, and 31 at the bottom) and
distorts the NTSC standard 3:4 vertical to horizontal aspect ratio by an
"adjustment" of the picture vertical height.  There is a distinct gap that
can be seen between the lines on the Amiga lo-res screen.

When the Amiga goes to 320 or 640 by 400 mode, it interlaces just like the
NTSC television standard does.  Now there is a barely perceptible darkening
between the lines, (their "Gaussian" profiles overlap somewhat) and there
is a distinct "jitter" in the general background.  The jitter is introduced
by the famous 1/30 second persistence problem.  On the television set, there
is as well the imprecision with which it manages to interlace the field line
positioning (or timing of each frame, depending on how you think of it).
As a general rule, you don't notice the flicker in television images because
the transmission bandwidth doesn't allow resolution of much better than 400
lines at the MTF limit (where a fine black and white checkerboard would be
rendered as gray); so features that would be on one line and would flicker
in the "clean" Amiga monitor are somewhat spread out and/or moving on the
television image.  I have the Amiga video modulator, and I notice far less
flicker in the interlace mode on the television set than on the Amiga 1080
monitor.  The resolution, color fidelity, purity, and stability on the 1080,
however, is VASTLY superior to what finally gets to the TV screen.  On the
TV, 80 column characters are just barely tolerable, and even so only close
up and for less than an hour - else even Meditating Guru Eyes will pop right
out of their sockets.
                                                                Howard Hull
[If yet unproven concepts are outlawed in the range of discussion...
                   ...Then only the deranged will discuss yet unproven concepts]
        {ucbvax!hplabs | allegra!nbires | harpo!seismo } !hao!hull

grr@cbm.UUCP (George Robbins) (01/29/86)

> Randal Spencer      Student DEC Consulting - University of Southern California

> I really need a good description of exactly why is it that the Amiga will
> flicker in the 400 mode.  I understand Television reasonably well, enough
> to understand that television is a set of 60 fields of 262 lines of video
> which extend from the above the top of the visible portion of the screen
> to just below the bottom. There is 1/60th of a second worth of motion and
> then the next frame is drawn, this time the lines are drawn between the
> previous lines.  This process is done so as to keep the screen from fading
> before the electron beam gets to the bottom of 512 lines after a 1/30th
> of a second.  When the Amiga is in 200 mode I assume that it is drawing
> on the first field and then not on the second?  

In 200 mode, the same image is displayed twice during the 1/30 period on
the very same lines.  The size of the scan line is large enough that you
do not notice any gaps.

> Perhaps the real problem that I could not figure out in my own head is,
> if the Amiga had to flicker to remain compatible with NTSC, why is it
> that even the highest resolution video of the day will not flicker like
> the Amiga?

The basic flicker is not really the fault of the amiga, it is caused by the
relatively short persistance of the color phosphors.  Monochrome phosphors
are available in varying persistences - from very long like used on medical
heartbeat displays to very short for oscilloscope photography.

Since only standard color phosphors are readily available, when two alternate
scan lines are very different they are each fading out 30 times a second,
which is perceived to be an alternating or flickering effect.  Because a
normal TV image has, on the average, very little contrast between scan lines,
you do not notice this while watching the A-team.

As has been pointed out before, you can minimize the effect by dimming the
ambient lighting and turning down the brightness.  Many of the commercial
color graphics displays run at 30 Hertz, non-interlaced, and must be used
in darkened rooms to minimize eyestrain.

> I run Setlace it sure seems to start shaking the formerly rock steady
> lines around on the screen.

I'm not sure what the problem is here, but anyway if you find a source of cheap
long persistence color monitors, please let Amiga and the world know about it!!!
-- 
George Robbins - now working with,	uucp: {ihnp4|seismo|caip}!cbm!grr
but no way officially representing	arpa: cbm!grr@seismo.css.GOV
Commodore, Engineering Department	fone: 215-431-9255 (only by moonlite)

breuel@h-sc1.UUCP (thomas breuel) (01/30/86)

> The basic flicker is not really the fault of the amiga, it is caused by the
> relatively short persistance of the color phosphors.  Monochrome phosphors
> are available in varying persistences - from very long like used on medical
> heartbeat displays to very short for oscilloscope photography.

No, using high-persistence monitors is not a solution. When you
scroll text or move your mouse-pointer around, it looks terrible
on a high-persistence monitor (just look at the IBM screen).
To make this perfectly clear: the reason why 70Hz interlaced monitors
do not appear to flicker is NOT that the persistence of the monitor
is matched with the refresh-rate, it is that the human eye/brain
cannot perceive flicker above 30Hz.

The way to get good high-resolution displays is not to use high-persistence
monitors, it is to use higher frequency displays. The LISA, the Mac, and
the Atari ST show that this is possible economically, at least for
black and white displays. Call me spoiled or whatever, but I have
gotten very used to my 700x500, flicker-free, low-persistence LISA
screen, and I'll not switch to a computer that doesn't have a similar
display quality.

Now, again, the reason why I am posting this is not to annoy Amiga owners
or to encourage the purchase of Atari ST's, but simply the hope that
Commodore will add a 640x400 70Hz mode to their otherwise great machine
if there is suficient demand for it. I just can't see how people can
reasonably argue that using a computer in a dimmed room, sitting back
3 feet, using a high-persistence monitor, or drawing pixels on top
of one another can be more than a bad compromise, given that a real flicker
free high-resolution display is not all that hard or expensive to make.

						Thomas.

keithd@cadovax.UUCP (02/04/86)

In article <898@h-sc1.UUCP> breuel@h-sc1.UUCP (thomas breuel) writes:

>The way to get good high-resolution displays is not to use high-persistence
>monitors, it is to use higher frequency displays. The LISA, the Mac, and
>the Atari ST show that this is possible economically, at least for
>black and white displays. Call me spoiled or whatever, but I have
>gotten very used to my 700x500, flicker-free, low-persistence LISA
>screen, and I'll not switch to a computer that doesn't have a similar
>display quality.

Funny, I've gotten used to the advantages of NTSC compatibility and multiple
colors at 640x400x4, and for most of the images I am interested in at that 
resolution (I have no interest in 80x44 text screens however) the flicker
is virtually non-existant.  (if you don't believe me, send me an Amiga
disk and I'll send you some DeluxePaint files at 640x400x4 where you won't 
notice the flicker at all except on the DeluxePaint menus.  I can directly
video-tape these images, so if you'd rather see it on video, send me a
blank tape.

>Now, again, the reason why I am posting this is not to annoy Amiga owners
>or to encourage the purchase of Atari ST's, but simply the hope that
>Commodore will add a 640x400 70Hz mode to their otherwise great machine
>if there is suficient demand for it. I just can't see how people can
>reasonably argue that using a computer in a dimmed room, sitting back
>3 feet, using a high-persistence monitor, or drawing pixels on top
>of one another can be more than a bad compromise, given that a real flicker
>free high-resolution display is not all that hard or expensive to make.
>						Thomas.

Now, again, the reason why I am posting this is not to annoy Atari owners
or to encourage the purchase of Amigas, but simply the hope that you
all will realize that everything is relative.  I *could* also say that
Atari will add a 640x400 NTSC mode to their otherwise great machine
if there is sufficient demand for it (but I won't).  How expensive it
is to make is only part of the issue, how compatible and flexible with
other real-world devices is part of it too.

Keith Doyle
#  {ucbvax,ihnp4,decvax}!trwrb!cadovax!keithd
#  cadovax!keithd@ucla-locus.arpa

pwp@iuvax.UUCP (02/06/86)

One approach to the resolution question is to ask how many patterns you
can have subject to various constraints. It particular the logarithm base
two of the number of patterns is an interesting measure of the effective
resolution. On a 200 line monitor, you can have 2^200 patterns (when using
two colors, such as black and white); on a 400 line monitor, you can have
2^400 patters. The logarithm is 200 and 400 in the two cases. If you require
one of the colors to occur in groups of atleast two (as was suggested for
reducing flicker) you can have about 2^324 patterns in 400 lines for an
effective resolution of 324 lines. If you require both colors to occur
in groups of atleast two, then the effective resolution is about 278 lines.
This suggests you should be able to control the flicker on the Amiga and
still get some what better resolution than the Atari, but only a little
better.

The above calucations, of course, are just suggestive, and if you have a
particular application such a displaying text, there is no substitute for
display text in the best way you can and then look at how good of a job you
were able to do.

By the way, I have an Atari. I could not see paying twice as much for a
computer that was only slightly better.