[net.micro.amiga] flicker, flicker, flicker!

prindle@nadc@caip.RUTGERS.EDU (01/30/86)

From: prindle@NADC

I've been watching all this discussion of flicker with some interest, and
I still see a general confused state of mind.  So let's, once and for all,
define what causes flicker and what can be done about it.

In the NTSC standard, a frame is drawn every 1/30 second, and each frame
consists of 2 fields.  Each field draws about 262 horizontal lines, of which
about 240 are visible (i.e. not obliterated by the vertical sync pulse), and
typically 200 are used for information (to allow for overscan).  I say "about"
because, if the picture is not to be interlaced, an integral number of lines
is drawn in a field, and the second field (assuming the monitor sync is adjusted
right) is drawn right over top of the first, giving an nX200 resolution display
in which every pixel is re-drawn every 1/60 second (e.g. the IBM PC Color
Display, the Commodore 64/128, etc, etc, etc.).  If the picture is to be
interlaced, something like 262.5 lines are drawn per field.  Thus, if the
first field began in the upper left corner of the screen, the second field
will begin in the center of the top of the screen (the vertical sweep in a
monitor is a sawtooth wave - thus the scan lines move down ever so slightly
as they move to the right) and all the lines of the second field will lie
directly between the lines of the previous field.  Since the height of the
electron beam is about 1/500 the height of the screen, this effectively fills
in the picture with solid color.  The information content of the second
field of a frame is typically different than that of the first field.
Thus, individual pixels (in nX400 resolution mode) are drawn every 1/30 second.
Unless the monitor has a quite high persistence phosphor (which has other
drawbacks, such as smearing/ghosting), the human eye can detect the reduction
in intensity of a pixel during the 1/30 second until that point is again
struck by the electron beam (note that the eye cannot detect the reduction that
occurs during the 1/60 second pixel refresh rate of a non-interlaced display).
Normally, the eye is a very forgiving device, as it performs an integration
function: if one pixel is changing in this fashion, but the pixel below it is
the same color and is changing the same way but 1/60 second later, the pair
is perceived as one solid non-flickering blob of color, two pixels high.  Only
when a pixel stands by itself, flanked by a different color on the top and
bottom of itself, does the eye fail to integrate; it depends on many subtle
factors (such as intensity, background lighting, distance, eye-to-screen angle,
etc.) how badly it fails.  (Other flicker conditions exist; for example, three
pixels vertically will somewhat defeat the integration effect since two will
be flickering in one phase, while the third out of phase pixel will not quite
cancel the nearby flicker).  The worst case comes when a horizontal line of more
than one pixel width is only one pixel high (as it might be in a small, high
resolution character such as capital E).  Here, the eye is annoyed to the hilt
by a whole bunch of those flickering pixels.  If you wonder why you've never
seen flicker on your home TV, go home and watch one of those sports or news
shows where tiny computerized titles are displayed, and you'll see it; most
analog real-life scenes just don't contain narrow, exactly horizontal lines.

To have no flicker (without high-persistence phosphors), there is really
only one choice - updating each pixel on the hi-res screen more frequently
than 1/30 second while continuing to draw a frame in at least 1/50 second
(another eye requirement so the picture doesn't appear to flash).  To do this, a
computer must 1) have a faster video chip, and 2) must abandon NTSC compati-
bility.  The IBM PC monochrome display system and the terminal on which I
am now typing (Direct 800B) use this technique.  I believe the Atari 520ST
also uses this on the monochrome output.  The net result, in the case of
the PC monochrome display, is a 50hz. noninterlaced display with 369 horizontal
lines per field/frame (350 visible).  Now, try to do this in a color display
chip with more data to fetch and process, try not to infringe on the memory
bandwidth (i.e. make the chip buffer a whole bit plane or two so it doesn't
have to access memory at the faster rate) and you'll understand why it hasn't
yet been feasible to implement this in a low cost computer display.  Add to
this the desire for NTSC compatibility (someone will buy an Amiga to
let the kids play games on the family tube, or "what good is a color monitor
if I can't play my VCR through it?") and this solution quickly becomes a
non-solution.

So flicker she will; individual perception will vary from person to person,
monitor to monitor, and environment to environment, but it will always be
there.
 -- try squinting!
 
Not an Amiga Owner,
Frank Prindle
Prindle@NADC.arap