[comp.sys.ibm.pc.hardware] 1024x768 interlaced vs non-inter

mardiros@dtoa3.dt.navy.mil (Mardiros) (04/25/91)

hi,

i need some advice very fast.

question: What are my bennifits for buying a non-interlaced monitor
versus an interlaced one?

my choice: interlaced- WYSE

	   non-interlaced - ALR

Thanks, your help is appreciated.

marty

bjorn@valhalla.esd.sgi.com (Bjorn Lindstrom) (04/25/91)

Interlaced monitors have a noticable "flickering" effect, particularly at
the higher resolutions (greater than 800x600).  Combine that with greater
detail at those higher resolutions, and you have a situation that is VERY
hard on the eyes.  The term comes from (inter-)"lacing" the scan lines, so
that every other scan line is displayed, taking two passes over the screen
to display them all (taking twice as long, obviously).  

A non-interlaced display, on the other hand, displays all the scan lines in
the same pass, creating none of the "flickering" found on interlaced displays.

Good luck!

Bjorn

crs@lanl.gov (Charlie Sorsby) (04/27/91)

In article <1991Apr24.230046.7851@odin.corp.sgi.com>, bjorn@valhalla.esd.sgi.com (Bjorn Lindstrom) writes:
> The term comes from (inter-)"lacing" the scan lines, so
> that every other scan line is displayed, taking two passes over the screen
> to display them all (taking twice as long, obviously).  

Well, yes and no.  Interlace scanning was originally used to
*reduce* flicker in television displays.  To understand this, one
must consider global as opposed to local flicker and constraints on
scanning speed.  More in a moment.

Cost and technical constraints *always* limit to one degree or
another, the maximum video frequency that can be used.  This limits
the maximum horizontal scanning speed possible which, in turn,
limits the maximum vertical scanning speed that will produce the
required number of scanning lines per frame.

> A non-interlaced display, on the other hand, displays all the scan lines in
> the same pass, creating none of the "flickering" found on interlaced displays.
			  ^^^^ ^^ ^^^ ^^^^^^^^^^^^

Not necessarily true, at least in principle.  But it does depend on
your definition of flickering.

The reason that television uses interlace scanning is that scanning
*speed* is (or was) constrained by cost and technical considerations
when the standard was defined.  For a given, relatively low,
scanning speed, interlace scanning will produce less perceived
*global* flicker.  The flicker of any *single* pixel will be the
same whether interlace is used or not.

For given vertical and horizontal scanning speeds any specific
pixel will be illuminated exactly the same number of times per
second whether scanning is interlace or noninterlace.

In the case of television (vertical scanning frequency is approx.
60 times per second, horizontal approx. 15,750).  With this
relationship between vertical and horizontal scanning periods this
is possible with interlace scanning.  With non-interlace scanning
*and* a limit of 15,750 horizontal scans per second, *noninterlace*
scanning would limit vertical scanning to approximately 30 scans
per second (with the number of scanning lines needed) and the
viewer would perceive global flicker.

*My* interpretation of the problem with character displays as
opposed to image displays is that with the former, one in
concentrating intently on a small area of the screen at any one
time (i.e. the area occupied by a character).  With interlace
scanning, vertically adjacent pixels are illuminated at
significantly different times (one field period apart) and probably
cause the perception of *local* flicker.

I believe that, if character display monitors were constrained to
the same maximum video frequency, and, therefore, the same maximum
horizontal scan rate, and, therefore, the same maximum vertical
scan rate, as are television displays, the apparent global flicker
resulting from noninterlace scanning would be much more noticeable
and annoying than the local flicker due to interlace.

Once one can afford a high enough video frequency (the cost of the
electronics goes up with increasing bandwidth), interlace is no
longer necessary.

But it isn't simply a case that interlace scanning is bad and
noninterlace scanning is good.  There are trade-offs that must be
made.

Best,

Charlie Sorsby						"I'm the NRA!"
	crs@lanl.gov