[net.micro.amiga] Interlaced monitor

dimare@ucla-cs.UUCP (09/05/85)

*** REPLACE THIS LINE WITH YOUR AMIGA ***

I can't figure out what an interlaced monitor is.
What will look better in the Amiga: 640x400, or
640x200? Does it matter?

	Adolfo
	      ///

gary@cirl.UUCP (Gary Girzon) (09/10/85)

> I can't figure out what an interlaced monitor is.
> What will look better in the Amiga: 640x400, or
> 640x200? Does it matter?

	In theory, all monitors should be capable of interlace. During
interlace mode, only half the scan lines are drawn for a particular
sweep. The other scan lines are drawn during the next sweep. In the
AMIGA, 400 line resolution can be achieved this way. The price one pays
is flicker, since screen sweeps are only done 30 times a second, as
opposed to 60 times a second during non-interlace. Thus a higher
persistance monitor is needed to eliminate flicker.

	I would hope that 640 by 400 would look much better. I have
noticed some flicker in the AMIGA interlace graphics demo, where the
spatial resolution gets a bit high. I would like to see text in 400 line
mode - it is too bad that most of the operating system seems to be
locked in 200 line mode right now. It should be possible
to change that since the display resolution is transparent to the
application. 

	One thing that looks quite bad is the color Textcraft display.
It is very hard to look at the screen, which uses black characters with
a white background. The problem is that with a non-interlace display,
the white background is lined with black lines (since the actual
scan lines are not next to each other). The display would look much better
with white on black, or light blue on blue. I have not seen Textcraft
run in interlace mode.

	Has anyone seen the AMIGA with a monochrome display? Since color,
at low resolution (640 by 200 is low resolution for color!) is very hard
to work with, it may be worthwhile investing in a second monochrome
monitor with some sort of scan doubling hardware to get a tighter display.

						Gary

jpexg@mit-hermes.ARPA (John Purbrick) (09/11/85)

> I can't figure out what an interlaced monitor is.
> What will look better in the Amiga: 640x400, or
> 640x200? Does it matter?
> 	Adolfo

This isn't net.video, but maybe that's why the question was asked.
A conventional TV, and almost all computer monitors, use an interlaced
scan.  This means that 30 times a second the screen is drawn, first the odd-
numbered lines and then the even ones. Thus each scan takes 1/60 second. The
reason for this is that even though the information only has to be presented 
30 times a second (easier for the memory) the flicker rate of the 
display will reflect the fact that two scans of the screen are being
done. You may ask "Why not just write the screen all in one sweep 30
times a second?" but unfortunately 30 flashes a second is quite notice-
able, hence the need to increase the flicker rate.

Now, to the Amiga (and other displays with > 200 lines). 200 lines is 
a common resolution because what can be done is repeat each pixel on 
(say) line 69 in the same place on line 70. Then the pixel will be
shown 60 times a second in one line or other,  and the objectionable
flicker is not seen; the eye doesn't readily notice that the pixel is 
actually jumping back and forth between lines. If you leave this 
system and go to a true 400 lines, you risk seeing the flicker of the
individual pixels, an especially nasty bug if you draw horizontal or 
near-horizontal lines of only 1 pixel's width. One solution to the
problem is to use a display with long-persistence phosphors, which 
continue to glow until the  next time they are written. Unfortunately
this plays havoc with moving objects (they leave a trail behind them)
and the Amiga has extensive animation capability, so this isn't really
a viable idea. It is certainly possible to get non-interlaced displays 
of 400, or many more, lines, which write at 60Hz and demand fast memories,
but they cost a bundle and are essentially a specialty item. Since the 
computer and display have to agree on what the scanning procedure and 
bit rate are, the computer would have to be modified to use a true
high-resolution monitor, even if you could afford one. If anyone knows 
how to get around this dilemma, I'd love to hear about it, as my ambition
is to do CAD on the cheap. Oh yes, you can improve a display's apparent
resolution by use of anti-aliasing, but that's another story.

John Purbrick					jpexg@mit-hermes.ARPA
{...decvax!genrad!  ...allegra!mit-vax!}  mit-eddie!mit-hermes!jpexg

jow@unccvax.UUCP (Jim Wiley) (09/11/85)

> A conventional TV, and almost all computer monitors, use an interlaced
> scan.  This means that 30 times a second the screen is drawn, first the odd-
> numbered lines and then the even ones. Thus each scan takes 1/60 second....

> Now, to the Amiga (and other displays with > 200 lines). 200 lines is 
> a common resolution because what can be done is repeat each pixel on 
> (say) line 69 in the same place on line 70. Then the pixel will be
> shown 60 times a second in one line or other,  and the objectionable
> flicker is not seen; the eye doesn't readily notice that the pixel is 
> actually jumping back and forth between lines. If you leave this 

actually, in noninterlaced video, the pixel is not jumping between
lines.  The SAME lines are updated each 1/60 sec.  No jumping.  If
there is jumping then the display is indeed interlaced with the even
and odd fields displaying the same data but interlacing.  To reiterate,
in non-interlaced video it is as if the odd lines are displayed every
1/60 of a second and no even lines are displayed so that the lines are
placed over one another and not between.

If characters are displayed on a non-interlaced monitor, there is no
flicker.  If the same characters are displayed interlaced with the
even and odd frames displaying the same data, there is noticable
flicker.  Here at DataSpan we have done such tests and the only way
to go for characters is non-interlaced.  A high persistance phosphor
helps reduce the flicker but does not remove it entirely.  Any flicker,
after looking at a display for more that 30 min. will drive you nuts.
Commodore did the right thing by going non-interlaced for the character
display!

James Wiley
DataSpan, Inc.

crs@lanl.ARPA (09/13/85)

> 	In theory, all monitors should be capable of interlace. During
> interlace mode, only half the scan lines are drawn for a particular
> sweep. The other scan lines are drawn during the next sweep. In the
> AMIGA, 400 line resolution can be achieved this way. The price one pays
> is flicker, since screen sweeps are only done 30 times a second, as
> opposed to 60 times a second during non-interlace. Thus a higher
> persistance monitor is needed to eliminate flicker.

Is interlace *that* different in a video monitor than it is in a TV?

The reason that interlace is used in TV is to *reduce* flicker.
Because of limited bandwidth, a full TV image can only be produced
once every 33.33 millisecond (ie 1/30 of a second).  At this scanning
rate, a given area of the screen is illuminated only 30 times per
second and there would be a noticeable top to bottom moving flicker.
By using interlace scanning, *every other* scanning line is scanned
from top to bottom in 1/60 of a second, then the "missing" lines are
filled in during the second "field" of the frame.  Thus, the entire
screen is illuminated from top to bottom 60 times per second rather
than 30 and *apparent* flicker is reduced because no *large area* is
left unexcited for more than 16.67 ms which is accomodated by
persistance of vision.  While it still takes 1/30 of a second to
produce a full picture or "frame" the entire screen is scanned 60
times per second by breaking the frame up into two interlaced fields.

I system bandwidth is adequate, the entire image or frame could be
scanned 60 times per second *noninterlaced* and the synchronizing
system would be considerably simplified (especially the sync generator
at the source) but it is questionable (to me, at least) if (apparent)
flicker would be less.

What is different in the case of a video display monitor?

-- 
All opinions are mine alone...

Charlie Sorsby
...!{cmcl2,ihnp4,...}!lanl!crs
crs@lanl.arpa

BILLW@SU-SCORE.ARPA (09/14/85)

From: William "Chops" Westfield <BillW@SU-SCORE.ARPA>


Yeah.  If interlaced displays cause unacceptable flicker, then how
come comercial television isn't bother by this?

I think it boils down to the following:

 1) TV pictures are low contrast - almost the entire screen is
    emiting some light.
 2) TV pictures are moving.
 3) TV Pictures have pretty low resolution - no way does your TV
    have 640 horizontal dots in a typical picture.

So, for ful screen animated graphics, Id expect the Amiga to perform
OK using an interlaced display, but it probably wouldn't be a good
idea for text.  (hmm.  We have some displays here that use interlacing.
Maybe Ill try hooking up a monitor without high persistance phosphor,
and see how it looks...)

BillW

jerem@tekgvs.UUCP (Jere Marrs) (09/16/85)

In article <204@cirl.UUCP> gary@cirl.UUCP (Gary Girzon) writes:
>
>	Has anyone seen the AMIGA with a monochrome display? Since color,
>at low resolution (640 by 200 is low resolution for color!) is very hard
>to work with, it may be worthwhile investing in a second monochrome
>monitor with some sort of scan doubling hardware to get a tighter display.
>
>						Gary

	Yes, I have seen it work with an NEC JB-1201M amber monitor (20MHz)
and it looks quite good. I saw no flicker whatever and I'm sure that is due to
the longer persistence phosphor used in those monitors.

						Jere

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Tektronix, Inc., Beaverton, Oregon tektronix!tekcrl!tekgvs!jerem

henry@utzoo.UUCP (Henry Spencer) (09/17/85)

The reason why interlace is a more serious compromise for a computer
display than for normal video is that normal video doesn't usually have
a lot of one-pixel detail in it.  The picture on any interlaced display
is made up of pixels flashing at 30 Hz.  If a whole area of pixels are
(roughly) the same color, then the eye will average out the alternating
30-Hz flashing into a 60-Hz flash frequency for the whole area.  When
color changes drastically from one pixel to the next, this averaging
can't happen.  TV pictures generally are composed of substantial areas
of continuous color.  Computer displays often have one-pixel-wide lines
and characters with one-pixel-wide strokes.

Consider a pattern of well-separated white one-pixel dots on a black
background.  If the dots are all in even-numbered scan lines, then
in the first 60th of a second all the dots are refreshed, and in the
second 60th of a second nothing happens.  So the net refresh rate is
30 Hz, and the interlace is useless.  This is an extreme case, mind you.
-- 
				Henry Spencer @ U of Toronto Zoology
				{allegra,ihnp4,linus,decvax}!utzoo!henry

hr@uicsl.UUCP (09/19/85)

<>

	"Is interlace *that* different in a video monitor than it is in a TV?
	..What is different in the case of a video display monitor?"

In a TV image, there are few horizontal lines that are 1 scan line high.
Actually, many sets don't do interlace well, so this might help keep
down flicker.

On computer generated displays, single height lines occur more
frequently. We used to use a 30HZ color system that had 480 lines displayable.
When a horizontal line was at least 2 scan lines high, flicker wasn't too bad.
When it was only 1, flicker got worse. About the worst viewing came from
alternate black and white lines. My eyes would start watering after a
minute or two.

(These 'lines' may be only 1 dot wide.)
I was once doing some work with dithered images. The dithering algorithm
maximizes the difference between adjacent pixels. With 30HZ
refresh, dithered images were pretty painful to watch.

						harold ravlin
					{ihnp4,pur-ee}!uiucdcs!uicsl!hr