[comp.graphics] Frame rate

nagle@well.UUCP (John Nagle) (08/14/89)

In article <12027@ulysses.homer.nj.att.com> ggs@ulysses.homer.nj.att.com 
(Griff Smith) complains about TV frame rates.  This is a significant point.
The Showscan film work indicates that humans notice an improvement in 
motion representation as frame rates are increased above 24FPS, and that
humans cease to notice an improvement somewhere between 60FPS and 80FPS.
In Showscan, frame rates in the 60-80 range are combined with 70mm film
and a wide screen to provide what might be called "improved definition film."
The various HDTV systems use frame rates well below this threshold, and it
may turn out that a second round of HDTV improvement will be necessary
at some later date.

Even "large screen" television, though, tends to occupy only a small percentage
of the visual field.  Showscan does attempt to provide a screen large enough
to fill the visual field, given a stationary head position.  Motion strobing
effects are most noticeable near the edge of the visual field, where the
motion-detection functions of peripheral vision dominate.  So the proposed
HDTV standards will probably be good enough for the small screen.

What they won't be good enough for, though, are video goggles.  Video goggles,
as used by the virtual reality types, attempt to fill the entire visual
field.  When the wearer turns their head, the image must pan accordingly.
That panning operation should not generate visible artifacts.  We can thus
expect that virtual reality systems will require frame rates in the Showscan
range before the wearer is comfortable moving at high speed in the virtual
environment.

					John Nagle

hades@pbinfo.UUCP (Detlef Siewert) (08/16/89)

In article <13130@well.UUCP> nagle@well.UUCP (John Nagle) writes:
> [...] Motion strobing
>effects are most noticeable near the edge of the visual field, where the
>motion-detection functions of peripheral vision dominate.  So the proposed
>HDTV standards will probably be good enough for the small screen.
>
:-) :-) :-) :-) :-) :-) :-) :-) :-) :-) :-) :-) :-) :-) :-) :-) :-)

I personally think that the visual attention of humans is attracted
by fast motions at the edges of the visual field. So when you notice
a motion beside you, you tend to look at it (an old reflex to protect
yourself from dangerous animals). That's the point why television
attracts your view also. So without flicker, nobody would look at
that old stuff in the telly anymore! That's why the frame rate of
HDTV is not raised.

:-) :-) :-) :-) :-) :-) :-) :-) :-) :-) :-) :-) :-) :-) :-) :-) :-)


	  /\
	 /  \
	/\   \
       /  \   \		Hans-Detlef Siewert
      /\ HaDeS \
     /  \   \   \
    /____\___\___\

Please don't start serious discussions about this!
It wasn't meant seriously!

myers@hpfcdj.HP.COM (Bob Myers) (08/19/89)

>Interlace seems to be an excellent way to cut the bandwidth in half
>while avoiding flicker and motion artifacts.  Other than pressure
>from the movie industry, what are the other arguments for eliminating
>interlace?

Interlace does NOT avoid flicker and motion artifacts; flicker (actually
a host of effects with various names, but let's just do what everybody
does and lump them together under "flicker) is MUCH  worse in an
interlaced display over a non-interlaced display of the same size and
brightness at the same *frame* rate.  You can't avoid it - the price you
pay for reducing the bandwidth required is the fact that each pixel gets
refreshed only half as often as before.  This is particulrly annoying
in images containing narrow horizontal or near-horizontal lines, but even
a plain white raster, if interlaced, will be obviously worse than a
non-interlaced raster, all else being equal.

The effects you mention - regarding panned images, etc. - are certainly
artifacts of the slower refresh rate, and to some degree the 3:2 pulldown
used in going from film to television.  (I also believe that part of the
problem with a pan is psychological - things which you expect to be
"stationary" suddenly start moving, but you're NOT moving your head.
You tend to track differently than you would if you were "panning your eyes.")
But non-interlaced TV would definitely provide a significant performance
improvement over interlaced, assuming that you can afford the bandwidth.


Bob Myers  KC0EW   HP Graphics Tech. Div.|  Opinions expressed here are not
                   Ft. Collins, Colorado |  those of my employer or any other
myers%hpfcla@hplabs.hp.com               |  sentient life-form on this planet.

king@dciem.dciem.dnd.ca (Stephen King) (08/23/89)

In article <17400006@hpfcdj.HP.COM> myers@hpfcdj.HP.COM (Bob Myers) writes:
> [...]     <flicker> is MUCH  worse in an
>interlaced display over a non-interlaced display of the same size and
>brightness at the same *frame* rate.  [...]
                         ^^^^^
This is a new one on me, Bob; how can 30Hz (frame rate) non-interlaced be
any better than 30Hz (frame rate) interlaced video, such as NTSC? Surely
both will have the same horizontal scan rate and will refresh lines at the
same rate, so why is there this difference? I think you have the relationship
backwards: the interlaced display will flicker LESS than the non-interlaced
one AT THE SAME FRAME RATE. (all other things being equal) If what you
state were true, there would be no need for interlace, n'est-ce pas? 

-- 
                       Se non e` vero, e` ben trovato 
     ...{utzoo|mnetor}!dciem!dretor!king        king@dretor.dciem.dnd.ca

myers@hpfcdj.HP.COM (Bob Myers) (08/25/89)

>> [...]     <flicker> is MUCH  worse in an
>>interlaced display over a non-interlaced display of the same size and
>>brightness at the same *frame* rate.  [...]
                         ^^^^^
>This is a new one on me, Bob; how can 30Hz (frame rate) non-interlaced be
>any better than 30Hz (frame rate) interlaced video, such as NTSC? Surely


Looks like the old rule of "open mouth - insert foot" still applies.  You are,
of course, correct; what I was *trying* to say - and stumbling over my tongue
badly in the process - was that there's a distinct difference between a
"60Hz non-interlaced" display, and a "60Hz interlaced" display.  In one
case, we're talking frame rate; in the other, the "60Hz" is, of course, the
*field* rate.  I should've said "vertical sweep frequency", but was trying to
make the distinction for those who only see the specs "interlaced/non-int."
both attached to the same "60Hz" number.

Bob "Gotta keep remembering to READ it before I send it off..." M.

sparks@corpane.UUCP (John Sparks) (08/26/89)

<2434@dciem.dciem.dnd.ca>
Sender: 
Reply-To: sparks@corpane.UUCP (John Sparks)
Followup-To: 
Distribution: 
Organization: Corpane Industries, Inc.
Keywords: 

In article <2434@dciem.dciem.dnd.ca> king@dretor.dciem.dnd.ca (Stephen King)
writes:
>In article <17400006@hpfcdj.HP.COM> myers@hpfcdj.HP.COM (Bob Myers) writes:
>> [...]     <flicker> is MUCH  worse in an
>>interlaced display over a non-interlaced display of the same size and
>>brightness at the same *frame* rate.  [...]
>                         ^^^^^
>This is a new one on me, Bob; how can 30Hz (frame rate) non-interlaced be
>any better than 30Hz (frame rate) interlaced video, such as NTSC? Surely
>both will have the same horizontal scan rate and will refresh lines at the
>same rate, so why is there this difference? I think you have the relationship
>backwards: the interlaced display will flicker LESS than the non-interlaced
>one AT THE SAME FRAME RATE. (all other things being equal) If what you
>state were true, there would be no need for interlace, n'est-ce pas? 

Well, Interlace lets you get twice the information on the screen (at least in
the case of my amiga). The flicker is not because of the frame rate, but
because of the *difference* in the information presented in each field (1/2
frame). 

An interlaced 30Hz 1K x 1K screen is probably delivering 1K x 500 pixels each
1/60th of a second. The monitor draws the first 500 lines then shifts down
slightly  and draws the second 500 lines in the next 1/60th/sec to make a 30Hz
frame with 1000 lines. This will flicker as your eye will notice the
differences between one field and the other, especially in pixels that are lit
in one field but not the other.

A non-interlaced display at 30Hz must display all 1000 lines each frame, there
are no 'fields'. The fields are the frames. since each frame has the same
information being displayed there will be less flicker. The better solution
would be to make the monitor 60Hz non interlaced.

The advantage of using interlace is that it lets you get twice the information
on equipment that is only qualified for 1/2 the informational amount. In other
words, it lets you save money by using cheaper monitors. 

-- 
John Sparks   |  {rutgers|uunet}!ukma!corpane!sparks | D.I.S.K. 24hrs 1200bps
|||||||||||||||          sparks@corpane.UUCP         | 502/968-5401 thru -5406 
You are in a maze of twisty little passages, all alike.

sleat@sim.ardent.com (Michael Sleator) (08/26/89)

In article <17400006@hpfcdj.HP.COM> myers@hpfcdj.HP.COM (Bob Myers) writes:

>Interlace does NOT avoid flicker and motion artifacts; flicker (actually
>a host of effects with various names, but let's just do what everybody
>does and lump them together under "flicker) is MUCH  worse in an
>interlaced display over a non-interlaced display of the same size and
>brightness at the same *frame* rate.  You can't avoid it - the price you
>pay for reducing the bandwidth required is the fact that each pixel gets
>refreshed only half as often as before.  This is particulrly annoying
>in images containing narrow horizontal or near-horizontal lines, but even
>a plain white raster, if interlaced, will be obviously worse than a
>non-interlaced raster, all else being equal.

I disagree.  Have you ever compared a 60Hz progressive scan display with
a 60Hz frame rate (120Hz field rate) 2:1 interlaced display?  Under some
circumstances I find the flicker from a 60Hz progressive scan display with
a P4 or similar speed phosphor annoying.  By comparison, a 60Hz (frame) 2:1
interlaced display looks rock solid.  This is *especially* true with a plain
white raster.

There is something else confused in your paragraph above, in that at the
same frame rate, an interlaced and a non-interlaced display require the
same video bandwidth.  Perhaps you meant to describe an interlaced field
rate equal to a progressive frame rate???

One of my pet peeves in this area is that people, by and large, do not
distinguish between interlace per se and refresh rate.  (Witness the much-
used but ill-defined term, "interlace flicker".)  When most people hear
"interlace", they think "30Hz frame rate".  The two are not synonymous.
For example, I can imagine circumstances (somewhat unusual, I'll admit)
in which it would make sense to run at 60Hz frame, 180Hz field, 3:1 interlace.
This might be appropriate in a display where you had a large number of scan
lines, a very small spot size, and for some reason needed a very fast phosphor
yet the screen had to be viewed by humans.

This example points out the reason interlacing "works":  loosely speaking
(someone else can probably do a better job of speaking tightly to this point),
your eye (brain?) integrates over a small spatial and temporal region.  If
the scan lines were very far apart, interlacing would not help.

One of the frequent objections to interlaced displays is that one can easily
construct pathological cases, such as fully illuminating all of the scan
lines on one field with the other field completely dark, which look terrible.
Note, however, that at equal *frame* rates, this case is indistinguishable
in the interlaced and progressive formats.  (See below re the practical
realities of this.)  So again, this is principally an objection to low
refresh rates, not to interlace.

So why would you want to interlace at all, if at the same frame rate you
are not buying any bandwidth relief?  Personally, at 50-60Hz frame rates,
for more or less static displays (e.g., text), I would rather look at an
interlaced display than a non-interlaced one.  These rates are marginally
acceptable in terms of flicker, and interlacing generally helps.  It does
have it's costs, however, and they can be high.

The two biggest obstacles to making an acceptable interlaced CRT display
are high-voltage regulation and vertical deflection stability.  In order
to maintain accurate geometery between fields, the electron beam acceleration
voltage must be very tightly regulated.  If the beam current (proportional
to brightness) on one field is different from that on the other field, the
high-voltage power supply will tend to droop, and there will be a geometrical
expansion of the raster.  Clearly, if the two fields are not exactly the same
size, they won't interlace properly.  In a non-interlaced display, it's not
nearly so critical because the image tends to expand as a whole.

The second obstacle, vertical deflection stability, is not hard to understand.
In order to avoid line paring, the start of each vertical scan must be
controlled to within a small part of one horizontal period.  With say 600
lines per field, this amounts to a small fraction of a percent of the
vertical period.  This is mostly a matter of careful design and doesn't
necessarily imply a parts cost penalty of the same order as the high-voltage
regulation problem.

There is a small cost in bandwidth, since more time is spent in vertical
retrace.  For 2:1 interlace, this generally amounts to less than five
percent of the total frame time.

Another problem is magnetic interference.  Just as in the high-voltage
problem, any distortion which affects one field differently than the other
will be very noticeable.  A constant magnetic field, such as the Earth's
field, will not cause a problem.  Changing fields, such as from transformers,
adjacent displays, and power wiring, can cause vertically adjacent pixels,
on different fields, to be deflected in opposite directions.  This can be
very objectionable.

Even at equal frame rates, there will still be motion artifacts in an
interlaced display.  Or rather, the motion artifacts on an interlaced
display will be different from those on a non-interlaced display.  It's
not clear to me that one is always better or worse than the other.



I realize that most or all of these points have been made in this group at
one time or another, but confusion still abounds.  Despite its abuses,
interlacing is a valid technique which can successfully applied to a variety
of real-world needs.  (Can you imagine watching European television, with
its 25Hz frame rate, without interlacing?)  If people were to look carefully
at the real issues here, we might find that there are a varity of
applications where 2:1 or higher ratio interlaced displays running at
field rates of 100 to 200Hz provide better results than progressive scan
displays.  Interlace is not, or at least should not be, a dirty word.


Michael Sleator
Ardent Computer
880 W. Maude Ave.
Sunnyvale, CA  94086
408-732-0400
{apple, decwrl, hplabs, ubvax, uunet}!ardent!sleat

dar@telesoft.telesoft.com (David Reisner) (08/30/89)

In article <17400008@hpfcdj.HP.COM>, myers@hpfcdj.HP.COM (Bob Myers) writes:
> ... there's a distinct difference between a
> "60Hz non-interlaced" display, and a "60Hz interlaced" display.  In one
> case, we're talking frame rate; in the other, the "60Hz" is, of course, the
> *field* rate. ...

I agree with the above.  The discussion about interlaced and non-interlaced
systems with the same frame rate may be true, but is largely irrelevent.
A common interlaced computer monitor would have a 60Hz field rate and a 30Hz
frame "rate".  A common non-interlaced monitor would have a 60Hz field and
frame rate (since they are the same).  Given these numbers, the non-interlaced
monitor will demonstrate less flicker and other objectionable artifacts in
almost all cases.

If someone disagrees significantly with my "common frame rate" estimates,
I am of course interested in hearing about it, but please include specific
information about some specific display systems (e.g. VGA).

-David
ucsd!telesoft!dar, dar@sdcsvax.ucsd.edu

sleat@sim.ardent.com (Michael Sleator) (08/31/89)

In article <470@telesoft.telesoft.com> dar@telesoft.telesoft.com (David Reisner) writes:
>In article <17400008@hpfcdj.HP.COM>, myers@hpfcdj.HP.COM (Bob Myers) writes:
>> ... there's a distinct difference between a
>> "60Hz non-interlaced" display, and a "60Hz interlaced" display.  In one
>> case, we're talking frame rate; in the other, the "60Hz" is, of course, the
>> *field* rate. ...
>
>I agree with the above.  The discussion about interlaced and non-interlaced
>systems with the same frame rate may be true, but is largely irrelevent.

What is it that you agree with?  That the meanings of "60Hz non-interlaced"
and "60Hz interlaced" are, "of course," universally and unambiguously
understood? I hope not, because I think that's far from true.  (The
predecessor to the quoted posting clearly demonstrates this.)  Far better
to say what you mean.

Irrelevant to whom?  There are certainly issues here which are relevant to
those who design, specify, or evaluate video monitors or the hardware which
drives them; a class of people with a significant representation in this
newsgroup.  By extension, these issues are relevant to those who use such
systems.

To say, in effect, "non-interlaced displays run at 60 frames/sec and
interlaced displays run at 30 frames/sec and that's the way the world is,
why talk about it"  may indeed reflect the preponderance of current practice,
but it does nothing to further the state of the art.  Rather, mindsets
like this inhibit advancement of the state of the art.  This was the point
of my "interlace as a dirty word" diatribe a few days ago.  The designer
who thinks that interlacing is a guaranteed cheap way to cut his bandwidth
requirements in half is no better than the customer who thinks "Interlace?
Ick.  I'm not going to touch it."  Neither one is critically appraising the
real issues.  

>A common interlaced computer monitor would have a 60Hz field rate and a 30Hz
>frame "rate".  A common non-interlaced monitor would have a 60Hz field and
>frame rate (since they are the same).  Given these numbers, the non-interlaced
>monitor will demonstrate less flicker and other objectionable artifacts in
>almost all cases.
>
>If someone disagrees significantly with my "common frame rate" estimates,
>I am of course interested in hearing about it, but please include specific
>information about some specific display systems (e.g. VGA).

If you happen to have an Ardent Titan with a stereo display, you may notice
that in certain modes it runs at 120 fields/sec 2:1 interlaced.  Common?
No.  Relevant?  To some of us.  (By the way, the Sony GDM-1950 monitor does
a very nice job with interlaced video.  I hope the replacement for it does
as well.  (You guys down there in San Diego listening??? :-) ))


Michael Sleator
Ardent Computer ("Stardent"?  Well, it's better than "Arellar"!)
880 W. Maude
Sunnyvale, CA  94086
408-732-0400
...!{decwrl | hplabs | ubvax | uunet}!ardent!sleat

P.S.:	I'll be gone for a couple of weeks and follow-ups will probably
	have been purged by the time I get back, so if you want to flame
	me, you should probably mail me a copy.  Actually, that goes for
	low-temperature replies also.

pepke@loligo (Eric Pepke) (09/01/89)

When you're talking about the difference between interlace and non-interlace
per se, it doesn't make any sense to let every other variable differ as well.
If you do, it's never clear just what you are comparing, and the result is
that you can never say anything meaningful about interlace.

Interlace was a kludge way back in the early days when emulsions were coarse
and amplifiers were slow to minimize perceived flicker GIVEN A CERTAIN
BANDWIDTH.  You don't need to worry about comb filters and aliasing when
you make your amplifiers out of tubes with bakellite bases and all your
images come from an image orthicon that is about as grainy as cream of
wheat.  Now, of course, you can build tiny solid-state current feedback 
amplifiers with transitions sharp enough to pierce the firmament.  However, 
bandwidth is still one of the most important numbers that you have to 
engineer around.  In CG terms it translates roughly to horizontal scan 
frequency times number of dots in a line and directly affects

1) Response time of the phosphor
2) The rate at which you have to grab memory
3) The quality of the fastest amplifiers in the entire loop
4) How long you can make your cable and what you have to make it out of

These are the MAJOR engineering decisions.  The question is, given the 
bandwidth you can handle, the potentially high vertical visual frequencies,
the resolution of the bits of phosphor, etc. is interlace itself going to
make the picture better or worse, or will it matter.  Numbers for existing
systems put on spec sheets by marketing departments have to do more with 
history than with anything else, and freeing one from these bad 40-year-old
decisions that made sense at the time is the whole point of HDTV in the first 
place.

So, when considering interlace vs. non-interlace, keep the bandwidth
constant, as well as the number of lines displayed and pixels per line.  
The kind of niggling about whether 30 Hz is frame rate or field rate is part 
of the reason that cps was abandoned for Hz, so Hz is obviously not a useful 
concept in this context and causes more trouble than it's worth.

Eric Pepke                                     INTERNET: pepke@gw.scri.fsu.edu
Supercomputer Computations Research Institute  MFENET:   pepke@fsu
Florida State University                       SPAN:     scri::pepke
Tallahassee, FL 32306-4052                     BITNET:   pepke@fsu

Disclaimer: My employers seldom even LISTEN to my opinions.
Meta-disclaimer: Any society that needs disclaimers has too many lawyers.