[comp.graphics] Color perception and Re: Virtual Reality

mjones@stdc01.UUCP (Michael Jones) (11/21/89)

COLOR PERCEPTION:

In <391@ucsvc.ucs.unimelb.edu.au> U5569462@ucsvc.ucs.unimelb.edu (DAVID CLUNIE):

  >Which brings me to the point that I have read somewhere recently (can't
  >remember where) that the human eye CAN'T distinguish any more than 64 
  >different shades of grey. Is this so ? Do people believe it ?

It is not so, and those "in-the-know" do not believe it. My professional work
in graphics involves real-time out-the-window flight simulation, and this is
an area where 30-Hz update, 4000-polygon, smooth-shaded, texture-mapped, anti-
aliased 24-bit (8-bit R, G, and B) output is "good", and 60-Hz update, 8000
polygon, 36-bit (12-bit R, G and B) is "real good". We are also expected to 
provide as many as 12 channels (displays) of the data-base (containing 10^5 
to 10^7 polygons) at the same time. 

The LSB's are needed in low-light situations, so that you can have both
dynamic range and precision. Otherwise, Mach-banding will be _very_ obvious 
in textured and/or smooth-shaded objects.

Those extra bits (plus the fractional bits required for good sub-pixel blending
in transparent and multi-coverage situations) tend to be _really_ expensive to
come by, and yet they are often required -- so I know I am not alone in my
appreciation of them.

VIRTUAL REALITY:

In response to the many postings on this hot, new topic:

"Virtual Reality" is only a new idea to people moving up from workstation
graphics. NASA astronauts could all tell you that they've been in space before
they ever left the ground. This is also true for airline pilots, military
pilots, truck and tank drivers, ship's pilots, cargo crane operators, and many
others. Many of the people at LLNL and LASL have "been there" (at least for a
few nanoseconds.)


EMOTIONAL CLAIM:

## step .. step .. step --- we can hear the sound of someone surmounting the ##
## soap-box.  There is an almost electric anticipation throughout the crowd. ##

  Don't fall into the trap of thinking that the IRIS and TAAC systems define 
  some kind of "extreme high-end graphics hardware". They are nice, but like 
  all engineering endeavors, involve many trade-off's, which, in their case,
  appear to be oriented at good performance as constrained by low cost.

## thunderous applause is heard as the speaker is held aloft by the many who ##
## have heard and understood the wisdom of his words. Much has been learned! ##


-- 
-- Michael T. Jones          Email:            ...!mcnc!rti!stdc01!mjones --
-- The wise man will pursue  Paper: 3101-H Aileen Drive, Raleigh NC 27606 --
-- excellence in all things  Voice: W:(919)361-3800  and  H:(919)851-7979 --

rick@hanauma.stanford.edu (Richard Ottolini) (11/22/89)

In article <578@stdc01.UUCP> mjones@stdc01.UUCP (Michael Jones) writes:
>
>"Virtual Reality" is only a new idea to people moving up from workstation
>graphics. NASA astronauts could all tell you that they've been in space before
>they ever left the ground. This is also true for airline pilots, military
>pilots, truck and tank drivers, ship's pilots, cargo crane operators, and many
>others. Many of the people at LLNL and LASL have "been there" (at least for a
>few nanoseconds.)

I somewhat disagree.  These are experiments along a continuum of development
heading toward virtual reality.  Two significant changes in the past
couple years are greater creative control over your "synthetic reality"
and increased accessibility (lower cost).  Most of the above examples the
human is there only for the ride.  There are some neat results when the human
can radically alter the environment and interact with other humans in the
synthetic reality.  Also the price of the I/O helmets, gloves, suits have
dropped from six figures to four figures and soon lower.