[comp.sys.sgi] real-time and 3D workstations.

aries@XP.PSYCH.NYU.EDU (Aries Arditi) (09/08/89)

I've been following the discussion following your request to info-iris for
a machine on which to conduct your experiments.  I'm afraid that most of
the discussants don't quite understand what the demands of an RT
experiment are.  The IRIS machines, and most other 3D workstations are
UNIX based, which almost by definition means they are NOT real-time in the
sense you need for accurate measurement of time.  Depending on what is
happening at any given time, any number of processes are competing for the
CPU's time. This means that you will always have some uncertainty as to
the onset of your stimulus. Additionally, you will have some uncertainty
as to the button time, unless you have an external real-time clock hung on
the back (Personal IRISes have only one VME bus slot, so only 1 such
device can be on your system at once), and even then, since you don't know
exactly when the stimulus onset was, you still have error.

I think the IRIS sounds like the machine in your price range to generate
your stimuli, but may or may not be depending on how accurately you need
to measure RT. Here are 2 possible solutions:

1. There are versions of "real-time" UNIX available (I know not where, but
someone at AT&T Murray Hill developed one of them for DEC PDP 11's.
Possibly an SGI sales rep will do some legwork to find one for you if he
thinks that's necessary to make the sale.  In the PDP 11 market, these
real-time versions of UNIX varied quite extensively both in quality, I
understand, and in the extent to which they are "real-time." So be careful
here.  You will need to know that you can still run the software for your
experiment, under this operating system.

2. The solution I have used with some success, on another multi-tasking
(i. e. not real time) machine, is to count video frames when measuring
time, rather than waiting for an interrupt from a clock or other device.
It's very easy, but your accuracy is limited to +/- 16.66667 msec.  If
that's OKAY for your application. Sometimes you can degrade your stimulus
or task in some irrelevant way in order toraise all your RT's so 17 msec
isn't such a big deal.

Anyway, computer graphics people usually think that real-time just means
animated, so watch out for that.

Good luck.
-Aries Arditi
 Vision Research Laboratory
 The Lighthouse
 111 E 59th Street
 New York, NY 10022

jmb@patton.sgi.com (Jim Barton) (09/11/89)

In article <8909081349.AA26475@cmcl2.NYU.EDU>, aries@XP.PSYCH.NYU.EDU (Aries Arditi) writes:
> 
	[ ... discussion of some aspects of RT UNIX ... ]
> 
> I think the IRIS sounds like the machine in your price range to generate
> your stimuli, but may or may not be depending on how accurately you need
> to measure RT. Here are 2 possible solutions:
> 
> 1. There are versions of "real-time" UNIX available (I know not where, but
> someone at AT&T Murray Hill developed one of them for DEC PDP 11's.
> Possibly an SGI sales rep will do some legwork to find one for you if he
> thinks that's necessary to make the sale.  In the PDP 11 market, these
> real-time versions of UNIX varied quite extensively both in quality, I
> understand, and in the extent to which they are "real-time." So be careful
> here.  You will need to know that you can still run the software for your
> experiment, under this operating system.

Actually, IRIX has some real-time features which may be adequate for this
application.  First, it is possible to fix a process priority and thus
exempt it from normal priority degradation.  These are various priorities
supported, and of course the highest is gauranteed control of the processor
when it wants.  Fine grained memory locking, which is probably not needed
for this application, allows total control over what's in memory.

> 2. The solution I have used with some success, on another multi-tasking
> (i. e. not real time) machine, is to count video frames when measuring
> time, rather than waiting for an interrupt from a clock or other device.
> It's very easy, but your accuracy is limited to +/- 16.66667 msec.  If
> that's OKAY for your application. Sometimes you can degrade your stimulus
> or task in some irrelevant way in order toraise all your RT's so 17 msec
> isn't such a big deal.

The accuracy of gettimeofday() is 1 millisecond, which should be
more than adequate for human/machine interaction (which is on the order of
tenths of seconds).  Using itimers gives interrupts at a 100HZ rate,
or as pointed out you can use the graphics system frame rate as a clock
as well.

> Anyway, computer graphics people usually think that real-time just means
> animated, so watch out for that.

Some of use know what REAL real-time means, even though we do graphics too.
IRIX is a "soft" real-time system, in that the mechanisms for gauranteed
response and resource control are good down to ~1 millisecond.  A "hard"
real-time system doesn't run UNIX and can get down to 10's of microseconds.
Human/machine interaction is definitely "soft".

> Good luck.
> -Aries Arditi
>  Vision Research Laboratory
>  The Lighthouse
>  111 E 59th Street
>  New York, NY 10022

-- Jim Barton
Silicon Graphics Computer Systems    "UNIX: Live Free Or Die!"
jmb@sgi.sgi.com, sgi!jmb@decwrl.dec.com, ...{decwrl,sun}!sgi!jmb

SCCR50::DSLKRM%rsgate.rsg.hac.com@BRL.MIL ("Kevin R. Martin" -578-4316, 213) (09/15/89)

     Please excuse my delay in accessing and contributing to the engaging
discussion on realtime unix/graphics.  All the comments were great.  I couldn't
resist but to add a thing or two.

     First, on the discussion on realtime.  Realtime is what you need it to be,
no more or no less.  From James Martin, Design of Real-Time Computer Systems:
"A real-time computer system may be defined as one which controls an
environment by receiving data, processing them, and taking action or returning
results **sufficiently** quickly to affect the functioning of the environment
at that time."  Milliseconds and microseconds are interesting, but they don't
define realtime.  They just define our current limits at quantizing time and
reacting with it via computers.  (Of course the existance of "real-time
features" makes the job easier and helps to "classify" an operating system).

     Let's not forget that the Voyager sent back 'realtime' video, traveling at
the speed of light, which was received several hours after it encountered its
subjects :-)!

     Second, out of interest in the subject of realtime workstations, let me
refer to a case involving realtime unix/graphics and the measurement of time.

     In any discussion on measuring time we usually refer to these terms (among
others):

Resolution:	Minimum time interval that can be measured.
Accuracy:	The degree of conformity of the measure to a true value.
Precision:	The degree of refinement with which a measurement is stated.

     Now let's say the unix/graphics task is to measure the amount of time it
takes for an observer to react to an event (stimulus) in the graphics. 
Ignoring the speed of light (a valid assumption this time :-)), we would want
to measure the time from when the event occurs on the screen, until the
observer taps a response button.  To get a feeling for the time and distances
involved-- imagine a car appearing out of the fog, in your lane, heading in
your direction.  If your both traveling at about 35 mph, ten milliseconds
represents about one foot of closing distance.

     Lets assume the event (say a traffic light changing this time) takes place
in the middle of a non-interlaced display.  Since I presume I can't call
gettimeofday() at the exact point in time on the display when the event becomes
visible (say when half the pixels representing the event are turned on, or even
when the middle scan line of those representing the event comes on), I'll end
up calling it some time nearby.  The value I read may very well have
microsecond resolution, and accuracy of 1 millisecond compared to a 'real'
world clock.  However, because of the latency (I couldn't access the clock at
the desired point in time), the value I have to work may need to be stated with
a precision somewhere around a half a frame time (+/- 16.667 msec on a 30Hz
noninterlaced monitor).  And this is just the time measurement needed to
represent the start of the interval being measured!  Similar difficulties
result in measuring the end of this interval (and any interim points), and only
compound the precision loss.  Despite these limitations, we ARE measuring these
intervals with significantly better precision, WITH Silicon Graphics machines,
and some custom hardware.

     And yes it is true that human machine interaction can be considered on the
order of tenths of seconds.  But we would like to know just how good we humans
are and how much time we're losing.  Perhaps we can't always rely on humans to
be our only interface to machines...


Kevin R. Martin                         Internet: dslkrm@sccr50.dnet.hac.com
Radar Systems Group                     U.S. Mail:
Hughes Aircraft Company                 Loc: RC, Bldg: R50, M/S: 2723
GM Hughes Electronics Corp.             P.O. Box 92426
General Motors Corp.                    Los Angeles, CA 90009