mmachlis@athena.mit.edu (Matthew A Machlis) (03/01/91)
I am working on a realtime graphics simulation for which it is critical that every loop through the program take the same amount of time to run. I realize that the drawing routines will take variable amounts of time to execute depending on what exactly is being drawn on the screen, and can account for this by forcing the program to pause a bit if the loop runs faster than usual. However, in timing the program I have noticed that occasionally a loop will take much longer (up to 2-3 times as long) as the average loop time. All I can think of is that some background process is interrupting my program to do something and taking up this extra time. Is there any way I can stop this from happening? I can use "nice," but only the super-user can execute programs with higher priority than usual, and running my program as the super-user all the time is not an option. Is there any way to boot up the machine such that it doesn't run all of the background filekeeping tasks? I would appreciate any info anybody might have. -- ---------------------------------------------------------------------------- Matt Machlis MIT Space Systems Laboratory (617)253-2272
marks@AIVAX.RADC.AF.MIL (David Marks) (03/01/91)
Two ideas: 1. Check out the TIMER0 and TIMER1 pseudodevices in gl. In browsing through some code we have, I have seen those used to generate events at regular intervals to update the screen of a flight simulator. They are used in conjuction with the noise() routine, to control how often the timers generate events. 2. UNIX provides the set_interval_timer [setitimer(2)] routine which you can use to trigger a SIGALRM signal after a specified interval. Idea #1 is probably the way to go, due to its functional simplicity and its direct availability through gl. Dave Marks Rome Laboratory marks@aivax.radc.af.mil
micah@flobb4.csd.sgi.com (Micah Altman) (03/05/91)
In <1991Feb28.204654.26010@athena.mit.edu> mmachlis@athena.mit.edu (Matthew A Machlis) writes: >I am working on a realtime graphics simulation for which it is critical that >every loop through the program take the same amount of time to run. I >realize that the drawing routines will take variable amounts of time to >execute depending on what exactly is being drawn on the screen, and can >account for this by forcing the program to pause a bit if the loop runs >faster than usual. However, in timing the program I have noticed that >occasionally a loop will take much longer (up to 2-3 times as long) as the >average loop time. All I can think of is that some background process is >interrupting my program to do something and taking up this extra time. Is This is definitely a possibility. Other possibilities include: + you are accessing a memory location not loaded, and need to read a page in from disk + you are occassionally running into overflow/underflow, which generates hidden traps and slows down things a lot + you are trapped in some kernel/networking service ( another process but not necessarily a background process ) >there any way I can stop this from happening? I can use "nice," but only See Using Real Time Programming, which is in the current IRIS-4D Programmer's Guide for complete details and a sample realtime program >the super-user can execute programs with higher priority than usual, and >running my program as the super-user all the time is not an option. Is >there any way to boot up the machine such that it doesn't run all of the >background filekeeping tasks? There are some daemons you can turn off, especially if you aren't planning to do networking and disk i/o but unfortunately you can't turn off everything. The only way to run in true realtime is to be able to do some configuration of the machine and processes running as superuser, and to be running on a multiprocessor machine. -- "Entia non sunt multiplicanda sine necessitate." - William of Ockham Micah Altman, "Computational Juggler" micah@csd.sgi.com Phone (415) 335-1866 FAX (415) 965-2309 Disclaimer: Everything in this document is a lie.