[comp.windows.x] Server out of memory error?

warren@cbnewsh.att.com (warren.a.montgomery) (12/26/90)

I am seeing an error I can't explain while running an application
under X11 R4 on Sun workstations (it happens on both sun3 and
sun4, using, I believe, the normal X server.)  My application is
an animation, runs directly on Xlib, and issues a large number of
CopyArea requests from off-screen bitmaps onto a window, flushing
the buffer to the server after each.  What happens is that if it
runs in this mode long enough (where long enough is pretty short,
like a minute or 2), it gets the error: 
fatal IO error 12 (Not enough memory)
from the server.  The error seems to be triggered by the timing of
the requests and not the absolute number, in that if I stop it for
a while after every 30 seconds or so and then continue, it never
gets the error.  

The message I get also indicates that there are gobs of
unprocessed requests pending when it dies.  It looks like what
happens is that the server is giving priority to reading and
queueing new requests over executing them, even as it runs out of
memory.  This seems pretty silly if so, why not give priority to
executing requests, letting the client block when the channel to
the server fills up with unread requests?  Does someone out there
know if this is indeed what is happening?  If so, how does an
application defend itself against this suicidal server behavior?

Reply by mail if possible.

Thanks

Warren Montgomery (iexist!warren@att.com)
-- 

	Warren Montgomery
	att!ihlpf!warren

warren@cbnewsh.att.com (warren.a.montgomery) (12/28/90)

Thanks to several folks who suggested the cause of my server out
of memory error.  The graphics operations produce gobs of NoExpose
events that are never getting read, since the application is in a
tight loop drawing.  It's easily remedied by disabling
GraphicsExposure events for these operations.
-- 

	Warren Montgomery
	att!ihlpf!warren