[comp.windows.interviews] urgent delete world problem

cathryn@UUNET.UU.NET (Cathryn Szekely) (06/05/91)

Hello,

I am looking for a very quick answer to a question.  I only have a week to
fix this problem so if anyone could have mercy on me and help me I would
appreciate it.

I am running InterViews version 2.6 on SCO/UNIX.

I have a program that spawns a child process, and I am having trouble
dealing with deleting the World in the child process, then returning to 
the parent.


Program Descripton:
-------------------
I have a World with a display of graphics inserted into it.  The display
of graphics consists of various Interactors including a MenuBar at
the top and a GraphicBlock containing a Picture with various Graphic
objects in it.

One of the menu options on my display of graphics is "Edit".  Choosing
the Edit option causes a process to be spawned.  The process spawned
is an exact copy of the original process (in other words, I don't
exec() anything after I fork()).  Then the following occurs:

   - The parent process removed its graphic display from the world 
     before fork() was called, so now it just goes to sleep.

   - The child process enters the editor (by inserting a new graphic
     display into the world) and allows the user to edit the graphic 
     display.

   - If the user quits the editor without saving changes, the child process
     kills itself and signals the parent to wake up so that the user
     will be returned to the original graphic display.

   - If the user saves changes and quits the editor, then the child process 
     kills the parent and takes over and returns to the graphic display with 
     the new changes.


Problem:
--------
When the user quit without saving changes, the child process exited and
woke up the parent.  The parent tried to re-insert the original graphic
display, and then exited (why? did something weird happen to the world?).
The error message displayed started with:
  X Error: BadIDChoice, invalid resource ID chosen for this connection....


Kludgey Solution:
-----------------
I figured that even though the child was a separate process that it
must have done something (external to the process) to the world (perhaps
to do with the X-server?) so that when the parent woke up the world was 
no longer in a state that it recognized.   

So, now when I spawn the editor process, I create a new world for the 
editor process and when I quit without saving changes the parent has 
no problem re-inserting the graphic display into its own world (This 
works fine!).


Problem:
--------
When the editor process is spawned it creates its own world.  It then has
to delete the old world so that that memory will not be hanging around
in the child process.  If I do this, and the user quits without saving
changes, returning to the parent's original graphic display produces
a "Bad Font" error and exits.

The error starts with:

X Error: Bad Font ...

If I do not delete the old world, everything works fine, but obviously
this is not an acceptable major memory leak every time I enter and exit
the editor process!


Possible Solution:
------------------
I looked in the X11-world.C code and saw that the destructor for
world calls XCloseDisplay() which, according to my X-reference manual,
removes all references to resources for that world.  That explains
the "Bad Font" error. 

The X manual also describes a call to XCloseDownMode(display, mode),
which is supposed to stop this from happening.  I called this
routine from the editor process just before deleting the world as follows:

    XCloseDownMode(world->Rep()->display(), RetainTemporary);
    delete world;
    world = new World(...);


Problem:
--------
When the user quits without saving changes and the parent process tries
to re-insert its graphic display into its world,  everything hangs miserably
and X gets totally trashed, and if I try it a couple of times my whole
machine gets hosed so I have to re-boot, so obviously there is more
to it than I thought.


Questions:
----------
1. How can I free the world space in the child process without trashing the
   parent's world?

2. Do I have to create yet another new world in the parent process before
   re-inserting the graphic display?

3. Do I have to somehow call XOpenDisplay to get the world back to normal?

4. Is there some way to do this without having to create a new world at all?

5. (Un-related to above questions)  When I enter the editor process and
   insert the new graphic display, there is a period of time in between
   where the original display goes away and then the new one appears.
   Is there a way to smooth this out so that the old display does not
   dissapear until the new one appears? (I tried removing the old
   one after inserting the new one, but the old one stays in the 
   background.  Calls to Flush() and Sync() did not fix this problem.

linton@marktwain.rad.sgi.com (Mark Linton) (06/06/91)

In article <9106042350.AA15775@chipcom.CHIPCOM.COM>, chipcom!cathryn@UUNET.UU.NET (Cathryn Szekely) writes:
|> I have a program that spawns a child process, and I am having trouble
|> dealing with deleting the World in the child process, then returning to 
|> the parent.
|> The error message displayed started with:
|>   X Error: BadIDChoice, invalid resource ID chosen for this connection....

The problem is the child gets a copy of the X Display information, which includes
things like the "next" resource id to allocate (for things like windows).  The parent
has the original.  The child uses the (shared) connection, using resource ids.
When the parent tries to allocate a resource id later, it tries one the child has already used.

|> X Error: Bad Font ...

In 2.6, we do font caching that does not work properly across multiple displays.
So, it looks up a font name in the child and tries to use it on the child's new display connection.
This is fixed in 3.0, but I don't know how easy it would be for you to convert.