deadman@garnet.berkeley.edu (Ben Haller) (01/28/91)
Howdy, folks. I'm working on a DA that allows display & modification of color tables (sort of like Klutz DA, but much better). I've run into a few questions that seem to be difficult to answer with IM. 1. How do you find out the "standard" CLUT for a given bit depth and color/grayscale setting? Klutz has a "Default CLUT" button, but tracing through in MacsBug reveals that Klutz gets the default CLUT from resources inside it, and, in fact, sets to the *wrong* CLUT in grayscale modes, since the standard CLUT for grayscale modes has been changed since Klutz came out. There must be some way to get this information out of CQD - how? 2. Inside Mac talks about "protected" and "reserved" color table entries. The mechanism for reserving entries is documented, but doesn't seem to work exactly as documented. The mechanism for protecting entries is not documented - the ProtectEntries call seems to be a black box. And ProtectEntries is *very* slow. What I need to do is be able to determine if a given entry is protected or not. The only way to do this according to IM is apparently to try to protect it myself. If I get a protection error, it's already protected. If I don't, it wasn't. This is very, very slow (why??) and really seems like a kludge. Does anyone know how this rather arcane stuff works (or, indeed, if it really does work at all)? Klutz seems to at least determine whether a given entry is protected or reserved, although it is very slow and seems to make mistakes. But I can't even correctly determine that stuff, much less quickly/reliably. 3. I would like to patch SetEntries upon the opening of my DA so that the CLUT may be changed by the application running, but so that nothing that my DA does changes the CLUT. For example, with many CLUTs opening a dialog box will modify the CLUT so that it has certain colors that the Dialog Manager likes to have around. Ditto with windows, and probably other stuff as well. My DA needs to prevent these changes from occurring, since a CLUT-displaying DA that modifies the CLUT is really sort of stupid. Klutz doesn't even attempt to address this issue. The big problem is, how to patch SetEntries in MultiFinder? (Note that this may not seem like a problem given that I only want to intercept SetEntries calls generated by my DA, which will presumably be in my layer. Unfortunately I'm not at all sure that this is the case - for example, if my window exists *at all*, the Palette Manager will try to give it CLUT entries, regardless of what layer is currently running. And so on. My DA will cause the CLUT to change in many ways, not always in my layer, it seems to me). Any ideas? 4. Finally, a request for input. What features would *you* like to see in a CLUT displaying/modifying DA? What do you dislike about Klutz? Etc, etc. This DA will be done quite soon - it's high-priority for me - so reply soon. I really don't care if you reply personally or if you post - use common sense to determine how interesting your opinion/input is to the net in general. -Ben Haller (deadman@garnet.berkeley.edu)
deadman@garnet.berkeley.edu (Ben Haller) (01/28/91)
In article <1991Jan28.023112.29714@agate.berkeley.edu> deadman@garnet.berkeley.edu (Ben Haller) (ME) writes: > 1. How do you find out the "standard" CLUT for a given bit depth and > color/grayscale setting? Klutz has a "Default CLUT" button, but > tracing through in MacsBug reveals that Klutz gets the default > CLUT from resources inside it, and, in fact, sets to the *wrong* > CLUT in grayscale modes, since the standard CLUT for grayscale > modes has been changed since Klutz came out. There must be some > way to get this information out of CQD - how? I should clarify: it says in IM-5, page 81: "There are several default color tables that are in the Macintosh II ROMs. There is one for each of the standard pixel depths. The resource ID for each table is the same as the depth. For example, the default color table used when you switch your system to 8 bits per pixel mode is stored with resource ID = 8." This doesn't solve my problem, because since that was written, things have changed. There is now one default color table per depth for color modes, and a *different* color table per depth for grayscale modes. I need to know how this works. Is there a more complex resource ID numbering scheme now? Did 32-Bit QuickDraw patch something in an attempt to use the correct table for whatever display was being "defaulted"? Is there some call that I'm not aware of that returns the correct table? Etc. -Ben Haller (deadman@garnet.berkeley.edu)