[comp.windows.x] Do Servers Perform Gamma Correction?

jdp@polstra.UUCP (John Polstra) (02/17/90)

The description of the XColor structure in the XLIB book seems to imply
that the RGB values correspond linearly to the intensities on the
display.  ("On full ... is a value of 65535 ... Half brightness is a
value of 32767 ...")  Do the various servers attempt to achieve
linearity via gamma correction or some similar scheme?  I looked in the
obvious places in the server sources but did not find any such
intensity correction.

Somebody (server or client) has to perform intensity correction, and it
seems clear that the server, which knows the characteristics of the
display, is the obvious choice.
-- 

-  John Polstra               jdp@polstra.UUCP
   Polstra & Co., Inc.        ...{uunet,sun}!practic!polstra!jdp
   Seattle, WA                (206) 932-6482

rws@EXPO.LCS.MIT.EDU (Bob Scheifler) (02/20/90)

    The description of the XColor structure in the XLIB book seems to imply
    that the RGB values correspond linearly to the intensities on the
    display.

The implication is wrong (Xlib probably should be reworded).  The server does
not do gamma correction.


    Somebody (server or client) has to perform intensity correction, and it
    seems clear that the server, which knows the characteristics of the
    display, is the obvious choice.

There is some discussion going on in the ximage mailing list about gamma
correction and where it should be done.  Saying it should *always* be done
isn't necessarily the right answer either.  The real problem I think, for most
applications, stems from the use of an inherently device-dependent color model
(RGB) as the only interface.  A device independent color model (e.g. the HVC
model that Tektronix presented at a past X conference) would be a better
solution than introducing gamma correction.

mccoy@pixar.UUCP (Daniel McCoy) (02/27/90)

In article <9002192154.AA01862@expire.lcs.mit.edu> rws@EXPO.LCS.MIT.EDU (Bob Scheifler) writes:
>There is some discussion going on in the ximage mailing list about gamma
>correction and where it should be done.  Saying it should *always* be done
>isn't necessarily the right answer either.  The real problem I think, for most
>applications, stems from the use of an inherently device-dependent color model
>(RGB) as the only interface.  A device independent color model (e.g. the HVC
>model that Tektronix presented at a past X conference) would be a better
>solution than introducing gamma correction.

It *is* always done.  Currently what happens is that people choose
colors that look good on the monitor they are using.  So the colors
in places like rgb.txt have been manually corrected for somebody's
monitor.  Since monitors don't differ too greatly, people don't notice.

That is until they try to display image data, whether scanned or synthetic.
Applications that care end up having to guess the monitor characteristics
and do the correction themselves.

I don't agree that a new color model is required.  
All that's needed are some correction tables (r, g, and b per screen) and 
a way for applications that care to specify DoCorrection to XStoreColors.
That would handle most things quite well, without breaking any existing
clients.

If you try to do really high quality correction, like color matching for
the print world, I doubt you will be able to satisfy everybody.  
For applications that need that level of quality, some esoteric 
Queryable monitor characteristics like white-point should help them out. 

Dan McCoy  ...!ucbvax!pixar!mccoy

(Sorry if you think that ximage is the place for this kind of stuff,
things have gotten pretty quiet there.  There were only two of us making
up that discussion.)