[rec.video] Gamma correction

dave@imax.com (Dave Martindale) (06/06/91)

In article <14070@exodus.Eng.Sun.COM> srnelson@nelsun.Eng.Sun.COM (Scott R. Nelson) writes:
>
>The broadcast television industry has settled on a standard gamma value of
>2.222222 (1.0/0.45).  This has been build into television sets for
>decades.  This value happens to look correct on all properly adjusted
>monitors that I have seen.

Well, actually, NTSC specifies a monitor gamma of 2.2 exactly, giving a
camera (or lookup table, since your frame buffer mimics a camera) of
0.4545.  For practical purposes, 0.45, but it is the 2.2 number that is
defined, not 0.45.  This was done with full knowledge that the
"average" receiver has a gamma of about 2.8, because they wanted the
image on screen to have a gamma higher than is "realistic" because
people prefer that when looking at images in a room with dimmed
lighting.  The result of gamma correction for 2.2 followed by display
on a monitor with a real gamma of 2.8 is an image with a gamma of
2.8/2.2 = 1.27 relative to the original scene.

PAL specifies a monitor gamma of 2.8 instead of 2.2.  This should give
more accurate tone reproduction on sets with a real gamma of 2.8.
However I have heard at least one report that European broadcasters
adjust cameras for a gamma of 0.45 anyway, not 0.36, because they want
the same contrast increase that the NTSC standard gives.

(Is there anyone out there who works in a European television studio?
Are cameras set up for a gamma of 0.45 or 0.36, or something else
entirely?)

The HDTV production standard specifies a more complex "opto-electronic
transfer characteristic" that is basically gamma correction with a
factor of 0.45 in bright areas, spliced to a linear function at low
brightnesses.  They did this to have a characteristic that camera
gamma-correctors could adhere tightly to; the NTSC and PAL
gamma-correction functions are not physically realizable for dark areas
of the picture.

So, the "gamma correction" function you should use is standardized, but
the standard you should use depends on the television system you are
encoding your picture for.

On the other hand, if you just want to display your picture on a
monitor, you should gamma-correct for that particular monitor.  If you
want to accurately reproduce tones, you should fully correct for the
monitor gamma, to give an on-screen picture with a gamma of 1.0
relative to the calculated intensities.  If you want to see what the
image will look like on TV, though, you might want to use a lookup
table that gives the on-screen picture a gamma of about 1.25 relative
to the calculated intensities, since NTSC encoding will do that.  And
if you want to see what it will look like when recorded on film and
projected in a theatre, you might want to adjust the on-screen picture
for a gamma of 1.5, since that's what film does.

Of course, as Scott points out, not doing any gamma correction at all
will get you an extremely dark picture that bears little relationship
to the intensities you calculated.  It's better to gamma-correct using
almost any value of gamma than to ignore the problem.  Picking the
*precise* value of gamma that is appropriate to your circumstances
is less important.

	Dave Martindale