[comp.graphics] Cleaning up frame-grabbed images: "Fourier plane"?

adam@ste.dyn.bae.co.uk (Adam Curtin) (09/27/90)

Hi

I'm considering buying a colour digitiser (video frame grabber) as a cheap
substitute for a colour scanner.

However, all of the images I've seen have black lines across them, something
to do with the video signal and scanning and such, which spoil the image.

My wife remembers from her physics undergraduate days that an image overlayed
with a grating could be optically processed to remove the grating - light from
the image was focused though a lens to come to a point at the fourier plane.
Then, somehow, the image the other side was a magically cleaned up version of
the original. (You can tell that I don't understand it very well :-).
She can't describe it well enough for me to code it (even if I could understand
it), but she tells me that the technique is often used for cleaning up images
from satellites.

Can anyone illuminate me?

Adam
-- 
A. D. Curtin			  .	Tel  : +44 438 753430
British Aerospace (Dynamics) Ltd. .	Email: adam@ste.dyn.bae.co.uk
PB 230, PO Box 19, Six Hills Way, . <This disclaimer conforms to RFC 1188>
Stevenage, SG1 2DA, UK.		  .	"My other car is an FJ1200"

tomg@hpcvlx.cv.hp.com (Thomas J. Gilg) (09/29/90)

> My wife remembers from her physics undergraduate days that an image overlayed
> with a grating could be optically processed to remove the grating - light from
> the image was focused though a lens to come to a point at the fourier plane.
> Then, somehow, the image the other side was a magically cleaned up version of
> the original. (You can tell that I don't understand it very well :-).

There are many image processing/cleanup methods based on frequency domain
filters.  The usual computational sequence is:

    1. Convert the Spatial Domain Image (ie, your digitized image) to the
       Frequency Domain.  The Fast Fourier Transform (FFT) is popular here.

    2. Analyze the Frequency Domain representation of your image, and then
       apply some selected mask to it.

    3. Convert the Frequency Domain result back into the Spatial Domain.
       Magically, you'll have your cleaned image iff you did the right
       things in step #2.   BTW - the inverse FFT is popular here.

While the above is a standard sequence, and is usually more effective/faster
than pure Spatial Domain methods, it does take time.

Its my understanding that "optical" implementations of the above described
computational sequence are possible.  One lens setup deals with step #1,
a few optical filters modify the image for step #2, and another lens setup
converts the image back for step #3.  While this allows for real time image
processing, I'm not sure if its too handy for pre-digitized images.

Thomas Gilg
tomg@cv.hp.com

Great reference: "Digital Image Processing"
                 Rafael C. Gonzalez/Paul Wintz
                 Addison-Wesley Publishing Company

ph@miro.Berkeley.EDU (Paul Heckbert) (09/29/90)

In article <1990Sep27.085647.13944@ste.dyn.bae.co.uk>,
adam@ste.dyn.bae.co.uk (Adam Curtin) writes:
>My wife remembers from her physics undergraduate days that an image overlayed
>with a grating could be optically processed to remove the grating...

This sounds like "deconvolution".

If you've got an image a(x,y) that's been convolved with a
linear, space-invariant filter h(x,y) to arrive at another image b(x,y)
then you can un-filter using the following approach.

Convolution in the spatial domain corresponds to multiplication in
the frequency domain:

SPATIAL DOMAIN					FREQUENCY DOMAIN

b(x,y) = a(x,y) * h(x,y)			B(wx,wy) = A(wx,wy) . H(wx,wy)

a(x,y) = b(x,y) deconvolved with h(x,y)		A(wx,wy) = B(wx,wy) / H(wx,wy)

    where
	'*' denotes convolution, '.' denotes multiplication
	(wx,wy) are x and y frequencies
	F(wx,wy) is the Fourier transform of f(x,y)

So if you have the degraded image b, and the point spread function
(impulse response) of the filter h, you take their Fourier transforms to
compute B and H, divide the first by the second frequency-by-frequency,
and then compute the inverse Fourier transform to compute a.

This works assuming that you know the degradation filter h, that it's
linear and shift-invariant, and that H is always nonzero, and that the
noise is zero or small.

There are more sophisticated methods that don't assume knowledge of h
(called blind deconvolution) and methods that minimize the amplification
of noise during deconvolution, but I don't know them well enough to
describe them.

As a starter book on image processing, try:

    Anil K. Jain,
    Fundamentals of Digital Image Processing
    Prentice Hall, Englewood Cliffs, NJ, 1989.

I don't know how the optical method you mentioned would work, except that
lenses effectively compute a Fourier transform.  I don't know how you
would do the division optically, however.  Maybe other netters can answer that.

Paul Heckbert, Computer Science Dept.
570 Evans Hall, UC Berkeley		INTERNET: ph@miro.berkeley.edu
Berkeley, CA 94720			UUCP: ucbvax!miro.berkeley.edu!ph

turk@media-lab.media.mit.edu (Matthew Turk) (09/30/90)

 > I don't know how the optical method you mentioned would work, except that
 > lenses effectively compute a Fourier transform.  I don't know how you
 > would do the division optically, however.  Maybe other netters can answer that.

Yep, you can do multiplications in the Fourier domain easily with
optics.  And the division by H(wx,wy) is implemented as a
multiplication by (1 / H(wx,wy)).

But note that you can do this digitally as well.  The impulse function
of the filter whose frequency response is (1/H(wx,wy)) is itself a
spatial filter h'(x,y), so once you find h'(x,y) you just convolute
the image with it.

In practice, though, this is kinda difficult, both because you
probably don't know H(wx,wy) very accurately and because designing FIR
filters (which is what we're talking about!) is not as straightforward
as it sounds.  Another good reference is Jae S. Lim's "Two-Dimensional
Signal and Image Processing" (Prentice-Hall, 1990).

But if I remember the original post properly, you're finding black
lines which spoil your digitized images.  I'd check your hardware,
because these aren't normally part of the video signal!  (Are you by
any chance mixing 50 and 60 hz components?)

	Matthew Turk
	MIT Media Lab			turk@media-lab.media.mit.edu
	20 Ames St., E15-391		uunet!mit-amt!turk
	Cambridge, MA  02139		(617)253-0381

p560fgr@mpirbn.mpifr-bonn.mpg.de (Frank Grieger) (10/01/90)

In article <1990Sep27.085647.13944@ste.dyn.bae.co.uk> adam@ste.dyn.bae.co.uk (Adam Curtin) writes:
>Hi
>
>I'm considering buying a colour digitiser (video frame grabber) as a cheap
>substitute for a colour scanner.
>
>However, all of the images I've seen have black lines across them, something
>to do with the video signal and scanning and such, which spoil the image.
>
>My wife remembers from her physics undergraduate days that an image overlayed
>with a grating could be optically processed to remove the grating - light from
>the image was focused though a lens to come to a point at the fourier plane.
>Then, somehow, the image the other side was a magically cleaned up version of
>the original. (You can tell that I don't understand it very well :-).
>She can't describe it well enough for me to code it (even if I could understand
>it), but she tells me that the technique is often used for cleaning up images
>from satellites.
>
>Can anyone illuminate me?
>
>Adam
>-- 
>A. D. Curtin			  .	Tel  : +44 438 753430
>British Aerospace (Dynamics) Ltd. .	Email: adam@ste.dyn.bae.co.uk
>PB 230, PO Box 19, Six Hills Way, . <This disclaimer conforms to RFC 1188>
>Stevenage, SG1 2DA, UK.		  .	"My other car is an FJ1200"

Hallo Adam,

the physical explanation of spatial filtering is very simple: 
Please consider an optical setup were a lense forms am image in its 
focalplane, which will be projected on a detektor by a second lense. We call
this a telescope looking at an image at infinity. If there is no dirt on the
lense or any other negative influence on the image formation, a paralell light 
beam will be focused to a lgiht spot in the focal plane. This spot is a
diffraction pattern called an airy disc. If there are some dirt particles on 
the first lense, light will be scattered out of the light spot in the focal
plane and will have negative influence on the image formation on the detector.
If you now center a pinhole in the focal plane of the first lense which will
cover up the light which was scattered by the dirt particles, the information
about the dirt ist lost and the result is a clean image on the detector. This
technique is for example used to produce clean laser light beams for
holography. For more complicated setups and more general images you need more 
complicated masks in the focal plane of the lense.

If you now would like to clean images in the computer, you have to know
something about the mathematics of this phisical efect:

In the experement we discribed above, we had an image in the infinity, a lense
which projected the image into its focal plane where the image quality was
improved by the positioning of a mask, and we had a secound lense which showed
us the result on a detector. In terms of mathematics the lenses perform
a Fourier-Transform on the original image. The Fourier-Transform is discribed
by:

          H(f) = int { h(t) exp [ -i 2 PI f t ] } dt

H(f) is the image in the Fourier domain, 
f    is its coordinate vector,
int  denotes an integration,
h(t) is the original image and
t    is the vector in the image plane.

To clean images in the computer, you take the image, perform a Fourier Transform
and multiply a binary mask to the result or better fit "bad" regions whith 
values in the surrounding. Finaly you perform an inverse Fourier-Transform to 
obtain the clean image. 

For your specific problem:

The stripes in your original image, will be represented by peaks in the
Fourier domain which you will have to remove.

On the computer you should use the fast fourier transform algorithm, which 
might be in your math library or is discribed in the litterature.

Litterature:

ALSOP, L. E., and A. A. NOWROOZI, "Fast Fourier analysis," J. Geophys. Res.
   Vol. 71, PP. 5482-5483, November 15, 1966

GOODMAN, J. W., "Introduction to Fourier Optics," McGRAW-HILL BOOK COMPANY

BRACEWELL, R., "The Fourier Transform and ist Applications," McGRAW-HILL BOOK
   COMPANY
result.


I hope this is the information you wanted.

Frank