[sci.space] Beating the diffraction limit

jwl@ernie.Berkeley.EDU (James Wilbur Lewis) (01/14/88)

In article <531@srs.UUCP> lee@srs.UUCP (Lee Hasiuk) writes:
>>    From Jenkins & White, Fundamentals of Optics p331
>>    Minimum angle of resolution in seconds = 1.220 * ( lambda / D )
>>    Where D is the diameter of the aperture and lambda is the wavelength
>> of the light. It is physically impossible, using 1 image, to get below 
>> this limit. 
>
>In a complex analysis class, we were told that the diffractive 'limits' of
>lenses and mirrors could be bypassed to a certain degree through the use
>of analytic continuation.  Anyone care to comment?

The diffraction component of the point spread function for a given wavelength 
and aperture is known; it should be possible to beat the diffraction limit
by deconvolving this function with the image.  I've seen this done for 
out-of-focus images, and the results are remarkable.

The real problem, it seems to me, is noise introduced by the atmosphere
(and other factors, I suppose...).  Since you can't remove the noise
analytically, information is truly lost.  It is not clear (to me) how this
effect varies with aperture; amateur astronomers often prefer a small
aperture/high f-ratio instument  to larger (and theoretically better
resolution) "light bucket" type 'scopes for planetary observations where
light grasp isn't the limiting factor.  Are larger apertures really more
sensitive to "seeing", or is this an artifact of the difference in focal
ratios/optical quality?  Would this effect be irrelevant for a telescope 
above the atmosphere, where one doesn't have to worry about air boiling 
around inside the tube?

-- Jim Lewis
   U.C. Berkeley

brucec@orca.UUCP (01/17/88)

In article <22572@ucbvax.BERKELEY.EDU> jwl@ernie.Berkeley.EDU.UUCP
(James Wilbur Lewis) writes:
>The diffraction component of the point spread function for a given wavelength 
>and aperture is known; it should be possible to beat the diffraction limit
>by deconvolving this function with the image.  I've seen this done for 
>out-of-focus images, and the results are remarkable.
>

Similar computations can also be done to remove motion-blurring.  If the
distortion characteristics of the intervening medium are known (or can be
approximated to some reasonable degree), they can be also be removed.
This makes it easy to look through wavey glass, but leaves something to be
desired when looking through the atmosphere, because of the imperfect
knowledge about the medium.  Still some enhancement is possible.

>The real problem, it seems to me, is noise introduced by the atmosphere
>(and other factors, I suppose...).  Since you can't remove the noise
>analytically, information is truly lost.  It is not clear (to me) how this
>effect varies with aperture; amateur astronomers often prefer a small
>aperture/high f-ratio instument  to larger (and theoretically better
>resolution) "light bucket" type 'scopes for planetary observations where
>light grasp isn't the limiting factor.  Are larger apertures really more
>sensitive to "seeing", or is this an artifact of the difference in focal
>ratios/optical quality?  Would this effect be irrelevant for a telescope 
>above the atmosphere, where one doesn't have to worry about air boiling 
>around inside the tube?

Yes, larger apertures are more sensitive.  As I recall (I may very well be
wrong, this information comes from many years back in my memory), there is
a critical size of roughly 10 cm., established by the size of the average
convection cell in the air.  Distortion is least when all of your image
goes through a single cell (on average).

There are some things which can be done to remove atmospheric distortion
even though it varies with time in an unpredictable fashion.  First,
recognize that there are several sources of distortion:

	1) Unpredictable translations of the image caused by changes in
the refractive index of the air you look through as a function of time.
This is mostly of concern when taking moving pictures, or trying to
compare one picture to another (although taking pictures of the janitor
next to the general you want to know about can be embarrassing).  Where
this is a problem in single images is when the motion takes place in a
time of the same order as the exposure length (or the integration time of
the CCD).  This can be handled as motion-deblurring.

	2) Arbitrary affine geometric distortions caused by what you might
call the "funhouse mirror" effect:  Changes in the path of the light rays
over the field of the image.  Since this affects only the large-scale
geometry of the image, it can removed by applying an inverse transform,
which can be determined interactively if need be (twiddle the knobs 'til
it looks right).  This gets harder if the distortion changes on the same
time-scale as the exposure time.  I haven't read of any research on this
problem (guess who's most interested in it), but I would guess that
applying motion-deblurring with different parameters in each of several
regions of the image would be useful.  This might also help when the image
is viewed through several convection cells, since the distortion
transformations will change abruptly at the edges of a cell.

	3) Haze. This is equivalent to a loss of image contrast.  Since
most of the human image recognition capability is based on boundaries
(high-spatial-frequency components of an image), edge-enhancement helps
here.  This is a useful preprocessing step for the first two effects.

Theoretically, taking a number of images of the same area and correcting
for angular changes due to the flight path of the observer could allow
some averaging out of distortion.  I suspect this technique isn't all that
useful, since the improvement should go as the square root of the number
of images used in the average, and you just don't have that long a time
during which a low-orbit bird is over any one place.

By the bye, the newspaper article I read stated that these incredible
feats of imaging could be done through cloud cover, which I very much
doubt.  Even infrared doesn't see perfectly through clouds, and since IR
wavelengths are longer than light, the diffraction limit on angular
resolution is larger for a given aperture optic.

---------------------------------------------------------------------------
	"The galaxy-spanning luminous arcs reported by M. Mitchell
	Waldrop in Research News on 6 February have a very simple
	explanation.  They are part of the scaffolding that was not
	removed when the contractor went bankrupt owing to cost
	overruns."
					"Arthur C. Clarke, Sri Lanka"

My opinions are my own; no-one else seems to want them.

Bruce Cohen
UUCP:        {the real world}...!tektronix!ruby!brucec
ARPA/CS-NET: brucec@ruby.TEK.COM
overland:    Tektronix Inc., M/S 61-028, P.O. Box 1000, Wilsonville, OR  97070

howard@COS.COM (Howard C. Berkowitz) (01/22/88)

In article <8801192122.AA06886@ames-pioneer.arpa>, eugene@PIONEER.ARC.NASA.GOV (Eugene Miya N.) writes:
> Bruce, you made some excellent comments about this problem!
> However, I would like to add one comment about what you said about
> removing motion blurr.   Rather than do it computationally or
> optically, it's just much simpler to move the recording instrument or
> media. (If I had a quarter for every roll of film I've hunched over,
> I'd be rich.)



A number of photorecon satellites do exactly that, according to Dino
Brignoli, a retired senior CIA recon expert who has a rather interesting
"road show" on photoreconnaissance and history.  I heard him a few
years ago at the Washington chapter of the Society for Photographic
Scientists and Engineers.

He said that one of the major breakthroughs in imaging satellites,
which was classified for some time, was using moving backs both
to cancel motion and allow much longer exposure times.    
-- 
-- howard(Howard C. Berkowitz) @cos.com
 {uunet,  decuac, sun!sundc, hadron, hqda-ai}!cos!howard
(703) 883-2812 [ofc] (703) 998-5017 [home]
DISCLAIMER:  I explicitly identify COS official positions.