scott@applix.UUCP (Scott Evernden) (12/01/87)
I have a 300dpi monochromatic image from a MicroTek scanner. The image is a halftone, meaning the scanner has imposed a regular dither pattern (variable: 2x2 to 8x8) to achieve grey scale. I want to display this image on an approx. 75dpi screen. Currently, I take the simple approach of resampling the original image according to the scale factor (300/75) and read the value of the closest pixel. As you might guess, some very bad artifacting occurs. I suppose I could average the pixels in the vicinity of the sample location and maybe reduce my problem. Question is: does anyone have any neat ideas or references on how to display hi-res dithered images on a low-res screen at any scale factor? I'm looking for image scaling algorithms which can avoid the artifacting problem. Ideally, the algorithm should be amenable to rotation and shearing transforms as well. -scott
jbm@aurora.UUCP (Jeffrey Mulligan) (12/04/87)
in article <627@applix.UUCP>, scott@applix.UUCP (Scott Evernden) says: > > > Question is: does anyone have any neat ideas or references on > how to display hi-res dithered images on a low-res screen at any > scale factor? I'm looking for image scaling algorithms which can > avoid the artifacting problem. > The following approach is probably a little expensive computationally (i.e. slow) but it should work: 1) Low pass filter the original dithered image Now you have a continuous tone image which approximates the original. The filter should be designed to cut out everything above the Nyquist frequency of the final sampling rate. 2) Subsample this intermediate image. Since you have filtered, the subsampling will not introduce any aliases (presumably the type of "artifacts" you observed?). 3) Dither the subsampled image. -- Jeff Mulligan (jbm@ames-aurora.arpa) NASA/Ames Research Ctr., Mail Stop 239-3, Moffet Field CA, 94035 (415) 694-5150