[comp.lang.postscript] Moire patterns in bitmap rendering

jeffhi@microsoft.UUCP (Jeff HINSCH) (11/06/90)

I'm having a problem eliminating moire patterns from screendumps
that I'm imaging on an L300/RIP 2.  Has anyone else dealt with
this before?  The problem arises in dithered images where there
are many different colored pixels packed closely together.
A LaserWriter exhibits the moire problems as blocks of varying
grey-scale, but the L300 (running at 1270dpi) produces lovely
plaids and abstract patterns.

The best solution I've come up with so far is to run the images
at 134+ lines of resolution, but the offset people can't handle
more than 130 and prefer ~120.  Next best is to use a 90 degree
angle, but the output is not very appealing to the eye--everything
appears to be marching down the page in columns and there are
still some combinations of dithered color that produce moire at 90.
My next effort was to try a line screen at a flat angle, but that
is truely ugly.

I suppose I could use traditional methods as a model: I'd strip
all the yellow out of the bitmap making a new bitmap and image
it at a unique angle (30 is recommended I believe), return to the
image origin and lay down the stripped-out blue portion of the
image at a different angle, etc.  But before I cripple my current
production methods I thought I'd ask you to share your experiences
along this line.

Thanks for any help you can provide.
------------
only UNIX cripples don't have a sig.