[comp.fonts] Smoothing enlarged bitmap fonts

doug@eris (Doug Merritt) (04/11/88)

By popular demand :-) here's my article on smoothing enlarged bitmap
fonts, originally posted as two articles to comp.sys.amiga.

Please note that I'm just talking about *how* to do so. I am certainly
not going to claim that enlarging bitmaps is the best thing to do
in all cases. But it can be handy when done well.

	Doug Merritt		doug@mica.berkeley.edu (ucbvax!mica!doug)
			or	ucbvax!unisoft!certes!doug

Newsgroups: comp.sys.amiga
Subject: Re: Font Smoothing Algorithm
Summary: Low pass spatial filtering does the trick
References: <6065@elroy.Jpl.Nasa.Gov>
Keywords: FFT, smoothing, maps, zoom, spatial filter, averaging, bogus Dpaint2

In article <6065@elroy.Jpl.Nasa.Gov> rgd059@Mipl3.JPL.Nasa.Gov writes:
>
>Does anyone have, or know where to find, an algorithm for smoothing out
>bitmapped fonts when they are scaled up?  Specifically, I want to enlarge
>standard Amiga fonts and not get a blocky appearance.  Failing that, how
>about a general bitmap smoothing routine?

The standard conceptual model for doing this with *any* bitmap images,
including but not limited to fonts, is to do a spatial lowpass filtering
pass after the enlargement/zooming pass.  No colors or grey scales
considered here, but the model is easy to extend.

A simple pixel-duplicating enlargement preserves hard edges/lines, which
corresponds to adding extraneous high resolution details. This is obvious
once you consider that, if you enlarge by (say) a factor of two, then your
smallest resolved detail is in blocks of 2 by 2 pixels, each of which will
be either entirely black or entirely white (depending on the single pixel
that was magnified into a two by two block).

Obviously the magnified bitmap could be smoother if individual pixels
could be turned on or off as well as 2 by 2 blocks. The pixels which
*should* be another value than they actually end up being, constitute
high resolution noise...an artifact created by magnification. In
Fourier optics, different resolution scales correspond to different
spatial frequencies; high resolution noise equals high spatial frequency
noise. By analogy with acoustical spectrums, consider that the obvious
way to get rid of high frequency noise is with a low-pass filter.

So what you do is run a two-dimensional low-pass filter over the
bitmap. The "overkill" way to do this is to take the Fourier transform
of the bitmap (w/ a public domain FFT algorithm), delete the highest
frequency (note that this is the filtering step, and in general could
be a much more complex filter), and do an inverse FFT again to get your
smoothed bitmap back.

The fast, smart, easy way to do it is with a special purpose filter...
Low pass filtering corresponds to an averaging operation. If you
average each successive pair of values in a sampled one dimensional
signal (like digitized sound), you are effectively filtering out the
highest frequency component. You can do the same thing in a bitmap
by making each pixel equal to the average value of its nearest neighbors.

That's all you have to do to smooth it, but you probably will have to
map the resulting grey scale (result of averaging) into pure black and
white by thresholding (picking a grey value below which the pixel is
considered black, and above which it's considered white).

If your zoom factor is something other than a factor of two, the
algorithm still works as long as you create intermediate grey scale
pixels during enlargement, followed by a *weighted* average over the
number of pixels involved in the magnification.

Left as an exercise for the reader is the question of whether to include
diagonal neighbors in the averaging process, and if so, how much to
weight them relative to up/down/right/left neighbors.

Ta-da. You're done. Easy, right? The only reason I went into so much
detail, rather than just sketching the algorithm, is to make it clear
that this is an implementation of a clean mathematical model, not just
a hack that happens to work. Also it's a good example of why FFT's are
so indispensible with optical or imaging analysis of (almost) any sort.

As far as I can tell from the (lack of) features in this area in
commercially available Amiga software, far too few people are aware
of this stuff. For instance, Dpaint 2 does not use this method when
changing the size of brushes...it simply selectively deletes/replicates
pixels, which is often highly undesirable since it introduces a lot
of strange artifacts into many types of images.


Subject: Re: Font Smoothing Algorithm
Summary: Also tends to create serifs; more on image processing
Reply-To: doug@eris.UUCP (Doug Merritt)
Keywords: FFT, smoothing, spatial filter, serif/sans serif fonts

In article <8310@agate.BERKELEY.EDU> doug@eris.berkeley.edu (Doug Merritt) writes:
>The standard conceptual model for doing this with *any* bitmap images,
>including but not limited to fonts, is to do a spatial lowpass filtering

I forgot to mention that this also tends to *add* serifs to sans serif
fonts, which has some interesting implications about why serif fonts
look pleasing to the eye, considering that the visual system does
something akin to an FFT during processing. (Actually a Gabor transform,
according to Dr. Karl Pribram [NeuroPsychology chairman at Stanford], which
is a finite rather than infinite equivalent of the Fourier transform with
implications of a sort of Uncertainty Principle of resolvable details. Not
to digress or anything...)

The reason for the serif embellishment is that, to create a perfectly
straight line requires infinite high spatial frequencies (to the resolution
of the display, anyway). The fourier transform of a square wave (which is
analogous to a straight line/rectangle) is composed of an infinite sequence
of spatial sinusoids. This means that to draw a perfect square wave, or
perfectly straight line/rectangle, you need all the high frequency components
you can display.

If you filter out any of the high frequency components (for instance
in order to accomplish the smoothing I describe), then sharp square
edges will tend to get more rounded, which in this application means
that, at a sufficiently large magnification level, the fonts will be
more curvaceous, with serifs.

If you wanted to end up with a sans serif font with horizontal and
vertical straight lines preserved, modify the lowpass filter a bit
to preserve the purely vertical and purely horizontal high spatial
frequencies, but filter out the ones with both a vertical and horizontal
component. Do this by averaging only over diagonal neighbors, not
including the horizontal and vertical neighbors.

If you want your font to have straight diagonal lines along with horizontal
and vertical lines, but still to have no curves, then the filter gets
even more complex. In general you can draw an image composed of all
of the types of features you are concerned with, and take an FFT of it.
The result can be used directly as a filter to delete those features,
or its complement can be used to preserve only those features.

For further details see any text on one dimensional signal processing, or
on Fourier optics for two dimensional signal processing, such as
"Introduction to Fourier Optics" by Joseph Goodman (rigorous), or
"Optical Information Processing" by Francis Yu (more accessible,
and with photos, still mathematical), or "Array Signal Processing"
by Justice/Owsley/Yen/Kak (more general, e.g. includes phased array
radar and CT techniques).

The fact that you have a choice like this as to the appearance of the
magnified font is a consequence of the fact that there are several
different ways to introduce new high resolution (high spatial frequency)
detail where there was none to begin with.

Interestingly enough, it is possible to model most different styles
of fonts as appropriately filtered versions of vector fonts, keeping
in mind that there are several *entirely different* ways of visually
symbolizing the same letter (e.g. look at "A" versus "a") in the
vector font.

-------------------- end of included articles ------------------------

	Doug Merritt		doug@mica.berkeley.edu (ucbvax!mica!doug)
			or	ucbvax!unisoft!certes!doug

rsalz@bbn.com (Rich Salz) (04/11/88)

Most conventional typesetter manufacturers refuse to scale UP, but will
scale DOWN.

About five years ago I was in charge of a project to buy a new production
system for the MIT student newspaper.

At that time was common to have four font masters for digital typesetters
with about 1200dpi resolution.  I vaguely remember 12, 36, 72 and 144
point for one manufacturer.

The CRS/Alphatype company was a standout at the time because they had a
different master for every point size, and an overall resolution about
twice the norm, 2576dpi or so.  The Alphatype was the first commercial
typesetter used by the TeX project.  (We were gonna got a copy of their
hacks, but we didn't buy an Alphatype; we later got a Cg8400.)
	/rich $alz
-- 
Please send comp.sources.unix-related mail to rsalz@uunet.uu.net.