[comp.graphics] "Random" compression

schear@ttidca.TTI.COM (Steve Schear) (12/10/87)

 have been considering an alternative image compression method which
 utilizes randomness (actually pseudo-randomness) rather than chaos to
 do its dirty work.  If workable, this algorithm promises to be capable 
 of exactly compressing arbitrary sized images with a fixed length code 
 based only upon the number of different pixel values in the image.  
 Though the compression phase could be exceedingly complex, decompression 
 should be fast and straightfoward.

 The idea is that a selected and constrained "random" sequence can 
 infact code for images.  In particular, any given image can be, at worst
 case, considered a "ramdom", that is uncorrelated, arrangement of pixels.
 If this is so, then a particular selected random pattern of pixels can be
 used to code for any image, given of course sufficient computing power
 and time (the catch).

	First, the pixels in an image are sorted, according to their 
	values and locations to form a specialized sort of histogram.

	Next, a random number generator is used to select pixels (it
	can start at any pixel location in the image).  This generator
	is constrained so that it can never choose the same pixel twice.
	Furthermore, once a pixel is chosen, all pixels of the same value
	must be selected before a pixel of another value is picked.  If 
	the generator selects an incorrect value, a different random
	sequence is selected (yes, I know that this could require an
	incredible number of permutations).

	If, and when a random seed is produced which correctly "codes"
	for the image, the image is then stored as simply the list of
	pixels runs plus the seed.

	If one now assumes that neighbooring pixels in most images have a high
	correlation, a good general assumption, specifically weighted "random"
	sequences can be selected which tend to match the image under
	compression to reduce the computation effort (perhaps though, not 
	enough).

Ok guys, flame away!

kaufman@Shasta.STANFORD.EDU (Marc Kaufman) (12/12/87)

In article <1531@ttidca.TTI.COM> schear@ttidca.TTI.COM (Steve Schear) writes:

> have been considering an alternative image compression method which
> utilizes randomness (actually pseudo-randomness) rather than chaos to
> do its dirty work.  If workable, this algorithm promises to be capable 
> of exactly compressing arbitrary sized images with a fixed length code 
> based only upon the number of different pixel values in the image.  
> Though the compression phase could be exceedingly complex, decompression 
> should be fast and straightfoward.

...

>Ok guys, flame away!

Actually, since at most a finite number of graphic images have ever been
created (or will be in the forseeable future), we can just use a Godel
numbering scheme to identify them.  48 bits should suffice for the next
few years. :-)

Marc Kaufman (kaufman@Shasta.stanford.edu)