panisset@thunder.mcrcim.mcgill.edu (Jean-Francois Panisset ) (04/17/91)
I have been asked to look into importance sampling as a way to implement some digital filters in hardware (more specifically, 2D filters for anti-aliasing and texture filtering). The idea of an importance sampled filter is that instead of specifying equi-spaced weighted samples as an approximation to the continuous filtering function, all the samples are given the same weight (unity), but are distributed so that each sample occupies an equal area under the function (or equal volume in the 2D case). The good part is that you can now perform you filtering simply by adding up the values of your signal at the sampling points, no multiplication required (this of course assumes that you can specify non-regular sampling easily). The problem is that although this technique is alluded to in several papers (in particular, Rob Cook's papers on stochastic sampling), I have yet to find a rigorous frequency domain analysis of this technique. Thus I would appreciate it if anyone could point me to a reference on this topic. Thanks in advance, JF Panisset -- Jean-Francois Panisset INET: panisset@mcrcim.mcgill.ca panisset@larry.mcrcim.mcgill.edu UUCP: ...!mcgill-vision!panisset
bobc@hplsla.HP.COM (Bob Cutler) (04/18/91)
This sounds like a problem that could be addressed by the Papoulis' Generalized Sampling Theory. The theory deals not only with non-uniform sampling, but also with other forms of sampling like derivative sampling. He has a brief description of the theory in his text "Signal Analysis". In it he also references his paper which was published in the November '77 issue of IEEE Trans. on Circuits and Systems. I don't know if the theory has been extended to 2D transforms. Bob Cutler Lake Stevens Instrument Division Hewlett-Packard Everett, WA 98205