[comp.dsp] Creating Samples by stuffing & filtering

mcmahan@netcom.UUCP (Dave Mc Mahan) (11/06/90)

The Problem:

I have been given a task of smoothing an ECG (electro-cardiogram) signal that
needs to be sent to a strip chart recorder.  Currently, the system outputs
250 samples, although the next generation will provide only 125 samples on
two channels.  The goal is to provide two benefits: smoother strip chart
recordings and peak restoration.  Current strip chart output shows definate
steps for each sample, especially when viewed at a high paper speed.  The
current sampling does not always sample at the exact peak (obviously) and it
is desirable to re-create it.  The signal is low pass filtered before sampling
to avoid aliasing.  Although it isn't a great rolloff, the low pass filter
starts cutting off about 52 Hz and is 8 dB down at 62 Hz.


What I have done:

To create more samples, I assumed that I could just read in one sample,
duplicate it to the output the desired number of times for the proper
expansion factor (currently, an expansion factor of 4 samples out for each
sample in seems to be about right) and then pass the expanded data through
a digital filter to low pass filter it and smooth the final output.  I am
using a 33 tap filter for this.  I need to use an FIR filter because I need
to ensure linear phase response (and constant time delay) of all frequencies 
in the signal.  The final signal does look much better and original peaks
are restored, although I do get a bit of "Gibb's Phenomena" showing up in
areas where there is a very high rate of change in the input signal.  I assume
this is due to using such a short FIR filter length and could be made to go
away if I increased the number of taps.  The process will eventually have to
run on a 68000 in real time, so the number of taps (and the resulting number
of multiplications per second) is of concern.


My Question:

Does this make sense, or is there a better way?  The results I get now look
pretty good, but I was wondering if there is a better or more efficient way
to generate the desired data samples.  I plan on using as many tricks as
possible in software to actually implement the filter with minimum math
overhead.  I thought about just doing straight line interpolation of the
samples, but then I wouldn't get any peak restoration.


All thoughts or ideas are appreciated.


    -dave

-- 
Dave McMahan                            mcmahan@netcom.uucp
					{apple,amdahl,claris}!netcom!mcmahan

parks@bennett.berkeley.edu (Tom Parks) (11/07/90)

In article <16311@netcom.UUCP>, mcmahan@netcom.UUCP (Dave Mc Mahan) writes:
> The Problem:
> 
> I have been given a task of smoothing an ECG (electro-cardiogram) signal that
> needs to be sent to a strip chart recorder.

[stuff deleted]

> What I have done:
> 
> To create more samples, I assumed that I could just read in one sample,
> duplicate it to the output the desired number of times for the proper
> expansion factor (currently, an expansion factor of 4 samples out for each
> sample in seems to be about right) and then pass the expanded data through
> a digital filter to low pass filter it and smooth the final output.

Instead of duplicating the samples, insert zeros between the samples. 
This way your low pass filter can be flat in the pass band, instead of
needing to compensate for the bowing introduced by duplicating samples. 
(Duplicating samples is equivalent to passing your signal through a
system that has a rectangular pulse for its impulse response.  The
Fourier transform of a rectangular pulse is sinc-like.)  Also, this
means that many of the samples you are filtering will be zero, and
multiplication by zero can be done very quickly, in zero time in fact.


Tom Parks
Electronics Research Laboratory
University of California, Berkeley
----------------------------------
Life is a sexually transmitted disease.

chris@ncmicro.lonestar.org (Chris Arps) (11/08/90)

I did some real-time data smoothing of an oil well tool that measured
resistance.  What we did was to place the samples into a queue that
was a nice number for a bell curve average.  We just used a simple arrray
that was either 9 or 27 elements long giving low and high amounts of 
smoothing.  The array was used as a queue by filling it up with the
first 9 data values, multiplying a pre-set array of bell curve values 
(which all add up to 1.0) to each value and adding to a result 
giving a weighted average value which is output to the strip recorder. 
When the next data word comes in, just move all the array elements upi
one postion in the array and add the new data sample to the end of the
array.  This algorithm works well and on machines with block memory move, 
the queue work fast enough for real-time computation.


I hope that this is good enough description of the algorithm.  It 
produced nice curves and can be extended as low as 3 elments or 
as high as you need.