[sci.electronics] Heath "Most Accurate" Clock

gordonl@microsoft.UUCP (Gordon LETWIN) (03/03/90)

A few months ago there were several postings discussing Heath's "Most
Accurate Clock" - an electronic clock that uses a built in receiver to
synchronize itself to the WWV broadcasts.

Quite a few people critisized the receiver section, saying that their
clock didn't pick up WWV very well and was in "High Spec" rarely.

I've just built one of these and have some observations.  First, I should
state that mine is obviously the latest model; it's possible that Heath
has tweaked the design between this and earlier versions.

Second, the clock is not intended to be correctly receiving WWV most of the
time.  The WWV data is sent once a minute, a one bit per second (!).  There's
no CRC so the clock has to use a "voting" procedure to determine if the
data frame was correct.  This requires three minutes of flawless reception.
The clock adjusts it's internal oscillator based upon the divergence
between it's internal time and the broadcast time so that, after a day or
two of operation, the clock has a "stand alone" accuracy of about 3 seconds
a year.  This is 10 ms a day, so the design intent is that if the clock
can hear WWV successfully at least once a day then all is cool.  

The clock has a "high spec" light which comes on for (half an hour?) after
the clock successfully synchronizes to WWV.  This light is expected to
be off most of the time, but so long as the 10th second digit is not dim the
clock is still accurate to within 10 ms.  The only time you're truely getting
bad reception is when the clock can't sync for 24 hours, whereupon it dims
the 10th second digit in it's display as a warning.

One problem that people may have is confusion between receiver sensitivity and
with the clock's ability to capture the WWV digital data.  The clock uses
tone decoders - much like PLLs - to detect the 100 hz and 1000 hz tones.  
These decoders are set by the user in a calibration procedure that isn't
super precise.  The further these are off, the stronger the signal and the
longer the time it takes the chip to detect the tone.  I found that by using
a frequency counter to set these, in place of the Heath calibration procedure,
my clock's ability to synchronize improved markedly.  My clock is in "high
spec" for more than 12 hours a day now.

In fact, I wanted to test out the "dim digit after 24 hours w/o sync" feature
and I collapsed the telescoping antenna to cut off reception,
but it still picked up WWV just fine.  I then
unscrewed the antenna and removed it.  Coming back later, I found that the
clock was STILL receiving WWV at least a couple of times a day!  I finally
had to ground the antenna input connector in order to break contact.
Although the reception at my house is probably average, all my homebrew
computer gear makes my house  "FCC Class Z" - I can't use walkie talkies
or portable phones because of the high RFI, so this makes the clock's
receiver performance even more impressive.

As a P.S., the clock's design is kind of schitzo.  On one hand, it has an
adjustment for the speed of light between the clock and the transmitter,
and it warns you (via "high spec") when the time is off by more than
one millisecond, but on the other hand, the only way to use this super
accuracy is via the RS232 connector and the protocol pretty much prevents
you from obtaining that accuracy!  In other words, the clock design is
hyper accurate, but there's no way to get times out more accurate then,
say, +- about 10 ms.  

If you're a radio astronomer or whatever then this is a problem, but if you
just want to synchronize your computers and other gear to an accurate time,
this clock is plenty good enough.

	Gordon Letwin

pps - I worked for heath 12 years ago, but I saw enough things there, good
	and bad, that I'm neither "fer 'em" nor "agin 'em"