[news.software.b] Delayed ihave

heiby@mcdchg.UUCP (11/05/87)

One of the sites I feed is actually supposed to be just a "backup"
feed to them.  They are a long distance call away, so the intention
was to let them get the majority of their news locally, but to have
an ihave/sendme link with me to make sure they didn't miss anything.
We'd like to be able to have the message-ids go into a file that
basically sits around for about three days before becoming the contents
of an ihave message to their site.  I think I can kludge something
into the script I run each hour, between the processing of incoming
news with "rnews -U" and the batching process, but that seems kinda
ugly.  Is there a simple way to do this?
-- 
Ron Heiby, heiby@mcdchg.UUCP	Moderator: comp.newprod & comp.unix
"I know engineers.  They love to change things."  McCoy

steve@nuchat.UUCP (Steve Nuchia) (11/12/87)

In article <2293@mcdchg.UUCP>, heiby@mcdchg.UUCP (Ron Heiby) writes:
> One of the sites I feed is actually supposed to be just a "backup"
> feed to them.  They are a long distance call away, so the intention
> was to let them get the majority of their news locally, but to have
> an ihave/sendme link with me to make sure they didn't miss anything.
> We'd like to be able to have the message-ids go into a file that
> basically sits around for about three days before becoming the contents
> of an ihave message to their site.  I think I can kludge something
> into the script I run each hour, between the processing of incoming
> news with "rnews -U" and the batching process, but that seems kinda
> ugly.  Is there a simple way to do this?

I've cooked up a solution that works quite well, though I'm
still running one of the steps manually - haven't settled
the question of how many days to delay yet.

The big change is in the routine ihave in control.c - mine
formats lines with the article id and the date into a file
named after the machine we got the ihave from.

Then there's this program that slurps up all of the history
into "compact" memory structures - still takes a lot - and
then grinds through the ihave file, disposing of redundant stuff,
ordering what it hasn't seen yet, and re-spooling anything
that is still too new to order.  It takes 2 1/4 cpu minutes
to process a couple of days worth of ihaves, mostly reading
the history files.  If you let the default ihavein control.c
do it it reads the history database on average once for every
15 or so articles!

I've just patched up to level 12 (from 5!) and I don't know
if its all still running yet, but I can send my code to
anyon who asks.  If enough interrest, I'll send it to the
archives.

-- 
Steve Nuchia	    | [...] but the machine would probably be allowed no mercy.
uunet!nuchat!steve  | In other words then, if a machine is expected to be
(713) 334 6720	    | infallible, it cannot be intelligent.  - Alan Turing, 1947