[news.admin] Backup news server.

pjg@lictor.acsu.buffalo.edu (Paul Graham) (08/22/89)

In article <9433@cs.Buffalo.EDU>, I wrote
>I'd like to maintain a backup/clone/copy news system.  Normally it would
>share the news load.  In case of failure users would be able to switch 
>servers without discontinuity.  We use nntp and B news 2.11.
>I thought I'd be able to do it by having one feed the other based upon
>history file differences.  That doesn't seem to work.  Has someone done
>this?
>I'm deferring the problem of posting to both machines until I get them cloned.
>Please mail any ideas to me.  (pjg@acsu.buffalo.edu)

First off in that article my From: line was hosed, but the reply-to: is ok.
It appears I wasn't clear enough.  I want to have two (or more) machines with
identical (modulo dates in history) news spools.  These machines will be
nntp servers.  With regard to this problem they won't know or care about
uucp.  I want my users to be able to switch from one machine to the other
as a source of articles (using rrn etc.) without being aware of any difference.
Or I may want to move them without them knowing about it.  rdist was mentioned
but I've found that pawing through 100 or so meg. with rdist to be a bit on
the slow side.  ihave/sendme and backup uucp machines are of course out of
the picture.

Assume that I'll be able to control the newfeed and work out something about
users posting.  It seems that keeping track of differences in the history
files and using them to generate nntp batches would be sufficient but my
first attempt (using diff) failed to keep the machines synchronized.  Cancel
messages seemed to get out of synch and then all following postings (in that
group) would have the wrong article numbers.

So if you know how to keep mulitiple NNTP servers in exact synch (preferably
without running expire more than once a day) please let me know.

Thanks for your time.

woods@ncar.ucar.edu (Greg Woods) (08/22/89)

In article <9460@cs.Buffalo.EDU> pjg@urth.cc.buffalo.edu (Paul Graham) writes:
>I want to have two (or more) machines with
>identical (modulo dates in history) news spools.  

  I do this by setting the backup server up as a normal batch news feed (so 
that the /usr/spool/batch file with article file names is created), but then
I wrote a C program (ftpxmit) which reads the normal batch file and FTP's
the articles directly into /usr/spool/news on the backup server. The active
file is transmitted every time ftpxmit starts up, so that it is always
up to date. My backup server does not allow posting, and since it gets no
real "feed" I don't even need a history file. Aside from the posting 
restriction, this works quite well and preserves article numbers for
.newsrc files. The backup server is, of course, set up as an NNTP server
even though the NNTP server is not used for news transmission, because it
is used for remote news reading.
  "expire" on the backup server is even simpler: I just run find(1) and
delete anything over two weeks old in /usr/spool/news.

--Greg