[news.software.b] Resending news using history file...

barry@walrus.UUCP (01/25/87)

Does anyone have a shell script that will re-que news articles using the
history file? From time to time it is necessary for me to re-send the
last couple of days' worth of news to one of my leaf sites. Typically,
I'll run a simple find(1) on the /usr/spool/news directory, sending all
files modified within the last `n' days. However, this is blatantly
inefficient, as cross-posted articles are thus sent multiple times (only
to be dropped as duplicates on the receiving end).

A script that walked the history file pulling out the article pathnames
for the past "n" days and sticking those back into
/usr/spool/batch/<system> would be much more efficient, and I think a
quite useful addition to the news 2.11 distribution.

If anyone has one of these, please send me a copy--I'll post the best
(first?) to this newsgroup (so's to avoid the inevitable multiple
posting). If I don't get one Within A Reasonable Time, I'll write one
myself (I promise ;-).

Thanks!

-- 
LIVE: Barry A. Burke, ``working'' at home
UUCP: barry@walrus.Adelie.COM | {harvard | ll-xn | mirror}!adelie!walrus!barry
ARPA: barry%adelie@harvard.HARVARD.EDU

sl@van-bc.UUCP (01/27/87)

In article <437@walrus.Adelie.COM> barry@walrus.Adelie.COM (Barry A. Burke) writes:
>
>Does anyone have a shell script that will re-que news articles using the
>history file? From time to time it is necessary for me to re-send the
>last couple of days' worth of news to one of my leaf sites. Typically,
>I'll run a simple find(1) on the /usr/spool/news directory, sending all
>files modified within the last `n' days. However, this is blatantly
>inefficient, as cross-posted articles are thus sent multiple times (only
>to be dropped as duplicates on the receiving end).
>
>A script that walked the history file pulling out the article pathnames
>for the past "n" days and sticking those back into
>/usr/spool/batch/<system> would be much more efficient, and I think a
>quite useful addition to the news 2.11 distribution.
>

There are two solutions to this. First is what you proposed a shell script
to walk through and find any articles to repost and resend them. A script
was posted recently (Nov/Dec I think) to just that.

Secondly, a simple solution which requires only minimal setup is to add the
following line to your sys file:

	backup:world,comp,sci,news,rec,soc,talk,misc,net,mod,na,usa,can:F

This will create a file called backup in /usr/spool/batch containing the
names of all of the articles received. You should arrange the following in
your daily news cleanup script:

	mv backup.7.Z backup.8.Z
	mv backup.6.Z backup.7.Z
	...
	mv backup.1.Z backup.2.Z
	mv backup backup.1
	compress backup.1

To resend a days feed simply:

	zcat /usr/spool/batch/backup.n.Z /usr/spool/batch/sitename

Compress does a reasonably good job of keeping the files down to a minimal
size due to the regularity of the filenames. I keep two weeks worth at a
cost of about 125k of file space.



-- 
Stuart Lynne	ihnp4!alberta!ubc-vision!van-bc!sl     Vancouver,BC,604-937-7532
Todays feature: Perry Mason Solves the Case of the Buried Clock, Erle
Stanley Gardner, 1943. A bank clerk boasted branzenly about a $90,000
embezzlement, and an alarm clock ticked away cheerfully underground.

barry@walrus.UUCP (01/29/87)

In article <437@walrus.Adelie.COM> barry@walrus.Adelie.COM (Barry A. Burke) writes:
>
>Does anyone have a shell script that will re-que news articles using the
>history file? 
	.
	.
	.
>If anyone has one of these, please send me a copy--I'll post the best
>(first?) to this newsgroup (so's to avoid the inevitable multiple
>posting). If I don't get one Within A Reasonable Time, I'll write one
>myself (I promise ;-).

I received several responses, most of which suggested that I edit the
end of the history file so that what I had was a list of article ID's,
then put these into an "ihave" file (/usr/spool/batch/<system>.ihave),
and then do a "sendbatch -i -c <system>".  Using the "ihave" strategy
has the benefit that the remote side will only request those files that
it didn't already have, and cross-posted articles will only be
transmitted once.

HOWEVER, there is one BIG, BAD side effect.  Culling the history file to
send only article ID's will result in sending articles in *all*
newsgroups received locally- even things the remote side doesn't want,
or shouldn't have.  Among the shouldn't have is a real gotcha- if you
run ihave/sendme with any sites- the control messages for these will
ALSO get sent, which results in the remote site sending you BACK loads
of articles you already have (because he sees the "sendme" messages).

SO- I loooked for a strategy that would easily resend ONLY those
articles that had originally been destined to the site in question (my
orignial need was to be able to re-send articles when the remote site
dropped several days worth of news into the "bucket" due to software
problems).  What I came up with is the following simple Bourne shell/awk
script (requeue) that walks the "log" file(s) that news creates as it
receives articles (/usr/lib/news/log*).  I keep the last seven days'
worth of logs around (log.0-5), so I can do

	requeue floozle /usr/lib/news/log*

to resend the last weeks' worth of stuff, or just select any desired
day(s). Since the log files contain a list of sites each article was
sent to, the script assures that only the proper articles are
re-transmitted. 

I also received a C-Shell script that automates the effort of resending using
the /usr/lib/news/history file. In light of the aforementioned problems,
it is probably best used IFF you don't run ihave/sendme with any other
sites, OR as a tool to supply an initial swamp-load for a new "leaf"
node.

I include both scripts for general consumption:

-----Chop-Chop----------Buzz-Buzz------Oh what a relief that wuz!-------
#! /bin/sh
# This is a shell archive, meaning:
# 1. Remove everything above the #! /bin/sh line.
# 2. Save the resulting text in a file.
# 3. Execute the file with /bin/sh (not csh) to create:
#	requeue
#	resend
# This archive created: Wed Jan 28 23:19:31 1987
export PATH; PATH=/bin:/usr/bin:$PATH
echo shar: "extracting 'requeue'" '(509 characters)'
if test -f 'requeue'
then
	echo shar: "will not over-write existing file 'requeue'"
else
sed 's/^	X//' << \SHAR_EOF > 'requeue'
	X#! /bin/sh
	X#
	X# Original-Author: Barry A. Burke <barry@adelie.Adelie.COM>
	X#				   harvard!adelie!barry
	Xif ( test -z "$1" -o -z "$2" )
	Xthen
	X	echo "Usage: requeue <system> <logfiles>"
	X	exit 1;
	Xfi
	Xecho "Building \"ihave\" list for $1 from $2 $3 $4 $5 $6 $7 $8 ... "
	Xawk '$6=="sent"{for(i=8;$i!="";i++){if((name==$i)||(name","==$i)){print$5;break;}}}' name=$1 $2 $3 $4 $5 $6 $7 $8  >> /usr/spool/batch/$1.ihave
	Xecho ""
	Xecho "Done!"
	Xecho "	Remember to run \"/usr/lib/sendbatch -c -i $1\" to send the list"
	Xexit 0;
	X
SHAR_EOF
if test 509 -ne "`wc -c < 'requeue'`"
then
	echo shar: "error transmitting 'requeue'" '(should have been 509 characters)'
fi
chmod +x 'requeue'
fi
echo shar: "extracting 'resend'" '(1348 characters)'
if test -f 'resend'
then
	echo shar: "will not over-write existing file 'resend'"
else
sed 's/^	X//' << \SHAR_EOF > 'resend'
	X#! /bin/csh -f
	X#
	X# Original-Author: Tony Birnseth <peewee.uss.tek.csnet!tonyb@RELAY.CS.NET>
	X# $Header: resend.csh,v 1.9 86/10/09 13:07:45 news Exp $
	X#
	X# Resend news articles for dates given. (actually build a batch file)
	X# Dates must be of the form "mm/dd/yy"
	X#
	X
	Xset Usage = "Usage: resend date ..."
	X
	Xset HISTORY = /usr/lib/tek/news/history
	Xset tmp = /tmp/resend.$$
	Xset batch = /tmp/batch.$$
	X
	Xonintr done
	X
	Xif ( $#argv < 1 ) then
	X	echo $Usage
	X	exit 1
	Xendif
	X
	X@ i = 1
	Xwhile ( $i <= $#argv )
	X	if ( "$argv[$i]" !~ [0-9][0-9]/[0-9][0-9]/[0-9][0-9] ) then
	X		echo "Invalid date format <$argv[$i]> must be of form mm/dd/yy"
	X		exit 1
	X	endif
	X	@ i++
	Xend
	X
	X# write out an awk script
	Xecho "{" > $tmp
	X@ i = 1
	Xecho -n "	if ( " >> $tmp
	Xwhile ( 1 )
	X	echo -n '$2 == ' >> $tmp
	X	echo -n '"' >> $tmp
	X	echo -n $argv[$i] >> $tmp
	X	echo -n '" ' >> $tmp
	X	@ i++
	X	if ( $i > $#argv ) then
	X		echo ") " >> $tmp
	X		break
	X	else
	X		echo -n " || " >> $tmp
	Xend
	Xecho '		print $4 ' >> $tmp
	Xecho "}" >> $tmp
	X
	Xawk -f $tmp $HISTORY | sed 	-e 's;\.;/;g' \
	X				-e 's;^;/usr/spool/news/;' \
	X				-e '/cancelled$/d' > $batch
	Xset cnt = `wc $batch`
	Xecho "";echo Total == $cnt[1] news articles  batched.
	X
	Xecho "";echo -n "Which machines? "
	Xset hosts = ($<)
	Xforeach i ( $hosts )
	X	if ( "$i" == "" ) break
	X	cat $batch >> /usr/spool/batch/$i
	X	ls -l /usr/spool/batch/$i
	Xend
	X
	Xdone:
	Xrm -f $batch
	Xrm -f $tmp
	X
SHAR_EOF
if test 1348 -ne "`wc -c < 'resend'`"
then
	echo shar: "error transmitting 'resend'" '(should have been 1348 characters)'
fi
chmod +x 'resend'
fi
exit 0
#	End of shell archive