joe@auspyr.UUCP (Joe Angelo) (06/06/87)
Our 2.11 news software had a problem sending batched/compress [-b12] news
to a Xenix 2.10 site. Batched news would always fail or cause something
to core dump. (Sorry, I don't remember why...)
So, what next? We started feeding the Xenix site unbatched news. This
worked out just fine, with a minor exception that the UUCP overhead
was simply to high! If a call didn't go threw, for whatever reason, we'd
have a couple of hundred (read: too many) C./D./X. files sitting around.
Somewhat of a problem.
So what was deduced was that the unbatched_[by_news_software_]news interface
worked fine and that a higher level batcher/compresser could be installed
to feed the 2.10 rnews program. Attached a copy of the horrible scripts.
Yeah, they could use some work but so long as it works...
The idea is to batch the articles without using sendbatch. So, in
your news/sys file, mark the site for batched news. Ie:
xenix:groups:F:
Novice folks: The file names containing the articles will appear in
/usr/spool/batch/xenix (or whatever)...
The special_batcher script below will read the file names from the news
batchdir/site file and them convert it into a large file (under 100k) in
the form of:
START
X
Xcontents of news file
STOP
START
X
Xcontents of next news file...
X
STOP
This file is then UUX'd to the remote unbatcher script which converts the
above into a shell script, such as:
sed 's/^X//' <'STOP' | rnews
X
Xcontents of news file
STOP
sed 's/^X//' <'STOP' | rnews
X
Xcontents of next news file...
X
STOP
This script is then passed via the shell. If you heavy on security,
then you might have a problem here... (After all, if someone edits the D.
file, you'd be in some trouble on the remote end... but since UUCP prevents
that (haha) from happening, no problem...)
Excuse me for being incoherent, today is a horrid day...and I wanted to
post this now, otherwise I'd never do it...
---special_batcher script----
cd /usr/spool/batch
BITS=-b12
REMOTE=jmr
# scan args
for arg in $*
do
case "$arg" in
-b*) BITS=`echo $arg | sed 's/-b//'`;;
*) REMOTE="$arg";;
esac
done
BFILE=$REMOTE
QFILE=X${BFILE}.$$
if test -f $REMOTE
then
ok=ok
else
exit
fi
endoffile=true
didsome=no
COUNT=1
DFILE=$QFILE.$COUNT
LIST="$DFILE"
echo > $DFILE
while $endoffile
do
read line
if test $? = 1
then
endoffile=false
if test $didsome = no
then
rm -f $DFILE
fi
break
fi
echo 'START' >> $DFILE
sed 's/^/X/' < $line >> $DFILE
echo 'STOP' >> $DFILE
didsome=yes
SIZE=`ls -l $DFILE | awk '{print $4}'`
if test $SIZE -gt 100000
then
COUNT=`expr $COUNT + 1`
DFILE=$QFILE.$COUNT
LIST="$LIST $DFILE"
echo > $DFILE
didsome=no
fi
done < $BFILE
files=`echo $QFILE.*`
for file in $files
do
compress ${BITS} $file
if test -f $file.Z
then
rargs="$BITS -c"
OFILE=$file.Z
else
rargs=
OFILE=$file
fi
uux -r - ${REMOTE}!special_unbatcher $rargs < $OFILE
rm -f $OFILE
done
rm -f $BFILE
--special unbatcher to be installed on remote, lets call it special_unbatcher--
BITS=
COMPRESS=cat
for f in $*
do
case "$f" in
-b*) BITS=$f;;
-c) COMPRESS="compress";;
esac
done
# stdin is file...
if test $COMPRESS = "cat"
then
ok=ok
else
COMPRESS="$COMPRESS -d $BITS"
fi
$COMPRESS | sed 's;START;sed '\''s/'\^'X//'\'' <<'\''STOP'\'' | rnews;' >> /tmp/newsp$$
sh < /tmp/newsp$$
rm -f /tmp/newsp$$
---
Ofcourse, you'll need to add "special_unbatcher" in the L.cmds/Permissions
file on the remote and a cron daemon<!> locally to run
"special_batcher remote"
every so often.
--
"No matter Joe Angelo, Sr. Sys. Engineer @ Austec, Inc., San Jose, CA.
where you go, ARPA: aussjo!joe@lll-tis-b.arpa PHONE: [408] 279-5533
there you UUCP: {sdencore,cbosgd,amdahl,ptsfa,dana}!aussjo!joe
are ..." UUCP: {styx,imagen,dlb,jmr,sci,altnet}!auspyr!joe