bernie@metapro.DIALix.oz.au (Bernd Felsche) (02/05/91)
In <1991Feb04.033031.2714@convex.com> tchrist@convex.COM (Tom Christiansen) writes: >Look at /usr/local/lib/news/newsbin/expire/recovact for another good >candidate. Here's a loop in it: > while read group max min fourth > do > dir=`echo $group | tr . / ` # map ng name to directory name > new= > if test -d $NEWSARTS/$dir > then > new=`ls $NEWSARTS/$dir | egrep '^[0-9]+$' | sort -n | tail -1` > fi > case "$new" in > "") new=$max ;; # no files -- preserve old value > *) if test "$new" -lt "$max" # old value more recent (!) > then > new="$max" > fi > ;; > esac > dots="`echo $max | tr 0123456789 ..........`" > max="`expr 0000000000$new : '.*\('$dots'\)$'`" # preserve length > echo $group $max $min $fourth > done <active >active.new >I'm quite certain that would run faster in perl. Sure it would. One fork-exec is a *lot* faster than seven (7) per newsgroup (I'm assuming lots of built-in shell bits). There are over 700 newsgroups on our machine. The use of "ls" to generate filenames is a trifle silly, especially as it's only being used to find the maximum article number. A simple perl routine could read the directory and simply look for a maximum. It doesn't even have to sort. (btw: if people didn't screw around with timestamps so badly, then new="`ls -tr [0-9]*|tail -1`" would have done the same trick.) I'm only now starting to play with Perl, but I can easily see why it's so popular for things like this. -- _--_|\ Bernd Felsche #include <std/disclaimer.h> / \ Metapro Systems, 328 Albany Highway, Victoria Park, Western Australia \_.--._/ Fax: +61 9 472 3337 Phone: +61 9 362 9355 TZ=WST-8 v E-Mail: bernie@metapro.DIALix.oz.au | bernie@DIALix.oz.au