david@indetech.com (David Kuder) (09/11/90)
To minimize disk usage here we use a "reaper". This is a script that runs late at night to get rid of files that can be safely removed. For instance we remove all core files that are more than one day old and all GNU emacs litter that is more than 5 days old. IF it were that simple then a "find" would do the trick. But recently it has gotten more complicated and maintaining a collection of \( -name '*pat1*' -mtime +5 \) -o stuff is getting old. Thus, the direct question: Does anyone have a generic, readily configurable reaper? Or: Does anyone have a skeleton for a find equivalent written in Perl? I think I could do what I want from that. -- David A. Kuder Looking for enough time to get past patchlevel 1 415 438-2003 david@indetech.com {uunet,sun,sharkey,pacbell}!indetech!david
lwall@jpl-devvax.JPL.NASA.GOV (Larry Wall) (09/11/90)
In article <1990Sep10.230742.9600@indetech.com> david@indetech.com (David Kuder) writes: : Or: Does anyone have a skeleton for a find equivalent written in Perl? : I think I could do what I want from that. This one will beat find in elapsed time on my machine, as long as you're primarily examining just the names. If you have to stat every file, it'll run a bit slower, but you can do away with the $nlink == 2 business. #!/usr/local/bin/perl &dodir('.'); sub dodir { local($dir,$nlink) = @_; local($dev,$ino,$mode); ($dev,$ino,$mode,$nlink) = stat('.') unless $nlink; # first time opendir(DIR,'.') || die "Can't open $dir"; local(@filenames) = readdir(DIR); closedir(DIR); if ($nlink == 2) { # this dir has no subdirectories for (@filenames) { next if $_ eq '.'; next if $_ eq '..'; print "$dir/$_\n"; } } else { # this dir has subdirectories for (@filenames) { next if $_ eq '.'; next if $_ eq '..'; $name = "$dir/$_"; print $name,"\n"; ($dev,$ino,$mode,$nlink) = lstat($_); next unless -d _; chdir $_ || die "Can't cd to $name"; &dodir($name,$nlink); chdir '..'; } } } Larry Wall lwall@jpl-devvax.jpl.nasa.gov