ubi@ginger.sri.com (Ron Ueberschaer x4399) (03/03/90)
I'll admit it--I'm a packrat. I'm also the local radar imaging guru, so I've got a lot of freeware sources, data files, images, etc. cluttering up our disks. When it comes time to free up space, I realize that, according to someone's famous law, 80-90% of my disk usage is in 10-20% of the files. I usually resort to something like: > cd > du -s */ | sort -rn | head -20 > cd <biggest dir> > du -s */ | sort -rn | head -20 > cd <biggest subdir> > ls -sl | sort -rn | head -20 to get to the big non-directory files that I might decide I can do without. This is time-consuming and often confusing (du doesn't show which files are directories; ls doesn't show the grand-total for directories). What would be really nice is some sort of report script called, say, "bigoldfiles", which would do all this automatically. Perhaps something using du and find. I remember a program called "byteyears" which seemed pretty useful... Is there anything better? Well how 'bout it, wizards? --Ron Ueberschaer SRI International ubi@unix.sri.com ...!{hplabs,rutgers}!sri-unix!ubi
merlyn@iwarp.intel.com (Randal Schwartz) (03/04/90)
In article <9733@unix.SRI.COM>, ubi@ginger (Ron Ueberschaer x4399) writes: | What would be really nice is some sort of report script | called, say, "bigoldfiles", which would do all this automatically. | Perhaps something using du and find. I just decide on my own "how old" and "how big" tresholds and use out-of-the-box find(1), ala: $ find . -mtime +28 -size +1000 -ls and then process the info in my head. Does that not get you the info you are looking for? Just another UNIX hacker, -- /=Randal L. Schwartz, Stonehenge Consulting Services (503)777-0095 ==========\ | on contract to Intel's iWarp project, Beaverton, Oregon, USA, Sol III | | merlyn@iwarp.intel.com ...!any-MX-mailer-like-uunet!iwarp.intel.com!merlyn | \=Cute Quote: "Welcome to Portland, Oregon, home of the California Raisins!"=/