lwv@n8emr.UUCP (Larry W. Virden) (02/21/89)
I want to take the output of a find dir -criteria_of_various_sort -print and massage it into a smaller list of file names which have no overlap. Files which have 2 or more hard links would resolve into a single file name - files with soft links would resolve into their hard link path name (before the previous resolution takes place?). Does anyone know of a program or easy set of filters to do this sort of thing? I would think it would be quite useful if one does backups using tar or cpio and didnt want duplication of various files - my use is for manipulating these files via grep, etc after reduction of name space. -- Larry W. Virden 674 Falls Place, Reynoldsburg, OH 43068 (614) 864-8817 75046,606 (CIS) ; LVirden (ALPE) ; osu-cis!n8emr!lwv (UUCP) osu-cis!n8emr!lwv@TUT.CIS.OHIO-STATE.EDU (INTERNET) The world's not inherited from our parents, but borrowed from our children.
tif@cpe.UUCP (02/22/89)
Written 2:10 pm Feb 20, 1989 by n8emr.UUCP!lwv in cpe:comp.sources.w >I want to take the output of a find dir -criteria_of_various_sort -print and >massage it into a smaller list of file names which have no overlap. Files Off the top of my head, try something like this find dir -whatever -print | xargs ls -i | sort -n -u -2 | sed 's/^[^ ]*//' At least this eliminates the System V style links (I don't know anything about other types of links) by sorting uniquely on the inode number. Note: It won't work quite right if find goes across to a different filesystem. Paul Chamberlain Computer Product Engineering, Tandy Corp. {killer | texbell}!cpe!tif