[comp.binaries.ibm.pc.d] Zoo questions

malpass@vlsi.ll.mit.edu (Don Malpass) (09/05/88)

I just started to kick the tires of msdos-zoo to see if it might be
THE ANSWER to the need for a compressing backup system since it
saves path names.
   1.  Page 4 of zoo.man describes the "I" modifier which I thought
might allow a file of SUBDIRECTORIES to be fed in via standard input.
I seem to find that if complete file names are listed without
wildcards, things work fine, but wildcard expansion after the path
doesn't take place:
  e:/diskram/l.bat       works fine,  but....
  f:/desmet/include/*          or
  g:/vi/bugs/*.*         result in "Zoo: ERROR: Could not open ..."
What am I missing?  Generating the file of all subdirectories
is trivial with "ls -R1c" and "grep" (I've made my dos look as much
like UNIX as possible), but prepending each file name with its complete
path is more of a pain.  Hacking up my CATADD file-catalog program
to generate such a giant file would probably be the easiest way to go.
But is that necessary?
   2.  Page 9, under "zoo P{EQq} archive" says "A unique temporary file
in the current directory is used ...", but the temporary file was formed
in the same directory as the .zoo file.  (I happened to run zoo from
the directory containing the files I wanted archived to a floppy.)  I
think this is probably a document error.  Page 10 explains that the "."
modifier of the Pack command answers the need for packing .zoo files
larger than half the size of the disk containing them.  (I'd be happier
if the .bak were in the current directory and the packed .zoo ended
up where its father had been.)
   This dilemma of backup automation vs. the finite space of floppy
disks needs serious thought in any extention of zoo, along with its
desire to make its own .bak files for safety.  Even if I made a
"path-name file", I'd have to estimate resulting .zoo file size and
chop things into diskette-size pieces.  The creation of 10K "files"
distributed over 7 floppies has always set my teeth on edge.
      The reason I don't simply use Fastback is that I have a Heath
Z-100, and FB isn't available for the Z-100.  At work I have an
ethernet connection from my at-clone to a sun (= infinite storage), and
zoo would provide a nicer backup scheme than the one pc-nfs has,
although that is painless except for the file space a full backup
takes.  Having the ability in unix to extract from the ms/dos .zoo
backup files would be a bonus though.
   BTW, hats off to Rahul Dhesi for zoo.  I'll still use arc for lots
of things, but I like zoo.  [PLEASE folks, don't take this as an excuse
to fire up the lengthy arc/zoo/pk-arcument again!]
-- 
Don Malpass   [malpass@LL-vlsi.arpa],  [malpass@spenser.ll.mit.edu] 
  My opinions are seldom shared by MIT Lincoln Lab, my actual
    employer RCA (known recently as GE), or my wife.