[comp.protocols.tcp-ip.ibmpc] PC -> Unix backups???

hudgens@sun13.SCRI.FSU.EDU (Jim Hudgens) (11/02/90)

Does anyone know an easy way to do a backup of dos filesystems 
to a UNIX host using redistributable software like the 
Clarkson packet drivers, NCSA telnet, etc.  Something like PCNFS 
would make this fairly easy, but it's not cheap.  

My current thoughts are to use a ftp-driver program on the unix side, 
which connects to the PC via ftp, walks the dos file system, 
extracts each file it finds in turn, tars it to an archive (using
append mode), and then deletes it from the UNIX side.  Not pretty,
but it might work.  Someone posted a perl program recently which 
does a good portion of the above.  

Does anyone know of a free (or very cheap) way to do something like
this easily?

--
Disclaimer:  I didn't do it.
Jim Hudgens		Supercomputer Computations Research Institute
hudgens@sun13.scri.fsu.edu

dpz@dimacs.rutgers.edu (David Paul Zimmerman) (11/08/90)

hudgens@sun13.SCRI.FSU.EDU (Jim Hudgens) writes:

>Does anyone know an easy way to do a backup of dos filesystems 
>to a UNIX host using redistributable software like the 
>Clarkson packet drivers, NCSA telnet, etc.  Something like PCNFS 
>would make this fairly easy, but it's not cheap.  

I'm working on a scheme to do weekly backups of my administrative PCs' data.
Pros: It's simple and will be cheap to implement.  Cons: it uses 1:1 disk
space.

I'm only interested in backing up the data that they create locally, so I'm
configuring all of their programs to write data in a C:\DATA directory tree.
Once a week, they will run a script (actually, they will click on an Windows
3.0 icon that runs a script) called BACKUP.BAT.

BACKUP.BAT uses a freely available TAR.EXE (part of a PAX package that came
over comp.binaries.ibm.pc many moons ago grasshopper) to "tar" up C:\DATA into
C:\BACKUP\BACKUP.TAR (or some such name).  BACKUP.BAT then calls KA9Q,
configured to use the packet driver (I use the SLIP, NI5210, and NI9210 PDs),
with an AUTOEXEC.NET script that includes the line "start ftp".  This starts
the KA9Q FTP server.  The PC then sits there all night and waits for an
incoming FTP connection (using permissions that I've set up in the KA9Q
FTPUSERS file).

That much I have finished.  My next task in this project is to set things up
on a Unix box to pick up the tar file in the wee hours of the morning.  With a
batch FTP script (using "bftp", from USC), that should be trivial.

All of this should ultimately cost me a total of nothing.

						David
-- 
David Paul Zimmerman                                     dpz@dimacs.rutgers.edu
Systems Programmer						    rutgers!dpz
Rutgers Univ Center for Discrete Math and Theoretical Computer Science (DIMACS)

HAROLD@UGA.CC.UGA.EDU (Harold Pritchett) (11/08/90)

On Thu, 8 Nov 90 02:33:03 GMT David Paul Zimmerman said:
>
>I'm working on a scheme to do weekly backups of my administrative PCs' data.
>Pros: It's simple and will be cheap to implement.  Cons: it uses 1:1 disk
>space.
>
>I'm only interested in backing up the data that they create locally, so I'm
>configuring all of their programs to write data in a C:\DATA directory tree.
>Once a week, they will run a script (actually, they will click on an Windows
>3.0 icon that runs a script) called BACKUP.BAT.
>
>BACKUP.BAT uses a freely available TAR.EXE (part of a PAX package that came
>over comp.binaries.ibm.pc many moons ago grasshopper) to "tar" up C:\DATA into
>C:\BACKUP\BACKUP.TAR (or some such name).  BACKUP.BAT then calls KA9Q,

It seems to me that the right solution here would be to use ZOO in place of
TAR.EXE.  ZOO will provide the same function (package files together, retaining
the original directory structure, etc) with the  added advantage of compressing
the data so that a 1:1 storage space is no longer required.  ZOO is also
available for UNIX systems, so if necessary a file can be extracted on the
backup server and downloaded without having to return the whole thing back to
the pc to extract the one file desired.

>All of this should ultimately cost me a total of nothing.

Since ZOO is also in the public domain, this cost doesn't change...

Harold C Pritchett         |  BITNET:  HAROLD@UGA
BITNET TechRep             |    ARPA:  harold@uga.cc.uga.edu
The University of Georgia  |
Athens, GA 30602           |    fido:  1:370/60
(404) 542-3135             |     Bbs:  SYSOP at (404) 354-0817

david@WUBIOS.WUSTL.EDU ("David J. Camp") (11/09/90)

In Reply to this Note From: <Harold Pritchett>
>
>On Thu, 8 Nov 90 02:33:03 GMT David Paul Zimmerman said:
>>
>>I'm working on a scheme to do weekly backups of my administrative PCs' data.
>>Pros: It's simple and will be cheap to implement.  Cons: it uses 1:1 disk
>>space.
>>
[text deleted]
>It seems to me that the right solution here would be to use ZOO in place of
>TAR.EXE.  ZOO will provide the same function (package files together, retaining

I used to be a zoo aficionado until I learned about the public domain tar
program available in wuarchive:mirrors/msdos/filutl/tar.zip .  This will
do both tar and compress in one pass on MsDos.  Although I have not used
its compress facility much, the concept makes zoo rather obsolete.  I
have used the utility without compression, and it works fine.  -David-

david@wubios.wustl.edu             ^     Mr. David J. Camp
david%wubios@wugate.wustl.edu    < * >   +1 314 382 0584
...!uunet!wugate!wubios!david      v     "God loves material things."

dpz@dimacs.rutgers.edu (David Paul Zimmerman) (11/16/90)

Harold Pritchett writes:

> It seems to me that the right solution here would be to use ZOO in place of
> TAR.EXE.  ZOO will provide the same function (package files together,
> retaining the original directory structure, etc) with the added advantage of
> compressing the data so that a 1:1 storage space is no longer required.

Thanks for the suggestion... you are quite correct.  I've got about a 66%
overall compression rate now.  I also looked at PKZIP, which got me about a
75% compression rate.  I prefer PD to shareware, though, so I'll be sticking
with ZOO.  It also seems to win over both simple tar and tar-then-compress,
since it compresses in-stream.

				David
-- 
David Paul Zimmerman                                     dpz@dimacs.rutgers.edu
Systems Programmer						    rutgers!dpz
Rutgers Univ Center for Discrete Math and Theoretical Computer Science (DIMACS)

jbvb@FTP.COM ("James B. Van Bokkelen") (11/18/90)

    ....  I prefer PD to shareware, though, so I'll be sticking with ZOO.  It
    also seems to win over both simple tar and tar-then-compress, since it
    compresses in-stream.

Our 2.05 TAR has a switch which compresses the data on the fly, reading or
writing a .Z file on the server.  

James B. VanBokkelen		26 Princess St., Wakefield, MA  01880
FTP Software Inc.		voice: (617) 246-0900  fax: (617) 246-0901