jones@uv4.eglin.af.mil (Calvin Jones, III) (10/16/89)
Terminal Entry <aliu@ohe-sun2.usc.edu> writes: >PS: Why don't people use entire-disk archivers when archiving a program > that's a full disk, instead of making our life harder with zoo? -> ;) ZOO has the capability to archive an entire disk, retaining the complete directory structure within the archive. The archive should then be expanded with the X// switch to create the original structure. Doesn't anyone read the docs anymore? But my kid tells me, "If it's a GOOD video game ya don't need the instructions, Dad!" 8-) --- Cal // Cal Jones - Internet: <Jones@UV4.Eglin.AF.Mil> or \X/ BBS: 904-243-6219 1200-9600HST 340Meg, all Amiga --------------------------------------------------------------------- NW Florida's first Amiga BBS running on NW Florida's FIRST AMIGA!
petersen@uicsrd.csrd.uiuc.edu (Paul Petersen) (10/16/89)
In article <1737@nigel.udel.EDU> jones@uv4.eglin.af.mil (Calvin Jones, III) writes: >Terminal Entry <aliu@ohe-sun2.usc.edu> writes: > >>PS: Why don't people use entire-disk archivers when archiving a program >> that's a full disk, instead of making our life harder with zoo? -> ;) > >ZOO has the capability to archive an entire disk, retaining the complete >directory structure within the archive. The archive should then be >expanded with the X// switch to create the original structure. > I've done this with several disk I've put up for anonymous FTP, and I almost always get requests to break up the file into more "manageable" chunks such as 250K or less per file. The rational usually is a less than perfect tcp/ip link or the need to only get one archive member or the use of disk quotas on the receiving end that preclude transfering large amounts. It seems similar reasons would apply to other places such as BBSs. -Paul Petersen University of Illinois, Urbana-Champaign Center for Supercomputing Research and Development UUCP: {uunet,convex}!uiucuxc!uicsrd!petersen INTERNET: petersen@uicsrd.csrd.uiuc.edu University of Illinois, Urbana-Champaign Center for Supercomputing Research and Development UUCP: {uunet,convex}!uiucuxc!uicsrd!petersen ARPANET: petersen%uicsrd@uxc.cso.uiuc.edu CSNET: petersen%uicsrd@uiuc.csnet BITNET: petersen@uicsrd.csrd.uiuc.edu
Sullivan@cup.portal.com (sullivan - segall) (10/17/89)
>Terminal Entry <aliu@ohe-sun2.usc.edu> writes: > >>PS: Why don't people use entire-disk archivers when archiving a program >> that's a full disk, instead of making our life harder with zoo? -> ;) > >ZOO has the capability to archive an entire disk, retaining the complete >directory structure within the archive. The archive should then be >expanded with the X// switch to create the original structure. Whole disk archivers obviously must save the original directory structure as well. > >Doesn't anyone read the docs anymore? But my kid tells me, "If it's a >GOOD video game ya don't need the instructions, Dad!" 8-) > total non-sequitur. My gripes about whole disk archivers are somewhat more pragmatic. I hate archives that I can't pull individual files out of. I also hate archives that I can't dump to harddisk. I would personally prefer that all archives that didn't have to be put in WARP form, weren't. ____ Mail: Sullivan Segall Sullivan@cup.portal.com ____
maniac@arrakis.nevada.edu (ERIC SCHWERTFEGER) (10/17/89)
>PS: Why don't people use entire-disk archivers when archiving a program > that's a full disk, instead of making our life harder with zoo? -> ;) > Several reasons actually. I know some disk archivers will warn you if there is a bootblock virus on the disk being warped, but that can be over-ridden, and I haven't seen a disk archiver than warns the un-archiving person. Second, I have a 32 Meg hard disk and 2 Megs of ram. I do not like being told that I have to unarchive to a floppy rather than ram or hard disk. Finally, you can do a lot more with zoo. you can list the files in the archive or extract specific files. All in all, i prefer the versatility of zoo to a disk archiver. And yes, when I archive an entire disk, I do set up the file(s) so that the x// option will put everything back where it belongs. Eric Schwertfeger maniac@arrakis.nevada.edu
ba@m2-net.UUCP (Bill Allen) (10/25/89)
Sender: Reply-To: ba@m-net.UUCP (Bill Allen) Followup-To: Distribution: Organization: M-NET, Ann Arbor, MI Keywords: -- --------------------------------------------------------- Reply-To: ba@m2-net.UUCP (Bill Allen Beogelein) M-NET, Ann Arbor, MI or call Amiga Shareware HQ at 313-473-2020 24hrs, 130meg, 100% Amiga ---------------------------------------------------------
antunes@ASTRO.PSU.EDU (Alex Antunes) (11/06/89)
Hi! Okay, I can handle .lzh,.zuu,.zoo,.arc,.wrp,.pak,."a few others"... My problem is the ".Z" files-- what do you use to unpack them? Actually, does anybody have a complete list of all the standard appendages for all the archiving programs out there? Sorry if this is a repeat question, but some stuff isn't covered in the intro documents! ------------ Sandy Antunes "the Waupelani Kid" 'cause that's where I live... antunes@astro.psu.edu Penn State Astronomy Dept ------------ "sleep is for the weak and sickly" ------------
jac@muslix.llnl.gov (James Crotinger) (11/06/89)
The .Z files are compressed with the UNIX compress command, and can be uncompressed with the same (or with uncompress). Unfortunately I'm not aware of a P.D. version of compress for the Amiga that can handle 16 bit compression, which is the default if the file was compressed on a UNIX system. I have a version that does this, but it came with A-Talk III and is probably not PD. Jim
aliu@girtab.usc.edu (Terminal Entry) (11/06/89)
In article <8911052127.AA03986@astro.psu.edu> antunes@ASTRO.PSU.EDU (Alex Antunes) writes: >Hi! Okay, I can handle .lzh,.zuu,.zoo,.arc,.wrp,.pak,."a few others"... >My problem is the ".Z" files-- what do you use to unpack them? >Actually, does anybody have a complete list of all the standard appendages >for all the archiving programs out there? Sorry if this is a repeat >question, but some stuff isn't covered in the intro documents! Ending Use --------------------------------------------------------- .zoo Zoo .arc Arc .zip Zip .lzh LHarc .sit Only available in "unsit" for amiga. .Z Compress (Lempel-Ziv) .C Compact (Huffman). Rather outdated. .sh/.shr/.shar SHell ARchiver (SHAR). .wrp Amiga disk-archiver .tar Tape archiver. .sq Sq/Unsq. (forgot the proper name) .bak Matt Dillon's HD->Flippy backup util. Can't think of any more. To answer the question, the files with the .Z ending are single files that have been compressed using the COMPRESS(1) utility on Unix, which is also available on the Amiga. (Faster to do it on Unix tho!)
hjanssen@cbnewse.ATT.COM (hank janssen) (11/06/89)
This message is empty.
bdb@becker.UUCP (Bruce Becker) (11/06/89)
In article <37713@lll-winken.LLNL.GOV> jac@muslix.UUCP (James Crotinger) writes: | | The .Z files are compressed with the UNIX compress command, and can |be uncompressed with the same (or with uncompress). Unfortunately I'm |not aware of a P.D. version of compress for the Amiga that can handle |16 bit compression, which is the default if the file was compressed on |a UNIX system. I have a version that does this, but it came with |A-Talk III and is probably not PD. The compress which comes with Loftus Uucp/News system handles 16 bits. It is a PD distribution as far as I know. -- .::. Bruce Becker Toronto, Ont. w \@@/ Internet: bdb@becker.UUCP, bruce@gpu.utcs.toronto.edu `/c/-e BitNet: BECKER@HUMBER.BITNET _/ \_ Your Agrarian Distress Card - Don't heave loam without it...
mikes@lakesys.lakesys.com (Mike Shawaluk) (11/06/89)
In article <37713@lll-winken.LLNL.GOV> jac@muslix.UUCP (James Crotinger) writes: > The .Z files are compressed with the UNIX compress command, and can >be uncompressed with the same (or with uncompress). Unfortunately I'm >not aware of a P.D. version of compress for the Amiga that can handle >16 bit compression, which is the default if the file was compressed on >a UNIX system. I have a version that does this, but it came with >A-Talk III and is probably not PD. I had a program that is either PD or ShareWare, called "Mash", which was a UN*X compress compatible program, and would either compress or expand files. Unfortunately, I don't know where it is anymore (I am about 6 months behind in cataloging my various download disks; one of these days, for sure .. :-) Also, if my memory serves me correctly, MRBackup (which is on at least one Fish disk) compresses its files via the same format that UN*X compress uses, and even uses the .Z extensions (at least an older version that I once looked at did this, I don't know about the current version, which I haven't looked at.) BTW, I have no idea whether either of these Amiga programs supports all of the possible options that UN*X compress does, especially relating to the number of bits of Ziv-Lempel codes (i.e., 14 vs. 15 or 16, which require more memory). -- - Mike Shawaluk "Rarely have we seen a mailer -> DOMAIN: mikes@lakesys.lakesys.com fail which has thoroughly -> UUCP: ...!uunet!marque!lakesys!mikes followed these paths." -> BITNET: 7117SHAWALUK@MUCSD
tadguy@cs.odu.edu (Tad Guy) (11/06/89)
In article <37713@lll-winken.LLNL.GOV> jac@muslix.llnl.gov (James Crotinger) writes:
Unfortunately I'm not aware of a P.D. version of compress for the
Amiga that can handle 16 bit compression, which is the default if
the file was compressed on a UNIX system.
The source for the UNIX compress is freely distributable, and compiled just
fine for me with Lattice (this was a long time ago). The limitation of 12 bits
is due to memory size. If you have plenty-o-ram, you can compile uncompress to
handle 16 bits... (Other uncompress programs get around this in other ways).
...tad
esker@abaa.uucp (Lawrence Esker) (11/07/89)
In article <37713@lll-winken.LLNL.GOV> jac@muslix.UUCP (James Crotinger) writes: > The .Z files are compressed with the UNIX compress command, and can >be uncompressed with the same (or with uncompress). Unfortunately I'm >not aware of a P.D. version of compress for the Amiga that can handle >16 bit compression, which is the default if the file was compressed on >a UNIX system. > Jim Mark Rinfret's (sp?) program called MRBackup is a backup utility that uses the unix compress format. All backed up files that are not precompressed can be saved as .Z files on the floppy. The restore operation can do single file from any source which will decompress any .Z file. It uses selectable 12 to 16 bit compression codes. It is freely redistributable (maybe shareware). The latest version I know of is v3.3C. Supposed to be minor bug fixes from v3.3 which is the latest I have. Quite a bargin, a compress program and HD backup all rolled into one. -- ---------- Lawrence W. Esker ---------- Modern Amish: Thou shalt not need any computer that is not IBM compatible. UseNet Path: __!mailrus!sharkey!itivax!abaa!esker == esker@abaa.UUCP
swan@jolnet.ORPK.IL.US (Joel Swan) (11/07/89)
In article <37713@lll-winken.LLNL.GOV> jac@muslix.UUCP (James Crotinger) writes:
:
: The .Z files are compressed with the UNIX compress command, and can
:be uncompressed with the same (or with uncompress). Unfortunately I'm
:not aware of a P.D. version of compress for the Amiga that can handle
:16 bit compression, which is the default if the file was compressed on
:a UNIX system. I have a version that does this, but it came with
:A-Talk III and is probably not PD.
:
: Jim
(pulling from the memory bin)
There is an Amiga program called MASH by Justin V. McCormic (sp?) that is
similar to the Unix compress. It also renames files with a .Z sufix.
Look for it at your local Amiga BBS or I could maybe mail it to you.
(should I post it?) Oh, also forgot to say that is de-compresses as well.
Joel
d87-khd@sm.luth.se (Karl-Gunnar Hultland) (11/08/89)
In article <6297@merlin.usc.edu> aliu@girtab.usc.edu (Terminal Entry) writes: >In article <8911052127.AA03986@astro.psu.edu> antunes@ASTRO.PSU.EDU (Alex Antunes) writes: ######## Lines Deleted ########## > Ending Use > --------------------------------------------------------- > .lzh LHarc ^ | Where can one get LHarc and how is it compared to arc,zoo warp etc. Karl <$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$> < Karl 'Kalle' Hultland <$> > < Dept. of Comp. sci. <$> d87-khd@sm.luth.se | > < University of Lulea <$> {uunet,mcvax}!sunic.se!sm.luth.se!d87-khd > < Sweden <$> > <$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$> < If two people agree on EVERYTHING , one of them is OBSOLETE!! > <$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$><$>
maniac@arrakis.nevada.edu (ERIC SCHWERTFEGER) (11/08/89)
> Where can one get LHarc and how is it compared to arc,zoo warp etc. > > > Karl you can find LHarc 1.00 at xanth.cs.odu.edu (128.82.8.1) in the incoming/amiga directory. As far as how it compares with the others, it is a standard file archiver rather than a disk archiver, and it has most of the features of zoo, including long file names and (as of 1.00) full path names. It is slower than most archivers, but more effective than anything else like it. I have a 15 Meg subdirectory of zoo files that is now only 11 meg of .lzh files. One important feature of LHarc 1.00 as opposed to zoo is that LHarc has an option to recursively archive directories. Eric Schwertfeger, UNLV, maniac@arrakis.nevada.edu
auyeung@iris.ucdavis.edu (Tak (UlTech) AuYeung) (11/08/89)
In article <984@unsvax.NEVADA.EDU> maniac@arrakis.nevada.edu.uucp (ERIC SCHWERTFEGER) writes: > [... deleted] > One important feature of LHarc 1.00 as opposed to zoo is that >LHarc has an option to recursively archive directories. > >Eric Schwertfeger, UNLV, maniac@arrakis.nevada.edu Zoo can recursively compress / decompress, use the "/" option. --Tak # # ===== everything's only what it seems to be # # # # ===\ /=== # # >>>>> auyeung@iris.ucdavis.edu # # # # ----X #---# >>>>> Tak-Ying (UlTech) AuYeung \===/ \=== # ===/ \=== # # >>>>> Logical Entity in Netland
maniac@arrakis.nevada.edu (ERIC SCHWERTFEGER) (11/08/89)
In article <5859@ucdavis.ucdavis.edu> auyeung@iris.ucdavis.edu (Tak (UlTech) AuYeung) writes: >In article <984@unsvax.NEVADA.EDU> maniac@arrakis.nevada.edu.uucp (ERIC SCHWERTFEGER) writes: >> [... deleted] >> One important feature of LHarc 1.00 as opposed to zoo is that >>LHarc has an option to recursively archive directories. >> >>Eric Schwertfeger, UNLV, maniac@arrakis.nevada.edu > >Zoo can recursively compress / decompress, use the "/" option. > >--Tak What version of Zoo do you have? I have 2.01, and I have to specify every directory myself. Yes, zoo will store the full path name, but in order to zoo the fonts directory, and all subdirectories, for example, you would need to "zoo a foobar fonts:*" then "zoo a foobar fonts:topaz/*", etc, for each individual subdirectory. lharc on the other hand will automatically lharc all the directories if you "lharc -r -x a foobar fonts:*" Eric Schwertfeger, UNLV, maniac@arrakis.nevada.edu
aliu@girtab.usc.edu (Terminal Entry) (11/08/89)
In article <987@unsvax.NEVADA.EDU> maniac@arrakis.nevada.edu.uucp (ERIC SCHWERTFEGER) writes: > >What version of Zoo do you have? I have 2.01, and I have to specify every >directory myself. Yes, zoo will store the full path name, but in order to >zoo the fonts directory, and all subdirectories, for example, you would >need to "zoo a foobar fonts:*" then "zoo a foobar fonts:topaz/*", etc, for >each individual subdirectory. lharc on the other hand will automatically >lharc all the directories if you "lharc -r -x a foobar fonts:*" > I have 2.01 running on Unix, (at least I think it is 2.01), and to store whole directories I move to the top directory and send a: find * -print | zoo aI archivename 'find' will output a list with all the paths, which gets piped to zoo. However, I haven't figured out how to do this on Amiga yet. Anyone knows of a 'find' program for the Amiga? I'm running Cshell v3.03a. Any other way? aliu@nunki.usc.edu Forwarded (MD)
tadguy@cs.odu.edu (Tad Guy) (11/08/89)
In article <984@unsvax.NEVADA.EDU> maniac@arrakis.nevada.edu (ERIC SCHWERTFEGER) writes: > > Where can one get LHarc... > > you can find LHarc 1.00 at xanth.cs.odu.edu (128.82.8.1) in the > incoming/amiga directory. ... You've been peeking! That version of LHarc was contributed by Calvin Jones <jones@eglin.af.mil> and has been moved to its correct home on xanth as /amiga/lharc100.zoo. It's no longer in /incoming/amiga (which is a temporary location for stuff). ...tad
tadguy@cs.odu.edu (Tad Guy) (11/09/89)
In article <6337@merlin.usc.edu> aliu@girtab.usc.edu (Terminal Entry) writes: > I have 2.01 running on Unix, (at least I think it is 2.01), and to > store whole directories I move to the top directory and send a: > > find * -print | zoo aI archivename > > 'find' will output a list with all the paths, which gets piped to zoo. > > However, I haven't figured out how to do this on Amiga yet. Anyone knows of > a 'find' program for the Amiga? I'm running Cshell v3.03a. Any other way? I had to find the same thing in order to package the BKDC#2 entries for upload to xanth (more on the status of that in another letter...) My roommate (Hi Dave!) pointed out the ``fnams'' program from the early days of comp.binaries.amiga. It takes a directory name (with ending slash) as an argument and prints a list of all the files under that directory, recursively. On xanth.cs.odu.edu, it's available as /usenet/comp.binaries.amiga/volume2/dos/fnams.uu1.Z Since I have a lot of zoo'ing to do in order to get the demos packaged, I made an alias (using the Dillon/Drew shell): alias zoodir "%i fnams >pipe:names $i/ ; zoo <pipe:names aI ram:$i.zoo" Of course, in another window I'm using DNet to upload the previously completed zoo. Ahh, multitasking... ...tad
hamilton@intersil.uucp (Fred Hamilton) (11/09/89)
In article <5859@ucdavis.ucdavis.edu>, auyeung@iris.ucdavis.edu (Tak (UlTech) AuYeung) writes: > In article <984@unsvax.NEVADA.EDU> maniac@arrakis.nevada.edu.uucp (ERIC SCHWERTFEGER) writes: >>LHarc has an option to recursively archive directories. >> >>Eric Schwertfeger, UNLV, maniac@arrakis.nevada.edu > > Zoo can recursively compress / decompress, use the "/" option. It (Zoo V2.00) has never RECURSIVELY packed files for me. In fact, a utility called "fnams" was created to allow you to zoo an entire disk or subdirectory. Fred Hamilton Any views, comments, or ideas expressed here Harris Semiconductor are entirely my own. Even good ones. Santa Clara, CA
fnf@estinc.UUCP (Fred Fish) (11/09/89)
In article <984@unsvax.NEVADA.EDU> maniac@arrakis.nevada.edu.uucp (ERIC SCHWERTFEGER) writes: >> Where can one get LHarc and how is it compared to arc,zoo warp etc. > > It is slower than most archivers, but more effective than anything >else like it. I have a 15 Meg subdirectory of zoo files that is now only >11 meg of .lzh files. Well, it looks like I will be sticking with zoo for my library since the copyright restrictions preclude my distributing the lharc program (the restriction that no fee be charged except the cost of magnetic media). Tis a pity, because there are times when I could have used the extra compression to get something to fit on a single disk. -Fred -- # Fred Fish, 1835 E. Belmont Drive, Tempe, AZ 85284, USA # 1-602-491-0048 asuvax!{nud,mcdphx}!estinc!fnf
vv77076@tut.fi (Vanhatupa Vesa) (11/09/89)
In article <6337@merlin.usc.edu> aliu@girtab.usc.edu (Terminal Entry) writes: >I have 2.01 running on Unix, (at least I think it is 2.01), and to >store whole directories I move to the top directory and send a: > >find * -print | zoo aI archivename > >'find' will output a list with all the paths, which gets piped to zoo. > >However, I haven't figured out how to do this on Amiga yet. Anyone knows of >a 'find' program for the Amiga? I'm running Cshell v3.03a. Any other way? > > aliu@nunki.usc.edu > Forwarded (MD) There is a very simple way to do it by using the ARP 1.3 "search" command. First you define alias for find: alias find search FILES ALL [] And now you can do like this: find . * | zoo aI archive[.zoo] or like this: find dh0:usr/src/foo/ * | zoo aI archive[.zoo] There is one thing I miss: Unix-like (tcsh) foreach command. Is there such program already? -- -- Vesa Vanhatupa, ! Internet: vv77076@tut.fi Tampere University of Technology, ! UUCP: vv77076@tut.uucp Finland ! Bitnet: vv77076@fintut.bitnet
blgardne@esunix.UUCP (Blaine Gardner) (11/10/89)
From article <6337@merlin.usc.edu>, by aliu@girtab.usc.edu (Terminal Entry): > I have 2.01 running on Unix, (at least I think it is 2.01), and to > store whole directories I move to the top directory and send a: > > find * -print | zoo aI archivename > > 'find' will output a list with all the paths, which gets piped to zoo. > > However, I haven't figured out how to do this on Amiga yet. Anyone knows of > a 'find' program for the Amiga? I'm running Cshell v3.03a. Any other way? Thanks for the Unix tip! On the Amiga I use a program called Fnams (that's NOT a typo). It was written specifically to feed a directory structure to Zoo. Typical use is: fnams >ram:tempfile df0: zoo <ram:tempfile aI archivename The case is important, capital "I" meaning read stdin. Of course pipes could be used instead. There is also a simple script file included with Fnams called ZooAll that does it in on step. With ZooAll in my S: directory and the S protection bit set, I just: cd df0: zooall dh1:archivename It would be nicer if Zoo did recursive directories itself, but this does work well. And of course to extract the files and create the directories: zoo x// archivename Fnams is on one of the Fish disks. I'm not near my Amiga, so I can't be more specific. -- Blaine Gardner @ Evans & Sutherland 580 Arapeen Drive, SLC, Utah 84108 Here: utah-cs!esunix!blgardne {ucbvax,allegra,decvax}!decwrl!esunix!blgardne There: uunet!iconsys!caeco!i-core!worsel!blaine (My Amiga running uucp) OPUS LIVES!!!
GORRIEDE@UREGINA1.BITNET (Dennis Robert Gorrie) (11/10/89)
I just downloaded MRbackup 3.3. The docs say that the utilities menu provides the amiga-equivalent of unix COMPRESS, supporting the full range from 12 to 16 bit compression. IT warns that 16 bit compression takes a lot of time, and a LOT of memory. MRbackup stores files in the .z format when it does backups. You can then use the MRbackup to extract them or compress others. +-----------------------------------------------------------------------+ |Dennis Gorrie 'Chain-Saw Tag... | |GORRIEDE AT UREGINA1.BITNET Try It, You'll Like It!'| +-----------------------------------------------------------------------+
magik@sorinc.PacBell.COM (Darrin A. Hyrup) (11/10/89)
In article <987@unsvax.NEVADA.EDU> maniac@arrakis.nevada.edu (ERIC SCHWERTFEGER) writes: >In article <5859@ucdavis.ucdavis.edu> auyeung@iris.ucdavis.edu (Tak (UlTech) AuYeung) writes: >>In article <984@unsvax.NEVADA.EDU> maniac@arrakis.nevada.edu.uucp (ERIC SCHWERTFEGER) writes: >>> [... deleted] >>> One important feature of LHarc 1.00 as opposed to zoo is that >>>LHarc has an option to recursively archive directories. >>> >>>Eric Schwertfeger, UNLV, maniac@arrakis.nevada.edu >> >>Zoo can recursively compress / decompress, use the "/" option. >> >>--Tak > >What version of Zoo do you have? I have 2.01, and I have to specify every >directory myself. Yes, zoo will store the full path name, but in order to >zoo the fonts directory, and all subdirectories, for example, you would >need to "zoo a foobar fonts:*" then "zoo a foobar fonts:topaz/*", etc, for >each individual subdirectory. lharc on the other hand will automatically >lharc all the directories if you "lharc -r -x a foobar fonts:*" Yes, Eric is right. Under Amiga Zoo (as of the 2.0X releases) there is no support for recursive directory storage. One must either specify them by hand one at a time, or use a program such as pname or something like that to store a recursive list of filenames with full path extensions into a file which can be read by Zoo. I have no idea if they are planning to confront this problem in a later release (especially since 2.01 was released 2-3 years ago), but most folks I know have switched over to LHarc for all purposes for this reason among the various other excellant reasons to do so. Funny thing is that UNIX Zoo handles recursive directory trees just fine, and its not hard at all to do directory tree transversal on the Amiga so there really isn't any excuse. >Eric Schwertfeger, UNLV, maniac@arrakis.nevada.edu We now return you to your previously scheduled discussion. -- Darrin A. Hyrup // AMIGA Enthusiast rencon!esfenn!dah magik@sorinc.PacBell.COM \X/ & Software Developer pacbell!sorinc!magik ========================================================================== "Speak little and well, if you wish to be considered as possessing merit."
paquette@cpsc.ucalgary.ca (Trevor Paquette) (11/16/89)
In article <VV77076.89Nov9133314@naakka.tut.fi>, vv77076@tut.fi (Vanhatupa Vesa) writes: > In article <6337@merlin.usc.edu> aliu@girtab.usc.edu (Terminal Entry) writes: > > >I have 2.01 running on Unix, (at least I think it is 2.01), and to > >store whole directories I move to the top directory and send a: > >find * -print | zoo aI archivename > >'find' will output a list with all the paths, which gets piped to zoo. > > Talk about doing things the hard way... On the Unix end to get a dir and all of its sub-dirs and files do the following: zoo a zoofile * */* */*/* */*/*/* etc.. * = all files in current directory (including actual dirs) */* = all files in dirs 1 level below this one */*/* = all files in dirs 2 levels below this one etc.. The same can be done on the amiga, just use a shell that supports '*'. Most seem to work with this incantation. On the amiga end to extract the whole thing just give the following command: zoo x// zoofile The same command on the Unix end will work as well. ________________________________/Luminous beings we are, not this crude matter Trevor Paquette ICBM:51'03"N/114'05"W|'She flies like darkness in the night, {ubc-cs,utai,alberta}!calgary!paquette| no mortal is safe within her sight. ' paquette@cpsc.ucalgary.ca | - ancient myth(?) of Esalon