chen@digital.sps.MOT.COM (Jinfu Chen) (01/10/90)
I did a quick and dirty comparision on four compression programs in an UNIX box.And here is the result: Program Size Cmp Size Ratio Comp. Time Decmp. Time Comp. Speed Decomp. Speed byte byte % second second byte/second byte/second (Text) lharc 961244 253799 73.6 353.8 63.8 2717 15067 compress 961244 302096 68.6 29.4 22.6 32695 42533 zoo 961244 368158 61.7 85.9 48.2 11190 19943 arc 961244 425193 55.8 163.4 111.0 5883 8660 (Binary) lharc 293489 130670 55.5 101.7 32.3 2886 9086 compress 293489 167001 43.1 13.5 9.2 21740 31901 zoo 293489 170255 42.0 33.8 17.2 8683 17063 arc 293489 180114 38.6 55.7 37.8 5269 7764 Text file: xlib document (nroff output) Binary file: mush (apollo m68k coff) Program versions: compress 5.1 lharc V0.03 (Beta) arc 5.21 zoo 2.01 Observations: o In terms of compresion size, lharc ranks on top, with 5-12% better than compress, 12-13% better than zoo, and 17-18% better than arc(5.21). o In terms of speed in compression mode, compress rates much higher than the rest, in the order of 2.5-3 times faster than zoo, 4-6 times faster than arc (5.21), and 8-12 times faster than lharc. o In decompression mode, the speed differences are less than the ones in compression mode. Compress still rates the fastest, with 2 times faster than zoo, 2.5-3 times faster than lharc (not bad), and 4-5 times faster than arc (5.21). There's no doubt people who use online services love `lharc', in which transfer time is their main concern, giving the fact that `lharc' provides over 10% better compression ratio than other methods. While other situation when storage space and process time are to be balanced, `compress' probably is the choice. I read from the README file in arc 6.0 (ST version) that 6.0's speed is quite fast comparing to 5.21. However, the compression ratio changes very little. I doubt even arc 6.0 can beat other three programs in both speed and compression. Of course there are other issues involved when comparing these programs, such as feature of including comments, capability of including multiple files and/or directories. For now I just simply ignore them. In case anyone interested, the tests were run on an Apollo workstation DN3000 (12 Mhz 68020/68881) with 8M memory (OS SR10.2), which is approximately a 1.9 mips machine (comparing to the 0.8-0.9 mips 8Mhz 68000 ST). Run times are reported in user's time.
cr1@beach.cis.ufl.edu (Christopher Roth) (01/10/90)
What we have here is a lot of different archive methods being used. It would be nice if we could use one standard, and I would vote for Zoo. -- =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-= * Christoper Roth * "Machines have no * InterNet : cr1@beach.cis.ufl.edu * Conscience..." =-=-=-=-=-=-=-=-=-=-=-=-=-Post No Bills-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
jvance@ics.uci.edu (Joachim Patrick Vance) (01/11/90)
In article <21701@uflorida.cis.ufl.EDU> cr1@beach.cis.ufl.edu (Chris Roth) writes: >What we have here is a lot of different archive methods being used. >It would be nice if we could use one standard, and I would vote for >Zoo. > I would rather have someone develop a universal archiving tool which could dearchive many different formats and perhaps archive in all those formats as well. There are trade-offs for all the compression methods and I would rather not lose the freedom of choice. Does someone want to develop such a universal archiving tool? Is it worth it? I for one would have a use for such a program. -- Joachim ~~~~~~~~~~~~~~~~~~~~~~ What do my .sig and UCI have in common? | | jvance@ics.uci.edu | - - - - - - - - - | ~~~~~~~~~~~~~~~~~~~~~~ They're both Under Construction Indefinately.|
tim@brspyr1.BRS.Com (Tim Northrup) (01/11/90)
I have come across a somewhat serious deficiency with LHARC ... archives which contain directories cannot be transported from UNIX to the ST (the ST versions of LHARC and UNLZH do not convert from / to \). I assume things do not work in the other direction either, but I have not tested that as yet. I will forward this on to the folks that handle the ST and UNIX versions that I have, but think everyone here should be aware of the problem. Other than that, the disk savings of lharc are pretty impressive. -- Tim -- Tim Northrup +------------------------------------------+ +---------------------------------+ GEnie: T.Northrup | UUCP: uunet!crdgw1!brspyr1!tim | Air Warrior: "Duke" | ARPA: tim@brspyr1.BRS.Com +------------------------------------------+
rs0@beach.cis.ufl.edu (Bob Slaughter) (01/11/90)
In article <21701@uflorida.cis.ufl.EDU> cr1@beach.cis.ufl.edu (Chris Roth) writes: >What we have here is a lot of different archive methods being used. >It would be nice if we could use one standard, and I would vote for >Zoo. I would _really love to see a super-smart program that can use any of these formats, and allow you to unarchive _anything_. Then the choice of archive format would become less of a problem in the present and the future. -- * Bob Slaughter * This space for rent * * InterNet#1: rs0@beach.cis.ufl.edu * Call 1-800-FOR-RENT * * InterNet#2: Haldane@Pine.Circa.Ufl.Edu * Model Railroading * * Bitnet: Haldane@UFPine * is Fun!! *
chen@digital.sps.mot.com (Jinfu Chen) (01/12/90)
In article <21723@uflorida.cis.ufl.EDU> rs0@beach.cis.ufl.edu () writes: > >I would _really love to see a super-smart program that can use any of >these formats, and allow you to unarchive _anything_. Then the choice >of archive format would become less of a problem in the present and >the future. > It's possible to write such program. However, it's pretty hard to write a program to handle multiple format by itself with comfortable speed. It's much easier to write a program to identify the compression format and call appropriate compression program to handle the data file. Under UNIX, this is trival as each of these four compression schemes has an unique 'magic' number. The UNIX command `file' can be used for such application. If you add following lines in the 'magic' file (/etc/magic), the command should be able to identify the format of an compressed file: 0 short 0x1f9d compress(l) output 0 byte 0x1a arc(l) archive output 2 string -l lharc(l) archive output 20 long 0xdca7c4fd zoo(l) archive output Note that the magic number for zoo is for Motorola 680x0 chips, the Intell chips and VAX family (?) store the long integer in normal order. So the number should 0xfdc4a7dc instead. I'm not 100% sure the lharc magic string covers every format of lharc. Inside lharc.c, there are four of them listed: #define LZHUFF0_METHOD "-lh0-" #define LZHUFF1_METHOD "-lh1-" #define LARC4_METHOD "-lz4-" #define LARC5_METHOD "-lz5-" Alternatively, one can add these information to the magic file so `file' command can even tell you which archive method lharc uses.
covertr@force.UUCP (Richard E. Covert) (01/12/90)
In article <21723@uflorida.cis.ufl.EDU>, rs0@beach.cis.ufl.edu (Bob Slaughter) writes: > In article <21701@uflorida.cis.ufl.EDU> cr1@beach.cis.ufl.edu > (Chris Roth) writes: > >What we have here is a lot of different archive methods being used. > >It would be nice if we could use one standard, and I would vote for > >Zoo. > > > I would _really love to see a super-smart program that can use any of > these formats, and allow you to unarchive _anything_. Then the choice > of archive format would become less of a problem in the present and > the future. > > -- > * Bob Slaughter * This space for rent * Actually, a SuperSmart Do It All Archiver isn't needed. A friend of mine named Gerry Szekely of Detroit Michigan wrote a ShareWare program called "ARCIT SHELL" which handles ARC, LHARC, ZOO, ZIP all transparently from the same dialogue box. You select a button for the desired ARCing format and then ARCIT SHELL uses other buttons to build a command line to the desired program. I use it all the time. And then there is the FAMOUS Charles Johnson's ARC SHELL program which is great for ARC and LHARC files. And there is a new LZH dearchiver called UNLZH12 which is super fast for extracting files. So, as long as there are excellent Shells (GEM-based shells I mean) I don't see why we need a overly complex do it all archiver. I would rather see a sharp programmer spend time on improving the existing LZH program. That would be of more immediate benefit to more users than a do it all program. Another point is that a DO IT ALL program would freeze the archivers into a static state. I don't think that the ARC and LZH and ZOO and ZIP updates would as easily incorporated into a DO IT ALL than just a shell. A shell can normally use the newer archiver without change, witness the fact that ARC SHELL 2.0 worked fine with ARC 6.02, but w/o the new features of ARC 6.02. Then Charles Johnson updated ARC SHELL to ARC SHELL 2.1b and now it works fine with ARC 6.02. Pretty simple and quick. So, for all of the above reasons I don't see a burning need for a DO IT ALL archviver. With the two fine shells mentioned above we can work with any archiver with ease now!! -- Richard E. Covert, Lead Engineer of Software Tools Group AG Communications Systems, Phoenix AZ (602) - 581-4652 TCP/IP: covertr@gtephx UUCP: {ncar!noao!asuvax | uunet!zardoz!hrc | att}!gtephx!covertr
perand@nada.kth.se (Per Andersson) (01/13/90)
In article <21723@uflorida.cis.ufl.EDU> rs0@beach.cis.ufl.edu () writes: >In article <21701@uflorida.cis.ufl.EDU> cr1@beach.cis.ufl.edu >I would _really love to see a super-smart program that can use any of >these formats, and allow you to unarchive _anything_. Then the choice >of archive format would become less of a problem in the present and >the future. And of course it should work under Unix, BSD+SysV, Amiga, ST, PC and mac. This is unfortunately an utopia. But things is getting better, Zoo works on all these systems, arc does, and lharc at least works on BSD, ST and PC. Then you can build e menysystem around them to get your program. If this supersmart program had been done, there would be enough ccordination not to have umpteen compression formats. Just ramblin' on, Per -- --- Per Andersson Royal Institute of Technology, Stockholm, Sweden perand@admin.kth.se, @nada.kth.se