[comp.sys.amiga] Compressed archive format

xanthian@zorch.SF-Bay.ORG (Kent Paul Dolan) (08/26/90)

Several years back, we made the transition from .arc to .zoo files,
and for a while there was a lot of trouble with stuff being archived
as .zoo's, but the zoo program not being widely distributed, and not
often available on host computer systems.  Eventually, it all got
straightened out, and .arc's gave way to .zoo's.

We're going through it again, but this time it's worse.

The .zoo format is being replaced by _two_ formats, .lhw and .lzh,
(and also by whatever PKAZIP writes, I suppose), but the archive
executables are not widely distributed, and host computer versions
(at least here) won't even compile and are several changes behind
the Amiga versions.

This is leading to a lot of confusion and reduced utility, which
is a shame, since the new archivers do seem to save a significant
amount of space.

Could some kind soul 1) do a good set of benchmarks to see which is
the most time/space efficient of the new archivers, and publish it
here; 2) find and distribute the widely ported source code for the
host executable; 3) find and distribute the source code and executables
for the Amiga versions; and, most important, 4) pick _one_ new
archiver/format we can all agree upon?

Maintaining a chinese menu of archive software is not too useful,
and archivers without source to fix the obvious bugs is less than
stunningly helpful as well.

If there is someone really into pain, going back through the
abcfd20 archives and repacking everything into this common
format would really be helpful, but from my own experience
with a much more modest earlier form of those archives, the
task could take several _months_ of someone's free time, so
that's a lot to ask.

Thanks; I'm getting really frustrated trying to keep up with the
sudden profusion of archive formats; too much is coming past or
put up for ftp that I couldn't unpack if I could download it, and
not having a working version on my host system is the pits, as
most of the file management stuff I do with zoo is suddenly not
available, so that I have to download the headers and the
uudecoded contents separately rather than sticking my usual
POSTER.source, MAILER.source, or similar archival copy of the
publication header into the archive, as I consistently do with
.zoo archives.

Kent, the man from xanth.
<xanthian@Zorch.SF-Bay.ORG> <xanthian@well.sf.ca.us>

JKT100@psuvm.psu.edu (JKT) (08/26/90)

In article <1990Aug25.180851.4401@zorch.SF-Bay.ORG>, xanthian@zorch.SF-Bay.ORG
(Kent Paul Dolan) says:
>
>Could some kind soul 1) do a good set of benchmarks to see which is
>the most time/space efficient of the new archivers, and publish it
>here; 2) find and distribute the widely ported source code for the
>host executable; 3) find and distribute the source code and executables
>for the Amiga versions; and, most important, 4) pick _one_ new
>archiver/format we can all agree upon?

No need to duplicate the effort of doing benchmarks and publishing
the results.... Look at the August 1990 issue of AmigaWorld, page 46.
There's an entire article comparing ARC, ZOO, LHARC, and PKAZIP.

                                                            Kurt
--
 -----------------------------------------------------------------------
|| Kurt Tappe   (215) 363-9485  || Amigas, Macs, IBM's, C-64's, NeXTs, ||
|| 184 W. Valley Hill Rd.       ||  Apple ]['s....  I use 'em all.     ||
|| Malvern, PA 19355-2214       ||  (and in that order too!   ;-)      ||
||  jkt100@psuvm.psu.edu         --------------------------------------||
||  jkt100@psuvm.bitnet  jkt100%psuvm.bitnet@psuvax1  QLink: KurtTappe ||
 -----------------------------------------------------------------------

plouff@kali.enet.dec.com (08/27/90)

In article <90237.220415JKT100@psuvm.psu.edu>, JKT100@psuvm.psu.edu (JKT) writes...
> 
>No need to duplicate the effort of doing benchmarks and publishing
>the results.... Look at the August 1990 issue of AmigaWorld, page 46.
>There's an entire article comparing ARC, ZOO, LHARC, and PKAZIP.
> 
>                                                            Kurt

Errr, the article has many flaws.  Chief among these are the author's 
ignorance of the software distribution methods used on the net (i.e. the 
importance of archiving tools on large systems), his obvious bias toward 
PKAZIP and his failure to mention the number of files and directories in 
his test packages.

He also claims that "one prominent developer [found] that LHARC sometimes
lost pieces of archived text files," but offers no details.  Anybody got 
an explanation for this _AmigaWorld_ howler?

-- 
Wes Plouff, Digital Equipment Corp, Maynard, Mass.
plouff@kali.enet.dec.com

Networking bibliography:  _Islands in the Net_, by Bruce Sterling
			  _The Matrix_, by John S. Quarterman

d88-mbe@sm.luth.se (Michael Bergman) (08/28/90)

xanthian@zorch.SF-Bay.ORG (Kent Paul Dolan) writes:

>We're going through it again, but this time it's worse.

>The .zoo format is being replaced by _two_ formats, .lhw and .lzh,
>(and also by whatever PKAZIP writes, I suppose), but the archive
>executables are not widely distributed, and host computer versions
>(at least here) won't even compile and are several changes behind
>the Amiga versions.

Well, .lhw and .lzh aren't comparable. Lhwarp is only used to compress
whole Amiga disks and is only available under AmigaDOS. Lharc
is available for mainframe (UNIX) machines - latest version 1.00 is
100% compatible with all other versions (including Amiga Lharc) The
source can be downloaded from a usenet archive: comp.sources.misc.
Compiles "as is" under SunOS and BSD 4.3.

>Could some kind soul 1) do a good set of benchmarks to see which is
>the most time/space efficient of the new archivers, and publish it
>here; 2) find and distribute the widely ported source code for the
>host executable; 3) find and distribute the source code and executables
>for the Amiga versions; and, most important, 4) pick _one_ new
>archiver/format we can all agree upon?

Benchmarks have been done. I have them on paper, but not in electronic format
I'm afraid so I can't repost them. Because lharc is so slow, the guy who
did this concluded that Zip is the overall most effective algorithm (I *think*)

However, Lharc generally packs better and you can always run it in the
background (as Amiga owners we're all familiar with that, right? :-)
If there really is a demand for these "benchmarks" I can type them in and
post them. Mail me if you want to see them. Or better still, the guy who
did them in the 1st place might read this and repost them :-)

To my knowledge there is no complete zip tool that runs under UNIX or VMS.
There is a BSD unzip though, I think I have the source to that on tape.

For those of you who want to convert .zoo to .lzh, I include a script written
by me (copy this as you wish and spread it!)

Michael


----cut here----cut here----cut here----cut here----cut here----
#! /bin/sh
#
# Reconstructs zoo archive(s) to lharc archive(s)
# using /tmp/ for temporary storage.
# The .zoo extension should not be given explicitly.
# Before use, check that the zoo archive(s) aren't damaged: zoo Lm *
#
# M. Bergman  1990-03-28
#
if [ $# -lt 1 ]
then
   echo "Usage: zoo2lzh <zoo archive file {zoo archive file}...>"
   echo "	You must have rw permission both to the zoo archive(s)"
   echo "	*and* the catalog(s) they are in."
   echo "	The .lzh file(s) will have the default permissons."
   echo ""
   exit 1
fi
for ARG do
   if [ -w $ARG.zoo -a -r $ARG.zoo ]
   then
      WD=`pwd`
      ARCHFILE=$ARG
      if [ `echo $ARG | awk '{print substr($1,0,1)}'` != / ]
      then
         ARCHFILE=$WD/$ARG
      fi
      TMPDIR=/tmp/.z2l#$$
      mkdir $TMPDIR
      chmod og-rwx $TMPDIR
      cd $TMPDIR
      echo ""
      echo "Reconstructing $ARCHFILE.zoo"
      zoo e// $ARCHFILE.zoo
      \rm $ARCHFILE.zoo
      FILES=`ls -A`
      echo $FILES
      lharc a $ARCHFILE.lzh $FILES
      cd $WD
      \rm -R $TMPDIR
   else
      echo ""; echo "zoo2lzh: $ARG.zoo nonexistent or limited permission"
   fi
done
echo ""


-- 
      Michael Bergman         Internet: d88-mbe@sm.luth.se
  //  Undergrad. Comp. Eng.   BITNET:   d88-mbe%sm.luth.se@kth.se
\X/   U of Lulea, SWEDEN      ARPA:     d88-mbe%sm.luth.se@ucbvax.berkeley.edu
			      UUCP:  {uunet,mcvax}!sunic.se!sm.luth.se!d88-mbe

sparks@corpane.UUCP (John Sparks) (08/28/90)

JKT100@psuvm.psu.edu (JKT) writes:

>>Could some kind soul 1) do a good set of benchmarks to see which is
>>the most time/space efficient of the new archivers, and publish it
>>here; 2) find and distribute the widely ported source code for the
>>host executable; 3) find and distribute the source code and executables
>>for the Amiga versions; and, most important, 4) pick _one_ new
>>archiver/format we can all agree upon?

>No need to duplicate the effort of doing benchmarks and publishing
>the results.... Look at the August 1990 issue of AmigaWorld, page 46.
>There's an entire article comparing ARC, ZOO, LHARC, and PKAZIP.

other points besides speed are cost and usability. 
Zoo and Lharc are freeware but
PKAZIP is shareware. Remember the big stink about PKAZIP a while back
with Phil Katz saying he wasn't going to revise amiga PKAZIP because no
one was registering it? Also, it is only workbench runable. It takes up
a full window and can't be ran from a command line (i.e. Pkazip -x file
won't work). 

Lharc seems speed comparable with Zoo, maybe a bit slower at packing.
But it packs quite a bit smaller. I also have versions of Lharc that
run under MSDOS and Unix.

LHW is Lharc Warp, which is like Warp in that it packs disk tracks rather
than files. This is good for packing entire floppies, but it's disadvantage
is that there are no versions of LHW that work under unix so you can't get
an archive listing on a unix host. You have to download it to the amiga
first to find out what is in the LHW archive.

When Lharc first became available, I hesitated to use it because I was used
to Zoo and I did not have a version of Lharc for Unix (where I store all
my archives). But now that I have Lharc for Unix (it came across comp.sources
.unix a few months back) I am starting to like it. It has all of the advantages
of Zoo, but packs smaller. And judging from all of the BBS's I call, it is
fast becomming the standard in the Amiga world.


-- 
John Sparks         |D.I.S.K. Public Access Unix System| Multi-User Games, Email
sparks@corpane.UUCP |PH: (502) 968-DISK 24Hrs/2400BPS  | Usenet, Chatting,
=-=-=-=-=-=-=-=-=-=-|7 line Multi-User system.         | Downloads & more.
A door is what a dog is perpetually on the wrong side of----Ogden Nash

crazyrat@disk.UUCP (@jap) (08/28/90)

Hello, csamiga world.  Again I call for information from the net.  I am 
trying to use MessyDOS, but am having problems mounting a 360K disk drive
as a MSH: device.  This drive mounts perfectly as DF3: perfectly, but 
MessyDOS eats it.
What I need, is help (if possible), but what I'd like more, is how to contact
Olaf on the net somehow.  I have his mailing address, but I like to be able
to fix this (or find out otherwise) without going to that extent.

Thanks for the help!

@jap

-- 
Joel C. Justen      Crazyrat Productions Ltd.
disc:  who cares?   CRAZYRAT@DISK.UUCP

xanthian@zorch.SF-Bay.ORG (Kent Paul Dolan) (08/29/90)

d88-mbe@sm.luth.se (Michael Bergman) writes:
> xanthian@zorch.SF-Bay.ORG (Kent Paul Dolan) writes:

>> We're going through it again, but this time it's worse.

>> The .zoo format is being replaced by _two_ formats, .lhw and .lzh,
>> (and also by whatever PKAZIP writes, I suppose), but the archive
>> executables are not widely distributed, and host computer versions
>> (at least here) won't even compile and are several changes behind
>> the Amiga versions.

> Well, .lhw and .lzh aren't comparable. Lhwarp is only used to compress
> whole Amiga disks and is only available under AmigaDOS.

Well, that makes it pretty useless for working in the larger context
of building archives/swapping stuff across non-Amiga hosts, and I
hope folks will stop submitting archives in that format, then.

> Lharc
> is available for mainframe (UNIX) machines - latest version 1.00 is
> 100% compatible with all other versions (including Amiga Lharc) The
> source can be downloaded from a usenet archive: comp.sources.misc.
> Compiles "as is" under SunOS and BSD 4.3.

Well, perhaps, but it is rife with BSDisms (and SunOS is a child of BSD),
and won't come close to compiling under (a fairly primitive) SYSV system;
it also assumes lots of ANSI C stuff is available, like catenating strings
by putting them on separate lines like:

	"garbage one"
	"garbage two"

that don't work in older C compilers.

I put a couple of hours into trying to port it to my current system, and
decided I was growing old a lot faster than the code was growing smart,
and trashed my work-in-progress.

Also, at least the Amiga version is buggy, in that it promises not to
store duplicates of the same pathname, but in fact does do so.

>> Could some kind soul 1) do a good set of benchmarks to see which is
>> the most time/space efficient of the new archivers, and publish it
>> here; 2) find and distribute the widely ported source code for the
>> host executable; 3) find and distribute the source code and executables
>> for the Amiga versions; and, most important, 4) pick _one_ new
>> archiver/format we can all agree upon?

> Benchmarks have been done. I have them on paper, but not in electronic
> format I'm afraid so I can't repost them.

Seeing them would be nice; another poster noted that (some) benchmarks
exist in the August AmigaWorld, but I'd rather not trust stuff printed
in that rag; it's mostly good for the ads.

> Because lharc is so slow, the guy who did this concluded that Zip is
> the overall most effective algorithm (I *think*)

I have the impression Zip is a proprietary format, and also not widely
ported to mainframes; is either of those right?  If so, that argues
against their use.

> However, Lharc generally packs better and you can always run it in the
> background (as Amiga owners we're all familiar with that, right? :-)

Well, I'll give up a _lot_ of time efficiency (like three or four to one)
for 5% better packing, since, as you note, I can background and ignore the
packing time, but I have to pay for the storage space in cold, hard cash,
and I pay for it in perpetuity.

> If there really is a demand for these "benchmarks" I can type them in and
> post them. Mail me if you want to see them. Or better still, the guy who
> did them in the 1st place might read this and repost them :-)

Well, you at least are following the conversation, so if no one beats
you to reposting them, and you find the time free, I for one would
appreciate the effort on your part.

> To my knowledge there is no complete zip tool that runs under UNIX or VMS.
> There is a BSD unzip though, I think I have the source to that on tape.

Bummer.  Lots of the stuff I pack up and download is not Amiga specific,
it is generic or Unix oriented code that I might like to port "someday".
A tool that works across a wide range of platforms, and is available in
source code form is a necessity if the current multiplicity of archivers
is to be replaced by a single tool.

Maybe I'll find time to rip the Huffman code out of lharc and replace it
with the more efficient arithmetic encoding, for another 5% gain, if I
can ever make sense out of the currently undocumented and uncommented
lharc 1.0 sources I have available from comp.sources.misc.  I suppose I
should descend to a paper copy, since the context on my screen isn't
enough to follow the code.  Shake a bit of the dust off my printer.  Wish
I had source for the Amiga version, so I could just drop in the arithmetic
encoding I already have working without having to reinvent all the file
walking stuff and flag processing.  Bleah.  Maybe I won't find time, too.

> For those of you who want to convert .zoo to .lzh, I include a script written
> by me (copy this as you wish and spread it!)

Thanks for your comments and for the script, Michael.

Kent, the man from xanth.
<xanthian@Zorch.SF-Bay.ORG> <xanthian@well.sf.ca.us>

d88-mbe@sm.luth.se (Michael Bergman) (08/29/90)

xanthian@zorch.SF-Bay.ORG (Kent Paul Dolan) writes:

[about LHwarp]
>Well, that makes it pretty useless for working in the larger context
>of building archives/swapping stuff across non-Amiga hosts, and I
>hope folks will stop submitting archives in that format, then.

I agree. LHwarp has it's uses, but *not* for building archives on non-
Amiga hosts. It is convenient when you want to pack the entire AmigaDOS
file structure on a disk 'as is' and put it on other media, though.

> Lharc
>I put a couple of hours into trying to port it to my current system, and
>decided I was growing old a lot faster than the code was growing smart,
>and trashed my work-in-progress.

I know, I know. I wish it was available for a wider range of systems.
On the other hand I made a lot of people *very* happy when I posted a
note here in c.s.a saying that a UNIX lharc compatible with Amiga versions
existed! I guess we have to begin somewhere...

>Also, at least the Amiga version is buggy, in that it promises not to
>store duplicates of the same pathname, but in fact does do so.

Hmm, I *think* this is fixed in the new v1.20. Check it out. 1.20 is also
a lot faster (35% unpacking & 15% packing) than 1.10.

>I have the impression Zip is a proprietary format, and also not widely
>ported to mainframes; is either of those right?  If so, that argues
>against their use.

Yes, yes. I agree again. Sure, PkaZip is nice and has a very high "overall"
performance, but... the tool we want to use *has* to be available for
mainfraime systems. Can someone port it? (Just kidding.)

>Well, I'll give up a _lot_ of time efficiency (like three or four to one)
>for 5% better packing, since, as you note, I can background and ignore the
>packing time, but I have to pay for the storage space in cold, hard cash,
>and I pay for it in perpetuity.

Once again we share the same opinion. Strangely enough, not all people do.

>Well, you at least are following the conversation, so if no one beats
>you to reposting them, and you find the time free, I for one would
>appreciate the effort on your part.

Ok. I'll dig them up and type them in tomorrow (oops, it's 1:15 AM already,
read today :-). BUT these might be the same figures that appeared in Amiga
World! I can't check, I'm afraid.

>A tool that works across a wide range of platforms, and is available in
>source code form is a necessity if the current multiplicity of archivers
>is to be replaced by a single tool.

Well, the reason for the increased use of LHarc in spite of it's flaws (e.g.
not being available for most systems) is the efficiency. It appears to me that
people are willing to give up a lot for better packing. I think that LHarc
will win in the end and that new versions for other systems will be written.
As I said before, the archive tool has to meet a demand from the users, other-
whise noone will use it. It *seems* as if the packing efficiency is the most
important aspect here. It is also possible to make the archiver programs
pack and unpack faster. The fastest versions for Amiga are much faster than
the UN*X version 1.00.

>Maybe I'll find time to rip the Huffman code out of lharc and replace it
>with the more efficient arithmetic encoding, for another 5% gain

Hmm.. If you ever do, send it to me so I can try it, will you? I'm not sure
it's a good thing to break another "standard" by using this arithmetic
encoding for just another 5%, though. Then we would have yet another
archiver and versions for different computer would have to
be written before people could use it. Unless of course you would take it
upon yourself to write Amiga/IBM PC/Mac/Xenix/VMS/UNIX versions all
at once..:-) Lharc isn't nearly as flawless as zoo yet and the UNIX version
lacks *lots* of options that a good archiver must have, but it's a beginning.
(There were more that one version of zoo too, right? :-)

>Thanks for your comments and for the script, Michael.

About the script - I gave you an old version, sorry! This is the latest.

Mike
 
#! /bin/sh
#
# Reconstructs zoo archive(s) to lharc archive(s)
# using /tmp/ for temporary storage.
# Before use, check that the zoo archive(s) aren't damaged: zoo Lm *
#
# M. Bergman  Last changed 1990-06-06
#
if [ $# -lt 1 ]
then
   echo "Usage: zoo2lzh <zoo archive file {zoo archive file}...>"
   echo "	You must have rw permission both to the zoo archive(s)"
   echo "	*and* the catalog(s) they are in."
   echo "	.lzh file(s) will have the default permissons."
   echo "	The .zoo extension can be omitted."
   echo ""
   exit 1
fi
TMPDIR=/tmp/.z2l#$$
mkdir $TMPDIR
chmod 700 $TMPDIR
for ARG do
   if [ `echo $ARG | awk -F. '{print $NF}'` = zoo ]
   then
      EXT=`echo $ARG | awk -F. '{ORS="."} {for (i=1; i < NF-1; i++) print $i}
				{ORS=""} {print $(NF-1)}'`
   fi
   if [ -w $EXT.zoo -a -r $EXT.zoo ]
   then
      WD=`pwd`
      ARCHFILE=$EXT
      if [ `echo $EXT | awk '{print substr($1,0,1)}'` != / ]
      then
         ARCHFILE=$WD/$EXT
      fi
      echo ""
      echo "Reconstructing $ARCHFILE.zoo"
      cd $TMPDIR
      zoo e// $ARCHFILE.zoo
      rm $ARCHFILE.zoo
      FILES=`ls -A`
      echo $FILES
      lharc a $ARCHFILE.lzh $FILES
      rm -fr $FILES
      cd $WD
   else
      echo ""; echo "zoo2lzh: $EXT.zoo nonexistent or limited permission"
   fi
done
rmdir $TMPDIR
echo ""
-- 
      Michael Bergman         Internet: d88-mbe@sm.luth.se
  //  Undergrad. Comp. Eng.   BITNET:   d88-mbe%sm.luth.se@kth.se
\X/   U of Lulea, SWEDEN      ARPA:     d88-mbe%sm.luth.se@ucbvax.berkeley.edu
			      UUCP:  {uunet,mcvax}!sunic.se!sm.luth.se!d88-mbe

bscott@nyx.UUCP (Ben Scott) (08/29/90)

In article <14937@shlump.nac.dec.com> plouff@kali.enet.dec.com writes:
>He also claims that "one prominent developer [found] that LHARC sometimes
>lost pieces of archived text files," but offers no details.  Anybody got 
>an explanation for this _AmigaWorld_ howler?

Well, I have a POSSIBLE explanation... I mean, I don't remember who the 
author is or necessarily know who he's refering to, but a reasonably 
prominent developer HAS found a repeatable circumstance in which LHarc
can lose bits of files (text files, I think, possibly WordPerfect files,
don't remember for sure).  Reid Bishop, co-author of B.A.D. and several
other (_STILL_) upcoming products, is the person I'm talking about. 

Since the bug didn't interest me too much at the time and it was a while
ago, I've forgotten the exact details but he was quite certain.  For those
of you who are interested, he can be reached through the net via the 
Compuserve gateway at 72017.1744@compuserve.com (for now, Arvada 68K,
the board he and I run listed in my .sig, is down).  He is quite busy 
at this time and so allow some time for a reply, if any is even possible
or reliable through that gateway (Larry Phillips:  Did you ever get the
messages he sent you?).  If he wants I'll post more details as he reports
them to me.

Note that in no way am I trying to defend AmigaWorld's reliablity in the
general case, which I have found to be "variable".  

.                            <<<<Infinite K>>>>

--
.---------------------------------------------------------------------------.
|Ben Scott, professional goof-off and consultant at The Raster Image, Denver|
|Amiga UUCP node domain: bscott@vila.denver.co.us Else: bscott@nyx.cs.du.edu|
|FIDO point address 1:104/421.2, or call the Arvada 68K BBS at (303)424-9831|
|"Don't embarrass us..."  "Have I ever?" - Buckaroo Banzai  | *AMIGA POWER* |
`---------------------------------------------------------------------------'

sparks@corpane.UUCP (John Sparks) (08/30/90)

d88-mbe@sm.luth.se (Michael Bergman) writes:


>Benchmarks have been done. I have them on paper, but not in electronic format
>I'm afraid so I can't repost them. Because lharc is so slow, the guy who
>did this concluded that Zip is the overall most effective algorithm (I *think*)

The first version of Lharc for the Amiga was slow, but the latest versions
by Jonathan Forbes seem pretty fast. I have seen a version that just unpaks
called Lhunarc and a program that packs and unpacks called LZ.
Inside a doc file Jonathan gives this benchmark:

----excerpt from LZ .82 doc---

How fast is it [LZ .82]?
   Lightning fast.  If you've been using Lharc to compress or decompress files,
then you're in for a big surprise.  If you've been using Lhunarc 0.96 (also
written by me), and thought that was fast, then prepare yourself for some major
speed increases, because Lz is faster still!  Lz is between 16% and 25% faster
than Lhunarc 0.96, and that's no joke in terms of speed!

   I have pushed the decompression algorithm almost to its limits.  No other
.LZH extractor can even come -close- to Lz's decompression times.  Things
simply can't get much faster.  Yes, I know I said that in the Lhunarc 0.96
documentation too, but this time I'm pretty sure I'm almost at the limit!

   Here follows a small speed comparison.  The files being compressed and
decompressed were those on Fred Fish disk #245.

                      * - Different algorithm/encoding scheme

                                           *        *
             | (0.82)  | (1.0) | (.99d) | (1.01) | (1.40)
             |   LZ    | Lharc | LharcA | PkaZip | Lhwarp
Fish245.LZH  |  11:33  | 26:29 |    ?   |  10:31 |  13:43  <-- compressing
             |   1:47  |  5:12 |   2:34 |   2:53 |   2:51  <-- decompressing

   LZ certainly kills everything else dead, but is it faster than PkaZip?
Well, sometimes it is, but usually it isn't.  However, LZ (and Lharc, etc.)
almost always compresses files better than Zip.  For example, "MegRyan.lzh"
(a complete 655k digitised sound sample of the famous restaurant scene from
When Harry Met Sally), compresses from 655302 bytes to 488679 when LZ is used,
while the equivalent .zip file is 536773 bytes long.  LZ compresses the file
faster than Zip, too.
---end excerpt---

LZ .82 is shareware, but Lhunarc is freelware. Oh, BTW, the LZ and Lhunarc
beat the pants off of both Zoo and Arc at uncompressing.


-- 
John Sparks         |D.I.S.K. Public Access Unix System| Multi-User Games, Email
sparks@corpane.UUCP |PH: (502) 968-DISK 24Hrs/2400BPS  | Usenet, Chatting,
=-=-=-=-=-=-=-=-=-=-|7 line Multi-User system.         | Downloads & more.
A door is what a dog is perpetually on the wrong side of----Ogden Nash

new@ee.udel.edu (Darren New) (08/30/90)

In article <1056@tau.sm.luth.se> d88-mbe@sm.luth.se (Michael Bergman) writes:
>at once..:-) Lharc isn't nearly as flawless as zoo yet and the UNIX version
>lacks *lots* of options that a good archiver must have, but it's a beginning.
>(There were more that one version of zoo too, right? :-)

Well, why not take the packing algorithms out of lharc and incorporate
them into zoo? Why keep switching to another archiver that has to be
ported across a bunch of machines instead of incorporating new features
into an archiver that already works and has been ported?  -- Darren
-- 
--- Darren New --- Grad Student --- CIS --- Univ. of Delaware ---
----- Network Protocols, Graphics, Programming Languages, 
      Formal Description Techniques (esp. Estelle), Coffee -----

sparks@corpane.UUCP (John Sparks) (08/31/90)

xanthian@zorch.SF-Bay.ORG (Kent Paul Dolan) writes:


>> Lharc
>> is available for mainframe (UNIX) machines - latest version 1.00 is
>> 100% compatible with all other versions (including Amiga Lharc) The
>> source can be downloaded from a usenet archive: comp.sources.misc.
>> Compiles "as is" under SunOS and BSD 4.3.

>Well, perhaps, but it is rife with BSDisms (and SunOS is a child of BSD),
>and won't come close to compiling under (a fairly primitive) SYSV system;


We have gotten Lharc to compile on our SYS V unix system. We are running
Interactive Unix 2.2 (SYS V 3.2) 

Usually we have all sorts of trouble getting things to compile but lharc
wasn't so bad. The compiler complained about some stuff (I forgot what
exactly, it was a while back. I think it had something to do with setting
file dates in some function), so we just took out that section and it
compiled right up. It seems to work just fine and is compatible with 
my Amiga at home and the MSDOS version I have here at work. 

I can see if we still have the butchered version on line here and mail it
to anyone who wants to try it out.



-- 
John Sparks         |D.I.S.K. Public Access Unix System| Multi-User Games, Email
sparks@corpane.UUCP |PH: (502) 968-DISK 24Hrs/2400BPS  | Usenet, Chatting,
=-=-=-=-=-=-=-=-=-=-|7 line Multi-User system.         | Downloads & more.
A door is what a dog is perpetually on the wrong side of----Ogden Nash

warren@hpindda.cup.hp.com (Warren Burnett) (08/31/90)

/ hpindda:comp.sys.amiga / sparks@corpane.UUCP (John Sparks) /  2:52 pm  Aug 27, 1990 /

> Lharc seems speed comparable with Zoo, maybe a bit slower at packing.

HA! That's a laugh.  Even on my 68030 workstation lharc is slow.  Lharc 
takes about five times as long as zoo does on compression and two to 
three times as long as zoo on decompression, at least on the kinds of 
files that I archive.

Personally, I am sticking with zoo.  I would rather have my archive files
be slightly larger than sit around waiting for ten minutes to decompress
this nifty new screen hack I just downloaded.

judd@boulder.Colorado.EDU (JUDD STEPHEN L) (09/02/90)

In article <2793@corpane.UUCP> sparks@corpane.UUCP (John Sparks) writes:
>
>----excerpt from LZ .82 doc---
>
>almost always compresses files better than Zip.  For example, "MegRyan.lzh"
>(a complete 655k digitised sound sample of the famous restaurant scene from
>When Harry Met Sally),

Ah yes, I meant to post something about this.  Is this perhaps available to
the masses anywhere?  Maybe I could time my startup-sequence to finish at the
right time...

On the subject of sampled sounds, last time I was on NewXanth the sampled
sound directories were unreadable.  Why?  More importantly, will they ever be
available?

					-Steve
--
judd@tramp.colorado.edu         //  "We had separate hotel rooms.  We used hers
...!ncar!boulder!tramp!judd   \X/ the first night and mine the second." - K.A.

>John Sparks         |D.I.S.K. Public Access Unix System| Multi-User Games, Email

sparks@corpane.UUCP (John Sparks) (09/05/90)

warren@hpindda.cup.hp.com (Warren Burnett) writes:

|/ hpindda:comp.sys.amiga / sparks@corpane.UUCP (John Sparks) /  2:52 pm  Aug 27, 1990 /

|> Lharc seems speed comparable with Zoo, maybe a bit slower at packing.

|HA! That's a laugh.  Even on my 68030 workstation lharc is slow.  Lharc 
|takes about five times as long as zoo does on compression and two to 
|three times as long as zoo on decompression, at least on the kinds of 
|files that I archive.

|Personally, I am sticking with zoo.  I would rather have my archive files
|be slightly larger than sit around waiting for ten minutes to decompress
|this nifty new screen hack I just downloaded.

Well after downloading files to my amiga then starting to uncompress them,
I usually stay at the console when uncompressing a Lharc file with Lhunarc
but I can go watch about 15 minutes of TV when unzooing a zoo file. And
that is on any kind of file. You must have a really lousy version of
Lharc. 


-- 
John Sparks         |D.I.S.K. Public Access Unix System| Multi-User Games, Email
sparks@corpane.UUCP |PH: (502) 968-DISK 24Hrs/2400BPS  | Usenet, Chatting,
=-=-=-=-=-=-=-=-=-=-|7 line Multi-User system.         | Downloads & more.
A door is what a dog is perpetually on the wrong side of----Ogden Nash

olson@uhunix1.uhcc.Hawaii.Edu (Todd Olson) (09/07/90)

In article <2909@corpane.UUCP> you write:
>warren@hpindda.cup.hp.com (Warren Burnett) writes:
>
>|/ hpindda:comp.sys.amiga / sparks@corpane.UUCP (John Sparks) /  2:52 pm  Aug 27, 1990 /
>
>|> Lharc seems speed comparable with Zoo, maybe a bit slower at packing.
>
>|HA! That's a laugh.  Even on my 68030 workstation lharc is slow.  Lharc 
>|takes about five times as long as zoo does on compression and two to 
>|three times as long as zoo on decompression, at least on the kinds of 
>
>Well after downloading files to my amiga then starting to uncompress them,
>I usually stay at the console when uncompressing a Lharc file with Lhunarc




John for an even better improvment over lhunarc, try LZ (el zed) by
Jonathan Forbes, the same guy who wrote lhunarc, it is even faster
and it also deos compression, all in all a Lharc replacement, the most
current verion I have is .90.  It is really quick on a 2500/30!




				Todd Olson

BTW it is shareware, and it looks as though there will be a check sent to
Jonathon from Hawaii, unless he likes Mac nuts some some other guy :-)

>
>
>-- 
>John Sparks         |D.I.S.K. Public Access Unix System| Multi-User Games, Email
>sparks@corpane.UUCP |PH: (502) 968-DISK 24Hrs/2400BPS  | Usenet, Chatting,
>=-=-=-=-=-=-=-=-=-=-|7 line Multi-User system.         | Downloads & more.
>A door is what a dog is perpetually on the wrong side of----Ogden Nash


-- 
                       olson@uhunix.uhcc.hawaii.edu
____________________________________________________________________________
"Take your work seriously, but never take yourself seriously and do not take
what happens to either yourself or your work seriously." --Booth Tarkington


--
                       olson@uhunix.uhcc.hawaii.edu
____________________________________________________________________________
"Take your work seriously, but never take yourself seriously and do not take
what happens to either yourself or your work seriously." --Booth Tarkington

cseaman@sequent.UUCP (Chris "The Bartman" Seaman) (09/08/90)

olson@uhunix1.uhcc.Hawaii.Edu (Todd Olson) writes:
< In article <2909@corpane.UUCP> you write:
< >warren@hpindda.cup.hp.com (Warren Burnett) writes:
< >|/ hpindda:comp.sys.amiga / sparks@corpane.UUCP (John Sparks) /  2:52 pm  Aug 27, 1990 /
< >|> Lharc seems speed comparable with Zoo, maybe a bit slower at packing.

< >|HA! That's a laugh.  Even on my 68030 workstation lharc is slow.  Lharc 
< >|takes about five times as long as zoo does on compression and two to 
< >|three times as long as zoo on decompression, at least on the kinds of 

Last night I was running some fairly exhaustive (though non-scientific)
benchmarking of lharc versus zoo versus versus lharca versus lz.  The
most recent version of lharc (1.21) compressed 480K worth of IFF 8SVX
files down to 397K in 3:02, where zoo compressed the same files down to
a whopping 450K in :48.  Lharc took just under 4 times as long as zoo.
This is on a 5MB 2500/20, archiving to/from vd0:.

< John for an even better improvment over lhunarc, try LZ (el zed) by
< Jonathan Forbes, the same guy who wrote lhunarc, it is even faster
< and it also deos compression, all in all a Lharc replacement, the most
< current verion I have is .90.  It is really quick on a 2500/30!

Lz is **VERY** quick.  The archive I mentioned above was built in 2:12,
and de-archived in (are you ready?) 37 SECONDS!  However, there is
still the zero length file bug (which the documentation implies was
fixed).  Any zero length files cause a corrupt archive, which lz cannot
extract, although lharc can (to some extent).  It also has a peculiar
habit of generating a slightly different compressed size for some files
(usually only one or two bytes difference).  I know this is a nit, but,
if it is based on the same algorithm as lharc, it should generate
identical archives (lharca does).

< BTW it is shareware, and it looks as though there will be a check sent to
< Jonathon from Hawaii, unless he likes Mac nuts some some other guy :-)

If he can fix the bugs, I would register.  Until then I can't use it.

-- 
Chris (Insert phrase here) Seaman |  /o  -- -- --
cseaman@sequent.com <or>          |||    -- -- -     I'm Outta Here, Man!
...!uunet!sequent!cseaman         |vvvv/  -- -- -
The Home of the Killer Smiley     |___/  -- -- --

xanthian@zorch.SF-Bay.ORG (Kent Paul Dolan) (09/08/90)

[Crossposted to alt.sources.d to provide the data toward the end, and to
news.admin to show the possible telecomm savings of changing compressors;
the rest is Amiga specific.  Please select the appropriate followup group;
I've pointed it back to comp.sys.amiga, where this thread is ongoing.]

sparks@corpane.UUCP (John Sparks) writes:
>warren@hpindda.cup.hp.com (Warren Burnett) writes:
>|sparks@corpane.UUCP (John Sparks) writes:
>
>|> Lharc seems speed comparable with Zoo, maybe a bit slower at packing.
>
>|HA! That's a laugh.  Even on my 68030 workstation lharc is slow.  Lharc 
>|takes about five times as long as zoo does on compression and two to 
>|three times as long as zoo on decompression, at least on the kinds of 
>|files that I archive.

Hmm, on my A68000, I just unarchived an 803Kbyte file from lharc in
under 60 seconds.  I can easily live with that speed for the better
compression.  No question, lharc is slow to compress, but see below.

>|Personally, I am sticking with zoo.  I would rather have my archive files
>|be slightly larger than sit around waiting for ten minutes to decompress
     ^^^^^^^^
>|this nifty new screen hack I just downloaded.

Well, I just compressed an 803 Kbytes file with lharc, zoo, and compress,
all on the Amiga.  Zoo (12 bit Lempel-Ziv) gave 131 Kbytes, compress (14 bit
Lempel-Ziv) gave 121 Kbytes, and lharc (?? bit Lempel-Ziv cascaded with
(adaptive ?) Huffman) gave 50 Kbytes!  You can try this yourself, it is
the StPauls.dat file in the recent DKBtrace distribution in
comp.binaries.amiga.

The file, although it is in fact a text picture description file, is of the
type of raw pixel image data - lots of repeated bytes - so you might want
to really reconsider which compressor you want to use for image data; paying
for 2.5 times the storage to get the higher speed of zoo is a big hit.

The performance of lharc on general c.{s,b}.a distributions is less
spectacular, but still very good compared to zoo or compress.  I almost
always save at least 20%, which is money in the bank for me.

>Well after downloading files to my amiga then starting to uncompress them,
>I usually stay at the console when uncompressing a Lharc file with Lhunarc
>but I can go watch about 15 minutes of TV when unzooing a zoo file. And
>that is on any kind of file. You must have a really lousy version of
>Lharc. 

Well, lharc does something on the screen to keep your attention; I'd be a
bit surprised if it were really faster than zoo at decompression: please
do a couple of timing tests on large files, and, if you find lharc actually
faster, _please_ publish your version number!  ;-)

Still, the Amiga is a multitasking machine, and I just start lharc
going in the background and do something else while it churns; I
really don't much care how fast it is, just how well it compresses.
I'm slowly converting zoo files to lharc files, to recover _lots_
of floppy disk space for reuse.

The _entire_ DKBtrace distribution, source, docs, data, and binaries
lharc-ed like this:

Original file bytes:        1,920,793  (no file system overhead counted)
                                       (as reported by lharc)
Internal compressed bytes:    426,603  (sum of compressed files in .lzh)
                                       (as reported by lharc)
Archive size in bytes:        431,306  (External file size, not counting)
                                       (directory and extension blocks)
                                       (as reported by "list")

Compare to zoo:

Original file bytes:        1,920,793  (no file system overhead counted)
                                       (as reported by zoo)
Internal compressed bytes:    660,523  (sum of compressed files in .zoo)
                                       (as reported by zoo)
Archive size in bytes:        670,146  (External file size, not counting)
                                       (directory and extension blocks)
                                       (as reported by "list")

That's just too good a compression gain for me to ignore, even though it
really does take lharc almost five times as long to compress the files.
[Your data may differ slightly; I use my own arcane directory structure,
throw away the duplicate copies of the Docs files, and keep the header
from the first posting in each distribution group for reference.]

Think how much money that represents in transmission and storage costs for
the net, as well.  Even uuencoded, that 33% data savings will be the same;
perhaps it is time to really think about what we pay to use the compression
algorithm "everybody has" for USENet data transfer.

It is interesting to notice that lharc seems to use only half the overhead
to store full directory structures as does zoo.  I wonder why.

Those intensely interested in compressing image data are referred to the
ongoing discussion in alt.sources.d.

Kent, the man from xanth.
<xanthian@Zorch.SF-Bay.ORG> <xanthian@well.sf.ca.us>

sparks@corpane.UUCP (John Sparks) (09/12/90)

xanthian@zorch.SF-Bay.ORG (Kent Paul Dolan) writes:

|sparks@corpane.UUCP (John Sparks) writes:
|>Well after downloading files to my amiga then starting to uncompress them,
|>I usually stay at the console when uncompressing a Lharc file with Lhunarc
                                                                     ^^^^^^^

|Well, lharc does something on the screen to keep your attention; I'd be a
|bit surprised if it were really faster than zoo at decompression: please
|do a couple of timing tests on large files, and, if you find lharc actually
|faster, _please_ publish your version number!  ;-)

Note, I said using "LHUNARC" not lharc. There is a big difference in those
two programs. LZ as has been mentioned, is even faster. 

-- 
John Sparks         |D.I.S.K. Public Access Unix System| Multi-User Games, Email
sparks@corpane.UUCP |PH: (502) 968-DISK 24Hrs/2400BPS  | Usenet, Chatting,
=-=-=-=-=-=-=-=-=-=-|7 line Multi-User system.         | Downloads & more.
A door is what a dog is perpetually on the wrong side of----Ogden Nash