[comp.binaries.ibm.pc.d] How about un-compression?

bwwilson@lion.waterloo.edu (Bruce Wilson) (03/29/89)

Hi,
There have been a lot of people comparing the various compression
programs, quoting the speed of *compression*.  But what about the
speed of un-compression?  It seems to me that this is a more 
common operation for most people.  I know that probably 90% of my
time spent with archives is looking at them to see if they are 
useful, or not.  I don't really care if the archive took the author
2 minutes or 20 minutes to create.

Does anyone know if Zoo is just as fast as Zip or Arc in this process?
(I think they are about the same but I haven't done any timing.)

I understand Zip has various levels of compression that take more 
time to compress but do they also take more time to uncompress?
(My guess is that they don't.)

In trying to sort out the Arc-Zoo-Zip debate I've realized that 
there are four main factors (there are many more but I think these
are the main ones) [quotes are from memory]:

	1. COMPRESSION: "I just converted all my Arc's to
	   Zip and I saved 2 meg. and that's nothing to scoff at..."

	2. SPEED: "Zoo takes 2.5 minutes to archive these
	   files whereas Zip only took 2 minutes..."

	3. PORTABILITY: "Zoo works on all three machines I
	   work with but Zip doesn't.."

	4. DISTRIBUTION/FEES: "We need a FREELY DISTRIBUTABLE file
	   archiving system..."

I'm not sure it's possible to find a file archiving that will be the
best of all the above so the decision should be based on a weighted
combination.  In other words we have to decide what the important
features are before choosing an archiving system.  I've read a lot
of messages by various people supporting the various programs (all
with good, valid points) but they don't get us any closer to a 
decision because we all have different priorities.

Personally, I think (4) is the most important because of the nature
of the Net and noone has can say they don't want to use an arc program
because it costs.  This is followed very closely by (2) because I 
would guess that most people on the net are using more than one system
(I'm reading this group on a Unix machine but downloading for my PC).
The next would be (1) because this saves space for archives sites (and
me) making room for more stuff.  The last is (2), not because it isn't
important but rather I think it's the *least* important.

If you accept my priority list (I don't expect you to) then it's 
fairly clear that Zoo wins.  Maybe the algorithm for LHarc could
be 'plugged-into' Zoo (and change the extension) to create a new 
program to satisfies the compression fanatics (then they could
only gripe about speed).

While I'm posting, I would definitely like to see something in the
first article indicating a bit about the licensing of the program. 
IE- "this binary is public domain|free but copyrighted|requests a
$10 donation|requires a $10 payment".  Then *I* can decide if I will
go to the effort of trying a Rolodex program that wants $100.

Also, Rahul, you are doing a great job and I thank you for your efforts.

Thank you for your time,
bruce

bruce Wilson               | "what you don't spend you don't have to earn..."
bwwilson@lion.waterloo.edu |    from Blake (a film by Bill Mason)

Chris.Maidt@p8.f30.n147.z1.FIDONET.ORG (Chris Maidt) (04/01/89)

 BW> Hi,
 BW> There have been a lot of people comparing the various compression
 BW> programs, quoting the speed of *compression*.  But what about the
 BW> speed of un-compression?  It seems to me that this is a more 
 BW> common operation for most people.  I know that probably 90% of my
 BW> time spent with archives is looking at them to see if they are 
 BW> useful, or not.  I don't really care if the archive took the author
 BW> 2 minutes or 20 minutes to create.

Besides this being a test to see if this inter-domain stuff really works, I  
want to comment that for ME, speed and size are the most important parts of  
compression.  I write my own stuff, and have to constantly update it and send  
it off again ... I want it arced NOW and I want to be able to send it in  
nothing flat.  De-compression, on the otherhand ... I like care how long it  
takes to uncrunch ... although SEA's 6.01 takes the cake here unarcing whole  
tree structures in 10-15 minutes. 



--  
Chris Maidt - via FidoNet node 1:147/10
UUCP: ...!uokmax!metnet!30.8!Chris.Maidt
INTERNET: Chris.Maidt@p8.f30.n147.z1.FIDONET.ORG