[comp.binaries.ibm.pc.d] File Transmission Standards

mike@dhw68k.cts.com (Michael J. Cleary) (05/11/88)

I am surprised at the current methods utilized by the NET for transmitting
files.
  
  The splitting, cutting, pasting.
  The uuencoding, uudecoding.
  The archiving, unarchiving.  
      compress
      PKARC
      ZOO
      ARC
 
Based on the current skill level of the programmers on the NET, I 
would think that one could write a program that could do all of these
functions for both posting (NET format) and deposting (MS-DOS format).
 
Besides streamlining the process, it would avoid the number of manual
steps (possible points of errors).  I do not know how many times I 
have seen NETers complain about not being able to run the program 
after going through the series of manual processes to extract it from
Net format.  Standardization would also be a side effect of this type 
of program, as you would not need to *know* which reverse processes
needed to be applied.
 
Currently, I do not utilize executables from the NET, as I do not
want to get involved in the manual task of deNETting them.  I basically
call local BBS's and just simply push a button to retrieve the file.
 
If only the NET was so easy.   :-)
 
I am *NOT* flaming this sequence of manual tasks passed on from father
to son, I am just pointing out that there must be a more professional
way to approach the issue.
 
I would like everyone to talk about this, because I would like to see 
something done about it. 
-- 
Michael J. Cleary

rjchen@phoenix.Princeton.EDU (Raymond Juimong Chen) (05/12/88)

I sort of got into this discussion after it had gotten started,
so please pardon me if this has been brought up before:

Is there a compelling argument against having each uuencoded file
be an independent .ARC/.ZOO file?  This way, when multipart programs
come through and a part gets munged, we can uudecode what we have
so far and test it and evaluate it (or at least the part that
got through) immediately instead of waiting around for mail/reposting,
then gluing the parts together (in order!), then uudecoding, then
un-arcing and evaluating.

: If this has been hashed/rehashed, pretend the line eater got the :
: rest of the file :

This also solves the "what if the parts come in in the wrong order"
and the "I hate cutting and gluing" problem:  since each part is
a separate archive, it doesn't matter what order you get them.
Since each part is an independent archive, you don't have to do any
cutting and pasting at all.  (uudecode ignores headers and trailers.
Then again, what happened to atob?)

One problem would occur if a single file, when arc'd and uuencoded,
is still too large for transmission as a single file.  This is a case in 
which cutting is probably the only viable solution.

A possible counter-argument is that it would create lots of archives
with similar names (like dsz1 dsz2 dsz3... dsz9) which would then have
to be re-combined by the receiver.  Or, if you can't de-archive on
your host machine, it means that the number of files you have to
transfer is ten instead of one.  To the first, I say "Gosh, that's
too bad."  There do exist programs which can combine archives without
extracting/recompressing each file.  (I believe that SEA arc comes
with one [MARC I believe it's called], and I presume that ZOO has
a similar capability.)  To the second, I argue that it actually makes
life easier:  The total time for transfer is marginally larger, but
[1] it placates people who say "Gosh, how am I going to fit a 700K
archive onto my 360K floppy?", [2] if something goes wrong
with your connection, you only lose the last file you were transferring
instead of losing the entire file.

Such is my $0.02 worth.  (And when is c.b.i.p going to get moderated?)

-- 
Raymond Chen	UUCP: ...allegra!princeton!{phoenix|pucc}!rjchen
		BITNET: rjchen@phoenix.UUCP, rjchen@pucc
		ARPA: rjchen@phoenix.PRINCETON.EDU
"Say something, please!  ('Yes' would be best.)" - The Doctor