[comp.sources.d] Why "shar: Shell Archive

rsalz@bbn.com (Rich Salz) (09/20/89)

I'm finishing up the "final" release of my cshar tools.  This is what
I generally use to pack things up for posting to comp.sources.unix.
Working on that, plus the metrics posting that just went out, has led
me to make the following request:
    If at all possible, please don't submit sources to comp.sources.unix
    that are generated with the above tool (I don't know it's common name,
    it's not the RogueMonster one, it's not my cshar) -- the one that
    uses s2_seq.tmp files to unpack things in sequence.
    
There are two reasons for this:
	1.  It generates complex /bin/sh constructs:
		( read Scheck
		  if test "$Scheck" != $CurArch
		  then echo "Please unpack part $Scheck next!"
		       exit 1;
		  else exit 0; fi
		) < s2_seq_.tmp || exit 1
	    my shell parser can't parse this, and I doubt any non-sh
	    parser can.

	2.  If a piece is missing, the whole archive is useless until
	    it shows up.  This is often a waste of time; you might as
	    well let folks start examining the other files.

/r$
-- 
Please send comp.sources.unix-related mail to rsalz@uunet.uu.net.
Use a domain-based address or give alternate paths, or you may lose out.

storm@texas.dk (Kim F. Storm) (09/21/89)

rsalz@bbn.com (Rich Salz) writes:

>    If at all possible, please don't submit sources to comp.sources.unix
>    that are generated with the above tool (I don't know it's common name,
>    it's not the RogueMonster one, it's not my cshar) -- the one that
>    uses s2_seq.tmp files to unpack things in sequence.

The tool in question is the shar2' archiver:

comp.sources.misc: Volume 3, Issue 14
Submitted-By: "Wm E. Davidsen" <davidsen@crdos1.UUCP>
Archive-Name: shar2
Date: 10 May 88 15:46:51 GMT

>There are two reasons for this:
>	1.  It generates complex /bin/sh constructs:
>	    my shell parser can't parse this, and I doubt any non-sh
>	    parser can.

True, and this should be corrected - even some Bourne shells have
problems unpacking these archives!

>	2.  If a piece is missing, the whole archive is useless until
>	    it shows up.  This is often a waste of time; you might as
>	    well let folks start examining the other files.

This is actually a very nice feature when you regard it from the
sender's point of view:  shar2 will make almost evenly sized articles
by splitting source files across archive parts.  This has two benefits:

a)  It generally produces fewer parts, and the fewer parts there are,
    the less risk there is that a part is lost.

b)  It enables you to send source files that are larger than 50-60 kbyte
    (or whatever you may think is a safe maximum for articles)
    *automatically* - and they will be concatenated automatically on
    the recipient end as well (when unpacking with /bin/sh).
    
The drawback is of course that parts must be unpacked in sequence.
The good thing about the shar2 archives is that this is enforced.

I have seen postings specifying that "You must unpack this part last",
because they contain commands to concatenate some files to yield
some large files, e.g.

	cat man.* > xyzzy.1
	rm man.*

Imagine that man.2 is missing because part 4 has not arrived, and you
choose to unpack part 7 despite the warning (if given at all)...
Where does man.1 and man.3 go, and will xyzzy.1 be a complete manual?
(the answer is left as an excercise to the reader :-)

I would like to see a replacement for shar2 which could

1) Reorder files according to their size to produce as few parts as
   possible (within a given maximum size).

2) Facilitate automatic split and concatenation of large files.

3) Allow "secure" unpacking, i.e. with a simple shell parser.

Is `cshar' the answer to my prayers?
-- 
Kim F. Storm        storm@texas.dk        Tel +45 429 174 00
Texas Instruments, Marielundvej 46E, DK-2730 Herlev, Denmark
	  No news is good news, but nn is better!

davidsen@crdos1.crd.ge.COM (Wm E Davidsen Jr) (09/21/89)

In article <1979@prune.bbn.com>, rsalz@bbn.com (Rich Salz) writes:
|  I'm finishing up the "final" release of my cshar tools.  This is what
|  I generally use to pack things up for posting to comp.sources.unix.
|  Working on that, plus the metrics posting that just went out, has led
|  me to make the following request:
|      If at all possible, please don't submit sources to comp.sources.unix
|      that are generated with the above tool (I don't know it's common name,
|      it's not the RogueMonster one, it's not my cshar) -- the one that
|      uses s2_seq.tmp files to unpack things in sequence.

  It's my "shar2" tool.
|      
|  There are two reasons for this:
|  	1.  It generates complex /bin/sh constructs:
		[ example deleted ]
|  	    my shell parser can't parse this, and I doubt any non-sh
|  	    parser can.
  It was intended to work with the UNIX /bin/sh program, not a subset
which some people don't have. It works with versions back to at least V7.
|  
|  	2.  If a piece is missing, the whole archive is useless until
|  	    it shows up.  This is often a waste of time; you might as
|  	    well let folks start examining the other files.
  After unpacking many things which had "all arrived but one" I think
that "waste of time" may be a matter of opinion. It's a consequence of
splitting files to make equal size archives rather than doing an
aproximation with whole size.

  If you want to force people to use your versions of all the software
involved why not say so. I have recieved only thanks and a few helpful
suggestions for improvements. I assume that if other people were
bothered by this they would say so (they certainly talk about my other
software).
-- 
bill davidsen	(davidsen@crdos1.crd.GE.COM -or- uunet!crdgw1!crdos1!davidsen)
"The world is filled with fools. They blindly follow their so-called
'reason' in the face of the church and common sense. Any fool can see
that the world is flat!" - anon

tytso@athena.mit.edu (Theodore Y. Tso) (09/22/89)

In article <444@crdos1.crd.ge.COM> davidsen@crdos1.UUCP (bill davidsen) writes:
>|  There are two reasons for this:
>|  	1.  It generates complex /bin/sh constructs:
>		[ example deleted ]
>|  	    my shell parser can't parse this, and I doubt any non-sh
>|  	    parser can.
>  It was intended to work with the UNIX /bin/sh program, not a subset
>which some people don't have. It works with versions back to at least V7.

Even if some people have /bin/sh, they may not want to use it.  After
all, shar archives are such a huge potential security hole.  I would
feel much safer using a parser that didn't allow such constructs as
"(/bin/rm -rf /) &".  

This is a fair request to make, in any case.  What about people running
MS-DOS that want to unshar a source kit?  Why force them to suffer more
than they already have to?  (Not running a real operating system should
be punishment enough!  :-) 

>|  
>|  	2.  If a piece is missing, the whole archive is useless until
>|  	    it shows up.  This is often a waste of time; you might as
>|  	    well let folks start examining the other files.
>  After unpacking many things which had "all arrived but one" I think
>that "waste of time" may be a matter of opinion. It's a consequence of
>splitting files to make equal size archives rather than doing an
>aproximation with whole size.
Of course, the other obvious question is what is it so important to
have equal size archives?  The cost of having a couple of 47k or 48k
archive files instead of all 50k seems to be a small price to pay for
being able to get the files before you get all of the archives.  This
is particularily true for packages which get broken up into a large
number of pieces.

Another suggestion:  someone should write a shar which can break up
files into several pieces (although it should try very hard to
rearrange files to make things the right length) and which generate
shar files that unwrap partial files and, when a shar file detects
that all the pieces of a particular partial file have been unwrapped,
puts it together automatically.  
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
Theodore Ts'o				bloom-beacon!mit-athena!tytso
3 Ames St., Cambridge, MA 02139		tytso@athena.mit.edu
   Everybody's playing the game, but nobody's rules are the same!

dave@galaxia.Newport.RI.US (David H. Brierley) (09/22/89)

In article <1979@prune.bbn.com> rsalz@bbn.com (Rich Salz) writes:
...
>/r$

Like, wow!  Rich Salz Returns From the Dead!     ;)

Nice to have you back.  I had heard you had been eaten by a giant butterfly.
-- 
David H. Brierley
Home: dave@galaxia.Newport.RI.US   {rayssd,xanth,lazlo,mirror,att}!galaxia!dave
Work: dhb@rayssd.ray.com           {sun,uunet,gatech,necntc,ukma}!rayssd!dhb

rick@pcrat.uucp (Rick Richardson) (09/22/89)

In article <444@crdos1.crd.ge.COM> davidsen@crdos1.UUCP (bill davidsen) writes:
>In article <1979@prune.bbn.com>, rsalz@bbn.com (Rich Salz) writes:
>|  I'm finishing up the "final" release of my cshar tools.  This is what
>  It's my "shar2" tool.
>It's a consequence of
>splitting files to make equal size archives rather than doing an
>aproximation with whole size.

OK guys, lets get together so that the next 'shar' coming from
whomever, does some much needed things, like:

1) handle automatic compression followed by {uuencoding,btoa'ing}
   so that those odd bits of binary junk (like icons, etc.) can be
   automatically handled.

2) handle large (>50) input files.  Recently, I had to resort to
   the ugliness of invoking 1) on a document file that was too large
   to fit into a 'cshar' archive.  The document was palatable
   to the network, but not to cshar.

I think its important that the *.sources moderator realize that
there are times when these things are needed.  If Rich produces
a new shar program as *.sources moderator, it is tantamount to becoming
the 'standard'.  I know that when I was selecting a shar program among
the many available, I picked the current cshar simply because the
moderator of *.sources had written it.  And then had to hack it
to do the binary stuff.  I hope the 'standard' covers the needs of the
posters.

-- 
Rick Richardson |       Looking for FAX software for UNIX ??????        mention
PC Research,Inc.|                  WE'RE SHIPPING			 your
uunet!pcrat!rick|    Ask about FaxiX - UNIX Facsimile System (tm)        FAX #
(201) 389-8963  | Or JetRoff - troff postprocessor for the HP {Laser,Desk}Jet

rsalz@bbn.com (Rich Salz) (09/23/89)

I apologize to Bill Davidsen.  I knew he was the author of the v1.22 shar,
I was just (obliquely) leaving his name off to avoid having the discussion
get personal.  If you're shipping files around to Unix sites, then v1.22
is a great tool:  it's easy to use, the auto-split is nice, etc.

For world-wide posting in places like comp.sources.unix, however, I find
that features like the above (cf. my original article with its two primary
complaints) tend to get in the way much more than they benefit.  I base
this on my reactions when I'm sent postings, and from the tons of email I
get from readers who've dropped a piece, read news on one machine and use
the cshar shell parser to unpack things on a PC, and so on.

Folks, by all means, please use the tool that suits you best.  I was just
pointing out some problems that I have with a very common one.  There's
nothing wrong with saying "this doesn't do the job for me."  In
retrospect, I probably should have written a more subdued title.
	/r$
-- 
Please send comp.sources.unix-related mail to rsalz@uunet.uu.net.
Use a domain-based address or give alternate paths, or you may lose out.

allbery@NCoast.ORG (Brandon S. Allbery) (09/23/89)

As quoted from <1989Sep22.005247.11518@pcrat.uucp> by rick@pcrat.uucp (Rick Richardson):
+---------------
| OK guys, lets get together so that the next 'shar' coming from
| whomever, does some much needed things, like:
| 
| 1) handle automatic compression followed by {uuencoding,btoa'ing}
|    so that those odd bits of binary junk (like icons, etc.) can be
|    automatically handled.
| 
| 2) handle large (>50) input files.  Recently, I had to resort to
|    the ugliness of invoking 1) on a document file that was too large
|    to fit into a 'cshar' archive.  The document was palatable
|    to the network, but not to cshar.
+---------------

This is, so to speak, "in the queue" -- I'll wait for the next release of
cshar, then add whatever is necessary.  I *do* run into these kinds of things,
since c.s.misc is inherently less structured than .unix.

++Brandon
-- 
Brandon S. Allbery, moderator of comp.sources.misc	     allbery@NCoast.ORG
uunet!hal.cwru.edu!ncoast!allbery		    ncoast!allbery@hal.cwru.edu
bsa@telotech.uucp, 161-7070 BALLBERY (MCI), ALLBERY (Delphi), B.ALLBERY (GEnie)
Is that enough addresses for you?   no?   then: allbery@uunet.UU.NET (c.s.misc)

wcs) (09/23/89)

In article <14502@bloom-beacon.MIT.EDU> tytso@athena.mit.edu (Theodore Y. Tso) writes:
]Even if some people have /bin/sh, they may not want to use it.  After
]all, shar archives are such a huge potential security hole.  I would

	While we're at it, can I put in a plug for trashing the
		PATH=/bin:/usr/ucb:/usr/bin
	Some of us don't run Berkeley (gasp!) and get very annoyed
	at having to go edit shar files to delete the line.

		PATH=/bin:/usr/ucb:/usr/bin:$PATH
	would do just fine, an will accomodate people whose
	filesystems are arranged differently.

]Another suggestion:  someone should write a shar which can break up
]files into several pieces (although it should try very hard to
]rearrange files to make things the right length) and which generate
]shar files that unwrap partial files and, when a shar file detects

	Has anyone written someting like this?  The general case is
	a knapsack / bin-packing problem that takes a list of items
	and outputs a bunch of lists each containing less than N KB.
	Implementation issues include output formats, and whether sizes
	belong in the input or should be determined by the bin-packer.
-- 
# Bill Stewart, AT&T Bell Labs 4M312 Holmdel NJ 201-949-0705 ho95c.att.com!wcs
# also found at 201-271-4712 tarpon.att.com!wcs Somerset 4C423 Corp. Park 3

# More Colombians die from American tobacco than Americans from Colombian coke.

john@chance.UUCP (John R. MacMillan) (09/25/89)

In article <4155@cbnewsh.ATT.COM> wcs@cbnewsh.ATT.COM (Bill Stewart 201-949-0705 ho95c.att.com!wcs) writes:
|	Has anyone written someting like this?  The general case is
|	a knapsack / bin-packing problem that takes a list of items
|	and outputs a bunch of lists each containing less than N KB.
|	Implementation issues include output formats, and whether sizes
|	belong in the input or should be determined by the bin-packer.

I hacked around with this a while ago, and decided it's not really
worth the effort.  Usually I saw either no reduction in the number of
parts, or one fewer, so the savings in net bandwith isn't very big
(one set of news headers and one shar heading).
-- 
John R. MacMillan           "Don't you miss it...don't you miss it...
john@chance.UUCP             Some of you people just about missed it."
...!utcsri!hcr!chance!john        -- Talking Heads

tom@mims-iris.waterloo.edu (Tom Haapanen) (09/26/89)

Bill Stewart <wcs@cbnewsh.ATT.COM> writes:
> 	While we're at it, can I put in a plug for trashing the
> 		PATH=/bin:/usr/ucb:/usr/bin
> 	Some of us don't run Berkeley (gasp!) and get very annoyed
> 	at having to go edit shar files to delete the line.
> 
> 		PATH=/bin:/usr/ucb:/usr/bin:$PATH
> 	would do just fine, an will accomodate people whose
> 	filesystems are arranged differently.

Shells on MS-DOS also choke on this; you might be on d: and you /bin is
on c: and it ends up that the shell script can't do anything...  :-(
Please, let's change this!

					\tom haapanen
"now, you didn't really expect          tom@mims-iris.waterloo.edu
 my views to have anything to do        watmims research group
 with my employer's, did you?"          university of waterloo

davidsen@crdos1.crd.ge.COM (Wm E Davidsen Jr) (09/26/89)

In article <4155@cbnewsh.ATT.COM>, wcs@cbnewsh.ATT.COM (Bill Stewart 201-949-0705 ho95c.att.com!wcs) writes:

|  ]Another suggestion:  someone should write a shar which can break up
|  ]files into several pieces (although it should try very hard to
|  ]rearrange files to make things the right length) and which generate
|  ]shar files that unwrap partial files and, when a shar file detects
|  
|  	Has anyone written someting like this?  The general case is
|  	a knapsack / bin-packing problem that takes a list of items
|  	and outputs a bunch of lists each containing less than N KB.
|  	Implementation issues include output formats, and whether sizes
|  	belong in the input or should be determined by the bin-packer.

  The default behavior of shar2 (up to v1.25 now) is to break up the
total collection of data into M files of size S each. I usually roll
mine to be about 50k, allowing headers, etc, to be added without hitting
the magic 64k limit.

  shar2 handles text, binary, or a mixture thereof, using automatic
recognition or manual specification of binary files.
-- 
bill davidsen	(davidsen@crdos1.crd.GE.COM -or- uunet!crdgw1!crdos1!davidsen)
"The world is filled with fools. They blindly follow their so-called
'reason' in the face of the church and common sense. Any fool can see
that the world is flat!" - anon

mcdonald@aries.uiuc.edu (Doug McDonald) (09/26/89)

If one is going to distribute things in a shar-like format, it
is really important that the protocol to shar and un-shar be described
in a method that is easily implemented as ordinary C (or whatever)
code. Using a UNIX shell is not satisfactory IF the operations
cannot be conveniently done by a relatively simple program.
This is so people who don't use UNIX can use these files. Binary
files should be clearly distinguished from text files. 

Doug McDonald(mcdonald@uxe.cso.uiuc.edu)

davidsen@crdos1.crd.ge.COM (Wm E Davidsen Jr) (09/26/89)

In article <1989Sep25.195540.18104@ux1.cso.uiuc.edu>, mcdonald@aries.uiuc.edu (Doug McDonald) writes:

|  code. Using a UNIX shell is not satisfactory IF the operations
|  cannot be conveniently done by a relatively simple program.
|  This is so people who don't use UNIX can use these files. Binary
|  files should be clearly distinguished from text files. 

  We're talking about comp.sources.unix. Making a compromise in error
checking to allow non-unix people to use unix sources isn't a really
good idea. Since every one using unix has /bin/sh it's the unpacker of
choice. The paranoid can use a chroot script to prevent possible side
effects. Unpackers are nice, useful, etc, but everybody doesn't have
one, so common sense dictates that we use something you have.

-- 
bill davidsen	(davidsen@crdos1.crd.GE.COM -or- uunet!crdgw1!crdos1!davidsen)
"The world is filled with fools. They blindly follow their so-called
'reason' in the face of the church and common sense. Any fool can see
that the world is flat!" - anon

slocum@hi-csc.UUCP (Brett Slocum) (09/27/89)

Well, I too dislike the 'continued'-style shar files.  Besides the
previously mentioned problems, my site
doesn't get multi-part postings in order, so it requires me to
either save the pieces and then unshar them (I usually use |unshar
straight from 'rn'), or to 'unread' all the parts until they all
arrive and then |unshar them.  If not all the parts come, then I
have to save them all (after having skipped them for the week or
so waiting for missing parts) and get the part from the archive
site.  With the other kind of shar, I can |unshar from 'rn' in
any order (usually) and if there are missing parts, I can simply
send to the archive site immediately.


-- 
Brett Slocum, Honeywell SSDC, Golden Valley, Minnesota
<uunet!hi-csc!slocum>              | AIDS is a virus; George Bush
  or <hi-csc!slocum@uunet.uu.net>  | is a punishment from God.

chip@vector.Dallas.TX.US (Chip Rosenthal) (09/27/89)

In article <495@crdos1.crd.ge.COM> davidsen@crdos1.UUCP (bill davidsen) writes:
>  The default behavior of shar2 (up to v1.25 now) is to break up the
>total collection of data into M files of size S each.

As others have asked, I'd also like to see the feature which checks that
parts 1 through N-1 have been processed before part N is removed.

Here is the scenario.  I have an 8-part shar archive compressed and
stored in Part1.Z through Part8.Z.  I want to unshar it, using my
"safe unshar" program.  So I say:

	% foreach file ( /archive_dir/Part?.Z )
	zcat $file | unshar
	end

Braphz.  Outa luck.  Parts 2 through 8 won't extract because the required
stuff isn't in the directory where the unshar happens.
-- 
Chip Rosenthal / chip@vector.Dallas.TX.US / Dallas Semiconductor / 214-450-5337
Someday the whole country will be one big "Metroplex" - Zippy's friend Griffy

mcdonald@uxe.cso.uiuc.edu (09/27/89)

In article <1989Sep25.195540.18104@ux1.cso.uiuc.edu>, mcdonald@aries.uiuc.edu (Doug McDonald) writes:

|  code. Using a UNIX shell is not satisfactory IF the operations
|  cannot be conveniently done by a relatively simple program.
|  This is so people who don't use UNIX can use these files. Binary
|  files should be clearly distinguished from text files. 
Bill davidsen replies:

>  We're talking about comp.sources.unix. Making a compromise in error
>checking to allow non-unix people to use unix sources isn't a really
>good idea. Since every one using unix has /bin/sh it's the unpacker of
>choice. The paranoid can use a chroot script to prevent possible side
>effects. Unpackers are nice, useful, etc, but everybody doesn't have
>one, so common sense dictates that we use something you have.

But not everything posted to comp.sources.unix really IS unix specific!!!
And, even if it is, somebody might want to port it to some more
common system. The old-fashioned "shar" that seems to have been standard
up to now can be undone by relatively simple portable tools. Let's
keep it that way.

Besides- there is another aspect - people seem to post shar files
to other sources groups too. If some new "shar" starts out only in
comp.sources.unix, it might spread like a plague to other groups.
There is no need to compromise anything - just be sure that
what is done in one place can be undone in another.

Doug McDonald

dg@lakart.UUCP (David Goodenough) (09/27/89)

tytso@athena.mit.edu (Theodore Y. Tso) sez:
> Even if some people have /bin/sh, they may not want to use it.  After
> all, shar archives are such a huge potential security hole.  I would
> feel much safer using a parser that didn't allow such constructs as
> "(/bin/rm -rf /) &".  

Agreed. I _NEVER_, repeat _NEVER_, repeat _NEVER_ pass shar files
anywhere near /bin/sh. Instead, I use a heavily hacked version of
Craig Noborg's unshar, written in C. I do this for the maps, and
for any shell archive that arrives in *.sources. By and large it does
the job admirably, even to the point of being able to handle mkdir,
cd (i.e. for multiple directory shars), and split source shell archives.
I take the following approach to Bill D.'s shars: if the output redirection
construct is '>' then just unwrap the file. If it's '>>' and there is no
file already present, then say so, but carry on. That way, you can get bits
out, even if you're missing files. Of course, ugly things will happen if
a huge file is split into three shar files, and you unpack the first and
third w/o doing the second.

However, I'm going to agree with Rich Salz by way of saying that I wish I
hadn't needed to hack it to the degree that I did to be able to handle
some of the stuff that's out there. Remember the KISS principle.
-- 
	dg@lakart.UUCP - David Goodenough		+---+
						IHS	| +-+-+
	....... !harvard!xait!lakart!dg			+-+-+ |
AKA:	dg%lakart.uucp@xait.xerox.com			  +---+

guy@auspex.auspex.com (Guy Harris) (09/28/89)

 >The paranoid can use a chroot script to prevent possible side
 >effects.

The paranoid may not have the root privileges that this requires.

davidsen@crdos1.crd.ge.COM (Wm E Davidsen Jr) (09/28/89)

In article <755@vector.Dallas.TX.US>, chip@vector.Dallas.TX.US (Chip Rosenthal) writes:

|  Here is the scenario.  I have an 8-part shar archive compressed and
|  stored in Part1.Z through Part8.Z.  I want to unshar it, using my
|  "safe unshar" program.  So I say:
|  
|  	% foreach file ( /archive_dir/Part?.Z )
|  	zcat $file | unshar
|  	end
|  
|  Braphz.  Outa luck.  Parts 2 through 8 won't extract because the required
|  stuff isn't in the directory where the unshar happens.

  Here's the scenario. There is a broken unshar which can't handle shell
commands. It fails and the shell script is blamed.

  I just tried exactly this, substituting /bin/sh for unshar and it
works fine. The sequence file is kept in the current directory. The
unpack is done in the current directory. No problems, no failure.

  If the problem is that your unshar can't handle the files, that's a
legitimate problem, but it will be better solved if the cause is
understood. shar2 produces complex shell scripts which do error
checking. If someone wants to have the ability to bypass the error
checking, that's a good idea, and I will probably put it in the new
version, via an environment variable, such as ERRCHK={yes,no,warn} or
some such. I don't intend to produce a version which doesn't produce
error checking code.
-- 
bill davidsen	(davidsen@crdos1.crd.GE.COM -or- uunet!crdgw1!crdos1!davidsen)
"The world is filled with fools. They blindly follow their so-called
'reason' in the face of the church and common sense. Any fool can see
that the world is flat!" - anon

davidsen@crdos1.crd.ge.COM (Wm E Davidsen Jr) (09/29/89)

In article <45400009@uxe.cso.uiuc.edu>, mcdonald@uxe.cso.uiuc.edu writes:

|  Besides- there is another aspect - people seem to post shar files
|  to other sources groups too. If some new "shar" starts out only in
|  comp.sources.unix, it might spread like a plague to other groups.
|  There is no need to compromise anything - just be sure that
|  what is done in one place can be undone in another.

  Well, shar2 has been out for over two years now, if that's new. It has
certainly been popular in some groups where files are large or binary
data must be sent (data or files with control characters).

  If there was a way to make all of the unshar programs work, I would do
it. But there are programs in Basic, C, etc, which simply can't do error
checking on archive order. I want error checking and therefore use
shar2. It's my idea of "safe shar," and I really am not offended if
other people want to use non-error checking versions, I just don't want
to do it myself.

  I think diversity is great, and there is cross polination. I believe
that the new version Rich is making will break files to limit posting
size (I thought I saw that mentioned). In the meantime I will smile and
keep on using what works for me, with the onle unshar I'm sure you have
(/bin/sh). 
-- 
bill davidsen	(davidsen@crdos1.crd.GE.COM -or- uunet!crdgw1!crdos1!davidsen)
"The world is filled with fools. They blindly follow their so-called
'reason' in the face of the church and common sense. Any fool can see
that the world is flat!" - anon

chip@vector.Dallas.TX.US (Chip Rosenthal) (09/30/89)

In article <691@lakart.UUCP> dg@lakart.UUCP (David Goodenough) writes:
>Remember the KISS principle.

Absolutely.  I mentioned something to Bill in private email which
warrants consideration:

    The shar archive format should not be built for the convenience of
    the sender, but rather for the convenience of the receiver.

From this, follows the notion that the simpler the format the better.

There are all sorts of nice things which a shar archive can do
automatically:  concatonate files to big to fit in one archive reasonably,
automatically uudecode binary files, etc.  However, there is no reason
why these things need to be built into the archive.  Why not just provide
a "Runme.first" script which contains the few simple shell commands to
do these things?  A fancy shar program might build such a file
automatically.  This is more difficult than building these commands on
the fly and making them part of the archive, but that is trading off
sender (and shar author's) convenience for receiver convenience.

BTW, this isn't an absolute.  My own shar script does build in a
couple of things, like the clobber check and unpack check via "wc".
Like any engineering activity, there are tradeoffs when you decide
which features to use and which to leave out.
-- 
Chip Rosenthal / chip@vector.Dallas.TX.US / Dallas Semiconductor / 214-450-5337
Someday the whole country will be one big "Metroplex" - Zippy's friend Griffy