derosa@motcid.UUCP (John DeRosa) (03/15/91)
I have just finished un-binhqx'ing, unstuffing and uncompacting 70 files from the Sumex archives and have endured just about every style of archived Macintosh files that could exist. I came to the conclusion that there should be a standard for how items should be archived prior to being placed in the Sumex archives. It would appear that the only standard is choice-du-jour, i.e. whatever the latest tool of choice is and whatever way the user feels like using it. In the dim dark past it was Packit, then came Stuffit, now Compactor is all the rage. Binhqx appears to be the only constant. The point is that having all of the archived documents in one format will ease the job of the receiving user. Even though MOST new archived programs are using Compactor, even this pseudo standard has variations, self-extracting or not, folders or not. By setting a standard and by streamlining the whole process of recovery, many man years of effort will be saved. Consider how many people recovery a particular document while only one person is involved in the archiving process. Adding a few steps to the front end will serve to benefit those of use at the receiving end. Ultimately, I would like to see a single tool and a single dialog window perform the complete recovery, BinHqx decode and de-compress. Batch processing also. I would like to propose a standard as outlined below. I know that this will generate many comments, and possibly settle nothing ;-), but maybe, just maybe the world will be a better place. Sumex Archival Standard for Macintosh - Revision 1.0 ==================================================== 1) If there is more than one document to be in the new archive, put them in a folder. This eases the recovery process as it does not necessitate the independant creation of a folder prior to recovery nor does it litter the hard drive with multiple "read me" files. 2) Try as best as possible to have the name of each file in the archve refer to the main file in some manner. This will facilitate grouping "lost" files with its parent file or application. 3) If there is only a single document to be in the new archive and it is less than 20K in size, DO NOT compress the document. Compression of a document of this size benefits very little from the compression and only serves to add another step to the recovery process. 4) If the document or folder is 20K or larger then compress the document/folder with Compactor as a self-extracting archive (.sea). Why not Stuffit, you might ask? In this area I am bending to popular demand, Compactor seems to be more widely used and is purported to be faster. I personally prefer Stuffit as it incorporates BinHqx decoding. A large thank you to Raymond Lau for Stuffit that has been so popular for so long. He set the standards. I have been told that a later version of Compactor will have BinHqx also. How about batch modes and hqx file recombining? Why self-extracting, you might ask? If the file is self extracting, then the receiver does not need Compactor when decompressing. A simple double click and, poof, a folder is born. 5) Encode the file in BinHqx format. Make sure that the name of this final archive file reflects the content and the version number of the program, if applicable. Names such as file1.hqx are no help at all. Names like acmetool.1.3.6b8.hqx makes things perfectly clear. 6) When you mail the compressed and binhqx'd file to the archives, be sure and add a short paragraph telling what the program/file is all about, complete with version number. MAKE SURE THAT YOU ONLY USE 80 COLUMNS or less for the paragraph width. If you go beyond 80 columns, then the file is converted to a Lpunch format. This adds another step to the recovery process. A simple way of making sure this happens is to add the .hqx file to the mailer first, then use the right edge of the file as a guide. Do not type to the right of the right edge of the file. I hope that I have added to the general good by this long winded diatribe. I sure that I will find out. -- = John DeRosa, Motorola, Inc, Cellular Infrastructure Group = = e-mail: ...uunet!motcid!derosaj, motcid!derosaj@uunet.uu.net = = Applelink: N1111 = =I do not hold by employer responsible for any information in this message =
lfk@eastman1.mit.edu (Lee F. Kolakowski) (03/20/91)
I too have been downloading stuff for our new machines. The kinds of problems are as described in the article above. I think the need for a standard is obvious, but we may not all have the same needs. On Unix systems the standard for binary transmission is a compressed tar. It retains directory structure, handles text and binary files. For mailing sources there are other standards. But, since machines connected to the net are largely not Macintosh, and there are unix tools to dehex, and unsit files, I reccomend the following modifications to the suggested standards. On 14 Mar 91 22:55:12 GMT, derosa@motcid.UUCP (John DeRosa) said: > Sumex Archival Standard for Macintosh - Revision 1.0 > ==================================================== > 1) If there is more than one document to be in the new > archive, put them in a folder. > This eases the > recovery process as it does not necessitate the > independant creation of a folder prior to recovery > nor does it litter the hard drive with multiple > "read me" files. > 2) Try as best as possible to have the name of each > file in the archve refer to the main file in some manner. > This will facilitate grouping "lost" files with > its parent file or application. > 3) If there is only a single document to be in the new > archive and it is less than 20K in size, DO NOT > compress the document. > Compression of a document > of this size benefits very little from the compression > and only serves to add another step to the recovery > process. > 4) If the document or folder is 20K or larger then compress > the document/folder with Compactor as a self-extracting > archive (.sea). Use Stuffit so we can peer at the contents on a Unix machine. > Why not Stuffit, you might ask? In this area I am > bending to popular demand, Compactor seems to be more > widely used and is purported to be faster. I personally > prefer Stuffit as it incorporates BinHqx decoding. A > large thank you to Raymond Lau for Stuffit that has > been so popular for so long. He set the standards. I > have been told that a later version of Compactor will > have BinHqx also. How about batch modes and hqx file > recombining? > Why self-extracting, you might ask? If the file is > self extracting, then the receiver does not > need Compactor when decompressing. A simple double > click and, poof, a folder is born. > 5) Encode the file in BinHqx format. Make sure that the > name of this final archive file reflects the content and > the version number of the program, if applicable. Names > such as file1.hqx are no help at all. Names like > acmetool.1.3.6b8.hqx makes things perfectly clear. > 6) When you mail the compressed and binhqx'd file to the > archives, be sure and add a short paragraph telling > what the program/file is all about, complete with > version number. > MAKE SURE THAT YOU ONLY USE 80 COLUMNS > or less for the paragraph width. If you go beyond > 80 columns, then the file is converted to a Lpunch > format. This adds another step to the recovery > process. A simple way of making sure this happens > is to add the .hqx file to the mailer first, then > use the right edge of the file as a guide. Do not > type to the right of the right edge of the file. 7) Save all documentation in ascii text files that can be read and printed on non-Mac machines. -- Frank Kolakowski ======================================================================= |lfk@athena.mit.edu or lfk@eastman1.mit.edu or kolakowski@wccf.mit.edu| | Lee F. Kolakowski M.I.T. | | Dept of Chemistry Room 18-506 | | 77 Massachusetts Ave. Cambridge, MA 02139 | | AT&T: 1-617-253-1866 #include <disclaimer.h> | ======================================================================= ||Desert Storm - Lasers have made this the cleanest *dirty war* ever.|| =======================================================================
johnston@oscar.ccm.udel.edu (Bill Johnston) (03/20/91)
In article <6008@crystal9.UUCP>, derosa@motcid.UUCP (John DeRosa) writes... >I came to the conclusion that there should be a standard >for how items should be archived prior to being placed >in the Sumex archives. Yes, indeed. When this was last debated, StuffIt 1.5.1 was chosen because the format is OPEN. Many folks in the world-wide net.community rely on unix machines for which "sit" and "unsit" utilities are available as freeware. Running "mcvert" and "unsit" on one's unix host allows the user to download binaries, which are significantly smaller than the corresponding .hqx files (thus faster to download). These open format utilities are available in "shar.Z" format in the /mac/unix-utilities directory of rascal.ics.utexas.edu [128.83.138.20]. Because the format is open, the files can be accessed and used by non-Mac people; for example, the PC-clone folks are now downloading sound files from the Mac archives! These folks know about "unsit", but using Compact Pro, DiskDoubler, Diamond, and other "Mac-only" formats gets in the way of a good thing -- the beginnings of genuine platform-independence for computer data. This problem even affects Mac users running A/UX; some of the "new, fast" compression tools aren't 32-bit clean. (The new StuffIt's are ...). Yes, several of the new utilities run faster and compress files more efficiently (on the Macintosh only) than do any members of the "StuffIt" family. But using them puts a wall up around the Mac community that we don't need. A similar situation exists in the PC camp. Good archive formats have been introduced like .zip and .arc, supported by powerful shareware applications and rudimentary freeware dearchivers. Nevertheless, the comp.binaries.ibm.pc group is sticking with the less efficient '.zoo' format because EVERBODY can use it. Why take a step in the wrong direction? The future of computing can't POSSIBLY lie in a host of incompatible and proprietary archival formats -- until the Deluxe's, the Compact Pro's, and the DiskDoublers of the world realize that the whole world doesn't have a Mac (or the time to keep up with forty million formats), let's stick with something that WORKS for everybody. Bill (johnston@oscar.ccm.udel.edu)
jeffb.bbs@shark.cs.fau.edu (Jeffrey Boser) (03/20/91)
These posts are for the most part forgetting one important aspect of macintosh implementation: FREEDOM! 1) Putting multiple files in a folder to be compressed inhibits *my* freedom to extract them where I want them to go. 2) Using Text files for docs does not let the creator to use graphics and picts (as complicated programs sometimes require) as he wishes. Besides, MacWrite, Word, and WriteNow have fairly standard formats and can be handled by most word processors. 3) Some Compression techniques work better than others, and although 5% might seem like very little, on a 1M file, it means 50K of down time. (and if you are paying for down time, every little bit helps) 4) Compatability with other machines is NOT the responsability of the format creator. ie: is it IBM's fault that macs and other machines cannot read their disks? if you have unix, why dont *you* write a program to let you peek into compressed files? but basicly, macs have the extreme advantage of having VERY standard tools and file formats. I have seen over twenty IBM word processors, each not giving a damn about the other's formats. By owning StuffIt and Compactor, a person has direct and unihibited access to 99.99% of all mac software. a common word processor would be nice for docs too. .....Jeff
dorner@pequod.cso.uiuc.edu (Steve Dorner) (03/20/91)
In article <oHi4y2w163w@shark.cs.fau.edu> jeffb.bbs@shark.cs.fau.edu (Jeffrey Boser) writes: >4) Compatability with other machines is NOT the responsability of the > format creator. ie: is it IBM's fault that macs and other machines > cannot read their disks? if you have unix, why dont *you* write > a program to let you peek into compressed files? It is the responsibility of the author of the compression software to make his or her format public, so we CAN write programs to peek into their files. That's the beef with some of these newer archivers; they won't pony up their format, so nobody can do diddly with their archives except on a mac, using their program. "Oop! Ack! Phhthhh!," in the words of one past Presidential candidate. -- Steve Dorner, U of Illinois Computing Services Office Internet: s-dorner@uiuc.edu UUCP: uunet!uiucuxc!uiuc.edu!s-dorner
lrm3@ellis.uchicago.edu (Lawrence Reed Miller) (03/20/91)
In article <oHi4y2w163w@shark.cs.fau.edu> jeffb.bbs@shark.cs.fau.edu (Jeffrey Boser) writes, in part: > >4) Compatability with other machines is NOT the responsability of the > format creator. ie: is it IBM's fault that macs and other machines > cannot read their disks? if you have unix, why dont *you* write > a program to let you peek into compressed files? >[stuff deleted to save bandwidth] >.....Jeff The Stuffit 1.5.1 format is published. The Compactor format is not. How am I supposed to write a UNIX program to decompress Compactor archives if I can't get ahold of the format and then freely distribute the sorce code so that the utility can be compiled on other UNIX machines? Am I supposed to "figure out the format for myself" or something? I don't care who's "responsability" it is to make public archives readable on non-mac platforms. The fact is that if your archives are in Compactor (or Stuffit Deluxe) format, they _can't_. Compactor isn't even 32-bit clean. Ever seen it run under the A/UX 32-bit finder? If the author of the program can't even go to the trouble of writing his program to Apple specs (i.e. 32-bit cleanness) do you really think it will make a good archive format? Macs running A/UX are still Macs...shouldn't we be allowed to use the archive format, too? Lawrence Miller PS: Macs with SuperDrives _can_ read IBM disks. So can NeXTs (with floppies). VAXs can't, but who cares? Of course, you need special hardware if the disk is 5.25", but we are talking about formats, right (_not_ media)? I guess the IBM disk format is public :^)
llvvll@mixcom.COM (James R. Macak) (03/20/91)
In article <6008@crystal9.UUCP> derosa@motcid.UUCP (John DeRosa) writes: >Sumex Archival Standard for Macintosh - Revision 1.0 >==================================================== >4) If the document or folder is 20K or larger then compress > the document/folder with Compactor as a self-extracting > archive (.sea). > Why not Stuffit, you might ask? In this area I am > bending to popular demand, Compactor seems to be more > widely used and is purported to be faster. I personally > prefer Stuffit as it incorporates BinHqx decoding. A > large thank you to Raymond Lau for Stuffit that has > been so popular for so long. He set the standards. I > have been told that a later version of Compactor will > have BinHqx also. How about batch modes and hqx file > recombining? > > Why self-extracting, you might ask? If the file is > self extracting, then the receiver does not > need Compactor when decompressing. A simple double > click and, poof, a folder is born. I think it would be better to _not_ use self-extracting archives as the standard. First of all, why add those extra K of self-extracting code to every single file that is posted to sumex, the net, etc? Multiply that 15-20K (or whatever it is) by hundreds of files and that redundant self-extracting code starts adding up rather quickly. Secondly, Compact Pro is shareware and there is even a freeware extractor application. The same double click that launches the self-extracting archive will launch the application that extracts the file in question. Remember, this is the Mac, and it can do that for you. Let's take advantage of its capabilities... ;-) Finally, I'm always more than a bit wary about launching an application that I've just downloaded to my Mac without doing a virus check first. In the case of self-extracting archives, I'll have to check the self-extracting archive first, launch that archive, and then check the products of the self-extraction. Seems like a real waste of effort to me. Hence I'd recommend that self-extracting archives be banned under the proposed standard. There are too many drawbacks to them and the proposed advantage is questionable. Jim -- macak@mixcom.UUCP (James R. Macak) "I'm curious, Doctor, why is it uunet!uwm!mixcom!macak called 'M-5' and not 'M-1'?" << All my own opinions. >>
d88-jwa@byse.nada.kth.se (Jon W{tte) (03/21/91)
In article <6008@crystal9.UUCP> derosa@motcid.UUCP (John DeRosa) writes:
4) If the document or folder is 20K or larger then compress
the document/folder with Compactor as a self-extracting
archive (.sea).
Why not Stuffit, you might ask? In this area I am
Popular demand most probably still is StuffIt because:
a) it's still more widespread and used in many sites
b) there exist tools for unpacking of stuffit archives on
non-mac platforms.
I personally de-binhex files on the unix box I ftp from,
then unsit them and compress them instead, easing the
process of transferring to A/UX and saving a lot of modem
or disk space.
Why self-extracting, you might ask? If the file is
Non-mac boxes can't run self-extractors. They are out.
It also takes space...
Maybe, just maybe, I'll throw something together that
a) supports files & folders
b) runs on unix _and_ mac
c) is compatible with compress(1)
The File & Folder stuff would by necessity be special, but
I would provide source for the world...
(I have most of these tools already working under Mac & A/UX,
but not stable nor streamlined enough for release)
Happy hacking,
h+@nada.kth.se
Jon W{tte
--
"The IM-IV file manager chapter documents zillions of calls, all of which
seem to do almost the same thing and none of which seem to do what I want
them to do." -- Juri Munkki in comp.sys.mac.programmer
resnick@cogsci.uiuc.edu (Pete Resnick) (03/21/91)
jeffb.bbs@shark.cs.fau.edu (Jeffrey Boser) writes: > These posts are for the most part forgetting one important >aspect of macintosh implementation: FREEDOM! >1) Putting multiple files in a folder to be compressed inhibits *my* > freedom to extract them where I want them to go. You've missed the point: If someone gets an archive without everything in a folder, and then wants to just extract everything (as is easiest on a Unix box), you end up with a mess of files all over your current directory unless you first create a new directory for the un-archived files and move the archive there to extract things. When there is only one file in the archive, this is not necessary. I like the idea of the person putting them all in one directory if there is more than one. I can always move them later, and current versions of Stuffit now allow me to go through the folders in the archive anyway. >2) Using Text files for docs does not let the creator to use graphics > and picts (as complicated programs sometimes require) as he wishes. > Besides, MacWrite, Word, and WriteNow have fairly standard formats > and can be handled by most word processors. Using programs which create mini-applications for docs is much nicer than MacWrite, Word, or some other thing that at least *someone* is bound not to have. You can also put graphics and picts in TeachText, though it is less fun. pr -- Pete Resnick (...so what is a mojo, and why would one be rising?) Graduate assistant - Philosophy Department, Gregory Hall, UIUC System manager - Cognitive Science Group, Beckman Institute, UIUC Internet/ARPAnet/EDUnet : resnick@cogsci.uiuc.edu BITNET (if no other way) : FREE0285@UIUCVMD
cycy@isl1.ri.cmu.edu (Cowboy) (03/21/91)
In article <oHi4y2w163w@shark.cs.fau.edu>, jeffb.bbs@shark.cs.fau.edu (Jeffrey Boser) writes: First, I vote for keeping Stuffit 1.5.1 Second: > all mac software. a common word processor would be nice for docs too. Well, there is TeachText -- Chris. -- -- Chris. (cycy@isl1.ri.cmu.edu) "People make me pro-nuclear." -- Margarette Smith
jba@gorm.ruc.dk (Jan B.Andersen) (03/21/91)
d88-jwa@byse.nada.kth.se (Jon W{tte) writes: >Maybe, just maybe, I'll throw something together that >a) supports files & folders >b) runs on unix _and_ mac >c) is compatible with compress(1) I don't know if they are 32-bit clean or if they'll work with A/UX, but MacTar and MacCompress is already available. I also happens to like the (original) Unix philosophy of creating small, independent filters, and with the Multifinder-only System 7.0, I see no need to combine BinHex, Tar and Compress into one ugly application. >Happy hacking, > h+@nada.kth.se > Jon W{tte --- Jan B. Andersen (jba@dat.ruc.dk) Hurra for Broendby!!!!
jba@gorm.ruc.dk (Jan B.Andersen) (03/21/91)
First jeffb.bbs@shark.cs.fau.edu (Jeffrey Boser) writes: >>2) Using Text files for docs does not let the creator to use graphics >> and picts (as complicated programs sometimes require) as he wishes. >> Besides, MacWrite, Word, and WriteNow have fairly standard formats >> and can be handled by most word processors. Then resnick@cogsci.uiuc.edu (Pete Resnick) replies: >Using programs which create mini-applications for docs is much nicer >than MacWrite, Word, or some other thing that at least *someone* is >bound not to have. You can also put graphics and picts in TeachText, >though it is less fun. While everybody seems to agree, that its _a good thing_ to keep the compression formats open, so one is able to examine the stuff on Unix- boxes or PC's, why don't the same argument hold for documentation? Correct me if I'm wrong, but formats like [nt]roff, tbl and pic are open and available on several platforms (except Mac??). Why can't we stick with these? --- Jan B. Andersen (jba@dat.ruc.dk) Hurra for Broendby!! on many platforms >pr >-- >Pete Resnick (...so what is a mojo, and why would one be rising?) >Graduate assistant - Philosophy Department, Gregory Hall, UIUC >System manager - Cognitive Science Group, Beckman Institute, UIUC >Internet/ARPAnet/EDUnet : resnick@cogsci.uiuc.edu >BITNET (if no other way) : FREE0285@UIUCVMD
minich@unx2.ucc.okstate.edu (Robert Minich) (03/22/91)
by resnick@cogsci.uiuc.edu (Pete Resnick): |>1) Putting multiple files in a folder to be compressed inhibits *my* |> freedom to extract them where I want them to go. | | You've missed the point: If someone gets an archive without everything | in a folder, and then wants to just extract everything (as is easiest | on a Unix box), you end up with a mess of files all over your current | directory unless you first create a new directory for the un-archived | files and move the archive there to extract things. When there is only | one file in the archive, this is not necessary. I like the idea of the | person putting them all in one directory if there is more than one. | I can always move them later, and current versions of Stuffit now | allow me to go through the folders in the archive anyway. How about a compromise? Personally, I think the archiver programs should have an option to create a new folder to unstuff into for those who don't/can't use MultiFinder. However, I think a reasonable work-around is to put an empty folder (with a name similar to that of the archive, not "Empty Folder" :-) that could be "unstuffed" first and then everything unstuffed into that if we like. The extra folder won't help batch unstuffing, but then again I consider it a REAL pain when I have to unstuff a whole bunch of files at once and waste disk space and my time just to look at the Read_Me file and find I can't make use of that download in the first place! (I like the method at rascal.ics.utexas.edu of having an intro file so I don't have to waste net bandwidth either.) So, what does the net think of an extra folder in every .sit archive? As for the other ideas presented in the original proposal, I agree with the "Stuffit forever" crowd until code to extract files from Compactor Pro (or any other archiver) is available. -- |_ /| | Robert Minich | |\'o.O' | Oklahoma State University| "I'm not discouraging others from using |=(___)= | minich@d.cs.okstate.edu | their power of the pen, but mine will | U | - "Ackphtth" | continue to do the crossword." M. Ho
francis@zaphod.uchicago.edu (03/22/91)
In article <1991Mar21.123303.9895@gorm.ruc.dk> jba@gorm.ruc.dk (Jan B.Andersen) writes: While everybody seems to agree, that its _a good thing_ to keep the compression formats open, so one is able to examine the stuff on Unix- boxes or PC's, why don't the same argument hold for documentation? Correct me if I'm wrong, but formats like [nt]roff, tbl and pic are open and available on several platforms (except Mac??). Why can't we stick with these? 'cause they stink. :-) (Or, at least, [nt]roff does--don't know the others.) Some flavor of TeX might be good. The main reason is that our Mac apps our *much* nicer to create with. Anybody know of an app that'll create TeX output with a Mac user interface? -- /============================================================================\ | Francis Stracke | My opinions are my own. I don't steal them.| | Department of Mathematics |=============================================| | University of Chicago | Until you stalk and overrun, | | francis@zaphod.uchicago.edu | you can't devour anyone. -- Hobbes | \============================================================================/
werner@cs.utexas.edu (Werner Uhrig) (03/22/91)
> How about a compromise? Personally, I think the archiver programs should > have an option to create a new folder to unstuff into for those who > don't/can't use MultiFinder personally, I think the functionality to create a folder "on the fly" should be part of Apple's standard file-dialogue window. but then I use QuickFolder and am aware of half a dozen other INITs which provide me with exactly "that"... putting this functionality into every darn programm is the wrong way to go... -- (Internet) werner@cs.utexas.edu (BITnet) werner@UTXVM (UUCP) ..!uunet!cs.utexas.edu!werner
chai@hawk.cs.ukans.edu (Ian Chai) (03/22/91)
In article <FRANCIS.91Mar21141151@daisy.zaphod.uchicago.edu> francis@zaphod.uchicago.edu writes: >The main reason is that our Mac apps our *much* nicer to create with. >Anybody know of an app that'll create TeX output with a Mac user >interface? Yeah, I would like to see a TeX <=> RTF translator, be it UNIX or Mac based. Ian Chai
eck@eniac.seas.upenn.edu (Hangnail Whipperwill) (03/22/91)
Just a note about MacCompress...it's not exactly the most reliable program. I can't get it to work with compressed files over 20K or so. I tried upping the RAM, and instead of crashing on >20K files, it produces output files on the order of five megs in size(!) And the author has moved on to bigger and better things... Brian