bear@spl1.UUCP (Harry Skelton) (12/23/87)
( this message is being cross posted over comp.binaries.ibm.pc and comp.sys.ibm.pc - sorry for that last posting to sys.ibm.pc....) For those of you who are posting and making replies, Please keep it to comp.sys.ibm.pc. I'm trying to get my act together for my postings and to see fights over pirate software on this group (See the aquarium program) does not help matters. A new newsletter (with propper form feeds) will soon be posted to this group (comp.sys.ibm.pc) and the uploads of software will begin soon. PLEASE CLEAR THIS CHANNEL! (comp.binaries.ibm.pc) Many thanks for your understanding. ----------------------------------------------------------------------- @@@@@@@@@ @@@@@@@@ @@::::::@@ @@::::::@@: @@: @@: @@: @@: @@@@@@@@@:: @@: :: Harry Skelton - Moderator - COMP.BINARIES.IBM.PC @@::::::: @@: @@: @@: @@ @@: @@@@@@@@:: :: :::::::: -----------------------------------------------------------------------
bobmon@iuvax.UUCP (Bobmon) (12/24/87)
I guess this is another request for moderate moderation... I like those shaded banners Harry uses, and I would like to see the (source) code posted. But at 1200bps, they take a while to display. So I hope he uses them with a light hand. Another, unrelated comment -- I'm sure the question of source-vs.-executables has/will come up. My view is that, for the MSDOS world (and sometimes for the UNIX world), posting ARChived material confers HUGE benfits in terms of organization, not to mention the compression. (I don't mention it because of the situation of usenet mailers recompressing and thereby expanding the postings...) Even though the original file may have been source code, the ARChive itself is clearly binary -- just not executable binary. For this reason, I hope ARChived source code will be considered suitable for comp.binaries.ibm.pc. This isn't meant as flame, just "constructive criticism". I'm glad Harry's moderating, and more power to him. Also, Happy Holidays! Bob Montante
w8sdz@brl-smoke.ARPA (Keith B. Petersen ) (12/27/87)
I hope that any binary files posted will be ARCs. There is no error checking in uuencoded OR files compacted with the Unix "compress" program. ARCed files provide a built-in CRC which is checked when extracting the file members. Also of concern is the version of uuencode used. The latest Unix version does NOT use the space character. It uses the " ` " character instead. This gets around problems of truncated trailing spaces. -- Keith Petersen Arpa: W8SDZ@SIMTEL20.ARPA Uucp: {bellcore,decwrl,harvard,lll-crg,ucbvax,uw-beaver}!simtel20.arpa!w8sdz GEnie: W8SDZ
farren@gethen.UUCP (Michael J. Farren) (12/27/87)
In article <6902@brl-smoke.ARPA> w8sdz@brl.arpa (Keith B. Petersen (WSMR|towson) <w8sdz>) writes: >I hope that any binary files posted will be ARCs. There is no error >checking in uuencoded OR files compacted with the Unix "compress" >program. ARCed files provide a built-in CRC which is checked when >extracting the file members. If AND ONLY IF the ARCs contain no text files whatsoever. To restate what I've said about eight times so far: when you compress and uuencode text files, and then send them over a typical Usenet link (which tries to compress them once again), you force every site on the net to pay for the transmission of a much larger file than they would have had to if you had just sent straight text. These costs may not come out of your own personal pocket, but somebody's gotta pay, and there are too many sites that refuse to carry binary groups already - we don't need to cause more sites to refuse because it just plain costs too much. I've posted facts and figures based on test runs before; I don't particularly want to do it again. Take my word for it, posting ARCed text files loses badly - very, very badly - and should NEVER be done. Post ARCed binaries if you must, but strip out the text files first, and send them separately, please? -- Michael J. Farren | "INVESTIGATE your point of view, don't just {ucbvax, uunet, hoptoad}! | dogmatize it! Reflect on it and re-evaluate unisoft!gethen!farren | it. You may want to change your mind someday." gethen!farren@lll-winken.arpa | Tom Reingold, from alt.flame
w8sdz@brl-smoke.ARPA (Keith B. Petersen ) (12/28/87)
One of the *most* frustrating things about Usenet is the frequent errors in newsgroup postings. I can't understand why no one has addressed that problem. I see large numbers of notes about truncated postings. It's very likely that the very thing that saves money, the use of "compress" to handle net news, is the cause of this problem. I REPEAT: THERE IS NO ERROR CHECKING IN COMPRESS/UNCOMPRESS. It is very likely that the re-postings required because of truncation nullify most of the savings! -- Keith Petersen Arpa: W8SDZ@SIMTEL20.ARPA Uucp: {bellcore,decwrl,harvard,lll-crg,ucbvax,uw-beaver}!simtel20.arpa!w8sdz GEnie: W8SDZ
farren@gethen.UUCP (Michael J. Farren) (12/28/87)
In article <6903@brl-smoke.ARPA> w8sdz@brl.arpa (Keith B. Petersen (WSMR|towson) <w8sdz>) writes: >One of the *most* frustrating things about Usenet is the frequent errors >in newsgroup postings. I can't understand why no one has addressed that >problem. I see large numbers of notes about truncated postings. It's >very likely that the very thing that saves money, the use of "compress" >to handle net news, is the cause of this problem. I REPEAT: THERE IS NO >ERROR CHECKING IN COMPRESS/UNCOMPRESS. I would suggest that you learn a little bit about what you are talking about before you go talking about it. While there is, indeed, no error checking in compress, there really doesn't need to be. There IS error checking in every protocol I'm aware of which is used to send the files from machine to machine. The chances of an error cropping up in file transmission is miniscule - the chances of an error happening in decompressing a correctly-transmitted file are zero. It isn't 'very likely' that compress is the problem - it is as close to certain as makes no difference that it is NOT. Much more likely is the fact that articles are not passed from machine to machine unprocessed - each article is processed in some way by each site before it's sent on to the next. (At least, in 99% of the cases - there are a very few sites that just send them on and never touch them.) The truncation problems are almost always due to the fact that any article appearing on the Net has gone through any number of systems, each of which may handle a file in a different way. This is particularly true if an article has passed through an inter-net gateway before it has gotten to you, or if it has passed through one of the many different mailer programs, each of which seems to have its own set of peculiarities. Often, as well, the original poster turns out to be the one who screwed it up, without noticing. After all, HE didn't have to decode and decompress it! If you're really concerned about finding out how the truncation happened, trace the article back to its source - somewhere in there, you'll find the site that messed it up, and I'll bet you any amount that the culprit won't be compress. -- Michael J. Farren | "INVESTIGATE your point of view, don't just {ucbvax, uunet, hoptoad}! | dogmatize it! Reflect on it and re-evaluate unisoft!gethen!farren | it. You may want to change your mind someday." gethen!farren@lll-winken.arpa | Tom Reingold, from alt.flame