mark@cbosgd.UUCP (Mark Horton) (01/10/85)
This is in response to a proposal misposted to net.sources for a net.sources.repost for repostings of things that some people didn't get. I think this is a bad idea. If you don't get a particular article, it's probably because either (1) somebody upstream from you isn't letting that newsgroup through, or (2) somebody along your path had their /usr/spool/news filesystem fill up, preventing the whole thing from making it. In either case, reposting won't help, it probably will do the same thing again. There is clearly a need for a solution to this problem, however. I have two ideas. (1) We set up some machines as public repositories of "things worth saving", such as mod.sources and net.sources. We publish UUCP dialup info for them (probably a very restricted special uucp login) and anyone who wants can call them up and get what's there. We'd have to index it somehow, that is, either publish lists of what's available regularly, or else find a way to get a listing back to the requestor (who is probably not in the L.sys file on the repository.) What we really want here is something like the ARPANET FTP with anonymous logins (which not only work better than UUCP, and are faster, but put less load on the repository machine.) Too bad we can't all connect into an ARPA-like network. (2) We post such things regularly on a newsgroup which is not transmitted over phone lines, but only over StarGate. Presumably this implies screening, but I think we'd need this anyway to keep the "who has the source to rogue" messages off. Mark
msk@afinitc.UUCP (Morris Kahn) (01/11/85)
I think this one is a winner, especially for sites which archive sources and may only wish to save one copy! -- -- From the terminal of Morris Kahn (...ihnp4!wucs!afinitc!msk) -- Affinitec, Corp., 2252 Welsch Ind. Ct., St. Louis, MO 63146 (314)569-3450
wls@astrovax.UUCP (William L. Sebok) (01/11/85)
> ..... In either case, reposting won't help, it probably will > do the same thing again. Not necessarily, the obstructions (systems with disk files filled up or such) are probably going to be different in a second posting. The number of sites that successfully receive one of the two postings is going to be larger than the number of sites that successfully receive each individual posting. -- Bill Sebok Princeton University, Astrophysics {allegra,akgua,burl,cbosgd,decvax,ihnp4,noao,princeton,vax135}!astrovax!wls
hokey@gang.UUCP (Hokey) (01/12/85)
Far be it from me to suggest that all of the stuff sent across the Net is worthwhile, but there is a good chance that if some articles in *.sources (or a descendant) were truncated, others are missing as well. The best fix would seem to be xfernews, mentioned by spaf@gatech a while ago. I'm hoping it shows up in 2.10.3, but I don't know if that is planned. Once I suggested that if the line count on an article was off, the posting be rejected and a new copy retransmitted. This way, "mangled" articles would not propagate across the net (due to a line-eater bug or a truncation because of a full disk partition). If this retransmission was accompanied by a mail message to the usenet administrator on the sending machine, there would be an extra incentive to keep the software in good running order. Needless to say, this idea has not been met with wild enthusiasm, mostly because some early versions of news add or delete an extra blank line at the end of an article, and this was viewed as a benign problem. -- Hokey ..ihnp4!plus5!hokey 314-725-9492