[comp.sys.amiga] Filtering the source/binaries without bottleneck

denbeste@bbn.com (Steven Den Beste) (01/07/89)

Our two recent experiences with moderators have each had good sides and bad
sides.

Bob Page is doing a great job in expediting postings to the group, but recently
he posted some things in excess of 2 megabytes to the binaries groups
which were useless to the vast majority of readers of the group. People have
begun clamoring for a pre-filter.

The only way that Bob, working ON HIS OWN TIME (let's not forget this),
has been able to expedite things through has been for him to do
minimal checking.

Before, Pat and Justin (and another guy whose name I've forgotten) used to
really test everything, but since this takes a lot of time and they didn't have
much to spare, there were delays of weeks, and the complaints were "Come on,
guys, expedite!". (If the crowd had had its way, Justin and Pat would have
worked full time on moderation.)

I'd like to express my opinion that all of them have done a good job. If there
has been a problem, I think it is with the process, not the
people. The purpose of this report is to propose a different process,
which I think is better than either of the two we've seen before.


Through-out this, I'll use the term "Page" to refer to the primary moderator,
though Bob Page's direct involvement in such a process is entirely up to him...

Page keeps a list of volunteer testers, and for each he has a list of hardware
and major software, and interests and expertise.

When he receives a submission, that submission must have at the beginning the
following information: What is required in the environment for this to run, and
for source what language tools are needed to manufacture it. (So things like
amount of RAM, amount of disk space, NTSC/PAL and suchlike are here.)

Anyone sending in a submission without this information gets a form letter in
the email, and the submission is stored until that information is provided.
If the submitter can't be reached, or after 30 days without an answer,
the submission goes in the byte bucket.

With this information in hand, Page checks his list of testers to see who can
test it, who hasn't been as busy as the others. (By which I mean that Page
tries to spread the load fairly evenly within the limits of requirements.)

He ships off a copy of the submission to that tester. The tester must either
test it within a week, or send back a letter within that week saying how long
it will take to test it and give a reason for the delay. Page might reassign it
if he chooses.

If Page can't readily find an appropriate tester, he sends a description of the
submission and required environment out to an exclusive mailing list of all the
testers (and no-one else) asking for a volunteer. If none can be found, Page
informs the original submitter, and the project is dropped. (Alternatively,
it might be posted with a caveat.)

[Another possibility is that Page posts the description and requirements to the
net and asks for a volunteer to join the tester corps.]

[Oh, yeah: And whenever Page sees someone out there bitch and moan about
comp/[sources|binaries]/amiga speed, he sends a test package to that person -
put up or shut up. :-)]

As the tester works, he/she keeps notes, and when done returns the following
testing report:

     Name of program:
     Date of submission:
     Author name and net address:
     Tester name and net address
     Function of program:

     Tools and versions needed for compilation:
     Hints and tricks in compilation:

     Environment requirements to run it:
     Observed bugs:
     Unusual behavior, perhaps attributable to it:
     Hints and tricks for use:

     Do you intend to keep using it?
        If so, for what?
        If not, why not?


This report (plus maybe the test package itself if the tester had to change
anything to get it to work) is sent back to Page.

If there is a decision to let the submission through.
Page posts the test report to comp.sys.amiga, and packages the test report into
the ".zuu" which is posted to the source and/or binary group.

If Page and the tester decide to block the submission, the test report is
returned to the submitter, to allow correction of problems.


To make this work, the tester group needs to be 5-20 knowledgeable users with
large machines.
(Though it might not be bad to have a couple of people with minimal systems on
the list, such as a A500/512K/1 drive and nothing else...) I volunteer to be on
this list.

I believe this gives us the best of both worlds. Justin and Pat gave us
quality moderation, but at a big loss in speed because the testing process
was a bottleneck. But if it is
divided among 10 or 20 people, it isn't too bad. Page's job gets only a little
bigger under this process, so he should be able to work just as fast as he has.

I would like to hear discussion of this.


Steven C. Den Beste,   BBN Communications Corp., Cambridge MA
denbeste@bbn.com(ARPA/CSNET/UUCP)    harvard!bbn.com!denbeste(UUCP)

sean@ms.uky.edu (Sean Casey) (01/07/89)

Frankly, I don't care if the sources/binaries are pre-tested or not.
All I really care is that the group be moderated to keep the S/N level
at zero. I'd rather have faster article propagation.

If I may suggest a change, I would like to have the files placed in a
directory before zooing so that they will extract to a directory. Thus
I can easily keep all the files for a package in one place. If I
extract more than one package in one shot, all the readme files won't
overwrite each other, etc.

Sean
-- 
***  Sean Casey                        sean@ms.uky.edu,  sean@ukma.bitnet
***  Who sometimes never learns.       {backbone site|rutgers|uunet}!ukma!sean
***  U of K, Lexington Kentucky, USA  ..where Christian movies are banned.
***  ``My name is father. You killed my die. Prepare to Inigo Montoya.''

louie@trantor.umd.edu (Louis A. Mamakos) (01/07/89)

I don't think that Bob Page should have to test sumissions for 
comp.sources.amiga before posting them.  A quick glance at it, and a simple
"garbage" filter is all I expect him to do.  Anything more extensive will
will only slow down the propagation.  And we know how bent of of shape we
all were with slow propagation before.

If there are bugs in the sources, the end user has the option of fixing
the problem himself.  Much of the value in sources is examining the 
algorithms and techniques used, not necessarily in running them. Or so it
seems to me.

I don't care what you do with the binaries.  Personally, I don't run any
binaries on my system from USENET or other non-commercial source.  You
don't know *where* they've been.  With sources, at least I can examine
them for any suspicious code.

I would hate to increase the burden on the moderator.  It enough that he
packages things up and keeps archives; I don't expect him to test code for
me.  What do you want for free, anyway?




Louis A. Mamakos  WA3YMH    Internet: louie@TRANTOR.UMD.EDU
University of Maryland, Computer Science Center - Systems Programming

mikes@lakesys.UUCP (Mike Shawaluk) (01/07/89)

In article <10831@s.ms.uky.edu> sean@ms.uky.edu (Sean Casey) writes:
>
>If I may suggest a change, I would like to have the files placed in a
>directory before zooing so that they will extract to a directory. Thus
>I can easily keep all the files for a package in one place. If I
>extract more than one package in one shot, all the readme files won't
>overwrite each other, etc.
>
Just a note for you; if you makedir a new directory for the files to go in,
and cd to that directory before unZOOing the directory, then you have solved
the above problem without screwing up the rest of the people who might
already have a directory of the name which would have been embedded in the
ZOO file.

And, by the way, I'd like to second the original message poster's motion, by
offering to be a pre-tester, or whatever the term he gave.  If Bob (or
someone else) decides to implement the idea, they can contact me for more
information, etc.

-- 
   - Mike Shawaluk
     ...!uunet!marque!lakesys!mikes

lphillips@lpami.wimsey.bc.ca (Larry Phillips) (01/08/89)

In <3054@haven.umd.edu>, louie@trantor.umd.edu (Louis A. Mamakos) writes:
> I don't think that Bob Page should have to test sumissions ... [more deleted]

  I agree. 

> I don't care what you do with the binaries.  Personally, I don't run any
> binaries on my system from USENET or other non-commercial source.  You
> don't know *where* they've been.  With sources, at least I can examine
> them for any suspicious code.

  I will run binaries (or at least some), but I realize that by doing so, the
rsponsibility is mine, at least to the degree that I should take reasonable
precautions with it. I do not expect the moderator to do my job for me. I do
expect anyone who has taken on the job of moderator to do pretty much as Bob
has been doing, and appreciate the time and effort required to do it.

> I would hate to increase the burden on the moderator.  It enough that he
> packages things up and keeps archives; I don't expect him to test code for
> me.  What do you want for free, anyway?

  There is a another consideration here as well. Having Bob check out every
package, or even having a team check them out, is still no guarantee of a fully
'safe' program. There are just too many variables to test for to make sure that
a program doesn't inadvertently trash something on a particular hardware
combination running a particular set of programs. There is also no way of
knowing that a program does not contain deliberate nastiness, as it might be in
there biding its time until run #x, or a certain date.

  In the latter case, if Bob were testing all submissions, and one got through
that turned out to be damaging a month down the road, what would the
net.reaction be? Ask yourself if  you would like to be in the position of
having tested and approved The Binary That Ate The Amiga Community. Perhaps it
is better that we each test in our own environment, leaving no doubt as to
where the responsibility lies, and absolving the already hard working moderator
of the additional responsibility, implied, if not explicit.

  As a sysop on Comuserve's Amiga forums, I download and test submissions
constantly. Some I test more thoroughly than others, since often I do not have
the apporpriate hardware or software to perform a proper test. This has made me
very aware that should something propogate that is less than friendly, I have
helped to propogate it, and worse, have helped to propogate it with at least an
implied seal of approval. Consequently, some binaries that are approved for
public data libraries are tested well, while others are tested for copyright
and the fact that they will un(ARC|Zoo) properly, and not much more.

  Of course CIS is not Usenet, and there are a lot of differences in the
perception of the two services, so it stands to reason that there might be a
difference in perception toward the binaries available on each. Perhaps we need
something in the 'monthly newcomers package' (if that ever gets going),
detailing what sort of testing is and isn't done, and outlining the end users
responsibilities in this area.

-larry

--
Frisbeetarianism: The belief that when you die, your soul goes up on
                  the roof and gets stuck.
+----------------------------------------------------------------------+ 
|   //   Larry Phillips                                                |
| \X/    lphillips@lpami.wimsey.bc.ca or uunet!van-bc!lpami!lphillips  |
|        COMPUSERVE: 76703,4322                                        |
+----------------------------------------------------------------------+

richard@gryphon.COM (Richard Sexton) (01/08/89)

In article <34235@bbn.COM> denbeste@BBN.COM (Steven Den Beste) writes:
>
>Our two recent experiences with moderators have each had good sides and bad
>sides.
>
>Bob Page is doing a great job in expediting postings to the group, but recently
>he posted some things in excess of 2 megabytes to the binaries groups
>which were useless to the vast majority of readers of the group. People have
>begun clamoring for a pre-filter.
>

Yes, and lets look at the overall net.picture.  Last year somebody
posted a 770K PSPICE demo to the net, and there was a major outrage
over in news.* that went on for quite a while, got a lot of people
really annoyed, and all sorts of unresonable suggestions,many of which
were pysically impossible.

Fortnately however, when the RGB demo slid through, the denizens of
news.* were bickering over atheist women in SIGPLAN, and it seems
to have gone completely unnoticed. Phew.

Soooo, as much as we'd all like to see huge aniumations appear
every day, we're just gonna have to accept the fact that this
will not be; at least in this incarnation of USENET.

And second, Bob Page has been exemplory in his moderation of
the source and binary groups. The self imposed moratorium on
big animations, although an unpopular decision, was a good
move.

And as for testing, vs. delays, I think I for one would
rather have then stuff NOW, with minimal testing, rather
than wait 3 months for the next file to burp up. And after
all, if it's not well tested and has bugs, one of you 
people will generally have reported that fact long before
*I* find it out :-)


-- 
                              Hotel USENET
richard@gryphon.COM   {...}!gryphon!richard   gryphon!richard@elroy.jpl.nasa.gov

papa@pollux.usc.edu (Marco Papa) (01/08/89)

In article <10428@gryphon.COM> richard@gryphon.COM (Richard Sexton) writes:
...
|Soooo, as much as we'd all like to see huge aniumations appear
|every day, we're just gonna have to accept the fact that this
|will not be; at least in this incarnation of USENET.
...
|And as for testing, vs. delays, I think I for one would
|rather have then stuff NOW, with minimal testing, rather
|than wait 3 months for the next file to burp up. And after
|all, if it's not well tested and has bugs, one of you 
|people will generally have reported that fact long before
|*I* find it out :-)

I second the motion: no animations, no testing. We want the stuff NOW!
Great job so far, Bob.

-- Marco Papa 'Doc'
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
uucp:...!pollux!papa       BIX:papa       ARPAnet:pollux!papa@oberon.usc.edu
 "There's Alpha, Beta, Gamma and Diga!" -- Leo Schwab [quoting Rick Unland]
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

hansb@ariel.unm.edu (Hans Bechtel) (01/09/89)

I have an idea...

After the files are downloaded from the src/bin section, have whoever that
downloads one post a message to the net that it is ok about every 2 weeks
of use.  Of course, the person doing so should read the many messages to
make sure that nobody else has yet posted a message.  This way everybody
can help the group.

Hans Bechtel
hansb@ariel.unm.edu

(Will somebody from Commodore-Amiga please contact me?  Thanks...
I have a few questions to ask.
Please write..

Hans Bechtel
12353 Mountain NE #G
Albuquerque, NM  87112 )

limonce@pilot.njin.net (Tom Limoncelli) (01/09/89)

I think things are running just fine.  The only addition is the
question about posting graphics/songs/etc on the comp.binaries.amiga
group.  How about we make a comp.binaries.amiga.graphics?  Maybe a
comp.binaries.amiga.big?  The .big could be really big things that
required a certain amount of RAM.  Actually, a .graphics would be
better since certain sites could just plain not get that to ease the
net.bandwidth problems that some places are having.

Any comments?

-Tom
-- 
   Tom Limoncelli   Drew University    Madison NJ    201-408-5389
       tlimonce@drunivac.Bitnet          limonce@pilot.njin.net
 "Fences make good neighbors" -Frost       "I want an MMU" -Me
    Standard disclaimer?  No, we're still on the dpANS disclaimer.

richard@gryphon.COM (Richard Sexton) (01/09/89)

In article <4225@charon.unm.edu> hansb@ariel.unm.edu.UUCP (Hans Bechtel) writes:
>
>I have an idea...

Uh oh. Hans has an idea.

>After the files are downloaded from the src/bin section, have whoever that
>downloads one post a message to the net that it is ok about every 2 weeks
>of use.  Of course, the person doing so should read the many messages to
>make sure that nobody else has yet posted a message.  This way everybody
>can help the group.

Sheer genius. Now, backwater sites that are 2 days from a backbone
connection will ALL post: ``xxxyyy is safe'' because they havn't
seen the post yet, because of the dreaded SLOW NEWS FEED.


>(Will somebody from Commodore-Amiga please contact me?  Thanks...
>I have a few questions to ask.
>Please write..
>
>Hans Bechtel
>12353 Mountain NE #G
>Albuquerque, NM  87112 )

Yes, please help Hans. His phone doesnt work, he can't afford
a stamp, and he can't spell ``cbmvax''.


-- 
    ``Why don't you crawl back under your rock, Repto!'' - Space Ghost
richard@gryphon.COM   {...}!gryphon!richard   gryphon!richard@elroy.jpl.nasa.gov

UH2@PSUVM.BITNET (Lee Sailer) (01/09/89)

In article <14519@oberon.USC.EDU>, papa@pollux.usc.edu (Marco Papa) says:
>
>I second the motion: no animations, no testing. We want the stuff NOW!

There is a more moderate position possible.  Part of the moderators
function is to smooth out the delivery of incoming material.  If
something really important comes in, it should be posted right away.
If there doesn't happen to be anything very interesting in the queue (say
Matt Dillon takes the afternoon off 9-), then why not send out
an animation?

Personally, I trust Bob Page to judge which things need to be rushed out,
and which can wait a few days of weeks (or years).

                                                  lee

sean@ms.uky.edu (Sean Casey) (01/10/89)

In article <279@lakesys.UUCP> mikes@lakesys.UUCP (Mike Shawaluk) writes:
|In article <10831@s.ms.uky.edu> sean@ms.uky.edu (Sean Casey) writes:
|>If I may suggest a change, I would like to have the files placed in a
|>directory before zooing so that they will extract to a directory. Thus

|Just a note for you; if you makedir a new directory for the files to go in,
|and cd to that directory before unZOOing the directory, then you have solved
|the above problem without screwing up the rest of the people who might
|already have a directory of the name which would have been embedded in the
|ZOO file.

I appreciate the help, but give me a little credit! My Amy1000 is nearly
two and a half years old. I don't LIKE mkdir-ing new directories all the
time. I imagined others would like that convenience too.

Sean
-- 
***  Sean Casey                        sean@ms.uky.edu,  sean@ukma.bitnet
***  Who sometimes never learns.       {backbone site|rutgers|uunet}!ukma!sean
***  U of K, Lexington Kentucky, USA  ..where Christian movies are banned.
***  ``My name is father. You killed my die. Prepare to Inigo Montoya.''

cjp@antique.UUCP (Charles Poirier) (01/10/89)

Summary:
References: <34235@bbn.COM> <10831@s.ms.uky.edu>
Reply-To: vax135!cjp (Charles Poirier)

In article <10831@s.ms.uky.edu> sean@ms.uky.edu (Sean Casey) writes:
>Frankly, I don't care if the sources/binaries are pre-tested or not.
>All I really care is that the group be moderated to keep the S/N level
>at zero. I'd rather have faster article propagation.

I'm undecided about the pretesting issue, but let me point out for the
sake of completeness, that posted binaries that plain don't work have
an S/N of 0.000.  I downloaded over 2 megabytes of that animation noise
for example.

By the way, I vote NO on posting images and animations, even in a
separate subgroup.  Though I am a big fan of Amiga images, I feel they
have much less utility per byte than sources, binaries, or general
discussion.  Plus, I feel that the sheer volume of images and animations
we would see would inevitably cause problems.  Though it is true that
individual Usenet sites can refuse to carry individual groups, I fear
that many sites don't administer Usenet with a sufficiently fine touch
but will instead take out their wrath on the whole Amiga subtree or on
Usenet in general.

Some kind of direct-distribution scheme should be used for images.

-- 
	Charles Poirier   (decvax,ucbvax,mcnc,attmail)!vax135!cjp

   "Docking complete...       Docking complete...       Docking complete..."

pds@quintus.uucp (Peter Schachte) (01/10/89)

In article <3054@haven.umd.edu> louie@trantor.umd.edu (Louis A. Mamakos) writes:
>I don't think that Bob Page should have to test sumissions for 
>comp.sources.amiga before posting them.

Agreed.

>I don't care what you do with the binaries.  Personally, I don't run any
>binaries on my system from USENET or other non-commercial source.  You
>don't know *where* they've been.

The only other thing I believe a moderator should do is to make sure
that a program really comes from the person it claims to have come
from, and that that person is reachable.  That way if a program turns
out to have a virus or some other nasty, people can get their hands on
the perpetrator and throttle him or her.

Just compiling a program yourself is no guarantee that it doesn't do
something nasty.  Not unless you really read it cover to cover first.
It might not contain a virus, but it could do something like deleting a
random file somewhere.  That wouldn't take much code, so such a thing
could be concealed in a program without much trouble.  (God, I hope I
haven't given anyone any ideas!)

-Peter Schachte
pds@quintus.uucp
..!sun!quintus!pds

andrew@bhpese.bhpese.oz (Andrew Steele) (01/10/89)

	My Comments on the subject :-

I think Bob Page is doing a great job, keep it up.

The frustration with things not working would be alleviated if the time
	it takes to cut above a non standard cut line, unshar, join, uudecode
	and unzooing a file could be reduced.  If this whole process could 
	be automated then people would not waste large amounts of time getting
	files to a usable form.  I realise that some things to do this have
	come over the net of late but they either were OS dependant
	(BSD/SysV), or only worked for some cases, or both.

I generally get all the parts for a program and put them into a temperary
	directory before I start undoing them.  All programs I would think
	should only assume that they are to go into the current directory
	and if a program requires subdirectories it should only assume that
	they are to be created below the current directory.

The comment made that a piece of source code should be used to teach ideas
	is good, if it works then that is a bonus.

The idea of a test site already exists, its called the net.  If something
	doesn't work it doesn't take long to hear about it.  In the case of 
	a program like rgb if you don't know whether it's worth the time to
	download, wait a week ( usually less ), if there are no comments
	about it and you are still not sure try asking for comments on it.
	There is no need to inflict this work on the moderator.

The idea of bit bucketing anything that hasn't come back from a test site
	within a week or so is not a good idea.

MY SUGGESTION :
	Someone ( Probably Bob Page as he is in the position to do it ),
	should come up with a standard packaging method for files and
	an accompanying algorithm to undo them.  The shar format is almost
	right but it could be improved.
		The cut line should always be EXACTLY the same to allow
			for it to be searched for.
		A header to the first/only file should list the parts
			needed for the complete package, where they are
			( comp.binaries.amiga/comp.sources.amiga ), and 
			what format they are in ( .uu/.zuu/.zoo/need joining
			before undoing/etc. ).
		An simple algorithm to go from one/many news items to
			one .zoo file that contains all parts (binaries,
			sources,docs,etc) assuming all required news items
			are present in the same place.  Actual code for
			such an undoer would be nice but may be at the
			expense of portability (need at least a version
			for U*NIX BSD & Sys V and the Amiga).

Any other comments ?

Andrew Steele                      _____     Spengat Technologies, 
                                  /_   _\    c/o Electrical Computer Services,
ACSnet  : andrew@bhpese.oz        __| |__    BHP Rod & Bar Products Division,
INTERNET: andrew@bhpese.oz.au    /__| |__\   Newcastle, NSW, Australia.
UUCP    : ...!{uunet,mcvax}!munnari!bhpese.oz!andrew

karl@sugar.uu.net (Karl Lehenbauer) (01/10/89)

In article <3054@haven.umd.edu> louie@trantor.umd.edu (Louis A. Mamakos) writes:
>I don't care what you do with the binaries.  Personally, I don't run any
>binaries on my system from USENET or other non-commercial source.  You
>don't know *where* they've been.

Sadly, of course, commercial software isn't safe either.  At least one 
commercial product shipped with a virus -- sorry, I don't have the name.

-- 
-- uunet!sugar!karl  | "We've been following your progress with considerable 
-- karl@sugar.uu.net |  interest, not to say contempt."  -- Zaphod Beeblebrox IV
-- Usenet BBS (713) 438-5018

lphillips@lpami.wimsey.bc.ca (Larry Phillips) (01/11/89)

In <66717UH2@PSUVM>, UH2@PSUVM.BITNET (Lee Sailer) writes:
> There is a more moderate position possible.  Part of the moderators
> function is to smooth out the delivery of incoming material.  If
> something really important comes in, it should be posted right away.

Agreed. And if their is something not so important, there is no rush to post
it. If there is something not important at all, why bother posting it? The big
question is: "Are animations important enough to justify their bandwidth?"

>If there doesn't happen to be anything very interesting in the queue (say
>Matt Dillon takes the afternoon off 9-), then why not send out
>an animation?

Ahh...  something like a NOP in a time dependent program, or sync bytes in a
synchronous data link? Something to let us know that the moderator is still
alive and well and thinking about us and hasn't crashed? Perhaps to keep us all
practiced up? Unsharing, uudecoding, and unZooing are awfully easy to forget.
And then of course we must be careful to keep the net.bandwidth high during
those times when nothing useful is being propogated.

What if there are no animations in the queue? Well, he said, that's easy. Bob
can just send out DMCS scores, or DPaint pics, or essays written with
WordPerfect. Why I bet there are some really neat mortgages out there in
Analyze files we could all look over. How about it people? Would you all like
some directory listings I made with DirMaster?

>Personally, I trust Bob Page to judge which things need to be rushed out,
>and which can wait a few days of weeks (or years).

Personally, I resent the cost, both to the end users and to the backbone sites,
of propogating animations, especially just as 'filler'.

Yes, I think we can trust him to do the right thing.

-larry

--
Frisbeetarianism: The belief that when you die, your soul goes up on
                  the roof and gets stuck.
+----------------------------------------------------------------------+ 
|   //   Larry Phillips                                                |
| \X/    lphillips@lpami.wimsey.bc.ca or uunet!van-bc!lpami!lphillips  |
|        COMPUSERVE: 76703,4322                                        |
+----------------------------------------------------------------------+

wbt@cbnews.ATT.COM (William B. Thacker) (01/14/89)

In article <14519@oberon.USC.EDU> papa@pollux.usc.edu (Marco Papa) writes:
>In article <10428@gryphon.COM> richard@gryphon.COM (Richard Sexton) writes:
>...
>|Soooo, as much as we'd all like to see huge aniumations appear
>|every day, we're just gonna have to accept the fact that this
>|will not be; at least in this incarnation of USENET.
>...
>|And as for testing, vs. delays, I think I for one would
>|rather have then stuff NOW, with minimal testing, rather
>|than wait 3 months for the next file to burp up. And after
>|all, if it's not well tested and has bugs, one of you 
>|people will generally have reported that fact long before
>|*I* find it out :-)
>
>I second the motion: no animations, no testing. We want the stuff NOW!
>Great job so far, Bob.

First, I have to agree; no animations. They're just too darned big.

Maybe someone would volunteer to set up an animation mailing-list,
though. I don't that that would be offensive to anyone, would it ?

As for testing; no, I don't think it's reasonable to ask Bob to go 
line-by-line through the code looking for Trojan Hearses.  But certainly
I'd like to see someone at least test the code to see if it *does*
something.

The recent animations were a case in point.  After spending quite some time
downloading and unpacking the things, I found they only worked on PAL
Amigas.  It wouldn't have taken much testing to determine that, and a 
one-line note in the posting would have save lots of people the effort.

Moreover, not everything has to be tested. If Leo Schwab submits a screen 
hack, I'm willing to believe that it works. Likewise, one of Matt Dillon's
projects is virtually guaranteed to perform as specified.  From what I've
seen, most of the submissions come from a reputable group of programmers.
It's just the occasional "Nifty.Neato" submitted by "Ima Pseudonym", with
the description "Try this, you'll like it !" that needs closer attention.

Anyway, Bob, do what you feel is right, and what you can afford the time to
do.  I support you 99.9%  (.1% taken off for the Italian anims... let that
be a lesson to you ! 8-)



------------------------------ valuable coupon -------------------------------
Bill Thacker						att!cbnews!wbt
	"C" combines the power of assembly language with the
	 flexibility of assembly language.
Disclaimer: Farg 'em if they can't take a joke !
------------------------------- clip and save --------------------------------