[news.groups] How long would it take to actually read all of news?

webber@aramis.rutgers.edu (Bob Webber) (09/10/87)

[In this message I present stats on reading 40 hours worth of news and
then discuss software to cut 20% off of read time.]

Many people seem to think that Usenet has grown too big to read.  I
disagree.  So, I let 40 hours of news pile up unread and then sat down
and timed myself reading it.

I subscribed to all 415 news groups available locally rather than my
usual 401 to eliminate the empact of not reading os.vms on the results.
I made a typescript of the entire session.  I read it on a Sun III
where my own files were local but news was NFS mounted.  Although I
was subscribed to 415 groups, new messages had only occurred in 201 of
them over the 40 hour period.

At 20:57 I initiated vnews -A.  I did not read any of the messages but
rather used the header information (subject and author and length) as
a basis for determining whether or not I wanted to look at the message
later.  The messages I did want to look at later, I saved into
Articles (the default save file for vnews).  At 21:52, after scanning
the headers on 2132 messages, I was finally able to type vnews -A and
be told that there was no new news available.

I then proceeded to invoke gnu emacs over the file Articles and use
control-V and control-S to find my way through the Articles.  There
were 386 articles in Articles, including all the digests (since the
Subject line on a digest doesn't tell you if there is anything
interesting inside -- sigh).  The entire Articles file was 927,274 bytes.
I was done reading it at 23:25.

In summary, it took an hour to find the articles I actually wanted to
look at and an hour and a half to look at them (saving out a couple of
ones that I wanted to refer to later and making notes as to wish ones
called for a reply (sent later through mail and postnews as appropriate)).
I made a script of the entire session allowing the gathering of the
above statistics after the session.  The script was 1,249,028 bytes.

What do I conclude from this:
   1) It takes 3 hours to read 2 days worth of news when one ignores
      the news groups boundaries.
   2) It is alot faster to separate the scanning and reading phases
      of news.
   3) Current news software doesn't support 2.  Although emacs handled
      the read phase quite well, vnews -A was awkward for building the
      Articles file.  Specifically:
      a) Slowness of saves;
      b) Awkwardness of typing sequence ``s CR n'' occassionally out
         of order.  Would be nice to be able to just type one key and
         get the save in default file and go to next message action.
      c) Slowness of transitions between news groups and skipping over
         groups with no messages.

Proposed solution:
   Run 3 processes.  The first process builds a list of all message
headers, piping them to second process.  Second process shows headers
to user, a screenful at a time (not a screen/group at a time) and then
accepts save/ignore requests.  The second process forwards save
requests to a separate process that handles the actual append to
Articles operation.  Also, the news message number is saved to aid
interaction with postnews when appropriate.

[Note: readnews -c mail -f % looks plausible, but somehow I think 4
meg tmp files are worse than 1 meg Articles files.  Also, it doesn't
do the discussion grouping that is convenient in vnews -A (although
somewhat messed up by news group boundaries when discussion spread off
a cross posted article).]

Cheap solution:
   Use vnews -A for first process feeding it a sequence of ``n's'' and
feeding output on to second process.  Leave parsing of file name for
save request to background process 3.  Forward news group transitions to
background process 3 as well.  Allow option of saving message in a
named file to handle sources and RFC-type postings.

Expected result: 
   20% reduction in total time spent to read news.

Make sense?  Is there a simpler, more efficient way to put it
together?  Would anyone be interested in seeing the result?

[Note: this solution doesn't support all the frills of normal vnews
such as figuring reply addresses automatically - which don't seem to
me to actually be important from a point of view of how long one
spends with news.]

------ BOB (webber@aramis.rutgers.edu ; rutgers!aramis.rutgers.edu!webber)

rsk@s.cc.purdue.edu (Whitewater Wombat) (09/10/87)

In article <1475@aramis.rutgers.edu> webber@aramis.rutgers.edu (Bob Webber) writes:
>Make sense?  Is there a simpler, more efficient way to put it
>together?  Would anyone be interested in seeing the result?

Actually, "rn", and possibly other news-reading programs that I'm not
as familiar with, have already attacked this problem and have made good
progress towards solving it.  The use of rn's "kill files" and 'j' and
'k' commands at the article selection level allows quick weeding of
many articles.
-- 
Rich Kulawiec, rsk@s.cc.purdue.edu, s.cc.purdue.edu!rsk
PUCC News Administrator

brad@looking.UUCP (09/11/87)

In article <1475@aramis.rutgers.edu> webber@aramis.rutgers.edu (Bob Webber) writes:
>Many people seem to think that Usenet has grown too big to read.  I
>disagree.  So, I let 40 hours of news pile up unread and then sat down
>and timed myself reading it.

You disqualify yourself right here.  The mere fact that you would want
to try and read all of it, knowing it will take a few hours, removes you
from the category of people who claim that the net is too big to read.

Most people have other things to do, and consider the net as an extra
activity to be read if there is time.  I thought I read too much news back
when it was only 200K/day instead of the 3 megs/day it is now.

Almost nobody has 1 hour just to scan headers without actually getting any
reading done.  Or 1/2 hour per day.

----------
Now this isn't to say that better news reading software can't help
you read news faster.  I read news on a terminal that prints at
10,000 chars/second, and I find 4800 baud too slow these days.
I would like to be presented a menu of subjects, and click off the ones
I want with arrow keys or a mouse, and then see those articles.

But most of all nothing would substitute for good moderators, better
subject lines and more specific, lower volume newsgroups.
-- 
Brad Templeton, Looking Glass Software Ltd. - Waterloo, Ontario 519/884-7473

burton@parcvax.UUCP (09/11/87)

>But most of all nothing would substitute for good moderators, better
>subject lines and more specific, lower volume newsgroups.
>-- 
>Brad Templeton, Looking Glass Software Ltd. - Waterloo, Ontario 519/884-7473



I agree.  And it would also help if people stopped posting those articles which
do nothing but include a long quote and then say things like, "I don't know,
but I'd like to know the answer too." or "Yeah, I'd like that." or "please send
me a copy of that too."  People should learn to send mail, and to think before
posting.

I would also like a news filter that did K(ill) operation on specific        
individuals.



-- 
Philip Burton       burton@parcvax.COM   ...!hplabs!parcvax!burton
Xerox Corp.         preferred path: burton.osbunorth@xerox.COM
408 737 4635   ... usual disclaimers apply ...

billw@ncoast.UUCP (Bill Wisner) (09/12/87)

Cheaper solution:
	Get vn. It's a news reader that prints the subject of each article
and lets you mark which ones you want to read. Quite convenient, although it
can be a bit slow on startup.

webber@brandx.rutgers.edu (Webber) (09/13/87)

In article <923@s.cc.purdue.edu>, rsk@s.cc.purdue.edu (Whitewater Wombat) writes:
> Actually, "rn", and possibly other news-reading programs that I'm not
> as familiar with, have already attacked this problem and have made good
> progress towards solving it.  The use of rn's "kill files" and 'j' and
> 'k' commands at the article selection level allows quick weeding of
> many articles.

I think you missed the point.  I am not interested in having decisions
about what news I read made for me (by either people or programs) -- I
am interested in being able to make a maximum number of those as
quickly and efficiently as possible.  To the extent that such
decisions get forced upon me by those at other sites, I would prefer
that they were made randomly than that they were made by ``people who
think they know what is best for me''.  And above all, I would prefer
they were made minimally, i.e., that as much as possible would be sent.

Incidently, a few people have remarked that spending a few hours every
few days reading news is excessive.  Well, if every few days I read
the last few New York Times, it would take longer.  And ultimately,
I find a day's worth of netnews much more interesting than a day's worth
of the New York Times.  For the New York Times is mostly raw data (or
as raw as they know how to make it) whereas the net is mostly raw thought
(most of which would be lost otherwise -- sometimes it is a miracle
that they were maintained long enough to post a message).

----- BOB (webber@aramis.rutgers.edu ; rutgers!aramis.rutgers.edu!webber)

welty@sunup.steinmetz (richard welty) (09/13/87)

In article <954@looking.UUCP> brad@looking.UUCP (Brad Templeton) writes:
>Now this isn't to say that better news reading software can't help
>you read news faster.  I read news on a terminal that prints at
>10,000 chars/second, and I find 4800 baud too slow these days.
>I would like to be presented a menu of subjects, and click off the ones
>I want with arrow keys or a mouse, and then see those articles.

Ahh -- a friend of mine has written a mouse-oriented news reader for
Symbolics 36xx lisp machines, and it is marvelous.  It is quite amazing
how much faster news reading goes with such a system.  I'd dearly love
to see an equivalent for a Sun, but I don't have time to write one and
I have no idea on whether the available Sun window tools provide equivalent
support for such things anyway.  But it really is a beautiful news reader.
    -- Richard Welty         Phone H: 518-237-6307      W: 518-387-6346
(some return paths are of questionable value -- use one of these addresses:)
Internet:  welty@ge-crd.ARPA      Usenet:    uunet!steinmetz!crd!welty

``Now remember kids, these are trained professionals posting these
  articles; don't try this at home''

klundin@softg.UUCP (Kristin Lundin) (09/19/87)

In article <954@looking.UUCP>, brad@looking.UUCP (Brad Templeton) writes:
> In article <1475@aramis.rutgers.edu> webber@aramis.rutgers.edu (Bob Webber) writes:
> 
> Most people have other things to do, and consider the net as an extra
> activity to be read if there is time.  I thought I read too much news back
> Brad Templeton, Looking Glass Software Ltd. - Waterloo, Ontario 519/884-7473


 Now many people have heard (and been bored with ) all the arguments about
what should be where and who should tell them what should be there, etc.

 I'd now like to introduce you ( those who think the net is too big)
 to the U key. It makes the net seem sooooo much smaller than it really 
 is.
     Try it you'll like it.