[comp.society.futures] Finding a needle in a haystack

mark@intek01.UUCP (Mark McWiggins) (12/25/88)

From article <75@sopwith.UUCP>, by snoopy@sopwith.UUCP (Snoopy):
> 
>                                     ... There is a lot of useful information
> in usenet, but it is often buried in garbage.  How much time do you want
> to spend skimming through articles hitting 'k' or 'n'?  KILL files help,
> but not enough.  Moderation sometimes helps, but often the moderator bends
> a group far too strongly in the direction of their own interests.

Have you tried the "vn" newsreader?  It presents a screenful of message
titles, and you just pick the ones you want.  It makes picking just a few
articles out of a huge batch about as quick as it can be.

There's an option in rn that gives somewhat the same ability; hit = and you
get a screenful of article titles with their numbers.  You then type in the
number to see the article.  Not as easy as vn, but it does work.

Vn has a few rough edges (especially with Xenix) and not as many bells and
whistles as rn, but it's still my favorite.  I don't know if Larry Wall is
continuing development on rn, but if he is I'd be surprised if his next
version didn't offer a vn-ish approach as an option.

-- 

Mark McWiggins			UUCP:		uunet!intek01!mark
DISCLAIMER: I could be wrong.	INTERNET:	intek01!mark@uunet.uu.net
						(206) 455-9935

koreth@ssyx.ucsc.edu.ucsc.edu (Steven Grimm) (12/25/88)

In article <377@intek01.UUCP> mark@intek01.UUCP (Mark McWiggins) writes:
>From article <75@sopwith.UUCP>, by snoopy@sopwith.UUCP (Snoopy):
>>                                     ... There is a lot of useful information
>> in usenet, but it is often buried in garbage.  How much time do you want
>> to spend skimming through articles hitting 'k' or 'n'?
>Have you tried the "vn" newsreader?  It presents a screenful of message

Actually, it seems to me that Usenet is really archaic as it exists now.
Netnews has many aspects of a hypertext system, but it's not quite there.
Why not scrap all the news software and start anew?  It should be possible
to retain all of the old functionality, but in a new context.  For instance,
quoting part of an article and commenting on it is VERY hypertext-ish --
but why retransmit most of the original article every time it's quoted?
How about saying "this part of my message is a response to bytes 150 through
402 of message 108832@foo.bar?"  The news software could optionally insert
the proper part of the original, thus mimicing the behavior of the news
system we have now; or it could allow the reader to jump around the links
between messages.  (Readers of this newsgroup/mailing list should already
be familiar with the other aspects of hypertext; it should be apparent that
reading messages organized in this way would make it much easier to get
at the information you want.)

As well as making news a lot easier (and more fun) to peruse, this scheme
could dramatically reduce the bandwidth eaten by net messages.  I haven't
seen any hard numbers on this, but I imagine that a good 25% of the text
that gets sent around is in the form of quotes from older articles.

Naturally, all this would take some getting used to, and a good many people
would cry out in confusion and ask for the old, simple software again.  But
hypertext is obviously going to replace what we have now at some point; why
not get all the kinks out of the system now, before it becomes even more
widespread?

---
It's 5:10 on Christmas morning and still no sign of Santa...
Steven Grimm		Moderator, comp.{sources,binaries}.atari.st
koreth@ssyx.ucsc.edu	uunet!ucbvax!ucscc!ssyx!koreth

eric@snark.UUCP (12/27/88)

>Have you tried the "vn" newsreader?  It presents a screenful of message
>titles, and you just pick the ones you want.  It makes picking just a few
>articles out of a huge batch about as quick as it can be.

>There's an option in rn that gives somewhat the same ability; hit = and you
>get a screenful of article titles with their numbers.  You then type in the
>number to see the article.  Not as easy as vn, but it does work.

All the interactive readers in 3.0 news (including vnews) will have this
capability, it is already implemented and has been working fine
for months.

eric@snark.uu.net = eric%snark@uunet.uu.net     -->     Eric S. Raymond

freedman@cpsc.ucalgary.ca (Daniel Freedman) (12/29/88)

In article <8812262241.AA24943@snark.UUCP>, eric@snark.UUCP writes:
> >Have you tried the "vn" newsreader?  It presents a screenful of message
> >titles, and you just pick the ones you want.  It makes picking just a few
> >articles out of a huge batch about as quick as it can be.
> > ... (deleted)
> 
> All the interactive readers in 3.0 news (including vnews) will have this
> capability, it is already implemented and has been working fine
> for months.

Has anyone thought of allowing news readers to do the following things:

1) view all new messages that have the same subject together, rather than
   having to juggle many subjects in his head while reading news.

2) easily apply the 'n' function to not only the current article, but also
   to all followups to it.

It seems that these ideas are basically an extension of the current
newsgroup organization.  That is, currently, one can view all messages
in a newsgroup consecutively, and can choose to ignore all further
messages in a newsgroup.  The above new functions extend this from the
newsgroup level down to a new level, perhaps to be called the "chain"
level.  A chain would be a "group" of messages that were derived from
a common ancestor.  Currently, news is completely unmanagable.  I know
many people who won't read news because the volume is too high.  Just
as having topic-oriented newsgroups is a volume managing facility, so
would having articles organized into chains.  Multics Forum has this
facility (although you have to type "swn seen [aref]" to get it! ;-)).

Sorry if this has all been hashed out before.


Dan Freedman
University of Calgary Computer Science Department
2500 University Drive N.W.			      freedman@cpsc.UCalgary.CA
Calgary, Alberta, T2N 1N4	                   ...!alberta!calgary!freedman

janssen@titan.sw.mcc.com (Bill Janssen) (12/30/88)

Usenet has this, too, if the References: header is included.  MCC STP
has a newsreader that looks at all the References: of all the articles
in a newsgroup, and generates a graph of the group, which is displayed
in one panel of the newsreader.  This is very handy for following a
discussion, as most articles are posted as followups when a thread of
discussion is underway.

I'm sure that an article on the newsreader, called `gnews', has been
published somewhere, but I'm not sure where.

Bill

peter@ficc.uu.net (Peter da Silva) (12/31/88)

In article <5859@saturn.ucsc.edu>, koreth@ssyx.ucsc.edu.ucsc.edu (Steven Grimm) writes:
> Why not scrap all the news software and start anew?  It should be possible
> to retain all of the old functionality, but in a new context.  For instance,
> quoting part of an article and commenting on it is VERY hypertext-ish --
> but why retransmit most of the original article every time it's quoted?

Because people are lazy. They don't want to summarize, summarize, summarize.
For example, many people would have included the first 10 lines of your
article, including all the quotes. They're no longer necessary and are
available through the 'p' key in vnews.

Most of the time. You're really asking "why quote at all?", are you not? Well
that's easy. Articles get lost. Articles arrive before their parents. And so
on... and there's really no way to fix this without wrecking usenet.

> But hypertext is obviously going to replace what we have now at some point...

I don't see that this follows. Hypertext doesn't seem compatible with such a
radically distributed environment as we have here.
-- 
Peter da Silva, Xenix Support, Ferranti International Controls Corporation.
Work: uunet.uu.net!ficc!peter, peter@ficc.uu.net, +1 713 274 5180.   `-_-'
Home: bigtex!texbell!sugar!peter, peter@sugar.uu.net.                 'U`
Opinions may not represent the policies of FICC or the Xenix Support group.

eric@snark.UUCP (12/31/88)

>Has anyone thought of allowing news readers to do the following things:
>
>1) view all new messages that have the same subject together, rather than
>   having to juggle many subjects in his head while reading news.
>
>2) easily apply the 'n' function to not only the current article, but also
>   to all followups to it.

Yes. 3.0 news supports what you really wanted when you were describing "chain
level" -- article-tree navigation primitives that understand `conversation
structure'. That is, the Followup-To relation and its inverse define a tree
structure that you want to traverse in depth-first order. This is the first
step towards full hypertext functionality for netnews.

And there is a `ud' unsubscribe from discussion command.


eric@snark.uu.net = eric%snark@uunet.uu.net     -->     Eric S. Raymond

jellinghaus-robert@CS.YALE.EDU (Rob Jellinghaus) (12/31/88)

In article <420@cs-spool.calgary.UUCP> freedman@cpsc.ucalgary.ca (Daniel Freedman) writes:
>Has anyone thought of allowing news readers to do the following things:
>
>1) view all new messages that have the same subject together, rather than
>   having to juggle many subjects in his head while reading news.

Try ^N in rn.  If you don't have rn as your reader, get it for this func-
tionality alone.  There's a switch which, if set, will change the default
end-of-article action from n to ^N, which is vastly more useful.

>2) easily apply the 'n' function to not only the current article, but also
>   to all followups to it.

Try k in rn.  Kills all articles with same subject.

>Dan Freedman

Lotsa neat stuff in rn, huh?

Rob Jellinghaus                | At the tone, the time will be...
jellinghaus-robert@CS.Yale.EDU |        later than you think?
ROBERTJ@{yalecs,yalevm}.BITNET |        too late?
{everyone}!decvax!yale!robertj |        Miller time?

barnett@grymopire.steinmetz.ge.com (Bruce Barnett) (12/31/88)

>1) view all new messages that have the same subject together, rather than
>   having to juggle many subjects in his head while reading news.

The Gnews package to GNUemacs allows this. Gnews has most of the
features of rn plus a lot more.

You can select, on a per-newsgroup  basis
	Show index of articles when first entering
	Sort algorithm used above
	Format of display of the index
	Automatic article junking  based on author, topic, keywords, etc.
		using regular expressions, lisp, etc.
	Automatic Digest mode
	Automatic "rot13"

A simple sequence lets you edit the per-newsgroup "hook", used to
permanently kill topics, subjects, etc.

Since the environment is GNUemacs, you can skip around  several articles
while composing a response.

Since it is emacs, you can use spelling correctors etc. on articles
while composing them.

When replying by mail, the system can provide you with several
different addresses/paths, and you can select the one that has the
best chance of working.

I am using it now, and love it. It is sort-of the ultimate
programmable news reader. Be prepared to pay a small amount of CPU load,
and invest in learning GNUemacs Lisp.

--
	Bruce G. Barnett 	barnett@ge-crd.ARPA, barnett@steinmetz.ge.com
				uunet!steinmetz!barnett

bgun@trwind.UUCP (Bill Gunshannon) (01/03/89)

While we are on the subject of "future public networks" has anyone here
considered the concept of trying to get your local public library to set
up a small machine with a handfull of terminals in order to allow access
to USENET NEWS information to people who would otherwise not only not have
that access but probably have never even heard of it.
The idea may seem strange up front but I don't think it is any stranger than
all the other magazines that most libraries tend to carry.

As an interesting side effect, if the idea caught on it could even lead to
a new backbone consisting of libraries all across the country.  


Any comments???

bill gunshannon

bgun@trwind.ind.TRW.COM

doug@isishq.FIDONET.ORG (Doug Thompson) (01/04/89)

 
 SG>From: koreth@ssyx.ucsc.edu.ucsc.edu (Steven Grimm) 
 
 SG>Actually, it seems to me that Usenet is really archaic as it exists now. 
 SG>Netnews has many aspects of a hypertext system, but it's not quite there. 
 SG>Why not scrap all the news software and start anew?  It should be possible 
 SG>to retain all of the old functionality, but in a new context. 
 
Reason 1: just try to get 15,000 system administrators to do the same 
software mod on the same day with their 15 different varieties of 
hardware.  
 
Listen, you write the software, port it to all 15 hardware 
environments, debug it, and give it to all of us, and hey, sure! 
 
 SG> For instance, 
 SG>quoting part of an article and commenting on it is VERY hypertextish -- 
 SG>but why retransmit most of the original article every time it's quoted? 
 SG>How about saying "this part of my message is a response to bytes 150 through 
 SG>402 of message 108832@foo.bar?"  The news software could optionally insert 
 SG>the proper part of the original, thus mimicing the behavior of the news 
 SG>system we have now; or it could allow the reader to jump around the links 
 SG>between messages.  (Readers of this newsgroup/mailing list should already 
 SG>be familiar with the other aspects of hypertext; it should be apparent that 
 SG>reading messages organized in this way would make it much easier to get 
 SG>at the information you want.) 
 SG> 
 SG>As well as making news a lot easier (and more fun) to peruse, this scheme 
 SG>could dramatically reduce the bandwidth eaten by net messages. 
 SG> I haven't seen any hard numbers on this, but I imagine that a good 25% 
 SG>of the text  
 SG>that gets sent around is in the form of quotes from older articles. 
 
 
Yeah. We could reduce the data transmitted this way, and that would be 
good. But, we would run into some immediate other problems. In some of 
the high traffic newgroups I maintain a two-day expire. Anything more 
than 48 hours old just isn't there anymore. Reason? There are higher 
priority demands on finite disk space. 
 
So, your suggestion would require that I keep those messages longer (a 
lot longer) or dump the newsgroup.  
 
Which is to say that the current set-up balances out the costs of 
moving data against the cost of storing data in a fashion that allows 
a lot of flexibility. While your solution would be great for those 
systems that have unlimited disk space but don't like phone bills, 
those of us who can cope with phone bills but are squeezed for disk 
space would be out of luck. 
 
Of course the reason that we can cope with the phone bills is that 
usenet is a local call, and costs no cash at all. 
 
Then there is cpu time to go looking for those 150 lines from a 
two-week old article. Combined with the fact that conversation threads 
often last for *months* with quotes within quotes within quotes, I 
think I should have to increase my disk allocation for net news by a 
factor of five or ten in order to avoid the situation where a user 
sees: 
 
"sorry, referenced article does not exist". 
 
Then there are the unreliabilities which periodically leave us viewing 
quotes from messages we have never seen because they never made it to 
this site. 
 
While optimizing transmission by a factor of 25%, I think you'd lose a 
lot elsewhere.  In the past two years usenet has largely shifted from 
2400 BAUD telephone connections to 9600 BAUD connections.  That has 
resulted in a reduction in connect time (read cost) to 1/5th of what 
it was.  ISDN promises 64 Kbaud transfer rates in the forseeable 
future.  
 
I think there is more to be gained in pumping faster than there is to 
be gained by pumping less, for now. 
 
Anyway, if you want to write the software and give it away, I'll be 
happy to install it and tell you about all its bugs :-) 
 
Don't take this as discouragement. I'd really *like* it if you would 
write the software! Just make sure it  includes some (highly portable) 
device drivers to make a 60 Mb hard disk store 300 Mb of data :-). 
 
=Doug 
 


--  

------------------------------------------------------------------------
Fido      1:221/162 -- 1:221/0                         280 Phillip St.,  
UUCP:     !watmath!isishq!doug                         Unit B-4-11
DAS:      [DEZCDT]doug                                 Waterloo, Ontario
Bitnet:   fido@water                                   Canada  N2L 3X1
Internet: doug@isishq.math.fidonet.org                 (519) 746-5022
------------------------------------------------------------------------

doug@isishq.FIDONET.ORG (Doug Thompson) (01/04/89)

 
 DF>From: freedman@cpsc.ucalgary.ca (Daniel Freedman) 
 DF>Has anyone thought of allowing news readers to do the following things: 
 DF> 
 DF>1) view all new messages that have the same subject together, rather than 
 DF>   having to juggle many subjects in his head while reading news. 
 
Try ^N (if I recall correctly) in rn. 
 
 DF> 
 DF>2) easily apply the 'n' function to not only the current article, but also 
 DF>   to all followups to it. 
 
See above. 
 
Even humble fidonet PC news reading software let's you seek for any 
keyword(s) in the subject line, and display headers accordingly, or 
move through a conference based on commonality in the subject line. 
 
It's a great idea. It was implemented long ago. 
   


--  

------------------------------------------------------------------------
Fido      1:221/162 -- 1:221/0                         280 Phillip St.,  
UUCP:     !watmath!isishq!doug                         Unit B-4-11
DAS:      [DEZCDT]doug                                 Waterloo, Ontario
Bitnet:   fido@water                                   Canada  N2L 3X1
Internet: doug@isishq.math.fidonet.org                 (519) 746-5022
------------------------------------------------------------------------

gary@percival.UUCP (Gary Wells) (01/10/89)

In article <434@trwind.UUCP> bgun@trwind.UUCP (Bill Gunshannon) writes:
>
>While we are on the subject of "future public networks" has anyone here
>considered the concept of trying to get your local public library to set
>up a small machine with a handfull of terminals in order to allow access
>to USENET NEWS information to people who would otherwise not only not have
>that access but probably have never even heard of it.
>The idea may seem strange up front but I don't think it is any stranger than
>all the other magazines that most libraries tend to carry.
>
>As an interesting side effect, if the idea caught on it could even lead to
>a new backbone consisting of libraries all across the country.  
>
>Any comments???

Well, just a couple.
1) Just this last Sunday I was been involved in a several hour long debate 
amoung 10-15 poeple regarding the quality/quantity of usenet postings and the
attendent costs of transmission, storage, and reading all that stuff.   I don't
want to speak for the group, but the general consensus was that we should be
thinking about ways to limit the volume.  I think putting terminals in the 
library would tend to vastly increase the volume.  Probably also lower the 
quality, if your library harbors the same "dubious" characters that mine does.
I'm not saying this is a bad idea; in fact I think it's a good idea.  But first
let's figure out some way to handle the _existing_ traffic gracefully.

2) My library is so strapped for operating funds that the last time we 
approached them about the local users group donating a computer to them, they 
turned us down.  Couldn't afford the power and possible maintenance costs.


-- 
--------------------------------------------------------------------------------
Still working on _natural_ intelligence.

gary@percival   (...!tektronix!percival!gary)

snoopy@sopwith.UUCP (Snoopy) (01/15/89)

In article <377@intek01.UUCP> mark@intek01.UUCP (Mark McWiggins) writes:
|From article <75@sopwith.UUCP>, by snoopy@sopwith.UUCP (Snoopy):
|> 
|>                                     ... There is a lot of useful information
|> in usenet, but it is often buried in garbage.  How much time do you want
|> to spend skimming through articles hitting 'k' or 'n'?  KILL files help,
|> but not enough.  Moderation sometimes helps, but often the moderator bends
|> a group far too strongly in the direction of their own interests.
|
|Have you tried the "vn" newsreader?  It presents a screenful of message
|titles, and you just pick the ones you want.  It makes picking just a few
|articles out of a huge batch about as quick as it can be.

Haven't tried vn.  I've used readnews and vnews, and currently use rn.
The '=' command is sometimes helpful.  I currently have it presenting
articles grouped by subject, however this slows things down as rn needs
lots of disk i/o bandwidth to go hunting for matching subjects in a
couple hundred articles scattered around on the disk.  (Even the Fast File
System isn't perfect.)  Even after it already hunted down all the subjects
to do the kill file processing!?

But these and other suggestions people have posted miss the point.  News
volume is increasing exponentially, our time to read news isn't.  To
continue using usenet effectively, we MUST build better tools to screen
articles.  Not just *slightly* better tools, but tools that are as much
an improvement over rn as rn was over readnew/vnews.  This won't be easy,
as Larry has already done all the obvious stuff and more.  (Thanks Larry!)
Or find good moderators for 95% of the newsgroups.  And figure out some way
to keep the moderators from getting buried.
    _____     
   /_____\    Snoopy	"You are lost in a maze of newsgroups, all different."
  /_______\   
    |___|     tektronix!tekecs!sopwith!snoopy
    |___|     sun!nosun!illian!sopwith!snoopy