[comp.ai.neural-nets] Neuron Digest V5 #24

neuron-request@HPLABS.HP.COM ("Neuron-Digest Moderator Peter Marvit") (05/31/89)

Neuron Digest	Tuesday, 30 May 1989
		Volume 5 : Issue 24

Today's Topics:
			       Administrivia
			   neuron update function
		      wanted: neurobiology references
		    Re: wanted: neurobiology references
		    Re: wanted: neurobiology references
		    Re: wanted: neurobiology references
		    Re: wanted: neurobiology references
		    Re: wanted: neurobiology references
		    Re: wanted: neurobiology references
		  Re: ART and non-stationary environments
		  Re: ART and non-stationary environments
		  Re: ART and non-stationary environments
      Size limits of BP (Was Re: ART and non-stationary environments)
    Re: Size limits of BP (Was Re: ART and non-stationary environments)
    Re: Size limits of BP (Was Re: ART and non-stationary environments)
  Summary of AI and Machine Learning applications in Information Retrieval


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
ARPANET users can get old issues via ftp from hplpm.hpl.hp.com (15.255.16.205).

------------------------------------------------------------

Subject: Administrivia
From:    "Neuron-Digest Moderator -- Peter Marvit" <neuron@hplabs.hp.com>
Date:    Tue, 30 May 89 23:26:17 -0700 

Well, the response to having a Birds-of-a-Feather session at IJCNN in
Washington has been less than overwhelming; *no one* has responded.  Ah
well....

A reminder to *please* tell me if your account will be disabled this summer. 

	-Peter Marvit

------------------------------

Subject: neuron update function
From:    eghbalni@spectra.COM (Hamid Eghbalnia)
Organization: Spectragraphics, Corp., San Diego, CA
Date:    Wed, 19 Apr 89 23:43:33 +0000 


	I am looking for lit. ref. to neural update functions other than 
	sigmoids, thresholds.  In specific, I would appreciate references
	to material from both a neuro-science and system engineering point
	of view. 

	If you figured what I want by now, don't need to read further.

	There is supposedly this argument that functions such as sigmoid 
	functions are not either biologically accurate nor optimal from
	a dynamical system convergence point of view.  I am trying to find
	out if and from what source I can substantiate that - or refute
	it.

				Thanks.

	===============================   Reply:  ...!nosc!spectra!eghbalni
	===============================   or   :  eghbalni@spectra.com
	Disclaimer: standard.

------------------------------

Subject: wanted: neurobiology references
From:    Ian Parberry <omega.cs.psu.edu!ian@PSUVAX1.CS.PSU.EDU>
Organization: Penn State University
Date:    20 Apr 89 19:57:10 +0000 


At NIPS last year, one of the workshop attendees told me that, assuming one
models neurons as performing a discrete or analog thresholding operation on
weighted sums of its inputs, the summation appears to be done in the axons
and the thresholding in the soma.  This interested me because typical neural
network models don't take into account the hardware separation of these
operations, and Berman, Schnitger and myself had discovered (without
realizing the biological connection) that a new neural network model which
allows separation appears to be much more fault-tolerant than the old ones.

It's now time to write up the fault-tolerance result.  I'd like to include
some references to "accepted" neurobiological sources which back up the
attendee's observation.  Trouble is, I am not a neurobiologist, and do not
know where to look.  Can somebody knowledgeable please advise me?

Thanks,
Ian.

			Ian Parberry
    "Bureaucracy is expanding to meet the needs of an expanding bureaucracy"
  ian@theory.cs.psu.edu  ian@psuvax1.BITNET  ian@psuvax1.UUCP  (814) 863-3600
 Dept of Comp Sci, 333 Whitmore Lab, Penn State Univ, University Park, Pa 16802

------------------------------

Subject: Re: wanted: neurobiology references
From:    Mark Robert Thorson <portal!cup.portal.com!mmm@uunet.uu.net>
Organization: The Portal System (TM)
Date:    22 Apr 89 00:13:57 +0000 

I was taught, 10 years ago, that action potentials are believed to originate
at the axon hillock, which might be considered the transition between the
axon and the soma (cell body).  See FROM NEURON TO BRAIN by Kuffler and
Nichols (Sinauer 1976), page 349.

I would expect synaptic weights to be proportional to the axon circumference
where it joins the cell body, but I have no evidence to support that belief.

------------------------------

Subject: Re: wanted: neurobiology references
From:    mmm@cup.portal.com (Mark Robert Thorson)
Organization: The Portal System (TM)
Date:    Sat, 22 Apr 89 22:08:20 +0000 

> I would expect synaptic weights to be proportional to the axon circumference
> where it joins the cell body, but I have no evidence to support that belief.

Opps, I meant "dendrite circumference", of course.  And now that I think
about it, that's wrong too.  I was taught that there are two kinds of
conduction in nerves cells, "electrotonic" and "propagative".  The former
might be described as an electrolytic and resistive form of conduction,
while the latter involves action potentials originating in the axon hillock.

When the professor said this, I immediately asked, "Do you ever see
propagative conduction in dendrites?"  He said yes, and drew a diagram of a
neuron with a long axon and several dendrites, one of which was as long as
the axon.  He then proceeded to shade in both the axon and the long dendrite
with colored chalk to indicate where propagative conduction took place.

------------------------------

Subject: Re: wanted: neurobiology references
From:    boothe@mathcs.emory.edu (Ronald Boothe {guest})
Organization: Emory University
Date:    Sun, 23 Apr 89 14:17:12 +0000 

For most neurons in brain you can probably ignore propagative conduction by
dendrites and just consider the effects of electrotonic conduction.  This
conduction will be dissipated by the space constant of the cell membrane and
therefore the input of each synaptic input needs to be weighted by its
distance from the axon hillock.  In addition, many dendrites have branches
and varicosities which can alter the resistance to current flow along the
dendrite, so the geometry of the dendrites also must be taken into account.
Finally, a majority of excitatory synapses onto dendrites contact
specialized anatomical structures called spines. These spines are shaped
like mushrooms with the thin stalk projecting from the dendrite, and the
synaptic input coming onto the head of the spine.  This long thin stalk
provides resistance to current flow, so the weight of each synaptic input is
also influenced by the length and diameter of the stalk (some think a good
biological mechanism for altering the weights of specific inputs is to
change the shapes of the spines).

There is lots of recent work on this topic in the neuroscience literature.
I don't recall specific references right now, but some of the influentual
early work was done by W. Rall.  A check of the citation index to see who is
making reference to the old Rall papers should turn up current literature.

Ronald Boothe {guest}
Emory University, Atlanta, GA 
ARPA, CSNET:	boothe@emory.ARPA			BITNET: boothe@emory
UUCP: { sun!sunatl, gatech }!emory!boothe	

------------------------------

Subject: Re: wanted: neurobiology references
From:    carter@sloth.gatech.edu (Carter Bullard)
Organization: ICS Department, Georgia Institute of Technology
Date:    Mon, 24 Apr 89 14:58:09 +0000 

well,

   The idea of synaptic weights emerged principally from neuropharmacology.
It attempted to explain such phenomenon as the changes in the way neurons
responded to GABA (gama amino butyric acid) in the presence of valium, the
dopaminergic theory of psychosis and why some antipsychotic drugs
(chlorpromazine) seemed to work best during the morning, altered responses
to visual stimuli, at the cerebellar level, in the presence of amphetamine,
in cats, ..... the list goes on.

   The basic idea is that the transfer of information from one neuron to the
next is chemically based.  To summarize, as the nerve action potential
reaches the "terminal bouton" (that is the collection of synapses that
represent the "end" of a neuron), the electrical gradient changes on the
membrane of the presynaptic neuron set off a set of reactions that result in
the release of chemicals, "neurotransmitters", into the synaptic cleft.
Because the recipient (post synaptic) neuron has receptors on its outer
membrane that respond to the neurotransmitter, small deformations in the
electrical potential of the target neuron occur.  These are called miniature
excitatory (or inhibitory) postsynaptic potentials (MEPPs).  These
electrical changes propagate along the membrane, similar to ripples on a
waters surface.  The axon hillock, which is a specialized area on the
surface of the cell body of a neuron, can act as a capacitor, of sorts, in
that it can "summate" the potential changes over time.  It is thought that
the threshold for excitation originates at the axon hillock, but this is not
always the case, as the entire membrane of the neuron has the ability to
start a nerve action potential.  The axon hillock is generally responsible
for summating MEPPs.

   But the ability for a MEPP to cause a change at the axon hillock is
dependant on the distance between the loci of the chemical reaction to the
neurotransmitter and the axon hillock, the strength of the MEPP, and the
properties of the cell membrane that facilitate the propagation of the MEPP
along the membranes surface.  This is determined by many factors, but the
topology of the neuron is, indeed, important.

  However, the principle contributors to synaptic weight are generally
thought to be biochemically based.  These include such properties as, the
amount of neurotransmitter that is released from the presynaptic neuron, the
number of receptors that are available on the postsynaptic neuron, the
effectiveness of the transmitter to create a MEPP, the duration of the
neurotransmitter/receptor association, and the effectiveness of the
postsynaptic membrane to propagate the MEPP.

  The amount of neurotransmitter released with any given nerve action
potential is not constant with time, as the transmitter pool that is
available for release is limited.  The history of excitation of a neuron is
important, since neurotransmitters can be depleted with repeated excitation.
This is transmitter exhaustion, and is a real phenomenon that can be
demonstrated experimentally and clinically.  The factors that determine
presynaptic neurotransmitter availabilty are generally described with 4th or
5th order non-linear differential equations, depending on whether you
consider the variations in diet or not.

  The number of receptors that are available on the postsynaptic neuron,
their effectiveness to respond to chemical stimuli, and the rate of receptor
turnover has been the subject of pharmacological study for over 50 years,
and is rather complicated.  The best models are 3rd and 4th order
differential equations, where the history of excitation is a prominent
factor.

  The ability for the postsynaptic nerve membrane to propagate the MEPPs to
the axon hillock is also dependent on the history of excitation.

  Sooooooo, the number of historical dependants on synaptic weight can be
considered to be rather high.  The topology of the nerve is not that
variable, but the biochemical aspects of nerve function are extremely
variable.  It is probably this and a great deal of other factors, such as
the role of glial cells on neuronal functionality, that contribute the
greatest to the "weights" of a particular neuronal event.
  
Carter Bullard
School of Information & Computer Science, Georgia Tech, Atlanta GA 30332
uucp:	...!{decvax,hplabs,ihnp4,linus,rutgers}!gatech!carter
Internet:	carter@gatech.edu

------------------------------

Subject: Re: wanted: neurobiology references
From:    cs012133@brunix (Jonathan Stone)
Organization: Brown University Department of Computer Science
Date:    Mon, 24 Apr 89 19:06:17 +0000 

In article <545@hydra.gatech.EDU> carter@sloth.gatech.edu (Carter Bullard) writes:
>into the synaptic cleft.  Because the recipient (post synaptic) neuron
>has receptors on its outer membrane that respond to the neurotransmitter,
>small deformations in the electrical potential of the target neuron occur.
>These are called miniature excitatory (or inhibitory) postsynaptic
>potentials (MEPPs).  

I think there is a little confusion here. I learned that MEPP stands for
Miniature End-Plate Potential, in reference to the variatio n in potential
of the MEP (Motor End-Plate, where a neuron joins a muscle fiber) caused by
the release of a packet of acetylcholine by a motor neuron.  What the writer
meant to say is EPSP (and IPSP), which means what he said MEPP means, minus
the miniature.

Also, the specialization necessary to initiate (or sustain) a
self-propagating action potential is the presence of voltage-gated sod ium
channels, which I do not believe are located anywhere but along the axon
(and at its start). To say that the hillock summates ov er time is
inaccurate because I don't think it waits...it simply samples the potential
as soon as it is able (a set time period afte r the previous AP) and fires
whenever the potential rises above threshold.  The summation is done at the
INPUT site in that if a second input arrives before the effect of the first
has dissipated, the effect of the second will be added to that of
 the first. It is obvious if you understand the underlying mechanisms.

As far as synaptic weight, there is presently much debate over the
biochemical mechanism, with several recent advances. It will probably be
solved when the mechanism is discovered for how weights are changed.
Previously, the hot answer was change in shape of the dendritic spine, but
now it seems that the NMDA receptor as well as the molecule CaM-Kinase are
the mediating factors (though their e ffect may simply be to change the
shape of the spine). The big debate now, though, is whether anti-Hebbian
learning is pre-not-post or post-not-pre.

Hebbian learning occurs when the presynaptic neuron effectively causes the
postsynaptic neuron to fire--both are depolarized (active) simultaneously:
the connection between the two is strengthened. However, ANTI-Hebbian
learning, or weakening of the synapse, occur s under uncertain conditions.
Whether this occurs when the presynaptic cell fires but not the post, or
when the postsynaptic cell fi res but not the pre, is the topic that most
interests my teacher, Mark Bear, who currently favors pre-not-post.  Crap,
I'm late for class--sorry, but I hope this much helps the discussion.

------------------------------

Subject: Re: wanted: neurobiology references
From:    g523116166ea@deneb.ucdavis.edu (0040;0000008388;0;327;142;)
Organization: University of California, Davis
Date:    Wed, 26 Apr 89 01:00:13 +0000 

A useful review of electrotonic ("cable-core) models for neural conduction
is J. Jack, et al, _Electric Current Flow in Excitable Cells_, Oxford U.
Press, 1975.  Only a little dated, and the clarity of its presentation more
than compensates.

A current review of the neural spine phenomenon and some of the modelling is
in R. Coss and D. Perkel, "The Function of Dendritic Spines: A review of
Theoretical Issues", Behavioral and Neural Biology (44)151-185, 1985.  Coss
was my PhD advisor, and I did most of the modelling in his lab.  Major
problem wasn't technical but interpretive, I thought: we don't know a neural
system in which the spine swelling could be meaningful (analytically, that
is- lots of qualitative speculation).  Spine swelling is suggestive and we
did fascinating natural history studes: e.g., we let young honeybees out of
their hive for the first time to conduct their first flight and to learn all
the cues for returning home safely.  We recovered them and popped, freeze-
dried, and sliced their brains for neuromorphometry.  The spine population
after this one-trial learning was significantly skewed to more swollen
shapes.  Other studies have used cichlid fish and ground squirrels.  Marion
Diamond (UCB) probably could comment on any human data.

Hopefully helpfully...

====
R. Goldthwaite             rogoldthwaite@{deneb.}ucdavis.edu
Psychology and Animal Behavior/evolution, U.California, Davis, 95616

"Genetic algorithms: designer genes make new niches"

new PhD: postdocs/job possibilities welcome!

------------------------------

Subject: Re: ART and non-stationary environments
From:    myke@gatech.edu (Myke Reynolds)
Organization: School of Information and Computer Science, Georgia Tech, Atlanta
Date:    Fri, 28 Apr 89 23:56:08 +0000 

In article <2503@bucsb.UUCP> adverb@bucsb.bu.edu (Josh Krieger) writes:
>I think it's important to say one last thing about ART:
>
>ART is primarily usefull in a statistically non-stationary environment
>because its learned categories will not erode with the changing input.
>If your input environment is stationary, then there may be little reason
>to use the complex machinery behind ART; your vanilla backprop net will
>work just fine.

BAM is a the stationary version of ART, and blows backprop out of the water
in both power and simplicity. Its less than a linear equation solver, but
thats enough to out-preform backprop.  That backprop is not much worse, is
not only wrong, it makes for a skimpy last ditch effort to argue for a model
that has no other defense.
 
Myke Reynolds
School of Information & Computer Science, Georgia Tech, Atlanta GA 30332
uucp:	...!{decvax,hplabs,ncar,purdue,rutgers}!gatech!myke
Internet:	myke@gatech.edu

------------------------------

Subject: Re: ART and non-stationary environments
From:    kavuri@cb.ecn.purdue.edu (Surya N Kavuri )
Organization: Purdue University Engineering Computer Network
Date:    Sun, 30 Apr 89 19:57:48 +0000 

> BAM is a the stationary version of ART, and blows backprop out of the
> water in both power and simplicity. Its less than a linear equation solver,
> but thats enough to out-preform backprop.

   I do not understand what you mean by "power" but if you look at the
memory capacity, BAM's look pathetic.

   I do not speak for BP, but I heard some explanations that the hidden
layers serve as feature detectors (4-2-4 decoder) which shows a
likeness(intuitive) to pattern classification methods.


                                             Surya Kavuri
                                             (FIAT LUX)

  P.S: What I dispise in relation to BP is the apparent tendencies that
people have in romanticizing it.  (I should say that the problem is not with
BP but with its researchers).  I have seen sinful explanations to what the
hidden units stand for.  I have seen claims that they stand for concepts
that could be given physical meanings (sic!).  These are baseless dreams
that people come with.  This is a disgrace to the serious scientific
community as it indicates a degeneration
        
   BP is not even Steepest gradient approach, strictly speaking.  It does
minimization of an error measure.

   (1) There are no measures of its convergence time.
       
                                        

------------------------------

Subject: Re: ART and non-stationary environments
From:    myke@gatech.edu (Myke Reynolds)
Organization: School of Information and Computer Science, Georgia Tech, Atlanta
Date:    Sun, 30 Apr 89 22:12:46 +0000 

Surya N Kavuri writes:
>   I do not understand what you mean by "power" but if you look at the 
>  memory capacity, BAM's look pathetic.  

Its memory capacity is no less than that of a linear filter, and its size is
not limited, unlike BP. Since size = memory capacity, its memory capacity is
limited only by your implementation of a linear equation solver. If you
don't make the obvious step of using a sparse solver, then it will be
pathetic.
 
Myke Reynolds
School of Information & Computer Science, Georgia Tech, Atlanta GA 30332
uucp:	...!{decvax,hplabs,ncar,purdue,rutgers}!gatech!myke
Internet:	myke@gatech.edu

------------------------------

Subject: Size limits of BP (Was Re: ART and non-stationary environments)
From:    frey@eecae.UUCP (Zachary Frey)
Organization: Michigan State University, ERDL
Date:    Tue, 02 May 89 18:21:52 +0000 

In article <18589@gatech.edu> myke@gatech.UUCP (Myke Reynolds) writes:
>[ART's] memory capacity is no less than that of a linear filter, 
>and its size is
>not limited, unlike BP. Since size = memory capacity, its memory capacity
>is limited only by your implementation of a linear equation solver.

I am not familiar with ART, but I am familiar with back-propagation from the
Rummelhart & McClelland PDP volumes, and I don't remember ever seeing
anything about a size limit to networks implemented with back- propagation.
Could you elaborate?

I am currently working on implementing a simulation for feedforword networks
using BP as a learning rule that should work for arbitrarily large networks
(limited by computer resources, of course).  Since the equations involved
are recursively defined, I don't see why there should be a size limit on the
net.

Zach Frey

* U.S.nail:  Zachary Frey           || e-mail:  frey@frith.egr.msu.edu    *
*            326 Abbot Hall         ||          frey@eecae.ee.msu.edu     *
*            E. Lansing, MI  48825  || voice:   (517)355-6421             *
* DISCLAIMER: My opinions, my responsiblity.                              *

------------------------------

Subject: Re: Size limits of BP (Was Re: ART and non-stationary environments)
From:    myke@gatech.edu (Mike Rynolds)
Organization: School of Information and Computer Science, Georgia Tech, Atlanta
Date:    Wed, 03 May 89 18:07:47 +0000 

Try increasing the number of internal nodes without changing the
input/output you train it on. If you were to simulate more complex
input/output, an increased number of internal nodes would be necessary to
learn the greater complexity.  But even without greater complexity you will
notice a rapid decrease in learning rate as a function of the number of
internal nodes, and at a certain point, it stops learning all together.

 
Myke Reynolds
School of Information & Computer Science, Georgia Tech, Atlanta GA 30332
uucp:	...!{decvax,hplabs,ncar,purdue,rutgers}!gatech!myke
Internet:	myke@gatech.edu

------------------------------

Subject: Re: Size limits of BP (Was Re: ART and non-stationary environments)
From:    mbkennel@phoenix.Princeton.EDU (Matthew B. Kennel)
Organization: Princeton University, NJ
Date:    Thu, 04 May 89 00:38:20 +0000 

)Try increasing the number of internal nodes without changing the input/output
)you train it on....

Yes, it starts to memorize once you have significantly more free parameters
than examples.  I would think, though, that this is a fundamental
limitation--- you can always fit a 5th degree polynomial through 5 points,
for example.  The same sort of thing should apply in networks...(plug for my
adviser's research).  If you need to use significantly more free weights
than examples then you're wasting weights, or the functional representation
is poor.  If you don't have enough examples, you won't be able to learn a
complicated function if the given examples don't map out the input space
well enough.

Something that I'd like to explore further is learning with a radial-basis
function network, i.e. a one-hidden layer network where the input layer's
input function is:

 	n_j =   sum_i  (o_i - w_ij)^2;  o_j = 1/(1+n_j^2) e.g.

instead of the conventional  n_j = sum_i o_i * w_ij.

You can learn the output layer of this network using a guaranteed
conventional algorithm (linear least-squares; singular-value-decomposition)
once you've selected the centers, i.e. the first layer of weights, with
k-means clustering for example.

With one hidden layer, this network can perform complicated nonlinear
transformations, unlike the simple perceptron.

For predicting chaotic time series, where the inherent locality of the
functional representation is an advantage, this method is more accurate and
faster to converge, I've found.

Matt Kennel
mbkennel@phoenix.princeton.edu

------------------------------

Subject: Summary of AI and Machine Learning apps in Information Retrieval
From:    knareddy@umn-d-ub.D.UMN.EDU (Krishna)
Organization: U of Minnesota-Duluth, Information Services
Date:    Mon, 24 Apr 89 01:52:15 +0000 

[[ Editor's note: In addition to this excellent list, I know of at least one
major effort at using a connectionist superstructure with a frame-based
architecture and traditional discourse analysis to categorize scientific
abstracts for later retrieval.  Any other you know of? -PM ]]

Hi Netters,

Earlier I've posted a request seeking information about AI techniques and
Machine Lerning applications in Information Retrieval. Based on the
responses received and the information I'd with me, I've prepared a list of
conference proceedings and publications.

There had been many requests to summarize the info. Here it goes.  The
following list is far from being exhaustive. Any additions may be mailed to
me or posted here.

SIGIR 89 has tutorials on related topics and one may look forward to the
conference proceedings. SIGIR 89 is to be held at Boston.

Thanks,
krishna

Note : RIAO 88 conference is about "USER-ORIENTED CONTENT-BASED TEXT AND
IMAGE HANDLING" held at MIT from March 21-24 i988.

				BIBLIOGRAPHY

[Guntzer , Juttner, Seegmuler, Sarre 88]"Automatic Thesarus Construction by
Machine Learning Retrieval Sessions, RIAO 88", Ullrich Guntzer, G.Juttner,
G.Seegmuller, F.Sarre

[Gauch, Smith 88] "Intelligent Search of Full-Text Databases", RIAO 88,
Susan Gauch, John B.Smith

[Liddy 88] "Towards a Friendly Adaptable Information Retrieval Systems",
RIAO 88,Elizabeth D.Liddy

[Driscoll 88] "An Application of Artificial Intelligence Techniques to
Automated Key-Wording", RIAO 88, James R. Driscoll

[Fox, Weaver, Chen, France 88] "Implementing a Distributed Expert-Based
Information Retrieval System", RIAO 88, Edward A. Fox, Marybeth T. Weaver,
Qi-Fan Chen, Robert K.France

[Harman, Benson, Fitzpatrick, Huntzinger, Goldstein 88] "IRX : An
Information Retrieval System for Experimentation and User Application", RIAO
88, Donna Harman, Dennis Benson, Larry Fitzpatrick, Rand Huntzinger, Charles
Goldstein

[Kuhn 88] "DoD Gateway Information System (DGIS) Common Command Language;
The Decision for Artificial Language", RIAO 88, Allan D. Kuhn

[Humphrey 88] "Interactive Knowledge-Based Indexing : The MedIndEx System",
RIAO 88, Susanne M. Humphrey

[Tong, Applebaum 88] "Conceptual Information Retrieval from full text", RIAO
88, Richard M. Tong, Lee A. Applebaum

[Diel, Schukat 88] "An Intelligent System for Text Processing Applications",
RIAO 88, Hans Diel, H. Schukat

[Jacob, Rau 88] "Natural Language Techniques for intelligent Information
Retrieval", SIGIR 88, P.S.Jacob, L.F.Rau

[Case 88] "How do Experts do it ? The use of Ethnographic Methods as an aid
to Understanding the Cognitive Processing and Retrieval of Large Bodies of
Text", SIGIR 88, D.O.Case

[Belkin 88]"On the Nature and Function of Explanation in Intelligent
Information Retrieval", 	SIGIR 88, N.J.Belkin

[Brachman, McGuinness 88]"Knowledge Representation, Connectionism and
Conceptual Retrieval", SIGIR 88, R.J.Brachman, D.L. McGuinness

[Jones, deBessonet, Kundu 88] "ALLOY : An Amalgamation of Expert, Linguistic
and Statistical Indexing Methods", SIGIR 88, L.P.Jones, C. deBessonet, S.
Kundu

[Brajnik, Guida, Tasso 88] "IR-NLI II : Applying Man-Machine Interaction and
Artificial Intelligence Concepts to Information Retrieval", SIGIR 88,
G.Brajnik,G. Guida, C. Tasso

[Teskey 88] "Intelligent Support for Interface Systems", SIGIR 88, F.N. Teskey

[Furnas, Deerwester, Dumais, Landauer, Harshman, Streeter, Lochbaum 88]
"Information Retrieval Using a Singular Value Decomposition Model of Latent
Semantic Structure", SIGIR 88, G.W.Furnas, S. Deerwester, S.T. Dumais, T.K.
Landauer, R.A. Harshman, L.A.Streeter, K.E.Lochbaum

[Croft, Lucia, Cohen 88] "Retrieving Documents by Plausible Inference : A
Preliminary Study", SIGIR 88, W.B.Croft, T.J.Lucia, P.R. Cohen

[Barthes, Glize 88] "Planning in an Expert System for Automated Information
Retrieval", SIGIR 88, C.Barthes, P.Glize

[Zarri 88] "Conceptual Representation for Knowledge Bases and << Intelligent
>> Information Retrieval Systems", SIGIR 88, G.P.Zarri

[Borko 87] "Getting Started in Library Expert Systems Research", Info. Proc.
Management Vol.23, No. 2, pp 81-87, 1987, Harold Borko

[Pollitt 87] "CANSEARCH : An Expert Systems Approach to Document Retrieval",
Info. Proc. Management, Vol. 23, No. 2,pp 119-138, 1987, Steven Pollitt

[Rada 87]"Knowledge-Sparse and Knowledge-Rich Learning in Information
Retrieval", Info. Proc. Management Vol. 23, No. 3, pp. 195-210, 1987, Roy
Rada

[Croft 87] "Approaches to Intelligent Information Retrieval", Info. Proc.
Management Vol. 23, No.4, pp. 249-254, 1987, W.B.Croft

[Rau 87] "Knowledge Organization and Access in A Conceptual Information
System", Info. Proc.  Management, Vol. 23, No. 4, pp. 269-283, 1987, Lisa F.
Rau

[Chiaramella, Defude 87] "A Prototype of an Intelligent System for
Information Retrieval: IOTA", Info. Proc. Management, Vol. 23, No. 4, pp.
285-303, 1987, Y.Chiaramella, B. Defude

[Brajnik, Guida, Tasso 87] "User Modeling In Intelligent Information
Retrieval", Info. Proc.  Management, Vol. 23, No. 4, pp. 305-320, 1987,
Giorgio Brajnik, Giovanni Guida, Carlo 	Tasso

[Fox 87]"Development of the CODER system : A testbed for Artificial
Intelligencemethods in Information Retrieval", Info. Proc. Management, Vol.
23, No.4, pp.  341-366, 1987, Edward A. Fox

[Brooks 87] "Expert Systems and Intelligent Information Retrieval", Info.
Proc. Management, Vol. 23, No. 4, pp. 367-382, 1987, H.M.Brooks

[Many Authors 87] "Distributed Expert-Based Information Systems : An
Interdisciplinary approach", Vol. 23, No. 5, pp. 395-409, 1987

[Rada, Forsyth 86] "Machine Learning - applications in expert systems and
Information Retrieval", Published by Ellis Horwood Limited, 1986, Richard
Forsyth, Roy Rada

[Benigno, Cross, deBessonet 86] "COREL - A Conceptual Retrieval System", M.
Kathryn Di Benigno, George R. Cross, Cary G. deBessonet

[Croft, Thompson 85] "An Expert Assistant for Document Retrieval", COINS
Technical Report 85-05, Dept. of Comp. Sc., Univ. of Massachusetts, 1985
(?), W. Bruce Croft, Roger H. Thompson

[McCune, Tong, Dean, Shapiro 85] "RUBRIC : A System for Rule Based
Information Retrieval", IEEETrans. on Softwre Engg., Vol. SE-11, No. 9,
Sept. 85, pp 939- 945, Brian P. McCune, Richard M. Tong, Jeffrey S. Dean,
Daniel G. Shapiro

[Lebowitz 85] "An experiment in Intelligent Information Systems :
RESEARCHER", Columbia univ. comp. sc. dept. tech. report CUCS-171-85, 1985,
Michael Lebowitz

[Zarri 84] "Expert Systems and Information Retrieval : An Experiment in the
domain of biographical data management", Int. J. Man-Machine Studies (1984)
20, pp. 87-106, Gian Piero Zarri

[Danilowicz 84] "Users and Experts in the Document Retrieval system model",
Int. J. Man-Machine Studies (1984) 21, 245-252, Czeslaw Danilowicz

[Jones 83] "Intelligent Retrieval",Proceedings Intelligent Information
Retrieval, pp. 136-142 , Aslib, London, March 1983, Karen Sparck Jones

[Lebowitz 83] "Intelligent Information Systems", SIGIR 83, Michael Lebowitz

[DeJong 83] "Artificial Intelligence Implications for information
Retrieval", SIGIR 83, Garry DeJong

[Brooks 83] "Using Discourse Analysis for the Design of Information
Retrieval Interaction Mechanisms", SIGIR 83, H.M.Brooks

[Zarri 83] "RESEDA, an Information Retrieval system using Artificial
Intelligence and Knowledge Representation Techniques", An ACM publ. (I do
not know which), copyrighted as "1983 ACM 0-89791-107-5/83/006/0189"

------------------------------

End of Neurons Digest
*********************