[comp.ai.neural-nets] Neuron Digest V6 #59

neuron-request@HPLMS2.HPL.HP.COM ("Neuron-Digest Moderator Peter Marvit") (10/10/90)

Neuron Digest   Tuesday,  9 Oct 1990
                Volume 6 : Issue 59

Today's Topics:
                  NIPS*90 WORKSHOP PRELIMINARY PROGRAM


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: NIPS*90 WORKSHOP PRELIMINARY PROGRAM
From:    jose@learning.siemens.com (Steve Hanson)
Date:    Wed, 26 Sep 90 15:17:02 -0400



        NIPS*90 WORKSHOP PRELIMINARY PROGRAM
       ____________________________________________________________
      !       POST  CONFERENCE  WORKSHOPS  AT  KEYSTONE            !
      !  THURSDAY,  NOVEMBER  29  -  SATURDAY,  DECEMBER  2,  1990 !
      !____________________________________________________________!




     Dear Collegue,

       I  am  pleased to send you a preliminary description of workshops to
     be held during our annual "NIPS Post-conference Workshops".  Among the
     many workshop proposals we have received we believe to have selected a
     program of central topics  that  we  hope  will  cover  most  of  your
     interests  and  concerns.    As you know from previous years, our NIPS
     Post-conference Workshops are an opportunity for  scientists  actively
     working  in  the field to gather in an informal setting and to discuss
     current issues in Neural Information Processing.

       The Post-conference workshops will meet in Keystone, right after the
     IEEE  conference on Neural Information Processing Systems, on November
     30 and December 1.  You should be receiving an advance program, travel
     information   and   registration   forms   for   both   NIPS  and  the
     Post-conference workshops.  Please use these  forms  to  register  for
     both  events.  Please also indicate which of the workshop topics below
     you may be most interested in attending.  Your preferences are  in  no
     way  binding  or limiting you to any particular workshop but will help
     us in allocating suitable meeting rooms and minimizing overlap between
     workshop  sessions.    Please  mark your three most prefered workshops
     (1,2 and 3) on the corresponding form in your registration package.

       I look forward to seeing you soon at NIPS  and  its  Post-conference
     Workshops.  Please don't hesitate to contact me with any questions you
     may have about the workshops in general  (phone:  412-268-7676,  email
     at:  waibel@cs.cmu.edu.).    Should  you  like  to  discuss a specific
     workshop, please also feel free to  contact  the  individual  workshop
     leaders listed below.

       Sincerely yours,


             Alex Waibel
             NIPS Workshop Program Chairman
             School of Computer Science
             Carnegie Mellon University
             Pittsburgh, PA 15217


     ----------------------------------------------------------------------




        Thursday,  November  29,  1990
            5:00 PM:  Registration and Reception at Keystone

        Friday,  November  30,  1990
            7:30 -  9:30 AM: Small Group Workshops
            4:30 -  6:30 PM: Small Group Workshops
            7:30 - 10:30 PM: Banquet and Plenary Discussion

        Saturday,  December  1,  1990
            7:30 -  9:30 AM: Small Group Workshops
            4:30 -  6:30 PM: Small Group Workshops
            6:30 -  7:15 PM: Plenary Discussion, Summaries
            7:30 - 11:00 PM: Fondue Dinner, MountainTop Restaurant


     ----------------------------------------------------------------------



                         NIPS '90 WORKSHOP DESCRIPTIONS

                   Workshop Program Coordinator: ALEX WAIBEL
                           Carnegie Mellon University
                              phone: 412-268-7676
                           E-mail:  waibel@cs.cmu.edu


                      1.  OSCILLATIONS IN CORTICAL SYSTEMS
                                  Ernst Niebur
                         Computation and Neural Systems
                                 Caltech 216-76
                               Pasadena, CA 91125
                             Phone:  (818) 356-6885
                             Bitnet:  ernst@caltech
                     Internet: ernst@aurel.cns.caltech.edu

       40-60 Hz oscillations have long been reported in the rat and  rabbit
     olfactory  bulb  and  cortex  on  the  basis  of single-and multi-unit
     recordings as well as EEG  activity.  Periodicities  in  eye  movement
     reaction  times  as  well  as  oscillations  in  the  auditory  evoked
     potential in response to single  click  or  a  series  of  clicks  all
     support  a  30-50  Hz  framework  for aspects of cortical activity and
     possibly cortical function. Recently,  highly  synchronized,  stimulus
     specific oscillations in the 35-85 Hz range were observed in areas 17,
     18 and PMLS of anesthetized as well  as  awake  cats.    Neurons  with
     similar  orientation  tuning  up  to  10  mm apart and even across the
     vertical meridian (i.e. in different  hemispheres)  show  phase-locked
     oscillations.

       The  functional  importance  of  these  oscillations  as well as the
     underlying mechanisms are a matter of debate. For the time being,  the
     field  is  characterized  by  relatively few experiments and a certain
     abundance of theories, some coming from biology, some from the  neural
     network  community,  and  also  from physics (coupled oscillators have
     been discussed by physicists in  many  circumstances).  This  workshop
     will   be  an  opportunity  to  bring  together  experimentalists  and
     theoreticians.


                               2.  BIOLOGICAL SONAR
                              Herbert L. Roitblat
                            Department of Psychology
                              University of Hawaii
                                2430 Campus Road
                               Honolulu, HI 96822
                             Phone:  (808) 956-6727
                   E-mail:  herbert@uh.cc.uk.uhcc.hawaii.edu

                     Patrick W. B. Moore & Paul E. Nachtigall
                           Naval Ocean Systems Center
                               Hawaii Laboratory
                                 P. O. Box 997
                             Kailua, Hawaii, 96734
                    Phone:  (808) 257-5256 & (808) 257-1648
                   E-mail:  pmoor@nosc.mil & nachtig@nosc.mil

     Several  species,  most  notably  bats  and dolphins, are known to use
     biological sonar to obtain information about the  world  around  them.
     These  animals  have  evolved  solutions to severe processing problems
     that  can  be  exploited  in  the  development  of  artificial  signal
     processing mechanisms, including intelligent sonar and radar.  We call
     the process of using  biological  studies  to  inform  the  design  of
     artificial  systems  "biomimetics"  because the artificial systems are
     designed as mimics of biological ones.  This  workshop  will  consider
     the  neural  network  and  neurobiological  models  of  animal  signal
     processing that have recently been  advanced.    It  will  attempt  to
     integrate   studies   of   dolphins,  bats,  and  other  species  with
     computational  models.    The  workshop  will  be   of   interest   to
     investigators  of  biological  signal  processing  as  well  as  those
     interested in the development or artificial signal processing systems.

                              3.  NETWORK DYNAMICS
                                 Richard Rohwer
                     Centre for Speech Technology Research
                              Edinburgh University
                                80, South Bridge
                               Edinburgh EH1 1HN
                                    Scotland
                      Phone:  (44 or 0) (31) 225-8883 x261
                    E-mail:  rr%ed.eusip@nsfnet-relay.ac.uk
     The 1990 network dynamics workshop is to consist of mini-presentations
     to nucleate discussions about temporal  patterns  in  real  and  model
     neural  networks, much as was done in 1989.  The subject area includes
     the  description,  interpretation  and  engineering  design  of  these
     patterns.    An  effort  will  be  made  to  arrange  presentations on
     different specific subjects than were discussed  last  year,  although
     priority  will  be given to new developments in any area.  An issue of
     particular interest is the functional  or  cognitive  significance  of
     temporal  patterns.    Another  is the diversity of temporal behaviour
     produced by specific classes of  models,  with  implications  for  the
     evaluation  of  biological  models  and  the  selection  of models for
     engineering.  Training algorithms are also of interest.


               4.  CONSTRUCTIVE AND DESTRUCTIVE LEARNING ALGORITHMS
                                Scott E. Fahlman
                           School of Computer Science
                           Carnegie-Mellon University
                              Pittsburgh, PA 15213
                              Phone: (412) 268-257
                          Internet: fahlman@cs.cmu.edu
     Most  existing  neural  network  learning algorithms work by adjusting
     connection weights in a fixed network.   Recently  we  have  seen  the
     emergence of new learning algorithms that alter the network's topology
     as they learn.  Some of these algorithms start with excess connections
     and remove any that are not needed; others start with a sparse network
     and add hidden units as needed, sometimes in  multiple  layers.    The
     user  is  relieved  of  the burden of guessing in advance what network
     topology will best fit a given problem.  In addition,  many  of  these
     algorithms claim improved learning speed and generalization.

       In this workshop we will review what is known about the relationship
     between  network  topology,  expressive  power,  learning  speed,  and
     generalization.    Then  we  will examine a number of constructive and
     destructive algorithms,  attempting  to  identify  the  strengths  and
     weaknesses  of  each.    Finally,  we  will look at open questions and
     possible future developments.


            5.  COMPARISONS BETWEEN NEURAL NETWORKS AND DECISION TREES
                                Lorien Y. Pratt
                          Computer Science Department
                               Rutgers University
                            New Brunswick, NJ 08903
                             Phone:  (201) 932-4634
                        E-mail:  pratt@paul.rutgers.edu

                                 Steven W. Norton
                        Siemens Corporate Research, Inc.
                             755 College Road East
                              Princeton, NJ 08540
                             Phone:  (609) 734-3365
                     E-mail:  nortonD @learning.siemens.com

     The  fields  of  Neural  Networks  and  Machine  Learning have evolved
     separately in many ways.  However,  close  examination  of  multilayer
     perceptron learning algorithms (such as Back-Propagation) and decision
     tree induction methods (such as ID3 and CART) reveals  that  there  is
     considerable  convergence  between  these  subfields.    They  address
     similar problem classes (inductive classifier  learning)  and  can  be
     characterized  by  a  common  representational formalism of hyperplane
     decision regions.  Furthermore, topical subjects  within  both  fields
     are  related, from minimal trees and brain-damaged nets to incremental
     learning.

       In this workshop, invited  speakers  from  the  Neural  Network  and
     Machine  Learning communities (including Les Atlas and Tom Dietterich)
     will discuss their empirical and theoretical comparisons  of  the  two
     areas.    In a discussion period, we'll then compare and contrast them
     along the dimensions  of  representation,  learning,  and  performance
     algorithms.    We'll debate the ``strong convergence hypothesis'' that
     these two research areas are really studying the same problem.


                            6.  GENETIC ALGORITHMS
                                  David Ackley
                                   MRE-2B324
                                    Bellcore
                                 445, South St.
                           Morristown, NJ 07962-1910
                             Phone:  (201) 829-5216
                          E-mail:  ackley@bellcore.com

     "Genetic  algorithms"  are optimization and adaptation techniques that
     employ an evolving population of candidate solutions.   "Recombination
     operators" exchange information between individuals, creating a global
     search  strategy  quite  different  from  ---   and   in   some   ways
     complementary  to  --- the gradient-based techniques popular in neural
     network learning.  The first segment of this workshop will survey  the
     theory  and  practice  of  genetic  algorithms,  and then focus on the
     growing body of research efforts that combine genetic  algorithms  and
     neural  networks.    Depending  on  the  participants'  interests  and
     backgrounds, possible discussion topics  range  from  "So  what's  all
     this, then?" to "How should networks best be represented as genes?" to
     "Is the increased schema disruption inherent in  uniform  crossover  a
     feature or a bug?"

       As   natural  neurons  provide  inspiration  for  artificial  neural
     networks, and  natural  selection  provides  inspiration  for  genetic
     algorithms,   other   aspects  of  natural  life  can  provide  useful
     inspirations for studies in "artificial life".  In  artificial  worlds
     simulated  on  computers,  experiments  can be performed whose natural
     world analogues would be inconvenient or  impossible  for  reasons  of
     duration,  expense,  danger,  observability,  or ethics.  Interactions
     between genetic evolution and neural learning can be studied over many
     hundreds  of  generations.    The  consensual  development  of simple,
     need-based lexicons among tribes of artificial beings can be observed.
     The  second  segment of this workshop will survey several such "alife"
     research projects.  A discussion of prospects and  problems  for  this
     new, interdisciplinary area will close the workshop.


           7.  IMPLEMENTATIONS OF NEURAL NETWORKS ON DIGITAL, MASSIVELY
                             PARALLEL COMPUTERS
                       S. Y. Kung and K. Wojtek Przytula
                       Hughes Research Laboratories, RL69
                            3011 Malibu Canyon Road
                            Malibu, California 90265
                             Phone:  (213) 317-5892
                         E-mail: wojtek@csfvax.hac.com

     Implementations of neural networks span a full spectrum from  software
     realizations  on general-purpose computers to strictly special-purpose
     hardware realizations.    Implementations  on  programmable,  parallel
     machines,  which are to be discussed during the workshop, constitute a
     compromise between the two  extremes.    The  architectures  of  these
     machines reflect the structure of neural network models better than do
     those of sequential machines,  thus  resulting  in  higher  processing
     speed.     The  programmability  provides  more  flexibility  than  is
     available in specialized hardware implementations and opens a way  for
     realization  of  various  models, including future modifications, on a
     single machine.  We will discuss the degree to which the architectures
     of  the  machines  should  mimic  the  structure of the neural network
     models  versus  the  degree  of  the   match   to   be   obtained   by
     programmability.

                             8.  VLSI NEURAL NETWORKS
                                    Jim Burr
                          Starlab Stanford University
                               Stanford, CA 94305
                        Phone:  (415) 723-4087 (office)
                              (415) 725-0480 (lab)
                             (415) 574-4655 (home)
                       E-mail:  burr@mojave.standford.edu

     This one day  workshop  will  address  the  latest  advances  in  VLSI
     implementations  of  neural  nets. How successful have implementations
     been so far? Are dedicated neurochips being used in real applications?
     What  algorithms  have been implemented? Which ones have not been? Why
     not?  How important is on chip learning? How much arithmetic precision
     is  necessary?  Which is more important, capacity or performance? What
     are the issues in constructing  very  large  networks?  What  are  the
     technology scaling limits?  Any new technology developments?

       Several invited speakers will address these and other questions from
     various points of view in discussing their current research.  We  will
     try  to  gain  better  insight  into  the strengths and limitations of
     dedicated hardware solutions.

                          9.  OPTICAL NEURAL NETWORKS
                                Kristina Johnson
                        University of Colorado, Boulder
                                 Campus Box 425
                               Boulder, CO 80309
                             Phone:  (303) 492-1835
                    E-mail:  kris%fred@boulder.colorado.edu
                           kris@boulder.colorado.edu

     This  workshop  will  address  issues  in the implementation of neural
     networks  in   parallel   optical   hardware   including   issues   in
     scaleability,   speed,  bipolar  neurons  and  weights,  influence  of
     component  characteristics  on  system  performance  and  methods  for
     all-optical  learning.    The goal of the workshop will be to identify
     algorithms and applications or neural networks that  are  particularly
     suited  for  optoelectronic  implementation.  Novel device and systems
     that can implement neural processes will also be highlighted, such  as
     recent advances in custome VLSI/ modulator technology.

                          10.  NEURAL NETWORKS IN MUSIC
                                Samir I. Sayegh
                             Department of Physics
                               Purdue University
                           Fort Wayne, IN 46805-1499
                             Phone:  (219) 481-6157
                       E-mail:  sayegh@ed.ecn.purdue.edu

     The  workshop  is  to  address  aspects of perception, composition and
     motor skill performance  in  different  aspects  of  music  and  their
     modeling using Neural Networks.

       Although music applications can be dismissed as not "technical," the
     topic is of great importance precisely because most of  the  knowledge
     involved  occurs  at  a  preverbal  cognitive  level.  In music, as in
     neural networks, teaching by example is predominant.

       The recent surge in interest is indicated by the  increasing  number
     of  presentations  at major conferences and the publication of special
     issues of the Computer Music Journal (Fall89, Winter90)  dedicated  to
     Neural   Networks   and   Connectionism.     A  special  volume,  with
     contributions from and additions to these articles is being edited.

       A large spectrum of applications  as  well  as  mature  and  refined
     developments will be represented at the workshop.  These include pitch
     perception, composition, quantization of  musical  time,  performance,
     chord classification and fully developed systems.

                            11.  SPEECH RECOGNITION
                         Nelson Morgan and John Bridle
                    International Computer Science Institute
                         1947 Center Street, Suite 600
                               Berkeley, CA 94704
                             Phone:  (415) 643-9153
                       E-mail: morgan@icsib4.berkeley.edu

     In  the  early,  heady days of the most recent neural-network revival,
     many devotees felt  that  undifferentiated  masses  of  simple  neural
     models  could solve any classification problem.  More recently, it has
     been widely accepted that constraints on connections,  structure,  and
     sometimes  the form of the input representation are necessary for good
     performance in complex domains such as speech.

       In the workshop, we will discuss perspectives on this  issue.    The
     emphasis  will  be on the value and risk of using application-specific
     knowledge  to  constrain  network  topologies  and  to  integrate  ANN
     algorithms into systems which recognize speech.

                         12.  NATURAL LANGUAGE PROCESSING
                                  Robert Allen
                                   MRE-2A367
                                    Bellcore
                                 445, South St.
                           Morristown, NJ 07962-1910
                             Phone:  (201) 829-4315
                        E-mail: rba@flash.bellcore.comp

     It is  possible  to  imagine  fully  integrated  network-based  speech
     processing  systems, massive connectionist knowledge bases, and swarms
     of communicating connectionist agents.   Indeed,  networks  have  many
     properties  which  seem to make them suitable for pro- cessing natural
     language.  Activations can readily integrate con-  text  ranging  from
     phoneme  interactions,  to  reference,  to seman- tics and pragmatics.
     Likewise temporal processing may accommo- date syntax and learning  in
     networks  can  model  acquisition.  Nonetheless even some basic issues
     stir controversy such as: the ability to handle context-free grammars,
     compositionality,  the  need  for  rule-like  behavior, and generating
     past-tense verb forms.  This workshop will  consider  both  conceptual
     and  practical  limitations  in  bridging  the  gap  between  existing
     demonstrations and potential applications.


                13.  HAND-PRINTED CHARACTER RECOGNITION
                        Gale Martin & Jay Pittman, 
            MCC, 3500 Balcones Center Drive, Austin, Texas  78759
                 Phone:  (512) 338-3334, 338-3363,  
               E-mail:  galem@mcc.com, pittman@mcc.com
 
Over the past several years backpropagation techniques have successfully been
applied to isolated hand-printed character recognition.  This workshop will
consider what has been learned and where the field is headed next. The issues
to be addressed include the following:  1) Black art issues (what works, what
doesn't, what matters), 2) What are the current important research problems
& approaches (segmentation, incorporation of higher level constraints, very 
large symbol sets)? 3) How can the field foster collaborations and 
comparisons of techniques?  The format will include very brief talks by 
interested participants and subsequent discussion periods. The target
audience includes those interested in and/or  working on handwriting 
recognition and related visual recognition problems.  Individuals interested
in giving brief talks are invited to contact Gale or Jay at the above addresses. 


       14.  NN PROCESSING TECHNIQUES TO REAL WORLD MACHINE VISION PROBLEMS
                       Murali M. Menon & Paul J. Kolodzy
                             MIT Lincoln Laboratory
                            Lexington, MA 02173-9108
                             Phone:  (617) 863-5500
                          E-mail:  Menon@LL.LL.MIT.EDU
                             Kolodzy@LL.LL.MIT.EDU

     Our proposed workshop will discuss the application of neural  networks
     to   vision   applications,  including,  but  not  limited  to,  image
     restoration and pattern recognition.  Participants will present  their
     specific applications for discussion to highlight the relevant issues.
     Since the discussions will be driven by actual applications,  we  will
     place  an  emphasis  on the advantages of using neural networks at the
     system level in addition to the individual processing steps.

       To focus the discussions in the workshop  we  plan  to  present  the
     following  applications:    a  medical  screening  system using neural
     networks for identification of Pap smears, a description of a software
     and  hardware  implementation of a neural network for multi-scale edge
     extraction, a  large  scale  software  implementation  of  Grossberg's
     Boundary  Contour  System (BCS), and a Markov Random Field (MRF) based
     image restoration system.
       The proposed workshop  invites  participation  from  researchers  in
     machine  vision,  neural  network  modeling,  pattern  recognition and
     biological vision.


           15.  INTEGRATION OF ARTIFICIAL NEURAL NETWORKS WITH EXISTING
     TECHNOLOGIES EXPERT SYSTEMS AND DATABASE MANAGEMENT SYSTEMS
                                 Veena Ambardar
                               15251 Kelbrook Dr
                               Houston, TX 77062
                             Phone:  (713) 663-2264
                           E-mail:  veena@shell.uucp

     Integration of Artificial Neural Networks (ANNs) with  Expert  Systems
     (ES)  & Database Management Systems (DBMS) is already being researched
     in several ways such as : a) extraction of rules from a  given  neural
     network  b) developme- nt of hybrid systems using both ANNs & ESs & c)
     information retrieval & query  processing  using  ANNs  etc.  Possible
     benefits  of  this  integration  are  a)  facilit- ated acquisition of
     knowledge bases b) automated extension  of  existing  expert  systems(
     without  knowing  the rules explicitly) c) better understanding of the
     dynamics of ANNs d) reducing the size of the existing  expert  systems
     by  using threshold logic functions e) better retrieval of information
     & queries embedded  with  partial  cues  or  noise  &  f)  facilitated
     preprocessing  of  data  before  it  is fed to ANNs etc.  The workshop
     shall focus on this integration with the following specific  questions
     in  perspective : a) the benefits from this integration b) theoretical
     & practical issues those ought to be addressed for  this  bridging  c)
     current  status  of  research  efforts  &  commercial packages in this
     direction d) discussion of the different techniques for the extraction
     of  rules from a given ANN and if they could be extended to Hopfield &
     ART etc. e) different ways those could be used  to  improve  upon  the
     accuracy  rate for the processing of the query with partial cues/noise
     using neural nets  f)  future  directions  for  both  researchers  and
     vendors.  Both researchers & commercial vendors would be invited.

                                 16.  BIOPHYSICS
                                  J. J. Atick
                          Institute for Advanced Study
                              Princeton, NY 08540
                              Phone:  609-734-8000

                                    B. Bialek
                            Department of Physiology
                      University of California at Berkeley
                               Berkeley, CA 94720
                      E-mail:  bialek@candide.berkeley.edu


       The  workshop  will  proceed  through  organized  discussion with as
     minimal formal presentation as possible.  The discussion will focus on
     the  development of new theoretical ideas in neurocomputing that could
     be tested with real biological experiments.  The workshop will  review
     some  of the recent progress, discuss its implications and try to come
     up with new directions to pursue.  The emphasis  will  be  on  sensory
     systems where the signal processing problems involved could be sharply
     defined and the performance well qualified.    Some  of  the  specific
     questions that the workshop will address are:

        - Are  there  theories  of  what  the  nervous  system  should
          compute?

        - Can we rate the observed performance of the  nervous  system
          on  some  absolute scale?  How optimal are its computational
          strategies?

        - What is the nature of the encoding  of  information  in  the
          nervous  system?  Is the encoding for trigger features or is
          there general purpose encoding?    To  what  extent  is  the
          information distributed?

        - How  universal are the computations involved? Could the same
          theoretical principle account  for  the  variety  of  neural
          computations observed in the as well as different species.

        - What  is  the  role  of  adaptation  and what is its precise
          computational formulation?  What is  kept  constant  in  the
          process of adaptation?

        - Are  there critical experimental tests of recent theories in
          neurocomputing.

       Participants who have  done  work  related  to  the  theme  of  this
     workshop are encouraged to provide preprints of their work for general
     distribution.

                       17.  ASSOCIATIVE LEARNING RULES
                                 David Willshaw
                            University of Edinburgh
                          Centre for Cognitive Science
                               2 Buccleuch Place
                               Edinburgh EH8 9LW
                         Phone:  031 667 1011 ext 6252
                          E-mail:  David@uk.ac.ed.cns


       <Abstract Unavailable>


       18.  RATIONALES FOR, AND TRADEOFFS AMONG VARIOUS NODE FUNCTION SETS
                                J. Stephen Judd
                        Siemens Corporate Research, Inc.
                             755 College Road East
                              Princeton, NJ 08540
                             Phone:  (609) 734-6500
                       E-mail:  Judd@learning.siemens.com

     Linear  threshold  functions  and  sigmoid  functions have become very
     standard node functions  for  our  neural  network  studies,  but  the
     reasons  for  using  them  are  not very well founded. Are there other
     types of functions that might be more justifiable? or work better?  or
     make  learning  more  tractable?  This  workshop  will explore various
     issues that might help answer such questions. Come  hear  the  experts
     tell  us  what  matters and what doesn't matter; then tell the experts
     what *really* matters.

------------------------------

End of Neuron Digest [Volume 6 Issue 59]
****************************************