[comp.ai.neural-nets] Neuron Digest V6 #48

neuron-request@HPLMS2.HPL.HP.COM ("Neuron-Digest Moderator Peter Marvit") (08/15/90)

Neuron Digest	Tuesday, 14 Aug 1990
		Volume 6 : Issue 48

Today's Topics:
    Special Session on AI in Communications - '91 Phoenix Conference
		      Re: universe and intelligence
		   neural network generators in Ada??
		     NN Approach to Inverse Problems
	   Re - conjugate gradient method and other things ...
	    conjugate gradient optimization program available
				abstract
			 research programmer job
		  Research associate position available
		 Job opening at Intel for NN IC designer
			SIEMENS Job Announcement
 Re: Call for Participation in Connectionist Natural Language Processing
	       special issue of Connection Science Journal
			  THIRD ISAI IN MEXICO
		 International Journal of Neural Systems


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: Special Session on AI in Communications - '91 Phoenix Conference
From:    "Doug Bigwood" <dbigwood@umdars.umd.edu>
Date:    02 Aug 90 10:58:00 -0400

I am organizing a special session on Artificial Intelligence Applications
in Communications for the 10th Annual IEEE Phoenix Conference on
Computers and Communications to be held March 27- 30, 1991 in Scottsdale,
Arizona.  I am in the process of gathering a list of potential
contributors to this session.  Anyone interested in contributing a paper
to this session should contact me (preferably via e-mail) by the middle
of August or so (or ASAP).  All I need at this time is the title of the
proposed submission, the author's names, and an address (again,
preferably an e-mail address).  The draft manuscripts will need to be
submitted in September.  I will provide further details to those who
express interest. Note that the term "Artificial Intelligence" as used
here is being interpreted in its loosest sense; i.e. topics may include
expert systems, neural networks, robotics, genetic algorithms, etc.
Thank you for your time.

Dr. Douglas W. Bigwood
B.E. Technologies
12906 Old Chapel Place
Bowie, MD  20720

Internet: dbigwood@umdars.umd.edu
Bitnet: dbigwood@umdars


------------------------------

Subject: Re: universe and intelligence
From:    Douglas G. Danforth <danforth@riacs.edu>
Date:    Fri, 03 Aug 90 16:19:30 -0700

In comp.ai.neural-nets you write:

>There is a problem I have been thinking for several years: What is the
>relationship between the universe and the intelligence (or physics and
>biology)?

>If the universe (the physical laws) can evolve to provide an environment
>for the intelligent systems to survive, can we say that the universe is
>not intelligent (or just a blind watchmaker)?

>Why the biological systems are more intelligent than the physical
>systems? Are there some intelligences in physics?  Can we build a
>universal model of intelligence where the intelligence evolves like the
>universe?

>Any comments are very welcome.


>Chung-Chih Chen
>Artificial Intelligence Laboratory
>(Building K, 4th Floor)
>Free University of Brussels
>Pleinlaan 2
>1050 Brussels, BELGIUM
>(email: chen@arti.vub.ac.be)

I have an answer which you may not like, namely: 
Intelligence is in the mind of the intelligent AND why do you consider
intelligence significant?

No matter how complex humans or other species evolve to become, the
complexity is neither good nor bad, neither supreme or trivial. It just
is.  But then rocks and trees just are. It is the aware mind that gives
value to the universe BUT that does not mean that the value has any
significance in any absolute sense. The fact that we focus on a very
small class of behavior and give it great value simply says that is what
we do. Nothing more.

By the way, the universe is aware and intelligent for we are of the
universe.


	Douglas Danforth
	Research Institute for Advanced Computer Science (RIACS)
	NASA Ames Research Center
	Moffett Field, CA 94035
	U.S.A.


------------------------------

Subject: neural network generators in Ada??
From:    fritz_dg%ncsd.dnet@gte.com
Date:    Mon, 06 Aug 90 17:27:35 -0400


Are there any non-commercial Neural Network "generator programs" or such
that are in Ada? (ie. generates suitable NN code from a set of user
designated specifications, code suitable for embedding, etc).

I'm interested in

	- experience developing and using same, lessons learned
	- to what uses such have been put, successful?
	- nature of; internal use of lists, arrays; what can be user
	  specified, what can't; built-in limitations; level of HMI
	  attached; compilers used; etc., etc.
	- and other relevant info developing and applying such from those
	  who have tried developing and using them

Am also interested in opinions on:

	If you were going to design a NN Maker _today_, how would you
            design it?
	If Ada were the language, what special things might be done?

Motive should be transparent.  My sincere thanks to all who respond.  If
there is interest, I'll turn the info (if any) around to the list in
general.

Dave Fritz	fritz_dg%ncsd@gte.com
		(301) 738-8932		

------------------------------

Subject: NN Approach to Inverse Problems
From:    mhe%neuron@augean.ua.oz.au (mingyi)
Date:    Wed, 08 Aug 90 15:49:27 -0600

        I am interested in Neural Network approach to inverse problems
like deconvolution. It will be greatly appreciated if someone can provide
me some references on this subject.

Please reply to:
                mhe%neuron@augean.ua.oz.au

or

Mr. Mingyi He
Department of Electrical & Electronic Engineering
The University of Adelaide
G.P.O.Box 498, Adelaide, SA 5001
AUSTRALIA

------------------------------

Subject: Re - conjugate gradient method and other things ...
From:    KRISH%tifrvax.bitnet@pucc.PRINCETON.EDU
Organization: T I F R, Bombay, India
Date:    Fri, 10 Aug 90 01:48:00 +0700

I am working on neural network based speech recognition with temporal
neural networks. The architecture I am using is feed-forward MLP type. I
have been using the conventional back-prop algorithm which I realize is
computationally expensive. Can anyone tell me how the CONJUGATE GRADIENT
based method works for MLP training ? I would appreciate if someone can
outline the algorithm for implementation and also give me some references
to follow up. (Though our library is not getting many neural net
journals/conf. proc.).

I would also like to know if there are any LEARNING METHODS for networks
with FEED-BACK.

Thanks in advance,

Krishnan, 01:40 local time.

------------------------------

Subject: conjugate gradient optimization program available
From:    Mark Fanty <fanty@cse.ogi.edu>
Date:    Fri, 10 Aug 90 10:00:03 -0700

The speech group at OGI uses conjugate-gradient optimization to train
fully connected feed-forward networks.  We have made the program (OPT)
available for anonymous ftp:

1. ftp to cse.ogi.edu

2. login as "anonymous" with any password

3. cd to "pub/speech"

4. get opt.tar

OPT was written by Etienne Barnard at Carnegie-Mellon University.

  Mark Fanty				Computer Science and Engineering 
			       		Oregon Graduate Institute
  fanty@cse.ogi.edu			196000 NW Von Neumann Drive      
  (503) 690-1030			Beaverton, OR 97006-1999         



------------------------------

Subject: abstract
From:    Peter Wohl <thsspxw@iitmax.iit.edu>
Date:    Fri, 27 Jul 90 12:45:08 -0600


This is the abstract of a paper to appear in the Proceedings of the
International Conference on Parallel Processing, 08/13-17, 1990:

SIMD Neural Net Mapping on MIMD Architectures
=============================================

Peter Wohl and Thomas W. Christopher
Dept. of Computer Science
Illinois Institute of Technology
IIT Center, Chicago, IL 60616

Abstract
 --------
The massively parallel, communication intensive SIMD algorithm of
multilayer back-propagation neural networks was encoded for coarse
grained MIMD architectures using a relatively low level message-driven
programming paradigm. The computation / communication ratio was software
adjusted by defining a "temporal window" grain over the set of training
instances. Extensive experiments showed an improvement in multiprocessor
utilization over similar reported results and the simulator scaled up
well with more complex networks on larger machines.  The code can be
easily modified to accommodate back-prop variations, like quickprop or
cascade-correlation learning, as well as other neural network
architectures and learning algorithms.  []

Copies of the paper can be obtained by writing to the authors at the
above address, or email-ing to: thsspxw@iitmax.iit.edu (P. Wohl).

	-Peter Wohl

------------------------------

Subject: research programmer job
From:    Carol Plathan <carol@ai.toronto.edu>
Date:    Fri, 03 Aug 90 11:14:14 -0400


RESEARCH PROGRAMMER JOB AT THE UNIVERSITY OF TORONTO

STARTING SALARY: $36,895 - $43,406
STARTING DATE:  Fall 1990

The Connectionist Research Group in the Department of Computer Science at
the University of Toronto is looking for a research programmer to develop
a neural network simulator that uses Unix, C, and X-windows.  The
simulator will be used by our group of about 10 researchers, directed by
Geoffrey Hinton, to explore learning procedures and their applications.
It will also be released to some researchers in Canadian Industry.  We
already have a fast, flexible simulator and the programmer's main job
will be to further develop, document, and maintain this simulator.  The
development may involve some significant re-design of the basic
simulator.  Additional duties (if time permits) will include:

Implementing several different learning procedures within the simulator
and investigating their performance on various data-sets; Assisting
industrial collaborators and visitors in the use of the simulator;
Porting the simulator to faster workstations or to boards that use fast
processors such as the Intel i860 or DSP chips; Developing software for a
project that uses a data-glove as an input device to an adaptive neural
network that drives a speech synthesizer; Assisting in the acquisition
and installation of hardware and software required for the project;

The applicant should possess a Bachelors or Masters, preferably in
Computer Science or Electrical Engineering, and have at least two years
programming experience including experience with unix and C, and some
experience with graphics.  Knowledge of elementary calculus and
elementary linear algebra is essential.  Knowlege of numerical analysis,
information theory, and perceptual or cognitive psychology would be
advantageous.  Good oral and written communication skills are required.

Please send CV + names of two or three references to Carol Plathan,
Computer Science Department, University of Toronto, 10 Kings College
Road, Toronto Ontario M5S 1A4.  You could also send the information by
email to carol@ai.toronto.edu or call Carol at 416-978-3695 for more
details.  The University of Toronto is an equal opportunity employer.

ADDITIONAL INFORMATION

The job can be given to a non-Canadian if they are better than any
Canadians or Canadian Residents who apply.  In this case, the
non-Canadian would probably start work here on a temporary work permit
while the application for a more permanent permit was being processed.

There are already SEVERAL good applicants for the job.  Candidates who do
not already program fluently in C or have not already done neural network
simulations stand very little chance.  Also, it is basically a
programming job. The programmer may get involved in some original
research on neural nets, but this is NOT the main part of the job, so it
is not suitable for postdoctoral researchers who want to get on with
their own research agenda.

Interviews will be during September.  We will definitely not employ
anybody without an interview and we cannot afford to pay travel expenses
for interviews (except in very exceptional circumstances).  If there are
several good applicants from the west coast of the USA, I may arrange to
interview them in California.

We already have sufficient funding to support the programmer for the next
three years. However, we have applied to the Canadian Government for
additional funding specifically for this work, and if it comes through
(in November 1990) the programmer will be transferred to that source of
funding and the simulator will definitely be supplied to Canadian
Industry.  The job will then require more interactions with industrial
users and more systematic documentation, maintainance and debugging of
the simulator releases.






------------------------------

Subject: Research associate position available
From:    austin@minster.york.ac.uk
Date:    06 Aug 90 12:36:33 +0000

Please post -


		     University	of York
Departments of Computer	Science, Electronics and Psychology

		 Research Associate post in
	  Neural Networks and Image Classification

Applications are invited for a three year  research  associ-
ateship	 within	 the  departments of Computer Science, Elec-
tronics	 and  Psychology  on  a	 SERC  image  interpretation
research  initiative. Applicants should	preferably have	pro-
gramming and research experience  of  image  interpretation,
neural networks	and psychology.

The project is aimed at	the development	of neural models  of
classification	 tasks	 and   involves	 characterizing	 the
processes involved in learning and  applying  classification
skills	in  clinical  screening	 tasks.	 A  major  aim is to
develop	models based on	current	advances in neural networks.

Salaries will be on the	 1A  scale  (  11,399  ---   13495).
Informal  enquiries  may  be  made  to	Dr. Jim	Austin (0904
432734,	email: austin@uk.ac.york.minster). Further  particu-
lars  may  be  obtained	 from  The  Registrar's	 Department,
University of York, Heslington,	York, YO1 5DD,	UK  to	whom
three copies of	a curriculum vitae should be sent. The clos-
ing date for applications  is  24  Aug	1990.  Please  quote
reference number J2.


		       August 6, 1990


------------------------------

Subject: Job opening at Intel for NN IC designer
From:    Bhusan Gupta <bgupta@aries.intel.com>
Date:    Thu, 09 Aug 90 16:19:58 -0700


The neural network group at Intel is looking for an engineer to
participate in the development of neural networks.

A qualified applicant should have a M.S. or PhD in electrical engineering
or equivalent experience. The specialization required is in CMOS circuit
design with an emphasis on digital design. Analog design experience is
considered useful as well. Familiarity with neural network architectures,
learning algorithms, and applications is desirable.

The duties that are specific to this job are:
	Neural network design.
		Architecture definition and circuit design.
		Chip planning, layout supervision and verification.
		Testing and debugging silicon.
	The neural network design consists primarily of digital design with
	both a gate-level and transistor-level emphasis.


The job is at the Santa Clara site and is currently open.

Interested principals can email at bgupta@aries.intel.com until the end
of August. Resumes in ascii are preferred. I will pass along all
responses to the appropriate people.

street address:
	Bhusan Gupta
	m/s sc9-40
	2250 Mission College Blvd.
	P.O. Box 58125
	Santa Clara, Ca 95052


Intel is an equal opportunity employer, etc.

Bhusan Gupta

------------------------------

Subject: SIEMENS Job Announcement
From:    kuepper@ICSI.Berkeley.EDU (Wolfgang Kuepper)
Date:    Tue, 14 Aug 90 11:28:07 -0700


		IMAGE UNDERSTANDING and ARTIFICIAL NEURAL NETWORKS

	The Corporate Research and Development Laboratories of Siemens AG, 
	one of the largest companies worldwide in the electrical and elec-
	tronics industry, have research openings in the Computer Vision 
	as well as in the Neural Network Groups. The groups do basic and 
	applied studies in the areas of image understanding (document inter-
	pretation, object recognition, 3D modeling, application of neural 
	networks) and artificial neural networks (models, implementations, 
	selected applications). The Laboratory is located in Munich, an 
	attractive city in the south of the Federal Republic of Germany.

	Connections exists with our sister laboratory, Siemens Corporate 
	Research in Princeton, as well as with various research institutes 
	and universities in Germany and in the U.S. including MIT, CMU and 
	ICSI.

	Above and beyond the Laboratory facilities, the groups have a 
	network of Sun and DEC workstations, Symbolics Lisp machines, 
	file and compute servers, and dedicated image processing hardware.

	The successful candidate should have an M.S. or Ph.D. in Computer 
	Science, Electrical Engineering, or any other AI-related or 
	Cognitive Science field. He or she should prefarably be able to 
	communicate in German and English.

	Siemens is an equal opportunity employer.

	Please send your resume and a reference list to
		Peter Moeckel
		Siemens AG
		ZFE IS INF 1
		Otto-Hahn-Ring 6
		D-8000 Muenchen 83
		West Germany
	e-mail: gm%bsun4@ztivax.siemens.com
	Tel. +49-89-636-3372
	FAX  +49-89-636-2393

	Inquiries may also be directed to
		Wolfgang Kuepper (on leave from Siemens until 8/91)
		International Computer Science Institute
		1947 Center Street - Suite 600
		Berkeley, CA 94704
	e-mail: kuepper@icsi.berkeley.edu
	Tel. (415) 643-9153
	FAX  (415) 643-7684


------------------------------

Subject: Re: Call for Participation in Connectionist Natural Language Processing
From:    powers@uklirb.informatik.uni-kl.de
Date:    Tue, 31 Jul 90 12:23:19 +0700


I'm organizing the AAAI Symposium on Machine Learning of Natural Language
and Ontology.  Obviously there is a relationship.  I will tack our call
on the end of this.  Maybe we could have a joint session?
 ------------------------------------------------------------------------
David Powers		 +49-631-205-3449 (Uni);  +49-631-205-3200 (Fax)
FB Informatik		powers@informatik.uni-kl.de; +49-631-13786 (Prv)
Univ Kaiserslautern	 * COMPULOG - Language and Logic
6750 KAISERSLAUTERN	 * MARPIA   - Parallel Logic Programming
WEST GERMANY		 * STANLIE  - Natural Language Learning
 ------------------------------------------------------------------------

       Machine Learning of Natural Language and Ontology

Over the last thirty years there has been a trickle of papers addressing
aspects of the Natural Language Learning area.  The 80s have even seen a
few books published on the subject.  These have tended to take
drastically different theoretical approaches, and have drawn on varying
degrees on fields outside Computer Science and Artificial Intelligence.

During this same period, computational and mathematical modelling of
language and learning have increasingly been recognized as relevant to
assessing the validity of a theory of Language Acquisition or the Nature
of Language.  Conversely, researchers in Linguistics, Psycholinguistics
and Philosophy, as well as Computing, have been considering how and where
we can apply our increasing knowledge of the human characteristics and
constraints which determine how we solve problems, learn about the world,
and use language.

The symposium will address all aspects of the relationship between
Machine Learning and Natural Language.  We not only expect input from
researchers in Computer Science and Artificial Intelligence (Machine
Learning, Natural Language, Robotics, Vision, Neural Nets, Parallelism,
etc.) but wish particularly to encourage relevant contributions from
other fields (Linguistics, Psycholinguistics, Philosophy, Neurology,
Mathematics, etc.)

Specific areas of interest include:

Traditional Approaches
 - Applicability of traditional machine learning.
 - Applicability of traditional parsing techniques.

Complexity Theory 
 - Formal results on learning and language constraints.
 - Development of effective classifications of language.

Cognitive Science
 - Psychological results on language and restrictions.
 - Linguistic results on the nature of natural language.

Parallel Networks
 - Neural models of parsing and learning.
 - Parallel models of parsing and learning.

Symbol Grounding
 - Grounding of Natural Language Systems.
 - Interaction between Modalities and Learning of Ontology.

System Development
 - Computable hypotheses and heuristics for language learning.
 - Experimental language learning systems and their rationale.


Prospective participants are encouraged to contact a member of the
symposium committee to obtain a more detailed description of the
symposium goals and issues.  Participants should then submit an extended
abstract of a paper (500-1000 words) and/or a personal bio-history of
work in the area (300-500 words) with a list of (up to 12) relevant
publications.

We will acknowledge your e-mail enquiries or submissions promptly, and
will deal with other forms of communication as quickly as possible.

Submissions should be sent by e-mail to powers=sub@informatik.uni-kl.de
(and/or reeker@cs.ida.org) by November 16th.  If e-mail is impossible,
two copies should be sent to arrive by November 16th to:

    Larry Reeker, Institute for Defense Analyses, C & SE Div.,
    1801 N. Beauregard St, Alexandria, VA 22311-1772

OR, fax a copy (with cover page) by November 16th BOTH to 1-703-820-9680
(Larry Reeker, USA) AND to +49-631-205-3210 (David Powers, FRG).

Program Committee: David Powers (chair - powers@informatik.uni-kl.de),
Larry Reeker (reeker@cs.ida.org), Manny Rayner (manny@sics.se), Chris
Turk (UK - Fax: +44-633-400091).


------------------------------

Subject: special issue of Connection Science Journal
From:    Noel Sharkey <N.E.Sharkey@cs.exeter.ac.uk>
Date:    Sat, 04 Aug 90 16:30:53 +0100



The NATURAL LANGUAGE special issue of CONNECION SCIENCE will be
on the shelves soon. I though you might like to see the contents.


CONTENTS

Catherine L Harris
   Connectionism and Cognitive Linguistics

John Rager & George Berg
   A Connectionist Model of Motion and Government in Chomsky's
   Government-binding Theory

David J Chalmers
   Syntactic Transformations on Distributed Representations

Stan C Kwasny & Kanaan A Faisal
   Connectinism and Determinism in a Syntactic Parser

Risto Miikkulainen
   Script Recognition with Hierarchical Feature Maps

Lorraine F R Karen
   Identification of Topical Entities in Discouse: a Connectionist
   Approach to Attentional Mechanism in Language

Mary Hare
   The Role of Similarity in Hungarian Vowel Harmony: a Connectionist
   Account

Robert Port
   Representation and Recognition of Temporal Patterns


Editor: Noel E. Sharkey, University of Exeter

Special Editorial Review Panel

Robert B. Allen, Bellcore
Garrison W. Cottrell, University of California, San Diego
Michael G. Dyer, University of California, Los Angeles
Jeffrey L. Elman, University of California, San Diego
George Lakoff, University of California, Berkeley
Wendy G. Lehnert, University of Massachusetts at Amherst
Jordan Pollack, Ohio State University
Ronan Reilly, Beckman Institute, University of Illinois at Urbana-Champaign
Bart Selman, University of Toronto
Paul Smolensky, University of Colorado, Boulder

We would like to encourage the CNLP community to submit many more papers,
and we would particulary like to see more papers on representational
issues.

noel



------------------------------

Subject: THIRD ISAI IN MEXICO
From:    "Centro de Inteligencia Artificial" <ISAI@tecmtyvm.mty.itesm.mx>
Organization: Instituto Tecnologico y de Estudios Superiores de Monterrey
Date:    Wed, 08 Aug 90 09:13:56 -0600


     To whom it may concern:
          Here you will find the information concerning the
      "THIRD INTERNATIONAL SYMPOSIUM ON ARTIFICIAL INTELLIGENCE".

          Please display it in your department's bulletin board.
     Thank you very much in advance.
              Sincerely,
                        The Symposium Publicity Committee.
====================================================================
          THIRD INTERNATIONAL SYMPOSIUM ON
               ARTIFICIAL INTELLIGENCE:
  APPLICATIONS OF ENGINEERING DESIGN & MANUFACTURING IN
          INDUSTRIALIZED AND DEVELOPING COUNTRIES

             OCTOBER 22-26, 1990
                ITESM, MEXICO

   The Third International Symposium on Artificial Intelligence will
   be held in Monterrey, N.L. Mexico on October 22-26, 1990.
   The Symposium is sponsored by the ITESM (Instituto Tecnologico y
   de Estudios Superiores de Monterrey)  in cooperation with the
   International Joint Conferences on Artificial Intelligence Inc.,
   the American Association for Artificial Intelligence, the Sociedad
   Mexicana de Inteligencia Artificial and IBM of Mexico.

   GOALS:
   * Promote the development and use of AI technology in the
     solution of real world problems. Analyze the state-of-the-art
     of AI technology in different countries. Evaluate efforts
     made in the use of AI technology in all countries.

   FORMAT:
   ISAI consists of a tutorial and a conference.
           Tutorial.- Oct. 22-23
           Set of seminars on relevant AI topics given in two days.
   Topics covered in the Tutorial include:
   "Expert Systems in Manufacturing"
   Mark Fox, Ph.D., Carnegie Mellon University, USA
   "A.I. as a Software Development Methodology"
   Randolph Goebel, Ph.D., University of Alberta, Canada

            Conference.- Oct. 24-26
            Set of lectures given during three days. It consists of
   invited papers and selected papers from the "Call for Papers"
   invitation. Areas of application include: computer aided product
   design, computer aided product manufacturing, use of industrial
   robots, process control and ES, automatic process inspection and
   production planning.
   Confirmed guest speakers:
   Nick Cercone, Ph.D, Simon Fraser University, Canada
   Alan Mackworth, Ph.D, University of British Columbia, Canada
   Mitsuru Ishizuka, Ph.D, University of Tokyo, Japan

   IMPORTANT:
             Computer manufacturers, AI commercial companies,
   universities and selected papers with working programs could
   present products and demonstrations during the conference.
   In order to encourage an atmosphere of friendship and exchange
   among participants, some social events are being organized.
     For your convinience we have arranged a free shuttle bus
   service between the hotel zone and the ITESM during the three
   day conference.

    FEES: (Valid before August 31)
         Tutorial.-
           Professionals    $ 250 USD + Tx(15%)
           Students         $ 125 USD + Tx(15%)
        Conference.-
           Professionals    $ 180 USD + Tx(15%)
           Students         $  90 USD + Tx(15%)
           Simultaneous Translation   $ 7 USD
           Formal dinner    $ 25 USD *
           *(Includes dinner, open bar, music  (Oct 26))
    Tutorial fee includes:
        Tutorial material.
        Welcoming cocktail party (Oct.22)

    Conference fee includes:
        Proceedings.
        Welcoming cocktail party (Oct.24)
        Cocktail party. (Oct.25)

    HOTELS:
        Call one to the hotels listed below and mention that you
    are going to the 3rd. ISAI. Published rates are single or
    double rooms.
    HOTEL                   PHONE*              RATE
    Hotel Ambassador       42-20-40          $85 USD + Tx(15%)
    Gran Hotel Ancira      42-48-06          $75 USD + Tx(15%)
                           91(800) 83-060
    Hotel Monterrey        43-51-(20 to 29)  $60 USD + Tx(15%)
    Hotel Rio              44-90-40          $48 USD + Tx(15%)
    * The area code for Monterrey is (83).

    REGISTRATION PROCEDURE:
        Send personal check payable to "I.T.E.S.M." to:
              "Centro de Inteligencia Artificial,
               Attention: Leticia Rodriguez,
               Sucursal de Correos "J", C.P. 64849,
               Monterrey, N.L. Mexico"

        INFORMATION:
              CENTRO DE INTELIGENCIA ARTIFICIAL, ITESM.
              SUC. DE CORREOS "J", C.P. 64849 MONTERREY, N.L. MEXICO.
              TEL.    (83) 58-20-00 EXT.5132 or 5143.
              TELEFAX (83) 58-07-71, (83) 58-89-31,
              NET ADDRESS:
                          ISAI AT TECMTYVM.BITNET
                          ISAI AT TECMTYVM.MTY.ITESM.MX

------------------------------

Subject: International Journal of Neural Systems
From:    Benny Lautrup <LAUTRUP@nbivax.nbi.dk>
Date:    Thu, 09 Aug 90 11:19:00 +0200

 

INTERNATIONAL JOURNAL OF NEURAL SYSTEMS 
       
The International Journal of Neural  Systems  is  a  quarterly  journal
which covers information processing in natural  and  artificial  neural
systems. It publishes original contributions on  all  aspects  of  this
broad subject which involves  physics,  biology,  psychology,  computer
science and engineering. Contributions include research papers, reviews
and short communications.  The  journal  presents  a  fresh  undogmatic
attitude towards this multidisciplinary field with  the  aim  to  be  a
forum for novel ideas and  improved  understanding  of  collective  and
cooperative phenomena with computational capabilities. 

ISSN: 0129-0657 (IJNS) 

 ----------------------------------

Contents of issue number 3 (1990):

1. A. S. Weigend, B. A. Huberman and D. E. Rumelhart: 
   Predicting the future: A connectionist approach.

2. C. Chinchuan, M. Shanblatt and C. Maa: An artificial neural 
   network algorithm for dynamic programming.

3. L. Fan and T. Li: Design of competition based neural networks 
   for combinatorial optimization.

4. E. A. Ferran and R. P. J. Perazzo: Dislexic behaviour of 
   feed-forward neural networks.

5. E. Milloti: Sigmoid versus step functions in feed-forward  
   neural networks.  

6. D. Horn and M. Usher: Excitatory-inhibitory networks with
   dynamical thresholds.

7. J. G. Sutherland: A holographic model of memory, learning
   and expression. 

8. L. Xu: Adding top-down expectations into the learning procedure 
   of self-organizing maps.

9. D. Stork: BOOK REVIEW     

 ----------------------------------

Editorial board:

B. Lautrup (Niels Bohr Institute, Denmark)  (Editor-in-charge)
S. Brunak (Technical Univ. of Denmark) (Assistant Editor-in-Charge) 

D. Stork (Stanford) (Book review editor)

Associate editors:

B. Baird (Berkeley) 
D. Ballard (University of Rochester) 
E. Baum (NEC Research Institute)
S. Bjornsson (University of Iceland)
J. M. Bower (CalTech)
S. S. Chen (University of North Carolina)
R. Eckmiller (University of Dusseldorf)
J. L. Elman (University of California, San Diego)
M. V. Feigelman (Landau Institute for Theoretical Physics)
F. Fogelman-Soulie (Paris)  
K. Fukushima (Osaka University)
A. Gjedde (Montreal Neurological Institute)
S. Grillner (Nobel Institute for Neurophysiology, Stockholm)
T. Gulliksen (University of Oslo)
D. Hammerstroem (University of Oregon)
J. Hounsgaard (University of Copenhagen) 
B. A. Huberman (XEROX PARC)
L. B. Ioffe (Landau Institute for Theoretical Physics)
P. I. M. Johannesma (Katholieke Univ. Nijmegen)
M. Jordan (MIT)
G. Josin (Neural Systems Inc.)
I. Kanter (Princeton University)
J. H. Kaas (Vanderbilt University)
A. Lansner (Royal Institute of Technology, Stockholm)   
A. Lapedes (Los Alamos)
B. McWhinney (Carnegie-Mellon University)
M. Mezard (Ecole Normale Superieure, Paris)
A. F. Murray (University of Edinburgh)
J. P. Nadal (Ecole Normale Superieure, Paris)
E. Oja (Lappeenranta University of Technology, Finland)
N. Parga (Centro Atomico Bariloche, Argentina)
S. Patarnello (IBM ECSEC, Italy)
P. Peretto (Centre d'Etudes Nucleaires de Grenoble)
C. Peterson (University of Lund)
K. Plunkett (University of Aarhus)
S. A.  Solla (AT&T Bell Labs)
M. A. Virasoro (University of Rome)
D. J. Wallace (University of Edinburgh)
D. Zipser (University of California, San Diego) 

 ----------------------------------


CALL FOR PAPERS  

Original contributions consistent with the scope  of  the  journal  are
welcome.  Complete  instructions  as  well   as   sample   copies   and
subscription information are available from 

The Editorial Secretariat, IJNS
World Scientific Publishing Co. Pte. Ltd.
73, Lynton Mead, Totteridge
London N20 8DH
ENGLAND 
Telephone: (44)1-446-2461

or 

World Scientific Publishing Co. Inc.
687 Hardwell St.
Teaneck
New Jersey 07666
USA  
Telephone: (1)201-837-8858  

or

World Scientific Publishing Co. Pte. Ltd.
Farrer Road, P. O. Box 128
SINGAPORE 9128
Telephone (65)278-6188



------------------------------

End of Neuron Digest [Volume 6 Issue 48]
****************************************