[comp.ai.neural-nets] Neuron Digest V5 #47

neuron-request@HPLABS.HP.COM ("Neuron-Digest Moderator Peter Marvit") (11/26/89)

Neuron Digest   Saturday, 25 Nov 1989
                Volume 5 : Issue 47

Today's Topics:
               CFP: Parallel Computing special issue on NNs
                NEURAL NET SUMMER SCHOOL AT WANG INSTITUTE
       International Symposium on AI and Math (Second Announcement)


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: CFP: Parallel Computing special issue on NNs
From:    Heinz Muehlenbein <gmdzi!muehlen@uunet.UU.NET>
Date:    Tue, 07 Nov 89 13:55:10 -0100 

  Special issue on neural networks
 -----------------------------------
Dear colleagues,

I am editing a special issue of the journal Parallel Computing on neural
networks. The following topics will be covered

  ---introduction to NN
  ---simulation of NN's
  ---performance of NN's
  --- limitations of current NN's 
  ---the next generation

I am looking for papers describing the limitations of current NN's and/or
give an outline of the next generation. In my opinion, the next generation
of NN's will have the following features ( to mention only some important
ones). They are
       -modular
       -recurrent
       -asynchronous
Modular neural networks are networks which are composed out of subnetworks,
which can be trained independently. A major problem is to find the modular
structure which fits the specific application problem. I have proposed
genetic neural networks as a longterm research topic.

In these networks, the genes specify the network modules and their
interconnection. By simulating the evolution process, these networks adapt
to the application.

I believe that many researchers are going into the same direction. Why not
publishing it now?

Please contact me by e-mail. Deadline for abstracts is the end of November.
Deadline for the finished paper (length 10-15 pages) is January, 31. The
issue will appear late summer 1990.

- -----Heinz Muehlenbein
      GMD
      P.O 1240
 5205 Sankt Augustin 1
      Germany 
muehlen@gmdzi.uucp

------------------------------

Subject: NEURAL NET SUMMER SCHOOL AT WANG INSTITUTE
From:    mike@bucasb.BU.EDU (Michael Cohen)
Date:    Fri, 10 Nov 89 00:57:00 -0500 

BOSTON UNIVERSITY, A WORLD LEADER IN NEURAL NETWORK RESEARCH AND
TECHNOLOGY, PRESENTS TWO MAJOR SCIENTIFIC EVENTS:


MAY 6--11, 1990
NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS
A self-contained systematic course by leading neural architects who know the 
field as only its creators can.


MAY 11--13, 1990
NEURAL NETWORKS FOR AUTOMATIC TARGET RECOGNITION
An international research conference presenting INVITED and CONTRIBUTED 
papers, herewith solicited, on one of the most active research topics in 
science and technology today.

                                     
                               SPONSORED BY
                      THE CENTER FOR ADAPTIVE SYSTEMS
                                    AND
                            THE WANG INSTITUTE
                                    OF
                             BOSTON UNIVERSITY
                         WITH PARTIAL SUPPORT FROM
                THE AIR FORCE OFFICE OF SCIENTIFIC RESEARCH

- -----------------------------------------------------------------------------


                             CALL FOR PAPERS
                             ---------------

              NEURAL NETWORKS FOR AUTOMATIC TARGET RECOGNITION
                             MAY 11--13, 1990

This research conference at the cutting edge of neural network science and
technology will bring together leading experts in academe, government, and
industry to present their latest results on automatic target recognition in
invited lectures and contributed posters. Automatic target recognition is a
key process in systems designed for vision and image processing, speech and
time series prediction, adaptive pattern recognition, and adaptive
sensory-motor control and robotics. It is one of the areas emphasized by
the DARPA Neural Networks Program, and has attracted intense research
activity around the world. Invited lecturers include:

JOE BROWN, Martin Marietta, "Multi-Sensor ATR using Neural Nets"

GAIL CARPENTER, Boston University, "Target Recognition by Adaptive Resonance: 
ART for ATR"

NABIL FARHAT, University of Pennsylvania, "Bifurcating Networks for Target 
Recognition"

STEPHEN GROSSBERG, Boston University, "Recent Results on Self-Organizing 
ATR Networks"

ROBERT HECHT-NIELSEN, HNC, "Spatiotemporal Attention Focusing by
Expectation Feedback"

KEN JOHNSON, Hughes Aircraft, "The Application of Neural Networks to the
Acquisition and Tracking of Maneuvering Tactical Targets in High Clutter IR
Imagery"

PAUL KOLODZY, MIT Lincoln Laboratory, "A Multi-Dimensional ATR System"

MICHAEL KUPERSTEIN, Neurogen, "Adaptive Sensory-Motor Coordination using
the INFANT Controller"

YANN LECUN, AT&T Bell Labs, "Structured Back Propagation Networks for
Handwriting Recognition"

CHRISTOPHER SCOFIELD, Nestor, "Neural Network Automatic Target Recognition
by Active and Passive Sonar Signals"

STEVEN SIMMES, Science Applications International Co., "Massively Parallel
Approaches to Automatic Target Recognition"

ALEX WAIBEL, Carnegie Mellon University, "Patterns, Sequences and
Variability: Advances in Connectionist Speech Recognition"

ALLEN WAXMAN, MIT Lincoln Laboratory, "Invariant Learning and Recognition
of 3D Objects from Temporal View Sequences"

FRED WEINGARD, Booz-Allen and Hamilton, "Current Status and Results of Two
Major Government Programs in Neural Network-Based ATR"

BARBARA YOON, DARPA, "DARPA Artificial Neural Networks Technology Program:
Automatic Target Recognition"

          ------------------------------------------------------

CALL FOR PAPERS---ATR POSTER SESSION: A featured poster session on ATR
neural network research will be held on May 12, 1990. Attendees who wish to
present a poster should submit 3 copies of an extended abstract (1
single-spaced page), postmarked by March 1, 1990, for refereeing. Include
with the abstract the name, address, and telephone number of the
corresponding author. Mail to: ATR Poster Session, Neural Networks
Conference, Wang Institute of Boston University, 72 Tyng Road, Tyngsboro,
MA 01879. Authors will be informed of abstract acceptance by March 31,
1990.

SITE: The Wang Institute possesses excellent conference facilities on a
beautiful 220-acre rustic setting. It is easily reached from Boston's Logan
Airport and Route 128.

REGISTRATION FEE: Regular attendee--$90; full-time student--$70.
Registration fee includes admission to all lectures and poster session, one
reception, two continental breakfasts, one lunch, one dinner, daily morning
and afternoon coffee service. STUDENTS: Read below about FELLOWSHIP
support.

REGISTRATION: To register by telephone with VISA or MasterCard call (508)
649-9731 between 9:00AM--5:00PM (EST). To register by FAX, fill out the
registration form and FAX back to (508) 649-6926. To register by mail,
complete the registration form and mail with your full form of payment as
directed. Make check payable in U.S. dollars to "Boston University". See
below for Registration Form. To register by electronic mail, use the
address "rosenber@bu-tyng.bu.edu". On-site registration on a
space-available basis will take place from 1:00--5:00PM on Friday, May 11.
A RECEPTION will be held from 3:00--5:00PM on Friday, May 11. LECTURES
begin at 5:00PM on Friday, May 11 and conclude at 1:00PM on Sunday, May 13.

- --------------------------------------------------------------------


             NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS
                               MAY 6--11, 1989

This in-depth, systematic, 5-day course is based upon the world's leading
graduate curriculum in the technology, computation, mathematics, and
biology of neural networks. Developed at the Center for Adaptive Systems
(CAS) and the Graduate Program in Cognitive and Neural Systems (CNS) of
Boston University, twenty-eight hours of the course will be taught by six
CAS/CNS faculty. Three distinguished guest lecturers will present eight
hours of the course.


COURSE OUTLINE
- --------------

          MAY 7, 1990
          -----------

          MORNING SESSION (PROFESSOR GROSSBERG)

HISTORICAL OVERVIEW: Introduction to the binary, linear, and
continuous-nonlinear streams of neural network research: McCulloch-Pitts,
Rosenblatt, von Neumann; Anderson, Kohonen, Widrow; Hodgkin-Huxley,
Hartline-Ratliff, Grossberg.

CONTENT ADDRESSABLE MEMORY: Classification and analysis of neural network
models for absolutely stable CAM.  Models include: Cohen-Grossberg,
additive, shunting, Brain-State-In-A-Box, Hopfield, Boltzmann Machine,
McCulloch-Pitts, masking field, bidirectional associative memory.

COMPETITIVE DECISION MAKING: Analysis of asynchronous variable-load
parallel processing by shunting competitive networks; solution of
noise-saturation dilemma; classification of feedforward networks: automatic
gain control, ratio processing, Weber law, total activity normalization,
noise suppression, pattern matching, edge detection, brightness constancy
and contrast, automatic compensation for variable illumination or other
background energy distortions; classification of feedback networks:
influence of nonlinear feedback signals, notably sigmoid signals, on
pattern transformation and memory storage, winner-take-all choices, partial
memory compression, tunable filtering, quantization and normalization of
total activity, emergent boundary segmentation; method of jumps for
classifying globally consistent and inconsistent competitive decision
schemes.

ASSOCIATIVE LEARNING: Derivation of associative equations for short-term
memory and long-term memory.  Overview and analysis of associative
outstars, instars, computational maps, avalanches, counterpropagation nets,
adaptive bidrectional associative memories.  Analysis of unbiased
associative pattern learning by asynchronous parallel sampling channels;
classification of associative learning laws.


          AFTERNOON SESSION (PROFESSORS JORDAN AND MINGOLLA)

COMBINATORIAL OPTIMIZATION

PERCEPTRONS: Adeline, Madeline, delta rule, gradient descent, adaptive
statistical predictor, nonlinear separability.

INTRODUCTION TO BACK PROPAGATION: Supervised learning of multidimensional
nonlinear maps, NETtalk, image compression, robotic control.

RECENT DEVELOPMENTS OF BACK PROPAGATION: This two-hour guest tutorial
lecture will provide a systematic review of recent developments of the back
propagation learning network, especially focussing on recurrent back
propagation variations and applications to outstanding technological
problems.


          EVENING SESSION: DISCUSSIONS WITH TUTORS
  


          MAY 8, 1990
          -----------

          MORNING SESSION (PROFESSORS CARPENTER AND GROSSBERG)

ADAPTIVE PATTERN RECOGNITION: Adaptive filtering; contrast enhancement;
competitive learning of recognition categories; adaptive vector
quantization; self-organizing computational maps; statistical properties of
adaptive weights; learning stability and causes of instability.

INTRODUCTION TO ADAPTIVE RESONANCE THEORY: Absolutely stable recognition
learning, role of learned top-down expectations; attentional priming;
matching by 2/3 Rule; adaptive search; self-controlled hypothesis testing;
direct access to globally optimal recognition code; control of categorical
coarseness by attentional vigilance; comparison with relevant behavioral
and brain data to emphasize biological basis of ART computations.

ANALYSIS OF ART 1: Computational analysis of ART 1 architecture for
self-organized real-time hypothesis testing, learning, and recognition of
arbitrary sequences of binary input patterns.


          AFTERNOON SESSION (PROFESSOR CARPENTER)

ANALYSIS OF ART 2: Computational analysis of ART 2 architecture for
self-organized real-time hypothesis testing, learning, and recognition for
arbitrary sequences of analog or binary input patterns.

ANALYSIS OF ART 3: Computational analysis of ART 3 architecture for
self-organized real-time hypothesis testing, learning, and recognition
within distributed network hierarchies; role of chemical transmitter
dynamics in forming a memory representation distinct from short-term memory
and long-term memory; relationships to brain data concerning
neuromodulators and synergetic ionic and transmitter interactions.

SELF-ORGANIZATION OF INVARIANT PATTERN RECOGNITION CODES: Computational
analysis of self-organizing ART architectures for recognizing noisy imagery
undergoing changes in position, rotation, and size.

NEOCOGNITION: Recognition and completion of images by hierarchical
bottom-up filtering and top-down attentive feedback.


          EVENING SESSION: DISCUSSIONS WITH TUTORS



          MAY 9, 1990
          -----------

          MORNING SESSION (PROFESSORS GROSSBERG & MINGOLLA)

VISION AND IMAGE PROCESSING: Introduction to Boundary Contour System for
emergent segmentation and Feature Contour System for filling-in after
compensation for variable illumination; image compression,
orthogonalization, and reconstruction; multidimensional filtering,
multiplexing, and fusion; coherent boundary detection, regularization,
self-scaling, and completion; compensation for variable illumination
sources, including artificial sensors (infrared sensors, laser radars);
filling-in of surface color and form; 3-D form from shading, texture,
stereo, and motion; parallel processing of static form and moving form;
motion capture and induced motion; synthesis of static form and motion form
representations.

          AFTERNOON SESSION (PROFESSORS BULLOCK, COHEN, & GROSSBERG)

ADAPTIVE SENSORY-MOTOR CONTROL AND ROBOTICS: Overview of recent progress in
adaptive sensory-motor control and related robotics research. Reaching to,
grasping, and transporting objects of variable mass and form under visual
guidance in a cluttered environment will be used as a target behavioral
competence to clarify subproblems of real-time adaptive sensory-motor
control. The balance of the tutorial will be spent detailing neural network
modules that solve various subproblems. Topics include: Self-organizing
networks for real-time control of eye movements, arm movements, and eye-arm
coordination; learning of invariant body-centered target position maps;
learning of intermodal associative maps; real-time trajectory formation;
adaptive vector encoders; circular reactions between action and sensory
feedback; adaptive control of variable speed movements; varieties of error
signals; supportive behavioral and neural data; inverse kinematics;
automatic compensation for unexpected perturbations; independent adaptive
control of force and position; adaptive gain control by cerebellar
learning; position-dependent sampling from spatial maps; predictive motor
planning and execution.

SPEECH PERCEPTION AND PRODUCTION: Hidden Markov models; self-organization
of speech perception and production codes; eighth nerve Average Localized
Synchrony Response; phoneme recognition by back propagation, time delay
networks, and vector quantization.



          MAY 10, 1990
          ------------

          MORNING SESSION (PROFESSORS COHEN, GROSSBERG, & MERRILL)

SPEECH PERCEPTION AND PRODUCTION: Disambiguation of coarticulated vowels
and consonants; dynamics of working memory; multiple-scale adaptive coding
by masking fields; categorical perception; phonemic restoration; contextual
disambiguation of speech tokens; resonant completion and grouping of noisy
variable-rate speech streams.

REINFORCEMENT LEARNING AND PREDICTION: Recognition learning, reinforcement
learning, and recall learning are the 3 R's of neural network learning.
Reinforcement learning clarifies how external events interact with internal
organismic requirements to trigger learning processes capable of focussing
attention upon and generating appropriate actions towards motivationally
desired goals. A neural network model will be derived to show how
reinforcement learning and recall learning can self-organize in response to
asynchronous series of significant and irrelevant events. These mechanisms
also control selective forgetting of memories that are no longer
predictive, adaptive timing of behavioral responses, and self-organization
of goal directed problem solvers.


          AFTERNOON SESSION 
          (PROFESSORS GROSSBERG & MERRILL AND DR. HECHT-NIELSEN)

REINFORCEMENT LEARNING AND PREDICTION: Analysis of drive representations,
adaptive critics, conditioned reinforcers, role of motivational feedback in
focusing attention on predictive data; attentional blocking and unblocking;
adaptively timed problem solving; synthesis of perception, recognition,
reinforcement, recall, and robotics mechanisms into a total neural
architecture; relationship to data about hypothalamus, hippocampus,
neocortex, and related brain regions.

RECENT DEVELOPMENTS IN THE NEUROCOMPUTER INDUSTRY: This two-hour guest
tutorial will provide an overview of the growth and prospects of the
burgeoning neurocomputer industry by one of its most important leaders.


          EVENING SESSION: DISCUSSIONS WITH TUTORS



          MAY 11, 1990
          ------------

          MORNING SESSION (DR. FAGGIN)

VLSI IMPLEMENTATION OF NEURAL NETWORKS: This is a four-hour self-contained
tutorial on the application and development of VLSI techniques for creating
compact real-time chips embodying neural network designs for applications
in technology. Review of neural networks from a hardware implementation
perspective; hardware requirements and alternatives; dedicated digital
implementation of neural networks; neuromorphic design methodology using
VLSI CMOS technology; applications and performance of neuromorphic
implementations; comparison of neuromorphic and digital hardware; future
prospectus.

- ----------------------------------------------------------------------------


                    COURSE FACULTY FROM BOSTON UNIVERSITY
                    -------------------------------------

STEPHEN GROSSBERG, Wang Professor of CNS, as well as Professor of
Mathematics, Psychology, and Biomedical Engineering, is one of the world's
leading neural network pioneers and most versatile neural architects;
Founder and 1988 President of the International Neural Network Society
(INNS); Founder and Co-Editor-in-Chief of the INNS journal "Neural
Networks"; an editor of the journals "Neural Computation", "Cognitive
Science", and "IEEE Expert"; Founder and Director of the Center for
Adaptive Systems; General Chairman of the 1987 IEEE First International
Conference on Neural Networks (ICNN); Chief Scientist of Hecht-Nielsen
Neurocomputer Company (HNC); and one of the four technical consultants to
the national DARPA Neural Network Study.  He is author of 200 articles and
books about neural networks, including "Neural Networks and Natural
Intelligence" (MIT Press, 1988), "Neural Dynamics of Adaptive Sensory-Motor
Control" (with Michael Kuperstein, Pergamon Press, 1989), "The Adaptive
Brain, Volumes I and II" (Elsevier/North-Holland, 1987), "Studies of Mind
and Brain" (Reidel Press, 1982), and the forthcoming "Pattern Recognition
by Self-Organizing Neural Networks" (with Gail Carpenter).

GAIL CARPENTER is Professor of Mathematics and CNS; Co-Director of the CNS
Graduate Program; 1989 Vice President of the International Neural Network
Society (INNS); Organization Chairman of the 1988 INNS annual meeting;
Session Chairman at the 1989 and 1990 IEEE/INNS International Joint
Conference on Neural Networks (IJCNN); one of four technical consultants to
the national DARPA Neural Network Study; editor of the journals "Neural
Networks", "Neural Computation", and "Neural Network Review"; and a member
of the scientific advisory board of HNC. A leading neural architect,
Carpenter is especially well-known for her seminal work on developing the
adaptive resonance theory architectures (ART 1, ART 2, ART 3) for adaptive
pattern recognition.

MICHAEL COHEN, Associate Professor of Computer Science and CNS, is a
leading architect of neural networks for content addressable memory
(Cohen-Grossberg model), vision (Feature Contour System), and speech
(Masking Fields); editor of "Neural Networks"; Session Chairman at the 1987
ICNN, and the 1989 IJCNN; and member of the DARPA Neural Network Study
panel on Simulation/Emulation Tools and Techniques.

ENNIO MINGOLLA, Assistant Professor of Psychology and CNS, is holder of one
of the first patented neural network architectures for vision and image
processing (Boundary Contour System); Co-Organizer of the 3rd Workshop on
Human and Machine Vision in 1985; editor of the journals "Neural Networks"
and "Ecological Psychology"; member of the DARPA Neural Network Study panel
of Adaptive Knowledge Processing; consultant to E.I. duPont de Nemours,
Inc.; Session Chairman for vision and image processing at the 1987 ICNN,
and the 1988 INNS meetings.

DANIEL BULLOCK, Assistant Professor of Psychology and CNS, is developer of
neural network models for real-time adaptive sensory-motor control of arm
movements and eye-arm coordination, notably the VITE and FLETE models for
adaptive control of multi-joint trajectories; editor of "Neural Networks";
Session Chairman for adaptive sensory-motor control and robotics at the
1987 ICNN and the 1988 INNS meetings; invited speaker at the 1990 IJCNN.

JOHN MERRILL, Assistant Professor of Mathematics and CNS, is developing
neural network models for adaptive pattern recognition, speech recognition,
reinforcement learning, and adaptive timing in problem solving behavior,
after having received his Ph.D. in mathematics from the University of
Wisconsin at Madison, and completing postdoctoral research in computer
science and linguistics at Indiana University.


                              GUEST LECTURERS
                              ---------------

FEDERICO FAGGIN is co-founder and president of Synaptics, Inc. Dr. Faggin
developed the Silicon Gate Technology at Fairchild Semiconductor. He also
designed the first commercial circuit using Silicon Gate Technology: the
3708, an 8-bit analog multiplexer. At Intel Corporation he was responsible
for designing what was to become the first microprocessor---the 4000
family, also called MCS-4. He and Hal Feeney designed the 8008, the first
8-bit microprocessor introduced in 1972, and later Faggin conceived the
8080 and with M. Shima designed it. The 8080 was the first high-performance
8-bit microprocessor. At Zilog Inc., Faggin conceived the Z80
microprocessor family and directed the design of the Z80 CPU. Faggin also
started Cygnet Technologies, which developed a voice and data communication
peripheral for the personal computer. In 1986 Faggin co-founded Synaptics
Inc., a company dedicated to the creation of a new type of VLSI hardware
for artificial neural networks and other machine intelligence applications.
Faggin is the recipient of the 1988 Marconi Fellowship Award for his
contributions to the birth of the microprocessor.

ROBERT HECHT-NIELSEN is co-founder and chairman of the Board of Directors
of Hecht-Nielsen Neurocomputer Corporation (HNC), a pioneer in
neurocomputer technology and the application of neural networks, and a
recognized leader in the field. Prior to the formation of HNC, he founded
and managed the neurocomputer development and neural network applications
at TRW (1983--1986) and Motorola (1979--1983). He has been active in neural
network technology and neurocomputers since 1961 and earned his Ph.D. in
mathematics in 1974. He is currently a visiting lecturer in the Electrical
Engineering Department at the University of California at San Diego, and is
the author of influential technical reports and papers on neurocomputers,
neural networks, pattern recognition, signal processing algorithms, and
artificial intelligence.

MICHAEL JORDAN is an Assistant Professor of Brain and Cognitive Sciences at
MIT.  One of the key developers of the recurrent back propagation
algorithms, Professor Jordan's research is concerned with learning in
recurrent networks and with the use of networks as forward models in
planning and control. His interest in interdisciplinary research on neural
networks is founded in his training for a Bachelors degree in Psychology, a
Masters degree in Mathematics, and a Ph.D. in Cognitive Science from the
University of California at San Diego. He was a postdoctoral researcher in
Computer Science at the University of Massachusetts at Amherst before
assuming his present position at MIT.

             ----------------------------------------------------------

REGISTRATION FEE: Regular attendee--$950; full-time student--$250.
Registration fee includes five days of tutorials, course notebooks, one
reception, five continental breakfasts, five lunches, four dinners, daily
morning and afternoon coffee service, evening discussion sessions with
leading neural architects.

REGISTRATION: To register by telephone with VISA or MasterCard call (508)
649-9731 between 9:00AM--5:00PM (EST). To register by FAX, fill out the
registration form and FAX back to (508) 649-6926. To register by mail,
complete the registration form and mail with you full form of payment as
directed. Make check payable in U.S. dollars to "Boston University". See
below for Registration Form. To register by electronic mail, use the
address "rosenber@bu-tyng.bu.edu". On-site registration on a
space-available basis will take place from 2:00--7:00PM on Sunday, May 6
and from 7:00--8:00AM on Monday, May 7, 1990. A RECEPTION will be held from
4:00--7:00PM on Sunday, May 6.  LECTURES begin at 8:00AM on Monday, May 7
and conclude at 12:30PM on Friday, May 11.

STUDENT FELLOWSHIPS supporting travel, registration, and lodging for the
Course and the Research Conference are available to full-time graduate
students in a PhD program. Applications must be postmarked by March 1,
1990.  Send curriculum vitae, a one-page essay describing your interest in
neural networks, and a letter from a faculty advisor to: Student
Fellowships, Neural Networks Course, Wang Institute of Boston University,
72 Tyng Road, Tyngsboro, MA 01879.

CNS FELLOWSHIP FUND: Net revenues from the course will endow fellowships
for Ph.D. candidates in the CNS Graduate Program. Corporate and individual
gifts to endow CNS Fellowships are also welcome. Please write: Cognitive
and Neural Systems Fellowship Fund, Center for Adaptive Systems, Boston
University, 111 Cummington Street, Boston, MA 02215.

- -----------------------------------------------------------------------


                  REGISTRATION FOR COURSE AND RESEARCH CONFERENCE

Course: Neural Network Foundations and Applications, May 6--11, 1990

Research Conference: Neural Networks for Automatic Target Recognition, 
May 11--13, 1990 


NAME: _________________________________________________________________

ORGANIZATION (for badge): _____________________________________________

MAILING ADDRESS: ______________________________________________________

                 ______________________________________________________

CITY/STATE/COUNTRY: ___________________________________________________

POSTAL/ZIP CODE: ______________________________________________________

TELEPHONE(S): _________________________________________________________




     COURSE                           RESEARCH CONFERENCE
     ------                           -------------------

     [    ] regular attendee $950     [   ] regular attendee $90 
     [    ] full-time student $250    [   ] full-time student $70
     (limited number of spaces)       (limited number of spaces)

                [   ] Gift to CNS Fellowship Fund 


TOTAL PAYMENT: $________

FORM OF PAYMENT:
     [    ] check or money order (payable in U.S. dollars to Boston University) 
     [    ] VISA  [   ] MasterCard 

Card Number:      ______________________________________________

Expiration Date:  ______________________________________________

Signature:        ______________________________________________



Please complete and mail to: 
Neural Networks 
Wang Institute of Boston University 
72 Tyng Road 
Tyngsboro, MA 01879 USA 

To register by telephone, call: (508) 649-9731. 


HOTEL RESERVATIONS: Room blocks have been reserved at 3 hotels near the
Wang Institute. Hotel names, rates, and telephone numbers are listed below.
A shuttle bus will take attendees to and from the hotels for the Course and
Research Conference.  Attendees should make their own reservations by
calling the hotel. The special conference rate applies only if you mention
the name and dates of the meeting when making the reservations.

Sheraton Tara       Red Roof Inn          Stonehedge Inn 
Nashua, NH          Nashua, NH            Tyngsboro, MA 
(603) 888-9970      (603) 888-1893        (508) 649-4342 
$70/night+tax       $39.95/night+tax      $89/night+tax 

The hotels in Nashua are located approximately 5 miles from the Wang
Institute. A shuttle bus will be provided.




------------------------------

Subject: International Symposium on AI and Math (Second Announcement)
From:    hector@maui.cs.ucla.edu (Hector A Geffner)
Organization: UCLA Computer Science Department
Date:    11 Nov 89 21:23:50 +0000 


                 ... Second Announcement ...

   International Symposium on Artificial Intelligence and Mathematics

                        January 3-5, 1990

                          Pier 66  Hotel
                     Fort Lauderdale, Florida

 Program Chair:  Martin Golumbic    martygo@yktvmh.bitnet

 Organizing Chair:  Frederick Hoffman  hoffmanf@servax.bitnet



 Keynote speaker:

    David Mumford, Harvard University
         "Finding Discrete Structure in a Noisy Analogue World"

 Invited Hour Speakers:

    Martin Davis, Courant Institute, NYU
         "In Defence of First Order Logic"

    Zohar Manna, Stanford University
         "Automated Deduction--Techniques and Applications"

    Drew McDermott, Yale University
         "Numerical Methods in Artificial Intelligence"

    Alan Robinson, Syracuse University
         "Artificial and Natural Proofs in Mathematics"

    Leslie Valiant, Harvard University
         "Computational Learning Theory"

 Special Session on Logic and Artificial Intelligence:

    Howard Blair (Syracuse University)
    Allen Brown (Xerox Webster Center)
    Michael Gelfond (University of Texas)
    Wiktor Marek (Cornell University and University of Kentucky)
    Anil Nerode (Cornell University)
    John Schlipf (University of Cincinnati)
    Mirek Truszczynski (University of Kentucky)
    Duminda Wijesekera (Cornell University)

 Session talks:

    Peter Caines (McGill) and S. Wang
         On the complexity of classical and logic-based
         observer-controllers for partially observed automata

    H. Geffner and J. Pearl (UCLA)
         Ranking and Priorities in Default Reasoning
  
    Horty and Thomason (CMU)
         The Mathematics of Inheritance

    H. Kautz and B. Selman
         The tractability of path-based inheritance

    L. Allison, C.S. Wallace and C.N. Yee (Monash)
         When is a string like a string?

    E. Kounalis and M. Rusinowitch
         A mechanization of conditional reasoning

    A. Tuzhilin and Z. Kedem  (NYU)
         Modelling and querying databases evolving in time

    Ken McAloon
         Small CLP languages and search problems

    E. Boros, P.L. Hammer (Rutgers) and J.N. Hooker (CMU)
         Boolean regression

    William McCune and Larry Wos (Argonne)
         Applications of automated reasoning to problems in
         mathematics and logic

    Michael Kearns (MIT)
         to be announced

    Chengqi Zhang and Maria Olowska (Univ. of Queensland)
         Homomorphic transformation among                               
         Inexact reasoning models in distributed Systems

    Michael Sims (NASA AMES)
         Control of mathematical discovery

    Elisha Sacks (Princeton)
         Automatic qualitative analysis of one-parameter planar ODE's
         by intelligent numeric simulation

    J-L. Lassez (IBM Research)
         Querying systems of linear constraints

    Michael Maher (IBM Research)
         to be announced

    G. Manzini (MIT) and M. Somalvico
         Probabilistic performance analysis of heuristic search using
         parallel hash tables

    P. Van Hentenryck and T. Graf
         Standard forms for rational linear arithmetic
         in constraint logic programming

    P. Bhattacharya and K Qian( Univ. of Nebraska)
         Parallel algorithm of skeletonization of binary or gray image

     Xiaodi Sun (705 Xidian University, PR China)
         New Approaches for the Treatment of Uncertainty in AI

    Henry W. Davis (Wright State Univ.) and S.V. Chenoweth
         Discrepancy analysis: A new approach for understanding the
         asymptotic complexity of A* tree search

    Mary McLeish (Univ. of Guelph)
         A theory for the use of production systems with conflicting
         evidence in probabilistic logic

    Brian G. Patrick, Mohammed Almulla and Monroe M. Newborn
         An upper-bound on the complexity of interative-deepening-A*

                       Approach of the Symposium

     The symposium will be of interest to an audience of both
     computer scientists and mathematicians.  The International
     Symposium on Artificial Intelligence and Mathematics is the
     first of a biennial series featuring applications of
     mathematics in artificial intelligence as well as artificial
     intelligence techniques and results in mathematics.  There has
     always been a strong relationship between the two disciplines,
     however, the contact between practitioners of each has been
     limited, partly by the lack of a forum in which the
     relationship could grow and flourish.  This symposium,
     alternating with the existing series of Workshops on Statistics
     and AI, represents a small step towards improving contacts and
     promoting cross-fertilization between the two areas.  Full
     length versions of selected papers will be published in the
     series Annals of Mathematics and Artificial Intelligence.

                                Sponsors

     The symposium is sponsored by Florida Atlantic University and
     IJCAII.  Additional funding is pending.  Partial travel
     subsidies may be available to young researchers.


                           Program Committee:

       Jean-Pierre Adam, IBM Paris Scientific Center
       Sanjaya Addanki, IBM Research
       Norman Foo, University of Sydney
       Mark Fox, Carnegie Mellon University
       William Gale, AT&T Bell Laboratories
       Peter Hammer, RUTCOR, Rutgers University
       Jean-Louis Lassez, IBM Research
       Hector Levesque, University of Toronto
       Wiktor Marek, University of Kentucky
       Anil Nerode, Cornell University
       Christos Papadimitriou, Univ. of Calif., San Diego
       Tomaso Poggio, MIT

                     Other  members  of  the  Editorial  Board  of

      "Mathematics and Artificial Intelligence"

       Woodrow Bledsoe, University of Texas
       Harvey Greenberg, University of Colorado at Denver
       Larry Henschen, Northwestern University
       Robert Hummel, Courant Institute, NYU
       Toshihide Ibaraki, Kyoto University
       R.C.T. Lee, National Tsing Hua University
       Jack Minker, University of Maryland
       Maurice Nivat, Universite de Paris
       Judea Pearl, University of Calif., Los Angeles
       F.J. Radermacher, Universitat Passau
       Michael Richter, Universitat Kaiserslautern
       Ronald Rivest, MIT
       Dana Scott, Carnegie Mellon
       Micha Sharir, Courant Institute, NYU
       Andrew Whinston, University of Texas
       H.P. Williams, University of Southampton

                         Functions and Lodging

      An early  arrivers reception  (cash bar)  on the  evening of
      January 2 and  a banquet on Thursday January  4 are included
      in the  registration fee.  Beverage service  for morning and
      afternoon breaks is also included.

      A block of rooms has  been reserved through December 1, 1989
      at the  Pier 66  Hotel, Ft. Lauderdale,  FL 33316,  USA; The
      rooms are  available at the Symposium  rate of approximately
      $67.00 per  night single  or double.  Reservations  for this
      block must  be made directly  with the hotel by  December 1,
      1989 mentioning the  name of the symposium.  The  hotel is 5
      minutes from the beach and can be reached by courtesy bus.

                             Registration

     Advanced registration can be done by mail, telephone
     (407-367-3099), FAX (407-367-3987) or email by December 15,
     1989.  The advanced registration fee is $130.00 ($60.00
     students) and can be paid by check or major credit card.
     Registration fee after December 15 will be $160.00.  Fees must
     be paid in U.S.  dollars and made payable to "Florida Atlantic
     University ".  Refunds (less $10) will be honored only for
     cancellations received by December 15, 1989.

     For further information, contact:
        Prof. Frederick Hoffman,  Dept. of Mathematics,
        Florida Atlantic University, Boca  Raton,  Florida 33431 U.S.A.
            Telephone:  (407) 367-3345
            email:      hoffmanf@servax.bitnet

  ---------------------------------------------------------------------
                  REGISTRATION FORM   --    Please print

   International Symposium on Artificial Intelligence and Mathematics

         Name:

         Affiliation:

         Address:


         Electronic mail:

                 Advanced registration $130.00 (by Dec. 15, 1989)

                 Advanced student registration $60.00 (by Dec. 15, 1989)

                 Regular registration $160.00 (after Dec. 15, 1989)

                 Student registration $100.00 (after Dec. 15, 1989_

------------------------------

End of Neurons Digest
*********************