[ont.events] Neural Networks for Industry: A two-day tutorial

itrctor@csri.toronto.edu (Ron Riesenbach) (10/24/89)

                     INFORMATION TECHNOLOGY RESEARCH CENTRE

                                      and

                TELECOMMUNICATIONS RESEARCH INSTITUTE OF ONTARIO

                            are pleased to sponsor:


                             A Two-Day Tutorial on

            N E U R A L   N E T W O R K S   F O R   I N D U S T R Y


                                 Presented by:
                              Dr. Geoffrey Hinton


                           Regal Constellation Hotel
               900 Dixon Road (near Person International Airport)
                                Toronto, Ontario
                            December 12 and 13, 1989






Why Neural Networks?

     Serial computation has been very successful at tasks that can be character-
ized  by clean logical rules, but it has been much less successful at tasks like
real-world perception or common sense reasoning that typically require a massive
amount  of  uncertain evidence to be combined to reach a reliable decision.  The
brain is extremely good at these computations and there is now  a  growing  con-
sensus that massively parallel "neural" computation may be the best way to solve
these problems.

     The resurgence of interest in neural networks has been fuelled  by  several
factors.   Powerful  new  search  techniques such as simulated annealing and its
deterministic approximations can be embodied very naturally in  these  networks,
so  parallel hardware implementations promise to be extremely fast at performing
the best-fit searches required for  content-addressable  memory  and  real-world
perception.  Recently,  new  learning procedures have been developed which allow
networks to learn from examples. The learning procedures automatically construct
the  internal  representations  that the networks require in particular domains,
and so they may remove the need for explicit programming in ill-structured tasks
that  contain  a  mixture  of regular structure, partial regularities and excep-
tions.

     There has also been considerable progress in developing ways of  represent-
ing complex, articulated structures in neural networks. The style of representa-
tion is tailored to the computational abilities of the networks and  differs  in
important  ways  from the style of representation that is natural in serial von-
Neuman machines.  It allows networks to be damage resistant which makes it  much
easier to build massively parallel networks.




Who Should Attend

     This tutorial is directed at Industry Researchers and  Managers  who  would
like to understand the basic principles underlying the recent progress in neural
network research.  Some impressive applications of neural networks to real-world
problems  already exist, but there are also many over-enthusiastic claims and it
is hard for the non-expert to distinguish between genuine  results  and  wishful
thinking.   The  tutorial will explain the main learning procedures and show how
these are used effectively  in  current  applications.  It  will  also  describe
research  in  progress  at various laboratories that may lead to better learning
procedures in the future.

     At the end of the tutorial attendees will understand the current  state-of-
the-art  in neural networks and will have a sound basis for understanding future
developments in this important technology.  Attendees will also learn the  major
limitations  of existing techniques and will thus be able to distinguish between
real progress and grandiose claims.  They will then be in  a  position  to  make
informed decisions about whether this technology is currently applicable, or may
soon become applicable, to specific problems in their area of interest.



                              Overview of the Tutorial


EARLY NEURAL NETWORKS & THEIR LIMITATIONS

 Varieties of Parallel Computation;  Alternative Paradigms for Computation

 A Comparison of Neural Models and Real Brains:  The Processing Elements and the
 Connectivity

 Major Issues in Neural Network Research

 The Least Mean Squares Learning Procedure: Convergence Rate, Practical Applica-
 tions and Limitations

 The Perceptron Convergence Procedure and the Limitations of Perceptrons

 The Importance of Adaptive "Hidden Units"



BACK-PROPAGATION LEARNING: THE THEORY & SIMPLE EXAMPLES

 The Back-Propagation Learning Procedure

 The NetTalk example

 Extracting the Underlying Structure of a Domain:  The Family Trees Example

 Generalizing from Limited Training Data:  The Parity Function

 Theoretical guarantees on the generalization abilities of neural nets

 Improving generalization by encouraging simplicity



SUCCESSFUL APPLICATIONS  OF BACK-PROPAGATION LEARNING

 Sonar Signal Interpretation

 Finding Phonemes in Spectrograms Using Time-Delay Nets

 Hand-written character recognition

 Bomb detection

 Adaptive interfaces for controlling complex physical devices

 Promising Potential  Applications




IMPROVEMENTS, VARIATIONS & ALTERNATIVES TO BACK-PROPAGATION

 Ways of Optimizing the Learning Parameters for Back-Propagation

 How the Learning Time Scales with the Size of the Task

 Back-Propagation in Recurrent Networks for Learning Sequences

 Using Back-Propagation with Complex Post-Processing

 Self-Supervised Back-Propagation

 Pre-Processing the Input to Facilitate Learning

 Comparison with Radial Basis Functions



UNSUPERVISED LEARNING PROCEDURES

 Competitive Learning for discovering clusters

 Kohonen's Method of  Constructing  Topographic  Maps:  Applications  to  Speech
 Recognition

 Linsker's method of learning by extracting principal components

 Using spatio-temporal coherence as an internal teacher

 Using spatial coherence to learn to recognize shapes




ASSOCIATIVE MEMORIES, HOPFIELD NETS & BOLTZMANN MACHINES

 Linear Associative Memories:  Inefficient  One-Pass  Storage  Versus  Efficient
 Iterative Storage

 Early Non-Linear Associative Memories:  Willshaw Nets

 Coarse-coding and Kanerva's sparse distributed memories Hopfield Nets and their
 Limitations

 Boltzmann Machines, Simulated Annealing and Stochastic Units

 Relationship of Boltzmann Machines to Bayesian Inference




MEAN FIELD NETWORKS

 Appropriate Languages and Computers for Software Simulators

 Predictions of Future Progress in the Theory and Applications of Neural Nets



GUEST LECTURE

Neural Signal Processing, by Dr. Simon Haykin, Director, Communications Research
Laboratory, McMaster University, Hamilton, Ontario.

     In this talk Dr. Haykin will present the results of neural signal  process-
ing  research  applied  to  radar-related  problems.   The algorithms considered
include (a) the backpropagation algorithm, (b) the Kohomen feature map, and  (c)
the  Boltzman machine.  The radar data bases used in the study include ice-radar
as encountered in the Arctic, and air traffic control primary radar.  The neural
processing is performed on the Warp systolic machine, which is illustrative of a
massively parallel computer.



                                Seminar Schedule

    Tuesday, December 12, 1989                  Wednesday, December 13, 1989


 8:00 a.m.  Registration and Coffee          8:00 a.m.  Coffee

 9:00       Opening words: Mike Jenkins,     9:00       Tutorial Session #5
            Exec. Director, ITRC and Peter
            Leach, Exec. Director,TRIO

 9:15       Tutorial Session #1             10:30       Break

10:30       Break                           11:00       Tutorial Session #6

11:00       Tutorial Session #2             12:30 p.m.  Lunch

12:30 p.m.  Lunch                            2:00       Tutorial Session  #7

 2:00       Tutorial Session  #3             3:30       Break

 3:30       Break                            4:00       Guest lecture:  Dr.
                                                        Simon Haykin, "Neural
                                                        Signal Processing"

4:00        Tutorial Session #4              5:00       Closing words

5:30        Wine and Cheese reception





Registration and Fees:

     The tutorial fee is $100 for employees of  companies  who  are  members  of
ITRC's  Industrial  Affiliates  Program  or who's companies are members of TRIO.
Non-members fees are $375/person.  Payment can be made by Visa, MasterCard, AMEX
or  by  cheque  (Payable  to: "Information Technology Research Centre").  Due to
limited space ITRC and  TRIO  members  will  have  priority  in  case  of  over-
subscription.   ITRC  and  TRIO  reserve the right to limit the number of regis-
trants from any one company.

     Included in the fees are a copy of the  course  notes  and  transparencies,
coffee  and  light refreshments at the breaks, a luncheon each day as well as an
informal wine and cheese reception Tuesday evening.  Participants are  responsi-
ble  for  their own hotel accommodation, reservations and costs, including hotel
breakfast, evening meals and transportation.  PLEASE MAKE  YOUR  HOTEL  RESERVA-
TIONS EARLY:

                           Regal Constellation Hotel
                                 900 Dixon Road
                               Etobicoke, Ontario
                                    M9W 1J7
                           Telephone: (416) 675-1500
                                Telex: 06-989511
                              Fax: (416) 675-1737

Registrations will be accepted up to and including the day of the event however,
due  to limited space, attendees who register by December 6th will have priority
over late registrants.  All cancellations after December 6th will  result  in  a
$50 withdrawal fee.

     To register, complete the registration form attached to  the  end  of  this
message then mail or fax it to either one of the two sponsors.




Dr. Geoffrey E. Hinton

     Geoffrey Hinton is Professor of  Computer  Science  at  the  University  of
Toronto,  a fellow of the Canadian Institute for Advanced Research and a princi-
pal researcher with the Information Technology Research Centre.  He received his
PhD  in  Artificial  Intelligence from the University of Edinburgh.  He has been
working on computational models of neural networks for the  last  fifteen  years
and has published 55 papers and book chapters on applications of neural networks
in vision, learning, and knowledge representation.  These  publications  include
the  book  "Parallel Models of Associative Memory" (with James Anderson) and the
original papers on distributed representations, on Boltzmann machines (with Ter-
rence  Sejnowski), and on back-propagation (with David Rumelhart and Ronald Wil-
liams).  He is also one of the  major  contributors  to  the  recent  collection
"Parallel Distributed Processing" edited by Rumelhart and McClelland.

     Dr. Hinton was formerly an  Associate  Professor  of  Computer  Science  at
Carnegie-Mellon University where he created the connectionist research group and
was responsible for the graduate course on  "Connectionist  Artificial  Intelli-
gence".   He  is on the governing board of the Cognitive Science Society and the
governing council of the American Association for Artificial  Intelligence.   He
is  a  member  of  the editorial boards of the journals Artificial Intelligence,
Machine Learning, Cognitive Science, Neural Computation and Computer Speech  and
Language.

     Dr. Hinton is an expert at explaining neural network  research  to  a  wide
variety of audiences.  He has given invited lectures on the research at numerous
international conferences and workshops, and has twice co-organized  and  taught
at the Carnegie-Mellon "Connectionist Models Summer School".  He has given three
three-day industrial tutorials in the United States for the Technology  Transfer
Institute.   He has also given tutorials at AT&T Bell labs, at Apple, and at two
annual meetings of the American Association for Artificial Intelligence.


Dr. Simon Haykin

     Simon Haykin received his B.Sc. (First-Class Honours)  in  1953,  Ph.D.  in
1956,  and  D.Sc.  in 1967, all in Electrical Engineering from the University of
Birmingham, England.  In 1980, he was elected Fellow of  the  Royal  Society  of
Canada.   He is co-recipient of the Ross Medal from the Engineering Institute of
Canada  and  the  J.J.  Thomson  Premium  from  the  Institution  of  Electrical
Engineers,  London.   He was awarded the McNaughton Gold Medal, IEEE (Region 7),
in 1986.  He is a Fellow of the IEEE.

     He is presently Director of the Communications Research Laboratory and Pro-
fessor  of Electrical and Computer Engineering at McMaster University, Hamilton,
Ontario.  His research interests include  image  processing,  adaptive  filters,
adaptive detection, and spectrum estimation with applications to radar.





 ----------------------------- Registration Form -----------------------------


                         Neural Networks for Industry
                          Tutorial by Geoffrey Hinton
                              December 12-13, 1989
                       Regal Constellation, 900 Dixon Rd.



Name          _________________________________________

Title         _________________________________________

Organization  _________________________________________

Address       _________________________________________

              _________________________________________

              _________________________________________

Postal Code   _______________________

Telephone     __________________       Fax   ___________________

E-mail        _______________________


Registration Fee (check one):


_  ITRC/TRIO Members - $100
_  Non-members       - $375


Method of Payment (check one):

_  Cheque                        (Make cheques payable to "Information
                                  Technology Research Centre")

_  VISA                          Card Number _________________________
_  MasterCard          ==>       Expiration Date _____________________
_  American Express              Surname _____________________________
                                 Signature ___________________________

Please note:  There will be a $50 cancellation charge after December 6/89.


Please fax or mail your registration to ITRC or TRIO:

  ITRC, Rosanna Reid              TRIO, Debby Sullivan
  203 College St., Suite 303      300 March Rd., Suite 205
  Toronto, Ontario, M5T 1P9       Kanata, Ontario, K2K 2E2
  Phone (416) 978-8558            Phone  (613) 592-9211
  Fax   (416) 978-8597            Fax    (613) 592-8163


               PRIORITY REGISTRATION DEADLINE:  DECEMBER 6/89.


------------------------------------------------------------------------------