itrctor@csri.toronto.edu (Ron Riesenbach) (10/19/90)
Neural Networks for Industry
from fundamentals to current research
presented by:
Dr. Geoffrey Hinton
Sponsored by:
Information Technology Research Centre
and
PRECARN Associates Inc.
December 11 and 12, 1990
Regal Constellation Hotel
900 Dixon Rd.
(near the Lester B. Pearson Airport)
Toronto, Ontario
1. Why Neural Networks?
Serial computation has been very successful at tasks that can be
characterized by clean logical rules. It has been much less successful at
tasks like real-world perception or common sense reasoning. These tasks
typically require massive amounts of uncertain evidence to be combined in
complex ways to reach reliable decisions. The brain is extremely good at
these computations and there is now a growing consensus that massively
parallel "neural" computation may be the best way to solve these problems.
The resurgence of interest in neural networks has been fuelled by
several factors. Powerful new search techniques such as simulated annealing
and its deterministic approximations can be embodied very naturally in
these networks. As such, parallel hardware implementations promise to be
extremely fast at performing the best-fit searches required for content-
addressable memory and real-world perception. Recently, new learning pro-
cedures have been developed which allow networks to learn from examples.
The learning procedures automatically construct the internal representa-
tions that the networks require in particular domains, and so they may
remove the need for explicit programming in ill-structured tasks that con-
tain a mixture of regular structure, partial regularities and exceptions.
2. Who Should Attend?
Day 1 is a tutorial directed at Industry Researchers and Managers who
would like to understand the basic principles underlying neural networks.
The tutorial will explain the main learning procedures and show how these
are used effectively in current applications. No previous exposure to
neural networks is necessary although a degree in computer science or
electrical engineering (or equivalent experience) is desirable.
Day 2 is an overview of advances made in this field in the last two or
three years. Research in progress at various laboratories will be reviewed.
This overview of how recent developments may lead to better learning proce-
dures in the future will be best appreciated by industry researchers and
managers who have some experience in this field or who have attended the
tutorial the previous day.
Those attending both days can expect to gain an understanding of the
current state-of-the-art in neural networks and be in a position to make
informed decisions about whether this technology is currently applicable,
or may soon become applicable, to specific problems in their area of
interest.
DAY 1
INTRODUCTION
o Computers versus brains
o The hardware of the brain
o Cooperative computation
o The Least Mean Squares learning procedure
o The perceptron paradigm
o Why hidden units are needed
o Varieties of learning procedure
o Competitive learning
o Learning topographic maps
BACKPROPAGATION LEARNING AND SIMPLE APPLICATIONS
o The backpropagation algorithm
o The NetTalk example
o The family trees example
o The parity example
o Theoretical results on generalization
o Simplicity and generalization
o Detecting bombs in suitcases
o Following a road with an autonomous land vehicle
BACKPROPAGATION: COMPLEX APPLICATIONS AND VARIATIONS
o Recognizing phonemes in spectrograms
o Recognizing hand-printed digits
o Alternative error functions
o Converting hand movements into speech
o Medical diagnosis
o What makes an application feasible
o The speed of convergence and ways to improve it
o Coding the input for backpropagation
o Self-supervised backpropagation
HOPFIELD NETS, BOLTZMANN MACHINES, AND MEAN FIELD NETS
o Binary Hopfield nets and their limitations
o Simulated annealing for escaping local energy minima
o Boltzmann Machines
o The Boltzmann machine learning procedure and its limitations
o Mean field networks for faster search
o Application to the travelling salesman problem
o A learning procedure for mean field nets.
DAY 2
RADIAL BASIS FUNCTIONS AND COMMUNITIES OF LOCAL EXPERTS
o Radial Basis Functions
o Relation to kernel methods in statistics
o Relation to Kanerva memories
o Application to predicting chaotic series
o Application to shape recognition
o Using soft competitive learning to adapt radial basis functions
o The elastic net
o Relation to mixtures of gaussians
o Communities of expert networks
o Relation to mixture models
o Applications to vowel recognition
MORE UNSUPERVISED LEARNING METHODS
o Finding multimodal projections of high dimensional data
o Application to adaptive modems
o Application to discovering important features of vowels
o Preserving information about the input with limited channel capacity
o Using coherence assumptions to discover spatial or temporal invariants
o Applications to stereo fusion and shape recognition
o Implementation in a new type of Boltzmann machine
NETWORKS FOR MODELLING SEQUENCES
o Backpropagation in recurrent networks
o Recurrent networks for predicting the next term in a sequence
o Using predictive networks for data-compression
o Ways to restrict recurrent networks
o Applications of recurrent networks to sentence understanding
o Hidden Markov Models and the Baum-Welch training procedure
o Combining HMM's with feedforward networks
o Implementing HMM recognizers in feedforward networks
o Reinforcement learning and the temporal credit assignment problem
o Recent developments in learning good action sequences
MISCELLANEOUS RECENT DEVELOPMENTS
o Neural networks for solving very large optimization problems
o Neural networks in non-linear controllers
o Better networks for hand-printed character recognition
o Why local minima are not fatal for backpropagation
o Why the use of a validation set improves generalization
o Adding hidden units incrementally
o Polynomial nets
3. Seminar Schedule
Tuesday, December 11, 1990 Wednesday, December 12, 1990
8:00-9:00 Registration and Coffee 8:00-9:00 Registration and Coffee
9:00-9:05 Opening words 9:00-9:05 Opening words
9:05-10:30 Tutorial Session #1 9:05-10:30 Advanced Session #1
10:30-11:00 Break 10:30-11:00 Break
11:00-12:30 Tutorial Session #2 1:00-12:30 Advanced Session #2
12:30-2:00 Lunch 12:30-2:00 Lunch
2:00-3:30 Tutorial Session #3 2:00-3:30 Advanced Session #3
3:30-4:00 Break 3:30-4:00 Break
4:00-5:30 Tutorial Session #4 4:00-5:30 Advanced Session #4
5:30-6:30 Wine and Cheese reception 5:30 Closing Words
4. Registration and Fees:
Fees are based on the affiliation of attendees. Employees of com-
panies who are members of ITRC's Industrial Affiliates Program or whose
companies are members of PRECARN pay a subsidized fee of $100/day. Non-
members fees are $400/day. There is no discount for attending both days.
Payment should be made by cheque (Payable to: "Information Technology
Research Centre") and should accompany the registration form where possible.
Due to limited space ITRC and Precarn members will have priority in case of
over-subscription. ITRC and Precarn reserve the right to limit the number
of registrants from any one company.
Fees include a copy of the course notes and transparencies, coffee and
light refreshments at the breaks, a luncheon each day as well as an infor-
mal wine and cheese reception Tuesday evening (open to registrants of
either day). Participants are responsible for their own hotel accommoda-
tion, reservations and costs, including hotel breakfast, evening meals and
transportation. PLEASE MAKE YOUR HOTEL RESERVATIONS EARLY:
Regal Constellation Hotel
900 Dixon Road
Etobicoke, Ontario
M9W 1J7
Telephone: (416) 675-1500
Telex: 06-989511
Fax: (416) 675-1737
When reserving hotel accommodation please mention ITRC/PRECARN for
special room rates.
Registrations will be accepted up to 5:00pm December 6, 1990. Late
registrations may be accepted but, due to limited space, attendees who
register by December 6th will have priority over late registrants.
To register, complete the registration form beolow then mail or fax it
to either one of the following two offices:
ITRC, Rosanna Reid PRECARN, Charlene Ferguson
University of Toronto 30 Colonnade Rd., Suite 300
D.L. Pratt Bldg., Rm. 286 Nepean, Ontario K2E 7J6
Toronto, Ontario M5S 1A1 Phone: (613) 727-9576
Phone: (416) 978-8558 Fax: (613) 727-5672
Fax: (416) 978-7207
5. Biography
Dr. Geoffrey E. Hinton (instructor)
Geoffrey Hinton is Professor of Computer Science at the University of
Toronto, a Fellow of the Canadian Institute for Advanced Research, a prin-
cipal researcher with the Information Technology Research Centre and a pro-
ject leader with the Institute for Robotics and Intelligent Systems (IRIS).
He received his PhD in Artificial Intelligence from the University of Edin-
burgh. He has been working on computational models of neural networks for
the last fifteen years and has published 70 papers and book chapters on
applications of neural networks in vision, learning, and knowledge
representation. These publications include the book "Parallel Models of
Associative Memory" (with James Anderson) and the original papers on dis-
tributed representations, on Boltzmann machines (with Terrence Sejnowski),
and on back-propagation (with David Rumelhart and Ronald Williams). He is
also one of the major contributors to the recent collection "Parallel Dis-
tributed Processing" edited by Rumelhart and McClelland.
Dr. Hinton was formerly an Associate Professor of Computer Science at
Carnegie-Mellon University where he created the connectionist research
group and was responsible for the graduate course on "Connectionist Artifi-
cial Intelligence". He is on the governing board of the Cognitive Science
Society and the governing council of the American Association for Artifi-
cial Intelligence. He is a member of the editorial boards of the journals
Artificial Intelligence, Machine Learning, Cognitive Science, Neural Compu-
tation and Computer Speech and Language.
Dr. Hinton is an expert at explaining neural network research to a
wide variety of audiences. He has given invited lectures on the research
at numerous international conferences, workshops, and summer schools. He
has given industrial tutorials in the United States for the Technology
Transfer Institute, AT&T Bell labs, Apple, Digital Equipment Corp., and
the American Association for Artificial Intelligence.
-------------------------Registration Form -------------------
Neural Networks for Industry
Tutorial by Geoffrey Hinton
December 11-12, 1990
Regal Constellation, 900 Dixon Rd.
Name _________________________________________
Title _________________________________________
Organization _____________________________________
Address _________________________________________
_________________________________________
_________________________________________
Postal Code ___________
Telephone __________________ Fax _________________
Registration Fee (check those that apply):
Day 1 Day 2 Total
------- ------- -------
ITRC or PRECARN member __ $75 __ $75 ________
Non-member __ $250 __ $250 ________
Please make cheques payable to Information Technology Research Centre
REGISTRATION DEADLINE: DECEMBER 6/90.
(Space is limited so register early)
Please fax or mail your registration to ITRC or PRECARN:
ITRC, Rosanna Reid PRECARN, Charlene Ferguson
University of Toronto 30 Colonnade Rd., Suite 300
D.L. Pratt Bldg., Rm. 286 Nepean, Ontario
6 King's College Rd. K2E 7J6
Toronto, Ontario M5S 1A1 phone (613) 727-9576
phone (416) 978-8558 fax (613) 727-5672
fax (416) 978-7207
--------------------------------------------------------------
--
|-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_|
| Ron Riesenbach, Toronto Manager Voice: (416) 978-8508 |
| Information Technology Research Centre Fax: (416) 978-7207 |
| D.L. Pratt Building, Room 286 E-mail: itrctor@csri.utoronto.ca |