neuron-request@HPLMS2.HPL.HP.COM ("Neuron-Digest Moderator Peter Marvit") (11/09/90)
Neuron Digest Thursday, 8 Nov 1990
Volume 6 : Issue 65
Today's Topics:
Info Available on AI/CogSci Program
Re: Neuron Digest V6 #64
Transputer Implementations References?
For NIPS*90 Presenters!
JNNS'90 program and the mailing list (long)
Please consider this tutorial announcement for posting
Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).
------------------------------------------------------------
Subject: Info Available on AI/CogSci Program
From: al@unix.cis.pitt.edu (Alan M Lesgold)
Organization: Univ. of Pittsburgh, Comp & Info Sys
Date: 04 Nov 90 15:50:27 +0000
The University of Pittsburgh's Intelligent Systems Program is a graduate
program leading to a PhD for students interested in artificial
intelligence and cognitive science. An interdisciplinary program, it has
strong ties to faculty in computer science, psychology, medical
informatics, linguistics, history and philosophy of science, and other
departments. Major research projects of faculty range over a variety of
formal computational approaches to cognitive science, with substantial
interest in applied as well as purely basic research. We welcome
inquiries from potential students interested in artificial intelligence
and cognitive science.
For more information, send a postal mailing address to
isp@unix.cis.pitt.edu
Alan Lesgold and Rich Thomason, Co-Directors
Intelligent Systems Program
University of Pittsburgh
------------------------------
Subject: Re: Neuron Digest V6 #64
From: Russ Eberhart <RCE1%APLVM.BITNET@CORNELLC.cit.cornell.edu>
Date: Tue, 06 Nov 90 17:27:45 -0500
In response to the request that prices be included with announcements of
new books:
Neural Network PC Tools: A Practical Guide
Eberhart & Dobbins, Eds.
Published by Academic Press, October 1990
Price: US $44.95
Book can be ordered toll-free 1-800-321-5068 from US or Canada
except Missouri, Alaska or Hawaii: 1-314-528-8110
The Table of Contents for this book was posted on a previous
issue of Neuron Digest.
------------------------------
Subject: Transputer Implementations References?
From: Ade Miller <ASM%ASTRONOMY.PHYSICS.SOUTHAMPTON.AC.UK@pucc.PRINCETON.EDU>
Date: Wed, 07 Nov 90 12:04:00 +0000
I have just started a Phd in applying neural networks to medical imaging
and automated data rejection in astronomy. My main interests are back-
propagation networks; optimisation and learning, applications to the
above and implementations on parallel hardware (especially Transputers).
Does anyone have any useful references on the above, especially the
Transputer implementation side, (I have already looked at 'PDP' etc.).
Cheers,
Ade.
JANET address: ASM@UK.AC.SOTON.PHASTR
------------------------------
Subject: For NIPS*90 Presenters!
From: nips90 <nips-90@CS.YALE.EDU>
Date: Wed, 07 Nov 90 14:17:37 -0500
The NIPS*90 Conference is now less than 3 weeks away.
I would like to encourage all NIPS*90 presenters to consider bringing
videotapes of their work if available. We will most likely have a video
projection system set up in the main conference room for oral and poster
spotlight presenters. We will also have a limited number of televisions
and video casette players available in the hall for the poster sessions.
If you would like to present a videotape (VHS format preferred), please
contact me as soon as possible so that I can reserve a machine for your
time slot or for your poster session.
Thanks much,
John Moody
NIPS*90 Program Chair
nips@cs.yale.edu
(203)432-1200 VOICE
(203)432-0593 FAX
------------------------------
Subject: JNNS'90 program and the mailing list (long)
From: kawahara@siva.ntt.jp
Date: Sat, 03 Nov 90 00:12:30 +0200
I have finally compiled a list of titles presented at the first annual
conference of Japan Neural Network Society. I hope this will give a
general aspect of neural network research activities in Japan.
If you have any questions, please contact neuro-admin@tut.ac.jp, which is
the administrating group of Japanese neural network researcher's mailing
list. The mailing list is still its infancy, and the traffic is still
very low. I hope this will change in the near future.
Hideki Kawahara
NTT Basic Research Laboratories
kawahara%siva.ntt.jp@RELAY.CS.NET
---------- cut here ----------
JNNS'90
The first annual conference of the Japan Neural Network Society
Tamagawa University, Tokyo Japan
10-12 September, 1990
Presentation titles: (Original titles were Japanese. These titles were
translated into English by the original authors. Some of them, indicated
by '**', were translated by the editor of this list. 'O' stands for oral
presentation and 'P' stands for poster presentation.)
O1-1, A-BP "Another type of back propagation learning", Kazuhisa Niki
(Electro technical Lab.)
O1-2, Learning of Affine Invariance by Competitive Neural Network, Suichi
Kurogi (Division of Control Engineering, Kyushu Institute of Technology)
O1-3, ** An Optimal Network Size Selection for Generalization based on
Cross Validation, Yasuhiro Wada and Mitsuo Kawato (ATR)
O1-4, Generalizing Neural Networks using Mean Curvature, Shin Suzuki and
Hideki Kawahara (NTT Basic Research Labs.)
O1-5, Evolution of Artificial Animal having Perceptron by Genetic
Algorithm, Masanori Ichinose and Tsutomu Hoshino (Institute of
Engineering Mechanics, University of Tsukuba)
O2-1, Neural Network Model of Self-Organization of Walking Patterns in
Insects, Shinichi Kimura, Masafumi Yano and Hiroshi Shimizu(Faculty of
Pharmaceutical Science University of Tokyo)
O2-2, Learning Trajectory and Force Control of Human Arm Using
Feedback-Error-Learning Scheme, Masazumi Katayama and Mitsuo Kawato(ATR
Auditory and Visual Perception Research Laboratories)
O2-3, Formation of Optimal Hand Configuration to grasp an Object, Naohiro
Fukumura, Yoji Uno and Ryoji Suzuki(Department of Mathematical
Engineering and Information Physics, Faculty of Engineering, University
of Tokyo)
O2-4, Hybrid Control of Robotic Manipulator by Neural Network Model
(Variable learning of neural networks by Fuzzy set theory), Takanori
Shibata and Toshio Fukuda (Nagoya University) Masatoshi Tokita and
Toyokazu Mitsuda (Kisarazu Technical College)
O2-5, An Overview of Neuofuzzy System, Akira KAWAMURA, Nobuo WATANABE,
Yuri Owada, Ryusuke Masuoka and Kazuo Asakawa (Computer-based Systems
Laboratory, Fujutsu Laboratories Ltd.)
O3-1, ** Cognitive Effects caused by Random Movement of Wide-Field
Patterns and Response Characteristics of MST Cells of Macaque Monkey,
Masao Kaneko, Hiroshi Nakajima, Makoto Mizuno, Eiki Hida, Hide-aki Saito
and Minoru Tsukada (Faculty of Engineering, Tamagawa University)
O3-2, Extraction of Binocular Parallax with a Neural Network and 3-D
Surface Reconstruction, Mahito Fujii, Takayuki Ito, Toshio Nakagawa (NHK
Science and Technical Research Laboratories)
O3-3, A Model of 3-D Surface Depth Perception from Its Boundary Perceived
with Binocular Viewing, Masanori Idesawa (Rikin: The Institute of
Physical and Chemical Research)
O3-4, Theory of Information Propagation, Tetsuya Takahashi(The Institute
for Physical and Chemical Research Laboratory for Neural Networks)
O3-5, A model of the transformation of color selectivity in the monkey
visual system, Hidehiko Komatsu, Shinji Kaji, Shigeru Yamane
(Electrotechnical Laboratory Neuroscience Section), Yoshie Ideura
(Komatsu Limited)
O4-1, Magical number in cognitive map and quantization of cognitive map
shape, Terunori Mori (Electrotechnical Laboratory)
O4-2, Generative representation of symbolic information in a pattern
recognition model "holovision", Hiroshi Shimizu and Yoko Yamaguchi
(Faculty of Pharmaceutical Sciences, University of Tokyo)
O4-3, Long-Term Potentiation to Temporal Pattern Stimuli in Hippocampal
Slices, Takeshi Aihara, Minoru Tsukada and Makoto Mizuno (Faculty. of
Eng. Tamagawa Univ.), Hiroshi Kato and Haruyoshi Miyagawa(Dept. of
Physiol. Yamagata Univ.)
O4-4, Bidirectional Neural Network Model for the Generation and
Recognition of Temporal Patterns, Ryoko Futami and Nozomu Hoshimiya
(Department of Electrical Communications, Faculty of Engineering, Tohoku
University)
O4-5, Theta rhythm in hippocampus: phase control of information
circulation, Yoko Yamaguchi and Hiroshi Shimizu (Faculty of
Pharmaceutical Science University of Tokyo)
O5-1, Stability and/or instability of limit cycle memories embedded in an
asynchronous neural network model, Toshinao Ishii and Wolfgang Banzhaf
(Central Research Laboratory Mitsubishi Electric Corporation), Shigetoshi
Nara (Department of Electric and Electronic Engineering Faculty of
Engineer Okayama Univ.)
O5-2, Geometric analysis of the dynamics of associative memory networks,
Kenji Doya (Faculty of Engineering, University of Tokyo)
O5-3, On the Integration of Mapping and Relaxation, Kazuyoshi Tsutsumi
(Department of Mechanical and System Engineering Faculty of Science and
Technology Ryukoku Univ.)
P1-1, Neural network model on gustatory neurons in rat, Masaharu Adachi,
Eiko Ohshima, Kazuyuki Aihara and Makoto Kotani (Faculty of Engineering,
Tokyo Denki Univ.), Takanori Nagai (Faculty of Medicine, Teikyo Univ.),
Takashi Yamamoto (Faculty of Dentistry, Osaka Univ.)
P1-2, Learning Algorithm based on Temporal Pattern Discrimination in
Hippocampus, Minoru Tsukada and Takeshi Aihara (Tamagawa University), K.
Kato (Yamagata University)
P1-3, A study on Learning by Synapse patch group, Shuji Akiyama, Yukifumi
Shigematsu and Gen Matsumoto (Electrotechnical Laboratory Molecular and
Cellular Neuroscience Section)
P1-4, A Model of the Mechanisms of Long-Term Depression in the
Cerebellum, Tatso Kitajima and Kenichi Hara (Faculty of Engineering,
Yamagata Univ.)
P1-5, ** Self-Organization in Neural Networks with Lateral Inhibition, Y.
Tamori, S. Inawashiro and Y. Musya (Faculty of Engineering, Tohoku
University)
P1-6, ** Receptive Fields by Self-Organization in Neural Networks,
S.Inawashiro, Y. Tamori and Y. Musya (Faculty of Engineering, Tohoku
University)
P1-7, ** Does Backpropagation Exist in Biological Systems? -- Discussions
and Considerations, Shyozo Yasui (Kyusyu Institute of Technology), Eiki
Hida (Tamagawa University)
P1-8, Accelerating the convergence of the error back-propagation
algorithm by deciding effective teacher signal, Yutaka Fukuoka, Hideo
Matsuki, Hidetake Muraoka and Haruyuki Minamitani (Keio Univ.)
P1-9, **Three-Layered Backpropagation Model with Temperature Parameter,
Yoji Fukuda, Manabu Kotani and Haruya Matsumoto (Faculty of Engineering,
Kobe University)
P1-10, Kalman Type Least Square Error Learning Law for Sequential Neural
Network and its Information Theory, Kazuyoshi Matsumoto (Kansai Advanced
Research Center, CRL, MPT)
P1-11, Learning Surface of Hierarchical Neural Networks and Valley
Learning Method, Kazutoshi Gouhara, Norifusa Kanai, Takeshi Iwata and
Yoshiki Uchikawa (Department of Electronic Mechanical Engineering School
of Engineering: Nagoya Univ.)
P1-12, A Study of Generalization of Multi-layered Neural Networks,
Katsumasa Matsuura (Mechanical Engineering Research Laboratory, Hitachi
Ltd.)
P1-13, When "Learning" occurs in Machine Learning, Noboru Watanabe
(Department of Biophysics, Kyoto University)
P1-14, A Study on Self-supervised Learning System, Kazushige Saga, Tamami
Sugasaka and Shigemi Nagata (Computer-Based Systems Laboratory Fujitsu
Laboratories Ltd.)
P1-15, A Learning Circuit for VLSI Analog Neural Network Implementation,
Hiroyuki Wasaki, Yoshihiko Horio and Shogo Nakamura (Department of
Electronic Engineering : Tokyo Denki Univ.)
P1-16, Comparison of Learning Methods for Recurrent Neural Networks,
Tatsumi Watanabe, Kazutoshi Gouhara and Yoshiki Uchikawa (Department of
Electronic Mechanical Engineering, School of Engineering : Nagoya Univ.)
P1-17, A Learning algorithm for the neural network of Hopfield type,
Fumio Matsunari and Masuji Ohshima (Toyota Central Res. & Develop. Labs.
Inc.)
P1-18, Quantitative relationship between internal model of motor system
and movement accuracy, movement distance and movement time, Yoji Uno and
Ryoji Suzuki (Faculty of Engineering: University of Tokyo)
P1-19, Autonomic control of bipedal locomotion using neural oscillators,
Gentaro Taga, Yoko Yamaguchi and Hiroshi Shimizu (Faculty of
Pharmaceutical Sciences: University of Tokyo)
P1-20, Learning Model of Posture Control in Cerebellum, Hiroaki Gomi and
Mitsuo Kawato (ATR Auditory and Visual Perception Research Laboratories)
P1-21, Hybrid Control of Robotic Manipulator by Neural Network Model
(Sensing and Hybrid Control of Robotic Manipulator with Collision
Phenomena), Takanori Shibata, Toshio Fukuda, Fumihito Arai and Hiroshi
Wada (Nagoya University), Masatoshi Tokita and Toyokazu Mituoka (Kisarazu
Technical College) Yasumasa Shoji (Toyo Engineering Corp.)
P1-22, Hybrid Position/Force Control of Robotic Manipulator by
Application of Neural Network (Adaptive Control with Consideration of
Characteristics of Objects), Masatoshi Tokita and Toyokazu
Mituoka(Kisarazu National College of Technology), Toshio Fukuda and
Takanori Shibata (Nagoya Univ.)
P1-23, Reverbration in Chaotic Neural Network with a Fractal Connection,
Masatoshi Hori, Masaaki Okabe and Masahiro Nakagawa (Department of
Electrical Engineering, Faculty of Engineering, Nagaoka University of
Technology)
P1-24, An investigation of correlation possibility in neurocomputing and
quantum mechanics via Matrix Dynamics, Tomoyuki Nishio (Technology
Planning office General R&D Laboratory JUKI Corporation)
P1-25, Knowledge Representation and Parallel Inference with Structured
Networks, Akira Namatame and Youichi Ousawa (National Defense Academy)
P1-26, Simulation of a Spiking Neuron with Multiple Input, G. Bugmann
(Fundamental Research Laboratories, NEC Corporation)
P2-1, Parallel Implementation of Edge detection by Energy Minimization on
QCDPAX machine, Hideki Asoh (Electrotechnical Laboratory), Youichi
Hachikubo and Tsutomu Hoshino (University of Tsukuba)
P2-2, Characteristics of the Marr-Hildreth filter for two particle image,
Shigeharu Toyoda (Department of Chemical Engineering, Faculty of
Engineering Science, Osaka Univ.)
P2-3, A neural network for fixation point selection, Makoto Hirahara and
Takashi Nagano (College of engineering, Hosei Univ.)
P2-4, A Method for Analyzing Inverse Dynamics of the Retinal Horizontal
Cell Response through the Ionic Current Model, Yoshimi Kamiyama, Hiroyuki
Ishii and Shiro Usui (Information and Computer Science, Toyohashi Uni. of
Technology)
P2-5, Selective Recognition of Plural Patterns by Neural Networks,
Katsuji Imai, Kazutoshi Gouhara and Yoshiki Uchikawa (Department of
Electronic Mechanical Engineering, School of Engineering, Nagoya Univ.)
P2-6, A neural network for size-invariant pattern recognition, Masaichi
Ishikawa and Toshi Nagano (College of Engineering, Hosei Univ.)
P2-7, Human-Face Identification by Neural Network for Mosaic Pattern,
Makoto Kosugi (NTT Human Interface Laboratories)
P2-8, Pattern Classification and Recognition Neural Network
based on the Associative Learning Rules, Hidetoshi Ikeno
(Maizuru College of Technology), Shiro Usui and Manabu
Sakakibara (Toyohashi Univ. of Technology)
P2-9, Learning of A Three-Layer Neural Network with Translated
Patterns, Jianqiang YI, Shuichi Kurogi and Kiyotoshi Matsuoka
(Division of Control Engineering, Kyushu Institute of
Technology)
P2-10, Combining a Neural Network with Template Matching for Handwritten
Numeral Classification, Masahiko Tateishi, Haruaki Yamazaki (Systems
Laboratories, OKI Electric Corporation)
P2-11, Recognition of Continuous Writing of English Words with the
Mechanism of Selective Attention, Taro Imagawa and Kunihiko Fukushima
(Faculty of Engineering Science, Osaka University)
P2-12, Recognition of Handwritten Alphanumeric Characters by the
Neocognitron, Nobuaki Wake Kunihiko Fukushima (Faculty of Engineering
Science, Osaka University)
P2-13, Target Recognition with Chebyshev Networks, Nobuhisa Ueda and
Akira Namatame (National Defense Academy)
P2-14, A Large Scale Neural Network "Comb NET" for Printed Kanji
Character Recognition (JIS 1st and 2nd Level Character Set), Takashi
Tohma, Akira Iwata, Hiroshi Matsuo and Nobuo Suzumura (Nagoya Institute
of Technology)
P2-15, Discriminative Properties of the Temporal Pattern in the
Mesencephalic Periaqueductal Gray of Rat, Mitsuo Terasawa, Minoru
Tsukada, Makoto Mizuno, Takeshi Aihara (Faculty of Engineering Tamagawa
Univ.)
P2-16, Spoken Word Recognition using sequential Neural Network, Seiichi
Nakagawa and Isao Hayakawa (Toyohashi University of Technology, Dept. of
Info. & Comp. Science.)
P2-17, Learning of the three-vowel sequence by a neural network model and
influence to a middle vowel from preceding and succeeding vowels,
Teruhiko Ohtomo and Ken-ichi Hara (Faculty of Engineering, Yamagata
Univ.)
P2-18, A Self-Organizing Neural Network for The Classification of
Temporal Sequences, Seiichi Ozawa (Nara National College of Technology),
Kazuyoshi Tsutsumi (Faculty of Science and Technology, Ryukoku Univ.),
Haruya Matsumoto (Faculty of Technology, Kobe Univ.)
P2-19, Neural Mechanism for Fine Time-Resolution, Kiyohiko Nakamura
(Graduate School of Science and Engineering Tokyo Institute of
Technology)
P2-20, Neuromagnetic Image Reconstruction by a Neural Network, Hisashi
Tsuruoka (Fukuoka Institute of Technology) and Kohyu Fukunishi (Advanced
Research Lab., Hitachi, Ltd.)
P2-21, A Sparse Stabilization in Mutually Connected Neural Networks,
Hiroshi Yamakawa (Faculty of Engineering, University of Tokyo), Yoichi
Okabe (Research Center for Advanced Science and Technology, University of
Tokyo)
P2-22, ** Relation between Recollection Ability and Membrane Potential
Threshold in Hopfield Model, Eizo Ohno and Atsushi Yamanaka (Sharp
Laboratory)
P2-23, Weight quantization in learning Boltzmann machine, Masanobu
Takahashi, Wolfgang Balzer, Jun Ohta and Kazuo Kyuma (Solid state quantum
electronics department Central Research Laboratory Mitsubishi Electric
Corporation)
P2-24, A cellular organizing approach for the travelling salesman
problem, M. Shigematsu (Electrotechnical Laboratory)
P2-25, Combinatorial Optimization Problems and Stochastic Logic Neural
Network, Yoshikazu Kondo and Yasuji Sawada (Research Institute of
Electrical Communication Tohoku Univ.)
P2-26, Neural Network Models on SIMD Architecture Computer, Makoto
Hirayama (ATR Auditory and Visual Perception Research Laboratories)
P2-27, Optimal connection of an associative network with non-uniform
elements, Hiro-fumi Yanai (Department of Mathematical Engineering and
Information Physics, University of Tokyo)
------------------------------
Subject: Please consider this tutorial announcement for posting
From: Ron Riesenbach <itrctor@csri.toronto.edu>
Date: Tue, 06 Nov 90 17:24:08 -0500
Sir:
The Information Technology Research Centre (ITRC) is a non-profit Centre
of Excellence of the province of Ontario. We sponsor world-class research
in various areas of information technology and transfer the results of this
research to companies.
The tutorial (announcement below) is given annually by Dr. Hinton and
has in the past attracted people from across North America. I would
appreciate it if you would consider posting it in your neuron-digest.
Thank you.
Ron Riesenbach, Manager
Information Technology Research Centre
----------------------------------------------------------------------
Neural Networks for Industry
from fundamentals to current research
presented by:
Dr. Geoffrey Hinton
Sponsored by:
Information Technology Research Centre
and
PRECARN Associates Inc.
December 11 and 12, 1990
Regal Constellation Hotel
900 Dixon Rd.
(near the Lester B. Pearson Airport)
Toronto, Ontario
1. Why Neural Networks?
Serial computation has been very successful at tasks that can be
characterized by clean logical rules. It has been much less successful at
tasks like real-world perception or common sense reasoning. These tasks
typically require massive amounts of uncertain evidence to be combined in
complex ways to reach reliable decisions. The brain is extremely good at
these computations and there is now a growing consensus that massively
parallel "neural" computation may be the best way to solve these problems.
The resurgence of interest in neural networks has been fuelled by
several factors. Powerful new search techniques such as simulated annealing
and its deterministic approximations can be embodied very naturally in
these networks. As such, parallel hardware implementations promise to be
extremely fast at performing the best-fit searches required for content-
addressable memory and real-world perception. Recently, new learning pro-
cedures have been developed which allow networks to learn from examples.
The learning procedures automatically construct the internal representa-
tions that the networks require in particular domains, and so they may
remove the need for explicit programming in ill-structured tasks that con-
tain a mixture of regular structure, partial regularities and exceptions.
2. Who Should Attend?
Day 1 is a tutorial directed at Industry Researchers and Managers who
would like to understand the basic principles underlying neural networks.
The tutorial will explain the main learning procedures and show how these
are used effectively in current applications. No previous exposure to
neural networks is necessary although a degree in computer science or
electrical engineering (or equivalent experience) is desirable.
Day 2 is an overview of advances made in this field in the last two or
three years. Research in progress at various laboratories will be reviewed.
This overview of how recent developments may lead to better learning proce-
dures in the future will be best appreciated by industry researchers and
managers who have some experience in this field or who have attended the
tutorial the previous day.
Those attending both days can expect to gain an understanding of the
current state-of-the-art in neural networks and be in a position to make
informed decisions about whether this technology is currently applicable,
or may soon become applicable, to specific problems in their area of
interest.
DAY 1
INTRODUCTION
o Computers versus brains
o The hardware of the brain
o Cooperative computation
o The Least Mean Squares learning procedure
o The perceptron paradigm
o Why hidden units are needed
o Varieties of learning procedure
o Competitive learning
o Learning topographic maps
BACKPROPAGATION LEARNING AND SIMPLE APPLICATIONS
o The backpropagation algorithm
o The NetTalk example
o The family trees example
o The parity example
o Theoretical results on generalization
o Simplicity and generalization
o Detecting bombs in suitcases
o Following a road with an autonomous land vehicle
BACKPROPAGATION: COMPLEX APPLICATIONS AND VARIATIONS
o Recognizing phonemes in spectrograms
o Recognizing hand-printed digits
o Alternative error functions
o Converting hand movements into speech
o Medical diagnosis
o What makes an application feasible
o The speed of convergence and ways to improve it
o Coding the input for backpropagation
o Self-supervised backpropagation
HOPFIELD NETS, BOLTZMANN MACHINES, AND MEAN FIELD NETS
o Binary Hopfield nets and their limitations
o Simulated annealing for escaping local energy minima
o Boltzmann Machines
o The Boltzmann machine learning procedure and its limitations
o Mean field networks for faster search
o Application to the travelling salesman problem
o A learning procedure for mean field nets.
DAY 2
RADIAL BASIS FUNCTIONS AND COMMUNITIES OF LOCAL EXPERTS
o Radial Basis Functions
o Relation to kernel methods in statistics
o Relation to Kanerva memories
o Application to predicting chaotic series
o Application to shape recognition
o Using soft competitive learning to adapt radial basis functions
o The elastic net
o Relation to mixtures of gaussians
o Communities of expert networks
o Relation to mixture models
o Applications to vowel recognition
MORE UNSUPERVISED LEARNING METHODS
o Finding multimodal projections of high dimensional data
o Application to adaptive modems
o Application to discovering important features of vowels
o Preserving information about the input with limited channel capacity
o Using coherence assumptions to discover spatial or temporal invariants
o Applications to stereo fusion and shape recognition
o Implementation in a new type of Boltzmann machine
NETWORKS FOR MODELLING SEQUENCES
o Backpropagation in recurrent networks
o Recurrent networks for predicting the next term in a sequence
o Using predictive networks for data-compression
o Ways to restrict recurrent networks
o Applications of recurrent networks to sentence understanding
o Hidden Markov Models and the Baum-Welch training procedure
o Combining HMM's with feedforward networks
o Implementing HMM recognizers in feedforward networks
o Reinforcement learning and the temporal credit assignment problem
o Recent developments in learning good action sequences
MISCELLANEOUS RECENT DEVELOPMENTS
o Neural networks for solving very large optimization problems
o Neural networks in non-linear controllers
o Better networks for hand-printed character recognition
o Why local minima are not fatal for backpropagation
o Why the use of a validation set improves generalization
o Adding hidden units incrementally
o Polynomial nets
3. Seminar Schedule
Tuesday, December 11, 1990 Wednesday, December 12, 1990
8:00-9:00 Registration and Coffee 8:00-9:00 Registration and Coffee
9:00-9:05 Opening words 9:00-9:05 Opening words
9:05-10:30 Tutorial Session #1 9:05-10:30 Advanced Session #1
10:30-11:00 Break 10:30-11:00 Break
11:00-12:30 Tutorial Session #2 1:00-12:30 Advanced Session #2
12:30-2:00 Lunch 12:30-2:00 Lunch
2:00-3:30 Tutorial Session #3 2:00-3:30 Advanced Session #3
3:30-4:00 Break 3:30-4:00 Break
4:00-5:30 Tutorial Session #4 4:00-5:30 Advanced Session #4
5:30-6:30 Wine and Cheese reception 5:30 Closing Words
4. Registration and Fees:
Fees are based on the affiliation of attendees. Employees of com-
panies who are members of ITRC's Industrial Affiliates Program or whose
companies are members of PRECARN pay a subsidized fee of $75/day. Non-
members fees are $250/day. There is no discount for attending both days.
Payment should be made by cheque (Payable to: "Information Technology
Research Centre") and should accompany the registration form where possible.
Due to limited space ITRC and Precarn members will have priority in case of
over-subscription. ITRC and Precarn reserve the right to limit the number
of registrants from any one company.
Fees include a copy of the course notes and transparencies, coffee and
light refreshments at the breaks, a luncheon each day as well as an infor-
mal wine and cheese reception Tuesday evening (open to registrants of
either day). Participants are responsible for their own hotel accommoda-
tion, reservations and costs, including hotel breakfast, evening meals and
transportation. PLEASE MAKE YOUR HOTEL RESERVATIONS EARLY:
Regal Constellation Hotel
900 Dixon Road
Etobicoke, Ontario
M9W 1J7
Telephone: (416) 675-1500
Telex: 06-989511
Fax: (416) 675-1737
When reserving hotel accommodation please mention ITRC/PRECARN for
special room rates.
Registrations will be accepted up to 5:00pm December 6, 1990. Late
registrations may be accepted but, due to limited space, attendees who
register by December 6th will have priority over late registrants.
To register, complete the registration form beolow then mail or fax it
to either one of the following two offices:
ITRC, Rosanna Reid PRECARN, Charlene Ferguson
University of Toronto 30 Colonnade Rd., Suite 300
D.L. Pratt Bldg., Rm. 286 Nepean, Ontario K2E 7J6
Toronto, Ontario M5S 1A1 Phone: (613) 727-9576
Phone: (416) 978-8558 Fax: (613) 727-5672
Fax: (416) 978-7207
5. Biography
Dr. Geoffrey E. Hinton (instructor)
Geoffrey Hinton is Professor of Computer Science at the University of
Toronto, a Fellow of the Canadian Institute for Advanced Research, a prin-
cipal researcher with the Information Technology Research Centre and a pro-
ject leader with the Institute for Robotics and Intelligent Systems (IRIS).
He received his PhD in Artificial Intelligence from the University of Edin-
burgh. He has been working on computational models of neural networks for
the last fifteen years and has published 70 papers and book chapters on
applications of neural networks in vision, learning, and knowledge
representation. These publications include the book "Parallel Models of
Associative Memory" (with James Anderson) and the original papers on dis-
tributed representations, on Boltzmann machines (with Terrence Sejnowski),
and on back-propagation (with David Rumelhart and Ronald Williams). He is
also one of the major contributors to the recent collection "Parallel Dis-
tributed Processing" edited by Rumelhart and McClelland.
Dr. Hinton was formerly an Associate Professor of Computer Science at
Carnegie-Mellon University where he created the connectionist research
group and was responsible for the graduate course on "Connectionist Artifi-
cial Intelligence". He is on the governing board of the Cognitive Science
Society and the governing council of the American Association for Artifi-
cial Intelligence. He is a member of the editorial boards of the journals
Artificial Intelligence, Machine Learning, Cognitive Science, Neural Compu-
tation and Computer Speech and Language.
Dr. Hinton is an expert at explaining neural network research to a
wide variety of audiences. He has given invited lectures on the research
at numerous international conferences, workshops, and summer schools. He
has given industrial tutorials in the United States for the Technology
Transfer Institute, AT&T Bell labs, Apple, Digital Equipment Corp., and
the American Association for Artificial Intelligence.
-------------------------Registration Form -------------------
Neural Networks for Industry
Tutorial by Geoffrey Hinton
December 11-12, 1990
Regal Constellation, 900 Dixon Rd.
Name _________________________________________
Title _________________________________________
Organization _____________________________________
Address _________________________________________
_________________________________________
_________________________________________
Postal Code ___________
Telephone __________________ Fax _________________
Registration Fee (check those that apply):
Day 1 Day 2 Total
------- ------- -------
ITRC or PRECARN member __ $75 __ $75 ________
Non-member __ $250 __ $250 ________
Please make cheques payable to Information Technology Research Centre
REGISTRATION DEADLINE: DECEMBER 6/90.
(Space is limited so register early)
Please fax or mail your registration to ITRC or PRECARN:
ITRC, Rosanna Reid PRECARN, Charlene Ferguson
University of Toronto 30 Colonnade Rd., Suite 300
D.L. Pratt Bldg., Rm. 286 Nepean, Ontario
6 King's College Rd. K2E 7J6
Toronto, Ontario M5S 1A1 phone (613) 727-9576
phone (416) 978-8558 fax (613) 727-5672
fax (416) 978-7207
--------------------------------------------------------------
------------------------------
End of Neuron Digest [Volume 6 Issue 65]
****************************************