[comp.archives] [comp.ai.neural-net...] Re: Adaptive Logic Networks

arms@cs.UAlberta.CA (Bill Armstrong) (05/09/91)

Archive-name: ai/neural-nets/atree/1991-04-26
Archive: menaik.cs.ualberta.ca:pub/atree.tar.Z [129.128.4.241]
Original-posting-by: arms@cs.UAlberta.CA (Bill Armstrong)
Original-subject: Re: Adaptive Logic Networks
Reposted-by: emv@msen.com (Edward Vielmetti, MSEN)

mikew@cutthroat.cs.washington.edu (Mike Williamson) writes:

>In article <1991Apr23.211210.29372@cs.UAlberta.CA> dwelly@saddle-lk.cs.UAlberta.CA (Andrew Dwelly) writes:
>>The algorithm is a radical departure from normal neural-net techniques
>>because it is based on the synthesis of an Adaptive Logic Network (ALN).
>>In the process of learning, the system constructs a boolean digital circuit
>>to perform the task. Training is rapid, and the execution of the resulting
>>network is extremely fast (and could easily be turned into hardware for
>>even more speed).

>Wait.  If the algorithm constructs a boolean digital circuit, how is
>it related to neural nets?  Many traditional inductive learning
>algorithms produce a "concept", which is often expressed as, e.g., a
>restricted conjunctive-normal-form boolean expression.  No real trick
>to make a circuit from that.

We call it a neural net simulation because it fits the usual neural
net paradigm (as outlined on page 22 of the book "Neurocomputing" by
Robert Hecht-Nielsen).  The elements are small circuits that have a
local state and the learning of complex functions results from a
solution to the "credit assignment" problem, which is called
"heuristic responsibility".  In fact this is a solution to a problem
that people have been talking about for a long time: how to make
MULTILAYER NETWORKS OF PERCEPTRONS adapt!  Each node, which can
realize any one of the four linear-threshold functions AND, OR, LEFT
(i.e. LEFT(x,y) = x), RIGHT, is just a little perceptron with possible
weight values 0 and 1, acting on boolean inputs.

The goal is to have a circuit which can perform pattern-recognition
tasks.  This has been tried out on OCR and turns out to be very immune
to noise.  The reason for this is the inherent property of binary
trees of the above function types to have an output which is unlikely
to change if a some inputs are perturbed.

>No wonder training is rapid, if you have departed so radically from
>neural nets as to have a standard inductive learning algorithm.

Sorry, the training is rapid BECAUSE IT'S BASED ON LOGIC, NOT
ARITHMETIC.  With logic one can take advantage of PARSIMONY (Meisel's
term) or lazy evaluation.  Briefly, if an element is an AND gate, and
if one of its inputs has been determined to be a 0, then there is no
need to evaluate the other input.  This gives orders of magnitude
speedup not only in software simulations but also in any system which
is not 100% parallel.  Since all economically feasible systems for
execution of very large neural networks will reuse the same hardware
for different parts of the computation, parsimony is important in
hardware systems too.

It would be interesting to see what a standard inductive learning
system would do on the kind of problems we have been working on, like
taking 5,000 20-bit vectors, randomly created, choosing four fixed bit
positions as "control leads" and defining the desired output as the
value of the non-control lead with index determined by the values of
the four control leads. This is a partial "one-of-sixteen" multiplexer
function, defined by samples.  Although the net sees only 5000 of the
1048576 possible inputs, it generalizes correctly to 98.6% of them.
Of course, there is no restriction on the function type a priori.

Anyway, why not try it out via anonymous ftp from
menaik.cs.ualberta.ca [129.128.4.241] in pub/atree.tar.Z
You have to set type binary, of course.

Thanks for your comments.  Sorry for the delay in replying, our
server was overloaded and the first try bombed.

Bill Armstrong




--
***************************************************
Prof. William W. Armstrong, Computing Science Dept.
University of Alberta; Edmonton, Alberta, Canada T6G 2H1
arms@cs.ualberta.ca Tel(403)492 2374 FAX 492 1071

-- comp.archives file verification
menaik.cs.ualberta.ca
-rw-r--r--  1 556      17         154327 May  1 10:17 pub/atree.tar.Z
found atree ok
menaik.cs.ualberta.ca:pub/atree.tar.Z