[can.ai] Adaptive Logic Networks

dwelly@saddle-lk.cs.UAlberta.CA (Andrew Dwelly) (04/24/91)

comp.ai comp.ai.neural-net comp.database can.ai

Readers of this group may be interested to hear that we shall shortly be
releasing some unusual neural net software on the net. In the meanwhile it
is availble by anonymous FTP from menaik.cs.ualberta.ca [129.128.4.241].

Applications we have tried up to now include:

Land Use Classification
Census Analysis
Prosthesis control

...and a number of pattern recognition tasks such as OCR. We feel that the
algorithms are suitable for large scale data analysis (hence the crossposting 
to comp.database), and are now searching for other industrial and business
applications.

The algorithm is a radical departure from normal neural-net techniques
because it is based on the synthesis of an Adaptive Logic Network (ALN).
In the process of learning, the system constructs a boolean digital circuit
to perform the task. Training is rapid, and the execution of the resulting
network is extremely fast (and could easily be turned into hardware for
even more speed).

The software includes a library of routines for the "C" programmer, a small
language for data analysis, documentation (in TeX) and some examples. 

Why are we releasing it ? We are hoping for a wider group of users, comments
on the existing software, and partners for future developments. Please send
all mail on the subject to:

arms@cs.ualberta.ca or dwelly@cs.ualberta.ca

Bill Armstrong.
Andrew Dwelly.
Rolf Manderscheid.

******************************************************************************
Andy Dwelly : dwelly@cs.ualberta.ca, Tel: 403-492-7591
!@#$%$#, %^&*%, - Listen, who swears ? Christopher Robin has fallen downstairs.
******************************************************************************
--
******************************************************************************
Andy Dwelly : dwelly@cs.ualberta.ca, Tel: 403-492-7591
!@#$%$#, %^&*%, - Listen, who swears ? Christopher Robin has fallen downstairs.
******************************************************************************

dwelly@saddle-lk.cs.UAlberta.CA (Andrew Dwelly) (04/24/91)

comp.ai comp.ai.neural-net comp.database can.ai

Readers of this group may be interested to hear that we shall shortly be
releasing some unusual neural net software on the net. In the meanwhile it
is availble by anonymous FTP from menaik.cs.ualberta.ca [129.128.4.241].

Applications we have tried up to now include:

Land Use Classification
Census Analysis
Prosthesis control

...and a number of pattern recognition tasks such as OCR. We feel that the
algorithms are suitable for large scale data analysis (hence the crossposting 
to comp.database), and are now searching for other industrial and business
applications.

The algorithm is a radical departure from normal neural-net techniques
because it is based on the synthesis of an Adaptive Logic Network (ALN).
In the process of learning, the system constructs a boolean digital circuit
to perform the task. Training is rapid, and the execution of the resulting
network is extremely fast (and could easily be turned into hardware for
even more speed).

The software includes a library of routines for the "C" programmer, a small
language for data analysis, documentation (in TeX) and some examples. 

Why are we releasing it ? We are hoping for a wider group of users, comments
on the existing software, and partners for future developments. Please send
all mail on the subject to:

arms@cs.ualberta.ca or dwelly@cs.ualberta.ca

Bill Armstrong.
Andrew Dwelly.
Rolf Manderscheid.
--
******************************************************************************
Andy Dwelly : dwelly@cs.ualberta.ca, Tel: 403-492-7591
!@#$%$#, %^&*%, - Listen, who swears ? Christopher Robin has fallen downstairs.
******************************************************************************

mikew@cutthroat.cs.washington.edu (Mike Williamson) (04/24/91)

In article <1991Apr23.211210.29372@cs.UAlberta.CA> dwelly@saddle-lk.cs.UAlberta.CA (Andrew Dwelly) writes:
>The algorithm is a radical departure from normal neural-net techniques
>because it is based on the synthesis of an Adaptive Logic Network (ALN).
>In the process of learning, the system constructs a boolean digital circuit
>to perform the task. Training is rapid, and the execution of the resulting
>network is extremely fast (and could easily be turned into hardware for
>even more speed).

Wait.  If the algorithm constructs a boolean digital circuit, how is
it related to neural nets?  Many traditional inductive learning
algorithms produce a "concept", which is often expressed as, e.g., a
restricted conjunctive-normal-form boolean expression.  No real trick
to make a circuit from that.

No wonder training is rapid, if you have departed so radically from
neural nets as to have a standard inductive learning algorithm.

-Mike

arms@cs.UAlberta.CA (Bill Armstrong) (04/26/91)

mikew@cutthroat.cs.washington.edu (Mike Williamson) writes:

>In article <1991Apr23.211210.29372@cs.UAlberta.CA> dwelly@saddle-lk.cs.UAlberta.CA (Andrew Dwelly) writes:
>>The algorithm is a radical departure from normal neural-net techniques
>>because it is based on the synthesis of an Adaptive Logic Network (ALN).
>>In the process of learning, the system constructs a boolean digital circuit
>>to perform the task. Training is rapid, and the execution of the resulting
>>network is extremely fast (and could easily be turned into hardware for
>>even more speed).

>Wait.  If the algorithm constructs a boolean digital circuit, how is
>it related to neural nets?  Many traditional inductive learning
>algorithms produce a "concept", which is often expressed as, e.g., a
>restricted conjunctive-normal-form boolean expression.  No real trick
>to make a circuit from that.

We call it a neural net simulation because it fits the usual neural
net paradigm (as outlined on page 22 of the book "Neurocomputing" by
Robert Hecht-Nielsen).  The elements are small circuits that have a
local state and the learning of complex functions results from a
solution to the "credit assignment" problem, which is called
"heuristic responsibility".  In fact this is a solution to a problem
that people have been talking about for a long time: how to make
MULTILAYER NETWORKS OF PERCEPTRONS adapt!  Each node, which can
realize any one of the four linear-threshold functions AND, OR, LEFT
(i.e. LEFT(x,y) = x), RIGHT, is just a little perceptron with possible
weight values 0 and 1, acting on boolean inputs.

The goal is to have a circuit which can perform pattern-recognition
tasks.  This has been tried out on OCR and turns out to be very immune
to noise.  The reason for this is the inherent property of binary
trees of the above function types to have an output which is unlikely
to change if a some inputs are perturbed.

>No wonder training is rapid, if you have departed so radically from
>neural nets as to have a standard inductive learning algorithm.

Sorry, the training is rapid BECAUSE IT'S BASED ON LOGIC, NOT
ARITHMETIC.  With logic one can take advantage of PARSIMONY (Meisel's
term) or lazy evaluation.  Briefly, if an element is an AND gate, and
if one of its inputs has been determined to be a 0, then there is no
need to evaluate the other input.  This gives orders of magnitude
speedup not only in software simulations but also in any system which
is not 100% parallel.  Since all economically feasible systems for
execution of very large neural networks will reuse the same hardware
for different parts of the computation, parsimony is important in
hardware systems too.

It would be interesting to see what a standard inductive learning
system would do on the kind of problems we have been working on, like
taking 5,000 20-bit vectors, randomly created, choosing four fixed bit
positions as "control leads" and defining the desired output as the
value of the non-control lead with index determined by the values of
the four control leads. This is a partial "one-of-sixteen" multiplexer
function, defined by samples.  Although the net sees only 5000 of the
1048576 possible inputs, it generalizes correctly to 98.6% of them.
Of course, there is no restriction on the function type a priori.

Anyway, why not try it out via anonymous ftp from
menaik.cs.ualberta.ca [129.128.4.241] in pub/atree.tar.Z
You have to set type binary, of course.

Thanks for your comments.  Sorry for the delay in replying, our
server was overloaded and the first try bombed.

Bill Armstrong




--
***************************************************
Prof. William W. Armstrong, Computing Science Dept.
University of Alberta; Edmonton, Alberta, Canada T6G 2H1
arms@cs.ualberta.ca Tel(403)492 2374 FAX 492 1071