[net.ai] Reply to stan the leprechaun hacker

narain@rand-unix@sri-unix.UUCP (08/13/83)

I am responding to two of the points you raised.

Attribute value pairs are hopeless for any area (including AI areas)
where your "cognitive chunks" are complex structures (like trees). An
example is symbolic algebraic manipulation, where it is natural to
think in terms of general forms of algebaraic expressions. Try writing
a symbolic differentiation program in terms of attribute-value pairs.
Another example is the "logic grammars" for natural language, whose
implementation in Prolog is extremely clear and efficient.

As to whether FP or more generally applicative languages are useful to
AI depends upon the point of view you take of AI. A useful view is to
consider it as "advanced programming" where you wish to develop 
intelligent computer programs, and so develop powerful computational
methods for them, even if humans do not use those methods. From this
point of view Backus's comments about the "von Neumann bottleneck"
apply equally to AI programming as they do to conventional
programming. Hence applicative languages may have ideas that could
solve the "software crisis" in AI as well.

This is not just surmise; the Prolog applications to date and underway
are evidence in favor of the power of applicative languages. You may
debate about the "applicativeness" of practical Prolog programming,
but in my opinion the best and (also the most efficient) Prolog
programs are in essence "applicative".

-- Sanjai Narain

sts@ssc-vax.UUCP (Stanley T Shebs) (08/18/83)

Actually trees can be expressed as attribute-value pairs.  Have had
to do that to get around certain %(&^%$* OPS5 limitations, so it's
possible, but not pretty.  However, many times your algebraic/tree
expressions/structures have duplicated components, in which case
you would like to join two nodes at lower levels.  You then end
up with a directed structure only.  (This is also a solution for
multiple inheritance problems.) 

I'll refrain from flaming about traditional (including logic) grammars.
I'm tired of people insisting on a restricted view of language that
claims that grammar rules are the ultimate description of syntax
(semantics being irrelevant) and that idioms are irritating
special cases.  I might note that we have basically solved the language
analysis problem (using a version of Berkeley's Phrase Analysis
that handles ambiguity) and are now working on building a language
learner to speed up the knowledge acquisition process, as well as
other interesting projects.

I don't recall a von Neumann bottleneck in AI programs, at least not
of the kind Backus was talking about.  The main bottleneck seems to
be of a conceptual rather than a hardware nature.  After all, production
systems are not inherently bottlenecked, but nobody really knows how
to make them run concurrently, or exactly what to do with the results
(I have some ideas though).

					stan the lep hack
					ssc-vax!sts (soon utah-cs)