[comp.realtime] "Easy" way to put "AI" in realtime embedded systems?

sgc@msel.unh.edu (Steven G. Chappell) (03/27/91)

Hi
	Over the years we have been working on an experimental autonomous
vehicle (EAVE) of the submersible persuasion. In the early stages, effort was
centered on the "traditional" robotics issues of positioning and motion
control, with a very heavy emphasis directed at the realtime aspects of same.
Subsequent work centered on an architecture which provides the capability for
adding new functionality to the already existing vehicle system. Recent work
dealt with the business of incorporating concepts from the AI community into
the vehicle runtime system so as to render it more "intelligent" (less stupid).
This is a nontrivial business since traditional embedded systems don't support
symbolic programming environments and such environments don't mix well with
"realtime" operation. Thus, my general information request:

By what methods can "AI" algorithms be installed in an embedded system?

Methods we have examined or heard about:
    software:
      augment embedded system with library which supports symbolic functionality
        develop in some extension of C, compile, download, run
      automated translation of symbolic code to supported code (C)
        develop in LISP, translate, compile, download, run
      utilize C++ (is C++ an adequate "AI" environment?)
        develop in C++, compile, download, run
      rehost the symbolic environment to the embedded system
        develop in LISP, download, run
    hardware: (is this really possible?)
      install LISP capable subsystem in target bus

In particular, we went the "rehost" route: transporting the University of
Utah's Portable Common LISP Subset from an HP Bobcat to our particular
development system and subsequently to our vehicle system. This has worked
to a degree, but it is not without its warts.

What experiences have you all had or heard about in regards to this?

Please email responses.
-- 
					Marine Systems Engineering Laboratory
					Marine Program Building
					University of New Hampshire
sgc@msel.unh.edu			Durham, NH  03824-3525

sfp@mars.ornl.gov (Phil Spelt) (03/27/91)

In article 777, Steven Chappell inquires about embedded AI, suggesting
several possible alternatives, and reports choosing "rehost" as their
solution.  My question:  Why stay with LISP?????

Concerning "embedded architecture" robotic control, we here at the Center
for Engineering Systems Advanced Research (CESAR) have been using just
such an architecture for our autonomous mobile robots for the past 4 or 5
years.  We have chosen the "rehost" solution, also, but with entirely different directions:

Our hardware configuration is: IBM PC/AT (industrialized), so that there is an 80286 host running 16 nodes of an NCube hypercube parallel computer.  All code
is written in 'C' (er, well, ONE program on the host is in FORTRAN).  We use 
an expert system to control both naviagtion and machine learning.  This is
created in CLIPS, which runs in two versions on two different nodes (one for navigation and one for Learning).  CLIPS provides the source code, which we then
ported to the node environment.  It also provides "LISP-like" rule construction,
but with (IMHO  8=) ) much better mathematical computation ability on the RHS.
Our robot runs around in an "unstructured", dynamic environment, locates a 
control panel, and learns to manipulate the panel to shut off a "DANGER" light.
All this is in "real time" -- the limiting factors are the speed of the arms and of the vision processing.  The ESs perform at a MUCH faster speed than the
mechanical parts of the robot.

I repeat my question:  WHY insist on LISP?????

Phil Spelt, Cognitive Systems & Human Factors Group  sfp@epm.ornl.gov
============================================================================
Any opinions expressed or implied are my own, IF I choose to own up to them.
============================================================================

smith@sctc.com (Rick Smith) (03/28/91)

sgc@msel.unh.edu (Steven G. Chappell) writes:

>By what methods can "AI" algorithms be installed in an embedded system?

The same way as any other algorithm...

Once you know how you want the system to operate there's nothing to
prevent you from porting your ideas from LISP to a conventional
embedded programming language. It's the approach I used when I did
thesis research in robotics.

On the other hand, if you don't know what '"AI" algorithms' you
want to use, you'll produce at best illusory progress pouring effort
into developing a platform with an "embedded LISP" environment.

When I think of '"AI" algorithms' I think of things like rule based
systems, constraint systems, heuristic search strategies, frame
systems, etc. Any or all of these can be implemented just fine
in non-LISP environments. Personally, I believe that it's better
not to use LISP since LISP can mask some resource issues (i.e.
memory usage) that you should be sure to solve if you're applying
such algorithms to an embedded system problem.

A potential weakness to my arguments is that '"AI" algorithms' are
most often described using LISP, so you need to know how to see
"under" the LISP in order to implement them in a non-LISP environment.
Still, I encourage this since it forces you to understand what
you're doing... and believe me, you _can_ do AI work even if you do
understand what you're doing!

Rick.
smith@sctc.com    Arden Hills, Minnesota

jax@well.sf.ca.us (Jack J. Woehr) (03/28/91)

sgc@msel.unh.edu (Steven G. Chappell) writes:

>symbolic programming environments and such environments don't mix well with
>"realtime" operation. Thus, my general information request:

>By what methods can "AI" algorithms be installed in an embedded system?

	There have been several succesful ports LISP >> Forth. GE's DELTA
and somebody's (NASA?) autorouter (Symbolics>>RTX2000) to name two. Performance
vastly increased in both cases.

-- 
 <jax@well.{UUCP,sf.ca.us} ><  Member, >        /// ///\\\    \\\  ///
 <well!jax@lll-winken.arpa >< X3J14 TC >       /// ///  \\\    \\\/// 
 <JAX on GEnie             >< for ANS  > \\\  /// ///====\\\   ///\\\ 
 <SYSOP RCFB (303) 278-0364><  Forth   >  \\\/// ///      \\\ ///  \\\

baker@csl.dl.nec.com (Larry Baker) (03/30/91)

There's a recent article in CACM that may be of interest here.

"Real-time Data Acquisition at Mission Control," Muratore, Heindel, Murphy,
Rasmussen and McFarland; CACM December 1990, v33 no 12.

One of the issues they faced was integrating "expert systems" (using CLIPS)
and a fairly hardcore realtime data acquisition problem.  I don't know the
details (e.g. how CLIPS compares to the other so-called "AI" tools), though.

--
Larry E. Baker, Jr.
NEC America, Inc.  C&C Software Laboratories
1525 Walnut Hill Ln., Irving, TX
(214) 518-3489
baker@texas.csl.dl.nec.com 		-or-	cs.utexas.edu!necssd!baker

jeff@aiai.ed.ac.uk (Jeff Dalton) (04/11/91)

In article <1991Mar27.151129.8754@cs.utk.edu> sfp@mars.epm.ornl.gov (Phil Spelt) writes:
>In article 777, Steven Chappell inquires about embedded AI, suggesting
>several possible alternatives, and reports choosing "rehost" as their
>solution.  My question:  Why stay with LISP?????

Why not?  Some people like Lisp.  Since we don't insist that you use
the language we like, why give us a hard time for not using the
language you like?

>It also provides "LISP-like" rule construction, but with (IMHO  8=) ) 
>much better mathematical computation ability on the RHS.

It's not clear what this means, since Lisp doesn't have rules.

rlk@telesoft.com (Bob Kitzberger @sation) (04/13/91)

In article <4471@skye.ed.ac.uk>, jeff@aiai.ed.ac.uk (Jeff Dalton) writes:
> 
> Why not?  Some people like Lisp.  Since we don't insist that you use
> the language we like, why give us a hard time for not using the
> language you like?

It doesn't seem to me like it's a quiestion of 'liking' a language or not.
Jeez, we're supposed to be _engineers_ with the wisdom to select the correct
tool for its merits, not because of our subjective preferences.

I missed the original post, but if the poster wants to embed LISP in a hard
real-time application, there are valid engineering questions that need
to be asked.

I don't prefess to be anything near a LISP expert, but I'll toss out
a few questions you need to ask yourself.  Can you calculate guaranteed 
worst-case times for LISP routines in the presence of interrupts?  How 
about the dynamic memory allocation algorithms used in LISP -- are they 
non-deterministic?   How can you protect LISP data structures in the 
presence of concurrency?  Is reclamation of stale heap space performed?
I don't mean to imply that LISP fails in these areas.

Worst-case analysis is necessary for hard real-time applications.  You're
charting new territory if you use LISP in hard real-time, IMHO.

	.Bob.
-- 
Bob Kitzberger               Internet : rlk@telesoft.com
TeleSoft                     uucp     : ...!ucsd.ucsd.edu!telesoft!rlk
5959 Cornerstone Court West, San Diego, CA  92121-9891  (619) 457-2700 x163
------------------------------------------------------------------------------
"Wretches, utter wretches, keep your hands from beans!"	-- Empedocles

jeff@aiai.ed.ac.uk (Jeff Dalton) (04/16/91)

In article <1234@telesoft.com> rlk@telesoft.com (Bob Kitzberger @sation) writes:
>In article <4471@skye.ed.ac.uk>, jeff@aiai.ed.ac.uk (Jeff Dalton) writes:
>> 
>> Why not?  Some people like Lisp.  Since we don't insist that you use
>> the language we like, why give us a hard time for not using the
>> language you like?
>
>It doesn't seem to me like it's a quiestion of 'liking' a language or not.
>Jeez, we're supposed to be _engineers_ with the wisdom to select the correct
>tool for its merits, not because of our subjective preferences.

Unfortunately, engineering reasons seldom tell you the single right
tool to use.  This doesn't stop some people from claiming that they do
or from being convinced themselves.  In the end, it often comes down
to whether you value, say, a certain degress of flexibility over a
certain degree of efficiency, or vice versa, and to what you're used
to.  In many cases, some people will find Lisp better than C and
others will find C better than Lisp.  This doesn't mean one of them
must be wrong.

This is not entirely subjective, but neither is it entirely objective.

>I missed the original post, but if the poster wants to embed LISP in a hard
>real-time application, there are valid engineering questions that need
>to be asked.

Just so, and for embedded real time it may be that Lisp is unsuitable.

>I don't prefess to be anything near a LISP expert, but I'll toss out
>a few questions you need to ask yourself.  Can you calculate guaranteed 
>worst-case times for LISP routines in the presence of interrupts?  How 
>about the dynamic memory allocation algorithms used in LISP -- are they 
>non-deterministic?   How can you protect LISP data structures in the 
>presence of concurrency?  Is reclamation of stale heap space performed?
>I don't mean to imply that LISP fails in these areas.

I will leave this to the experts on various Lisp systems.  Lisp can
certainly come close, but few if any implementations aim to be
suitable for embedded real-time use.

-- Jeff

varvel@cs.utexas.edu (Donald A. Varvel) (04/17/91)

In article <1234@telesoft.com> rlk@telesoft.com (Bob Kitzberger @sation) writes:

>I missed the original post, but if the poster wants to embed LISP in a hard
>real-time application, there are valid engineering questions that need
>to be asked.

>I don't prefess to be anything near a LISP expert, but I'll toss out
>a few questions you need to ask yourself.  Can you calculate guaranteed 
>worst-case times for LISP routines in the presence of interrupts?  How 
>about the dynamic memory allocation algorithms used in LISP -- are they 
>non-deterministic?   How can you protect LISP data structures in the 
>presence of concurrency?  Is reclamation of stale heap space performed?
>I don't mean to imply that LISP fails in these areas.

>Worst-case analysis is necessary for hard real-time applications.  You're
>charting new territory if you use LISP in hard real-time, IMHO.

Funny you should mention it.  Yesterday I happened to be
at TI Dallas.  Folks there mentioned with some pride a
hard real-time system being developed in LISP.  They didn't
seem to want to talk much about details, but two points
that were mentioned were incremental garbage collection
and worst-case time guarantees for functions.

Very few off-the-shelf systems of any sort are suitable
for hard real-time applications.  Hardware designers seem
particularly prone to confuse *fast* with *guaranteed worst-
case performance*.  That tends to land us with "real-time"
processors with 17-level caches.

"Robotics" is often done by AI sorts, in LISP.  "Equipment
control" is often done by real-time sorts, in assembly.
It is clear to *me*, at least, that the two must eventually
evolve together.  If the worst problem we have in finding 
a reasonable meeting ground is producing a real-time LISP, 
we should count ourselves lucky.

-- Don Varvel (varvel@cs.utexas.edu)

markh@csd4.csd.uwm.edu (Mark William Hopkins) (04/17/91)

In article <1334@muleshoe.cs.utexas.edu> varvel@cs.utexas.edu (Donald A. Varvel) writes:
>"Robotics" is often done by AI sorts, in LISP.  "Equipment
>control" is often done by real-time sorts, in assembly.
>It is clear to *me*, at least, that the two must eventually
>evolve together.  If the worst problem we have in finding 
>a reasonable meeting ground is producing a real-time LISP, 
>we should count ourselves lucky.

The meeting ground is doing robotics in assembly too.  Different languages
are good for different things, and assembly is well-suited for bare to the
metal hardware tasks.  LISP (the first language I learned after BASIC) I think
is just too roundabout.

Generally, though, the meeting ground is right here at my place. :)

bill@ibmpcug.co.uk (Bill Birch) (04/19/91)

In article <11080@uwm.edu> markh@csd4.csd.uwm.edu (Mark William Hopkins) writes:
> In article <1334@muleshoe.cs.utexas.edu> varvel@cs.utexas.edu (Donald A. Varvel) writes:
> >"Robotics" is often done by AI sorts, in LISP.  "Equipment
> >control" is often done by real-time sorts, in assembly.
> >It is clear to *me*, at least, that the two must eventually
> >evolve together.  If the worst problem we have in finding 
> >a reasonable meeting ground is producing a real-time LISP, 
> >we should count ourselves lucky.
> 
> The meeting ground is doing robotics in assembly too.  Different languages
> are good for different things, and assembly is well-suited for bare to the
> metal hardware tasks.  LISP (the first language I learned after BASIC) I think
> is just too roundabout.
> 
> Generally, though, the meeting ground is right here at my place. :)

Some work has been done on "real-time" LISP systems. Most of the problems are to do with garbage collection. Also some production LISP systems are rather large
hence difficult to imbed. 

I've developed a LISP interpreter which has deterministic response times
because it uses reference counting garbage collection. It is in 'C' and
will run on an ATARI-ST. I guess you could embed it in a ROM-based system
.. er, yes, that would work. Email me for the source.

Otherwise you could translate LISP/scheme to C. 

I hear that the highly succesful real-time expert system "G2" from GENSYM 
uses LISP at base.  They will be releasing a C version soon.

So real-time AI is not so infeasible!

-- 
Automatic Disclaimer:
The views expressed above are those of the author alone and may not
represent the views of the IBM PC User Group.
--