michaelm@bcsaic.UUCP (michael b maxwell) (10/02/85)
The common wisdom is that AI-type applications should be developed in an AI language such as Lisp or Prolog, for obvious reasons. At the same time, it is frequently asserted that a mature Lisp program can be recoded in C for increased speed and decreased size. My question is, how much faster, and how much smaller? Obviously this depends on many things. Let's assume for the purposes of discussion reasonably good Lisp and C compilers. (e.g. Frantz Liszt, etc.; I don't "speak" C, so insert your favorite C compiler here!) Let's also assume you've made what modifications in your Lisp program you can to speed it up, e.g. doing (sstatus translink on), using localf (assuming that works, which it doesn't on my Sun...), setting up the allocation of space for different forms of data so as to optimize garbage collection, etc. Likewise whatever optimizations you can make to the C version. The improvements gained by recoding in C also obviously depend on what kind of a program you're recoding; so what programs are helped most? Pointer-intensive ones?? Arithmetic-intensive ones? On a large program, recoding the entire thing in C is probably not worthwile (I would guess). How can you tell what parts would best benefit from translation? (I assume here a Lisp like Frantz, which allows calling of C functions from Lisp code.) Or does this really gain you anything? Have many people really done the translation? Or is it one of those things that everyone talks about, but no one does anything about? :-) What gains did you realize? Are there any published studies? -- Mike Maxwell Boeing Artificial Intelligence Center ..uw-beaver!{uw-june,ssc-vax}!bcsaic!michaelm
freeman@spar.UUCP (Jay Freeman) (10/04/85)
[] There's another issue beside time and speed, having to do with real-time systems. A current net discussion bewails the plight of the poor F-16 pilot bounced by a Foxbat, when his or her Ada-coded countermeasures system raises an error message due to run-time range checking. But imagine the plight of the same pilot when the revised fire-control system -- rewritten in Lisp -- says "Garbage Collection" and goes to sleep for a while. (There are Lisp implementations that do incremental garbage collection, of course.) -- Jay Reynolds Freeman (Schlumberger Palo Alto Research)(canonical disclaimer)
jbn@wdl1.UUCP (10/06/85)
Well, the Ventilator Manager, an ``expert system'' from Stanford intended to make decisions about when to move patients off or back onto a respirator, was converted from EMYCIN to a rather small BASIC program, in which form it is actually useful; the EMYCIN form required a DECsystem 2060 and ran too slowly to keep up with real time. Interestingly, if you read the thesis, and have any background in process control, you realize that it's a simple control problem; you have a process with a few states, a few incoming sensor values, and limits on each sensor value which can either trigger an alarm, cause a state change, or both; in each state, the limits are different. That's all that seems to be needed, and a few pages in the back of the thesis give all the values and states. But apparently the underlying simplicity of the problem wasn't realized, or at least admitted, until they built an expert system to solve it. I'm beginning to suspect that knowledge engineering is more interviewing technique than computer science. John Nagle
jbn@wdl1.UUCP (10/07/85)
So, you get a fatal exception, the program reinitializes, rereads its sensors, gets minimal long-term info from non-volatile storage, and continues. Real-time programs have to be able to do this anyway, since you may have to recover from a transient fault. I've seen a Boeing 767 go through a full electronics restart when switching from APU to ground power; it takes about a second before all the displays are updated again. That's how it has to work. That's the way real-time programs are usually written. John Nagle