[comp.lang.lisp] Unix Lisp Environments

jdye@zodiac.ADS.COM (John W. Dye Jr.) (05/05/89)

On the subject of Lisp Environments for Unix (SUN) workstations.

As a former Lispm user who presently is a Sun common-lisp hacker
I would like to make a few observations about the state and
evolution of lisp environments for Unix workstations.  Hopefully
these comments will spark constructive discussion about what
we can do to speed the arrival of good lisp based programming environments.

1)  There is not much of a market for Lisp based products (compared to
    C or (UGH) Fortran).  This is the reason that companies cannot afford
    to devote amazing amounts of resources to bring lisp based programming
    environments to market.

2)  The window systems keep changing.  First there was sunview. Then there
    were X and NeWS.  Now there is (soon) a NeWS/X merge (with suntools
    support also).  Developers of lisp programming environments have had
    to program on a moving target (the window systems).  Unfortunately,
    most of the neat stuff that lisp programming environments do are
    window based (window-debuggers, fancy editors). 

3)  Finally, Emacs isnt that bad.  It's programmable, and with the
    right set of hacks and tags tables it begins to approximate the
    typical symbolics compile-test-debug cycle that we are all so
    familiar with.

   
The biggest problem we still face is debugging a multiprocessing lisp
using emacs only (we dont like sunview--OK!).  It would be nice to have
a facility like the lucid editor provides in good old gnu emacs.

What can we do about it???

We could all get together and agree on a window-system platform and
then build the gnu-lisp environment out of it!!

We can wait for Sun,Lucid,Franz, etc to supply us with one
  (see point 1).



Any other Ideas???

JD
jdye@ads.com
"ADS didn't make me write this--STD Disclaimer."

P.S.  I am encouraged by the state of Lisp and its direction.  I forsee
people actually delivering software that runs in lisp in the next 3-5 
years.

roberts@studguppy.lanl.gov (Doug Roberts) (05/05/89)

In article <7802@zodiac.UUCP> jdye@zodiac.ADS.COM (John W. Dye Jr.) writes:

> 
> On the subject of Lisp Environments for Unix (SUN) workstations.
> 
> As a former Lispm user who presently is a Sun common-lisp hacker
> I would like to make a few observations about the state and
> evolution of lisp environments for Unix workstations.  Hopefully
> these comments will spark constructive discussion about what
> we can do to speed the arrival of good lisp based programming environments.
> 
> 1)  There is not much of a market for Lisp based products (compared to
>     C or (UGH) Fortran).  This is the reason that companies cannot afford
>     to devote amazing amounts of resources to bring lisp based programming
>     environments to market.
> 
> 2)  The window systems keep changing.  First there was sunview. Then there
>     were X and NeWS.  Now there is (soon) a NeWS/X merge (with suntools
>     support also).  Developers of lisp programming environments have had
>     to program on a moving target (the window systems).  Unfortunately,
>     most of the neat stuff that lisp programming environments do are
>     window based (window-debuggers, fancy editors). 
> 
> 3)  Finally, Emacs isnt that bad.  It's programmable, and with the
>     right set of hacks and tags tables it begins to approximate the
>     typical symbolics compile-test-debug cycle that we are all so
>     familiar with.
> 
>    
> The biggest problem we still face is debugging a multiprocessing lisp
> using emacs only (we dont like sunview--OK!).  It would be nice to have
> a facility like the lucid editor provides in good old gnu emacs.
> 
> What can we do about it???
> 
> We could all get together and agree on a window-system platform and
> then build the gnu-lisp environment out of it!!
> 
> We can wait for Sun,Lucid,Franz, etc to supply us with one
>   (see point 1).
> 
> P.S.  I am encouraged by the state of Lisp and its direction.  I forsee
> people actually delivering software that runs in lisp in the next 3-5 
> years.
> 

You make some good points, especially the one regarding the small
market share of LISP. I think, however, that you missed one of the
major reasons that the Unix LISP environment is still decidedly
inferior to a LISPm: the majority of the market that is considering
LISP as a language in which to deliver applications is currently a
member of either the Unix or the VMS community: _they are not aware of
the productivity that exists ion a LISPm_. Nor _will_ most of them
become aware, given the cost disparity between LISPm hardware and the
other workstations.

What can we do about it? Complain, for one thing. I know that the
major vendors listen to this forum (over the past two years I've
received feedback from a number of the major software and hardware
vendors to comments made here). I believe that some fruit has already
been born, in part because of comments made to this list. Assisting to
the extent possible in establishing a standard window system will also
help, of course.

Regarding your P.S., we have been delivering LISP applications to our
customers for two years now.

--Doug
















--

===============================================================
Douglas Roberts
Los Alamos National Laboratory
Box 1663, MS F-602
Los Alamos, New Mexico 87545
(505)667-4569
dzzr@lanl.gov
===============================================================

malcolm@Apple.COM (Malcolm Slaney) (05/05/89)

In article <ROBERTS.89May4173615@studguppy.lanl.gov> roberts@studguppy.lanl.gov (Doug Roberts) writes:
>I think, however, that you missed one of the
>major reasons that the Unix LISP environment is still decidedly
>inferior to a LISPm: the majority of the market that is considering
>LISP as a language in which to deliver applications is currently a
>member of either the Unix or the VMS community: _they are not aware of
>the productivity that exists ion a LISPm_. Nor _will_ most of them
>become aware, given the cost disparity between LISPm hardware and the
>other workstations.
I think the real problem with LispM's is the very long learning curve.  I
was using Lisp for several years and then ran into a particurly nasty file
system bug.  I was amazed to see what a true wizard could do (thanks Kanef).

As far as the original question goes...it is much harder to build a good
system building tool for Lisp then it is for a conventional language like
C/Fortran/whatever.  In most conventional languages each compile can take
place independently since the compiler doesn't depend on the environment.
This is not true of Lisp.  If you strip out the environment stuff and ignore
the syntax than Symbolics' System Construction Tool (SCT) is much like make.
I wonder if Symbolics will ever get rid of all the nasty bugs in SCT.

What I really missed on the Symbolics machine was a good source code control
system (no file versions are a bad way of doing that).  I ended up keeping
my Lisp sources on a Sun, accessed them on the LispM with NFS and used
RCS on the Sun.  Best of both worlds.  (And using TAR to do distributions
was MUCH simpler than distribute-system....argghh!!)

Yours for bastardized systems.... :-).

								Malcolm

raymond@ptolemy.arc.nasa.gov (Eric A. Raymond) (05/06/89)

In article <7802@zodiac.UUCP> jdye@ads.com (John W. Dye Jr.) writes:
>....  It would be nice to have
>a facility like the lucid editor provides in good old gnu emacs.

You mean like Franz Allegro Lisp has?  I also think you can hack
GNU/Lucid with some of the same functionality (i.e. eval buffer, sexp).
-- 
Eric A. Raymond  (raymond@ptolemy.arc.nasa.gov)
G7 C7 G7 G#7 G7 G+13 C7 GM7 Am7 Bm7 Bd7 Am7 C7 Do13 G7 C7 G7 D+13: Elmore James

barmar@think.COM (Barry Margolin) (05/06/89)

In article <7802@zodiac.UUCP> jdye@ads.com (John W. Dye Jr.) writes:
>2)  The window systems keep changing.  First there was sunview. Then there
>    were X and NeWS.  Now there is (soon) a NeWS/X merge (with suntools
>    support also).  Developers of lisp programming environments have had
>    to program on a moving target (the window systems).  Unfortunately,
>    most of the neat stuff that lisp programming environments do are
>    window based (window-debuggers, fancy editors). 

The solution to this is coming soon.  International Lisp Associates
and Symbolics are developing a package called Y-Windows, which
provides very high-level window-based user interface facilities (it's
basically a clone of Symbolics's Dynamic Windows and Command Processor
facilities) in a number of Common Lisp implementations.  The
interfaces to this are mostly independent of the underlying window
system.  I think the prototype currently runs in Lucid Lisp using CLX,
Allegro CL using Quickdraw, Symbolics Common Lisp using the Genera
window system, and Symbolics CLOE using MS-Windows.  If a new window
system comes along, I imagine it is relatively straightforward to
write another driver.  Everything but drivers for
implementation-dependent window systems (e.g. Macintosh Quickdraw) is
written in portable Common Lisp.

Disclaimer: I have no connection with ILA and Symbolics, except as a
satisfied customer.

Barry Margolin
Thinking Machines Corp.

barmar@think.com
{uunet,harvard}!think!barmar

mujica@ra.cs.ucla.edu (S. Mujica) (05/06/89)

In article <1137@ptolemy.arc.nasa.gov> raymond@ptolemy.arc.nasa.gov (Eric A. Raymond) writes:

> Path: ucla-cs!usc!bloom-beacon!tut.cis.ohio-state.edu!ucbvax!agate!eos!ptolemy!raymond
> From: raymond@ptolemy.arc.nasa.gov (Eric A. Raymond)
> Newsgroups: comp.lang.lisp
> Date: 5 May 89 18:06:43 GMT
> References: <7802@zodiac.UUCP>
> Reply-To: raymond@ptolemy.arc.nasa.gov.UUCP (Eric A. Raymond)
> Organization: NASA Ames Research Center
> Lines: 9

> In article <7802@zodiac.UUCP> jdye@ads.com (John W. Dye Jr.) writes:
>....  It would be nice to have
>a facility like the lucid editor provides in good old gnu emacs.

> You mean like Franz Allegro Lisp has?  I also think you can hack
> GNU/Lucid with some of the same functionality (i.e. eval buffer, sexp).
> -- 
> Eric A. Raymond  (raymond@ptolemy.arc.nasa.gov)
> G7 C7 G7 G#7 G7 G+13 C7 GM7 Am7 Bm7 Bd7 Am7 C7 Do13 G7 C7 G7 D+13: Elmore James


 segre@cs.cornell.edu (Alberto M. Segre) has developed a gnu emacs
package for interfacing to common lisp.  you can obtain it from him.

The help file follows:


________________________________________________________________

The file clisp.el establishes a set of key bindings and functions to
support one or more Common Lisp processes running in inferior shells.

There are two sets of key bindings established, one for editing
lisp code and the other for interacting with a lisp listener.
Both sets of bindings are available via the C-c prefix.

Editing any file in lisp mode will cause an inferior lisp to be
started automatically. Normally this is accomplished by setting the
auto-mode-alist variable in your ".emacs" file to key off of a
filename extension.

While editing a file in Lisp mode:
  C-c l    switches to the last inferior lisp process visited (see C-c e)
  M-C-l    spawns a new lisp buffer, prompting for a host.

You can start as many Lisp listeners as you like, each with a distinct
value space. We use this feature to start a Lisp on a remote machine
that is presumably faster or has more memory.

The notion of "last Lisp process" corresponds to the last Lisp
listener whose GNU window appeared on the screen. You can switch to
any Lisp process by giving a prefix argument to C-c l specifying which
*lispN* buffer to select; the "last Lisp process" notion only controls
the behavior of C-c l (and other keybindings) when no prefix is given.

To pass code from GNU to lisp:
  C-c d    evals current defun in last inferior lisp process
  C-c C-d  = (C-c d) + (C-c l)
  C-c c    compiles current defun in last inferior lisp process
  C-c C-c  = (C-c c) + (C-c l)
  C-c s    evals last sexpr in last inferior lisp process
  C-c C-s  = (C-c s) + (C-c l)
  C-c r    evals current region in last inferior lisp process
  C-c C-r  = (C-c r) + (C-c l)
  C-c b    evals current buffer in last inferior lisp process
  C-c C-b  = (C-c b) + (C-c l)
  C-c t    traces current defun in last inferior lisp process
  C-c C-t  = (C-c t) + (C-c l)
  C-c C-a  beginning of current defun
  C-c C-e  end of current defun
The GNU emacs tags facility is used to cross index your source code.
Special bindings to support this feature include:
  C-c .    finds defun for current function in other window
  C-c ,    looks for next matching defun (C-c .)
  M-.      finds defun for current function (std GNU)
  M-,      looks for next matching defun (std GNU)
  M-t      lists files indexed by (C-c .)
  M-C-t    recomputes lookup table for (C-c .) and (C-c t)

In addition, there are a few bindings that are specific to Common Lisp
support.
  C-c m    shows Common Lisp macro expansion of current form
  C-c f    shows Common Lisp documentation for current function
  C-c v    shows Common Lisp documentation for current variable
  M-q      reindents current comment or defun

Indentation has been adapted to properly indent the Interlisp-style
FOR macro distributed by segre@gvax.cs.cornell.edu

Note that the "[" and "]" characters can be used as "super-parens" in
either mode. A "]" closes as many open "(" exist up to and including
an open "[". If no open "[" exists, "]" closes up to the top level.
The square brackets are replaced by the appropriate number of "(" and
")" in the buffer, since Common Lisp doesn't understand super-parens.
N.B.; To insert explicit square brackets, they must be prefaced by
C-q.

While typing to an inferior Lisp process buffer:
  C-c e    returns to last edited file of lisp code (see C-c l)
  M-C-l    spawns a new lisp buffer, prompting for a host.
  C-c l    with a prefix argument switches to that inferior lisp.

The notion of "last edit buffer" is the analogue to "last Lisp
buffer". The last GNU buffer visible that was not a Lisp process
buffer is the "last edit buffer". To go to a different buffer, use the
apporpriate GNU command (C-x b).

Finally, there are some "ksh"-like extensions to shell.el to help in
debugging Lisp code:
  C-c h    show history
  C-c C-p  previous form in history list
  C-c C-n  next form in history list
  C-c C-a  position at previous prompt
  C-c C-r  search backwards in history
  C-c C-s  search forward in history


________________________________________________________________


Sergio Mujica		mujica@cs.ucla.edu
Computer Science Department, UCLA

jdye@zodiac.ADS.COM (John W. Dye Jr.) (05/07/89)

In article <40215@think.UUCP> barmar@kulla.think.com.UUCP (Barry Margolin) writes:
>In article <7802@zodiac.UUCP> jdye@ads.com (John W. Dye Jr.) writes:
>>2)  The window systems keep changing.  First there was sunview. Then there
>>    were X and NeWS.  Now there is (soon) a NeWS/X merge (with suntools
>>    support also).  Developers of lisp programming environments have had
>>    to program on a moving target (the window systems).  Unfortunately,
>>    most of the neat stuff that lisp programming environments do are
>>    window based (window-debuggers, fancy editors). 
>
>The solution to this is coming soon.  International Lisp Associates
 ************

A solution to this is coming from ILA.  Another from Sun, Another from
Franz inc.  Lucid, of course, has their own solution (independent of Sun's
solution).  Then there is Coral Common Lisp etc...  

I dont know which vendors belong to ILA.  Obviously symbolics does.
Does Sun? Lucid? Franz?

Im pretty sure Sun and Franz wouldn't be happy with a re-implementation
of Slimebolix windows.  Will that window system be CLOS based? 

>Barry Margolin
>Thinking Machines Corp.

The problem here is a recursive version of the window-systems problem.
Now, instead of having to write environments that target a specific
window system, we will be writing environments that target specific
window-system-independent LISP window systems.   Deja Vu.

JD
jdye@ads.com

barmar@think.COM (Barry Margolin) (05/08/89)

In article <7820@zodiac.UUCP> jdye@ads.com (John W. Dye Jr.) writes:
>In article <40215@think.UUCP> barmar@kulla.think.com.UUCP (Barry Margolin) writes:
>>The solution to this is coming soon.  International Lisp Associates
> ************

Correction gratefully accepted.  That was an arrogant remark.

>A solution to this is coming from ILA.  Another from Sun, Another from
>Franz inc.  Lucid, of course, has their own solution (independent of Sun's
>solution).  Then there is Coral Common Lisp etc...  
>
>I dont know which vendors belong to ILA.  Obviously symbolics does.
>Does Sun? Lucid? Franz?

ILA is not a consortium, it is a third-party Lisp software vendor and
consulting firm (made up mostly of ex-Symbolics employees).  Up to
now, their products have been targeted primarily to Symbolics
machines.  In the case of Y-Windows, however, they've been working
closely with Symbolics to bring the power of Dynamic Windows to other
Lisp environments.

>Im pretty sure Sun and Franz wouldn't be happy with a re-implementation
>of Slimebolix windows.  Will that window system be CLOS based? 

Y-Windows (and Symbolics's Dynamic Windows) is NOT a window system.
It is a window-based user interface management library.  It provides
object-oriented typed I/O, output recording, dialogs, structured and
device-indepenent graphics, and other very high-level operations.

Yes, it is implemented using CLOS.

>The problem here is a recursive version of the window-systems problem.
>Now, instead of having to write environments that target a specific
>window system, we will be writing environments that target specific
>window-system-independent LISP window systems.   Deja Vu.

So what's the problem with this?  Without standards at the high level,
everyone has to reinvent things.  Currently, the Xlib world is way
ahead of the CLX world because they've got lots of standard widget
libraries; we don't even have a portable way to put up a pop-up menu!
Y-Windows is a very good Lisp widget library, and it has the advantage
that it isn't X-specific the way the popular X widgets are (Sun is
apparently doing a similar thing with their Xview toolkit, which works
with both X, NeWS, and SunView).

Barry Margolin
Thinking Machines Corp.

barmar@think.com
{uunet,harvard}!think!barmar

jeff@aiai.ed.ac.uk (Jeff Dalton) (05/09/89)

In article <30132@apple.Apple.COM> malcolm@Apple.COM (Malcolm Slaney) writes:
   As far as the original question goes...it is much harder to build a
   good system building tool for Lisp then it is for a conventional
   language like C/Fortran/whatever.  In most conventional languages each
   compile can take place independently since the compiler doesn't depend
   on the environment.  This is not true of Lisp.

   What I really missed on the Symbolics machine was a good source code
   control system (no file versions are a bad way of doing that).  I
   ended up keeping my Lisp sources on a Sun, accessed them on the LispM
   with NFS and used RCS on the Sun.

I think you are right that it is hard to build good tools for Lisp.
Another problem is that Lisp gets certain parts of the programming
environment, such as TRACE, more or less for free.  Since the free
tools are pretty useful, there's less pressure to develop better tools
than there might be otherwise.

I also agree that Unix tools can be used effectively with Lisp.
Indeed, it is possible to use Lisp in a way that allows compilations
to be more or less independent.  Such techniques are often used with
Lisps, such as KCL and Franz, that are packaged so that file
compilation can be invoked by a typing a command to the shell.

In C, files are not completely independent, but it is (fortunately)
possible to put the necessary information about files not currently
being compiled into ".h" files for #include.  Some Lisps (such as
Franz) have an include, but usually a file has to be loaded.  So,
where in C you'd write

     #include macros.h

in Common Lisp you have to write something like this:

     (eval-when (eval compile)
       (load "macros"))

If you start a fresh Lisp for each file compilation, and if each file
loads in the extra information it needs to be compiled, the situation
for Lisp becomes very similar to that for C.  Moreover, you can use
makefiles, just as you would for C, and other Unix tools such as RCS.
And if A.lisp needs to be compiled before you can compile B.lisp, you
just put something like

     B.o: A.o

in the makefile.

jeff@aiai.ed.ac.uk (Jeff Dalton) (05/09/89)

In article <ROBERTS.89May4173615@studguppy.lanl.gov> roberts@studguppy.lanl.gov (Doug Roberts) writes:
   I think, however, that you missed one of the major reasons that the
   Unix LISP environment is still decidedly inferior to a LISPm: the
   majority of the market that is considering LISP as a language in which
   to deliver applications is currently a member of either the Unix or
   the VMS community: _they are not aware of the productivity that exists
   on a LISPm_.

I think you have identified an important point.  However, I would
guess that most of the people who implement Lisps for Unix (Lucid,
Franz Inc, at al) do have a fairly good idea of what the Lisp Machines
accomplish.  So why don't they provide the same thing on conventional
machines?

I think it's possible to provide environments that are very similar.
People here who use Inference ART (Automated Reasoning Tool, or
something like that), which is built on top of Lisp, report that the
ART environment on a Sun is very close to that on a Symbolics,
although at the Lisp level the debugger probably isn't as good.

However, possible is not the same as easy, and I suspect the Lisp
implementors have not had sufficient resources to let them prepare
environmental tools as soon as they'd have liked.

Indeed, I think it is often easier to develop tools for languages,
such as C, that are more straightforwardly based on files, statements,
and lines of code.  On Suns, for example, dbxtool can get some very
useful effects simple by displaying the relevant lines of a source
file.  Most Lisp debuggers find it very difficult to relate evaluation
to source code, and this is perhaps a greater problem in Common Lisp
because so much tends to be done by macros.  Macro expansions often
look very little like what the user wrote.  This can be true in C as
well, but most of the time it doesn't matter, because the debugger
works with lines of source rather than with individual expressions.

raymond@ptolemy.arc.nasa.gov (Eric A. Raymond) (05/09/89)

How does this compare to Common-Windows.  I believe it provides
Xerox/KEE style routines.  I know that Franz sells it as a product for
any machine with X.

-- 
Eric A. Raymond  (raymond@ptolemy.arc.nasa.gov)
G7 C7 G7 G#7 G7 G+13 C7 GM7 Am7 Bm7 Bd7 Am7 C7 Do13 G7 C7 G7 D+13: Elmore James

uda@majestix.ida.liu.se (Ulf Dahlen) (05/09/89)

In article <7802@zodiac.UUCP> jdye@ads.com (John W. Dye Jr.) writes:
>On the subject of Lisp Environments for Unix (SUN) workstations.
>
>As a former Lispm user who presently is a Sun common-lisp hacker
>I would like to make a few observations about the state and
>evolution of lisp environments for Unix workstations.  Hopefully
>these comments will spark constructive discussion about what
>we can do to speed the arrival of good lisp based programming environments.

There is very much of "religion" when it comes to programming languages.
Many people back off when they hear the word "Lisp", probably thinking
things like "slow", "parenthesis", "unreadable" and "toy-language".

At the same time, in the C, Ada, Pascal etc community, there is an
evolution towards incremental programming, interactive debugging,
incremental (re)compiling and so on. All these things that we Lisp machine
users have been working with for years. It often seems to me that there
are hundreds of developers reinventing the wheel here.

They say that the Lisp machine concept is dead. The phrase now is
"general all-purpose OS and workstations". Perhaps that is true, but
as a friend said the other day: "I'd rather develop my C-program in
Lisp first on a Lisp machine and then port it to a SUN when it's
fully debugged(!), than work directly with C and UNIX." Spare your
flames, I am not saying Lisp is better than C but that the programming
environment on a Lisp machine is superior to the "environment" on, say,
a SUN.

Just some idle thoughts...
__________
Ulf Dahlen
Dept of Computer & Info Science, University of Linkoping, Sweden
Troskaregatan 51:23       |     uda@ida.liu.se
S-583 30  LINKOPING       |     uda@majestix.liu.se, uda@liuida.UUCP
SWEDEN                    |     {mcvax,munnari,seismo}!enea!liuida!uda
"The beginning is a very delicate time."

weissman@apollo.COM (Mark Weissman) (05/09/89)

I'm reposting this message about an interface to LUCID and GNUbecause
it seems relevent to this discussion.  Also, not to plug our system
but just to disagree with the statement that its difficult to compile
lisp files standalone:

*PLUG ON* (sorry...)
  Apollo DOMAIN/CommonLISP comes with a package to interface to our
  source code control system (DSEE) which allows distributed building of
  large lisp applications across distributed networks on a per file
  basis and real version control with branching, baselevels etc.
*PLUG OFF*

Subject: Package for ZMACS functionality with GNU Emacs and Common Lisp

The newest version of Zmacs functionality package may be found on:
   labrea.stanford.edu (36.8.0.47)
in:
   pub/gnu/apollo_emacs_support.tar.Z

This Interface includes functions to provide many features found on
a lisp machine and many other useful features.  It was designed for
working with DOMAIN/CommonLISP on APOLLO workstations although it 
should not be a major job to port it to other systems.  This should
also work with other LUCID Common Lisp systems under X windows.

Mark Weissman
weissman@apollo.com

Many features in this package are useful even if you are not working
with Common Lisp.

Completion menus are now mouse sensitive, command history and
yanking has been defined for minibuffer eval-expression, shell modes,
inferior-lisp mode and yanked symbols can now be specified as only 
those matching some pattern.

Incremental Compilation for inferior lisp and emacs lisp is available.
For common lisp, forms will be compiled in proper packages.

Inferior shell history & command completion exists for inferior-lisp,
unix shells and minibuffer command input (M-<Esc>). 

Many of the functions that work with emacs lisp have also been made
to work with inferior-lisp as well as adding other features for non-lisp
users.  Where possible, all new functions for working with inferior-lisp
have been extended to work with emacs lisp and vice versa.

List Notifications can be used to look at past messages.

Real Attribute Lists for Mode, Package, Base, IBase, Readtable, Syntax etc.
can be used with inferior lisp.

Sectionized buffers and querying of inferior lisp process can be used to 
find source code for definitions.

Inferior lisp maintains command history so previous forms can be yanked
to current input line.

Many commands exist for querying about inferior lisp forms.

Commands exist for operating on only changed lisp definitions.

The following is a list of most commands available from this environment:

MOUSE COMMANDS:
  apollo:mouse-move-mark
  apollo:mouse-find-source-code ;; Click on lisp symbol to find source code
  apollo:mouse-click-right      ;; Context sensitive mouse key
  apollo:mouse-find-file        ;; Click on path to edit a file
  apollo:grab-thing-click-right 
  apollo:list-modifications-click-right
ATTRIBUTE LIST COMMANDS: 
  parse-attribute-list
  update-attribute-list
  common-lisp-mode
  set-syntax
  set-mode
  set-package
PARENTHESIS AND COMMENT COMMANDS:
  close-definition
  find-unbalanced-parentheses
  uncomment-out-region
  comment-out-region
SECTION AND FIND-SOURCE-CODE COMMANDS:
  sectionize-buffer
  find-file-no-sectionize
  list-sections
  apollo:key-find-source-code ;; Locate source code by querrying lisp
FILE EVALUATION AND COMPILATION COMMANDS:
  compile-file
  load-file
  load-compile-file
EVALUATION AND COMPILATION COMMANDS:
  evaluate-buffer
  compile-buffer
  apollo:evaluate-last-sexp
  apollo:lisp-send-defun
  apollo:lisp-compile-defun
  evaluate-region
  evaluate-region-hack
  compile-region
COMMON LISP EVALUATION:
  evaluate-common-lisp
  evaluate-and-replace-into-buffer
  evaluate-into-buffer
CHANGED DEFINTION COMMANDS:
  list-modifications
  edit-next-definition
CHANGED DEFINITION COMMANDS:
  list-buffer-changed-definitions
  evaluate-buffer-changed-definitions
  compile-buffer-changed-definitions
  edit-buffer-changed-definitions
  list-changed-definitions
  evaluate-changed-definitions
  compile-changed-definitions
  edit-changed-definitions
CALLERS COMMANDS:
  edit-callers
  next-caller
  list-callers
  who-calls
DESCRIPTION COMMANDS:
  disassemble-lisp-code
  quick-arglist
  macro-expand-expression
  where-is-symbol
  describe-variable-at-point
  describe-function-at-point
  show-lisp-documentation
  apropos-symbol-at-point
  what-package
INFERIOR SHELL COMMANDS:
  csh
  sh
INFERIOR LISP COMMANDS:
  lisp
YANK HISTORY COMMANDS:
  apollo:yank
  apollo:yank-command
  yank-prev-command
  yank-prefix-command
  apollo:yank-pop
COMPLETION COMMANDS:
  apollo:shell-complete-symbol
  apollo:lisp-complete-symbol
MISCELANEOUS COMMANDS:
  list-notifications
  delete-all-font-info
  apollo:switch-to-buffer
  fix-man-output
  grep-to-temp-buffer
  insert-date
  pagify-lisp-buffer
  print-buffer-apollo
  print-region-apollo
  apollo:beginning-of-line

jeff@aiai.ed.ac.uk (Jeff Dalton) (05/10/89)

In article <1258@majestix.ida.liu.se> uda@majestix.ida.liu.se (Ulf Dahlen) writes:
>I am not saying Lisp is better than C but that the programming
>environment on a Lisp machine is superior to the "environment" on,
>say, a SUN.

There are certainly advantages to using a Lisp machine (and Lisp
generally).  For one thing, it's nice to be able to change one
function without having to relink the whole porgram.  But the
Sun C "environment" is not without its virtues.  Dbxtool is
an effective debugger, for example, and source-code control
systems (sccs, rcs) and make have to count for something too.

raymond@ptolemy.arc.nasa.gov (Eric A. Raymond) (05/10/89)

In article <431e7355.12972@apollo.COM> weissman@apollo.COM (Mark Weissman) writes:
>
>but just to disagree with the statement that its difficult to compile
>lisp files standalone:

I think the original poster was pointing out the standard macro/inline
function problem.  That is, expressions are compiled in the current
environment.  Macros/inline functions referenced in an expression get
whatever version which is currently in your world.  If you didn't
load the macro def, the compiler assumes it is a function (and may or
may not complain about it later).

A similiar situation occurs in the C world, where you have macros
(#defines), function prototypes, and variable definitions to consider
(the latter two cases are usually caught by a good compiler).  This is
a bit different in the st definitions are valid for only a single
source file.  Thus each file is resposible to include these
definitions.

The REQUIRE construct, if used religously, can provide similar
functionality in many cases.

Of course there are extra things that a DEFSYSTEM supplies ....

Summary: Lisp and C share similar problems.


>  distributed building of
>  large lisp applications across distributed networks on a per file
>  basis

This really doesn't make much sense. 

 - Lisp development typically does not require the same
   edit-compile-link cycle that other languages do.  A make-like facility
   is not as important.

 - Since there is rarely the need to compile lots of code at one time,
   distribution of compilation (which I assume yields speedup) is not
   as great a win.  Exceptions to this may include building old
   versions of software, distributing customized code; these are
   typically infrequent operations, so speedup is not such a great
   win, again.

 - I assume that you must have n distinct lisp worlds on n machines.
   So how do you get around the problems I discussed above (the need
   to have proper environment loaded)?  In C you are forced to read in
   your #include files for each source file compiled.  In Lisp, you
   just load it into your world once.  If you have n Lisps, each much
   load the the environment in.

-- 
Eric A. Raymond  (raymond@ptolemy.arc.nasa.gov)
G7 C7 G7 G#7 G7 G+13 C7 GM7 Am7 Bm7 Bd7 Am7 C7 Do13 G7 C7 G7 D+13: Elmore James

weissman@apollo.COM (Mark Weissman) (05/11/89)

Distributed Builds of large lisp systems:

> Macros/inline functions referenced in an expression get
> whatever version which is currently in your world.  
> If you didn't load the macro def, the compiler assumes it is a
> function (and may or may not complain about it later).

This system loads the minimal subset (user specified) necessary to
compile a file into an initially virgin lisp world.  If the required
files are compiled the binaries are loaded, otherwise the sources.  I
beleive that it is the case that a given file will usually only
require a handful of others for compilation to proceed.  With Lucid,
it is much faster to load an interpretted file than to compile and
load a file.  

These mini-builds can then be distributed over a network.  The
overhead cost of starting up a new lisp world is minimized by having
each world remain resident and reusable throughout the course of each
build.  Required files present in each world do not need to be
reloaded when compiling subsequent files.

> - Lisp development typically does not require the same
>   edit-compile-link cycle that other languages do.  A make-like facility
>   is not as important.

I disagree.  In a large lisp system when a widely used macro is
redefined many files may need to be compiled.  I have worked on
projects many entire days were spent recompiling a large system.  It
is true that for a small system with only one developer there is less
need for source control and make facilities.  Due to the fairly high
overhead involved with lisp applications, they tend to be large systems.

Mark Weissman
weissman@apollo.com

knighten@pinocchio (Bob Knighten) (05/11/89)

It is interesting to note that MIT's AI Lab seems to be rapidly moving to Unix
worstations and away from Lisp machines.

layer@snooze.UUCP (Kevin Layer) (05/11/89)

In answer to jdye@ads.com (John W. Dye Jr.):

    3)  Finally, Emacs isnt that bad.  It's programmable, and with the
	right set of hacks and tags tables it begins to approximate the
	typical symbolics compile-test-debug cycle that we are all so
	familiar with.

Not true.  We have coupled GNU Emacs and Allegro CL very tighly via
TCP/IP communication channels using multiprocessing (a scheduler built
on stack groups) in Allegro CL.  Things like completion of Lisp
symbols and M-. from Emacs dynamically query the Lisp environment for
information, in real time (completion of symbols is instaneous on a
Sun3/160).

    The biggest problem we still face is debugging a multiprocessing lisp
    using emacs only (we dont like sunview--OK!).  It would be nice to have
    a facility like the lucid editor provides in good old gnu emacs.

Again, we have almost reproduced a Lisp machine environment using just
GNU Emacs.  The `almost' comes from the fact that the interpreter for
Emacs and Lisp are different (and always will be), so you must write
editor programs in Emacs Lisp and everything else in Common Lisp.  I'm
not sure what problems you are talking about here, but our debugger
allows the interactive debugging of a process in one window from
another.

	Kevin Layer
	Franz Inc.

miller@CS.ROCHESTER.EDU (Brad Miller) (05/11/89)

I really wasn't going to post anything more on this stuff, but oh hell.
There are no flames in this article. I'm only, AS ALWAYS, Seeking knowledge.

    Date: 11 May 89 01:03:36 GMT
    From: layer@snooze.UUCP (Kevin Layer)

	The biggest problem we still face is debugging a multiprocessing lisp
	using emacs only (we dont like sunview--OK!).  It would be nice to have
	a facility like the lucid editor provides in good old gnu emacs.

    Again, we have almost reproduced a Lisp machine environment using just
    GNU Emacs.  The `almost' comes from the fact that the interpreter for
    Emacs and Lisp are different (and always will be), so you must write
    editor programs in Emacs Lisp and everything else in Common Lisp.  I'm
    not sure what problems you are talking about here, but our debugger
    allows the interactive debugging of a process in one window from
    another.

OK I'll bite and would appreciate your response (I'm used to lispms; only
non-lispm CL I can currently compare to is KCL - an environment-free lisp
:-)

Can you handle >1 lisp process running in the same (shared) address space?
That is, are there stack groups and coroutining? Dynamic Closures (basically
a stack group not run in a separate process)? Can I indeed debug a running
process from another process in a different window (examine vars, etc. while
the thing runs)? Can I invoke the editor based on the current stack frame?
Examine lexical closures not currently on the stack? Examine open catch
tags? error tags? applicable CLOS before/after/primary methods? Upon
signalling an error in a bad argument to some instruction, can the debugger
move the stack immediately to the frame that originally passed the bad value
(presumably up the stack via parameters, but possibly via a dynamic
binding)? How about invoking the debugger when a var is rebound? written to?
read? Can I mouse click on anything printed in the window and examine it's
underlying representation? (a list may print as (A B #) but is really
something else, also for, e.g. structured objects with print methods).

And, since this is almost a lispm environment modulo the emacs lisp
language, can I incrementally compile a new scheduler and have it work on my
workstation? What about other parts of the system? Full sources provided? In
Lisp? My primary input editor to any window Emacs (I do so hate to have to
reimplement the world to provide decent input editing to the user when I
want to accept data)? The point here is that real hackers want the universe
essentially part of the language, not a separate OS. The lispms give you
that - and that's what I want from anyone claiming to be giving me
"essentially a lispm environment".

Last time I looked seriously at the non-lispm lisp market was 18 months ago,
and it was clear then that the lispm was worth the money, mainly because of
the debugger, but also because of these other issues I've touched on. There
also seemed to be serious problems if the user defined "interesting" reader
macros when the system handled rubouts. (Like, it didn't correctly undo/redo
the functions the macros called). If you can give me that, with the
"approved" CL extensions (error system, CLX, CLOS) and at least 3620 level
performance without 3620 cost (2mw, 380mb = $20k) - your system starts
becoming quite a viable alternative.

"A lispm chauvinist with an open mind" (oxymoron of the month club selection
:-),

layer@snooze.UUCP (Kevin Layer) (05/12/89)

Brad,
  Yes we have *true* stack groups which share everything (but stack,
obviously)--there is one UNIX process.  All the features of our
debugger work generally on any `lightweight lisp' process (LLP), which
means you can return, restart, examine locals, etc.  In fact, Allegro
Common Windows uses one LLP for each window.  

>> Can I invoke the editor based on the current stack frame?

I'm not familiar with this feature (maybe if I were we would do it).

>> Examine lexical closures not currently on the stack?

Closures can be inspected just as functions can be.

>> Examine open catch tags? error tags? 

Yes.  What are "error tags"?  If this has to do with the (kmp's)
condition system, then the answer is that we fully support version 18.

>> applicable CLOS before/after/primary methods? 

We don't have a CLOS browser, but it is on our list for version 2 of
Allegro Composer.

>> Upon
>> signalling an error in a bad argument to some instruction, can the debugger
>> move the stack immediately to the frame that originally passed the bad value
>> (presumably up the stack via parameters, but possibly via a dynamic
>> binding)? How about invoking the debugger when a var is rebound? written to?
>> read? 

If you mean can we search back up the stack for an object and set the
frame point to that frame, well, now that you mention it that sounds
pretty handy and easy to do.  I've just put it on our "request for
enhancement" list.

We don't currently have "watch points", but I admit it would be nice.

>> Can I mouse click on anything printed in the window and examine it's
>> underlying representation? (a list may print as (A B #) but is really
>> something else, also for, e.g. structured objects with print methods).

For objects that have print/read consistency, we have an inspect which
will do what you want.

Regarding source code, yes, we do license it.  And yes, if you have
source code you may redefine the scheduler.  Even without source, you
can advise any function to change its behaviour.

>> My primary input editor to any window Emacs (I do so hate to have to
>> reimplement the world to provide decent input editing to the user when I
>> want to accept data)?

When version 19 of GNU Emacs is available, then all your interaction
with Lisp could be through type-in windows managed by X11/GNU Emacs.

The integration between GNU Emacs and Lisp is tight, as I said.  Most
everything one does in the interface is transparent, *except*
customizing the editor.  If you spend a lot of time doing this, then
this might be a drawback.

>> There
>> also seemed to be serious problems if the user defined "interesting" reader
>> macros when the system handled rubouts. (Like, it didn't correctly undo/redo
>> the functions the macros called).

Our reader does backup and re-read expressions so that reader macros
are handled correctly--this is done by Allegro Common Windows.

>> If you can give me that, with the
>> "approved" CL extensions (error system, CLX, CLOS) and at least 3620 level
>> performance without 3620 cost (2mw, 380mb = $20k) - your system starts
>> becoming quite a viable alternative.

We have v18 of the condition system, CLX (highly optimized), and CLOS.
I think you'll find the performance of Allegro CL on RISC processors is
quite amazing.  If that isn't enough, we're on the Cray.

	Kevin Layer
	Franz Inc.

aihaug@AUSTIN.LOCKHEED.COM (Daniel A Haug) (05/12/89)

> Brad,
>   Yes we have *true* stack groups which share everything (but stack,
> obviously)--there is one UNIX process.  All the features of our
> debugger work generally on any `lightweight lisp' process (LLP), which
> means you can return, restart, examine locals, etc.  In fact, Allegro
> Common Windows uses one LLP for each window.  

I must admit that this is much further advanced than I had previously
expected.  I, too, am seeking more information.  To that end, what
about a maintenance utility equivalent to the System Construction Tool
on the Symbolics?  This includes tracking system-level and file-level
versions, and support for an extensive Patch facility?  I've already
heard Apollo talking about their `make'-equivalent system.  This is
no good if it can't retain and track specific file versions, and
system versions.  Along this line would be distribution/restoration
of systems (I'm talking about something more than `tar' -- the ability
to generate a system distribution for a specific version, with all the
patches, sources, etc., in a single command).  I've also seen a few
public domain defsystem packages.  But, what I've seen so far doesn't
even come close to what we need.

My motivation?: our project is now several hundred thousands of lines of
code in size.  We have several subcontractors, and we exchange systems
back and fourth on a daily basis.  We would DIE without the configuration
management capabilities provided by SCT.  We also support several
additional sites with full software suites, and some sites with restricted
suites (e.g. no sources).  We do this all currently with half a person.

What about ``advising'' functions?  I could live without it... but it
has proved itself as an excellent debugging tool.

Dan Haug

Internet: haug@austin.lockheed.com
uucp:     ut-emx!lad-shrike!aihaug

lou@bearcat.rutgers.edu (Lou Steinberg) (05/12/89)

In all this discussion, I still haven't seen anyone mention the main
reason (IMHO) that Unix workstations are usually preferable to lisp
machines.  It has nothing to do with the intrinsic merit of the
machine architecture or the software system.  Rather, Unix boxes are
better because "everyone has one", both in the sense that large numbers
of people own them and in the sense that large numbers of companies
sell them.

Here at Rutgers, for instance, we have two symbolics machines and
probably over a hundred Suns.  The symbolics machines may
intrinsically be easier to support, but I simply can't tell.  Any
possible intrinsic advantage of symbolics is swamped by the fact that
there are so many people around here who can answer questions about
the suns, and write or collect software for the suns, and no one who
is a real symbolics wizard.

A similar argument applies to the hardware architecture.  Motorola
sells so many 68xxx chips it can afford a Whole Lot of engineers to
tweak every last nanosecond out of every last transistor.  Symbolics
simply can't afford that degree of low-level optimization, and it has
turned out that the advantage of a specialized architecture just
doesn't make up for the disadvantage of low volume and hence less
optimization.  With the advent of sparc (sun-4) and other risc
architectures, the advantages of symbolics' specialized micro-code
have gotten even less, since with risc the machine language is
essentially micro-code, and the instruction cache fills the role of
the micro-code memory.  (And sparc was in fact designed partly with
lisp in mind.)

Finally, the fact that lots of companies sell Unix boxes means that
Sun has to be competitive in price, performance, and service.  If they
try to charge too much I can always go elsewhere.  With lisp machines
you have much less choice.  (Otherwise symbolics could not charge what
it does.)

Let me end by emphasizing that these considerations are not always
decisive for everyone - I've seen the Allegro Composer environment
from Franz that was mentioned here, and I agree that the unix-based
programming environments are rapidly closing the gap, but they aren't
fully there yet.  However, for most situations, I think the issues
I've mentioned here do mean unix-based lisps are preferable.
-- 
					Lou Steinberg

uucp:   {pretty much any major site}!rutgers!aramis.rutgers.edu!lou 
arpa:   lou@aramis.rutgers.edu

layer@franz.UUCP (Kevin Layer) (05/14/89)

In answer to "aihaug@AUSTIN.LOCKHEED.COM (Daniel A Haug)":

>> what
>> about a maintenance utility equivalent to the System Construction Tool
>> on the Symbolics?  This includes tracking system-level and file-level
>> versions, and support for an extensive Patch facility?  

I am not familiar with this system, but I can offer my thoughts on
what I use to manage a large software system on UNIX, the source to
Allegro CL.  The main tool is RCS (Revision Control System), which is
publically available--it is one solid piece of freeware, as I've only
had to fix one bug in it in the many years I've used it.  Second, I
use the defsystem in Allegro CL, which is derived from a public one I
snarfed a couple of years ago off the net (sorry, I forgot the name of
the author).  I modified it heavily for the intended use of building
and maintaining Allegro CL, but it does not have an interface to RCS
(nor does it know about file versions), for the simple reason that I
never needed it.  As for distribution of Allegro CL, I use a
combination of UNIX tools (in the form of shell scripts) which make
the creation and maintainence of distributions/releases quite easy.

>> What about ``advising'' functions?  I could live without it... but it
>> has proved itself as an excellent debugging tool.

Yes, we have this, and I agree that it is a good debugging tool.

	Kevin Layer
	Franz Inc.

conliffe@caen.engin.umich.edu (Darryl C. Conliffe) (05/15/89)

In article <260@shrike.AUSTIN.LOCKHEED.COM>, aihaug@AUSTIN.LOCKHEED.COM (Daniel A Haug) writes:
> I must admit that this is much further advanced than I had previously
> expected.  I, too, am seeking more information.  To that end, what
> about a maintenance utility equivalent to the System Construction Tool
> on the Symbolics?  This includes tracking system-level and file-level
> versions, and support for an extensive Patch facility?  I've already
> heard Apollo talking about their `make'-equivalent system.  This is
> no good if it can't retain and track specific file versions, and
> system versions.  Along this line would be distribution/restoration
> of systems (I'm talking about something more than `tar' -- the ability
> to generate a system distribution for a specific version, with all the
> patches, sources, etc., in a single command).  I've also seen a few
> public domain defsystem packages.  But, what I've seen so far doesn't
> even come close to what we need.

I currently use DSEE to manage separate source files
used in building a system.  Each source file can be
tagged with a user defined name to specify the
"version" of the build.  Releases can be reconstructed
-  even those requiring the use of previous compilers
to function correctly.  Objects (files) can also be
monitored, so that changes alert others who need
to be advised of potential impacts (i.e. upon documentation)
wrought by the changes.  Must importantly, a full
history of
who and what changes is maintained.

dlw@odi.com (Dan Weinreb) (05/18/89)

In article <421@skye.ed.ac.uk> jeff@aiai.ed.ac.uk (Jeff Dalton) writes:

   Indeed, I think it is often easier to develop tools for languages,
   such as C, that are more straightforwardly based on files, statements,
   and lines of code.  On Suns, for example, dbxtool can get some very
   useful effects simple by displaying the relevant lines of a source
   file.  Most Lisp debuggers find it very difficult to relate evaluation
   to source code, and this is perhaps a greater problem in Common Lisp
   because so much tends to be done by macros.  Macro expansions often
   look very little like what the user wrote.  This can be true in C as
   well, but most of the time it doesn't matter, because the debugger
   works with lines of source rather than with individual expressions.

Yes, the reason it's hard in Lisp is entirely due to macros.  Look at
the Symbolics debugger: when it is operating on C, Fortran, or Pascal
code, it works completely correctly with the original source code (and
works a whole lot better than dbxtool, by the way).  But doing the
same for Lisp is significantly harder, because Lisp macros are so
powerful.  It's a tradeoff: it is harder to write a source-level
debugging for an extensible language like Lisp than for a fixed
language like C.  Of course, Lisp's extensibility (arising from
macros) is one of the biggest advantages of the Lisp language.  C
macros are much wimpier, and therefore easier to deal with.  Still,
the fact that the C debuggers cannot handle macros can be a true pain
in the neck, as I've discovered in practice.

roberts@studguppy.lanl.gov (Doug Roberts) (05/18/89)

In article <421@skye.ed.ac.uk> jeff@aiai.ed.ac.uk (Jeff Dalton) writes:

> In article <ROBERTS.89May4173615@studguppy.lanl.gov> roberts@studguppy.lanl.gov (Doug Roberts) writes:
>    I think, however, that you missed one of the major reasons that the
>    Unix LISP environment is still decidedly inferior to a LISPm: the
>    majority of the market that is considering LISP as a language in which
>    to deliver applications is currently a member of either the Unix or
>    the VMS community: _they are not aware of the productivity that exists
>    on a LISPm_.
> 
> I think you have identified an important point.  However, I would
> guess that most of the people who implement Lisps for Unix (Lucid,
> Franz Inc, at al) do have a fairly good idea of what the Lisp Machines
> accomplish.  So why don't they provide the same thing on conventional
> machines?
> 
> I think it's possible to provide environments that are very similar.
> People here who use Inference ART (Automated Reasoning Tool, or
> something like that), which is built on top of Lisp, report that the
> ART environment on a Sun is very close to that on a Symbolics,
> although at the Lisp level the debugger probably isn't as good.
> 
> However, possible is not the same as easy, and I suspect the Lisp
> implementors have not had sufficient resources to let them prepare
> environmental tools as soon as they'd have liked.
> 
During my visit to Lucid a few months ago, I got the impression that
many (but not all) of the internals people there had a previous
history with LispMs. I did receive an interesting comment to my
suggestion that they should try to emulate the functionality of
Symbolics' window debugger in their lisp environment. The comment was
something like" "Groan... Just what we need... To make our product
more Symbolics-like." 

I suspect that the Unix lisp developers' priorities regarding
development environments are changing. After one of my previous
postings in which I complained about the lack of good debuggers &
inspectors, I received mail form Lucid and Ibuki, and phone calls from
Franz and Envos. Ibuki, Franz, and Envos all offered their products as
examples of "new, improved" more functional development lisp
environments. I haven't heard directly from Lucid regarding any
efforts they might have on-going, but I did read in a previous posting
from this group about a "Cadillac" environment that they are working
on?? I also don't know what Sun might be working on, if anything (Sun
of SPE? :-})..

On that note...

--Doug
	


--

===============================================================
Douglas Roberts
Los Alamos National Laboratory
Box 1663, MS F-602
Los Alamos, New Mexico 87545
(505)667-4569
dzzr@lanl.gov
===============================================================

jeff@aiai.ed.ac.uk (Jeff Dalton) (05/24/89)

In article <31670@sri-unix.SRI.COM> roberts@studguppy.lanl.gov (Doug Roberts) writes:
>I did receive an interesting comment to my suggestion that they should try
>to emulate the functionality of Symbolics' window debugger in their lisp
>environment. The comment was something like" "Groan... Just what we 
>need... To make our product more Symbolics-like." 

I guess some of the Unix Lisp vendors are under commercial pressure to
make Lisp fit better with conventional languages and conventional ways
of doing things rather than make systems for Lisp hackers (such hackers
being in limited supply).  There is often also a strong negative reaction
to Lisp systems that are many megabytes in size.  To some extent this is
unfair, because calculations of the size of C systems tend to omit all
the things C gets "free" from Unix, but it is not completely unfair.

Lisp often ends up in competition with C.  People disagree about the
best way to compete (become more C-like?  less C-like?), but it has
proved difficult to convert the world to Lisp-like ways of thinking.
And so the "don't be like Symbolics" approach has its attractions.

jdu@ihlpf.ATT.COM (John Unruh, NY9R) (05/25/89)

In article <469@skye.ed.ac.uk> jeff@aiai.UUCP (Jeff Dalton) writes:
>
>I guess some of the Unix Lisp vendors are under commercial pressure to
>make Lisp fit better with conventional languages and conventional ways
>of doing things rather than make systems for Lisp hackers (such hackers
>being in limited supply).  There is often also a strong negative reaction
>to Lisp systems that are many megabytes in size.  To some extent this is
>unfair, because calculations of the size of C systems tend to omit all
>the things C gets "free" from Unix, but it is not completely unfair.
>

There is one big distinction between the things C programmers get from
the UNIX (R) Operating System and what a LISP programmer gets from a
good LISP system.  Ususally the tools used for C programming reside on
the disk except for the moments when they are in use (modulo the sticky
bit).  If a tool (especially a large one) is used in a LISP image, it
increases the size of the LISP image's virtual memory.  If the LISP
implementation is good at saving memory, programmers tools will be
autoloading files of some sort, or at least not be bound into the image
in such a way that every delivery system must have them.

Many machines have limits on the maximum process size, and have problems
with really big processes.  This may be an artifact of how conventional
programming languages work.  Most C programs are fairly small, and the
environment is not integrated in the same way as a Lisp machine, so the
whole thing tends to be less memory intensive.
-- 
                               John Unruh
                               AT&T-Bell Laboratories
                               att!ihlpf!jdu
                               (312)979-6765

aarons@syma.sussex.ac.uk (Aaron Sloman) (05/27/89)

I can't resist reacting to this (sorry Jeff)

jeff@aiai.ed.ac.uk (Jeff Dalton) writes:

> Organization: AIAI, University of Edinburgh, Scotland
>
> Lisp often ends up in competition with C.  People disagree about the
> best way to compete (become more C-like?  less C-like?), but it has
> proved difficult to convert the world to Lisp-like ways of thinking.

There is some evidence, albeit based only on a number of cases
rather than systematic research, that it is much easier to convert
people to Lisp-like ways of thinking if you give them a Lisp-like
language (with all the benefits of incremental compilation,
integrated editor, garbage collector, pattern matcher, lists,
records, arrays, tuples, etc etc) but not a lisp-like syntax.

Instead provide a syntax that is richer, more readable, and above
all more familiar to them.

I am of course, referring to Pop-11, which is far from perfect, but
has converted many C, Fortran and Pascal users to enthusiastic AI
language users on Suns, Vaxen, HP machines, Apollos, Macs (see
review of Alphapop in Byte May 1988).

I am not criticising Lisp for any intrinsic faults - just commenting
on how its syntax makes a significant proportion of experienced
programmers react.

Try putting in front of them any lisp procedure definition involving
a few nested loops and moderately long multi branch conditionals
with 'else' clauses. Ask them to try to understand it. Then compare
their ability to understand the Pop-11 equivalent using the sort of
syntax indicated here (in a simpler example).

    define test(item, P, Q, R, S, T);
        lvars item, procedure(P, Q, R, S, T);

        if null(item) then return(false)
        elseif P(item) then return(Q(item))
        elseif R(item) then
            if S(item) then return(item)
            elseif T(item) then return(Q(item))
            else return(item)
            endif
        else return(false)
        endif
    enddefine;

Here's how you let the variables x, and y, iterate over the elements
of lists L1 and L2 in Pop-11, so that you can apply function f to
one element from each list:

    for x, y in L1, L2 do
        f(x,y)
    endfor;

This kind of syntax, with all the extra syntax words, is often more
verbose than Lisp, and parsing by machine is more complex. But the
explicit use of distinctive opening and closing brackets "endif",
"endfor" etc, and sub-syntax words like "do", "elseif" and "then"
seems to add redundancy that is better suited to HUMAN readers
(apart from those already familiar with lisp, of course), and it
also allows the compiler to help you more if you put a closing
bracket in the wrong place. You get an error message saying
something like

    MSW: MISPLACED SYNTAX WORD : FOUND endwhile READING TO endif

instead of a program that compiles and then behaves oddly.

I think the key factor is reducing the load on short term memory.
In Pop-11, like many non-AI languages, the form:

    elseif ..... then .....

makes the context clear without your having to remember or scan back
to the beginning of the conditional expression, whereas in standard
Lisp syntax

    ((.....) (.....))

could mean almost anything, depending on the context: to work out that
it is part of a conditional one has to look back to the beginning of the
expression for the keyword "cond".

More generally "(x y z)" in LISP could, depending on context, be a
procedure call, a set of formal parameters, a set of local
variables, a list of three atoms, a condition in a conditional, the
consequent of a conditional, and so on.

By contrast, POP-11 uses a large variety of keywords to distinguish
syntactic contexts, instead of relying so much on POSITION to
determine significance. Parentheses are used for procedure calls in
the normal way, but distinct brackets are used for list expressions
"[...]" and vectors "{...}" and this also helps to reduce cognitive
load during reading. (This sort of thing is particularly important
when large programs have to be maintained by programmers who were
not the original authors.)

Please note that I am not saying that Pop-11 has the perfect syntax,
only (a) I've met quite a lot of programmers who had been put off
Lisp but have found Pop-11 very attractive and (b) this is to be
expected and explained because lisp is syntactically elegant and
economical whereas Pop-11 is syntactically rich and redundant.

I conjecture that far far more programmers around the world would be
using AI tools for all kinds of software development if the AI
community had been pushing Pop-11 at least as a way of getting into
AI. Turning to Lisp after having learnt AI techniques through Pop-11
is often quite successful.

As for C - well it is popular despite having a comparatively poor
syntax (and usually deadfully unhelpful compiler error messages),
but that is partly because it is a much simpler language than either
Lisp or Pop-11, and so there is much less to learn, and much less
for the syntax to distinguish.

Aaron Sloman,
School of Cognitive and Computing Sciences,
Univ of Sussex, Brighton, BN1 9QN, England
    INTERNET: aarons%uk.ac.sussex.cogs@nsfnet-relay.ac.uk
              aarons%uk.ac.sussex.cogs%nsfnet-relay.ac.uk@relay.cs.net
    JANET     aarons@cogs.sussex.ac.uk
    BITNET:   aarons%uk.ac.sussex.cogs@uk.ac
        or    aarons%uk.ac.sussex.cogs%ukacrl.bitnet@cunyvm.cuny.edu
    UUCP:     ...mcvax!ukc!cogs!aarons
            or aarons@cogs.uucp

aarons@syma.sussex.ac.uk (Aaron Sloman) (05/27/89)

jdu@ihlpf.ATT.COM (John Unruh, NY9R) writes:

>
> In article <469@skye.ed.ac.uk> jeff@aiai.UUCP (Jeff Dalton) writes:
    ........
> > ......There is often also a strong negative reaction
> >to Lisp systems that are many megabytes in size ....
> >
    .......
>
> Many machines have limits on the maximum process size, and have problems
> with really big processes.  This may be an artifact of how conventional
> programming languages work.  Most C programs are fairly small, and the
> environment is not integrated in the same way as a Lisp machine, so the
> whole thing tends to be less memory intensive.

Following on from my previous message on the syntactic differences
between Lisp and Pop-11, I guess I should point out that a full
Pop-11 development environment on a workstation tends to be very
much smaller than some of the well known Lisp environments.

This is partly because of the heavy use of auto-loading, which means
that Pop-11 facilities you don't use don't get linked into your
process. You can get useful work done in Poplog Pop-11 even on a 2
Mbyte machine (and that includes running a Pop-11 process that
includes an integrated Emacs-like, but smaller, editor VED), whereas
Lisp development environments tend to require far more memory.
Poplog Common Lisp doesn't require nearly so much (you can get
useful work done in 2.5 to 3 Mbytes) but it is not as heavily
optimised as Lucid. (It compensates by compiling faster.)

Like other AI language vendors we are adding tools to Poplog to allow
you to link an image containing only what your final system needs.
However, the minimal size will still be considerably more than
the minimal size of a C program.

Aaron Sloman,
School of Cognitive and Computing Sciences,
Univ of Sussex, Brighton, BN1 9QN, England
    INTERNET: aarons%uk.ac.sussex.cogs@nsfnet-relay.ac.uk
              aarons%uk.ac.sussex.cogs%nsfnet-relay.ac.uk@relay.cs.net
    JANET     aarons@cogs.sussex.ac.uk
    BITNET:   aarons%uk.ac.sussex.cogs@uk.ac
        or    aarons%uk.ac.sussex.cogs%ukacrl.bitnet@cunyvm.cuny.edu

    UUCP:     ...mcvax!ukc!cogs!aarons
            or aarons@cogs.uucp

lgm@ihlpf.ATT.COM (Mayka) (05/29/89)

In article <1028@syma.sussex.ac.uk>, aarons@syma.sussex.ac.uk (Aaron Sloman) writes:
> By contrast, POP-11 uses a large variety of keywords to distinguish
> syntactic contexts, instead of relying so much on POSITION to
> determine significance. Parentheses are used for procedure calls in
> the normal way, but distinct brackets are used for list expressions
> "[...]" and vectors "{...}" and this also helps to reduce cognitive
> load during reading. (This sort of thing is particularly important
> when large programs have to be maintained by programmers who were
> not the original authors.)

Common Lisp has the ability to handle

	1) Keyword-named functions, macros, and special forms.
		These make the function, macro, or special form look
		more like "constant" syntax that the ordinary
		programmer should not redefine.  Helps it to look
		more like syntax and less like dumb data.

	(defmacro :if (test &key then else)
		`(if ,test ,then ,else)
		)

	(:if x
	   :then y
	   :else z
	   )

	2) Keyword arguments to functions, macros, and special forms.
		Makes syntax less dependent on position, more mnemonic.
		The cost for macros and special forms is only at
		compile time.  Even for functions, keyword arguments
		need not be unduly expensive.

	3) Alternate grouping characters.  Characters such as  [ ]
		and  { }  can be redefined as synonyms for  ( ) , with
		the restriction that opening and closing character
		must match.  A convention could even be agreed upon that,
		for example,  { }  be used for lists that are clearly
		dumb data, leaving  ( )  more clearly as the function-
		calling syntax.

All this can be done within Common Lisp itself.  There need be no
confusion if such constructs are used uniformly within a given
software project.

Has anyone ever considered a plan like this - for the sake of
timid souls who are intimidated by Lisp's "native" syntax?


	Lawrence G. Mayka
	AT&T Bell Laboratories
	lgm@ihlpf.att.com

dlw@odi.com (Dan Weinreb) (05/29/89)

It's true that programmers familiar with C or Fortran are sometimes
scared off by Lisp's syntax, because it's so different.  But I dispute
your claims that Lisp's syntax is *actually* inferior.

Lisp programs with nested loops and complicated nested conditionals
are quite easy to read, because all serious Lisp programmers indent
their code properly and uniformly (because their editors make it easy
to do so).  Indentation is how people really understand the groupings
involved in these loops and conditionals.  Try taking your Pop-11
example, and intentionally making *one* indentation error; for example,
take the "if S(item)" line, the fourth line in the body, and indent
it right under the third.  Nobody will understand the code any more.

I agree that the Pop-11 indentation syntax you show is better than
what's currently in Common Lisp.  This situation is due to an
unfortunately holy war in the Lisp community, whose members cannot
agree on an iteration construct.  In the Symbolics superset of
Common Lisp, here's what your example would look like:

  (loop for x in l1
        for y in l2 do
    (f x y))

It has this advantage over the Pop-11 syntax you showed: each variable
is grouped together with the expression that produces the list, so
it's clearly who goes with whom.  The above example is too trivial to
demonstrate the point; it's clearer when the expressions are larger
than "l1".  (By the way, the keyword "do" will probably be removed or
made optional in the version of "loop" proposed for Common Lisp; years
of experience show that it would work better that way.)

You talk about "a program that compiles and then behaves oddly"
because of mismatches of opening and closing syntax.  This never
happens in Lisp, when used with serious editors.  It may not be
apparent to you that editors would solve this issue so well, and I
don't know how to demonstrate it; I have spent over ten years doing
this kind of programming, and I'll have to ask you to take my word for
it.  Or ask some other experienced Lisp programmers who use equivalent
editors.


      ((.....) (.....))

  could mean almost anything, depending on the context: to work out that
  it is part of a conditional one has to look back to the beginning of the

In a very long "cond" on a very small screen or window, yes, this
is true.  On the other hand, if you are looking at a big procedure
through a tiny window, it's hard to understand what's going on even
if you do know which kind of statement is which.  I agree that this
can be a problem, but in practice, "cond"s usually don't get that
big, and modern serious Lisp programmers tend to have large screens,
so it's not much of a problem.  Comments in the code also work wonders
for readability, not only making it clear that you're looking at a "cond",
but (far more important) explaining what the code really means.

    More generally "(x y z)" in LISP could, depending on context, be a
    procedure call, a set of formal parameters, a set of local
    variables, a list of three atoms, a condition in a conditional, the
    consequent of a conditional, and so on.

    By contrast, POP-11 uses a large variety of keywords to distinguish
    syntactic contexts, instead of relying so much on POSITION to

I could equally well say that "xxx yyy zzz" could mean any number of
things in Fortran or C.  In all of the languages under discussion, you
tell what's going on via keywords.  In Lisp, you know that "(cond
...)" means one thing and "(defun ...)" means another.  It's perfectly
clear, without all those different punctuation marks.

    determine significance. Parentheses are used for procedure calls in
    the normal way, but distinct brackets are used for list expressions
    "[...]" and vectors "{...}" and this also helps to reduce cognitive
    load during reading. (This sort of thing is particularly important
    when large programs have to be maintained by programmers who were
    not the original authors.)

Ah, now you're talking about distinguishing between expressions and
textual constants.  One of the few punctuation marks that Lisp *does*
use is the single-quote, which makes it clear whether something is
a constant or not.  Even in a very big constant, where the single-quote
is far away, it is *immediately* apparent whether you are looking
at code or not, since code can easily be recognized by the keywords
and indentation.  The only exception, of course, is when the
constant *really is* a piece of list structure that works as Lisp
code, but of course such constants look like code.  Again, I have
to ask you to believe me that in years of maintaining other people's
programs, I hardly ever had any trouble distinguishing constants from code.

I don't know enough about Pop-11 to comment on its syntax, but a good
way to see the problems with one kind of conventional syntax is to
look at C.  (Yes, I saw that you don't think highly of C's syntax
either.)  There's a great book called "C Traps and Pitfalls" (or
something very close to that) that explains a lot of ways that you can
get fouled up by C's syntax.  My favorite: how do you write a C macro
that expands into a statement?  It's very hard to get this right.  The
tricky issue is whether the macro should include the semicolon or not,
and how this interacts with the "if" and compound statement syntax in C.
The book shows a way that works, but it's really an obscure kludge.
This is the kind of problem Lisp would never have, because its lexical
syntax is so simple and uniform.

It's true that many people who are familiar with other languages
are scared off by Lisp syntax, but it's too bad, because their
fears are not justified by the facts.

Daniel Weinreb     Object Design, Inc.     dlw@odi.com

jeff@aiai.ed.ac.uk (Jeff Dalton) (05/30/89)

In article <1029@syma.sussex.ac.uk> aarons@syma.sussex.ac.uk (Aaron Sloman) writes:

>a full Pop-11 development environment on a workstation tends to be very
>much smaller than some of the well known Lisp environments.

Since the full PopLog includes a Common Lisp, this seems to show
that Lisp environments could be smaller, not that Pop-11 requires
less than Lisp.

jeff@aiai.ed.ac.uk (Jeff Dalton) (05/30/89)

In article <1028@syma.sussex.ac.uk> aarons@syma.sussex.ac.uk (Aaron Sloman) writes:
>There is some evidence, albeit based only on a number of cases
>rather than systematic research, that it is much easier to convert
>people to Lisp-like ways of thinking if you give them a Lisp-like
>language (with all the benefits of incremental compilation,
>integrated editor, garbage collector, pattern matcher, lists,
>records, arrays, tuples, etc etc) but not a lisp-like syntax.
>
>Instead provide a syntax that is richer, more readable, and above
>all more familiar to them.
>
>I am of course, referring to Pop-11

Over the years, quite a few preprocessors have been written.
These preprocessors convert some more conventional notation
to Lisp.  None of them has caught on in the Lisp community.
So I guess that shows there are some people who actually
prefer the Lisp syntax.

Of course, Pop-11 isn't a preprocessor.  But it's still true,
I think, that syntax is a matter of taste.

>Here's how you let the variables x, and y, iterate over the elements
>of lists L1 and L2 in Pop-11, so that you can apply function f to
>one element from each list:
>
>    for x, y in L1, L2 do
>        f(x,y)
>    endfor;

Lisp macros are an alternative to preprocessors that can work well
in some cases.  For example, the loop above could be written

     (loop for x in L1 and y in L2
           fo (f x y))

-- Jeff

nagle@well.UUCP (John Nagle) (05/30/89)

In article <1028@syma.sussex.ac.uk> aarons@syma.sussex.ac.uk (Aaron Sloman) writes:
>There is some evidence, albeit based only on a number of cases
>rather than systematic research, that it is much easier to convert
>people to Lisp-like ways of thinking if you give them a Lisp-like
>language (with all the benefits of incremental compilation,
>integrated editor, garbage collector, pattern matcher, lists,
>records, arrays, tuples, etc etc) but not a lisp-like syntax.

      I've found LISP as a language quite useful over the years, but
the "integrated environments" are something of a pain.  It's not at
all clear that the development environment and the object being developed
should be as intertwined as they are on, say, a Symbolics or the defunct
LMI machines.  What seems to happen in these systems is that the developed
object is a world, that is, a copy of the entire environment, rather than
a source text.  The state of the system becomes part of the thing being
developed.  This makes maintenance difficult.  In particular, merging the
work of several programmers is rather painful.  

      The Symbolics approach seems to be oriented toward the classic
hacker-type approach to programming.  The AI community sometimes calls
this "exploratory programming", although that term is somewhat in 
disrepute today.  It does not seem to be oriented toward the development
of programs that will be used by others.

					John Nagle

jeff@aiai.ed.ac.uk (Jeff Dalton) (05/31/89)

In article <11898@well.UUCP> nagle@well.UUCP (John Nagle) writes:
>       I've found LISP as a language quite useful over the years, but
> the "integrated environments" are something of a pain.  It's not at
> all clear that the development environment and the object being developed
> should be as intertwined as they are on, say, a Symbolics ...

Interestingly enough, some people have exactly the opposite view.
They think that programmers don't really like Lisp as a language --
what they really like is the environment.  I have even heard this
said to students by the person teaching them Lisp.

There are also some people who would like to give Prolog an
"environment like Lisp".  This is a less extreme view, but it
also presupposes that Lisp and it's environment can easily be
separated.

> What seems to happen in these systems is that the developed
> object is a world, that is, a copy of the entire environment, rather than
> a source text.  The state of the system becomes part of the thing being
> developed.

The Symbolics and LMI environments do let you write a set of files.
And, indeed, that's where function definitions are kept.  The Xerox
approach is more intertwined, because you don't operate directly on
the files.  You build a world, and then the system tries to write
all of the files for you.

However, whenever you do interactive development it seems to be
possible to have some things work only because files happened to be
loaded in a certain order, and then some computations were done, then
some functions refefined, etc. so that just relaoding the files won't
give you a working system.

But this hardly shows that noninteractive development is better.
And it doesn't mean that your intent can't be to develop a source
text.

> This makes maintenance difficult.  In particular, merging the
> work of several programmers is rather painful.  

I would say that the Lisp environment can be considered something
like dbxtool for C.  Why is it a bad thing if I can change one function
and continue rather than having to recompile, relink, and start over?
That is, I don't think I have to use a Lisp machine to build a world.
It can be used just to write a program.

And, since I don't see how dbxtool makes it harder for programmers
to work together, I don't see why the LispM environment has to do
that either.

>       The Symbolics approach seems to be oriented toward the classic
> hacker-type approach to programming.  The AI community sometimes calls
> this "exploratory programming", although that term is somewhat in 
> disrepute today.  It does not seem to be oriented toward the development
> of programs that will be used by others.

The idea was always in dispute (and so, I guess, in disrepute to
those who didn't like it).  There have long been people (Dijkstra
comes to mind) who think it best to think, write things out on
paper, and even prove them correct, before typing them in.  But
is this always the best way to proceed?

Moreover,if I'm going to write something down, I'd just as soon type
it in.  And if it's typed in, I'd like to test it out.  But, in any
case, just because something is good for exploratory programming
doesn't mean it's no good for anything else.

-- Jeff

nagle@well.UUCP (John Nagle) (06/01/89)

     Syntax is an big issue for beginning programmers, but assumes much
less importance once you know a few different languages.  Semantics and
paradigm are much bigger issues when writing sizable programs.  As
programs become larger, syntactical issues retreat in importance and
issues such as namespace control and other modularity issues dominate.

     Most of the attempts to make LISP look like Pascal or one of its
descendants result in a syntax that is more, rather than less, painful.
On the other hand, the fact that data and programs have the same 
representation in LISP really doesn't seem to be used all that much
any more.  It was felt to be terribly important at one time, but today,
it just doesn't seem to be a big issue.

					John Nagle

mthome@bbn.com (Mike Thome) (06/01/89)

In article <11917@well.UUCP> nagle@well.UUCP (John Nagle) writes:
>     Most of the attempts to make LISP look like Pascal or one of its
>descendants result in a syntax that is more, rather than less, painful.
For example, take a look at Logo...  When I was learning lisp, at first
it was its syntactic consistancy which was probably the most attractive
attribute.

>On the other hand, the fact that data and programs have the same 
>representation in LISP really doesn't seem to be used all that much
>any more.  It was felt to be terribly important at one time, but today,
>it just doesn't seem to be a big issue.
>					John Nagle
I must disagree here - any macro (including most setf expanders) will
depend on the similarity between data and program. Personally, I've been
working on a number of different projects which build (sometimes compile)
routines at runtime.  While this might even be unusual, lisp is, after
all, one of the very few (only mainstream?) language which can do this -
I have a hard time imagining what it would be like to try to implement
one of these systems in a "conventional" language.

Mike Thome (mthome@bbn.com)

shaff@Sesame.Stanford.EDU (Mike Shaff) (06/01/89)

ciao,

While the number of people that do *NOT* use Lisp's ability to operate on
programs as data may have increased, I think this has more to do with an
increase in the number of people that use Lisp than a lack of interest in the
capability.  For people who need to write code that performs analysis of one
form or another, simulates, etc a program the "code as data" quality of Lisp is
critical to their effort.  On a side comment, I think that in the future we
will see *MORE* of this type of usage as people need & create more embedded
language systems (e.g., AutoLisp).

(peace chance)

	mas
-- 

raymond@ptolemy.arc.nasa.gov (Eric A. Raymond) (06/02/89)

In article <11917@well.UUCP> nagle@well.UUCP (John Nagle) writes:
>
>On the other hand, the fact that data and programs have the same 
>representation in LISP really doesn't seem to be used all that much
>any more.  It was felt to be terribly important at one time, but today,
>it just doesn't seem to be a big issue.

Or perhaps it just taken for granted nowadays becuase its been
encapsulated (sort of the way GOTO's have disappeared).  

You use macros don't you? 

-- 
Eric A. Raymond  (raymond@ptolemy.arc.nasa.gov)
G7 C7 G7 G#7 G7 G+13 C7 GM7 Am7 Bm7 Bd7 Am7 C7 Do13 G7 C7 G7 D+13: Elmore James