[comp.lang.lisp] Is this the end of the lisp wave?

gjc@paradigm.com (01/09/91)

What is happening in the lisp world?
The two biggest lisp vendors seem to be taking a dive.

(1) a major hardware vendor is shutting down its in-house lisp group.

(2) Symbolics stock closed tue at 13/16, which gives a total market
    value (2.5*10^6 shares) of $2,030,000
    Note: the assets of LMI were sold out of bankrupcy in June of 1987
    for approximately $2,300,000 (inflation adjusted to 1991 dollars).

Of course Xerox got out of the lispmachine gig a while ago.
What is happening at Texas Instruments?

-gjc

welch@sacral.cis.ohio-state.edu (Arun Welch) (01/10/91)

>Of course Xerox got out of the lispmachine gig a while ago.

Xerox might be out of the machine gig, but their software platform
(Medley) is still on the market, sold/supported by Venue, and in fact
a lot faster on stock hardware. And some of us are glad for that...:-).

But, to go back to the original question, I think the waters are
muddied a tad by the fact that Lisp is still pretty strong outside the
US, and the external markets haven't crested the wave yet, it seems.
I've also seen a couple job offers for Lispers on misc.jobs.offered,
mostly looking for CLOS hackers these days, and the frequency seems to
be the same as it was a couple of years ago. 

...arun
Info-1100-request@cis.ohio-state.edu :-)
----------------------------------------------------------------------------
Arun Welch
Lisp Systems Programmer, Lab for AI Research, Ohio State University
welch@cis.ohio-state.edu

jjacobs@well.sf.ca.us (Jeffrey Jacobs) (01/12/91)

> What is happening in the LISP world?

Lack of demand due to Common LISP's enormous size, complexity, resource
requirements, training, etc.

I attended a presentation by Intellicorp on their new PRO-KAPPA product
yesterday.  "Conventional DP and MIS world wasn't willing to buy into
LISP".

For a more detailed "prophecy", see my Brainwaves column in AI Expert,
March 1988.

Common LISP effectively died from obesity.

Jeffrey M. Jacobs
ConsArt Systems Inc, Technology & Management Consulting
P.O. Box 3016, Manhattan Beach, CA 90266
voice: (213)376-3802, E-Mail: 76702.456@COMPUSERVE.COM

srt@aerospace.aero.org (Scott "TCB" Turner) (01/12/91)

>Common LISP effectively died from obesity.

*And* still managed to leave out "neq".

						-- Scott Turner

eliot@phoenix.Princeton.EDU (Eliot Handelman) (01/12/91)

;>Common LISP effectively died from obesity.
;
;*And* still managed to leave out "neq".

It died because it was linked to AI and that died. The whole beauty of
list processing was that at one time it was seriously believed that 
thought was essentially list processing, and if you toss in recursion
you can accomodate self-consciousness too. A lisp machine was a machine
whose resident language was "the logic of thought." No one believes this
anymore, so lisp semantics have been relativized, and now it's in competition
with much faster languages which have no such pretensions, but which
are computationally equivalent. Still, Lisp is a nice langauge for
big, open-ended explorations into cognitive architectures written by
people who aren't macho about how fast they can feed numbers into
arrays; and interestingly, lisp has acquired a strong foothold in the
technoarts community, of which I'm a member. Lisp still has a distinctivbe
aesthetic feel to it that makes it seem to come closer to a screwball
"language of thought" than C for instance, even if you're only thinking 
about this as history, as a way of life that was more naive and more 
optimistic than we can be about what a brain machine was going to look 
like.

aarons@syma.sussex.ac.uk (Aaron Sloman) (01/13/91)

eliot@phoenix.Princeton.EDU (Eliot Handelman) writes:

> Path: syma!icdoc!ukc!mcsun!uunet!cs.utexas.edu!uwm.edu!linac!att!rutgers!njin!princeton!phoenix!eliot
> Date: 12 Jan 91 08:40:55 GMT
> Sender: news@idunno.Princeton.EDU
> Organization: Princeton University, New Jersey
>
> ;>Common LISP effectively died from obesity.
> ;
> ;*And* still managed to leave out "neq".
>
> It died because it was linked to AI and that died. The whole beauty of
> list processing was that at one time it was seriously believed that
                                             ^^^^^^^^^^^^^^^^^^^^
> thought was essentially list processing, and if you toss in recursion
> you can accomodate self-consciousness too. A lisp machine was a machine
> whose resident language was "the logic of thought." No one believes this
> anymore,....

Actually, having been involved with AI since about 1969 I don't
think I've ever met any serious thinker who believed this.

I keep finding people who say AI is dead because the AI people
believed X and X has been proved wrong, when in fact X is so
obviously false that nobody ever would have believed it, except
perhaps a few badly taught students, and people on the fringes
trying to understand a difficult discipline and latching onto
simplified slogans because they couldn't see what was really going
on. (I once read a draft report by a social scientist commenting on
the UK "Alvey" advanced IT program, in which it was claimed that AI
was more or less defined by the use of Lisp. Fortunately, the final
report benefited from some informed criticism.)

All the people I talked to in the AI field in the early days were
very clear that there was a difference between what they were trying
to implement and how they were implementing it, although it was
agreed that sometimes making the distinction was not easy (hence the
occasional confused person who called a program a theory).

Choice of language and data-types was clearly an implementation
detail. For example, things that used to be done with property lists
in the early days are now often done using hash-tables because the
latter are much more efficient than long property lists. Also, some
of the things that lisp-ers do with lists are done in Prolog using
terms.

What many people in AI have believed is that intelligent systems
don't just manipulate vectors or lists but hierarchically structured
representations, i.e. that understanding, perceiving, thinking,
planning, and the like require the ability to cope with things
composed of parts with relationships between them, where the parts
are also composed of parts with relationships, etc. (This belief was
also shared by many linguists).

I don't think anything that has turned up in recent years has
shown that this belief is false.

Of course, the management of hierarchical complexity isn't all there
is to intelligent systems.

Aaron Sloman,
School of Cognitive and Computing Sciences,
Univ of Sussex, Brighton, BN1 9QH, England
    EMAIL   aarons@cogs.sussex.ac.uk
or:
            aarons%uk.ac.sussex.cogs@nsfnet-relay.ac.uk
            aarons%uk.ac.sussex.cogs%nsfnet-relay.ac.uk@relay.cs.net

jjacobs@well.sf.ca.us (Jeffrey Jacobs) (01/13/91)

> Symbolics and who?

Texas Instruments is who!

Also, it would appear that Lucid is getting into C++ based on a recent
ad in misc.jobs.offered!

Jeffrey M. Jacobs
ConsArt Systems Inc, Technology & Management Consulting
P.O. Box 3016, Manhattan Beach, CA 90266
voice: (213)376-3802, E-Mail: 76702.456@COMPUSERVE.COM

valdes+@cs.cmu.edu (Raul Valdes-Perez) (01/14/91)

In article <4178@syma.sussex.ac.uk> aarons@syma.sussex.ac.uk (Aaron Sloman) writes:
>All the people I talked to in the AI field in the early days were
>very clear that there was a difference between what they were trying
>to implement and how they were implementing it, although it was
>agreed that sometimes making the distinction was not easy (hence the
>occasional confused person who called a program a theory).

Could Prof. Sloman make clear why computer programs do not merit the status
of theory?  Would he accept a system of differential or difference equations
as a theory?
--
Raul E. Valdes-Perez			valdes@cs.cmu.edu
School of Computer Science		(412) 268-7698
Carnegie Mellon University
Pittsburgh, PA 15213

eliot@phoenix.Princeton.EDU (Eliot Handelman) (01/14/91)

In article <4178@syma.sussex.ac.uk> aarons@syma.sussex.ac.uk (Aaron Sloman) writes:
;eliot@phoenix.Princeton.EDU (Eliot Handelman) writes:

;> It died because it was linked to AI and that died. The whole beauty of
;> list processing was that at one time it was seriously believed that
;                                             ^^^^^^^^^^^^^^^^^^^^
;> thought was essentially list processing, and if you toss in recursion
;> you can accomodate self-consciousness too. A lisp machine was a machine
;> whose resident language was "the logic of thought." No one believes this
;> anymore,....
;
;Actually, having been involved with AI since about 1969 I don't
;think I've ever met any serious thinker who believed this.

I can back that up. 

;I keep finding people who say AI is dead because the AI people
;believed X and X has been proved wrong, when in fact X is so
;obviously false that nobody ever would have believed it, except
;perhaps a few badly taught students, and people on the fringes
;trying to understand a difficult discipline and latching onto
;simplified slogans because they couldn't see what was really going
;on. 

THis isn't a paraphrase of my point, I hope. I claimed that Lisp
is now COMPUTATIONALLY relative, that it does not need to be
identified as THE language of intelligence, artificial or 
otherwise. I did not suggest that AI is dead "because no one believes
that the mind is a list processor any more." The proposition that
list manipulation is intrinsic to thought is not, anyway, "so
obviously false that nobody would ever have believed it." 
It's not that obviously bad a hypothesis to propose that you represent 
lists AS lists, rather than in some other more machine-like idiom, say 
arrays. And this is by no means a banal assertion.


;Choice of language and data-types was clearly an implementation
;detail. 

Today, yes. The question is whether it was in 1969.

rshapiro@arris.com (Richard Shapiro) (01/14/91)

In article <5256@idunno.Princeton.EDU> eliot@phoenix.Princeton.EDU (Eliot Handelman) writes:
>It died because it was linked to AI and that died. The whole beauty of
>list processing was that at one time it was seriously believed that 
>thought was essentially list processing, and if you toss in recursion
>you can accomodate self-consciousness too. 


I've been a Lisp/AI programmer for the last 10 years or so, and this
is the first time I've ever heard this *very* peculiar argument. Many
AI programmers use Lisp for much simpler reasons:


1) It's the best general purpose programming language for symbolic
computing, and symbolic computing has turned out to be very useful in
AI applications. There are special purpose languages that are better
at symbol manipulation, but of the widely available general languages,
no other one really comes close.

2) The equivalence between program structures and data structures
makes dynamic computation quite straightforward. No other high-level
language offers this.

3) As a side-effect of (1) and (2), it's very easy to write "higher
level" languages based on Lisp (eg rule languages, knowledge rep.
languages, etc).

4) In particular, the higher-level language construct known as
object-oriented programming folds very well into Lisp -- it's a much
better fit than, say, object-oriented C. I assume we all know the
advantages of oop by now...

5) For various reasons, Lisp systems tend to have quite sophisticated
programming environments, and these are essential in the generation of
complex programs (eg AI programs). Symbolics LISPMs are a particularly
good example of this. The increase in productivity is impossible to
overstate.


>A lisp machine was a machine
>whose resident language was "the logic of thought." No one believes this
>anymore, 

I don't think any Lisp programmer ever believed this. Perhaps some
theorist wrote something like this in a journal somewhere. As I say,
I've been working in the field for quite awhile, and I haven't ever
heard any Lisp programmer make this claim, or anything even remotely
like it.

>so lisp semantics have been relativized, and now it's in competition
>with much faster languages which have no such pretensions, but which
>are computationally equivalent. 

"Computation equivalence" is a useless term. FORTRAN is
computationally equivalent. The question is: how easy is it to write
and maintain sophisticated programs? Lisp is still the clear winner
(among readily available languages, at least) in this regard.

As for speed, there are of course certain applications which require
greater speed than any Lisp can offer. I would claim that these
constitute a very small percentage of the kinds of programs Lisp is
generally used to write.

The reason Lisp has not gained even wider acceptance is due more to
institutional and managerial inertia. The people who are in a position
to make the decisions are simply afraid to make the switch from
whatever it is they're used to.

djb@babypuss.mitre.org (David J. Braunegg) (01/14/91)

>Lack of demand due to Common LISP's enormous size, complexity, resource
>requirements, training, etc.
>
>Common LISP effectively died from obesity.



OK.  What are the problems preventing a smaller, more efficient Lisp
so that we aren't forced to use the almost-a-programming-language C?

Dave

meltsner@crd.ge.com (Kenneth J Meltsner) (01/14/91)

LISP is not dead, despite efforts to kill it by overfeeding it.  The
problem may be that it will never have the wide appeal of an
efficient, system implementation-oriented language like C, or the
history of a language like FORTRAN.

In fact, there may be more LISP users today than ever before.  Look at
systems like GNUEMACS, with its LISP-dialect extension language.  Look
at WINTERP and Scheme, etc.  I recently saw a complete statistics
package written in XLISP for the Mac.  Common LISP may be dropping in
popularity due to its size, but other LISP dialects are doing quite
well.


What does restrict the use of LISP?

(1) Nasty delivery pricing.  I remember the bad old days when you had
to pay Microsoft a chunk of money to distribute programs written using
their compilers.  That changed quickly when microcomputer compiler
vendors managed to make a living without royalty fees.

But the situation is worse than simply the extra cash.  There's
bookkeeping and copy tracking costs to add in as well.  Even if the
price is not bad, the paperwork is one additional burden.

How can LISP vendors make a living?  I don't know.  DEC and Apple
allow you to generate applications without a royalty fee, but they may
be doing this to help sell hardware.  I'd actually suggest *raising*
compiler prices, but eliminating royalty fees for non-commercial use
of the runtime systems.  Or giving away application runtimes, but
charging extra for the tree-shakers and other runtime-development
tools.  LISP vendors attempt to make a living selling razor blades
instead of razors, but this doesn't work if this forces you to spend
all of your time keeping track of where the blades went.

(2) Creeping feature-itis.  LISP gets bigger and bigger, and every new
extension must subsume all previous extension efforts.  I don't want
to start the religious disputes again, but there's always four ways to
do anything in LISP, and perhaps we only need three.

(3) Bad press.  Management types aren't always the brightest folks,
but they have wonderful memories.  And they all remember LISP
machines, LISP-based expert system shells, LISP gurus, etc.  And the
word LISP got associated with "Not suitable for real world"
applications and hardware.  Even if the latest generation of
workstations can run LISP pretty well (and even my Mac runs a decent
version of LISP), they can only remember prima-donna hardware and
software maintenance types, and the incredible prices they charged.
The management types who switched and made a real investment in LISP
got burned badly, and no one appears to be adventurous enough to make
the same mistakes again.

(4) High-priced programmers.  When the LISP craze hit, the market
overheated and anyone with a CS 101 background in LISP got a $5000
raise.  Most of these folks tried to get super-programmer salaries
(based on their mastery of LISP's higher productivity environments),
but barely deserved COBOL wages.


LISP isn't dead.  It's just recuperating from all the hype of the
early '80s. 



--
===============================================================================
Ken Meltsner                        | meltsner@crd.ge.com (518) 387-6391
GE Research and Development Center  | Fax:  (518) 387-7495
P.O. Box 8, Room K1/MB207	    | Nothing I say should be attributed
Schenectady, NY 12301               | to my employer, and probably vice-versa
=================Dep't of Materials Science, ACME Looniversity=================

hall@aplcen.apl.jhu.edu (Marty Hall) (01/15/91)

In article <15657@crdgw1.crd.ge.com> meltsner@crd.ge.com writes:
[...]
>LISP isn't dead.  It's just recuperating from all the hype of the
>early '80s. 

My sentiments exactly. My opinion (wishful thinking?) is that it is
already on the mend.
				- Marty

dean@cs.mcgill.ca (Dean NEWTON) (01/15/91)

In article <127724@linus.mitre.org> djb@babypuss.mitre.org (David J. Braunegg) writes:
>>Lack of demand due to Common LISP's enormous size, complexity, resource
>>requirements, training, etc.
>>
>>Common LISP effectively died from obesity.
>
>
>
>OK.  What are the problems preventing a smaller, more efficient Lisp
>so that we aren't forced to use the almost-a-programming-language C?
>
>Dave

Nothing.  It's called Scheme.

Kaveh Kardan
Taarna Systems
Montreal, Quebec, Canada
(posting from a friend's account)

charest@ai-cyclops.jpl.nasa.gov (01/15/91)

In article <5256@idunno.Princeton.EDU> eliot@phoenix.Princeton.EDU (Eliot
Handelman) writes:
>;>Common LISP effectively died from obesity.
>;
>;*And* still managed to leave out "neq".
>
>It died because it was linked to AI and that died. 
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^

My sponsors will be very surprised when they find out.

*
Len Charest
Jet Propulsion Lab/Artificial Intelligence Lab
*

nagle@well.sf.ca.us (John Nagle) (01/15/91)

     LISP, in its grander forms, was intended to support an environment
in which everything was fluid, in which programs could modify themselves,
examine their own inner workings in a reasonable representation,
and in which a program could create new sections of program in an 
integrated way.  The dream was of programs that used these facilities
to improve themselves.  Lenat's AM and Eurisko actually did so; few
other programs ever did.

     Unfortunately, that didn't seem to lead much of anywhere.

					John Nagle

ahlenius@motcid.UUCP (Mark Ahlenius) (01/15/91)

djb@babypuss.mitre.org (David J. Braunegg) writes:

>>Lack of demand due to Common LISP's enormous size, complexity, resource
>>requirements, training, etc.
>>
>>Common LISP effectively died from obesity.

>OK.  What are the problems preventing a smaller, more efficient Lisp
>so that we aren't forced to use the almost-a-programming-language C?

There is a smaller, faster dialect of CL out there and as far as I
know it is being taught in some of the major universities
 - its Scheme.

 Quite a while back I read that Scheme was taught as the "first programming"
 language at MIT (via Structure and Interpretation of Computer Prg.)
 Is this still the case?

 There appears to be some renewed interest in Scheme lately.

 Although it lacks many features that CL has, it is small., compact,
 and fairly quick.

	'mark
-- 
===============	regards   'mark  =============================================
Mark Ahlenius 		  voice:(708)-632-5346  email: uunet!motcid!ahleniusm
Motorola Inc.		  fax:  (708)-632-2413
Arlington, Hts. IL, USA	 60004

jeff@aiai.ed.ac.uk (Jeff Dalton) (01/16/91)

In article <WELCH.91Jan10095911@sacral.cis.ohio-state.edu> welch@sacral.cis.ohio-state.edu (Arun Welch) writes:

>But, to go back to the original question, I think the waters are
>muddied a tad by the fact that Lisp is still pretty strong outside the
>US, 

It is?  Where?

>I've also seen a couple job offers for Lispers on misc.jobs.offered,
>mostly looking for CLOS hackers these days, and the frequency seems to
>be the same as it was a couple of years ago. 

That's the "oop" wave (which _hasn't_ crested yet), not the Lisp wave.

jeff@aiai.ed.ac.uk (Jeff Dalton) (01/16/91)

In article <1991Jan14.141651.12321@arris.com> rshapiro@arris.com (Richard Shapiro) writes:
>AI programmers use Lisp for much simpler reasons:

I don't think Lisp will do all that well if only AI programmers use
Lisp.  For one thing, it's not clear that AI programmers will continue
to use Lisp.  There's already a tendency to move to C, especially for
applications that are fairly well understood.  (If you know how to
write it before you start, you can write it in C.)  There's also a
tendency to use tools, such as expert system shells, and for those
to be implemented in C rather than Lisp.  And, of course, if AI
doesn't do all that well itself, anything tied to it will suffer
too.

Fortunately, it isn't just AI programmers who use Lisp.  For example,
Lisp is doing fairly well as a language for teaching and as an embedded
extension language for editors, CAD systems, etc.  However, it is
unlikely that this is enough to sustain the Lisp industry at its
current level.

>5) For various reasons, Lisp systems tend to have quite sophisticated
>programming environments, and these are essential in the generation of
>complex programs (eg AI programs). Symbolics LISPMs are a particularly
>good example of this. The increase in productivity is impossible to
>overstate.

But other languages are catching up, and Lisp environments on
"conventional" machines tned not to match the symbolics.

-- jeff

nagle@well.sf.ca.us (John Nagle) (01/16/91)

     Symbolics closed at 7/16 yesterday, down 3/16.  The year's high was 10.,
so the stock has lost 96% of its value in the last year.  The end must be
near.

					John Nagle

meltsner@crd.ge.com (Kenneth J Meltsner) (01/16/91)

In article <22650@well.sf.ca.us>, nagle@well.sf.ca.us (John Nagle) writes:
|>     Symbolics closed at 7/16 yesterday, down 3/16.  The year's high was 10.,
|>so the stock has lost 96% of its value in the last year.  The end must be
|>near.


Why does everyone assume the problems of the LISP hardware
manufacturers means that LISP is defunct?  I've never used a LISP
machine, but I've had great time using LISP on my Mac, VAX, and
DECStation.  Am I missing some basic rule that says LISP on UNIX
workstations isn't really LISP at all?  Given what I've seen of
Symbolics' equipment, support, and management style, I'm not surprised
they're doing badly, but is everyone else in the same boat?  Are the
software vendors doing as badly?  How stable are Lucid, Franz, etc.?

In a similar vein, I unearthed the 2/87 issue of AI EXPERT describing
LISP and its future.  Too much to type in, but I'll try to find some
choice bits to quote.

===============================================================================
Ken Meltsner                        | meltsner@crd.ge.com (518) 387-6391
GE Research and Development Center  | Fax:  (518) 387-7495
P.O. Box 8, Room K1/MB207	    | Nothing I say should be attributed
Schenectady, NY 12301               | to my employer, and probably vice-versa
=================Dep't of Materials Science, ACME Looniversity=================

miller@cs.rochester.edu (Brad Miller) (01/16/91)

In article <3946@skye.ed.ac.uk> jeff@aiai.UUCP (Jeff Dalton) writes:
|I don't think Lisp will do all that well if only AI programmers use
|Lisp.  For one thing, it's not clear that AI programmers will continue
|to use Lisp.  There's already a tendency to move to C, especially for
|applications that are fairly well understood.  (If you know how to
|write it before you start, you can write it in C.)  

If you know how to write it, it isn't AI.

-- 
----
Brad MillerU. Rochester Comp Sci Dept.
miller@cs.rochester.edu {...allegra!rochester!miller}

dcorbett@phobos.socs.uts.edu.au (Dan Corbett) (01/16/91)

djb@babypuss.mitre.org (David J. Braunegg) writes:

>>Lack of demand due to Common LISP's enormous size, complexity, resource
>>requirements, training, etc.
>>
>>Common LISP effectively died from obesity.


>OK.  What are the problems preventing a smaller, more efficient Lisp
>so that we aren't forced to use the almost-a-programming-language C?

We had those Lisps and we threw them away.  People got obsessed with big and
powerful languages.  Here's how you too can help bring back useful Lisps.

1) Read McCarthy's paper, in which he describes the whole purpose of
	inventing Lisp. ("Recursive Functions of Symbolic Expressions,"
	CACM 3,4, April 1960)  

2) Compare McCarthy's description to Common Lisp, and see where the authors
	of CL have completely deviated from the original intent of Lisp.

3) Look at the older implementations of Lisp and see the beauty of a simple,
	well-defined language.  You don't have to go back to 1.5, take a
	look at Franz or UCI Lisp.

-----------------------------------------------------------------------------
Dan Corbett
Department of Computer Science
University of Technology, Sydney
Australia
dcorbett@ultima.socs.uts.edu.au
-----------------------------------------------------------------------------

srt@aerospace.aero.org (Scott "TCB" Turner) (01/16/91)

It's been my perception that LISP gets used to build vaguely-defined,
evolving systems.  Much AI fits in that category.  LISP and the
programming environment that comes with it is amenable to a tinkering,
incremental approach to problem solving.

On the other hand, when the problem and its solution are well-defined,
a language like C is a more likely choice.  The code is written, the 
executable delivered, and then set aside until a round of bug fixes.

Obviously I'm over-simplifying, but I think the difference is valid.

						-- Scott Turner

eliot@phoenix.Princeton.EDU (Eliot Handelman) (01/16/91)

In article <18944@ultima.socs.uts.edu.au> dcorbett@phobos.socs.uts.edu.au (Dan Corbett) writes:

;3) Look at the older implementations of Lisp and see the beauty of a simple,
;	well-defined language.  You don't have to go back to 1.5, take a
;	look at Franz or UCI Lisp.


Yes, I miss this style of programming:

(defun grev (lis)
  (cond ((null (eval (caaadddadr lis))) (eval (caddadddaaadr lis)))
	((gremfrnk1 (caddadadr lis) 'glork 'fzt nil nil nil 'ftzwk)
	  (append (eval (cons (concat 
			  (cons 'flm (cadr (explode (cadddadar lis)))) )
				 (eval (cons 'list (caddadr lis)))))
		  (list (cadadadddar lis) 'brkvt)))
        (t (grev1 (list 'grk 'frmp nil nil nil nil nil nil nil)))))

Simple, yet elegant.

nagle@well.sf.ca.us (John Nagle) (01/16/91)

     A previous posting indicated that the total market value of Symbolics
was down around $2 million.  This is incorrect; there are about 25 million
shares outstanding, so even at their current penny-stock price of $0.44
per share, it would take about $11 million to buy out the entire company.

     Their numbers for the last few years show a steady decline, losses,
declining book value, declining income, and incipient oblivion.

					John Nagle

mbr@flash.bellcore.com (Mark Rosenstein) (01/16/91)

In article <5420@idunno.Princeton.EDU> eliot@phoenix.Princeton.EDU (Eliot Handelman) writes:

   From: eliot@phoenix.Princeton.EDU (Eliot Handelman)
   Newsgroups: comp.lang.lisp
   Date: 16 Jan 91 07:22:16 GMT
   References: <127724@linus.mitre.org> <18944@ultima.socs.uts.edu.au>
   Sender: news@idunno.Princeton.EDU
   Organization: Princesspool University, New Jersey
   Lines: 19

   Yes, I miss this style of programming:

   (defun grev (lis)
     (cond ((null (eval (caaadddadr lis))) (eval (caddadddaaadr lis)))
	   ((gremfrnk1 (caddadadr lis) 'glork 'fzt nil nil nil 'ftzwk)
	     (append (eval (cons (concat 
			     (cons 'flm (cadr (explode (cadddadar lis)))) )
				    (eval (cons 'list (caddadr lis)))))
		     (list (cadadadddar lis) 'brkvt)))
	   (t (grev1 (list 'grk 'frmp nil nil nil nil nil nil nil)))))

   Simple, yet elegant.

Nah.
"Our goal is that students who complete this subject should have a
good feel for the elements of style and the aesthetics of programming.
They should have command of the major techniques for controlling
complexity in a large system. They should be capable of reading a
50-page-long program, if it is written in an exemplary style. They
should know what not to read, and what they need not understand at any
moment They should feel secure about modifying a program, retaining
the spirit and style of the original author."
      from: "Structure and Interpretation of Computer Programs"
             Abelson, Sussman

You can write bad code in any language. Maybe even Snobol? I believe
with the macro facility of lisp, and keyword args, and the string stuff,
it is possible to generate an understandable and more importantly
maintainable version of the above. I would argue to get a fundamental
feel between the difference of C and Lisp, look and think hard about
the differences between CLX and the C version of Xlib. The use of
keyword args, the use of objects, a major part of the decomposition
of the problem is different, yet with the same functionality. Look
and decide which fits your style. Also which goes into a break loop
with wrong argument, and which goes into a segmentation fault core
dumped (oops sorry, and environment issue accidentally cropped up :^))

More fundamentally, I don't understand the: "If I understand it, I'd
build it in C" and the obesity argument. When I program in C, I'd be
using C++ and X, and maybe some other stuff. Look at how big Xclock
is. Real programmers don't need window systems? (or hash tables? or
string manipulation? or object systems? or io?)

If you want the functionality you have to pay a price. Not that
I don't want my lisp vendor to try and slim things down, but to get
to a full functionality C program, it isn't going to be small.

Maybe it's if I understand it, and it doesn't do much, I build it 
in C? This isn't fair, and I know it. But I think the overhead
of Lisp is smaller when you have bigger things.

I dunno. I honestly have no idea why lisp isn't more widely used,
or if we played back history and say Perq won the workstation contest
instead of Sun, it wouldn't be more likely that lisp would be more
widely used. Maybe there'd be hundreds of people declaring how
wonderfully clear variant records are? I dunno.

Mark.
-----
"C is a flexible programming language that gives a great deal of
freedom to the programmer. This freedom is the source of much of its
expressive power and one of the main strengths of C,making it
powerful, versatile and easy to use in a variety of application areas.
However, undisciplined use of this freedom can lead to errors."
      from: "C An Advanced Introduction"
             Gehani

jeff@aiai.ed.ac.uk (Jeff Dalton) (01/17/91)

In article <1991Jan15.230926.25923@cs.rochester.edu> miller@cs.rochester.edu (Brad Miller) writes:
>In article <3946@skye.ed.ac.uk> jeff@aiai.UUCP (Jeff Dalton) writes:
>|I don't think Lisp will do all that well if only AI programmers use
>|Lisp.  For one thing, it's not clear that AI programmers will continue
>|to use Lisp.  There's already a tendency to move to C, especially for
>|applications that are fairly well understood.  (If you know how to
>|write it before you start, you can write it in C.)  
>
>If you know how to write it, it isn't AI.

Well, then, much of commercial AI isn't AI.

jeff@aiai.ed.ac.uk (Jeff Dalton) (01/17/91)

In article <97216@aerospace.AERO.ORG> srt@aerospace.aero.org (Scott "TCB" Turner) writes:
>It's been my perception that LISP gets used to build vaguely-defined,
>evolving systems.  Much AI fits in that category.  LISP and the
>programming environment that comes with it is amenable to a tinkering,
>incremental approach to problem solving.

But that's not all Lisp is good for.  I don't think there's anything
in the nature of Lisp that means it must be worse than C at the tasks
for which C is used.

>On the other hand, when the problem and its solution are well-defined,
>a language like C is a more likely choice.  The code is written, the 
>executable delivered, and then set aside until a round of bug fixes.

But better programming environments are needed for C and are being
built.  That is, C programming will get more Lisp-like, so far as the
environment is concerned.  And this will make other differences
between the languages more important.

I think you are right to suggest that it's much more straightforward
to deliver the executable when using C.  But C also has advantages
when delivering the source.  Because more machines come with C
compilers than with Lisps, C is in practice more portable (even
though, as a language, it seems to provide more opportunities for
machine-dependence).  C also tends to be much more efficient at
certain tasks, such as processing text files, and tends to produce
smaller executables.  C technology is often fairly primitive, or at
least simple.  But it works well enough.  For example, Lisp systems
seem to have to go to a lot of effort to get rid of procedures that
will not be used, while in C they tend not to be included in the first
place.  Lisp's ability to load new procedures at run time, etc, is
more sophisticated, but is often more than is needed.

-- jeff

oz@yunexus.yorku.ca (Ozan Yigit) (01/17/91)

In article <5420@idunno.Princeton.EDU> eliot@phoenix.Princeton.EDU
(Eliot Handelman) writes:

>Simple, yet elegant.

Still is. ;-)

;;; fib with peano arithmetic and call/cc (by Kent Dybvig)

(define addc
   (rec addc
      (lambda (x y k)
         (if (zero? y)
             (k x)
             (addc (1+ x) (1- y) k)))))

(define fibc
   (rec fibc
      (lambda (x c)
         (if (zero? x)
             (c 0)
             (if (zero? (1- x))
                 (c 1)
                 (addc (call/cc (lambda (c) (fibc (1- x) c)))
                       (call/cc (lambda (c) (fibc (1- (1- x)) c)))
                       c))))))


And just in case you are not convinced, here is another expert
opinion. 

The wonderful thing about Scheme is:
Scheme is a wonderful thing.
Complex procedural ideas
Are expressed via simple strings.
Its clear semantics, and lack of pedantics,
Help make programs run, run, RUN!
But the most wonderful thing about Scheme is:
Programming in it is fun,
Programming in it is FUN!

                        tigger@hundred-acre-wood.milne.disney
                        forwarded by ramsdell@linus.uucp


cheers...	oz
---
Where the stream runneth smoothest,   | Internet: oz@nexus.yorku.ca 
the water is deepest.  - John Lyly    | UUCP: utzoo/utai!yunexus!oz  

jeff@aiai.ed.ac.uk (Jeff Dalton) (01/17/91)

In article <5569@turquoise.UUCP> ahlenius@motcid.UUCP (Mark Ahlenius) writes:
>djb@babypuss.mitre.org (David J. Braunegg) writes:

>>OK.  What are the problems preventing a smaller, more efficient Lisp
>>so that we aren't forced to use the almost-a-programming-language C?
>
>There is a smaller, faster dialect of CL out there and as far as I
>know it is being taught in some of the major universities - its Scheme.

I'm not sure the Scheme folk think of their language as a "dialect
of CL".  A dialect of Lisp, yes, but not a Common one.  Nor am I
convinced that Scheme will be faster.  Standard Scheme, at least,
lacks many of the efficiency tricks (e.g., declarations) available
in CL.  However, Scheme is smaller and, consequently, easier to
implement and (for the most part) easier to understand fully.
It is still difficult to implement an efficient Scheme, despite
its size, but at least the effort will be concentrated on fewer
constructs.

Much of the reason Common Lisp appears to be so big is, in my opinion,
a matter of organization and presentation.  If we start with C and
then add in various libraries, it starts to look fairly large too.  On
the other hand, it's easier with C to distinguish the essential core
from the rest.  There isn't any reason, other than historical, why
Common Lisp couldn't be presented, and even implemented, in a more
C-like way, as a language plus libraries of procedures and data types.
The core language would still be larger than C, but it would also 
have greater capabilities.

A different Lisp design that tries to get some of the advantages of
both Common Lisp and Scheme is EuLisp, a Lisp being developed mostly
in Europe (hence the name).  The conceptual and implementational
complexity of EuLisp is controlled by the use of two mechanisms:
levels and modules.  

There are 3 levels in EuLisp, each an extension of the one below.
Level 0 is a "kernel" Lisp, not too far from Scheme.  Level 1 is about
the size and scope of Le Lisp or Franz Lisp.  Level 2 is close to
Common Lisp.  The advantage of levels over three separate languages
is, of course, that they fit together in a coherent way.

In addition, constructs with related functionality (a data type and
procedures for operating on its instances, for example) are often
packaged together in a module.  If your program makes use of such
facilities, you must request the appropriate module, just as you must
include the appropriate ".h" file in C.

Modules are a finer division than levels.  An implementation aims
at a particular level, and each level has certain modules as standard.
An example of a difference between levels is that level 0 has only
the most basic mechanisms for defining new classes, while higher
levels have capabilities similar to those of CLOS.

-- Jeff

jeff@aiai.ed.ac.uk (Jeff Dalton) (01/17/91)

In article <18944@ultima.socs.uts.edu.au> dcorbett@phobos.socs.uts.edu.au (Dan Corbett) writes:

>2) Compare McCarthy's description to Common Lisp, and see where the authors
>	of CL have completely deviated from the original intent of Lisp.

This is an interesting claim.  Can you flesh it out a bit so I don't
have to do all that textual analysis?

oz@yunexus.yorku.ca (Ozan Yigit) (01/17/91)

In article <3954@skye.ed.ac.uk> jeff@aiai.UUCP (Jeff Dalton) writes:
> ...... Nor am I
>convinced that Scheme will be faster.

I know life is too short to even attempt to convince you, :-) so let me
just say that if you ever want to find out for sure, at least try out an 
industrial-strength implementation (such as Chez etc.) for your tests.
Who knows, you may be surprized.

>	...  Standard Scheme, at least,
>lacks many of the efficiency tricks (e.g., declarations) available
>in CL.

Scheme literature thus far available (Steele [1], Dybvig [2], Kranz et al.
[3] just to mention a few) seem to suggest that scheme may not need much in
the way of efficiency tricks (except perhaps to indicate to the compiler
that built-in functions will not be re-defined) to be compiled and
optimized properly.  [On the other hand, arguably a case may be made for
additional constructs for even *better* results]

>It is still difficult to implement an efficient Scheme, despite
>its size ...

Give me a unit of measure for your understanding of *efficient*, so that
we'll know what this new claim is all about.

oz
---
[1]  Guy Lewis Steele Jr., Rabbit: a Compiler for Scheme, MIT AI Memo
     474,  Massachusetts  Institute  of Technology, Cambridge, Mass.,
     May 1978.

[2]  R.  Kent  Dybvig,  Three  Implementation  Models   for   Scheme,
     Department  of  Computer Science Technical Report #87-011 (Ph.D.
     Dissertation), University of  North  Carolina  at  Chapel  Hill,
     Chapel Hill, North Carolina, April 1987.

[3]  David Kranz, Richard Kelsey, Jonathan A. Rees, Paul Hudak, James
     Philbin  and  Norman I. Adams, Orbit: An Optimizing Compiler for
     Scheme, Proceedings of the  SIGPLAN  Notices  '86  Symposium  on
     Compiler Construction, June 1986, 219-233.

---
Where the stream runneth smoothest,   | Internet: oz@nexus.yorku.ca 
the water is deepest.  - John Lyly    | UUCP: utzoo/utai!yunexus!oz

  

norvig@sequoia.berkeley.edu (01/17/91)

No.  Lisp is not dead.  In fact, it is thriving: Scheme is being taught
to freshmen in a number of the best Universities, and they will have
increasing influence as time goes on.

Lisp is a survivor.  FORTRAN is a few years older; COBOL a few years
younger.  All three have flaws, but persist because they have significant
user communities.  

Like Algol, Lisp is more than a language, it is a style of languages.
Algol itself is dead, but the ideas live on in Pascal and Modula.
Similarly, any language with dynamic types, garbage collection, and
functional closures is in some sense a dialect of Lisp.  So if the world
adopts ML, or Haskell, or some yet-to-be-defined variant, then the influence
of Lisp lives on.

jdudeck@polyslo.CalPoly.EDU (John R. Dudeck) (01/17/91)

>>2) Compare McCarthy's description to Common Lisp, and see where the authors
>>	of CL have completely deviated from the original intent of Lisp.
>
>This is an interesting claim.  Can you flesh it out a bit so I don't
>have to do all that textual analysis?

When I first tried to learn Lisp, somebody loaned me a copy of McCarthy's
book on 1.5.  It was about 1/4" thick, and I think most of that was
background.  I think you can describe the whole language in about 2 pages.

Sorry I can't be more helpful.  I wish I had that book!

-- 
John Dudeck                                        "Communication systems are
jdudeck@Polyslo.CalPoly.Edu                              inherently complex".
ESL: 62013975 Tel: 805-545-9549                                 -- Ron Oliver

SEB1525@mvs.draper.com (01/17/91)

There's one thing about LISP that makes it superior to C which
everyone else seems to have missed.  For me, this is the major factor
in why Lisp is a much easier language to develop software in.  It's
got nothing to do with "AI" (whatever you think it is) or
fancy-schmancy programming environments.

The Big Win is:  Storage management in Lisp is a non-issue.

Just about every nontrivial programming task or computer algorithm
involves scarfing up bunches of information the size of which is unknown
until run time.  Call them lists, arrays, structures, objects, or what
have you, in Lisp all you have to do is CONS (or MAKE-FOO) them up as
you need them.  In C you have to estimate how much memory they'll take
up, figure out how to handle things when they grow bigger than the
memory you've allocated, put in code to barf when the memory isn't
available, etc., etc.

I'd say that a very significant chunk of C programming is devoted to
this pain-in-the-ass storage management stuff.  And that's not even
considering the problem of string-building, where you can shoot
yourself in the foot so easily without feeling a thing.  When you
write code in Lisp, you can concentrate on the problem to be solved,
not on why the system crashed while you were trying to solve it.

The down side of this, of course, is that storage requirements can
balloon far beyond what is optimally needed, and garbage collection
is necessary.  But modern GC technology and clever compilers go a
long way toward mitigating these effects.

Someone on this list said:

>On the other hand, when the problem and its solution are well-defined,
>a language like C is a more likely choice.  The code is written, the
>executable delivered, and then set aside until a round of bug fixes.

Point is, that round of bug fixes lasts virtually forever in C, whereas
in Lisp the code is much more likely to be nearly bug-free at delivery.

                                                  - SEB

lgm@cbnewsc.ATT.COM (lawrence.g.mayka) (01/17/91)

In article <22650@well.sf.ca.us> nagle@well.sf.ca.us (John Nagle) writes:
	Symbolics closed at 7/16 yesterday, down 3/16.  The year's high was 10.,
   so the stock has lost 96% of its value in the last year.  The end must be
   near.

Don't speak too soon.  Symbolics has no long-term debt, and has been
profitable for the last two reported quarters (April-June and
July-September of 1990).  It closed Wednesday at 3/4, according to the
Chicago Tribune.


	Lawrence G. Mayka
	AT&T Bell Laboratories
	lgm@iexist.att.com

Standard disclaimer.

alms@cambridge.apple.com (Andrew L. M. Shalit) (01/18/91)

In article <3954@skye.ed.ac.uk> jeff@aiai.ed.ac.uk (Jeff Dalton) writes:

   There isn't any reason, other than historical, why
   Common Lisp couldn't be presented, and even implemented, in a more
   C-like way, as a language plus libraries of procedures and data types.

Offhand, I disagree with this.

It's true, Common Lisp has many features.  But these features are
often used to implement other features.  In other words, a CL
implementation has a very tangled call tree.  It's hard to find
portions of the language which could be removed.  If you put APPEND,
ASSOC, MEMBER, REVERSE, and MAPCAR into a separate module (as EuLisp
does, I believe) chances are that every implementation is going to
have them in the kernel anyway.  Hash-tables are used to implement
packages.  Format is used for error messages and other system io.
Sequence functions are used all over the place, etc.  The only real
candidate for separability I can think of is non-integer numerics.

     -andrew
--

jeff@aiai.ed.ac.uk (Jeff Dalton) (01/18/91)

In article <20544@yunexus.YorkU.CA> oz@yunexus.yorku.ca (Ozan Yigit) writes:
>In article <3954@skye.ed.ac.uk> jeff@aiai.UUCP (Jeff Dalton) writes:
>> ...... Nor am I
>>convinced that Scheme will be faster.
>
>I know life is too short to even attempt to convince you, :-) so let me
>just say that if you ever want to find out for sure, at least try out an 
>industrial-strength implementation (such as Chez etc.) for your tests.
>Who knows, you may be surprized.

Suppose I use T.  Would that count?  (I am willing to try your test
some time, if I can do it for free.  There is no money for Lisp here
these days.)

>>	...  Standard Scheme, at least,
>>lacks many of the efficiency tricks (e.g., declarations) available
>>in CL.
>
>Scheme literature thus far available (Steele [1], Dybvig [2], Kranz et al.
>[3] just to mention a few) seem to suggest that scheme may not need much in
>the way of efficiency tricks (except perhaps to indicate to the compiler
>that built-in functions will not be re-defined) to be compiled and
>optimized properly.  [On the other hand, arguably a case may be made for
>additional constructs for even *better* results]

I have no problem with the idea that Scheme's control structures,
including call/cc, can be implemented efficiently.  Ditto lists,
function calls, ...  The things I was thinking of were more like:

  *  Fixnum arithmetic.

  *  Dynamic extent declarations.

>>It is still difficult to implement an efficient Scheme, despite
>>its size ...
>
>Give me a unit of measure for your understanding of *efficient*, so that
>we'll know what this new claim is all about.

By efficient, I mean something like: as fast as C.

But I think the "difficult" is more important.  An efficient Scheme
requires a lot of attention to garbage collection and compiler
technology, and this tends to make the small size of the language
less significant.

gateley@rice.edu (John Gateley) (01/18/91)

In article <ALMS.91Jan17145321@ministry.cambridge.apple.com> alms@cambridge.apple.com (Andrew L. M. Shalit) writes:
   In article <3954@skye.ed.ac.uk> jeff@aiai.ed.ac.uk (Jeff Dalton) writes:
      There isn't any reason, other than historical, why
      Common Lisp couldn't be presented, and even implemented, in a more
      C-like way, as a language plus libraries of procedures and data types.
   Offhand, I disagree with this.
   It's true, Common Lisp has many features.  But these features are
   often used to implement other features.  In other words, a CL
   implementation has a very tangled call tree.

By choosing an appropriate set of primitives, you can get a small core
library with the property that the majority of functions in the CL
library will call only members of the core library (or the core
library plus a small set of others). This gives you the needed
untanglement.

John
gateley@rice.edu

--
"...Yes, I've got some questions that are guaranteed to shake you up. How
much marriage urges a windmill to paint infinity? Is a magic hide-a-bed
the vile home of spanish fire? Is firm corn merrier under gifts of less
important love? We wonder ..." The Residents

kpc00@JUTS.ccc.amdahl.com (kpc) (01/18/91)

In article <LGM.91Jan17101158@cbnewsc.ATT.COM> lgm@cbnewsc.ATT.COM
(lawrence.g.mayka) writes:

   In article <22650@well.sf.ca.us> nagle@well.sf.ca.us (John Nagle)
   writes: Symbolics closed at 7/16 yesterday, down 3/16...

   Don't speak too soon.  Symbolics has no long-term debt...

It is odd to see a company seemingly on the brink of obsolescence and
obscurity whose products are coveted by so many.

Several research laboratories use Symbolics' machines, and the loyalty
seems high.  Former users say they'd do anything to be using Symbolics
machines.  Yet we debate how close to bankruptcy the company might be!

I'm about to start using Symbolics' machines, and I look forward to
experiencing what many call an excellent environment.

But I wonder whether the machines will go the way of the horse and
carriage soon.  What happens to the supplies of oats and stirrups and
so on that I will have collected, I wonder?
--
If you do not receive a reply from me, please resend your mail;
occasionally this site's mail gets delayed.

Neither representing any company nor, necessarily, myself.

rh@smds.UUCP (Richard Harter) (01/18/91)

In article <2795309d.5117@petunia.CalPoly.EDU>, jdudeck@polyslo.CalPoly.EDU (John R. Dudeck) writes:

: In an article nagle@well.sf.ca.us (John Nagle) wrote:
: >     LISP, in its grander forms, was intended to support an environment
: >in which everything was fluid, in which programs could modify themselves,
: >examine their own inner workings in a reasonable representation,
: >and in which a program could create new sections of program in an 
: >integrated way.  The dream was of programs that used these facilities
: >to improve themselves.  Lenat's AM and Eurisko actually did so; few
: >other programs ever did.

: >     Unfortunately, that didn't seem to lead much of anywhere.

: I would say "fortunately".  It's hard enough to control complexity when
: we are able to analyze and decompose a problem and nail things down 
: in the solution as much as possible.  I hope I never have to maintain
: a program that is fluid.

You miss the point -- you wouldn't have to maintain such a program; it
would maintain itself.  :-)

But seriously, the objective was to bypass the complexity of program
building by having the self-building programs.  It may be that the ultimate
failure of the early attempts at auto-programming were really due to
not having a built-in understanding of software complexity.
-- 
Richard Harter, Software Maintenance and Development Systems, Inc.
Net address: jjmhome!smds!rh Phone: 508-369-7398 
US Mail: SMDS Inc., PO Box 555, Concord MA 01742
This sentence no verb.  This sentence short.  This signature done.

oz@yunexus.yorku.ca (Ozan Yigit) (01/18/91)

In article <3965@skye.ed.ac.uk> jeff@aiai.UUCP (Jeff Dalton) writes:
> ...  The things I was thinking of were more like:
>
>  *  Fixnum arithmetic.
>  *  Dynamic extent declarations.

Yes, I realized that later. The first would be very useful, the second,
for the purposes of (fluid-let ...) can possibly be implemented with
minimum cost *and* without interfering with the correct optimization of
tail recursion, as in [1].

>By efficient, I mean something like: as fast as C.

Ah, I thought you probably meant only as fast as an average CommonLipth
with all tricks turned on. ;-)

>... An efficient Scheme
>requires a lot of attention to garbage collection ...

No more or no less than anything else that has a need for GC ...

>... and compiler technology, 

Aw, come on. You well know that there is a lot of mileage on this one.
Writing an average compiler for R3.99RS/IEEE scheme is essentially
trivial. [in comparison to something like C for example]. Further, it is
my experience that some low-cost common-sense optimizations (that don't
require a detailed side-effect analysis as in Rabbit and others) still pay
off handsomely. It just depends how far you want to go. (!!)

>and this tends to make the small size of the language less significant.

See (!!). The ever-growing number of scheme implementations seem to suggest
that the small size and reduced complexity of the language *is*
significant. It *is* possible to have a small, std-compliant, useful and
fast scheme.

oz
---
[1] B.F. Duba, M. Felleisen, D. P. Friedman, Dynamic Identifiers can
    be Neat, Computer Science Technical Report 220, Indiana University,
    Bloomington, Indiana, April 1987.

---
We only know ... what we know, and    | Internet: oz@nexus.yorku.ca 
that is very little. -- Dan Rather    | UUCP: utzoo/utai!yunexus!oz

mike@apl (Mike Clarkson) (01/18/91)

In article <5569@turquoise.UUCP> ahlenius@motcid.UUCP (Mark Ahlenius) writes:
>djb@babypuss.mitre.org (David J. Braunegg) writes:
>
>>>Lack of demand due to Common LISP's enormous size, complexity, resource
>>>requirements, training, etc.
>>>
>>>Common LISP effectively died from obesity.
>
>>OK.  What are the problems preventing a smaller, more efficient Lisp
>>so that we aren't forced to use the almost-a-programming-language C?
>
>There is a smaller, faster dialect of CL out there and as far as I
>know it is being taught in some of the major universities
> - its Scheme.

I'm not sure if Scheme is really smaller. I am sure, that in my benchmarking
of almost every dialect of Scheme available, Franz and Lucid CL's are faster.
And this is without declarations, which are used regularly in CL.

As for being smaller, it depends on what you mean by smaller.  Yes the
languages as defined by their respective standards makes Scheme a smaller
dialect of Lisp.  But bear in mind that the Scheme standard does not define
many things lisp programmers consider essential, such as macros.  MIT makes
probably the most complete Scheme working environment, which includes
important elements of a Lisp programming environment such as macros,
packages, an inspector etc.; compiled it is over 35 Mbytes.  
Not small by my measure. 

> There appears to be some renewed interest in Scheme lately.

There is a lot of interest in Scheme, for very good reasons.  But speed and
size are not two of them.

Mike.


--
Mike Clarkson					mike@ists.ists.ca
Institute for Space and Terrestrial Science	uunet!attcan!ists!mike
York University, North York, Ontario,		FORTRAN - just say no. 
CANADA M3J 1P3					+1 (416) 736-5611

jeff@aiai.ed.ac.uk (Jeff Dalton) (01/18/91)

In article <NETMAILR11011708110SEB1525@MVS.DRAPER.COM> SEB1525@mvs.draper.com writes:

>The Big Win is:  Storage management in Lisp is a non-issue.

It's certainly a big win, but it's not quite a non-issue.  For
example, Lisp programmers are often advised to resort to explicit
storage management in order to avoid generating garbage, and some
significant applications have been carefully written so that no
garbage is generated.  This is, presumably, less important with
modern "non-intrusive" gc techniques, but I don't think the problem
has yet been reduced to zero.

BTW, I complain about Lisp only because I want it to get better.
I would much rather use Lisp than C, but there are times when I
have to use C because the available Lisps are too big to too slow
for whatever it is I want to do.

>I'd say that a very significant chunk of C programming is devoted to
>this pain-in-the-ass storage management stuff.  And that's not even
>considering the problem of string-building, where you can shoot
>yourself in the foot so easily without feeling a thing.

Yes, but C is _much_ faster for such things as readign through
files (or at least it seems to be), in part because one uses
pre-allocated arrays rather than constructing strings.  (There's
nothing that prevents similar techniques from being used in Lisp
-- in principal -- but current implementations tend not to provid
it.)

>>On the other hand, when the problem and its solution are well-defined,
>>a language like C is a more likely choice.  The code is written, the
>>executable delivered, and then set aside until a round of bug fixes.
>
>Point is, that round of bug fixes lasts virtually forever in C, whereas
>in Lisp the code is much more likely to be nearly bug-free at delivery.

The problem is convincing the right people that this is so.  Many
people are used to seeing strong typing as a necessity, or at least
as an important factor in producing reliable code.  And Lisp looks
very unsafe to them.  (As does C, but for less "pervasive" reasons.)

-- JD

tim@cstr.ed.ac.uk (Tim Bradshaw) (01/18/91)

>>>>> On 17 Jan 91 22:33:02 GMT, gateley@rice.edu (John Gateley) said:

> By choosing an appropriate set of primitives, you can get a small core
> library with the property that the majority of functions in the CL
> library will call only members of the core library (or the core
> library plus a small set of others). This gives you the needed
> untanglement.

And as well as this it is relatively easy to disentangle the language
at a coarser level: leave CLOS, the new loop macro and various other
big chunks of CL mentioned in ClTL2 out of the core of CL.  In fact I
would be fairly surprised & disappointed if CL implemtations did *not*
do this!

--tim
Tim Bradshaw.  Internet: tim%ed.cstr@nsfnet-relay.ac.uk
UUCP: ...!uunet!mcvax!ukc!cstr!tim  JANET: tim@uk.ac.ed.cstr
"...wizzards & inchanters..."

jeff@aiai.ed.ac.uk (Jeff Dalton) (01/18/91)

In article <ALMS.91Jan17145321@ministry.cambridge.apple.com> alms@cambridge.apple.com (Andrew L. M. Shalit) writes:

>   There isn't any reason, other than historical, why
>   Common Lisp couldn't be presented, and even implemented, in a more
>   C-like way, as a language plus libraries of procedures and data types.

>Offhand, I disagree with this.

Well, for one thing, I don't think you're taking history enough
into account.  The large, monolithic Common Lisp systems we all
know and love were constructed that way for reasons that I would
say are essentially historical.  Implementors from a different
background and with different experience of Lisp and different
expectations of how it would be used might well have implemented
Common Lisp (the very same language) in a significantly different
way.

You later say that Common Lisp implementations have a "very tangled
call tree".  Now, maybe that is true.  But it's not necessary to
implement CL that way, and it looks like you haven't looked at any
implementations to see if it's even _been_ implemented that way.  (Not
that that would settle the matter, since I was talking about how the
language could be implemented, not how it was implemented.)

Of course, there's no doubt that the kernel of CL is bigger than that
of C.  But that doesn't mean that a separation between kernel and the
rest can't be made.

>It's true, Common Lisp has many features.  But these features are
>often used to implement other features.  In other words, a CL
>implementation has a very tangled call tree.  It's hard to find
>portions of the language which could be removed.

There seem to be a fair number of people (especially in the UK?) who
think this is so and, moreover, must be so.  However, after citing
format, packages, and maybe multiple values they tend to run out of
examples.  Perhaps there are many more examples, but it requires some
careful investigation to determine just how far the problem extends.

My feeling is that the impression of a tangled language is due at
least in part to how CL has been presented, and -- I claim -- it could
be presented differently.  It is possible to extract a coherent subset
of CL, at least conceptually.  (Since I've done this a couple of
times, I think I have fairly good reason to think it _can_ be done.)
Of course, the language would be better in this respect if it had been
one of the goals of the design, but it is nonetheless possible to go
further in this direction than one might suppose.

>                                                 If you put APPEND,
>ASSOC, MEMBER, REVERSE, and MAPCAR into a separate module (as EuLisp
>does, I believe) chances are that every implementation is going to
>have them in the kernel anyway.  

That may well be so (except for ASSOC), but you need it to be so
for more than five functions for this to be significant evidence 
for your claim that it is difficult to untangle the language.
Moreover, the kernel needn't use function in their full generality.
Most likely, MEMQ is used rather than MEMBER, for example (and
the MEMQ calls perhaps compiled away).

BTW, Prolog implementations have append/2 in them, and yet users have
to write it themselves or load it from a library of they want to use
it.  So it's certainly possible to think of a language as not having
append even though it's "really" built-in.

>Hash-tables are used to implement packages.

Not necessarily.

>Format is used for error messages and other system io.

This is the most cited example, and it's not a very good one.  Many
format calls can be compiled away into calls on much simpler functions.
Moreover, system messages can be (and almost certainly are) written to
use only a subset of format's capabilities.

>Sequence functions are used all over the place, etc. 

Really?  Remove-if, perhaps?  There are a number of functions (and not
just sequence functions) that aren't used all over the place.  Look at
KCL, for example, where the more central functions are written in C
while others are written in Lisp.  And it would be possible to reduce
the size of the C kernel if this were felt to be sufficiently
desirable.

Remember too that libraries can be implemented, or connected to the
Lisp system, in various ways that reduce the impact of procedures not
actually in use: e.g., autoloading, shared libraries.

-- JD

oz@yunexus.yorku.ca (Ozan Yigit) (01/19/91)

In article <17550@ists.ists.ca> mike@apl.ists.ca (Mike Clarkson) writes:

>As for being smaller, it depends on what you mean by smaller. 

Sure. It all depends. I, for example, like this definition: "smaller" as
it relates to the time it takes an average 16-year-old highschool student
with a Mac Classic and a C compiler to implement the entire language. ;-)

>But bear in mind that the Scheme standard does not define
>many things lisp programmers consider essential, such as macros.

That depends what you mean by essential. If I were to take some of the
discussion in this newsgroup as a crude measure, I would probably conclude
that anything in the silver book is "essential", with all that implies.
As for macros, the issue has never been "doing it" but rather, "doing it
right", just to re-iterate in case this point hasn't been made enough many
times. A macro facility is typically included in just about every
implementation of the language, and most of those also support a more
unified extend-syntax (dbyvig) facility. A proposal for a very powerful
and hygenic macro facility is being completed on for some edition of the
revised report.

> MIT makes probably the most complete Scheme working environment, ...

which depends on what you mean by a most complete working environment,
which turns out to change in some circles just about every week. ;-)

... oz
---
We only know ... what we know, and    | Internet: oz@nexus.yorku.ca 
that is very little. -- Dan Rather    | UUCP: utzoo/utai!yunexus!oz

gregor@parc.xerox.com (Gregor Kiczales) (01/19/91)

In article <TIM.91Jan18130158@kahlo.cstr.ed.ac.uk> tim@cstr.ed.ac.uk (Tim Bradshaw) writes:

   And as well as this it is relatively easy to disentangle the language
   at a coarser level: leave CLOS, the new loop macro and various other
   big chunks of CL mentioned in ClTL2 out of the core of CL.  In fact I
   would be fairly surprised & disappointed if CL implemtations did *not*
   do this!

Actually, I would think there were better things to leave out of the
core implementation.  In fact, I would think that using CLOS in the core
is a good idea.  Its runtime can be quite small, and using it there can
provide a foundation for extensibility that many users want.

What I would leave out of the kernel implementation is stuff like the
hairy sequence functions, format and the like.  I think of these as
libraries, which can easily be separated.  It would seem that, in most
implementation strategies, these things would have a larger runtime and
be more intertwined with one another.

jwz@lucid.com (Jamie Zawinski) (01/19/91)

In article <3953@skye.ed.ac.uk> jeff@aiai.ed.ac.uk (Jeff Dalton) wrote:
>
> I think you are right to suggest that it's much more straightforward
> to deliver the executable when using C.  But C also has advantages
> when delivering the source.  Because more machines come with C
> compilers than with Lisps, C is in practice more portable (even
> though, as a language, it seems to provide more opportunities for
> machine-dependence).

Not to restart the Language War to End All Language Wars again, but...  I
really disagree with you that C source is more portable than Lisp!  How many
bits is an "int"?  A "short"?  What kind of padding and alignment nonsense is
inserted into structures?  In Lisp these sorts of issues almost never matter,
but in C they almost always do.  If something is written in Common Lisp,
you're pretty much guarenteed it will work in any CL.  If something is 
written in K&R C, all bets are off.

		-- Jamie

jjacobs@well.sf.ca.us (Jeffrey Jacobs) (01/22/91)

While you're digging thru old AI Experts, grab the March '88 Brainwaves
column...

Jeffrey M. Jacobs
ConsArt Systems Inc, Technology & Management Consulting
P.O. Box 3016, Manhattan Beach, CA 90266
voice: (213)376-3802, E-Mail: 76702.456@COMPUSERVE.COM

ceb@csli.Stanford.EDU (Charles Buckley) (01/23/91)

In article <NETMAILR11011708110SEB1525@MVS.DRAPER.COM> SEB1525@mvs.draper.com writes:

   There's one thing about LISP that makes it superior to C which
   everyone else seems to have missed.  For me, this is the major factor
   in why Lisp is a much easier language to develop software in.  It's
   got nothing to do with "AI" (whatever you think it is) or
   fancy-schmancy programming environments.

Finally, here's someone else says: LISP !-> AI

   The Big Win is:  Storage management in Lisp is a non-issue.

Exact.  Makes prototyping a breeze.  However, I'd still like to do my
own storage management in Lisp when my algorithm makes that possible
and expedient.  It's simply not provided for in CL.  A major vendor
who shall remain nameless claims to have no interest in providing for
it as an extension, either.  Harumpf.  At least Symbolics gave you
with-stack-array (or equiv.).

Solve this and the run-time library logistics, and you'd go a long way
to silencing the Lisp-bashers (at least the ones who speak from
knowledge, as opposed to a simple desire to raise hell).

barmar@think.com (Barry Margolin) (01/23/91)

In article <17374@csli.Stanford.EDU> ceb@csli.Stanford.EDU (Charles Buckley) writes:
>Exact.  Makes prototyping a breeze.  However, I'd still like to do my
>own storage management in Lisp when my algorithm makes that possible
>and expedient.  It's simply not provided for in CL.  A major vendor
>who shall remain nameless claims to have no interest in providing for
>it as an extension, either.  Harumpf.  At least Symbolics gave you
>with-stack-array (or equiv.).

I'm not sure precisely what type of extension you're referring to.  Both
Symbolics and Lucid provide "resources", which are pools of objects that
can be reused rather than consing new objects each time (I expect Allegro
CL also has resources, but I'm not familiar with its extensions).

The ANSI CL proposal will include a DYNAMIC-EXTENT declaration, which
specifies that the value of the specified variable will only be accessed
within the dynamic extent of the binding form.  Compilers can optimize this
into stack allocation of data structures.  This is similar to Symbolics's
STACK-LET, and I think Lucid already supports such a declaration.

--
Barry Margolin, Thinking Machines Corp.

barmar@think.com
{uunet,harvard}!think!barmar

espen@math.uio.no (Espen J. Vestre) (01/23/91)

First, this talk about lisp being dead puzzles me.  At least in european 
academic institutions, it doesn't seem to be dead at all.  Quite in 
contrary.  And among the new lisp installations I have seen on recent 
visits to certain universities, are Symbolics lispmachines-on-a-card.
(Alleged problems by Symbolics initiated this discussion, I understand)

Second, why is it claimed that CL is "too big"?  What does that mean?  
That applications tend to take up much space? That lisp systems take up 
much space?  Anyway, it is certainly possible to make compact CL systems.  
My own favourite system takes up only 790K on my harddisk.  And that 
includes all of CL (but not yet CLOS) with compiler, editor and 
window-interface.  In addition there is a 200K documentation system (all 
of CL) and 200K of other files.  Other programming languages aren't much 
better than that.  Certain well-known unix CL implementations are much 
bigger than this (esp. on RISC machines), which probably is caused by 
a "speed first, and leave the space to unix VM"-attitude.  Clumsy 
solutions wrt. to space is not limited to lisp, however.  On one well-
known brand of unix work-stations, the whole X library has to be linked 
with any application that uses it, which makes even a small clock a 
several-hundred-K application.

However, harddisk (or even RAM) space is not a real problem any more, is 
it?

-----------------------------------------
Espen J. Vestre                 
Department of Mathematics
University of Oslo
P.o. Box 1053 Blindern
N-0316 OSLO 3
NORWAY                            espen@math.uio.no
-----------------------------------------

davis@barbes.ilog.fr (Harley Davis) (01/23/91)

In article <3944@skye.ed.ac.uk> jeff@aiai.ed.ac.uk (Jeff Dalton) writes:

   In article <WELCH.91Jan10095911@sacral.cis.ohio-state.edu> welch@sacral.cis.ohio-state.edu (Arun Welch) writes:

   >But, to go back to the original question, I think the waters are
   >muddied a tad by the fact that Lisp is still pretty strong outside the
   >US, 

   It is?  Where?

France, for example, where Le-Lisp is still becoming more and more
popular.

-- Harley
--
------------------------------------------------------------------------------
nom: Harley Davis			ILOG S.A.
net: davis@ilog.fr			2 Avenue Gallie'ni, BP 85
tel: (33 1) 46 63 66 66			94253 Gentilly Cedex, France

jeff@aiai.ed.ac.uk (Jeff Dalton) (01/23/91)

Re: It's true, Common Lisp has many features.  But these features are
    often used to implement other features.  In other words, a CL
    implementation has a very tangled call tree.  It's hard to find
    portions of the language which could be removed.

One reason why it may be hard to see some underlying simplicity in
Common Lisp is that it is sometimes hidden by being "below" the level 
described in CLtL.

Streams are perhaps the easiest example to dissect.  The various
stream types (broadcast-stream, echo-stream, etc) appear as pimitives,
but only a few of them _have_ to be primitives.  The others could be
built up by defining new structures and extending the existing
operations (read, print, etc) to recognize the new stream types.  That
would be a fairly straightforward thing to do if the stream operations
were generic functions (as in CLOS).

In implementation with a well-integrated CLOS, streams might even
be implemented that way.  However, when CL was defined there wasn't
a standard object system and so most implementations have used
internal object-like mechanisms of their own.

Here we can see a break in the history leading to CL that has
been partially repaired by the additions of CLOS.  In (ITS) MacLisp,
a number of stream operations were implemented using a simple object-
like facility called "software file arrays" (SFAs).  In Lisp Machine
Lisp, it was done with Flavors.  The reason the repair is partial
is that CLOS was a sufficiently late addition that the stream
operations were not specified as generic functions.  They may be
in particular implementations, but portable code can't rely on it.

-- jd

otto@canon.co.uk (G. Paul Otto) (01/24/91)

In article <1991Jan23.094925.12728@ulrik.uio.no> espen@math.uio.no (Espen J. Vestre) writes:
>
>However, harddisk (or even RAM) space is not a real problem any more, is 
>it?
>
You jest, I presume?  Many real applications consume more memory (or disc)
space than current machines tend to have.  For example, I used to work on
stereo matching of satellite images - each image was 36 Mb; this would grow
to 144Mb if you used floating point, rather than bytes, for each pixel; and
you wanted to work on a pair at once.  Of course, you could chop the images
into pieces - but the ensuing housekeeping & assorted complications considerably
complicated the job.  Just while prototyping one (comparatively simple) part of
this job, I had to completely rewrite a C program 3 times to get its size down.
[This wasn't bad planning - I tested the ideas on fragments with simple programs
- then keep refining on more & more realistic data.]

As processors (and cameras) become cheaper and more powerful, such tasks are
likely to become more common ...

Paul

cutting@parc.xerox.com (Doug Cutting) (01/24/91)

In article <NETMAILR11011708110SEB1525@MVS.DRAPER.COM> SEB1525@mvs.draper.com writes:

   The Big Win is:  Storage management in Lisp is a non-issue.

No, Garbage collectors don't make storage management a non-issue, they
just make it less of an issue (unless you don't care about performance).

In article <17374@csli.Stanford.EDU> ceb@csli.Stanford.EDU (Charles Buckley) writes:

   However, I'd still like to do my own storage management in Lisp
   when my algorithm makes that possible and expedient.  It's simply
   not provided for in CL. 

I'm not sure what you're talking about.  One can easily do one's own
storage management in CL.  It's not considered exemplary coding style,
but when used judiciously can provide substantial acceleration (e.g.
for long-lived objects under a generation scavenging GC).

(defvar *cons-cells* () "Free list of CONS cells linked by CDR")

(defun make-cell (car cdr)
  (let ((cell *cons-cells*))
    (cond
      (cell
       (setq *cons-cells* (cdr *cons-cells*))
       (setf (car cell) car)
       (setf (cdr cell) cdr)
       cell)
      (t (cons car cdr)))))

(defun reclaim-cell (cell)
  (setf (car cell) nil)
  (setf (cdr cell) *cons-cells*)
  (setq *cons-cells* cell)
  nil)

This could be done with generic functions ALLOCATE and FREE, but this
is not usually needed and adversely affects performance.

	Doug

mayer@hplabsz.HP.COM (Niels Mayer) (01/24/91)

In article <1991Jan23.094925.12728@ulrik.uio.no> espen@math.uio.no (Espen J. Vestre) writes:
>Second, why is it claimed that CL is "too big"?  What does that mean?  
>That applications tend to take up much space? That lisp systems take up 
>much space?  Anyway, it is certainly possible to make compact CL systems.  
>My own favourite system takes up only 790K on my harddisk.

How much space a program takes up on a hard disk is not very important
-- disk is relatively cheap compared to RAM. More important, what is
the runtime size of your application? Most of the CL impls I've seen
take up megabytes of valuable RAM and swap space; add a substatial
application on top of that and 10-50 megabyte runtime images become
commonplace.  Generally, people won't notice this if you've got a
reasonable amount of RAM and the CL process is the one getting most of
the use. Typical cases for such usage is in application prototyping. 

On the other hand, try deliveriing a nontrivial CL-based application
in which the applications is but one of the processes competing for
resources on the workstation. Even worse, try delivering a CL-based
application into a multiuser, multiprocessing environment. You'll be
thrashing like crazy in no time. Users will hate you. Customers will
hate you. RAM manufacturers will be laughing all the way to the bank. 
      
>Clumsy 
>solutions wrt. to space is not limited to lisp, however.  On one well-
>known brand of unix work-stations, the whole X library has to be linked 
>with any application that uses it, which makes even a small clock a 
>several-hundred-K application.

The whole X, Xt and widget set library does not need to be linked in
to a X application such as your clock example. Only the parts that
your linker determines is necessary. 

However, much more code gets linked in than is ever used. That's
because much of the code needs to be there just incase an
infrequentlly used resource is set, customized, altered-at-runtime,
etc. It's the moral equivalent of having eval linked in to a
deliverable CL program (e.g. there's no way of proving that some
function or functionality can be removed given that data can be
interpreted and executed). Xtoolkit based applications have a
different form of data that can be interpreted -- resources. 

I find your comparison of CL against Xtoolkit based applications
interesting due to a particularly warped view of the Xtoolkit that I
hold.  Remember that anytime you're using the X toolkit, you are using
the nearest thing to Lisp's inefficiency, but packaged in an all-C
implementation. Everything done in the internals of Xt is dynamic, and
interpreted and very little can be optimized or compiled out at the
level of Xt's C implementation (e.g. Xt's dynamic method lookup). 

However, the customizability provided by the Xt implementation makes
applications far more useful, and far easier to integrate into a
variety of work styles and environments. Given shared libraries, and
multiple applications that share the same Xt & widget-set code, the
size of such applications becomes less of a memory burden. 

				----------

I personally think that Lisp is a wonderful language for prototyping
and extending systems at a high level of abstraction. I also think
that C is a very useful portable assembly language which should be
used when speed and size are important considerations. I use both
languages with equal ease inside WINTERP and I believe that hybrid
programming in both C and Lisp can give you the best of both worlds. 

-------------------------------------------------------------------------------
	    Niels Mayer -- hplabs!mayer -- mayer@hplabs.hp.com
		  Human-Computer Interaction Department
		       Hewlett-Packard Laboratories
			      Palo Alto, CA.
				   *

	Disclaimer: The above opinions are my own, and do not reflect
	the official corporate outlook of Hewlett-Packard.

djohnson@mbongo.ucsd.edu (Darin Johnson) (01/24/91)

In article <17374@csli.Stanford.EDU> ceb@csli.Stanford.EDU (Charles Buckley) writes:

>Finally, here's someone else says: LISP !-> AI
>
>   The Big Win is:  Storage management in Lisp is a non-issue.
>
>Exact.  Makes prototyping a breeze.

OK, I'm another person who likes LISP but doesn't do "AI".  Lack of
worry about storage management is nice, but only part of the picture.
What I like is that I don't even have to specify I/O.  If I want to
test out an algorithm, I needn't worry about input routines to read
and parse everything or to get things printed back out.  This is
especially useful when I want to test a function that isn't the "main"
function.  In C I spend lots of time writing wrapper code just to test
functions.

Also, type declaration is optional (until you want to improve
compilation), compilation is optional, etc.  So in essence, you can
quickly whip up a program (I don't like the word prototype - sounds
too much like software eng), test it out, and then convert it to C or
something if you need to.  (and if you wrote it with the correct
mindset, converting to C is straightforward)
-- 
Darin Johnson
djohnson@ucsd.edu
  - Political correctness is Turing undecidable.

dean@cs.mcgill.ca (Dean NEWTON) (01/24/91)

In article <CUTTING.91Jan23141807@skye.parc.xerox.com> cutting@parc.xerox.com (Doug Cutting) writes:
>In article <NETMAILR11011708110SEB1525@MVS.DRAPER.COM> SEB1525@mvs.draper.com writes:
>
>   The Big Win is:  Storage management in Lisp is a non-issue.
>
>No, Garbage collectors don't make storage management a non-issue, they
>just make it less of an issue (unless you don't care about performance).


I suppose the same thing used to be said about virtual memory management.
And about compilation.  "It's okay for prototyping, or if you don't care
about performance.  But the real thing should be written in assembler."

Perhaps you could do it faster yourself, but hey, life's too short for me
to have to waste my time with bookkeeping details the computer should be
able to handle.

Compilation, virtual memory management, and garbage collection are entirely
mechanical processes, and, as we have already seen in the case of compilers,
computers can become better at them than most programmers.

Kaveh Kardan
Taarna Systems
Montreal, Quebec, Canada
(posting from a friend's account)

ceb@csli.Stanford.EDU (Charles Buckley) (01/25/91)

In article <1991Jan23.080259.19816@Think.COM> Barry Margolin writes:
   In article <17374@csli.Stanford.EDU> Charles Buckley writes:
   >Exact.  Makes prototyping a breeze.  However, I'd still like to do my
   >own storage management in Lisp when my algorithm makes that possible
   >and expedient.  It's simply not provided for in CL.

   I'm not sure precisely what type of extension you're referring to.  Both
   Symbolics and Lucid provide "resources", which are pools of objects that
   can be reused rather than consing new objects each time

But resources are nothing more than a sort of software cache of
currently unused objects belonging to an isomorphic class.  If you
will, it's a sort of pointer-in-reserve system, to assure that that
which you allocated and might need again doesn't get eaten up
unnecessarily by the gc, just because you don't happen to have
explicitly programmed full pointer coverage for the life of your
algorithm.  It's nothing more than a slickered-up version of the hack
that Doug Cutting replied with.

Calling clear-resource only removes all these pointers-in-reserve,
*allowing* the gc to find the freed storage in the course of its own
plodding, methodical ruminations --- this according to language in the
vendor doc, and from my conversation with one of its employees who
should know.  If I turn the gc off, calling clear-resource does me no
good - memory still gets exhausted.  In those circumstances in which
gc performance is pathologically poor, the resource facility does
nothing to offload it.

I want to be able to tell the gc "I'll let you know when I'm done with
memory, and you should manage the free chunks, consolidating and all
that, but don't bother hunting through memory for non-referenced
blocks."  So the above-referenced employee: "That that would
make gc design too hard."

Well, C runtime supports it . . .

jeff@aiai.ed.ac.uk (Jeff Dalton) (01/25/91)

In article <1991Jan23.094925.12728@ulrik.uio.no> espen@math.uio.no (Espen J. Vestre) writes:
>First, this talk about lisp being dead puzzles me.  At least in european 
>academic institutions, it doesn't seem to be dead at all. 

I don't think Lisp is dead, but I wish it were healthier.

>Second, why is it claimed that CL is "too big"?  What does that mean?  
>That applications tend to take up much space? That lisp systems take up 
>much space? 

Both complaints have been made.

>Anyway, it is certainly possible to make compact CL systems.  

I agree.

>However, harddisk (or even RAM) space is not a real problem any more, is 
>it?

Well, I could certainly use more of both, and my employers don't seem
to be about to buy them for me.  The practical consequence in Common
Lisp terms is that I can run KCL on my desk machine but have to put up
with annoying paging delays if I try one of the bigger ones.

-- jeff

john@amc-gw.amc.com (John Sambrook) (01/25/91)

[ Warning -- Longish, kind of whiney article ahead :-]

I've been following the "Lisp is dying! / No it's not!" discussion
with interest.  I've also had very good experiences using Common Lisp
to develop prototypes of programs that no one would consider "AI"
programs.  However, I can also say that I have found Common Lisp useful
for prototyping programs that could be considered to be "AI" programs.

For example, I recently completed a Common Lisp prototype of a memory
disassembler for a new high-performance RISC chip.  Of course, for
delivery I had to recode it in C, but I believe that much time was
saved by doing a prototype first.  Certainly the delivered version of
the disassembler has been virtually bug-free and is well-structured
and maintainable to boot.

I am also using Common Lisp to experiment with new approaches to trace
disassembly.  While I can't explain trace disassembly in detail here,
suffice it to say that it involves correlating a lot of observed data
with a model of how a particular microprocessor behaves in order to
generate an annotated list of the machine instructions that were
executed to produce the observed data.  It's a hard problem that
ultimately requires the ability to be able to represent knowledge in a
compact form, and to be able to do a fair amount of reasoning based on
the data and the model.  Common Lisp is a great environment in which
to do this.  I only wish I had a commercial Common Lisp and CLOS in
which to do this.  Right now I am using AKCL on a Solbourne 4-501 with
GNU Emacs.  Not bad, but not ideal either.  I haven't been able to get 
a PCL working though.  I have some literature from a couple of the major
Common Lisp vendors, but I doubt that I will be able to secure the 
funding to purchase one of their systems.

Unfortunately, there does not seem to be much interest within our
company for looking at programming in any other language than C and/or
assembler.  In general, most of our engineers seem to be quite
satisfied with the status quo.  While there are a few engineers that
are interested in doing more with Lisp, many of our senior engineers
routinely ignore suggestions to consider using Lisp for certain
problems where Lisp would appear to be a good fit.

I'd be interested in hearing from other individuals who have been able
to introduce Lisp into an engineering environment.  While it's not my
role to be a Lisp evangelist, I feel some responsibility to improve
productivity within our company, and I think Lisp would be useful in
this respect.

-- 
John Sambrook                             DNS: john@amc.com
Applied Microsystems Corporation	 UUCP: amc-gw!john
Redmond, Washington  98073               Dial: (206) 882-2000 ext. 630

eb@lucid.com (Eric Benson) (01/25/91)

In article <17418@csli.Stanford.EDU> ceb@csli.Stanford.EDU (Charles Buckley) wrote:
> In article <1991Jan23.080259.19816@Think.COM> Barry Margolin writes:
>    In article <17374@csli.Stanford.EDU> Charles Buckley writes:
>    >Exact.  Makes prototyping a breeze.  However, I'd still like to do my
>    >own storage management in Lisp when my algorithm makes that possible
>    >and expedient.  It's simply not provided for in CL.
> 
>    I'm not sure precisely what type of extension you're referring to.  Both
>    Symbolics and Lucid provide "resources", which are pools of objects that
>    can be reused rather than consing new objects each time
> 
> But resources are nothing more than a sort of software cache of
> currently unused objects belonging to an isomorphic class.  If you
> will, it's a sort of pointer-in-reserve system, to assure that that
> which you allocated and might need again doesn't get eaten up
> unnecessarily by the gc, just because you don't happen to have
> explicitly programmed full pointer coverage for the life of your
> algorithm.  It's nothing more than a slickered-up version of the hack
> that Doug Cutting replied with.
> 
> Calling clear-resource only removes all these pointers-in-reserve,
> *allowing* the gc to find the freed storage in the course of its own
> plodding, methodical ruminations --- this according to language in the
> vendor doc, and from my conversation with one of its employees who
> should know.  If I turn the gc off, calling clear-resource does me no
> good - memory still gets exhausted.  In those circumstances in which
> gc performance is pathologically poor, the resource facility does
> nothing to offload it.
> 
> I want to be able to tell the gc "I'll let you know when I'm done with
> memory, and you should manage the free chunks, consolidating and all
> that, but don't bother hunting through memory for non-referenced
> blocks."  So the above-referenced employee: "That that would
> make gc design too hard."
> 
> Well, C runtime supports it . . .

I don't know whether the vendor you are referring to is Lucid, but the
description matches our system.  It is true that there is no way for
the user to explicitly return storage to Lisp's free space so that it
can be used without garbage collecting.  This is a consequence of the
design of our storage allocator.  We could have designed it so that
this operation would be possible, but we felt other factors outweighed
the need for this capability.  In fact, many Lisp systems have this
capability.  Most such systems manage storage allocation so that
memory returned to the system can only be used for objects of similar
type, or at least of similar size, but that isn't a requirement of
Lisp either.

Broadly speaking, there are two main classes of storage allocation
used by Lisp systems.  These are 1) systems that copy all active
objects to new storage and abandon the old storage to be recycled and
2) systems that retain active objects in place and link together
storage used by discarded objects.  Systems of the first type are used
by most recent Lisp systems, especially those running on workstations
with virtual memory.  C runtime systems (with some exotic exceptions,
such as those running on Lisp machines!) are restricted to the second
type.

The advantages of copying garbage collection include:

- Simpler storage allocation.  Allocating a new object is as simple as
  incrementing a pointer.

- Avoids fragmentation.  Non-copying collectors can cause active
  objects to be intermixed with free storage, resulting in difficulty
  allocating large objects and increased paging overhead.

- Lifetime-based collection (Generation scavenging or ephemeral GC).
  Although it is possible to implement these with non-copying
  allocation, they are much simpler and more effective with a copying
  GC.

Disadvantages include:

- Copying GC consumes more of the address space, since there must be
  space available to copy all active objects.  It is for this reason
  that copying GC is rarely found on systems without virtual memory,
  or those with small address spaces.

- Copying an object requires that you find all pointers to the
  object and update them to the new address.  In a non-copying system,
  it is sufficient to have one pointer to an object, since the pointer
  will not change.  So-called "conservative" garbage collectors are
  generally of this type.  Joel Bartlett's "mostly-copying
  conservative GC" is a hybrid copying/non-copying system.

- It is difficult to return storage to freespace explicitly in a
  copying system.

At the time we designed our system, we felt the advantages outweighed
the disadvantages.  The existence of the ephemeral GC means that the
"plodding, methodical ruminations" of the collector usually don't get
in your way.

eb@lucid.com 	           	 	Eric Benson
415/329-8400 x5523                      Lucid, Inc.
Telex 3791739 LUCID                     707 Laurel Street
Fax 415/329-8480                        Menlo Park, CA 94025

cutting@parc.xerox.com (Doug Cutting) (01/25/91)

In article <1991Jan24.153322.1307@cs.mcgill.ca> dean@cs.mcgill.ca (Dean NEWTON) writes:
   In article <CUTTING.91Jan23141807@skye.parc.xerox.com> cutting@parc.xerox.com (Doug Cutting) writes:

   >No, Garbage collectors don't make storage management a non-issue, they
   >just make it less of an issue (unless you don't care about performance).

   Compilation, virtual memory management, and garbage collection are entirely
   mechanical processes, and, as we have already seen in the case of compilers,
   computers can become better at them than most programmers.

This is a naive point of view.  Virtual memory does not really provide
infinite, fast, memory.  Data structures and algorithms which
acknowlege this can achieve vastly superior performance, e.g. B-trees,
which underly most database systems.  Similarly, garbage collectors
cannot always do a satisfactory job of storage management.  Usually
they can, but there are programs whose memory usage patterns cause
their performance to become dominated by GC.  In these cases one can
often make an otherwise unusable system quite by manual storage
management.  A (rather weak) example is the use of RECONS in the PCL
code walker.  Sure, PCL will run without it, but your compilations
would run much slower.

To claim otherwise is akin to claims one sometimes hears that memory
usage or CPU speed is no longer an issue.  While these are indeed less
of an issue, they're still core concerns of computer science.

	Doug

alms@cambridge.apple.com (Andrew L. M. Shalit) (01/25/91)

In article <6493@hplabsz.HP.COM> mayer@hplabsz.HP.COM (Niels Mayer) writes:

   In article <1991Jan23.094925.12728@ulrik.uio.no> espen@math.uio.no (Espen J. Vestre) writes:
   >Second, why is it claimed that CL is "too big"?  What does that mean?  
   >That applications tend to take up much space? That lisp systems take up 
   >much space?  Anyway, it is certainly possible to make compact CL systems.  
   >My own favourite system takes up only 790K on my harddisk.

   How much space a program takes up on a hard disk is not very important
   -- disk is relatively cheap compared to RAM. More important, what is
   the runtime size of your application? Most of the CL impls I've seen
   take up megabytes of valuable RAM and swap space.

I believe Espen was referring the Macintosh Common Lisp.  The system
(including editor, window system, etc) runs in a memory partition as
small as 1 megabyte.  Actually, it runs in 600k if you really want to
push things.

    -andrew

disclaimer:  I worked/work on this product.
--

person@plains.NoDak.edu (Brett G. Person) (01/28/91)

In article <127724@linus.mitre.org> djb@babypuss.mitre.org (David J. Braunegg) writes:
>>Lack of demand due to Common LISP's enormous size, complexity, resource
>>requirements, training, etc.
>>
>>Common LISP effectively died from obesity.
>
This is very right.  How manny different functions do the same thig in 
LISP?  LISP is beginning to look like something written by the government.
Which - I suppose - with the ANSI standards, it is:-)
>
>
>OK.  What are the problems preventing a smaller, more efficient Lisp
>so that we aren't forced to use the almost-a-programming-language C?
>

Simple, get rid of the redundancies, force people to re-write old code with
new styles, make the new lisp powerfull enough - i.e. give me a cheap
compiler that won't bring the machine I'm on to it's knees evertytime I use
it. 

-- 
Brett G. Person
North Dakota State University
uunet!plains!person | person@plains.bitnet | person@plains.nodak.edu

jeff@aiai.ed.ac.uk (Jeff Dalton) (01/29/91)

In article <7796@plains.NoDak.edu> person@plains.NoDak.edu (Brett G. Person) writes:

>>>Common LISP effectively died from obesity.

I don't think CL has died, though it may not be as successful as one
might like.

>This is very right.  How manny different functions do the same thig in 
>LISP? 

I don't think this is a very big problem.  There are very few
functions that do exactly the same thing (car and first come
to mind).  It's true that there is often more than one way to
do things (e.g. to get the third element of a list on might
use caddr, third, nth, or elt).  On the other hand, a fair 
amount of potential duplication has been avoided by having
sequence functions (e.g., length) that can be applied to both
vectors and lists.

oz@yunexus.yorku.ca (Ozan Yigit) (02/01/91)

In article <2456@paradigm.com> gjc@paradigm.com writes:

>What is happening at Texas Instruments?

Scheme ? ;-)

dak@sq.sq.com (David A Keldsen) (02/11/91)

djb@babypuss.mitre.org (David J. Braunegg) writes:

>>Lack of demand due to Common LISP's enormous size, complexity, resource
>>requirements, training, etc.
>>
>>Common LISP effectively died from obesity.

>OK.  What are the problems preventing a smaller, more efficient Lisp
>so that we aren't forced to use the almost-a-programming-language C?

Ah, well, there is such an animal.  It's called Scheme, and it has just
been standardized by the IEEE.  It's a small, lexically-scoped member
of the LISP family.  (Small in that the standard is only about 50
pages; compare this to _Common Lisp, The Language_ (Second Edition) at
just over 1000 pages.  (Yep, apples vs. oranges, but you get the point.))

Scheme even has a newsgroup, which is available as a mailing list digest
as well.  Have a look at comp.lang.scheme.

I'm not a spokesperson for SoftQuad or the IEEE, but I sure do like
their tastes in programming languages :-)

Dak
-- 
David A. 'Dak' Keldsen of SoftQuad, Inc. email: dak@sq.com  phone: 416-963-8337
"Sort of?  _Sort of_ the end of the world?  You mean we won't be certain?
We'll all look around and say 'Pardon me, did you hear something?'?"
	-- _Sourcery_ by Terry Pratchett