[comp.lang.lisp] long and deep?

kwang@uicsrd.csrd.uiuc.edu (06/10/89)

To those who have ever been in large Lisp programming projects:
I have two questions on lists used in large Lisp applications.

	1.  How long, do you think, become the lists? (1 2 3 4 ....)
	    How deep, do you think, become the lists? (((((....)))))
	2.  Were there frequent cases, you think, lists becomes both
	    long and deep? or ...

Thankyou for any guesses, feeling-from-experience, or whatever. -Kwang

kwang@uicsrd.uiuc.edu
Center for Supercomputing R & D
Univ. of Illinois at Uarbana-Champaign

leverich@randvax.UUCP (Brian Leverich) (06/11/89)

In article <47400023@uicsrd.csrd.uiuc.edu> kwang@uicsrd.csrd.uiuc.edu writes:
>
>	1.  How long, do you think, become the lists? (1 2 3 4 ....)
>	    How deep, do you think, become the lists? (((((....)))))
>	2.  Were there frequent cases, you think, lists becomes both
>	    long and deep? or ...
>
For the knowledge-based simulation work we're doing, we typically have
thousands to tens of thousands of object attributes that are short shallow
lists.  Then we have hundreds to thousands of attributes specifically
containing fact collections or event queues, with these guys being a few
hundred or more top-level elements and nesting typically no more than
three layers deep.

Incidentally, very few of our structures have pointers into their
interiors.  KBSim would be a ripe field for the design of tuned garbage
collectors...  Cheers.  -B
-- 
  "Simulate it in ROSS"
  Brian Leverich                       | U.S. Snail: 1700 Main St.
  ARPAnet:     leverich@rand-unix      |             Santa Monica, CA 90406
  UUCP/usenet: decvax!randvax!leverich | Ma Bell:    (213) 393-0411 X7769

forbus@p.cs.uiuc.edu (06/12/89)

I know I've built lists with over 100,000 elements easily.  But probably nested
only 20 or 30 levels deep.  However, that is only stuff I produce.  Alot of
stuff is produced by programs, and I don't look at every piece of internal
structure for huge examples.

Given your return address, I assume you care about this because of trying for
parallel speedups.  Lists aren't the only concern.
Most of one's non-locality nightmares come about through lots of
backpointers in structs.  The size of those datastructures, taken together,
can be in the 10's of megabytes.

I've heard random claims about making Lisp run on vector-oriented machines that
are based on "pure" Lisp, or worse yet, pure Scheme.  Anyone who thinks they
have made signficant progress because they've sped up a side-effect free Lisp
is fooling themselves.  Real AI systems almost always have massive
datastructures that evolve over the course of the computation.

vaughan@mcc.com (Paul Vaughan) (06/15/89)

One big source of Lists is source code.  Just run an analyzer over the
source for your LISP environment (assuming you have it).  Although
these lists aren't what people normally think of as application data,
a LISPM spends a lot of its time dealing with them.

 Paul Vaughan, MCC CAD Program | ARPA: vaughan@mcc.com | Phone: [512] 338-3639
 Box 200195, Austin, TX 78720  | UUCP: ...!cs.utexas.edu!milano!cadillac!vaughan