wilson@uicbert.eecs.uic.edu (Paul Wilson) (01/25/91)
I've got a new paper (submitted) that talks about some related issues. It turns out that the heap reorganization done by a copying garbage collector can do massive damage to the locality of references to code that lives in the heap. The worst damage is done by the copying algorithm traversing hash tables in memory order, which reaches the objects stored in hash tables in pseudo-random order. Procedures tend to be stored in hash tables that implement global or package namespaces. This naturally scrambles things up in memory and degrades locality. There's an easy hack around this problem, which greatly increases locality of reference in a Lisp system image. I also have a transitive traversal algorithm that clusters data objects for better locality. My paper also cites snazzy work others have done in dynamically reordering things according to actual access patterns. Where others have done this on an objectwise basis (which requires hardware support to be practical) I propose to do it on a pagewise basis, grouping pages within larger units of disk transfer. This was actually tried in the mid-70's by Baer and Sager, with disappointing results, but I think their results are due to problems that don't exist with current computers (in particular, having pages that are large *relative*to*the* total*memory*size*, i.e., less than a few hundred pages per level of the memory hierarchy.) I can send a preprint to anybody who's interested. -- Paul Paul R. Wilson Software Systems Laboratory lab ph.: (312) 996-9216 U. of Illin. at C. EECS Dept. (M/C 154) wilson@bert.eecs.uic.edu Box 4348 Chicago,IL 60680 -- Paul R. Wilson Software Systems Laboratory lab ph.: (312) 996-9216 U. of Illin. at C. EECS Dept. (M/C 154) wilson@bert.eecs.uic.edu Box 4348 Chicago,IL 60680