E1AR0002@SMUVM1.BITNET (03/01/86)
From BLYTHE%OZ.AI.MIT.EDU%mit-xx.arpa@CSNET-RELAY Mon Jan 13 22:58:34 1986 Received: by csevax.smu (4.12/4.7) id AA13311; Mon, 13 Jan 86 22:57:51 cst Received: from mit-xx.arpa by CSNET-RELAY.ARPA id a014525; 13 Jan 86 9:22 EST Received: from OZ.AI.MIT.EDU by XX.LCS.MIT.EDU via Chaosnet; 13 Jan 86 09:18-EST :tr 233 :unavailable :author Edwin Banks :asort Banks, E. :title Information Processing and Transmission in Cellular Automata :date January 1971 :reference (MAC-TR-81) :tr 234 :unavailable :author Lawrence J. Krakauer :asort Krakauer, L.J. :title Computer Analysis of Visual Properties of Curved Objects :date May 1971 :reference (MAC-TR-82) :adnum AD-723-647 :tr 235 :unavailable :author Terry Winograd :asort Winograd, T. :title Procedures as a Representation for Data in a Computer Program for Understanding Natural Language :date February 1971 :reference (MAC-TR-84), Available in book form under the title {\it Understanding Natural Language}, Terry Winograd, Academic Press (New York) 1972. In Britain and Europe, Edinburgh University Press, 1972 :adnum AD-721-399 :tr 236 :unavailable :author Thomas L. Jones :asort Jones, T.L. :title A Computer Model of Simple Forms of Learning :date January 1971 :reference (MAC-TR-20) :adnum AD-720-337 :tr 242 :unavailable :author Stephen W. Smoliar :asort Smoliar, S.W. :title A Parallel Processing Model of Musical Structures :date September 1971 :pages 276 :reference (MAC-TR-91) :adnum AD-731-690 :tr 258 :unavailable :author Carl Hewitt :asort Hewitt, C. :title Description and Theoretical Analysis (Using Schemata) of PLANNER: A Language for Proving Theorems and Manipulating Models In A Robot :date April 1972 :pages 408 :adnum AD-744-620 :tr 266 :unavailable :author Eugene Charniak :asort Charniak, E. :title Toward A Model Of Children's Story Comprehension :date December 1972 :pages 304 :adnum AD-755-232 :tr 271 :unavailable :author David L. Waltz :asort Waltz, D.L. :title Generating Semantic Descriptions From Drawings of Scenes With Shadows :date November 1972 :pages 349 :adnum AD-754-080 :reference (In {\it The Psychology of Computer Vision}) :tr 281 :unavailable :author Patrick H. Winston, Editor :asort Winston, P.H., ed. :title Progress In Vision And Robotics :date May 1973 :pages 327 :adnum AD-775-439 :tr 283 :unavailable :author Scott E. Fahlman :asort Fahlman, S.E. :title A Planning System For Robot Construction Tasks :date May 1973 :pages 143 :adnum AD-773-471 :tr 291 :unavailable :author Drew V. McDermott :asort McDermott, D. :title Assimilation of New Information by a Natural Language Under\-stand\-ing System :date February 1974 :pages 160 :adnum AD-780-194 :tr 294 :unavailable :author Ira P. Goldstein :asort Goldstein, I. :title Understanding Simple Picture Programs :date April 1974 :pages 228 :adnum AD-A005-907 :tr 297 :unavailable :author Gerald J. Sussman :asort Sussman, G.J. :title A Computational Model of Skill Acquisition :date August 1973 :pages 200 :reference (In {\it A Computer Model of Skill Acquisition}) :tr 310 :unavailable :author Patrick H. Winston :asort Winston, P.H. :title New Progress in Artificial Intelligence :date September 1974 :pages 350 :adnum AD-A002-272 :tr 316 :unavailable :author Ann D. Rubin :asort Rubin, A.D. :title Hypothesis Formation and Evaluation in Medical Diagnosis :date January 1975 :pages 246 :tr 345 :unavailable :author Eugene C. Freuder :asort Freuder, E.C. :title Computer System for Visual Recognition Using Active Knowledge :date June 1976 :pages 278 :tr 346 :unavailable :author John M. Hollerbach :asort Hollerbach, J.M. :title Hierarchical Shape Description of Objects by Selection and Modification of Prototypes :date November 1975 :adnum AD-A024-970 :pages 239 :tr 347 :unavailable :author Robert Carter Moore :asort Moore, R.C. :title Reasoning from Incomplete Knowledge in a Procedural Deduction System :date December 1975 :tr 352 :unavailable :author Johan de Kleer :asort de Kleer, J. :title Qualitative and Quantitative Knowledge in Classical Mechanics :date December 1975 :adnum AD-A021-515 :pages 120 :tr 354 :unavailable :author Charles Rich and Howard E. Shrobe :asort Rich, C.; Shrobe, H.E. :title Initial Report on A LISP Programmer's Apprentice :date December 1976 :reference (See {IEEE Transactions on Software Engineering}, Vol. 4, No. 5, Nov. 1978) :pages 217 :tr 362 :unavailable :author Allen Brown :asort Brown, A. :title Qualitative Knowledge, Causal Reasoning, and the Localization of Failures :date December 1976 :adnum AD-A052-952 :pages 198 :tr 397 :unavailable :author Tomas Lozano-Perez :asort Lozano-Perez, T. :title The Design of a Mechanical Assembly System :date December 1976 :adnum AD-A036-734 :pages 192 :tr 402 :unavailable :author Drew Vincent McDermott :asort McDermott, D. :title Flexibility and Efficiency in a Computer Program for Designing Circuits :date December 1976 :adnum AD-A043-964 :pages 263 :tr 403 :unavailable :author Richard Brown :asort Brown, R. :title Use of Analogy to Achieve New Expertise :date April 1977 :adnum AD-A043-809 :pages 148 :tr 418 :author Benjamin J. Kuipers :asort Kuipers, B.J. :title Representing Knowledge of Large-scale Space :date July 1977 :pages 147 :tr 419 :unavailable :author Jon Doyle :asort Doyle, J. :title Truth Maintenance Systems for Problem Solving :date January 1978 :adnum AD-A054-826 :pages 134 :tr 439 :unavailable :author Marc H. Raibert :asort Raibert, M. :title Motor Control and Learning by the State Space Model :date September 1977 :pages 181 :tr 450 :unavailable :author Scott E. Fahlman :asort Fahlman, S.E. :title A System for Representing and Using Real-World Knowledge :date December 1977 :adnum AD-A052748 :pages 195 :tr 457 :author Robert J. Woodham :asort Woodham, R.J. :title Reflectance Map Techniques for Analyzing Surface Defects in Metal Castings :date June 1978 :cost $7.00 :pages 216 :adnum AD-A062-177 :abstract This report explores the relation between image intensity and object shape. It is shown that image intensity is related to surface orientation and that a variation in image intensity is related to surface curvature. Computational methods are developed which use the measured intensity variation across surfaces of smooth objects to determine surface orientation. :tr 472 :unavailable :author Edwina Rissland Michener :asort Michener, E.R. :title The Structure of Mathematical Knowledge :date August 1978 :pages 139 :tr 474 :author Guy L. Steele :asort Steele, G.L., Jr. :title Rabbit: A Compiler for Scheme :date May 1978 :cost $7.00 :pages 272 :adnum AD-A061-996 :abstract We have developed a compiler for the lexically-scoped dialect of LISP known as SCHEME. The compiler knows relatively little about specific data manipulation primitives such as arithmetic operators, but concentrates on general issues of environment and control. Rather than having specialized knowledge about a large variety of control and environment constructs, the compiler handles only a small basis set which reflects the semantics of lambda-calculus. All of the traditional imperative constructs, such as sequencing, assignment, looping, GOTO, as well as many standard LISP constructs such as AND, OR and COND, are expressed as macros in terms of the applicative basis set. A small number of optimization techniques, coupled with the treatment of function calls as GOTO statements, serve to produce code as good as that produced by more traditional compilers. :tr 483 :unavailable :author Kurt A. VanLehn :asort VanLehn, K.A. :title Determining the Scope of English Quantifiers :date June 1978 :adnum AD-A084-816 :pages 127 :tr 492 :unavailable :author Richard C. Waters :asort Waters, R.C. :title Automatic Analysis of the Logical Structure of Programs :date December 1978 :pages 219 :reference (See "A Method for Analyzing Loop Programs" in {IEEE Transactions on Software Engineering}, Vol. 5, No. 3, May 1979) :adnum AD-A084-818 :tr 503 :unavailable :author Howard Elliot Shrobe :asort Shrobe, H.E. :title Dependency Directed Reasoning for Complex Program Under\-stand\-ing :date April 1979 :pages 293 :adnum AD-A078-055 :tr 512 :unavailable :author Kent A. Stevens :asort Stevens, K.A. :title Surface Perception from Local Analysis of Texture and Contour :date February 1980 :pages 120 :adnum AD-A084-803 :tr 515 :unavailable :author Matthew Thomas Mason :asort Mason, M.T. :title Compliance and Force Control for Computer Controlled Manipulators :date April 1979 :pages 71 :reference (See {IEEE Transactions on Systems, Man and Cybernetics}, Vol. SMC-11, No. 6, June 1981) :adnum AD-A077-708 :tr 529 :unavailable :author Johan de Kleer :asort de Kleer, J. :title Causal and Teleological Reasoning In Circuit Recognition :date September 1979 :adnum AD-A084-802 :pages 216 :tr 534 :author John M. Hollerbach :asort Hollerbach, J.M. :title An Oscillation Theory of Handwriting :date March 1980 :reference (See {\it Biological Cybernetics}, Vol. 39, pp. 139-156. 1981) :cost $6.00 :pages 86 :abstract Handwriting production is viewed as a constrained modulation of an underlying oscillatory process. Coupled oscillations in horizontal and vertical directions produce letter forms, and when superimposed on a rightward constant velocity horizontal sweep result in spatially separated letters. Modulation of the vertical oscillation is responsible for control of letter height, either through altering the frequency or altering the acceleration amplitude. Modulation of the horizontal oscillation is responsible for control of corner shape through altering phase or amplitude. :tr 537 :unavailable :author Candace Lee Sidner :asort Sidner, C.L. :title Towards a Computational Theory of Definite Anaphora Comprehen\-sion in En glish Discourse :date June 1979 :pages 265 :adnum AD-A084-785 :tr 540 :unavailable :author Kenneth Michael Kahn :asort Kahn, K.M. :title Creation of Computer Animation from Story Descriptions :date August 1979 :pages 323 :tr 542 :unavailable :author Luc Steels :asort Steels, L. :title Reasoning Modeled As A Society Of Communicating Experts :date June 1979 :pages 154 :tr 550 :unavailable :author David A. McAllester :asort McAllester, D.A. :title The Use of Equality in Deduction and Knowledge Representation :date January 1980 :cost $6.00 :pages 115 :adnum AD-A084-890 :tr 579 :author Ellen Hildreth :asort Hildreth, E. :title Implementation Of A Theory Of Edge Detection :date April 1980 :cost $6.00 :pages 124 :abstract This report describes the implementation of a theory of edge detection, proposed by Marr and Hildreth (1979). According to this theory, the image is first processed independently through a set of different size filters, whose shape is the Laplacian of a Gaussian, G(x,y). Zero-crossings in the output of these filters mark the positions of intensity changes at different resolutions. Information about these zero-crossings is then used for deriving a full symbolic description of changes in intensity in the image, called the raw primal sketch. :tr 581 :unavailable :author Jon Doyle :asort Doyle, J. :title A Model for Deliberation, Action, And Introspection :date May 1980 :cost $7.00 :pages 249 :adnum AD-A105-666 :tr 589 :unavailable :author Andrew P. Witkin :asort Witkin, A.P. :title Shape from Contour :date November 1980 :pages 100 :tr 595 :author Guy Lewis Steele, Jr. :asort Steele, G.L., Jr. :title The Definition and Implementation Of A Computer Programming Language Based On Constraints :date August 1980 :cost $8.00 :pages 372 :reference (VLSI Memo \#80-32) :adnum AD-A096-556 :abstract The constraint paradigm is a model of computation in which values are deduced whenever possible, under the limitation that deductions be {\it local} in a certain sense. One may visualize a constraint "program" as a network of devices connected by wires. Data values may flow along the wires, and computation is performed by the devices. A device computes using only locally available information (with a few exceptions), and places newly derived values on other, locally attached wires. In this way computed values are {\it propagated}. Advantages and disadvantages of the constraint paradigm are discussed, and a number of implementations of constraint-based programming languages are presented. A progression of ever more powerful languages is described, complete implementations are presented, and design difficulties and alternatives are discussed. The goal approached, though not quite reached, is a complete programming system which will implicitly support the constraint paradigm to the same extent that LISP, say, supports automatic storage management. :tr 604 :author Charles Rich :asort Rich, C. :title Inspection Methods In Programming :date June 1981 :cost $8.00 :pages 287 :adnum AD-A110-030 :abstract The work reported here lies in the area of overlap between artificial intelligence and software engineering. As research in artificial intelligence, it is a step towards a model of problem solving in the domain of programming. In particular, this work focuses on the routine aspects of programming which involve the application of previous experience with similar programs. I call this programming by inspection. Programming is viewed here as a kind of engineering activity. Analysis and synthesis by inspection are a prominent part of expert problem solving in many other engineering disciplines, such as electrical and mechanical engineering. The notion of inspection methods in programming developed in this work is motivated by similar notions in other areas of engineering. This work is also motivated by current practical concerns in the area of software engineering. The inadequacy of current programming technology is universally recognized. Part of the solution to this problem will be to increase the level of automation in programming. I believe that the next major step in the evolution of more automated programming will be interactive systems which provide a mixture of partially automated program analysis, synthesis and verification. One such system being developed at MIT, called the programmer's apprentice, is the immediate intended application of this work. :tr 610 :unavailable :author Richard Brown :asort Brown, R. :title Coherent Behavior From Incoherent Knowledge Sources In The Automatic Synthesis of Numerical Computer Programs :date January 1981 :pages 211 :adnum AD-A096-559 :tr 615 :unavailable :author Kenneth D. Forbus :asort Forbus, K.D. :title A Study of Qualitative and Geometric Knowledge in Reasoning about Motion :date February 1981 :cost $6.00 :pages 123 :adnum AD-A096-455 :abstract Reasoning about motion is an important part of our commonsense knowledge, involving fluent spatial reasoning. This work studies the qualitative and geometric knowledge required to reason in a world that consists of balls moving through space constrained by collisions with surfaces, including dissipative forces and multiple moving objects. An analog geometry representation serves the program as a diagram, allowing many spatial questions to be answered by numeric calculation. It also provides the foundation for the construction and use of a place vocabulary, the symbolic descriptions of space required to do qualitative reasoning about motion in the domain. :tr 619 :unavailable :author Barbara Y. White :asort White, B. :title Designing Computer Games to Facilitate Learning :date February 1981 :cost $7.00 :pages 204 :abstract The aim of this thesis was to explore the design of interactive computer learning environments. The particular learning domain selected was Newtonian dynamics. Newtonian dynamics was chosen because it is an important area of physics with which many students have difficulty and because controlling Newtonian motion takes advantage of the computer's graphics and interactive capabilities. The learning environment involved games which simulated the motion of a spaceship on a display screen. The purpose of the games was to focus the students' attention on various aspects of the implications of Newton's laws. Playing the games did improve the students ability to solve Newtonian dynamics problems. It is hypothesized that the games facilitated understanding because the microworld embodies Newton's laws in a way that links everyday beliefs about force and motion to formal physics knowledge, because it provides feedback as to how everyday beliefs fail, and because the games focus students' attention on areas where their knowledge needs revising. The design of the games and microworld was based on an analysis of why students have so much difficulty with Newtonian dynamics. This was done by taking protocols of students solving basic force and motion problems. The results revealed that the students possessed many kinds of knowledge, such as beliefs derived from living in a world with friction and prior experiences with addition, which interfered with their ability to understand Newtonian dynamics. The games and microworld were then redesigned to more effectively help the students revise their misconceptions and to draw upon aspects of their knowledge which would help them to better understand Newtonian dynamics. Finally, general principles of designing interactive computer learning environments were induced from this design process. :tr 623 :unavailable :author Anna R. Bruss :asort Bruss, A.R. :title The Image Irradiance Equation: Its Solution and Application :date June 1981 :adnum AD-A104043 :pages 109 :tr 633 :author William Douglas Clinger :asort Clinger, W.D. :title Foundations of Actor Semantics :date May 1981 :cost $7.00 :pages 177 :abstract The actor message-passing model of concurrent computation has inspired new ideas in the areas of knowledge-based systems, programming languages and their semantics, and computer systems architecture. The model itself grew out of computer languages such as Planner, Smalltalk, and Simula, and out of the use of continuations to interpret imperative constructs within -calculus. The mathematical content of the model has been developed by Carl Hewitt, Irene Greif, Henry Baker, and Giuseppe Attardi. This thesis extends and unifies their work through the following observations. The ordering laws postulated by Hewitt and Baker can be proved using a notion of global time. The most general ordering laws are in fact equivalent to an axiom of realizability in global time. Independence results suggest that some notion of global time is essential to any model of concurrent computation. Since nondeterministic concurrency is more fundamental that deterministic sequential computation, there may be no need to take fixed points in the underlying domain of a power domain. Power domains built from incomplete domains can solve the problem of providing a fixed point semantics for a class of nondeterministic programming languages in which a fair merge can be written. The event diagrams of Greif's behavioral semantics, augmented by Baker's pending events, form an incomplete domain. Its power domain is the semantic domain in which programs written in actor-based languages are assigned meanings. This denotational semantics in compatible with behavioral semantics. The locality laws postulated by Hewitt and Baker may be proved for the semantics of an actor-based language. Altering the semantics slightly can falsify the locality laws. The locality laws thus constrain what counts as an actor semantics. :tr 636 :author Barbara Sue Kerns Steele :asort Steele, B.S.K. :title An Accountable Source-to-Source Transformation System :date June 1981 :cost $6.00 :pages 99 :adnum AD-A110-115 :abstract Though one is led to believe that program transformation systems which perform source-to-source transformations enable the user to understand and appreciate the resulting source program, this is not always the case. Transformations are capable of behaving and/or interacting in unexpected ways. The user who is interested in understanding the whats, whys, wheres, and hows of the transformation process is left without tools for discovering them. I provide an initial step towards the solutin of this problem in the form of an accountable source-to-source transformation system. It carefully records the information necessary to answer such questions, and provides mechanisms for the retrieval of this information. It is observed that though this accountable system allows the user access to relevant facts from which he may draw conclusions, further study is necessary to make the system capable of analysing these facts itself. :tr 649 :author Michael Dennis Riley :asort Riley, M.D. :title The Representation of Image Texture :date September 1981 :cost $5.00 :pages 69 :adnum AD-A107636 :abstract This thesis explores how to represent image texture in order to obtain information about the geometry and structure of surfaces, with particular emphasis on locating surface discontinuities. Theoretical and psychophysical emphasis on locating surface discontinuities. Theoretical and psychophysical results lead to the following conclusions for the representation of image texture: (1) A texture edge primitive is needed to identify texture change contours, which are formed by an abrupt change in the 2-D organization of similar items in an image. The texture edge can be used for locating discontinuities in surface structure and surface geometry and for establishing motion correspondence. (2) Abrupt changes in attributes that vary with changing surface geometry---orientation, density, length, and width---should be used to identify discontinuties in surface geometry and Surface Structure. (3) Texture tokens are needed to separate the effects of different physical processes operating on a surface. They represent the local structure of the image texture. Their spatial variation can be used in the detection of texture discontinuities and texture gradients and their temporal variation may be used for establishing motion correspondence. What precisely constitutes the texture tokens is unknown; it appears, however, that the intensity changes alone will not suffice, but local groupings of them may. (4) The above primitives need to be assigned rapidly over a large range in an image. :tr 688 :author Robert W. Sjoberg :asort Sjoberg, R.W. :title Atmospheric Effects In Satellite Imaging of Mountainous Terrain :date September 1982 :cost $6.00 :pages 86 :adnum AD-A128431 :abstract It is possible to obtain useful maps of surface albedo from remotely-sensed images by eliminating effects due to topography and the atmosphere, even when the atmospheric state is not known.A simple phenomenological model of earth radiance that depends on six empirically-determined parameters is developed under certain simplifying assumptions. The model incorporates path radiance and illumination from sun and sky and their dependencies on surface altitude and orientation. It takes explicit account of surface shape, represented by a digital terrain model, and is therefore especially suited for use in mountainous terrain. A number of ways of determining the model parameters are discussed, including the use of shadows to obtain path radiance and to estimate local albedo and sky irradiance. The emphasis is on extracting as much information from the image as possible, given a digital terrain model of the imaged area and a minimum of site-specific atmospheric data. The albedo image, introduced as a representation of surface reflectance, provides a useful tool to evaluate simple imaging model. Criteria for the subjective evaluation of albedo images are established and illustrated for Landsat multispectral date of mountainous region of Switzerland :tr 690 :author Matthew Thomas Mason :asort Mason, M.T. :title Manipulator Grasping and Pushing Operations :date June 1982 :cost $6.00 :pages 137 :adnum AD-A128438 :abstract The primary goal of this research is to develop theoretical tools for analysis, synthesis, and application of primitive manipulator operations. The primary method is to extend and apply traditional tools of classical mechanics. The results are of such a general nature that they address many different and programming tools, and design of auxiliary equipment. Some of the manipulator operations studied are: (1) Grasping an object. The object will usually slide and rotate the period between first contact and prehension. (2) Placing an object. The object may slip slightly in the fingers upon contact with the table as the base aligns with the table. (3) Pushing. Often the final stage of mating two parts involves pushing one object into the other. In each of these operations the motion of the object is determined partly by the manipulator and partly by frictional forces. Hence the theoretical analysis focuses on the problem of partially constrained motion with friction. When inertial forces are dominated by frictional forces, we find that the fundamental motion of the object-whether it will rotate, and if so in what direction-may be determined by inspection. In many cases the motion may be predicted in predicted in detail, and in any case it is impossible to find bounds on the motion. With these analytical tools it is sometimes possible to predict the outcome of a given manipulator operation, or, on the other hand, to plan an operation producing a given desired outcome. :tr 703 :unavailable :author Gerald Roylance :asort Roylance, G. :title A Simple Model of Circuit Design :date May 1980 :cost $5.00 :pages 65 :abstract A simple analog circuit designer has been implemented as a rule based system. The system can design voltage followers, Miller integrators, and bootstrap ramp generators from functional descriptions of what these circuits do. While the designer works in a simple domain where all components are ideal, it demonstrates the abilities of skilled designers. While the domain is electronics, the design ideas are useful in many other engineering domains, such as mechanical engineering, chemical engineering, and numerical programming. Most circuit design systems are given the circuit schematic and use arithmetic constraints to select components values. This circuit designer is different because it designs the schematic. The designer uses a unidirectional CONTROL relation to fine the schematic. The circuit designs are built around this relation;it restricts the search space, assigns purposes to components, and finds design bugs. :tr 704 :author Daniel Carl Brotsky :asort Brotsky, D.C. :title An Algorithm for Parsing Flow Graphs :date March 1984 :cost $6.00 :pages 152 :abstract This report desribes research about {\it flow graphs} -- labeled, directed, acyclic graphs which abstract representations used in a variety of Artificial Intelligence applications. Flow graphs may be derived from {\it flow grammars} such as strings be derived from string grammars; this derivation process forms a useful model for the stepwise refinement processes used in programming and other engineering domains. The central result of this report is a parsing algorithm for flow graphs. Given a flow grammar and a flow graph, the algorithm determines whether the grammar generated the graph and, if so, finds all possible derivations for it. The author has implemented the algorithm in LISP. The intent of this report is to make flow-graph parsing available as an analytic tool for researchers in Artificial Intelligence. The report explores the intuitions behind the parsing algorithm, contains numerous, extensive examples of its behavior, and provides some guidance for those who wish to customize the algorithm to their own uses. :tr 707 :author Walter Hamscher :asort Hamscher, W. :title Using Structural and Functional Information in Diagnostic Design :date June l983 :cost $5.00 :pages 68 :adnum AD-A131859 :abstract We wish to design a diagnostic for a device from knowledge of its structure and function. The diagnostic should achieve both {\it coverage} of the faults that can occur in the device, and should strive to achieve {\it specificity} in its diagnosis when it detects a fault. A system is described that uses a simple model of hardware structure and function, representing the device in terms of its internal primitive functions and connections. The system designs a diagnostic in three steps. First, an extension of path sensitization is used to design a test for each of the connections in the device. Next, the resulting tests are improved by increasing their specificity. Finally the tests are ordered so that each relies on the fewest possible connections. We describe an implementation for this system and show examples of the results for some simple devices. :tr 715 :author George Edward Barton, Jr. :asort Barton, G.E., Jr. :title A Multiple-Context Equality-Based Reasoning System :date April 1983 :cost $6.00 :pages 145 :adnum AD-A132369 :abstract Expert Systems are too slow. This work attacks that problem by speeding up a useful system component that remembers facts and tracks down simple consequences. The redesigned component can assimilate new facts more quickly because it uses a compact, grammar-based internal representation to deal with whole classes of equivalent expressions at once. It can support faster hypothetical reasoning because it remembers the consequences of several assumption sets at once. The new design is targeted for situations in which many of the stored facts are equalities. The deductive machinery considered here supplements stored premises with simple new conclusions. The stored premises include permanently asserted facts and temporarily adopted assumptions. The new conclusions are derived by substituting equals for equals and using the properties of the logical connectives AND, OR, and NOT. The deductive system provides supporting premises for its derived conclusions. Reasoning that involves quantifiers is beyond the scope of its limited and automatic operation. The expert system which the reasoning system is a component is expected to be responsible for overall control of reasoning. :tr 720 :author John Canny :asort Canny, J. :title Finding Edges and Lines in Images :date June 1983 :cost $6.00 :pages 146 :adnum AD-A130824 :abstract The problem of detecting intensity changes in images is canonical in vision. Edge detection operators are typically designed to optimally estimate first or second derivative over some (usually small) support. Other criteria such as output signal to noise ratio or bandwidth have also been argued for. This thesis is an attempt to formulate a set of edge detection criteria that capture as directly as possible the desirable properties of an edge operator. Variational techniques are used to find a solution over the space of all linear shift invariant operators. The first criterion is that the detector have low probability of error i.e. failing to mark edges or falsely marking non-edges. The second is that the marked points should be as close as possible to the centre of the true edge. The third criterion is that there should be low probability of more than one response to a single edge. The technique is used to find optimal operators for step edges and for extended impulse profiles (ridges or valleys in two dimensions). The extension of the one dimensional operators to two dimensions is then discussed. The result is a set of operators of varying width, length and orientation. The problem of combining these outputs into a single description is discussed, and a set of heuristics for the integration are given. :tr 728 :author Daniel G. Theriault :asort Theriault, D.G. :title Issues in the Design and Implementation of Act2 :date June 1983 :cost $7.00 :pages 213 :adnum AD-A132326 :abstract Act2 is a highly concurrent programming language designed to exploit the processing power available from parallel computer architectures. The language supports advanced concepts in software engineering, providing high-level constructs suitable for implementing artificially-intelligent applications. Act2 is based on the Actor model of computation, consisting of virtual computational agents which communicate by message-passing. Act2 serves as a framework in which to integrate an actor language, a description and reasoning system, and a problem-solving and resource management system. This document describes issues in Act2's design and the implementation of an interpreter for the language. :tr 749 :author Reid Gordon Simmons :asort Simmons, R.G. :title Representing and Reasoning About Change in Geologic Interpretation :date December 1983 :cost $6.00 :pages 131 :abstract Geologic interpretation is the task of inferring a sequence of events to explain how a given geologic region could have been formed. This report describes the design and implementation of one part of a geologic interpretation problem solver -- a system which uses a simulation technique called {\it imagining} to check the validity of a candidate sequence of events. Imagining uses a combination of qualitative and quantitative simulations to reason about the chnges which occurred to the geologic region. The spatial changes which occur are simulated by constructing a sequence of diagrams. This quantitative simulation needs numeric parameters which are determined by using the qualitative simulation to establish the cumulative changes to an object and by using a description of the current geologic region to make quantitative measurements. The diversity of reasoning skills used in imagining has necessitated the development of multiple representations, each specialized for a different task. Representations to facilitate doing temporal, spatial and numeric reasoning are described in detail. We have also found it useful to explicitly represent {\it processes}. Both the qualitative and quantitative simulations use a discrete "layer cake" model of geologic processes, but each uses a separate representation, specialized to support the type of simulation. These multiple representations have enabled us to develop a powerful, yet modular, system for reasoning about change. :tr 753 :author Richard C. Waters :asort Waters, R.C. :title KBEmacs: A Step Toward the Programmer's Apprentice :date May 1985 :pages 236 :ADnum AD-A157814 :cost $7.00 :keywords computer aided design, program editing, programming environment, reuseable software components, Programmer's Apprentice :abstract The Knowledge-Based Editor in Emacs (KBEmacs) is the current demonstration system as part of the Programmer's Apprentice project. KBEmacs is capable of acting as a semi-expert assistant to a person who is writing a program -- taking over some parts of the programming task. Using KBEmacs, it is possible to construct a program issuing a series of high level commands. This series of commands can be as much as an order of magnitude shorter than the program it descibes. KBEmacs is capable of operating on ADA and LISP programs of realistic size and complexity. Although KBEmacs is neither fast enough nor robust enough to be considered a true prototype, both of these problems could be overcome if the systems were to be reimplemented. :tr 754 :author Richard L. Lathrop :asort Lathrop, R.L. :title Parallelism in Manipulator Dynamics :date December 1983 :cost $6.00 :pages 109 :abstract This paper addresses the problem of efficiently computing the motor torques required to drive a lower-pair kinematic chain (e.g., a typical manipulator arm in free motion, or a mechanical leg in the swing phase) given the desired trajectory; i.e., the Inverse Dynamics problem. It investigates the high degree of parallelism inherent in the computations, and presents two "mathematically exact" formulations especially suited to high-speed, highly parallel implementations using special-purpose hardware or VLSI devices. In principle, the formulations should permit the calculations to run at a speed bounded only by I/O. The first presented is a parallel version of the recent linear Newton-Euler recursive algorithm. The time cost is also linear in the number of joints, but the real-time coefficients are reduced by almost two orders of magnitude. The second formulation reports a new parallel algorithm which shows that it is possible to improve upon the linear time dependency. The real time required to perform the calculations increases only as the [$log_2$] of the number of joints. Either formulation is susceptible to a systolic pipelined architecture in which complete sets of joint torques emerge at successive intervals of four floating-point operations. Hardware requirements necessary to support the algorithm are considered and found not to be excessive, and a VLSI implementation architecture is suggested. We indicate possible applications to incorporating dynamical considerations into trajectory planning, e.g. it may be possible to build an on-line trajectory optimizer. :end :tr 767 :author Brian C. Williams :asort Williams, B.C. :title Qualitative Analysis of MOS Circuits :date July 1984 :cost $6.00 :pages 90 :keywords causal reasoning, VLSI, qualitative physics, design automation, qualitative circuit simulation, representation of knowledge, circuit theory, problem solving, expert systems :abstract With the push towards sub-micron technology, transistor models have become increasingly complex. The number of components in integrated circuits has forced designer's efforts and skills towards higher levels of design. This has created a gap between design expertise and the performance demands increasingly imposed by the technology. To alleviate this problem, software tools must be developed that provide the designer with expert advice on circuit performance and design. This requires a theory that links the intuitions of an expert circuit analyst with the corresponding principles of formal theory (i.e., algebra, calculus, feedback anaylsis, network theory, and electrodynamics), and that makes each underlying assumption explicit. Temporal Qualitative Analysis is a technique for analyzing the qualitative large signal behavior of MOS circuits that straddle the line between the digital and analog domains. Temporal Qualitative Analysis is based on the following four components: First, a qualitative representation is composed of a set of open regions separated by boundaries. These boundaries are chosen at the appropriate level of detail for the analysis. This concept is used in modeling time, space, circuit state variables, and device operating regions. Second, constraints between circuit state variables are established by circuit theory. At a finer time scale, the designer's intuition of electrodynamics is used to impose a causal relationship among these constraints. Third, large signal behavior is modeled by Transition Analysis, using continuity and theorems of calculus to determine how quantities pass between regions over time. Finally, Feedback Analysis uses knowledge about the structure of equations and the properties of structure classes to resolve ambiguities. :tr 789 :author Kenneth D. Forbus :asort Forbus, K.D. :title Qualitative Process Theory :date July 1984 :cost $7.00 :pages 179 :keywords qualitative reasoning, common sense reasoning, naive physics, artificial intelligence, problem solving, mathematical reasoning :abstract Objects move, collide, flow, bend, heat up, cool down, stretch, compress, and boil. These and other things that cause changes in objects over time are intuitively characterized as {\it processes}. To understand common sense physical reasoning and make programs that interact with the physical world as well as people do we must understand qualitative reasoning about processes, when they will occur, their effects, and when they will stop. Qualitative Process theory defines a simple notion of physical process that appears useful as a language in which to write dynamical theories. Reasoning about processes also motivates a new qualitative representation for quantity in terms of inequalities, called the {\it quantity space}. This report describes the basic concepts of Qualitative Process theory, several different kinds of reasoning that can be performed with them, and discusses its impact on other issues in common sense reasoning about the physical world, such as causal reasoning and measurement interpretation. Several extended examples illustrate the utility of the theory, including figuring out that a boiler can blow up, that an oscillator with friction wil eventually stop, and how to say that you can pull with a string, but not push with it. This report also describes GIZMO, an implemented computer program which uses Qualitative Process theory to make predictions and interpret simple measurements. The representations and algorithms used in GIZMO are described in detail, and illustrated using several examples. :tr 791 :author Bruce R. Donald :asort Donald, B.R. :title Motion Planning with Six Degrees of Freedom :date May 1984 :cost $7.00 :pages 261 :ADnum AD-A150312g :keywords motion planning, robotics, path planning, configuration space, obstacle avoidance, spatial reasoning, geometric modelling, piano mover's problem, computational geometry, applied differential topology, Voronoi diagram :abstract The motion planning problem is of central importance to the fields of robotics, spatial planning, and automated design. In robotics we are interested in the automatic synthesis of robot motions, given high-level specifications of tasks and geometric models of the robot and obstacles. The "Mover's" problem is to find a continuous, collision-free path for a moving object through an environment containing obstacles. We present an implemented algorithm for the "classical" formulation of the three-dimensional Mover's problem: Given an arbitrary rigid polyhedral moving object "P" with three translational and three rotational degrees of freedom, find a continuous, collision-free path taking "P" from some initial configuration to a desired goal configuration. This thesis describes the first known implementation of a complete algorithm (at a given resolution) for the full six degree of freedom Mover's problem. The algorithm transforms the six degree of freedom planning problem into a point navigation problem in a six-dimensional configuration space (called C-space). The C-space obstacles, which characterize the physically unachievable configurations, are directly represented by six-dimensional manifolds whose boundaries are five dimensional C-surfaces. :tr 793 :author Daniel Sabey Weld :asort Weld, D.S. :title Switching Between Discrete and Continuous Process Models to Predict Genet ic Activity :date May 1984 :cost $5.00 :pages 83 :keywords QP theory, simulation, aggregation, multiple representations :abstract Two kinds of process models have been used in programs that reason about change: discrete and continuous models. We describe the design and implementation of a qualitative simulator, PEPTIDE, which uses both kinds of process models to predict the behavior of molecular genetic systems. The program uses a discrete process model to simulate both situations involving abrupt changes in quantities and the actions of small numbers of molecules. It uses a continuous process model to predict gradual changes in quantities. A novel technique, called aggregation, allows the simulator to switch between these models through the recognition and summary of cycles. The flexibility of PEPTIDE's aggregator allows the program to detect cycles within cycles and predict the behavior of complex situations. :tr 794 :author Eugene C. Ciccarelli IV :asort Ciccarelli, E. :title Presentation Based User Interfaces :date August 1984 :cost $7.00 :pages 196 :keywords user interfaces, presentation systems, programming tools, display, editor :abstract A prototype {\it presentation system base} is described. It offers mechanisms, tools, and ready-made parts for building user interfaces. A general user interface mode underlies the base, organized around the concept of a {\it presentation}: a visible text or graphic form conveying information. The base and model emphasize domain independence and style independence, to apply to the widest possible range of interfaces. The {\it primitive presentation system model} treats the interface as a system of processes maintaining a semantic relation between an {\it application data base} and a {\it presentation data base}, the symbolic screen description containing presentations. A {\it presenter} continually updates the presentation data base from the application data base. The user manipulates presentations with a {\it presentation editor}. A {\it recognizer} translates the user's presentation manipulation into application data base commands. The primitive presentation system can be extended to model more complex systems by attaching additional presentation systems. In order to illustrate the model's generality and descriptive capabilities, extended model structures for several existing user interfaces are discussed. The base provides support for building the application and presentation data bases, linked together into a single, uniform network, including descriptions of classes of objects as well as the objects themselves. The base provides an initial presentation data base network, graphics to continuously display it, and editing functions. A variety of tools and mechanisms help create and control presenters and recognizers. To demonstrate the base's utility, three interfaces to an operating system were constructed, embodying different styles: icon, menu, and graphical annotation. :tr 807 :author Andrew Lewis Ressler :asort Ressler, A.L. :title A Circuit Grammar for Operational Amplifier Design :date January 1984 :cost $6.00 :pages 92 :keywords artificial intelligence, computer aided design, grammar, operational amplifier, circuit, design, language :abstract Electrical circuit designers seldom create really new topologies or use old ones in a novel way. Most designs are known combinations of common configurations tailored for the particular problem at hand. In this thesis I show that much of the behavior of a designer engaged in such ordinary design can be modelled by a clearly defined computational mechanism executing a set of stylized rules. Each of my rules embodies a particular piece of the designer's knowledge. A circuit is represented as a hierarchy of abstract objects, each of which is composed of other objects. The leaves of this tree represent the physical devices from which physical circuits are fabricated. By analogy with context-free languages, a class of circuits is generated by a phrase-structure grammar of which each rule describes how one type of abstract object can be expanded into a combination of more concrete parts. Circuits are designed by first postulating an abstract object which meets the particular design requirements. This object is then expanded into a concrete circuit by successive refinement using rules of my grammar. There are in general many rules which can be used to expand a given abstract component. Analysis must be done at each level of the expansion to constrain the search to a reasonable set. Thus the rules of my circuit grammar provide constraints which allow the approximate qualitative analysis of partially instantiated circuits. Later, more careful analysis in terms of more concrete components may lead to the rejection of a line of expansion which at first looked promising. I provide special failure rules to direct the repair in this case. As part of this research I have developed a computer program , CIROP, which implements my theory in the domain of operational amplifier design. :tr 810 :author Michael Andreas Erdmann :asort Erdmann, M.A. :title On Motion Planning with Uncertainty :date August 1984 :cost $7.00 :pages 261 :keywords motion planning, mechanical assembly, parts mating, robotics, configuration space, friction, compliance, uncertainty :abstract Robots must successfully plan and execute tasks in the presence of uncertainty. Uncertainty arises from errors in modelling, sensing, and control. Planning in the presence of uncertainty constitutes one facet of the general motion planning problem in robotics. This problem is concerned with the automatic synthesis of motion strategies from high level task specifications and geometric models of environments. In order to develop successful motion strategies, it is necessary to understand the effect of uncertainty on the geometry of object interactions. Object interactions, both static and dynamic, may be represented in geometrical terms. This thesis investigates geometrical tools for modelling and overcoming uncertainty. The thesis describes an algorithm for computing backprojections of desired task configurations. Task goals and motion states are specified in terms of a moving object's configuration space. Backprojections specify regions in configuration space from which particular motions are guaranteed to accomplish a desired task. The back projection algorithm considers surfaces in configuration space that facilitate sliding towards the goal, while avoiding surfaces on which motions may prematurely halt. In executing a motion from a backprojection region, a plan executor must be able to recognize that a desired task has been accomplished. Since sensors are subject to uncertainty, recognition of task success is not always possible. The thesis considers the structure of backprojection regions and of task goals that ensures goal recognizability. The thesis also develops a representation of friction in configuration space, in terms of a friction cone analogous to the real space friction cone. The friction cone provides the backprojection algorithm with a geometrical tool for determining points at which motions may halt. :tr 834 :author Peter Merrett Andreae :asort Andreae, P.M. :title Justified Generalization: Acquiring Procedures From Examples :date January,1985 :cost $6.00 :pages 161 :adnum AD-A156408 :keywords machine learning, constraining generalization, justification of generalization. :abstract This thesis describes an implemented system called NODDY for acquiring procedures from examples presented by a teacher. Acquiring procedures from examples involves several different generalization tasks. Generalization is an underconstrained task, and the main issue of machine learning is how to deal with this underconstraint. The thesis presents two principles for constraining generalization on which NODDY is based. The first principle is to exploit domain based constraints. NODDY demonstrates how such constraints can be used both to reduce the space of possible generalizations to manageable size, and how to generate negative examples out of positive examples to further constrain the generalization. The second principle is to avoid spurious generalizations by requiring justification before adopting a generalization. NODDY demonstrates several different ways of justifying a generalization and proposes a way of ordering and searching a space of candidate generalizations based on how much evidence would be required to justify each generalization. Acquiring procedures also involves three types of constructive generalization: inferring loops (a kind of group), inferring complex relations and state variables, and inferring predicates. NODDY demonstrates three constructive generalization methods for these kinds of generalization. :tr 844 :unavailable :author Gul Agha :asort Agha, G. :title Actors: A Model of Concurrent Computation In Distributed Systems :date June,1985 :pages 198 :keywords distributed systems, concurrency, programming languages, object-oriented programming, deadlock, semantics of programs, process architectures, functional programming :tr 852 :title Local Rotational Symmetries :author Margaret Morrison Fleck :asort Fleck, M.M. :date August 1985 :pages 156 :cost $6.00 :keywords shape representation, computer vision, artificial intelligence, smoothed local symmetries, local symmetries, multiple-scale representations, hierarchical representations, rotational symmetries, round regions :abstract This thesis describes a representation for the two-dimensional round regions called Local Rotational Symmetries. Local Rotational Symmetries are intended as a companion to Brady's Smoothed Local Symmetry Representation for elongated shapes. An algorithm for computing Local Rotational Symmetry representations at multiple scales of resolution has been implemented and results of this implementation are presented. These results suggest that Local Rotational Symmetries provide a more robustly computable and perceptually accurate description of round regions than the previous proposed representations. In the course of developing this representation, it has been necessary to modify the way both Smoothed Local Symmetries and Local Rotational Symmetries are computed. First, grey scale image smoothing proves to be better than boundary smoothing for creating representations at multiple scales of resolution, because it is more robust and it allows qualitative changes in representation between scales. Secondly, it is proposed that shape representations at different scales be explicitly related, so that information can be passed between scales and computation at each scale can be kept local. Such a model for multi-scale computation is desirable both to allow efficient computation and to accurately model human perceptions. :tr 859 :author Anita M. Flynn :asort Flynn, A.M. :title Redundant Sensors for Mobile Robot Navigation :date September 1985 :pages 70 :cost $5.00 :adnum AD-A161087 :keywords mobile robot, sensors, path planning, navigation, map making :abstract Redundant sensors are needed on a moblie robot so that the accuracy with which it perceives its surroundings can be increased. Sonar and infrared sensors are used here in tandem, each compensating for deficiencies in the other. The robot combines the data from both sensors to build a representation which is more accurate than if either sensor were used alone. Another representation, the curvature primal sketch, is extracted from this perceived workspace and is used as the input to two path planning programs: one based on configuration space and one based on a generalized cone formulation of free space. :tr 860 :author Jose Luis Marroquin :asort Marroquin, J.L. :title Probabilistic Solution of Inverse Problems :date September 1985 :pages 206 :cost $7.00 :adnum AD-A161130 :keywords inverse problems, computer vision, surface interpolation, image restoration, Markov random fields, optimal estimation, simulated annealing :abstract In this thesis we study the general problem of reconstructing a function, defined on a finite lattice, from a set of incomplete, noisy and/or ambiguous observations. The goal of this work is to demonstrate the generality and practical value of a probabilistic (in particular, Bayesian) approach to this problem, particularly in the context of Computer Vision. In this approach, the prior knowledge about the solution is expressed in the form of a Gibbsian probability distribution on the space of all possible functions, so that the reconstruction task is formulated as an estimation problem. ------------------------ Books and Manuals ------------------------ = These items are available from the publishers = Abelson, Harold, and Gerald Jay Sussman. {Structure and Interpretation of Computer Programs}. Cambridge, MA.: MIT Press, 1984. {Barriers to Equality in Academia: Women in Computer Science at MIT}. Prepared by female graduate students and research staff in the Laboratory for Computer Science and the Artificial Intelligence Laboratory at MIT. February, 1983. Available from MIT Lab for Computer Science. Berwick, Robert C.. {The Acquisition of Syntactic Knowledge}. Cambridge, MA.: MIT Press, 1985. Berwick, Robert C., and Amy Weinberg. {The Grammatical Basis of Linguistic Performance: Language Use and Acquisition}. Cambridge, MA.: MIT Press, 1984. Brady, J. Michael, and Robert C. Berwick. {Computational Models of Discourse}. Cambridge, MA.: MIT Press, 1983. Brady, J. Michael, ed. {Computer Vision}. Amsterdam: North-Holland, 1982. Brady, J. Michael, John Hollerbach, Timothy Johnson, Tomas Lozano-Perez, and Matthew T. Mason, eds. {Robot Motion: Planning and Control}. Cambridge, MA.: MIT Press, 1982. Brady, J. Michael. {The Theory of Computer Science: A Programming Approach}. London: Chapman and Hall, 1977. Brooks, Rodney A. {Programming in Common Lisp}. New York: John Wiley and Sons, 1985. Davis, Randall, and Douglas B. Lenat. {Knowledge-Based Systems in Artificial Intelligence}. New York: McGraw-Hill, 1982. DiSessa, Andrea, and Harold Abelson. {Turtle Geometry: The Computer as a Medium for Exploring Mathematics}. Cambridge, MA.: MIT Press, 1981. Fahlman, Scott E. {NETL: A System for Representing and Using Real-World Knowledge}. Cambridge, MA.: MIT Press, 1979. Grimson, W.E.L. {From Images to Surfaces: A Computational Study of the Human Early Visual System}. Cambridge, MA.: MIT Press, 1981. Hildreth, Ellen C. {The Measurement of Visual Motion}. (ACM Distinguished Dissertation Series.) Cambridge, MA.: MIT Press, 1984. Hillis, W. Daniel. {The Connection Machine}. Cambridge, MA.: MIT Press, 1985. Marcus, Mitchell P. {A Theory of Syntactic Recognition for Natural Language}. Cambridge, MA.: MIT Press, 1980. Mason, Matthew T., and J. Kenneth Salisbury, Jr. {Robot Hands and the Mechanics of Manipulation}. Cambridge, MA.: MIT Press, 1985. Mason, Matthew T., and J. Kenneth Salisbury, Jr. {Dextrous Robot Hand Videotape}. Cambridge, MA.: MIT Press, 1985. Minsky, Marvin. {Computation}. Englewood Cliffs, NJ.: Prentice Hall, 1967. Minsky, Marvin. {Robotics}. New York: Anchor Press Doubleday, 1985. Minsky, Marvin, and Seymour Papert. {Perceptrons}. Cambridge, MA.: MIT Press, 1968. Minsky, Marvin, ed. {Semantic Information Processing}. Cambridge, MA.: MIT Press, 1968. Moore, Robert C. {Reasoning from Incomplete Knowledge in a Procedural Deduction System}. New York: Garland Publishing, 1979. Papert, Seymour, and Robert McNaughton. {Counter-Free Automata}. Cambridge, MA.: MIT Press, 1971. Stallman, Richard M. {EMACS Manual}, (AIM 555). MIT Artificial Intelligence Laboratory, March 1983. Available from the AI Laboratory Publications Office at a cost of $3.50, prepaid. Stallman, Richard M., David Moon, and Daniel Weinreb. {Window System Manual}. MIT Artificial Intelligence Laboratory, August 1983, Edition 1.1, System Version 95. Available from the AI Laboratory Publications Office at a cost of $7.00, prepaid. Stallman, Richard M. {ZMail Manual}. MIT Artificial Intelligence Laboratory, Zmail Version, April 1983, First Edition. Available from the AI Laboratory at a cost of $6.00, prepaid. Sussman, Gerald. {A Computer Model of Skill Acquisition}. New York: Elsevier Science, February 1975. (Out of print). Ullman, Shimon. {The Interpretation of Visual Motion}. Cambridge, MA.: MIT Press, 1979. Weinreb, Daniel, David Moon, and Richard Stallman. {LISP Machine Manual}. MIT Aritificial Intelligence Laboratory, revised, June, 1984. Sixth Edition. Available from the AI Laboratory at a cost of $20.00, prepaid. Winograd, Terry. {Understanding Natural Language}. New York: Academic Press, 1972. Winston, Patrick H., and Karen Prendergast. {The A.I. Business: The Commercial Uses of Artificial Intelligence}. Cambridge, MA.: MIT Press, 1984. Winston, Patrick H. {Artificial Intelligence}, 2nd ed. Reading, MA.: Addison-Wesley, 1984. Winston, Patrick H., and Richard H. Brown, eds. {Artificial Intelligence: An MIT Perspective}. 2 volumes. Cambridge, MA.: MIT Press, 1979. Winston, Patrick H., and Berthold K.P. Horn. {LISP}, 2nd ed. Reading, MA.: Addison-Wesley, 1984. Winston, Patrick H., ed. {The Psychology of Computer Vision}. New York: McGraw-Hill, 1975. (Out of print). -------