[comp.doc.techreports] tr-input/mit.pt7

leff@smu.UUCP (Laurence Leff) (09/04/89)

:abstract Robots must plan and execute tasks in the presence of
uncertainty.  Uncertainty arises from sensing errors, control errors,
and uncertainty in the geometry of the environment.  We present a
framework for computing motion strategies that are guaranteed to
succeed in the presence of all three kinds of uncertainty.  When
guaranteed plans cannot be found or do not exist, {\it Error Detection
and Recovery} (EDR) strategies construct plans that might work, but
fail in a recognizable way when they cannot.  We give a constructive,
geometric definition for EDR strategies and show how they can be
computed.  We also describe an implemented planner, called LIMITED, for
a restricted domain.

:tr 988
:author Kenneth W. {Haase, Jr.}
:asort Haase, K.W.
:title TYPICAL: A Knowledge Representation System for Automated
Discovery and Inference
:date August 1987
:pages 110
:cost $8.00
:adnum AD-A187483
:keywords knowledge representation, discovery, type inference,
subsumption
:abstract
TYPICAL is a package for describing and making automatic inferences
about a broad class of SCHEME predicate functions. These functions,
called {\it types} following popular usage, delineate classes of
primitive SCHEME objects, composite data stuctures, and abstract
descriptions. TYPICAL types are generated by an extensible combinator
language from either existing types or primitive terminals. TYPICAL's
inferences place these types in a {\it subsumption lattice}.  TYPICAL
was developed as a representation language for the discovery program
Cyrano; particular examples are are given of TYPICAL's application in
the Cyrano program.

:tr 992
:author David L. Brock
:asort Brock, D.L.
:title Enhancing the Dexterity of a Robot Hand Using Controlled Slip
:date May 1987
:pages 208
:cost $9.00
:adnum AD-A209363
:keywords robot hand, dexterity, slip, controlled slip
:abstract
Humans can effortlessly manipulate objects in their hands, dexterously
sliding and twisting them within their grasp. Robots, however, have
none of these capabilities, they simply grasp objects rigidly in their
end effectors. To investigate this common form of human manipulation,
an analysis of controlled slipping of a grasped object within a robot
hand was performed. The Salisbury robot hand demonstrated many of
these controlled slipping techniques, illustrating many results of
this analysis.

:tr 993
:author Michael B. Kashket
:asort Kashket, M.B.
:title A Government-Binding Based Parser for Warlpiri, a Free-Word
Order Language
:date September 1987
:pages 167
:cost $8.00
:adnum AD-A189381
:keywords principle-based parser, Government-Binding Theory, free word
order 
:abstract
This report presents an implemented, Government-Binding based parser
for Warlpiri, an Australian Aboriginal language that exhibits free
word order.  The similarity in meaning among the freely permuted
sentences is assumed to be directly reflected in syntax.  This parser
properly processes free word order because it assigns the same
syntactic structure to the permutations of a sentence.  The parser
also handles fixed word order, as well as other phenomena.  This
parser differs from traditional rule-based parsing systems in that two
structures are computed, one encoding precedence and one hierarchy.
This bipartite representation is the key to handling free- and
fixed-order phenomena.

:tr 995
:author Feng Zhao
:asort Zhao, F.
:title An {\it O(N)} Algorithm for Three-dimensional N-body
Simulations 
:date October 1987
:pages 64
:cost $7.00
:adnum AD-A196903
:keywords N-body algorithm, particle simulation, Multipole expansion,
3-D tree, spatial refinement, parallel computing
:abstract
We develop an $O(N)$ algorithm that computes the gravitational
potentials and forces on $N$ point-masses interacting in
three-dimensional space.  In contrast to other fast $N$-body methods
such as tree codes, which only approximate the interaction potentials
and forces, this method is exact---it computes the potentials and
forces to within any prespecified tolerance up to machine precision.
We present an implementation of the algorithm for a sequential machine
and numerically verify the algorithm.  We also describe a parallel
version of the algorithm that runs on the Connection Machine in order
$O(\log N)$ time.

:tr 1000
:author Bonnie Jean Dorr
:asort Dorr, B.J.
:title UNITRAN: A Principle-Based Approach to Machine Translation
:date December 1987
:pages 311
:cost $10.00
:adnum AD-A195281
:keywords natural language processing, interlingual machine
translation, co-routine design, principles and parameters, parsing,
thematic substitution, generation
:abstract
This report presents an approach to natural language translation that
relies on principle-based descriptions of grammar rather than
rule-oriented descriptions.  The model that has been constructed is
based on abstract principles as developed by Chomsky (1981) and
several other researchers working within the ``Government and
Binding'' (GB) framework.  The approach taken is ``interlingual'',
{\it i.e.\/}, the model is based on universal {\it principles\/} that
hold across all languages; the distinctions among languages are then
handled by settings of {\it parameters\/} associated with the
universal principles.  Because of the modular nature of the model, the
interaction effects of universal principles are easily handled by the
system; thus, the programmer does not need to specifically spell out
the details of rule applications.

:tr 1001
:author Aaron F. Bobick
:asort Bobick, A.F.
:title Natural Object Categorization
:date November 1987
:pages 214
:cost $9.00
:adnum AD-A194103
:keywords object recognition, machine learning, cluster analysis,
computer vision, classification, natural kinds
:abstract 
This thesis addresses the problem of categorizing natural objects.  We
propose that the purpose of a categorization is to support the
inference of unobserved properties of objects from the observed
properties.  Because no such set of categories can be constructed in
an arbitrary world, we present the Principle of Natural Modes as a
claim about the structure of the world.  An evaluation function is
defined in which entropy measures for property uncertainty and
category uncertainty are combined through a free parameter that
reflects the goals of the observer.  Natural categorizations are shown
to be those that are stable with respect to this free parameter.  We
develop a categorization paradigm and algorithm to recover natural
categories; examples drawn from several natural domains are
successfully demonstrated.  Finally, a method is presented for
evaluating the utility of features in recovering natural categories.

:tr 1022
:author Pyung H. Chang
:asort Chang, P.
:title Analysis and Control of Robot Manipulators with Kinematic
Redundancy 
:date February 1988
:pages 132
:cost $8.00
:adnum AD-A198232
:keywords kinematic control, redundant manipulators
:abstract
A closed form solution formula for the kinematic control of
manipulators with redundancy is derived, using the Lagrangian
multiplier method. Differential relationship equivalent to the
Resolved Motion Method has been also derived. The proposed method is
proved to provide with exact equilibrium state for the Resolved Motion
Method. This exactness in the proposed method fixes the repeatability
problem in the Resolved Motion Method, and establishes a fixed
transformation from workspace to the joint space.  Also the method,
owing to the exactness, is demonstrated to give more accurate
trajectories than the Resolved Motion Method. In addition, a new
performance measure for redundancy control has been developed.

:tr 1023
:author David W. Jacobs
:asort Jacobs, D.W.
:title The Use of Grouping in Visual Object Recognition
:date October 1988
:pages 166
:cost $8.00
:adnum AD-A201691
:keywords grouping, object recognition, indexing, model-based vision,
libraries, segmentation
:abstract
The report describes a recognition system called GROPER.  GROPER
performs grouping by using distance and relative orientation
constraints that estimate the likelihood of different edges in an
image coming from the same object.  The thesis presents both a
theoretical analysis of the grouping problem and a practical
implementation of a grouping system.  GROPER also uses an indexing
module to allow it to make use of knowledge of different objects, any
of which might appear in an image.  We test GROPER by comparing it to
a similar recognition system that does not use grouping.

:tr 1030
:author Neil Singer
:asort Singer, N.C.
:title Residual Vibration Reduction in Computer Controlled Machines
:date February 1989
:pages 241
:cost $9.00
:adnum AD-208074
:keywords vibrations, oscillations, teleoperators, flexible manipulators
:abstract
Concerning control of machines that exhibit flexibility, there is a
great deal of literature about the modeling of a flexible plant and
the design of a controller, but little has been written about the
shaping of systems inputs.  This report examines two input shaping
techniques based on frequency domain analysis, and presents a linear
programming technique for generating vibration-reducing driving
functions for systems. The new technique is shown to result in less
residual vibration, have better robustness to system parameter
uncertainty, and require less computation than other currently used
input shaping techniques.

:tr 1031
:author Elisha Sacks
:asort Sacks, E.
:title Automatic Qualitative Analysis of Ordinary Differential
Equations Using Piecewise Linear Approximations
:date March 1988
:pages 96
:cost $8.00
:adnum AD-A196310
:keywords qualitative reasoning, dynamic systems, qualitative physics,
symbolic mathematics
:abstract
This paper explores automating the qualitative analysis of physical
systems.  It describes a program, called {\bf PLR}, that takes parameterized
ordinary differential equations as input and produces a qualitative
description of the solutions for all initial values.  {\bf PLR} approximates
intractable nonlinear systems with piecewise linear ones, analyzes the
approximations, and draws conclusions about the original systems.  It
chooses approximations that are accurate enough to reproduce the
essential properties of their nonlinear prototypes, yet simple enough to
be analyzed completely and efficiently.   It derives additional
properties, such as boundedness or periodicity, by theoretical methods.
I demonstrate {\bf PLR} on several common nonlinear systems and on published
examples from mechanical engineering.

:tr 1035
:author Daniel S. Weld
:asort Weld, D.S.
:title Theories of Comparative Analysis
:date May 1988
:pages 181
:cost $9.00
:adnum AD-A196328
:keywords qualitative analysis, causal reasoning, comparative
analysis, DQ analysis, exaggeration
:abstract
Comparative analysis is the problem of predicting how a system will react to
perturbations in its parameters, and why. For example, comparative analysis
could be asked to explain why the period of an oscillating spring/block system
would increase if the mass of the block were larger.  This thesis formalizes
the task of comparative analysis and presents two solution techniques:
differential qualitative (DQ) analysis and exaggeration.  Both techniques
solve many comparative analysis problems, providing explanations suitable for
use by design systems, automated diagnosis, intelligent tutoring systems, and
explanation based generalization.

:tr 1036
:author Kenneth Alan Pasch
:asort Pasch, K.A.
:title Heuristics for Job-Shop Scheduling
:date January 1988
:pages 163
:cost $8.00
:adnum AD-A198192
:keywords scheduling, job-shop, heuristic, geometric
:abstract
Two methods of obtaining approximate solutions to the classic {\it
general Job-Shop Scheduling Problem} are investigated. The first
method is iterative. A sampling of the solution space is used to
decide which of a collection of space pruning constraints are
consistent with ``good'' schedules.  These reduce the search space and
the sampling is repeated.  This approach can be used either to verify
whether some set of pruning constraints can prune with discrimination
or to generate solutions directly.  The second method of generating
solutions prunes a cartesian space in which schedules are represented
as trajectories.  On the average this method significantly outperforms
an array of the {\it Priority Dispatch Rules} when the objective
criteria is that of {\it Minimum Maximum Lateness}. It also compares
favorably with a recent iterative relaxation procedure.

:tr 1043
:author Karl T. Ulrich
:asort Ulrich, K.T.
:title Computation and Pre-Parametric Design
:date September 1988
:pages 206
:cost $9.00
:adnum AD-A202382
:keywords design, engineering problem solving, physical reasoning
:abstract 
My work is broadly concerned with the question {\it How can designs be
synthesized computationally?} The project deals primarily with
mechanical devices, and focuses on {\it pre-parametric design}: design
at the level of detail of a blackboard sketch rather than at the level
of detail of an engineering drawing.  I explore the project ideas in
the domain of single-input single-output dynamic systems, like
pressure gages, accelerometers, and pneumatic cylinders.  The problem
solution consists of two steps: 1) generate a schematic description of
the device in terms of idealized functional elements, and then from
the schematic description generate a physical description.

:tr 1045
:author Daniel Peter Huttenlocher
:asort Huttenlocher, D.P.
:title Three-Dimen-\ sional Recognition of Solid Objects from a
Two-Dimensional Image
:date October 1988
:pages 161
:cost $8.00
:adnum AD-A205667
:keywords computer vision, model-based recognition, alignment, affine
transformation 
:abstract
This thesis addresses the problem of recognizing solid objects in the
three-dimensional world, using two-dimensional shape information
extracted from a single image. Objects can be partly occluded and can
occur in cluttered scenes. A model based approach is taken, where
stored models are matched to an image. The matching problem is
separated into two stages, which employ different representations of
objects. The first stage uses the smallest possible number of local
features to find transformations from a model to an image. This
minimizes the amount of search required in recognition. The second
stage uses the entire edge contour of an object to verify each
transformation. This reduces the chance of finding false matches.

:tr 1047
:author Richard James Doyle
:asort Doyle, R.J.
:title Hypothesizing Device Mechanisms: Opening Up the Black Box
:date June 1988
:pages 213
:cost $9.00
:keywords causal reasoning, theory formation, qualitative reasoning,
modeling  
:abstract
I describe an approach to forming hypotheses about hidden mechanism
configurations within devices given external observations and a
vocabulary of primitive mechanisms.  An implemented causal modelling
system called {\bf JACK} constructs explanations for why a second piece of
toast comes out lighter, why the slide in a tire gauge does not slip
back inside when the gauge is removed from the tire, and how in a
refrigerator a single substance can serve as a heat sink for the
interior and a heat source for the exterior.  I report the number of
hypotheses admitted for each device example, and provide empirical
results which isolate the pruning power due to different constraint
sources.

:tr 1048
:author Reid G. Simmons
:asort Simmons, R.G.
:title Combining Associational and Causal Reasoning to Solve
Interpetation and Planning Problems
:date August 1988
:pages 215
:cost $9.00
:adnum AD-A202184
:keywords associational reasoning, causal reasoning, planning,
geologic interpretation, debugging
:abstract
This report describes a paradigm for combining associational and
causal reasoning to achieve efficient and robust problem-solving
behavior.  The Generate, Test and Debug (GTD) paradigm generates
initial hypotheses using associational (heuristic) rules.  The tester
verifies hypotheses, supplying the debugger with causal explanations
for bugs found if the test fails.  The debugger uses
domain-independent causal reasoning techniques to repair hypotheses,
analyzing domain models and the causal explanations produced by the
tester to determine how to replace faulty assumptions made by the
generator.  We analyze the strengths and weaknesses of associational
and causal reasoning techniques, and present a theory of debugging
plans and interpretations.  The GTD paradigm has been implemented and
tested in the domains of geologic interpretation, the blocks world,
and Tower of Hanoi problems.

:tr 1051
:author Peng Wu
:asort Wu, Peng
:title Test Generation Guided Design for Testability
:date July 1988
:pages 129
:cost $8.00
:keywords artificial intelligence, knowledge representation, test
generation, knowledge-based systems, VLSI design for testability
:abstract
This thesis presents a new approach to building a design for
testability (DFT) system.  The system takes a digital circuit
description, finds out the problems in testing it, and suggests
circuit modifications to correct those problems.  The key
contributions of the thesis research: (1) setting design for
testability in the context of test generation (TG), (2) using failures
during FG to focus on testability problems, and (3) relating circuit
modifications directly to the failures. A natural functionality set is
used to represent the maximum functionalities that a component can
have. The current implementation has only primitive domain knowledge
and needs other work as well. However, armed with the knowledge of TG,
it has already demonstrated its ability and produced some interesting
results on a simple microprocessor.

:tr 1052
:author Paul Resnick
:asort Resnick, P.
:title Generalizing on Multiple Grounds: Performance Learning in
Model-Based Technology
:date February 1989
:pages 101
:cost $8.00
:adnum AD-A207960
:keywords learning, explanation-based learning, model-based
troubleshooting
:abstract
This thesis explores ways to augment a model-based diagnostic program
with a learning component, so that it speeds up as it solves problems.
Several learning components are proposed, each exploiting a different
kind of similarity between diagnostic examples.  Through analysis and
experiments, we explore the effect each learning component has on the
performance of a model-based diagnostic program.  We also analyze more
abstractly the performance effects of Explanation-Based
Generalization, a technology that is used in several of the proposed
learning components.

:tr 1053
:author Ron I. Kuper
:asort Kuper, R.
:title Dependency-Directed Localization of Software Bugs
:date May 1989
:pages 74
:cost $7.00
:adnum AD-A210837
:keywords debugging, programmer's apprentice
:abstract 
Software bugs are violated specifications.  Debugging is the process
that culminates in repairing a program so that it satisfies its
specification.  An important part of debugging is localization, whereby
the smallest region of the program that manifests the bug is found.  The
Debugging Assistant (DEBUSSI) localizes bugs by reasoning about logical
dependencies.  DEBUSSI manipulates the assumptions that underlie a bug
manifestation, eventually localizing the bug to one particular
assumption.  At the same time, DEBUSSI acquires specification
information, thereby extending its understanding of the buggy program.
The techniques used for debugging fully implemented code are also
appropriate for validating partial designs.

:tr 1054
:author William T. Townsend
:asort Townsend, W.T.
:title The Effect of Transmission Design on Force-Controlled
Manipulator Performance
:date April 1988
:pages 111
:cost $8.00
:adnum AD-A198132
:keywords force control, manipulator, transmission, whole-arm
manipulation(WAM), robotic arm, design
:abstract
Previous research in force control has focused on the choice of
appropriate servo implementation without corresponding regard to the
choice of mechanical hardware.  This report analyzes the effect of
mechanical properties such as contact compliance, actuator-to-joint
compliance, torque ripple, and highly nonlinear dry friction in the
transmission mechanisms of a manipulator.  A set of requisites for
high performance then guides the development of mechanical-design and
servo strategies for improved performance.  A single-degree-of-freedom
transmission testbed was constructed which confirms the predicted
effect of Coulomb friction on robustness; design and construction of a
cable-driven, four-degree-of-freedom, ``whole-arm'' manipulator
illustrates the recommended design strategies.

:tr 1055
:author Panayotis S. Skordos
:asort Skordos, P.S.
:title Multistep Methods for Integrating the Solar System
:date July 1988
:pages 101
:cost $8.00
:adnum AD-A201692
:keywords numerical integration, error analysis, solar system,
two-body problem, multistep integrators, roundoff error
:abstract 
High order multistep methods, run at constant stepsize, are very
effective for integrating the Newtonian solar system for
extended periods of time. I have studied the stability and error
growth of these methods when applied to harmonic oscillators and
two-body systems like the Sun-Jupiter pair.  I have also tried to
design better multistep integrators than the traditional Stormer and
Cowell methods, and I have found a few interesting ones.

:tr 1056
:author Sundar Narasimhan
:asort Narasimhan, S.
:title Dexterous Robotic Hands: Kinematics and Control
:date November 1988
:pages 124
:cost $8.00
:adnum AD-A202183
:keywords hands, kinematics, computational architecture
:abstract
This report presents issues relating to the kinematics and control of
dexterous robotic hands using the Utah-MIT hand as an illustrative
example. The emphasis throughout is on the actual implementation and
testing of the theoretical concepts presented. The kinematics of such
hands is interesting and complicated owing to the large number of
degrees of freedom involved.  The implementation of position and force
control algorithms on such tendon driven hands has previously suffered
from inefficient formulations and a lack of sophisticated computer
hardware. Both these problems are addressed in this report. A
multiprocessor architecture has been built with high performance
microcomputers on which real-time algorithms can be efficiently
implemented. A large software library has also been built to
facilitate flexible software development on this architecture. The
position and force control algorithms described herein have been
implemented and tested on this hardware.

:tr 1065
:author Margaret M. Fleck
:asort Fleck, M.M.
:title Boundaries and Topological Algorithms
:date August 1988
:pages 450
:cost $12.00
:adnum AD-A201040
:keywords stereo matching, edge finding, topology, time
representation, modeling boundaries
:abstract
This report develops a model for the topological structure of
situations.  Topological concepts are shown to be important in a wide
range of artificial intelligence problems.  Formal models of space and
boundaries are developed in which the topological structure of space is
altered by the presence or absence of boundaries, such as those at the
edges of objects.  An implemented edge finder and a stereo matcher are
described.  Both algorithms achieve better performance because they take
advantage of topological structure.  The topological matcher is also
used to develop a new method for testing edge finder performance.

:tr 1072
:author Steven D. Eppinger
:asort Eppinger, S.D.
:title Modeling Robot Dynamic Performance for Endpoint Force Control
:date August 1988
:pages 138
:cost $8.00
:adnum AD-A202881
:keywords robot control, robot modeling, robot force control, robot
dynamics 
:abstract
This research aims to understand the fundamental dynamic behavior of
servo-controlled machinery in response to various types of sensory
feedback.  As an example of such a system, we study robot force
control, a scheme which promises to greatly expand the capabilities of
industrial robots by allowing manipulators to interact with uncertain
and dynamic tasks.  Dynamic models are developed which allow the
effects of actuator dynamics, structural flexibility, and workpiece
interaction to be explored in the frequency and time domains.  The
models are used first to explain the causes of robot force control
instability, and then to find methods of improving this performance.

:tr 1074
:author Walter Charles Hamscher
:asort Hamscher, W.
:title Model-based Troubleshooting of Digital Systems
:date August 1988
:pages 316
:cost $10.00
:adnum AD-A201041
:keywords artificial intelligence, automated diagnosis, hardware
troubleshooting, model-based reasoning, diagnosis from first
principles, temporal reasoning
:abstract
This thesis describes a methodology, a representation, and an
implemented program for troubleshooting digital circuit boards
at roughly the level of expertise one might expect in a human
novice.  Existing methods for model-based troubleshooting have
not scaled up to deal with complex circuits, in part because
traditional circuit models do not explicitly represent aspects
of the device that troubleshooters would consider important.
For complex devices the model of the target device should be
constructed with the goal of troubleshooting explicitly in mind.
Given that methodology, the principal contributions of the
thesis are ways of representing complex circuits to help make
troubleshooting feasible.  Temporally coarse behavior
descriptions are a particularly powerful simplification.

:tr 1079
:author Eric W. Aboaf
:asort Aboaf, E.W.
:title Task-Level Robot Learning
:date August 1988
:pages 76
:cost $8.00
:adnum AD-A209940
:keywords tasks, robotics, learning
:abstract
We are investigating how to program robots so that they learn from
experience.  Our goal is to develop principled methods of learning
that can improve a robot's performance of a wide range of dynamic
tasks. We have developed {\it task-level learning} that successfully
improves a robot's performance of two complex tasks, ball-throwing and
juggling. With task-level learning, a robot practices a task, monitors
its own performance, and uses that experience to adjust its task-level
commands. This learning method serves to complement other approaches,
such as model calibration, for improving robot performance.

:tr 1080
:author Waldemar Horwat
:asort Horwat, W.
:title A Concurrent Smalltalk Compiler for the Message-Driven
Processor 
:date May 1988
:pages 152
:cost $8.00
:adnum AD-A202182
:keywords compilation, Concurrent Smalltalk, fine grain, message
passing, message driven processor, parallel processing,
object-oriented programming, CST, programming languages
:abstract
The Optimist compiler compiles Concurrent Smalltalk to the assembly language of
the Message-Driven Processor (MDP), which is a node of the MIMD message-passing
J-Machine.  The compiler includes optimization techniques such as dead code
elimination, dataflow analysis, constant folding, move elimination, concurrency
analysis, duplicate code merging, tail forwarding, register variables.
The MDP presents unusual challenges and opportunities for compilation.  Due to
the MDP's small memory size, space optimization is more important than speed.
The MDP is an inherently concurrent processor with efficient mechanisms for
sending and receiving messages; the compiler takes advantage of these
mechanisms.  The MDP's tagged architecture allows very efficient support of
object-oriented languages such as Concurrent Smalltalk.

:tr 1081
:author Benjamin J. Paul
:asort Paul, B.J.
:title A Systems Approach to the Torque Control of a Permanent Magnet
Brushless Motor
:date August 1987
:pages 105
:cost $8.00
:adnum AD-A209399
:keywords torque, ripple, brushless
:abstract
Many approaches to force control have assumed the ability to command
torques accurately.  Concurrently, much research has been devoted to
developing accurate torque actuation schemes.  Often, torque sensors
have been utilized to close a feedback loop around output torque.  In
this paper, the torque control of a brushless motor is investigated
through: the design, construction, and utilization of a joint torque
sensor for feedback control; and the development and implementation of
techniques for phase current based feedforward torque control.  It is
concluded that simply closing a torque loop is no longer necessarily
the best alternative since reasonably accurate current based torque
control is achievable.
 
:tr 1085
:author Philip E. Agre
:asort Agre, P.E.
:title The Dynamic Structure of Everyday Life
:date October 1988
:pages 282
:cost $10.00
:adnum A2055677
:keywords situated activity, planning, action, deictic representation,
visual routines
:abstract
Computational theories of action have generally understood the organized
nature of human activity through the construction and execution of
plans.  By consigning the phenomena of contingency and improvisation to
peripheral roles, this view has led to impractical technical proposals.
As an alternative, I suggest that contingency is a central feature of
everyday activity and that improvisation is the central kind of human
activity.  I also offer a computational model of certain aspects of
everyday routine activity based on an account of improvised activity
called {\it running arguments} and an account of representation for
situated agents called {\it deictic representation}.

:tr 1086
:author Terence D. Sanger
:asort Sanger, T.
:title Optimal Unsupervised Learning in Feedforward Neural Networks 
:date January 1989
:pages 130
:cost $8.00
:adnum AD-A207961
:keywords neural networks, learning, connectionism, vision,
eigenvector analysis
:abstract
We investigate the properties of feedforward neural networks trained
with Hebbian learning algorithms.  A new unsupervised algorithm is
proposed which produces statistically uncorrelated outputs.  The
algorithm causes the weights of the network to converge to the
eigenvectors of the input correlation with largest eigenvalues.  The
algorithm is closely related to the technique of Self-supervised
Backpropagation, as well as other algorithms for unsupervised
learning.  Applications of the algorithm to texture processing, image
coding, and stereo depth edge detection are given.  We show that the
algorithm can lead to the development of filters qualitatively similar
to those found in primate visual cortex.

:tr 1089
:author Allen C. Ward
:asort Ward, A.C.
:title a Theory of Quantitative Inference, Applied to a Mechanical
Design Compiler
:date July 1989
:pages 142
:cost $8.00
:keywords mechanical design, contraint propagation, quantitative
reasoning, labeled internal calculus
:abstract
This thesis presents the ideas underlying a computer program that
takes as input a schematic of a mechanical or hydraulic power
transmission system, plus specifications and a utility function, and
returns catalog numbers from predefined catalogs for the optimal
selection of components implementing the design.  Unlike programs for
designing single components or systems, the program provides the
designer with a high level "language" in which to compose new designs.
It then performs some of the detailed design process.  The process of
"compilation", is based on a formalization of quantitative inferences
about hierarchically organized sets of artifacts and operating
conditions.  This allows the design compilation without the exhaustive
enumeration of alternatives.

:tr 1092
:author Eric Saund
:asort Saund, E.
:title The Role of Knowledge in Visual Shape Representation
:date October 1988
:pages 300
:cost $10.00
:adnum AD-A206173
:keywords shape representation, dimensionality-reduction, knowledge,
scale-space, later vision, dorsal fin
:abstract
This report shows how knowledge about the visual world can be built
into a shape representation in the form of a descriptive vocabulary
making explicit the important geometrical relationships comprising
objects' shapes. Two computational tools are offered: (1) {\it Shape
tokens} are placed on a {\it Scale-Space Blackboard}, (2) {\it
Dimensionality-reduction} captures {\it deformation classes} in
configurations of tokens. Knowledge lies in the token types and
deformation classes tailored to the constraints and regularities of
particular shape worlds. A hierarchical shape vocabulary has been
implemented supporting several later visual tasks in the
two-dimensional shape domain of the dorsal fins of fishes.

:tr 1099
:author Mark Harper Shirley
:asort Shirley, M.H.
:title Generating Circuit Tests by Exploiting Designed Behavior
:date December 1988
:pages 307
:cost $10.00
:adnum AD-A206175
:keywords knowledge-based systems, VLSI, circuit testing, test
generation  
:abstract
This thesis describes two programs for generating tests for digital
circuits that exploit several kinds of expert knowledge not used by
previous approaches.  First, many test generation problems can be
solved efficiently using operation relations, a novel representation
of circuit behavior that connects internal component operations with
directly executable circuit operations.  Operation relations can be
computed efficiently by searching traces of simulated circuit
behavior.  Second, experts write test programs rather than test
vectors because programs are more readable and compact.  Test programs
can be constructed automatically by merging program fragments using
expert-supplied goal-refinement rules and domain-independent planning
techniques.

:tr 1103
:author Henry M. Wu
:asort Wu, H.M.
:title Performance Evaluation of the Scheme86 and HP Precision
Architectures 
:date April 1989
:pages 38
:cost $7.00
:adnum AD-A209418
:keywords computer architecture, pipelining, performance evaluation,
parallelism 
:abstract
The Scheme86 and the HP Precision Architectures represent different
trends in computer processor design.  The former uses wide
micro-instructions, parallel hardware, and a low latency memory
interface.  The latter encourages pipelined implementation and visible
interlocks.  To compare the merits of these approaches, algorithms
frequently encountered in numerical and symbolic computation were
hand-coded for each architecture.  Timings were done in simulators and
the results were evaluated to determine the speed of each design.
Based on these measurements conclusions were drawn as to which aspects
of each architecture are suitable for a high-performance computer.

:tr 1112
:author Richard P. Wildes
:asort Wildes, R.P.
:title On Interpreting Stereo Disparity
:date February 1989
:pages 159
:cost $8.00
:adnum AD-A209398
:keywords image understanding, stereo, surface representation, 3D vision
:abstract
The problems under consideration center around the interpretation of
binocular stereo disparity. In particular, the goal is to establish a
set of mappings from stereo disparity to corresponding
three-dimensional scene geometry. An analysis has been developed that
shows how disparity information can be interpreted in terms of
three-dimensional scene properties, such as surface depth,
discontinuities and orientation. These theoretical
developments have been embodied in a set of computer algorithms for
the recovery of scene geometry from input stereo disparity. The
results of applying these algorithms to several disparity maps are
presented. Comparisons are made to the interpretation of
stereo disparity by biological systems.

:tr 1125
:author Michelle Kwok Lee
:asort Lee, M.K.
:title Summarizing Qualitative Behavior from Measurements of Nonlinear
Circuits 
:date May 1989
:pages 86
:cost $8.00
:adnum AD-A210919
:keywords qualitative analysis, parameter-space graph, dynamical
systems, symbolic description
:abstract
This report describes a program which automatically characterizes the
behavior of any driven, nonlinear, electrical circuit. To do this, the
program autonomously selects interesting input parameters, drives the
circuit, measures its response, performs a set of numeric computations
on the measured data, interprets the results and decomposes the
circuit's parameter space into regions of qualitatively distinct
behavior.  The output is a two-dimensional portrait summarizing the
high-level, {\em qualitative} behavior of the circuit for every point
in the graph, an accompanying textual explanation describing any
interesting patterns observed in the diagram and a symbolic
description of the circuit's behavior which can be passed on to other
programs for further analysis.

:tr 1128
:author Jeffrey Van Baalen
:asort Van Baalen, J.
:title Toward a Theory of Representation Design
:date May 1989
:pages 219
:cost $9.00
:adnum AD-A210885
:keywords knowledge representation, knowledge based systems
:abstract
This research is concerned with designing representations for
analytical reasoning problems (of the sort found on the GRE and LSAT).
These problems test the ability to draw logical conclusions.  A
computer program was developed that takes as input a straightforward
predicate calculus translation of a problem, requests additional
information if necessary, decides what to represent and how, designs
representations capturing the constraints of the problem, and creates
and executes a LISP program that uses those representations to produce
a solution.  Even though these problems are typically difficult for
theorem provers to solve, the LISP program that uses the designed
representations is very efficient.

:tr 1143
:author W. Kenneth Stewart, Jr.
:asort Stewart, W.K.
:title Multisensor Modeling Underwater with Uncertain Information
:date 
:pages 172
:cost $8.00
:keywords autonomous robot, object recognition, multisensor fusion,
navigation, visual modeling
:abstact
This thesis develops an approach to the construction of
multidimensional stochastic models for intelligent systems exploring
an underwater environment.  It describes methods for building models
by a three-dimensional spatial decomposition of stochastic,
multisensor feature vectors.  New sensor information is incrementally
incorporated into the model by stochastic backprojection.  Error and
ambiguity are explicitly accounted for by blurring a spatial
projection of remote sensor data before incorporation.  The stochastic
models can be used to derive surface maps or other representations of
the environment.  The methods are demonstrated on data sets from
multibeam bathymetric surveying, towed sidescan bathymetry, towed
sidescan acoustic imagery, and high-resolution scanning sonar aboard a
remotely operated vehicle.

:tr 1144
:author Andrew Berlin
:asort Berlin, A.
:title A Compilation Strategy for Numerical Programs Based on Partial
Evaluation 
:date July 1989
:pages 78
:cost $8.00
:keywords partial evaluation, compilation, parallel programming,
symbolic interpretation, parallel scheduling, scientific computation
:abstract
This work demonstrates how partial evaluation can be put to practical
use in the domain of high-performance numerical computation.  I have
developed a technique for performing partial evaluation by using
placeholders to propagate intermediate results.  For an important
class of numerical programs, a compiler based on this technique
improves performance by an order of magnitude over conventional
compilation techniques.  I show that by eliminating inherently
sequential data-structure references, partial evaluation exposes the
low-level parallelism inherent in a computation.  I have implemented
several parallel scheduling and analysis programs that study the
tradeoffs involved in the design of an architecture that can
effectively utilize this parallelism.  I present these results using
the 9-body gravitational attraction problem as an example.

:tr 1151
:author Jonathan Connell
:asort Connell, J.
:title A Colony Architecture for an Artificial Creature
:date August 1989
:pages 131
:cost $8.00
:keywords subsumption, robotics, mobile robot, multi-agent,
autonomous, collection
:abstract
This report describes a working autonomous mobile robot whose only
goal is to collect and return empty soda cans. It operates in an
unmodified office environment occupied by moving people. The robot is
controlled by a collection of over 40 independent "behaviors"
distributed over a loosely coupled network of 24 processors. Together
this ensemble helps the robot locate cans with its laser rangefinder,
collect them with its on-board manipulator, and bring them home using
a compass and an array of proximity sensors. We discuss the advantages
of using such a multi-agent control system and show how to decompose
the required tasks into component activities. We also examine the
benefits and limitations of spatially local, stateless, and
independent computation by the agents.