[mod.techreports] st11.x tech reports

E1AR0002@SMUVM1.BITNET (11/05/86)

TECHNICAL NOTE:  203\hfill PRICE: \$14.00\\[0.01in]

\noindent TITLE:  CONVERSATION AS PLANNED BEHAVIOR\\
AUTHORS:  JERRY H. HOBBS and DAVID A. EVANS\\
DATE:  DECEMBER 1979\\[0.01in]

ABSTRACT: Perhaps the most  promising  working hypothesis for the
study of conversation is that the participants can be viewed  as using
planning   mechanisms  much     like those   developed  in  artificial
intelligence.  In    this  paper,   a  framework    for  investigating
conversation,  which  for convenience   will  be  called  the Planning
Approach, is developed from this  hypothesis.  It suggests  a style of
analysis  to   apply  to conversation,    analysis   in terms  of  the
participants' goals, plans, and beliefs, and it indicates a consequent
program of research to  be pursued.     These are developed  in detail
in Part 2.

     Parts 3 and  4 are  devoted to  the microanalysis  of  an  actual
free-flowing  conversation,  as   an   illustration  of   the style of
analysis.  In the process, order is discovered in a conversation  that
on the surface seems   quite  incoherent.  The microanalysis  suggests
some   ways in which  the   planning mechanisms  common  in artificial
intelligence will have to be extended to  deal with  conversation, and
these are  discussed in Part  5.   In  Part 6, certain   methdological
difficulties are examined.   Part 7 addresses the  problem that arises
in this approach of what constitutes successful communication.\\
--------------------------------------------------------------------------------
-------------------------------------------------\\
TECHNICAL NOTE:  204\hfill PRICE:  \$12.00\\[0.01in]

\noindent TITLE:  METAPHOR, METAPHOR SCHEMATA, AND SELECTIVE INFERENCING\\
AUTHOR:  JERRY R. HOBBS\\
DATE:  DECEMBER 1979\\[0.01in]

     ABSTRACT: The  importance  of  spatial  and  other metaphors   is
demonstrated.  An approach  to  handling metaphor  in a  computational
framework is described,  based on the  idea  of selective inferencing.
Three examples of  metaphors are examined in detail  in this  light--a
simple metaphor,  a spatial metaphor  schema,  and  a novel  metaphor.
Finally, there is  a   discussion,  from  this   perspective,  of  the
analogical processes that underlie metaphor in this  approach and what
the approach says about several classical questions about metaphor.\\
--------------------------------------------------------------------------------
-------------------------------------------------\\
TECHNICAL NOTE:  205\hfill PRICE:  \$16.00\\[0.01in]

\noindent TITLE:  DIAGRAM:  A GRAMMAR FOR DIALOGUES\\
AUTHOR:  JANE J. ROBINSON\\
DATE:  FEBRUARY 1980\\[0.01in]

     ABSTRACT: This paper presents an explanatory overview  of a large
and complex grammar, DIAGRAM, that is  used in a  computer  system for
interpreting English dialogue.  DIAGRAM  analyzes  all of  the   basic
kinds of phrases  and sentences and many  quite complex  ones as well.
It is not tied to a particular domain  of application, and  it  can be
extended to analyze additional constructions,  using  the formalism in
which it is currently   written.  For every   expression it  analyzes,
DIAGRAM provides an annotated description of  the structural relations
holding  among its  constituents.   The  annotations provide important
information  for  other parts  of   the  system  that  interpret   the
expression in the context of a dialogue.

     DIAGRAM  is  an augmented  phrase  structure grammar.  Its   rule
procedures allow phrases to inherit attributes from their constituents
and  to acquire attributes from   the  larger  phrases  in which  they
themselves are constituents.   Consequently, when these attributes are
used to  set context-sensitive  constraints  on  the acceptance  of an
analysis,  the contextual constraints can  be imposed by conditions on
dominance as well as conditions on constituency.  Rule  procedures can
also assign scores to an analysis, rating some applications of a  rule
as probable or  as unlikely.  Less  likely analyses can be ignored  by
the procedures that interpret the utterance.

     In   assigning categories and  writing  the   rule statements and
procedures for DIAGRAM, decisions were guided by  consideration of the
functions  that  phrases  serve in   communication   as   well  as  by
considerations   of  efficiency  in  relating    syntactic analyses to
propositional   content.   The major  decisions  are   explained   and
illustrated with examples of the rules and the analyses they  provide.
Some contrasts with  transformational grammars  are pointed  out and
problems that motivate a plan to use  redundancy  rules  in the future
are discussed.  (Redundancy   rules  are meta-rules   that  derive new
constituent-structure rules    from a  set    of  base rules,  thereby
achieving generality of syntactic  statement without having to perform
transformations on  syntactic analyses.)    Other extensions of   both
grammar  and formalism   are  projected in  the  concluding   section.
Appendices  provide  details  and  samples of the  lexicon,  the  rule
statements, and the   procedures,  as well  as   analyses  for several
sentences that differ in type and structure.\\
--------------------------------------------------------------------------------
-------------------------------------------------\\
TECHNICAL NOTE:  206\hfill PRICE:  \$14.00\\[0.01in]

\noindent TITLE:  THE INTERPRETATION OF VERB PHRASES IN DIALOGS\\
AUTHOR:  ANN E. ROBINSON\\
DATE:  JANUARY 1980\\[0.01in]

     ABSTRACT:   This  paper discusses two   problems  central to  the
interpretation of  utterances: determining   the relationship  between
actions described in  an utterance  and  events   in   the world,  and
inferring the state of the world'' from utterances.   Knowledge of the
language, knowledge about   the general subject  being  discussed, and
knowledge about the current situation are all necessary for this.  The
problem of  determining an action  referred toby  a  verb  phrase   is
analogous  to the problem of determining  the object referred  to by a
noun phrase.

     This paper presents  an approach to the  problems of verb phrases
resolution in which knowledge about language, the  problem domain, and
the dialog itself is combined to interpret such references.  Presented
and  discussed are the  kinds of knowledge  necessary for interpreting
references to actions, as well as algorithms  for using that knowledge
in interpreting dialog utterances about  ongoing tasks and for drawing
inferences  about  the task situation    that are   based  on a  given
interpretation.\\
--------------------------------------------------------------------------------
-------------------------------------------------\\
TECHNICAL NOTE:  210\hfill PRICE:  \$14.00\\[-0.15in]
\begin{tabbing}
\noindent TITLE: \= INTERPRETING NATURAL-LANGUAGE UTTERANCES IN\\
             \> DIALOGS ABOUT TASKS\\
AUTHORS: \= ANN E. ROBINSON, DOUGLAS E. APPELT, BARBARA J. GROSZ,\\
     \> GARY G. HENDRIX, and JANE J. ROBINSON\\
DATE:  MARCH 1980\\[-0.15in]
\end{tabbing}

     ABSTRACT:  This  paper  describes  the  results of  a  three-year
research effort  investigating the knowledge and  processes needed for
participation in natural-language    dialogs   about ongoing
mechanical-assembly  tasks.   Major  concerns   were the    ability to
interpret and respond  to  utterances within the  dynamic  environment
effected by  progress in the   task, as well  as  by  the concommitant
shifting dialog context.

     The research strategy  followed was to  determine  the  kinds  of
knowledge  needed,  to define  formalisms     for  encoding  them  and
procedures for reasoning with them,  to implement those formalisms and
procedures in a computer system called TDUS, and then to test  them by
exercising the system.

     Principal accomplishments include: development of a framework for
encoding knowledge about  linguistic  processes; encoding of a grammar
for  recognizing many  of    the  syntactic structures    of  English;
development of the concept of focusing,'' which clarifies a major role
of  context; development of  a formalism   for representing  knowledge
about processes, and procedures  for reasoning about them; development
of   an  overall  framework  for describing  how    different types of
knowledge interact in  the  communication  process; development of   a
computer system that  not only   demonstrates the  feasibility of  the
various formalisms and procedures, but  also provides a  research tool
for testing new hypotheses about the communication process.

CONTENT INDICATORS:  3.60, 3.69, 3.42

KEY WORDS:  Natural-language understanding, Task-oriented dialogs\\
--------------------------------------------------------------------------------
-------------------------------------------------\\
TECHNICAL NOTE:  213\hfill PRICE:  \$14.00\\[-0.15in]
\begin{tabbing}
\noindent TITLE: \= RANDOM SAMPLE CONSENSUS: A PARADIGM FOR MODEL FITTING WITH\\
         \> APPLICATIONS TO IMAGE ANALYSIS AND AUTOMATED CARTOGRAPHY\\
AUTHORS:  MARTIN A. FISCHLER and ROBERT C. BOLLES \\
DATE:  MARCH 1980\\[-0.15in]
\end{tabbing}

     ABSTRACT: In  this  paper  we introduce a    new paradigm, Random
Sample Consensus (RANSAC), for fitting a  model  to experimental data.
RANSAC  is   capable  of  interpreting/smoothing  data   containing  a
significant percentage of gross errors, and thus is ideally suited for
applications in automated image analysis where interpretation is based
on the data  provided  by  error-prone  feature detectors.   A   major
portion  of  this  paper describes  the application  of RANSAC  to the
Location Determination Problem (LDP): given an  image depicting  a set
of landmarks with known locations, determine  that point in space from
which the image was obtained.  In response to a RANSAC requirement, we
derive new results on the minimum number of landmarks needed to obtain
a   solution,   and   present    algorithms    for   computing   these
minimum-landmark solutions in closed form.  These results  provide the
basis for an automatic system that can  solve the  LDP under difficult
viewing   and  analysis   conditions.    Implementation  details   and
computational examples are also presented.\\
--------------------------------------------------------------------------------
------------------------------------------------\\
TECHNICAL NOTE:  220\hfill PRICE:  \$10.00\\[-0.15in]
\begin{tabbing}
\noindent TITLE: \= A STORAGE REPRESENTATION FOR EFFICIENT ACCESS TO\\
             \> LARGE MULTIDIMENSIONAL ARRAYS\\
AUTHOR:  LYNN H. QUAM\\
DATE:  APRIL 1980\\[-0.15in]
\end{tabbing}

     ABSTRACT: This paper addresses problems associated with accessing
elements of large multidimensional arrays when the order  of access is
either unpredictable or  is orthogonal to the conventional   order  of
array  storage.  Large  arrays are defined as  arrays that are  larger
than the physical  memory immediately available  to store them.   Such
arrays must be accessed either  by the virtual memory  system  of  the
computer and operating system, or by direct input and output of blocks
of the array to a file system.  In  either case, the  direct result of
an inappropriate  order of reference  to the elements  of the array is
the very time-consuming movement of data between levels  in the memory
hierarchy,   often costing factors  of  three orders  of  magnitude in
algorithm performance.

     The access to  elements of large arrays  is decomposed into three
steps: transforming the subscript   values of an  n-dimensional  array
into  the element number in a  one-dimensional virtual array,  mapping
the virtual array position to  physical memory position, and accessing
the array element in physical memory.  The virtual-to-physical mapping
step    is unnecessary on  computer  systems  with  sufficiently large
virtual address  spaces.  This paper is  primarily  concerned with the
first step.

     A subscript  transformation  is proposed that solves  many of the
order-of-access problems associated with  conventional array  storage.
This  transformation is   based  on an additive  decomposition of  the
calculation of element number in the  array into the  sum of a  set of
integer functions applied to the set of subscripts as follows:

\begin{center}
element-number(i,j,...) = fi(i) + fj(j) + ...
\end{center}

     Choices for  the transformation functions  that  minimize  access
time to the   array  elements  depend on  the   characteristics of the
computer system's memory hierarchy  and the order  of accesses  to the
array elements.  It is  conjectured that given appropriate models  for
system and algorithm access  characteristics,  a pragmatically optimum
choice  can  be  made for the subscript transformation  functions.  In
general these   models  must be stochastic,   but   in   certain cases
deterministic models are possible.

     Using tables  to    evaluate the  functions   fi  and  fj   makes
implementation very efficient with  conventional computers.  When  the
array  accesses are made in  an  order  inappropriate to  conventional
array storage order,   this scheme  requires  far less  time  than for
conventional array-accessing schemes; otherwise,  accessing times  are
comparable.

     The semantics  of   a set of procedures  for  array access, array
creation, and the  association of arrays with file   names is defined.
For computer  systems  with insufficient  virtual memory, such  as the
PDP-10, a  software   virtual-to-physical mapping scheme   is given in
Appendix C.  Implementations are also  given in the appendix  for  the
VAX and PDP-10  series   computers to access   pixels of large  images
stored as two-dimensional arrays of n bits per element.\\
--------------------------------------------------------------------------------
------------------------------------------------\\
TECHNICAL NOTE:  221\hfill PRICE:  \$14.00\\[0.01in]

\noindent TITLE:  SCENE MODELING:  A STRUCTURAL BASIS FOR IMAGE DESCRIPTION\\
AUTHORS:  JAY M. TENENBAUM, MARTIN A. FISCHLER, and HARRY G. BARROW\\
DATE:  JULY 1980\\[0.01in]

     ABSTRACT: Conventional statistical  approaches to  image  modeling are
fundamentally limited because they take  no account of  the underlying
physical structure of the  scene  nor  of the image formation process.
The image features being modeled are frequently artifacts of viewpoint
and illumination that have no  intrinsic significance for higher-level
interpretation.  In this paper a  structural approach  to modeling  is
argued for  that  explicitly relates image   appearance to  the  scene
characteristics from which it arose.  After establishing the necessity
for structural  modeling in image analysis, a  specific representation
for scene  structure  is proposed   and then a  possible computational
paradigm for recovering this description from an image is described.\\
--------------------------------------------------------------------------------
-------------------------------------------------\\
TECHNICAL NOTE:  222\hfill PRICE:  \$10.00\\[-0.15in]
\begin{tabbing}
\noindent TITLE: \= RECONSTRUCTING SMOOTH SURFACES FROM PARTIAL,\\
         \> NOISY INFORMATION\\
AUTHORS:  HARRY G. BARROW, and J. MARTIN TENENBAUM\\
DATE:  JULY 1980\\[-0.15in]
\end{tabbing}

     ABSTRACT: Interpolating smooth surfaces from  boundary conditions
is a  ubiquitous problem in  early visual processing.    We describe a
solution for an important special case:  the interpolation of surfaces
that are  locally spherical  or  cylindrical  from initial orientation
values and  constraints   on orientation.  The   approach  exploits an
observation that components   of   the unit normal  vary   linearly on
surfaces  of uniform   curvature, which permits   implementation using
local parallel processes.    Experiments on spherical and  cylindrical
test cases have produced essentially  exact reconstructions, even when
boundary values were extremely sparse  or only partially  constrained.
Results on other test cases  seem in  reasonable agreement with  human
perception.\\
--------------------------------------------------------------------------------
-------------------------------------------------\\
TECHNICAL NOTE:  224\hfill PRICE:  \$12.00\\[0.01in]

\noindent TITLE:  A D-LADDER USER'S GUIDE\\
AUTHOR:  DANIEL SAGALOWICZ\\
DATE:  SEPTEMBER 1980\\[0.01in]

     ABSTRACT: D-LADDER  (DIAMOND-based Language Access to Distributed
Data with Error  Recovery) is  a computer  system designed  to provide
answers to questions posed at  the  terminal in   a subset of  natural
language regarding   a distributed data  base of  naval    command and
control information.  The system accepts  natural-language questions
about   the data.   For each question    D-LADDER plans a  sequence of
appropriate queries to the data base  management system, determines on
which machines  the queries are to be  processed, establishes links to
those machines  over  the  ARPANET,  monitors the   processing of  the
queries and answer to the original question.

     This user's guide is intended for the person who knows how to log
in to the host operating system, as well  as  how to enter  and edit a
line of text.  It does not explain how D-LADDER works, but rather  how
to use it on a demonstration basis.\\
--------------------------------------------------------------------------------
-------------------------------------------------\\
TECHNICAL NOTE:  225\hfill PRICE:  \$15.00\\[-0.15in]
\begin{tabbing}
\noindent TITLE: \= INTERPRETING DISCOURSE:  COHERENCE AND THE ANALYSIS\\
         \> OF ETHNOGRAPHIC INTERVIEWS\\
AUTHORS:  MICHAEL AGAR and JERRY R. HOBBS\\
DATE:  AUGUST 1980\\[-0.15in]
\end{tabbing}

     ABSTRACT: The data we  analyze is from a  series  of life history
interviews with a career heroin addict in New York, collected by  Agar
(1981).  We analyze this data  in terms  of  a combination  of two  AI
approaches to discourse.  The  first is work  on the  inferencing that
must take place in  people's  comprehension and  production of natural
language discourse.  The second approach to discourse applies  work on
planning to the planning  of individual speech  acts and  to the plans
speakers  develop for effecting  their  goals  in  larger stretches of
conversation.

     In this paper we  first outline how  we apply these approaches to
the ethnographic data.   We discuss three kinds of  coherence in terms
of which  we  analyze a  text,    and then describe   our  method more
generally.  We next give an example of the method of microanalysis  on
a short fragment of an interview, and then show how the beliefs, goals
and concerns that the microanalysis has revealed are tied  in with the
rest of the corpus.  Finally, we discuss the significance of this work
for ethnography.\\
--------------------------------------------------------------------------------
------------------------------------------------\\
-------

E1AR0002@SMUVM1.BITNET (11/07/86)

TECHNICAL NOTE:  203\hfill PRICE: \$14.00\\[0.01in]

\noindent TITLE:  CONVERSATION AS PLANNED BEHAVIOR\\
AUTHORS:  JERRY H. HOBBS and DAVID A. EVANS\\
DATE:  DECEMBER 1979\\[0.01in]

ABSTRACT: Perhaps the most  promising  working hypothesis for the
study of conversation is that the participants can be viewed  as using
planning   mechanisms  much     like those   developed  in  artificial
intelligence.  In    this  paper,   a  framework    for  investigating
conversation,  which  for convenience   will  be  called  the Planning
Approach, is developed from this  hypothesis.  It suggests  a style of
analysis  to   apply  to conversation,    analysis   in terms  of  the
participants' goals, plans, and beliefs, and it indicates a consequent
program of research to  be pursued.     These are developed  in detail
in Part 2.

     Parts 3 and  4 are  devoted to  the microanalysis  of  an  actual
free-flowing  conversation,  as   an   illustration  of   the style of
analysis.  In the process, order is discovered in a conversation  that
on the surface seems   quite  incoherent.  The microanalysis  suggests
some   ways in which  the   planning mechanisms  common  in artificial
intelligence will have to be extended to  deal with  conversation, and
these are  discussed in Part  5.   In  Part 6, certain   methdological
difficulties are examined.   Part 7 addresses the  problem that arises
in this approach of what constitutes successful communication.\\
--------------------------------------------------------------------------------
-------------------------------------------------\\
TECHNICAL NOTE:  204\hfill PRICE:  \$12.00\\[0.01in]

\noindent TITLE:  METAPHOR, METAPHOR SCHEMATA, AND SELECTIVE INFERENCING\\
AUTHOR:  JERRY R. HOBBS\\
DATE:  DECEMBER 1979\\[0.01in]

     ABSTRACT: The  importance  of  spatial  and  other metaphors   is
demonstrated.  An approach  to  handling metaphor  in a  computational
framework is described,  based on the  idea  of selective inferencing.
Three examples of  metaphors are examined in detail  in this  light--a
simple metaphor,  a spatial metaphor  schema,  and  a novel  metaphor.
Finally, there is  a   discussion,  from  this   perspective,  of  the
analogical processes that underlie metaphor in this  approach and what
the approach says about several classical questions about metaphor.\\
--------------------------------------------------------------------------------
-------------------------------------------------\\
TECHNICAL NOTE:  205\hfill PRICE:  \$16.00\\[0.01in]

\noindent TITLE:  DIAGRAM:  A GRAMMAR FOR DIALOGUES\\
AUTHOR:  JANE J. ROBINSON\\
DATE:  FEBRUARY 1980\\[0.01in]

     ABSTRACT: This paper presents an explanatory overview  of a large
and complex grammar, DIAGRAM, that is  used in a  computer  system for
interpreting English dialogue.  DIAGRAM  analyzes  all of  the   basic
kinds of phrases  and sentences and many  quite complex  ones as well.
It is not tied to a particular domain  of application, and  it  can be
extended to analyze additional constructions,  using  the formalism in
which it is currently   written.  For every   expression it  analyzes,
DIAGRAM provides an annotated description of  the structural relations
holding  among its  constituents.   The  annotations provide important
information  for  other parts  of   the  system  that  interpret   the
expression in the context of a dialogue.

     DIAGRAM  is  an augmented  phrase  structure grammar.  Its   rule
procedures allow phrases to inherit attributes from their constituents
and  to acquire attributes from   the  larger  phrases  in which  they
themselves are constituents.   Consequently, when these attributes are
used to  set context-sensitive  constraints  on  the acceptance  of an
analysis,  the contextual constraints can  be imposed by conditions on
dominance as well as conditions on constituency.  Rule  procedures can
also assign scores to an analysis, rating some applications of a  rule
as probable or  as unlikely.  Less  likely analyses can be ignored  by
the procedures that interpret the utterance.

     In   assigning categories and  writing  the   rule statements and
procedures for DIAGRAM, decisions were guided by  consideration of the
functions  that  phrases  serve in   communication   as   well  as  by
considerations   of  efficiency  in  relating    syntactic analyses to
propositional   content.   The major  decisions  are   explained   and
illustrated with examples of the rules and the analyses they  provide.
Some contrasts with  transformational grammars  are pointed  out and
problems that motivate a plan to use  redundancy  rules  in the future
are discussed.  (Redundancy   rules  are meta-rules   that  derive new
constituent-structure rules    from a  set    of  base rules,  thereby
achieving generality of syntactic  statement without having to perform
transformations on  syntactic analyses.)    Other extensions of   both
grammar  and formalism   are  projected in  the  concluding   section.
Appendices  provide  details  and  samples of the  lexicon,  the  rule
statements, and the   procedures,  as well  as   analyses  for several
sentences that differ in type and structure.\\
--------------------------------------------------------------------------------
-------------------------------------------------\\
TECHNICAL NOTE:  206\hfill PRICE:  \$14.00\\[0.01in]

\noindent TITLE:  THE INTERPRETATION OF VERB PHRASES IN DIALOGS\\
AUTHOR:  ANN E. ROBINSON\\
DATE:  JANUARY 1980\\[0.01in]

     ABSTRACT:   This  paper discusses two   problems  central to  the
interpretation of  utterances: determining   the relationship  between
actions described in  an utterance  and  events   in   the world,  and
inferring the state of the world'' from utterances.   Knowledge of the
language, knowledge about   the general subject  being  discussed, and
knowledge about the current situation are all necessary for this.  The
problem of  determining an action  referred toby  a  verb  phrase   is
analogous  to the problem of determining  the object referred  to by a
noun phrase.

     This paper presents  an approach to the  problems of verb phrases
resolution in which knowledge about language, the  problem domain, and
the dialog itself is combined to interpret such references.  Presented
and  discussed are the  kinds of knowledge  necessary for interpreting
references to actions, as well as algorithms  for using that knowledge
in interpreting dialog utterances about  ongoing tasks and for drawing
inferences  about  the task situation    that are   based  on a  given
interpretation.\\
--------------------------------------------------------------------------------
----------------------------