[net.ai] AIList Digest V3 #58

LAWS@SRI-AI.ARPA (05/06/85)

From: AIList Moderator Kenneth Laws <AIList-REQUEST@SRI-AI>


AIList Digest             Monday, 6 May 1985       Volume 3 : Issue 58

Today's Topics:
  Administrivia - Request for Help,
  Seminars - Database Theory and Equations (MIT) &
    Processing Uncertain Knowledge (UToronto) &
    Compact LISP Machine (SU) &
    How Processes Learn (CMU) &
    Space Modeling for Robot Navigation (SU) &
    Pictorial Explanations (UPenn) &
    Distributed Knowledge-Based Learning (USCarolina) &
    DART: An Automated Diagnostician (SU) &
    Abstraction and Classification in NIKL (MIT) &
    Evidential Reasoning in Semantic Networks (BBN)

----------------------------------------------------------------------

Date: Sun 5 May 85 16:09:01-PDT
From: Ken Laws <Laws@SRI-AI.ARPA>
Reply-to: AIList-Request@SRI-AI.ARPA
Subject: Seminar Abstracts

I fell behind on the seminar notices when I took off for DC early this
month.  Rather than just ignore all the talks, I've decided to "read
them into the record" now.  It will take three issues to hold all the
post-dated notices -- just ignore them if abstracts aren't your thing.

I find that I'm spending too much time gathering and editing seminar
notices, so I'm going to cut back.  I will continue to forward the
Stanford notices, but I'd like to get some volunteers to edit and
forward notices from CMU, CSLI, MIT-MC, Rutgers, and UTexas-20.

Other contributors can help me by providing meaningful Subject lines
for your messages.  I have been cleaning up many of the subject lines
as an aid to sorting the messages and to simplify construction of the
Today's Topics header.  It takes a fair amount of effort to find a
concise summary of an author's message, and I would appreciate it if
contributors would make an effort to provide the summary.

If anyone would like to split off a linguistics/psychology list, I'll
provide the necessary assistance.  I like reading the messages, but
I'm having trouble moderating such discussions and deciding which
seminar notices to include.  I think that a separate discussion list
is the best solution, but I'm willing to consider some kind of
joint moderation of the AIList message stream.  AIList is a fun hobby,
but I'd like to have a little more free time for other activities.

Thanks to everyone for helping to make AIList such a success.

                                        -- Ken Laws

------------------------------

Date: 5 Apr 1985 1229-EST
From: ALR at MIT-XX.ARPA
Subject: Seminar - Database Theory and Equations (MIT)

           [Forwarded from the MIT bboard by SASW@MIT-MC.]


DATE:    TUESDAY, APRIL 9, 1985
PLACE:   NE43-512A


Database Theory and Computing with Equations

Paris C. Kanellakis
MIT


Databases and equational theorem proving are well developed and
seemingly unrelated areas of Computer Science research.
We provide two natural links between these fields and demonstrate
how equational theorem proving can provide useful techniques and tools
for a variety of database tasks.

Our first application is a novel way of formulating database constraints
(dependencies) using equations. Dependency implication, a central computational
problem in database theory, is transformed into reasoning about equations.
Mathematical techniques from universal algebra provide new proof procedures and
better lower bounds for database dependency implication.

Our second application demonstrates that the uniform
word problem for lattices is equivalent to
implication of dependencies expressing transitive closure together with
functional dependencies,
(functional dependencies were the first
and most widely studied database constraints).
This natural generalization of functional dependencies, which is
not expressible using conventional database theory formulations, has
an efficient decision procedure and a natural inference system.

This is joint work with Stavros S. Cosmadakis.


HOST:  Prof. Guttag

------------------------------

Date: Wed, 10 Apr 85 13:04:45 est
From: Voula Vanneli <voula%toronto.csnet@csnet-relay.arpa>
Subject: Seminar - Processing Uncertain Knowledge (UToronto)


                           UNIVERSITY OF TORONTO
                     DEPARTMENT OF COMPUTER SCIENCE
         (SF = Sandford Fleming Building, 10 King's College Rd.)

ARTIFICAL INTELLIGENCE SEMINAR -  Monday, April 15, 11 a.m.,
SF 1105
                      Harry Stephanou
                           Texas



               Processing Uncertain Knowledge

     The methodology described in this talk is motivated  by
the  need to design knowledge based systems for applications
that involve: (1)  subjective  and/or  incomplete  knowledge
contributed  by multiple domain experts, and (ii) inaccurate
and/or incomplete data  collected  from  different  measure-
ments. The talk consists of two parts.

     In the first part, we present a quantitative  criterion
for measuring the effectiveness of the consensus obtained by
pooling evidence from two knowledge sources.  After a  brief
review  of the Dempster-Shafer theory of evidence, we intro-
duce a set-theoretic generalization  of  entropy.   We  then
prove  that  the  pooling  of  evidence  by  Dempster's rule
decreases the total entropy of the  sources,  and  therefore
focuses their knowledge.

     In the second part of the  talk,  we  present  a  fuzzy
classification  algorithm  that can utilize a limited number
of unreliable training samples, or prototypes.  We then pro-
pose the extension of this algorithm to a reasoning by anal-
ogy scheme in which decisions are based on the  "similarity"
of  the  observed evidence to prototypical situations stored
in the knowledge base.  The measure of similarity relies  on
a set-theoretic generalization of cross-entropy.

------------------------------

Date: Mon 8 Apr 85 14:21:11-PST
From: Susan Gere <M.SUSAN@SU-SIERRA.ARPA>
Subject: Seminar - Compact LISP Machine (SU)

                        EE380/CS310
                Seminar on Computer Systems


Title:     Compact LISP Machine

Speaker:   Steven D. Krueger
           Texas Instruments, Dallas

Time:   Wednesday, April 10  at 4:15 p.m.

Place:  Terman Auditorium

The Compact LISP Machine (CLM) development program is the first of
several Defense Advanced Research Projects Agency (DARPA) programs
intended to provide embedded symbolic computing capabilities for
government applications.  As one of many contracts funded under the
Strategic Computing Program, the CLM will provide a symbolic computer
capability for insertion of artificial intelligence (AI) and robotics
technology in a wide range of applications.

The heart of the CLM system is a high speed 32-bit VLSI LISP processor
chip, built using high speed CMOS technology.  It is based on the
architecture of the Explorer LISP machine from Texas Instruments, which
is based on the CADR LISP Machine from MIT and and LISP Machines
Incorporated (LMI).

The CLM system consists of four types of modules: Processor,
Cache/Mapper, 4MB Memory, and Bus Interface. The Processor, Memory,
and Bus Interface modules communicate over a high-performance 32-bit
multi-master NuBus(TM) system bus.

Some motivation will be given for adopting a special architecture for
symbolic processing.  Then the basic architecture of the CLM processor
and Explorer processor will be reviewed.  The NuBus system bus will be
described, and the CLM system modules will be described.

------------------------------

Date: 4 April 1985 1155-EST
From: Theona Stefanis@CMU-CS-A
Subject: Seminar - How Processes Learn (CMU)

        Name:   PS Seminar - J. Misra, The University of Texas/Austin
        Date:   10 April (Wed.)
        Time:   3:00-4:30
        Place:  4605 WeH
        Topic:  How Processes Learn


A key feature of distributed systems is that each component process
has access to its own state but not to the states of other processes.
Any nontrivial distributed algorithm requires that some processes
learn about the states of others.  To study such issues, we introduce
the notion of isomorphism among computations:  two computations are
isomorphic with respect to a process if the process can't tell them
apart.  We show that isomorphisms can be used to define and study
learning by processes.  We give a precise characterization of minimum
information flow for achieving certain desired goals.  As an example
we show that there is no algorithm to detect termination of an
underlying computation using a bounded number of overhead messages.

This talk assumes no previous background in distributed systems.

------------------------------

Date: 09 Apr 85  1321 PST
From: Marianne Siroker <MAS@SU-AI.ARPA>
Subject: Seminar - Space Modeling for Robot Navigation (SU)

                         SPECIAL ROBOTICS SEMINAR


      On Wednesday, April 10, Raja Chatila from France will speak on


          Consistent Space Modeling for Mobile Robot Navigation

Time: 4:15 PM                                   Place:  Rm 352 MJH

In order to understand its environment, a mobile robot should be able to
model it consistently, and to locate itself correctly. One
major difficulty to be solved is the inaccuracies introduced by the sensors.

The presented approach to cope with this problem relies on
general principles to deal with uncertainties: the use of
multisensory system, favoring of the data collected by the more accurate sensor
in a given situation, averaging of different but consistent measurements of the
same entity weighted with their associated uncertainties, and
a methodology enabling a mobile robot to define its own reference
landmarks while exploring and modeling its environment.

These ideas are presented together with an example of their application on
the mobile robot HILARE.

------------------------------

Date: Mon, 15 Apr 85 14:15 EST
From: Tim Finin <Tim%upenn.csnet@csnet-relay.arpa>
Subject: Seminar - Pictorial Explanations (UPenn)


AUTOMATING THE CREATION OF PICTORIAL EXPLANATIONS

 Steve Feiner (Brown Univ)

 3pm Monday April 15th, 337 Towne Building
 Computer and Information Science, University of Pennsylvania

     The APEX [Automated Pictorial EXplanations] project has as its
     long-term goal the realtime computer generation of effective pictorial
     and textual explanations. Our current research has concentrated on the
     automated creation of pictures that depict the performance of physical
     actions, such as turning or pushing, on objects.

     We are constructing a test-bed system that generates pictures of
     actions performed by a problem solver. Our system supports rules for
     determining automatically the objects to be shown in a picture, the
     style and level of detail with which they should be rendered, the
     method by which the action itself should be indicated, and the
     picture's viewing specifications. A picture crystallizes about a small
     set of objects inferred from the nature of the action being depicted.
     Additional objects and detail are added when it is determined that they
     help disambiguate an object from others with which it may be confused.

------------------------------

Date: Wed, 17 Apr 85 12:48 EST
From: Huhns <huhns%scarolina.csnet@csnet-relay.arpa>
Subject: Seminar - Distributed Knowledge-Based Learning (USCarolina)


                           CENTER FOR MACHINE INTELLIGENCE
                            University of South Carolina

                    A DISTRIBUTED KNOWLEDGE-BASED LEARNING SYSTEM
                              FOR INFORMATION RETRIEVAL

          Speaker:  Uttam Mukhopadhyay

          Date:  3:00 p.m.  Wednesday, April 17, 1985
          Location:  Room 230, Engineering Building

               MINDS (Multiple Intelligent  Node  Document  Servers)  is  a
          distributed   system   of   knowledge-based   query  engines  for
          efficiently  retrieving  multimedia  documents   in   an   office
          environment  of  distributed  workstations.  By learning document
          distribution patterns, as well as user interests and  preferences
          during  system  usage, it customizes document retrievals for each
          user.

               In this talk we discuss the implementation  of  a  two-layer
          learning  testbed  for  studying  plausible  heuristics.   In the
          simulated    environment,    document    distribution    patterns
          (object-level  concepts)  used by the query engine are learned at
          the lower level with the help of heuristics for assigning  credit
          and recommending adjustments;  these heuristics are incrementally
          refined at the upper level with the help of meta-heuristics.

------------------------------

Date: Thu 25 Apr 85 16:35:45-PST
From: Elliott Levinthal <LEVINTHAL@SU-SIERRA.ARPA>
Subject: Seminar - DART: An Automated Diagnostician (SU)


Professor Michael Genesereth will be the featured speaker at the
Seminar on May 1st.  Time is 2:15, in Terman Room 217.

"DART: An Automated Diagnostician for Equipment Failures"
    Michael R. Genesereth
     Logic Group
     Knowledge Systems Laboratory, Stanford


This talk describes a device-independent diagnostic program called
DART.  DART differs from previous approaches to diagnosis taken in
the Artificial Intelligence community in that it works directly
from design descriptions rather than MYCIN-like symptom-fault
rules.  DART differs from previous approaches to diagnosis taken in
the design-automation community in that it is more general and in
many cases more efficient.  DART uses a device-independent language
for describing devices and a device-independent inference procedure
for diagnosis.  The resulting generality allows it to be applied to
a wide class of devices ranging from digital logic to nuclear reactors.
Although this generally engenders some computational overhead on
small problems, it facilitates the use of multiple design descriptions
and thereby makes possible combinatoric savings that more than offsets
this overhead on problems of realistic size.

------------------------------

Date: 26 Apr 1985  16:05 EST (Fri)
From: "Daniel S. Weld" <WELD%MIT-OZ@MIT-MC.ARPA>
Subject: Seminar - Abstraction and Classification in NIKL (MIT)


ABSTRACTION AND CLASSIFICATION IN A HYBRID REPRESENTATION SYSTEM

                           Marc Vilain
                        BBN Laboratories
                          Cambridge, MA

   Hybrid architectures have been used in several recent knowledge
representation systems.  In this talk, I will explore several hybrid
representation architectures, focusing particularly on the architecture
of the KL-TWO system.  KL-TWO is built around a propositional reasoner
called PENNI (a descendant of RUP) and a terminological reasoner called
NIKL (a descendant of KL-ONE).

   I will show how NIKL can be interfaced to PENNI so as to augment
PENNI's propositional language with a limited form of quantification.
This interface relies crucially on two operations that follow naturally
from the KL-TWO architecture: abstraction and classification.  I will
describe these operations, and discuss how their generality might extend
beyond the scope of KL-TWO.

        Tuesday, April 30; 4:00pm; 8th Floor Playroom

------------------------------

Date: 24 Apr 1985 12:24-EST
From: AHAAS at BBNG.ARPA
Subject: Seminar - Evidential Reasoning in Semantic Networks (BBN)

           [Forwarded from the MIT bboard by SASW@MIT-MC.]

  Another BBN AI Seminar: Lokendra Shastri of U. of Rochester
will talk on Friday April 26 in room 5/143 (near the travel
office).

              Evidential Reasoning in Semantic Networks
           A Formal Theory and its Parallel Implementation

   The talk presents an evidential approach to knowledge
representation and inference wherein the principle of maximum
entropy is applied to deal with uncertainty and incompleteness.
It focuses on a representation language that is an evidential
extension of semantic networks, and develops a formal theory of
inheritance and recognition within this language.  The theory
applies to a limited, but interesting, class of inheritance and
recognition problems, including those that involve exceptions,
multiple hierarchies, and conflicting information.  The resulting
theory may be implemented as an interpreter-free, massively
parallel network made up of highly interconnected but extremely
simple computing elements.  The network can solve inheritance and
recognition problems in time proportional to the depth of the
conceptual hierarchy.

------------------------------

End of AIList Digest
********************