[comp.doc.techreports] tr-input/seki

leff@smu.UUCP (Laurence Leff) (02/22/89)

University of Kaiserslautern
SEKI:  List of Abstracts, Version of 17.2.1989

These reports are available from

Rolf Socher-Ambrosius
Fachbereich Informatik
Universitaet Kaiserslautern
Postfach 3049
D-6750 Kaiserslautern
West Germany

(Please no advance payments, please send a cheque
in DM only after having received an invoice)

SEKI REPORTS 1988
-----------------

SEKI REPORT SR-88-01   (SFB*)   20 pages, 2.- DM
Knut Hinkelmann, Klaus  Noekel, Robert Rehbold:
	SASLOG: Lazy Evaluation Meets Backtracking

ABSTRACT: We describe  a  combined  functional/logic  programming
language  SASLOG  which  contains  Turner's  SASL,  a fully lazy,
higher-order functional language, and pure Prolog as subsets. Our
integration is symmetric, i.e. functional terms can appear in the
logic part  of  the  program  and  v.v.  Exploiting  the  natural
correspondence  between  backtracking  and lazy streams yields an
elegant solution to the problem of transferring alternative vari-
able  bindings  to the calling functional part of the program. We
replace the rewriting approach to function evaluation by combina-
tor  graph  reduction, thereby regaining computational efficiency
and the structure sharing properties.  Our  solution  is  equally
well  suited  to a fixed combinator set and to a super combinator
implementation. In the paper we  use  Turner's  fixed  combinator
set.

SEKI REPORT  SR-88-02  (SFB)      33  pages,  2.- DM
Joachim Steinbach:      Comparision of Simplification Orderings

ABSTRACT: Termination is an important property of term  rewriting
systems.  Simplification orderings are often used which guarantee
termination. We describe the basic ideas of comparing  terms  and
present  the formal definitions and some examples of the most po-
pular simplification orderings. A new definition of one  of  them
is  given which is simpler than the original and therefore better
to handle.  Furthermore, we complete the comparison found in  the
literature,  i.e.  we  mark off the power (the sets of comparable
terms) of the orderings.

SEKI   REPORT   SR-88-03      10   pages,    1.- DM
Stanislaw Krajewski:    Did Goedel Prove that We Are Not Machines?
               (On Philosophical Consequences of Goedel's Theorem)

ABSTRACT: Goedel's  incompleteness  theorem  has  been  the  most
famous  example  of a mathematical theorem from which deep philo-
sophical consequences follow. They are said to give  an  insight,
first,  into the nature of mathematics, and more generally of hu-
man knowledge, and second, into the nature of the mind. The limi-
tations  of  logistic or formalist programmes of mathematics have
had a clear significance against the background  of  the  founda-
tional  schools of the early decades of this century. The limita-
tions of mechanism, or of the vision underlying research  in  the
field  of  Artificial  Intelligence,  gain significance only now.
Yet, while the limitations imposed by Goedel's theorem  upon  the
extent  of  formal  methods seem unquestionable they seem to have
very little to say about the restrictions concerning mathematical
or computer practice. And the alleged consequences concerning the
non-mechanical character of human  mind  are  questionable.   The
standard  reasoning, known as Lucas' argument, begs the question,
and actually implies that Lucas is inconsistent!

SEKI  REPORT  SR-88-04    (SFB)    12  pages,   1.- DM
Harold Boley: Iconic-Declarative Programming and Adaptation Rules

ABSTRACT: Functional and  logical  languages  permit  declarative
programming,  i.e.  executable high-level problem specifications.
However, to obtain optimum intelligibility, programs  schould  be
c,  i.e. directly model their domain, as illustrated imperatively
by object-oriented languages. After a discussion  of  iconic  as-
pects  in  pure  LISP (call nestings) and pure PROLOG (invocation
patterns), an advanced iconic-declarative technique in the  func-
tional/  logical  AI  language FIT is presented: adaptation rules
join left-hand side invocation patterns and right-hand side  call
nestings  of  transformation rules into single pictorial contexts
that process data by global tests and  direct  local  transforma-
tions.  Their  utility is exemplified in three domains (list pro-
cessing, set normalization, and graph searching). The conclusions
criticize  the  predominance of transformation rules and contrast
iconic-imperative with iconic-declarative developments.

SEKI REPORT  SR-88-05     29  pages,  2.-  DM
Horst Gerlach: A Representation for the Non-Instances of Linear Terms

ABSTRACT: not provided, available on request

SEKI REPORT SR-88-06   (SFB)    54  pages,  3.- DM
Hans-Juergen Ohlbach, Joerg Siekmann:
Using Automated Reasoning Techniques for Deductive Databasis

ABSTRACT: This report presents a proposal for  a  deduction  com-
ponent that supports the query mechanism of relational databasis.
The query-subquery (QSQ) paradigm is currently  very  popular  in
the  database community since it focuses the deduction process on
the relevant data. We show how to extend the  QSQ  paradigm  from
Horn clauses to arbitrary predicate logic formulae such that dis-
junctions in the consequence of an implication, negation  in  its
logical  meaning,  and arbitrary recursive predicates can be han-
dled without restrictions.  Various  techniques  to  improve  the
search  behaviour, such as lemma generation, query generalization
etc.  can be incorporated. Furthermore we show how to use  clause
graphs  for  compile time optimizations in the presence of recur-
sive clauses and to support the run time processing.

SEKI REPORT SR-88-07   (PH.D. THESIS, SFB)   183  pages,  15.- DM
Norbert  Eisinger: Completeness, Confluence, and Related Pro-
perties of Clause Graph Resolution

ABSTRACT: Clause graph (or connection graph) resolution  was  in-
vented by Robert Kowalski in 1975. Its behaviour differs signifi-
cantly from that of traditional resolution in clause sets.  Stan-
dard  notions  like  completeness do not adequately cover the new
phenomena and problems introduced by clause graph resolution, and
standard  proof  techniques  do not work for clause graphs, which
are the major reasons why important questions have been open  for
years.  This  thesis  defines  a series of relevant properties in
precise terms and answers several  of  the  open  questions.  The
clause  graph inference system is refutation complete and refuta-
tion confluent if the insertion of a single additional variant of
a  clause into the graph is allowed. Compared to clause set reso-
lution, clause graph resolution does not increase the  complexity
of simplest refutations, again provided that one additional vari-
ant may be inserted.  A number of  well-known  restriction  stra-
tegies  are refutation complete, but most are not refutation con-
fluent for clause graph resolution, which renders  them  useless.
Exhaustive  ordering  strategies  do  not exist and contrary to a
wide-spread conjecture the plausible  approximations  of  exhaus-
tiveness do not ensure the detection of a refutation.

SEKI REPORT SR-88-08   (PH.D. THESIS)   115 pages, 15.- DM
Hans-Juergen Ohlbach:   A Resolution  Calculus for Modal  Logics

ABSTRACT: A resoltion calculus for the quantified versions of the
modal logics K, T, K4, KB, S4, S5, B, D, D4, and DB is presented.
It presupposes a syntax transformation, similar to the skolemiza-
tion  in predicate logic that eliminates the modal operators from
modal logic formulae and shifts the modal context information  to
the  term  level.  The  formulae in the transformed syntax can be
translated into conjunctive normal form such that a clause  based
modal  resolution  calculus  is  definable without any additional
inference rule, but with special  modal  unification  algorithms.
The  method  can  be applied to first-order modal logics with the
two operators [] (necessary) and <> (possible) and with  standard
constant-domain possible worlds semantics, where the accessibili-
ty relation may have any combination of the following properties:
reflexivity,  symmetry, transitivity, seriality or non-seriality.
While extensions to other systems seem possible,  they  have  not
yet been investigated.

SEKI-REPORT SR-88-10    (PH.D. THESIS)  15.- DM
Manfred Schmidt-Schauss: Computational  Aspects of an Order Sor-
			 ted Logic With Term Declarations

ABSTRACT: In this thesis I investigate the logical foundations of
a very general order-sorted logic. This sorted logic extends usu-
al first order logic by a partially ordered set  of  sorts,  such
that  every  term  is  of  a particular sort or type, in addition
there is a mechanism to define the sort of terms using  term  de-
clarations.  Syntax and semantics of this order-sorted logic with
declarations are  defined  in  a  natural  way.   Unification  in
order-sorted logics with term declarations is undecidable and in-
finitary, i.e., minimal complete sets of unifiers may  be  infin-
ite.   However,  under the restriction that declarations are only
of the form f:S1x...xSn -> S and that the signature  is  regular,
unification  is  decidable  and minimal complete sets of unifiers
exist and are always finite. Furthermore there exists a signature
of this form such that unification is NP-complete. If there is no
equality predicate in the logic we use resolution  and  factoring
as inference rules, where the unification algorithm is adapted to
the sort-structure.  The  corresponding  calculus  is  refutation
complete.If  there  is  an  equality predicate and all equational
literals are in unit clauses, we use a special E-unification  al-
gorithm  and  show that under some restrictions such an algorithm
can be constructed from  an  unsorted  unification  algorithm  by
postprocessing  the  set  of unifiers. If arbitrary equations are
admissible, we use paramodulation as additional inference rule or
replace  resolution  by  the  E-resolution rule. An algorithm for
transforming unary predicates into  sorts  is  presented.  It  is
shown  that this algorithm is correct and complete under sensible
restrictions.  Usually, the  algorithm  may  require  exponential
time,  however, in the special case of Horn clauses the algorithm
can be performed in polynomial time.  We  also  investigate  term
rewriting systems in an order-sorted logic and extend the conflu-
ence criterion that is based on critical pairs by  critical  sort
relations.

SEKI-REPORT SR-88-11  (SFB)     8 pages, free
H.-J. Buerckert et al.: Opening the AC Unification Race

ABSTRACT: This note  reports  about  the  implementation  of  AC-
unification  algorithms, based on the variable-abstraction method
of Stickel and on the  constant-abstraction  method  of  Livesey,
Siekmann, and Herold. We give a set of 105 benchmark examples and
compare execution  times  for  implementations  of  the  two  ap-
proaches.  This  documents for other researchers what we consider
to  be  the  state-of-the-art  performance  for  elementary   AC-
unification problems.

SEKI  REPORT  SR-88-12   (SFB)     57  pages,  3.- DM
Joachim Steinbach:      Term Orderings With Status

ABSTRACT: The effective calculation with term  rewriting  systems
presumes  termination.  Orderings  on terms are able to guarantee
termination. This report deals with some of those term orderings:
Several  path  and  decomposition  orderings and the Knuth-Bendix
ordering. We pursue three aims: Firstly, known orderings will get
new  definitions.  In the second place, new ordering methods will
be introduced: We will extend existing orderings  by  adding  the
principle of status. Thirdly, the comparison of the power as well
as the time behaviour of all orderings will  be  presented.  More
precisely,  after  some  preliminary  remarks  to  termination of
rewrite systems we present the ordering  methods.  All  orderings
are connected by an essential characteristic: Each operator has a
status that determines the order according to which the  subterms
are  compared.   We  will present the following well-known order-
ings: The recursive path ordering with status, the path  of  sub-
terms  ordering  and another path ordering with status. A new re-
cursive decomposition ordering with status will lead the  catalo-
gue of orderings introduced here. Moreover, we give a new defini-
tion based on decompositions of the path of subterms ordering. An
extension  by incorporating status to this ordering as well as to
the improved recursive decomposition ordering will be a  part  of
the paper. All orderings based on decompositions will be present-
ed in a new and simple style: The decomposition of  a  term  con-
sists  of  terms  only. The original definitions take tuples com-
posed of three (or even four) components.  Additionally  to  path
and  decomposition  orderings,  we  deal with the weight oriented
ordering and incorporate status.  Finally,  important  properties
(simplification  ordering, stability w.r.t.  substitutions, etc.)
of the newly introduced orderings will be  listed.   Besides  the
introduction  of new orderings, another main point of this report
is the comparison of the power of these orderings, i.e.  we  will
compare  the sets of comparable terms for each combination of two
orderings. It turned out that the new version with status of  the
improved recursive decomposition ordering (equivalent to the path
ordering with status) is the most powerful ordering of the  class
of  path and decomposition orderings presented. This ordering and
the Knuth-Bendix ordering with status overlap. The orderings  are
implemented  in our algebraic specification laboratory TRSPEC and
the completion system COMTES. A series of  experiments  has  been
conducted  to  study  the  time  behaviour  of  the orderings. An
evaluation of these chronometries concludes the paper.

SEKI REPORT SR-88-13  (SFB)     23 pages, 2.- DM
J.Mueller, R.Socher-Ambrosius: Topics in Completion Theorem Proving

ABSTRACT: Completion Theorem Proving is based on the  observation
that proving a formula is equivalent to solving a system of equa-
tions over a boolean polynomial ring. The latter  can  be  accom-
plished  by means of the completion of a set of rewrite rules ob-
tained from the equational system. This report deals  with  three
problems    concerning   Completion   Theorem   Provers   (CTPs):
Are  multiple overlaps necessary  for the  completeness  of CTPs?
How  can  simplification be  interpreted in terms  of  resolution
inference rules? How can polynomials efficiently  be represented?
The answer to the first question is "no".
Even more it is shown in this report that  the  removal  of
multiple overlap steps does not increase the length of completion
refutations. This amounts to an extension  of  Dietrich's  (1986)
result concerning the correspondence of completion proofs without
simplification for Horn Logic to resolution  proofs.   Concerning
the   second   question   we   show  that  simplification  either
corresponds to resolution steps using matching instead of  unifi-
cation or to subsumption deletion. Changing the point of view, we
can also identify reduction rules  in  resolution  based  theorem
provers  to  be part of CTPs. We show that tautology elimination,
merging and the subsumption rule are performed  in  CTPs  without
being  explicitly defined as inference rules. In the last section
we deal with the technical question of choosing the datastructure
well suited for an efficient implementation of CTPs.

SEKI  REPORT   SR-88-14    8   pages,   free
Manfred Schmidt-Schauss:     Subsumption in KL-ONE is Undecidable

ABSTRACT: It is shown that in the frame-based language KL-ONE  it
is  undecidable,  whether one concept is subsumed by another con-
cept. The concept forming operators which are sufficient for this
result are: conjunction, value restriction, role restriction, the
operator `SOME', and role value maps using only `='. In  particu-
lar, number restrictions are not used. This shows that there is a
basic difference betwen feature terms and KL-ONE, since the  com-
plexity  of suvbsumption switches from co-NP-complete to undecid-
able, if the restriction is dropped that roles are functional.

SEKI REPORT SR-88-15  (SFB)     20 pages, 2.- DM
B.Gramlich, J.Denzinger: AC-Matching Using Constraint Propagation

ABSTRACT: We present a new  approach  to  associative-commutative
(AC-)  pattern  matching  using  a technique of global constraint
propagation which in many cases enables  us  to  drastically  cut
down  the search space for solutions. Given a conjunction of sim-
plified non-trivial AC-matching problems the  main  idea  of  our
method  consists in deducing and propagating constraints for pos-
sible variable assignments from all problems instead of  choosing
only  one problem for processing next. Thus many failure branches
of the search tree for solutions can  be  cut  without  expanding
them.  The  control  strategy for branching is designed such that
nodes with a small branching degree are preferred.  Incorporating
some  additional optimizations we get a new AC-matching algorithm
which is especially well-suited for certain  non-linear  patterns
and  for a systematic early detection of unsolvability.  Moreover
we point out the advantages of using unique ordered normal  forms
for  AC-equivalent terms based on a special class of orderings on
the function symbols and the variables. Finally we sketch how our
constraint  propagation  approach for AC-matching can be general-
ized to other kinds of E- matching and E-unification problems.

SEKI  REPORT  SR-88-16   (SFB)     8  pages,  free
Rolf Socher-Ambrosius:
Using Theory Resolution to Simplify Interpreted Formulae

ABSTRACT: Loveland & Shostak (1980) gave an  algorithm  for  con-
verting a decision procedure for ground formulae in a first-order
theory to a simplifier for such formulae. The algorithm  is  sup-
posed  to  produce a simplest clause set among all formulae built
from atoms of the original formula. In this  paper  it  is  shown
that  this  method  does  not meet the requirement of producing a
simplest semantic equivalent. Furthermore  we  show  that  theory
resolution  can  be used to extend Quine's method to an algorithm
that really accomplishes the task of simplifying interpreted for-
mulae.

SEKI   REPORT   SR-88-17    (SFB)     5   pages,    free
Klaus Noekel:   Convex Relations Between Time Intervals

ABSTRACT: We describe a fragment of Allen's full algebra of  time
interval relations (the algebra of convex relations) that is use-
ful for describing the dynamic  behavior  of  technical  systems.
After an intuitive description of the fragment we give two formal
definitions and prove that they are equivalent. This provides the
basis  for  the major result of the paper: in a time net in which
all interval relations are convex the test for  the  global  con-
sistency  of  the edge labelling can be carried out in polynomial
time (in the general case it is NP-complete). This  result  makes
convex interval relations an attractive candidate whereever qual-
itative reasoning about technical systems  requires  testing  for
global instead of local inconsistency.

SEKI  REPORT  SR-88-18   (SFB)     20  pages,  2.- DM
Bernhard Gramlich: Unification of Term Schemes Theory and Application

ABSTRACT: We investigate the solvability of certain infinite sets
of  first order unification problems represented by term schemes.
Within the  framework  of  second  order  equational  logic  this
amounts  exactly  to  solving  (variable-) restricted unification
problems. A method known for solving first order restricted unif-
ication  problems is generalized to the second order case. Essen-
tially this is achieved by transforming a restricted  unification
problem  into  an  unrestricted  one,  solving the latter and re-
transforming the solutions obtained. Finally the results are  ap-
plied to provide sufficient conditions for a property of "repeat-
ed unifiability" which in turn is crucial  for  the  analysis  of
divergence  of  completion procedures for term rewriting systems.
Although the study of  divergent  completion  behaviour  was  the
starting  point  for  the work presented the results obtained are
not only applicable to divergence analysis but may be useful  for
other applications, too.

SEKI  REPORT  SR-88-19   (SFB)     29  pages,  2.- DM
Christoph Lingenfelder: Structuring Computer Generated Proofs

ABSTRACT: One of the main  disadvantages  of  computer  generated
proofs  of  mathematical  theorems  is  their  complexity and in-
comprehensibility.  Proof  transformation  procedures  have  been
designed  in  order  to state these proofs in a formalism that is
more familiar to a human mathematician.  But usually  the  essen-
tial  idea of a proof is still not easily visible.  We describe a
procedure to transform proofs represented as abstract  refutation
graphs  into  natural deduction proofs. During this process topo-
logical properties of the refutation graphs can  be  successfully
exploited im order to obtain structured proofs. It is also possi-
ble to remove trivial steps from the proof formulation.

SEKI REPORT SR-88-20  (SFB)     10 pages, 1.-  DM
Klaus Noekel: Temporal  Matching: Recognizing Dynamic
		Situations from Discrete Measurements

ABSTRACT: The converse problem of measurement  interpretation  is
event  recognition.  In  situations  which are characterized by a
specific order of events, a single snapshot is not sufficient  to
recognize  an  event.   Instead one has to plan a measurement se-
quence that consists of observations at more than one time point.
In this paper we present an algorithm for planning such an obser-
vation sequence based on the specification of the event and  dis-
cuss  the problem of giving a meaningful definition of a success-
ful match of a measurement sequence against a situation  descrip-
tion.

SEKI REPORT SR-88-21    29 pages, 2.- DM
Manfred Schmidt-Schauss &  Gerd Smolka:
Attributive Concept Descriptions with Unions and Complements

ABSTRACT: This paper investigates the consequences of adding  un-
ions  and complements to the attributive concept descriptions em-
ployed in KL-ONE-like knowledge representation languages.  It  is
shown  that deciding consistency and subsumption of such descrip-
tions are PSPACE-complete  problems  that  can  be  decided  with
linear space.

SEKI REPORT SR-88-22    18 pages, 1.- DM
Harold  Boley:  Expert System  Shells: Very-high-level Languages
		for Artificial Intelligence

ABSTRACT: Expert-system shells are discussed  as  very-high-level
Languages  for  knowledge engineering. Based on a category/domain
distinction for expert  systems  the  concept  of  expert  system
shells is explained using seven classifications. A proposal for a
shell-development policy is  sketched.  The  conclusions  express
concern about overemphasis on shell surface.

SEKI REPORT SR-88-23    43 pages, 2.- DM
Manfred Schmidt-Schauss, Joerg  H. Siekmann:
	Unification Algebras: An Axiomatic Approach to Unification,
	Equation Solving and Constraint Solving

ABSTRACT: Traditionally unification is viewed as solving an equa-
tion  in  an  algebra  given  an explicit construction method for
terms and substitutions. We abstract from this explicit term con-
struction methods and give a set of axioms describing unification
algebras that consist of  objects  and  mappings,  where  objects
abstract terms and mappings abstract substitutions. A unification
problem in a given unification algebra is  the  problem  to  find
mappings  for  a system of equations <si=ti>, where si and ti are
objects, such that si and ti are mapped onto the same term.  Typ-
ical  instances  of unification algebras and unification problems
are: Term unification with respect  to  equational  theories  and
sorts,  standard  equation solving in mathematics, unification in
the lambda-calculus, constraint solving, disunification, and  un-
ification  of rational terms.  Within this framework we give gen-
eral purpose unification rules that can be used in every unifica-
tion  algorithm  in  unification algebras.  Furthermore we demon-
strate the use of this framework by investigating the analogue of
syntactic unification and unification of rational terms.



SEKI WORKING PAPERS 1988
------------------------

SEKI WORKING PAPER  SWP-88-01   (SFB)     77  pages,  3.- DM
M. Beetz, H. Freitag,  J. Klug, Christoph Lingenfelder (ed.):
        The MKRP User Manual

ABSTRACT: The current state of development of the  Markgraf  Karl
Refutation  Procedure  (MKRP),  a  theorem  proving  system under
development since 1977  at  the  Universities  of  Karlsruhe  and
Kaiserslautern,  West  Germany, is presented and evaluated in the
sequel. The goal of this project can be summarized by the follow-
ing  three  claims: it is possible to build a theorem prover (TP)
and augment it  by  appropriate  heuristics  and  domain-specific
knowledge such that
    (i) it will display an active and directed  behavior  in  its
        striving  for  a  proof, rather than the passive combina-
	torial search through very large search spaces, which was
	the characteristic behaviour of the TPs of the past. Con-
	sequently
   (ii) it will not generate a search space of many thousands  of
        irrelevant  clauses,  but will find a proof with compara-
	tively few redundant derivation steps.
  (iii) Such  a  TP  will  establish  an  unprecedented  leap  in
        performance  over  previous TPs expressed in terms of the
        difficulty of the theorems it can prove.

SEKI WORKING PAPER SWP-88-03    30 pages, 2.- DM
R. Scheidhauer, G. Seul:
	A Test Environment for Unification Algorithms

ABSTRACT: In this paper we describe TENUA,  a  Test  Envinronment
for  Unification Algorithms for first order terms. In its essence
the unification problem in first order logic can be expressed  as
follows:  Given  two terms containing some variables, find, if it
exists, the simplest substitution (assignment  of  some  term  to
every variable) which makes the two terms equal. Since Herbrand's
original work in  1930,  unification  has  been  the  subject  of
several research works, mainly settled in the field of artificial
intelligence. The first unification algorithm, introduced by  Ro-
binson 1965, constituted the central step of the resolution prin-
ciple, which is frequently used in theorem proving and logic pro-
gramming like PROLOG. Resolution, however, is not the only appli-
cation of the unification algorithm. In fact its pattern matching
nature often can be exploited in cases where symbolic expressions
are dealt  with,  for  instance  type  checkers  for  programming
languages  with  a  complex type structure, rewriting systems and
some knowledge representation formalisms in AI.  Because  in  all
these  applications unification constitutes the elementary opera-
tion, its preformance effects in a crucial way their global effi-
ciency.  Since  the  Robinson  algorithm has an exponential worst
case   complexity,    soon    linear    (Paterson/Wegman    1978,
Escalada/Ghallab    1987)    or    almost    linear    algorithms
(Martelli/Montanari 1983) were developed. The choice of  the  ap-
propriate unification algorithm for some application is facilated
by TENUA, a tool  which  allows  comfortable  implementation  and
analysis of unification algorithms. It provides the user with:

Facilities for the implementation of unification algorithms  such
as  an  interface for input and output of terms and substitutions
(including an arity check), and functions  for  term  conversion.
Implemented     unification     algorithms     (Robinson    1965,
Martelli/Montanari 1982, Escalada/Ghallab 1987) giving a  practi-
cal measure of efficiency.

Facilities for the comparison of unification algorithms  such  as
statistical functions and parameterized generators for "standard"
and "random" terms. This allows the user to  produce  term  pairs
appropriate  to  his application and so to test the efficiency of
unification algorithms on "real" conditions TENUA is  implemented
in  COMMONLISP  on  Apollo Domain Workstations. Except the online
documentation (HELP- facility) it is a  machine  independent  and
can be loaded in any COMMOMLISP environment.

SEKI WORKING PAPER SWP-88-04    180 pages, 5.- DM
Simone Gladel-Speicher:	Vergleichende  Untersuchung der Integrations
			problematik von Diagnoseexpertensystemen und
			Tiefenmodellierungssprachen am Beispiel ver-
			schiedener, bereits existierender Systeme.
(Diploma Thesis) No Abstract provided.

SEKI WORKING PAPER SWP-88-05    97 pages, 3.- DM
Winfried  Barth: ARI  Entwicklung  von  Inferenzkomponenten  fuer
		 wissensbasierte Systeme auf der Grundlage struk-
		 turierter Produktionsregelsysteme (Diploma Thesis)

ABSTRACT: The paper describes basic concepts and the  implementa-
tion  of  ARI, a practical tool for the construction of inference
engines for knowledge-based systems. ARI provides  a  representa-
tion  mechanism  based  on  production rules and object-centered,
structured  inheritance  networks.   Additionally,   ARI   offers
knowledge  structures  for  organizing rulebases modularly on the
basis of functional units ("problem solvers") and for  specifying
various control regimes. ARI is presented at three different lev-
els of abstraction. At  the  epistemological  level,  appropriate
inference and control mechanisms can be specified for each "prob-
lem solver". Viewed as functional  units  that  solve  a  generic
class  of  problems  according  to  thier specification, "problem
solvers" are task specific problem solving agents at a conceptual
level. At the implementational level, "problem solvers", inferen-
tial knowledge as well as control knowledge are interpreted effi-
ciently with an extended RETE pattern matching algorithm.

SEKI  WORKING  PAPER  SWP-88-06     91   pages,   3.-   DM
Knut Hinkelmann:   Eine funktional-logische Sprachintegration mit
                   Lazy Evaluation und semantischer Unifikation
(Diploma Thesis)

ABSTRACT:  SASLOG  is  a  combined  functional/logic  programming
language  which  contains  SASL, a fully lazy, higher order func-
tional language and the logic language Prolog. The integration is
symmetric  allowing  functional terms to appear in the logic part
and Prolog goals in the functional part. Exploiting  the  natural
correspondence  between  backtracking  and lazy streams yields an
elegant solution to the problem of transferring alternative vari-
able  bindings to the calling functional part of the program. The
evaluation of functional expressions in the logic part is  driven
by  the  extended  unification algorithm which takes into account
the semantics of function  symbols.  The  rewriting  approach  to
function  evaluation  is  replaced by combinator graph reduction,
thereby regaining  computational  efficiency  and  the  structure
sharing properties. The integration fits well to combinator graph
reduction. So  the  instantiation  of  logic  variables  supports
structure sharing. On the other hand we provide a solution to the
reduction of functional expressions containing logic variables in
different binding environments.

SEKI WORKING PAPER SWP-88-07    114 pages, 5.- DM
Markus Hoehfeld: Ein Schema fuer constraint-basierte  relationale
		 Wissensbanken (Diploma Thesis)

ABSTRACT: Logic programming languages can be seen  as  consisting
of  a  constraint  language, on top of which relations can be de-
fined by means of definite clauses. For example,  Prolog  II  em-
ploys  as constraint language equations and disequations that are
interpreted in the algebra of rational trees. An abstract formal-
ization of constraint languages is presented that generalizes the
constraint logic programming scheme of Jaffar and Lassez in order
to provide a framework for the combination of knowledge represen-
tation formalisms  and  the  logic  programming  paradigm.   This
framework  comes with a type discipline and an operational seman-
tics, which make sense for every  employed  constraint  language.
Using  a  feature  logic with inheritance hierarchies and complex
constants as constraint language we show that the framework gives
a  semantic  foundation  for  LOGIN, a logic programming language
with built-in inheritance.


* Publications denoted with (SFB) were accomplished within
  the SFB 314 project funded by the "Deutsche
  Forschungsgesellschaft (DFG).