Retargetable Self-Test Program Generation Using Constraint Logic Programming
- Ulrich Bieker,Peter Marwedel
This paper presents new techniques in two different
areas. Firstly, it proposes a solution to the problem of testing
embedded processors. Towards this end, it discusses the automatic
generation of executable test programs from a specification
of test patterns for processor components. Secondly, the paper
shows how constraint logic programming (CLP) improves the
software production process for design automation tools. The
advantages of CLP languages include: built-in symbolic variables
and the built-in support for constraints over finite domains such
as integers and Booleans.
During the recent years, there has been a significant shift in the
way complex electronic systems are implemented: various
types of embedded processors are being used in many...
System Description: Twelf --- A Meta-Logical Framework for Deductive Systems
- Frank Pfenning,Carsten Schurmann
. Twelf is a meta-logical framework for the specification, implementation,
and meta-theory of deductive systems from the theory of
programming languages and logics. It relies on the LF type theory and
the judgments-as-types methodology for specification [HHP93], a constraint
logic programming interpreter for implementation [Pfe91], and
the meta-logic M2 for reasoning about object languages encoded in
LF [SP98]. It is a significant extension and complete reimplementation
of the Elf system [Pfe94].
Twelf is written in Standard ML and runs under SML of New Jersey
and MLWorks on Unix and Window platforms. The current version
(1.2) is distributed with a complete manual, example suites, a tutorial
in the form of on-line lecture...
CLAIRE: Combining Objects and Rules for Problem Solving
This paper describes a new approach towards code reuse through a high-level
programming language. This work has been developed in the context of
combinatorial optimization for industrial problems, where performance is critical
and where a large number of fairly complex algorithms are used, that often share
similar structures (such as branch & bound). The result is the CLAIRE programming
language, which supports logic programming and provides high levels of abstraction
and parametrization. Consequently, it may be used as an executable pseudo-code to
describe concise and reusable problem solving algorithms.
For the last 10 years, we have been experimenting with the use of a hybrid
object-oriented language, LAURE^TM
Constraint Inductive Logic Programming
. This paper is concerned with learning from positive
and negative examples expressed in first-order logic with numerical constants.
The presented approach is based on the cooperation of Inductive
Logic Programming (ILP) and Constraint Logic Programming (CLP),
and proceeds as follows:
ffl A discriminant induction problem is shown to be equivalent to a Constraint
Satisfaction Problem (CSP): all constrained clauses covering positive
examples and rejecting negative examples can be trivially derived
from the solutions of this CSP.
ffl Solving this CSP then allows to build the G set of solutions in terms of
Version Spaces; this resolution can be delegated to a constraint solver.
ffl This CSP provides a tractable computational...
M E T a - S Y N T H E S I S
- Christoph Kreitz
ing from individual formalisms, a transformational synthesis of a program satisfying a
specification hhDD, RR, I, Oii can be described as follows:
As a starting point a new predicate F over DDThetaThetaRR, representing a declarative description of the
program to be created, is introduced by defining
8x:DD.8y:RR. I(x) ) (F(x,y) , O(x,y)).
In the context of the above formula O(x,y) will be transformed until a formula
8x:DD.8y:RR. I(x) ) (F(x,y) , O f (x,y))
has been inferred such that O f (x,y) is composed entirely of satisfiable predicates except for
possible occurrences of O. Then a program is generated by comparing the initial output condition
with the final one...
Characterizing Logic Grammars: A Substructural Logic Approach
- James Andrews,Veronica Dahl
this paper. A substructural logic
sequent calculus proof system is given which is shown to be equivalent to
SDGs for parsing problems, in the sense that a string of terminal symbols is
accepted by a grammar if and only if the corresponding sequent is derivable
in the calculus. One calculus is given for each of the two major interpretations
of SDGs; the two calculi differ by only a small restriction in one
rule. Since SDGs encompass other major grammar formalisms, including
DCGs, the calculi serve to characterize those formalisms as well. /
It is the authors' wish that no agency should ever derive military benefit from the publication
Hypothetical Knowledge and Counterfactual Reasoning
- Joseph Y. Halpern
Samet introduced a notion of hypothetical knowledge and showed
how it could be used to capture the type of counterfactual reasoning
necessary to force the backwards induction solution in a game of perfect
information. He argued that while hypothetical knowledge and
the extended information structures used to model it bear some resemblance
to the way philosophers have used conditional logic to model
counterfactuals, hypothetical knowledge cannot be reduced to conditional
logic together with epistemic logic. Here it is shown that in fact
hypothetical knowledge can be captured using the standard counterfactual
operator "?" and the knowledge operator "K", provided that
some assumptions are made regarding the interaction between the two.
The Relative Complement Problem for Higher-Order Patterns
- Alberto Momigliano,Frank Pfenning
We address the problem of complementing higher-order patterns without
repetitions of free variables. Differently from the first-order case, the complement
of a pattern cannot, in general, be described by a pattern, or even by
a finite set of patterns. We therefore generalize the simply-typed -calculus
to include an internal notion of strict function so that we can directly express
that a term must depend on a given variable. We show that, in this
more expressive calculus, finite sets of patterns without repeated variables
are closed under complement and unification. Our principal application is
the transformational approach to negation in higher-order logic programs.
In most functional and logic programming...
Creating Ontological Metadata for Digital Library Content and Services
- Peter Weinstein
We use formal ontologies to represent knowledge about digital-library content and services. Formal
ontologies define concepts with logic in a frame-inheritance structure. The expressiveness and precision of
these structures supports computational reasoning that can be used in important ways. This paper focuses
on the creation of ontological metadata.
We create ontological content metadata by generating it from MARC (MAchine Readable Cataloging) data.
MARC contains much information that is hard to exploit computationally. In particular, relationships
between works are implicit in shared values and natural-language notes. The conversion process involves
specifying an ontological model, mapping MARC to the ontology, and reasoning about the data to create
explicit links between...
Making Complex Timing Relationships Readable: Presburger Formula Simplification Using Don't Cares
- Tod Amon,Gaetano Borriello,Jiwen Liu
Solutions to timing relationship analysis problems are often reported
using symbolic variables and inequalities which specify linear relationships
between the variables. Complex relationships can be expressed
using Presburger formulas which allow Boolean relations to
be specified between the inequalities. This paper develops and applies
a highly effective simplification approach for Presburger formulas
based on logic minimization techniques.
Many problems in computer-aided design (ranging from physical
placement to scheduling in behavioral synthesis) can be formulated
using systems of linear inequalities. This is especially the case for
problems that deal with temporal information. In the simplest cases,
the inequalities must all hold and are derived from the delay ranges
in the specification and...
SAT-based decision procedures for normal modal logics: a theoretical framework
- Roberto Sebastiani,Adolfo Villafiorita
. Tableau systems are very popular in AI for their simplicity
and versatility. In recent papers we showed that tableau-based procedures
are intrinsically inefficient, and proposed an alternative approach
of building decision procedures on top of SAT decision procedure. We
called this approach "SAT-based". In extensive empirical tests on the
case study of modal K, a SAT-based procedure drastically outperformed
state-of-the-art tableau-based systems. In this paper we provide the theoretical
foundations for developing SAT-based decision procedures for
many different modal logics.
By a tableau framework for a logic L we generically denote a refutation system
for L extending Smullyan's propositional framework . Tableau frameworks
have been presented for many logics,...
The Complexity of Kleene Algebra with Tests
- Ernie Cohen,Dexter Kozen,Frederick Smith
Kleene algebras with tests provide a natural framework for equational
specification and verification. Kleene algebras with tests and
related systems have been used successfully in basic safety analysis,
source-to-source program transformation, and concurrency control.
The equational theory of Kleene algebras with tests has been shown to
be decidable in at most exponential time by an efficient reduction to
Propositional Dynamic Logic. In this paper we prove that the theory
Kleene algebra with tests (KAT)  is an algebraic system intermediate to
Kleene algebra (KA) and Propositional Dynamic Logic (PDL) in expressive
power. One can use KAT for a range of common verification tasks without
resorting to the full...
The Differential Fixpoint of General Logic Programs
- Ulrich Zukowski,Burkhard Freitag
We present a version of the alternating fixpoint procedure that is fully incremental.
Using ideas of partial evaluation techniques we can compute the well-founded model of
logic programs with negation bottom-up without any recomputations. Further extensions
of the semantics, e.g. to stable models or disjunctive programs are possible this way. We
show how to implement the algorithm efficiently using index-based data structures and describe
an extension to handle magic set transformed programs. This set-oriented bottomup
algorithm is compatible with the well-known optimizations of deductive databases,
e.g. semi-naive fixpoint iteration, and of relational databases, e.g. index techniques that
enable the processing of large amounts of data. Thus, a bottom-up...
Parametric Temporal Logic for "Model Measuring"
- Rajeev Alur,Kousha Etessami,Salvatore La Torre,Doron Peled
. We extend the standard model checking paradigm of linear
temporal logic, LTL, to a "model measuring" paradigm where one can
obtain more quantitative information beyond a "Yes/No" answer. For
this purpose, we define a parametric temporal logic, PLTL, which allows
statements such as "a request p is followed in at most x steps by a
response q", where x is a free variable. We show how one can, given a formula
'(x1 ; : : : ; xk ) of PLTL and a system model K, not only determine
whether there exists a valuation of x1 ; : : : ; xk under which the system...
Unified Semantics for Modality and lambda-terms via Proof Polynomials
- Sergei N. Artemov
It is shown that the modal logic S4, simple -calculus and modal -calculus admit a
realization in a very simple propositional logical system LP , which has an exact provability
semantics. In LP both modality and -terms become objects of the same nature, namely,
proof polynomials. The provability interpretation of modal -terms presented here may
be regarded as a system-independent generalization of the Curry-Howard isomorphism of
proofs and -terms.
The Logic of Proofs (LP , see Section 2) is a system in the propositional language with an
extra basic proposition t : F for "t is a proof of F ". LP is supplied with a...
A Modal Analysis of Staged Computation
- Rowan Davies,Frank Pfenning
We show that a type system based on the intuitionistic modal logic
S4 provides an expressive framework for specifying and analyzing computation
stages in the context of functional languages. Our main technical
result is a conservative embedding of Nielson & Nielson's two-level
functional language in our language Mini-ML
, which in addition to partial
evaluation also supports multiple computation stages, sharing of code
across multiple stages, and run-time code generation.
Dividing a computation into separate stages is a common informal technique in
the derivation of algorithms. For example, instead of matching a string against
a regular expression we may first compile a regular expression into a finite automaton
IBM Research Report
- Benjamin N. Grosof
Prioritized default rules offer a conveniently higher level of specification that facilitates
revision and modularity. They handle conflicts, including arising during updating of
rule sets, using partially-ordered prioritization information that is naturally available
based on relative specificity, recency, and authority. Despite having received much
study, however, they have had as yet little impact on on practical rule-based systems
and software engineering generally, and had very few deployed serious applications.
We give a new approach to the semantics and implementation of prioritized default
rules: to compile them into ordinary logic programs (LP's). (By "logic program" and
"prioritized default rules", we mean in the sense of declarative knowledge representation
Development of Optical Information Technology at ETL under RWC Program
- T. Hidaka
Research Project of Optical Information Technology at ETL under RWC program is given
in detail. In this program, firstly, so called CAD (computer-aided design) system for optical
devices will be established for the purpose of acceleration on the development of semiconductor
optical devices. Devices developed under the CAD system will be implemented into new types of
optical information systems such as optical digital system, multi-level optical logic system, neural
network system, optical fuzzy system, and optical frequency-multiplex logic system. Also, new
architectures for optical information systems will be developed, in which the novel philosophy is
strongly requested over conventional architectures so far developed for electronic circuits.
An Extension of Explanation-Based Generalization to Negation as Failure
- Stefan Schrodl
Implementations of Explanation-Based Generalization (EBG) within
a logic-programming environment, as e.g. the well-known PROLOGEBG
algorithm [KCMcC87], are able to generalize the proof of a goal
from a definite (i.e. Horn clause) domain theory. However, it is a
fact that practical applications frequently require the enhanced expressiveness
of negations in rule bodies. Specifically, this is the case
for the domain of game playing, where traditional EBG has turned
out to be inadequate [Ta89]. In this paper we present an approach
which extends EBG to this more general setting; it is described in the
form of a transformation system, and comprises Siqueira and Puget's
method of Explanation-Based Generalization of Failures [SiPu88] for
Seeking Explanations: Abduction in Logic, Philosophy of Science and Artifical Intelligence
- Atocha Aliseda-llera
It is a privilege to have five professors on my reading committee representing the
different areas of my Ph.D. program in Philosophy and Symbolic Systems: Computer
Science, Linguistics, Logic, Philosophy, and Psychology.
Tom Wasow was the first person to point me in the direction of abduction. He
gave me useful comments on earlier versions of this dissertation, always insisting that
it be readable to non experts. It was also a pleasure to work with him this last year
coordinating the undergraduate Symbolic Systems program.
Dagfinn Føllesdal encouraged me to continue exploring connections between abduction
and philosophy of science. Yoav Shoham and Pat Suppes gave me very good