Cumulative Higher-Order Logic as a Foundation for Set Theory
- Wolfgang Degen; Jan Johannsen
The systems K of transnite cumulative types up to are extended to systems K 1 that include a natural innitary inference rule, the so-called limit rule. For countable a semantic completeness theorem for K 1 is proved by the method of reduction trees, and it is shown that every model of K 1 is equivalent to a cumulative hierarchy of sets. This is used to show that several axiomatic rst-order set theories can be interpreted in K 1 , for suitable . Keywords: cumulative types, innitary inference rule, logical foundations of set theory. MSC: 03B15 03B30 03E30 03F25 1 Introduction...
A Logic of Situated Know-how
- Munindar P. Singh
Know-how is an important concept in Artificial Intelligence. It has been argued previously that it cannot be successfully reduced to the knowledge of facts. In this paper, I present sound and complete axiomatizations for two non-reductive and intuitively natural formal definitions of the know-how of an agent situated in a complex environment. I also present some theorems giving useful properties of know-how, and discuss and resolve an interesting paradox (which is described within). This is done using a new operator in the spirit of Dynamic Logic that is introduced herein and whose semantics and proof-theory are given. 1 Introduction Knowledge...
Extraction of Meta-Knowledge to Restrict the Hypothesis Space for ILP Systems
- Eric Mccreath; Arun Sharma
Many ILP systems, such as GOLEM, FOIL, and MIS, take advantage of user supplied meta-knowledge to restrict the hypothesis space. This meta-knowledge can be in the form of type information about arguments in the predicate being learned, or it can be information about whether a certain argument in the predicate is functionally dependent on the other arguments (supplied as mode information). This meta-knowledge is explicitly supplied to an ILP system in addition to the data. The present paper argues that in many cases the meta- knowledge can be extracted directly from the raw data. Three algorithms are presented that learn...
ILFA - A Project in Experimental Logic Computation
- Ulf Dunker; Andreas Flögel; Hans Kleine Büning; Jürgen Lehmann; Theodor Lettmann
ILFA provides programs and libraries of reusable software components for logic processing. "ILFA" stands for Inte- grated Logic Functions for Advanced Applications and has its roots in a research project between the german IBM in Heidelberg and the Universities of Duisburg and Paderborn. The implementation by the C programming language supports an efficient work with logic methods in different computer environments. ILFA contains a great number of logic algorithms divided in implementations for important subclasses of logic. Special attention has been considered to visual user interfaces for the ILFA library components, i.e. for using them, programming with them and providing...
A Formalization of the B-Method in Coq and PVS
- Jean Paul Bodeveix; Mamoun Filali; César A. Muñoz
. We formalize the generalized substitution mechanism of the B-method in the higher-order logic of Coq and PVS. Thanks to the dependent type feature of Coq and PVS, our encoding is compact and highly integrated with the logic supported by the theorem provers. In addition, we describe a tool that mechanizes, at the user level, most of the effort of the encoding. 1 Introduction In recent years, important work has been done in the design and implementation of general specification languages and theorem provers. The concretization of these efforts is illustrated in systems like HOL [Gor93], Coq [BBC + 97],...
A semantic view of classical proofs -- type-theoretic, categorical, and denotational characterizations (Extended Abstract)
- C.-H. L. Ong
Classical logic is one of the best examples of a mathematical theory that is truly useful to computer science. Hardware and software engineers apply the theory routinely. Yet from a foundational standpoint, there are aspects of classical logic that are problematic. Unlike intuitionistic logic, classical logic is often held to be non-constructive, and so, is said to admit no proof semantics. To draw an analogy in the proofsas -programs paradigm, it is as if we understand well the theory of manipulation between equivalent specifications (which we do), but have comparatively little foundational insight of the process of transforming one program...
Machine Learning Techniques for Adaptive Logic-Based Multi-Agent Systems: A Preliminary Report
- Eduardo Alonso; Daniel Kudenko
: It is widely recognised in the agent community that one of the more important features of high level agents is their capability to adapt and learn in dynamic, uncertain domains [12, 30]. A lot of work has been recently produced on this topic, particularly in the field of learning in multi-agent systems [1, 32, 33, 34, 42, 43]. It is, however, worth noticing that whereas some kind of logic is used to specify the (multi-)agents' architecture, mainly non-relational learning techniques such as reinforcement learning are applied. We think that these approaches are not well-suited to deal with the large...
That SnS Can Be Modally Characterized
- Hans Hüttel
We show that a modal mu-calculus with label set f1; : : : ; ng can define the Rabin recognizable tree languages up to an equivalence similar to the observational equivalence of Milner. 1 Introduction In  it was shown that the temporal logic ETL  can define exactly the class of !-regular languages. In  it was shown that a fixed-point calculus whose signature apart from maximal and minimal fixed points and disjunction includes the usual operators on trees can define exactly the sets of infinite trees recognized by Rabin tree automata ; this class of sets corresponds to...
- Norman Foo; Abhaya Nayak; Maurice Pagnucco
. Certain constraints that should be interpreted as definitions or quasi-definitions can cause prediction problems for logics of action. Standard ways to avoid them rely on notions of priority in the minimization of fluent change. We offer an alternative approach based on belief revision. Revision is performed on the reduct of the system theory after isolating definitions, and the revised theory is then expanded by restoring the definitions. The basis for this is a weak monotonicity result for AGM contraction. Examples from the literature are used to illustrate the approach. 1 Introduction A substantial part of knowledge representation research is...
Non-repudiation Evidence Generation for CORBA using XML
- Michael Wichert; David Ingham; Steve Caughey
This paper focuses on the provision of a nonrepudiation service for CORBA. The current OMG specification of a CORBA non-repudiation service forces the programmer to augment the application with calls to functions for generating or validating evidence. Furthermore, the application itself has to manage the exchange of this evidence between parties and its storage. The paper describes our design for a generic CORBA non-repudiation service implementation. Our approach provides a separation between the application business logic and the generation of evidence allowing nonrepudiation support to be incorporated into applications with the minimum of programmer effort. Our design is described in...
Frame-buffer on Demand: Applications of Stateless Client Systems in Web-based Learning
- Sheng Feng Li; Quentin Stafford-Fraser; Andy Hopper
The growth of the Internet and the World Wide Web has changed the way people are educated, and distance learning is amongst the most promising fields of new Web applications built from existing services with supporting infrastructures. In this paper, we introduce a number of new applications built from Stateless Client Systems to assist the learning of computer-based activities. Stateless Client Systems separate the display interface from the application logic in windowing systems. They embed a client/server architecture, where the server executes all the applications and the client simply presents the frame buffers or screen images to the user and...
Deliverables: A Categorical Approach to Program Development in Type Theory
- James Hugh McKinna
This thesis considers the problem of program correctness within a rich theory of dependent types, the Extended Calculus of Constructions (ECC). This system contains a powerful programming language of higher-order primitive recursion and higher-order intuitionistic logic. It is supported by Pollack's versatile LEGO implementation, which I use extensively to develop the mathematical constructions studied here. I systematically investigate Burstall's notion of deliverable, that is, a program paired with a proof of correctness. This approach separates the concerns of programming and logic, since I want a simple program extraction mechanism. The \Sigma-types of the calculus enable us to achieve this. There...
Theory-driven Logical Scaling
- Susanne Prediger; Gerd Stumme
this paper, the basics of Conceptual Information Systems and conceptual scales are provided in Section 2. For a conceptual scale, there is always a trade-off between its size and its soundness with respect to future updates of the database. There are two approaches of designing conceptual scales: data-driven design and theory-driven design. In theory-driven design, knowledge about the application domain is used to exclude impossible combinations of attributes. This keeps the conceptual scales small -- and their concept lattices easier to interpret. Theory-driven design is only applicable if there is enough knowledge about which types of objects may occur in...
Fair Games and Full Completeness for Multiplicative Linear Logic without the MIX-Rule
- J.M.E. Hyland; C. -h. L. Ong
We introduce a new category of finite, fair games, and winning strategies, and use it to provide a semantics for the multiplicative fragment of Linear Logic (mll) in which formulae are interpreted as games, and proofs as winning strategies. This interpretation provides a categorical model of mll which satisfies the property that every (history-free, uniformly) winning strategy is the denotation of a unique cut-free proof net. Abramsky and Jagadeesan first proved a result of this kind and they refer to this property as full completeness. Our result differs from theirs in one important aspect: the mix-rule, which is not part...
Loading a Cache with Query Results
- Laura M. Haas; Donald Kossmann; Ioana Ursu
Data intensive applications today usually run in either a client-server or a middleware environment. In either case, they must efficiently handle both database queries, which process large numbers of data objects, and application logic, with its fine-grained object accesses (e.g., method calls). Sophisticated optimization techniques speed up query processing, while caching is used to reduce the cost of the application logic. Query processing and caching decisions are typically made in isolation, though often in applications queries are used to identify those objects the application will further process. We propose a wholistic approach to speeding up such applications: we load the...
A Declarative Approach to Business Rules in Contracts: Courteous Logic Programs in XML
- Benjamin Grosof; Yannis Labrou; Hoi Y. Chan
We address why, and especially how, to represent business rules in e-commerce contracts. By contracts, we mean descriptions of goods and services offered or sought, including ancillary agreements detailing terms of a deal. We observe that rules are useful in contracts to represent conditional relationships, e.g., in terms & conditions, service provisions, and surrounding business processes, and we illustrate this point with several examples. We analyze requirements (desiderata) for representing such rules in contracts. The requirements include: declarative semantics so as to enable shared understanding and interoperability; prioritized conflict handling so as to enable modular updating/revision; ease of parsing; integration...
Problem of Information Passing in Magic Sets Method
- Karel Jezek; Martin Zíma
The magic sets method is used for efficient bottom up evaluation of logic programs. Since its origin, it suffers from the fact that influence of information passing among predicates has not been sufficiently studied. We have recognized the way of information passing as crucial for the efficiency of the method. In this article the ways of information passing are analyzed and a method of choosing an actual passing is described.
Symbolic Model Checking for Sequential Circuit Verification
- J. R. Burch; E. M. Clarke; D. E. Long; K. L. Mcmillan; D.L. Dill
The temporal logic model checking algorithm of Clarke, Emerson, and Sistla  is modified to represent state graphs using binary decision diagrams (BDDs)  and partitioned transition relations [10, 11]. Because this representation captures some of the regularity in the state space of circuits with data path logic, we are able to verify circuits with an extremely large number of states. We demonstrate this new technique on a synchronous pipelined design with approximately 5 \Theta 10 120 states. Our model checking algorithm handles full CTL with fairness constraints. Consequently, we are able to express a number of important liveness and...
Semantic Schema Integration in a Heterogeneous Database Environment
- Tiziana Catarci; Giuseppe Santucci; Semanticschemaintegrationina Heterogeneousdatabase; Dipartimentodiinformaticae Sistemistica; Rtc Tallaght
One of the most common approaches for accessing heterogeneous information systems is based on constructing a global schema describing the information in the individual databases. In order to automatically build such a schema, we need to formally represent the database schemata populating the component information systems and their interrelationships. Towards this aim, we use a logic language to express interdependencies between classes belonging to different schemata. Such interdependencies allow a designer of a cooperative information system to establish several relationships between both the intensional definition and these to instances of classes represented in different schemata. The logic language and its...
The Automated Refinement of a Requirements Domain Theory
- T. L. Mccluskey; M. M. West
. The specification and management of requirements is widely considered to be one of the most important yet most problematic phases in software engineering. In some applications, such as in safety critical areas or knowledge-based systems, the construction of a requirements domain theory is regarded as an important part of this phase. Building and maintaining such a domain theory, however, requires a large investment and a range of powerful validation and maintenance tools. The area of `theory refinement' is concerned with the use of training data to automatically change an existing theory so that it better fits the data. Theory...