Lecture Series - Collegium Logicum

The Society organizes a continuing lecture series, the Collegium Logicum in Vienna. Among the distinguished speakers have been, e.g., H. Barendregt, E. Börger, D. van Dalen, G. Boolos, H. Friedman, Y. Gurevich, J. Hintikka, W. Hodges, P. Hajek, T. Jech, M. Magidor, V. Marek, G. Mints, D. Mundici, M. Yardi and R. Reiter.

The lecture series is now organized by Agata Ciabattoni.

The lectures now take place at the seminar room Gödel, access from courtyard.


Prof. Johann A. Makowsky (Technion, Haifa): Sets and Logic for CS: How to teach what is useful

Many courses of Sets and/or Logic teach the old narrative of paradoxes, compactness, (in)completeness of logical systems etc. In this talk we examine what we think should be the topics of Sets and Logics so as to form a basis and toolbox for other courses such as automata and formal languages, databases, verification, formal methods, AI, etc


George Metcalfe (Universität Bern): Proof Theory for Lattice-Ordered Groups

Proof theory can provide useful tools for tackling problems in algebra. In particular, Gentzen systems admitting cut-elimination have been used to establish decidability, complexity, amalgamation, admissibility, and generation results for varieties of residuated lattices corresponding to substructural logics. However, for classes of algebras bearing some family resemblance to groups, such as lattice-ordered groups, MV-algebras, BL-algebras, and cancellative residuated lattices, the proof-theoretic approach has met so far only with limited success.

The main aim of this talk will be to introduce proof-theoretic methods for the class of lattice-ordered groups and to explain how these methods can be used to obtain new syntactic proofs of two core theorems: namely, Holland's result that this class is generated as a variety by the lattice-ordered group of order-preserving automorphisms of the real numbers, and the decidability of the word problem for free lattice-ordered groups.


Prof. Johann A. Makowsky (Technion, Haifa): Characterizing word functions recognized by weighted automata

Weighted automata were studied under different names (probabilistic automata, multiplicity automata) since the very beginning of automata theory. They recognize word functions with values in a commutative semiring. We give a new characterization of functions regognized by weighted automata, and compare our results to various other such characterizations.

Joint work with N. Labai. To appear in GandALF 2013.


Robert Sedgewick (Princeton University, USA): If You Can Specify It, You Can Analyze It. The Lasting Legacy of Philippe Flajolet

The Flajolet School of the analysis of algorithms and combinatorial structures is centered on an effective calculus, known as analytic combinatorics, for the development of mathematical models that are sufficiently accurate and precise that they can be validated through scientific experimentation. It is based on the generating function as the central object of study, first as a formal object that can translate a specification into mathematical equations, then as an analytic object whose properties as a function in the complex plane yield the desired quantitative results. Universal laws of sweeping generality can be proven within the framework, and easily applied. Standing on the shoulders of Cauchy, Polya, de Bruijn, Knuth, and many others, Philippe Flajolet and scores of collaborators developed this theory and demonstrated its effectiveness in a broad range of scientific applications.

Flajolet's legacy is a vibrant field of research that holds the key not just to understanding the properties of algorithms and data structures, but also to understanding the properties of discrete structures that arise as models in all fields of science. This talk will survey Flajolet's story and its implications for future research.


Andre Platzer (Carnegie Mellon University): How to Prove Your Robot Safe

Zemanek Room, Favoritenstrasse 11, 1040 Wien

Robots are up and coming and will be everywhere. Nowadays, robots interact more frequently with a dynamic environment outside limited manufacturing sites and they do so in close proximity with humans. That's why we don't want them to misbehave. Safety is key. But it's a difficult problem, because robots move around and how they move around is determined by complicated computer control programs. The combination of both aspects makes them cyber-physical systems.

This talk addresses the question how we can prove robots to be safe. How can we phrase what safety means? How can we model robots? And, once we have done that, how do we prove that the robot is safe?

We use hybrid system models and formal verification techniques based on differential dynamic logic to prove that the robot's control prevents collisions. We also verify safety despite location and actuator uncertainty. The approach we present is more general and has been used to verify other cyber-physical systems, including cars, trains, aircraft, and surgical robotics systems. The approach features a rich theory, which will, however, not play a major role in this talk.


Sebastian Eberhard (Universität Bern): An implicit characterisation of polynomial time using a truth predicate

Seminar Room Goedel, Favoritenstrasse 9, ground floor, courtyard

Theories of truth over a combinatory algebra have high expressive strength in spite of the simplicity of their axioms. They allow straightforward interpretations of theories of arithmetic and fixed point theories. In this talk, the truth predicate will be used to obtain a natural implicit characterisation of polynomial time.

A theory T_PT of truth over combinatory logic of polynomial strength will be introduced in detail. T_PT has a flexible truth-induction axiom which yields polynomial strength without assuming polynomially growing functions. We describe the proof-theoretic techniques necessary to treat T_PT, and show that T_PT suggests the design of new safe/normal function algebras for polynomial time.


Christoph Benzmüller (Freie Universität Berlin): Utilizing Church's type theory as a universal logic

Seminarraum Gödel, Institutsgebäude (Favoritenstr. 9-11) - EG, Zugang vom Innenhof

Expressive non-classical logics such as quantified modal logics or quantified conditional logics have many potential applications.

In this talk I address the question whether expressive non-classical logics, either individually or flexibly combined, can be mechanized and automated in a generic reasoning framework. This question is relevant beyond the boundaries of computer science and artificial intelligence.

I will outline a solution based on classical higher order logic (Church's type theory). I will show that many expressive non-classical logics are just natural fragments of classical higher order logic and that also flexible combinations of these logic fragments can easily be modeled in it. A particular focus of the talk will be on quantified modal logics and their combinations; quantified conditional logics will also be addressed.

Employing the above approach off-the-shelf higher order automated theorem provers, like my own LEO-II system, and higher order model finders can be directly applied for reasoning within non-classical logics and their combinations. Interestingly, even automated verification and exploration of meta-level properties of logics and logic combinations is now feasible.


Leonid Libkin (Univ. of Edinburgh): Graph Logics and Relations on Words

Zemanek Hörsaal, Institutsgebäude (Favoritenstr. 9-11) - EG

We investigate some basic questions about the interaction of regular and rational relations on words. The primary motivation comes from the study of logics for querying graph topology, which have recently found numerous applications. Such logics use conditions on paths expressed by regular languages and relations, but they often need to be extended by rational relations such as subword (factor) or subsequence. Evaluating formulae in such extended graph logics boils down to checking nonemptiness of the intersection of rational relations with regular or recognizable relations.

We prove that for several basic and commonly used rational relations, the intersection problem with regular relations is either undecidable (e.g., for subword or suffix, and some generalizations), or decidable with non-multiply-recursive complexity (e.g., for subsequence and its generalizations). These results are used to rule out many classes of graph logics that freely combine regular and rational relations, as well as to provide the simplest problem related to verifying lossy channel systems that has non-multiply-recursive complexity. We then prove a dichotomy result for logics combining regular conditions on individual paths and rational relations on paths, by showing that the syntactic form of formulae classifies them into either efficiently checkable or undecidable cases. We also give examples of rational relations for which such logics are decidable even without syntactic restrictions.


Tanel Tammet (Tallinn University of Technology): Recommender systems create new motivations for probabilistic logic

Seminarraum Gödel, Institutsgebäude (Favoritenstr. 9-11) - EG, Zugang vom Innenhof

We will first present a short overview of our recommender systems www.sightsplanner.com and www.sightsmap.com along with a description of the core technologies used in both. Next we will look into our experiences with automated reasoning tasks in the system along with possible solutions. The last part of the talk brings up motivation and problems with algorithms and systems for reasoning with probabilistic logic.


Egon Boerger (University of Pisa, Italy): The Abstract State Machines Method for Modeling and Analysis of Software-Based Systems

Seminarraum Gödel, Institutsgebäude (Favoritenstr. 9-11) - EG, Zugang vom Innenhof

We survey the mathematical foundation of the method and some of its recent characteristic applications in software engineering.


Byron Cook (Microsoft Research): Proving that programs eventually do something good

Technische Universität Wien, Karlsplatz 13, 1040 Wien, 1st floor, large hall "Festsaal"

Abstract: Software failures can be sorted into two groups: those that cause the software to do something wrong (e.g. crashing), and those that result in the software not doing something useful (e.g. hanging). In recent years automatic tools have been developed which use mathematical proof techniques to certify that software cannot crash. But, based on Alan Turing's proof of the halting problem's undecidablity, many have considered the dream of automatically proving the absence of hangs to be impossible. While not refuting Turing's original result, recent research now makes this dream a reality. This lecture will describe this recent work and its application to industrial software.


Sergei Artemov (City University of New York): Justification Logic

ABSTRACT: Justification Logic extends the usual modal logic of knowledge and belief with a component representing justification. The celebrated account of knowledge as "justified true belief," which is attributed to Plato, has long been a focus of epistemic studies. About a half-century ago, the notions of knowledge and belief acquired formalization by means of modal logic. However the notion of justification, an essential element of epistemic studies, was conspicuously absent, and this led to well-known deficiencies inherent in modal logics of knowledge.

Justification Logic extends the logic of knowledge in three major ways. First, it adds a long-anticipated mathematical notion of justification, making the logic more expressive. We now have the capacity to reason about justifications, simple and compound. We can compare different pieces of evidence pertaining to the same fact. We can measure the complexity of justifications, thus connecting the logic of knowledge to a rich complexity theory, etc. Second, justification logic furnishes a new, evidence-based foundation for the logic of knowledge, according to which F is known' is interpreted asF has an adequate justification.' Third, justification logic provides a novel, evidence-based mechanism of truth tracking which can be a valuable tool for extracting robust justifications from a larger body of justifications which are not necessarily reliable.


Harvey Friedman (Ohio State University):Concrete Mathematical Incompleteness

Technische Universität Wien, Karlsplatz 13, 1040 Wien, 1st floor, large hall "Festsaal"

Abstract: An unprovable theorem is a theorem about basic mathematical objects that can only be proved using more than the usual axioms for mathematics (ZFC = Zermelo Frankel set theory with the Axiom of Choice) - and that has been proved using standard extensions of ZFC generally adopted by the mathematical logic community.

The highlight of the talk is the presentation of unprovable theorems stated in terms of fixed points; cliques in graphs; kernels in directed graphs.

We first review some previous examples of unprovable theorems.

1-5 are unprovable in the weaker sense that any proof demonstrably requires some use of logical principles transcendental to the problem statement. 6

is BRT (Boolean Relation Theory).

  1. Patterns in finite sequences from a finite alphabet.

  2. Pointwise continuous embeddings between countable sets of reals (or more concretely, rationals).

  3. Relations between f(n_1,...,n_k) and f(n_2,...,n_k+1).

  4. Homeomorphic embeddings between finite trees.

  5. Borel sets in the plane and graphs of one dimensional Borel functions.

  6. Boolean relations between sets of integers and their images under integer functions.


Mehrnoosh Sadrzadeh (Oxford): Modal Logic and Reasoning about Information Flow

1040 Wien, Seminarraum "Gödel", Favoritenstr. 9

Joint Lecture of the Kurt Goedel Society and the Wolfgang Pauli Institute (WPI) Vienna

Abstract: By adding one unary connective to propositional logic, a modality, mysterious connections are formed. The logic becomes a decidable fragment of first order logic, whose semantics correspond to topological frames. Modalities have formalized a variety of concepts such as belief, time, and provability. The epistemic applications, especially recent trends of reasoning about information update, are mostly based on classical settings and their computations use the full power of negation.

Questions arise. Theoretically: can modal epistemic computations become constructive? On the foundational side: can dual modalities be defined in a positive setting? In practice: can positive modalities still model useful concepts?

I will show that by using methods from algebraic and categorical logic, pairs of positive adjoint modalities can be defined. These come equipped with simple unfolding rules and can interpret dynamic and epistemic concepts. I will show how they are applied to reason about information flow in a wide range of scenarios, from our daily lives when we learn from communication, to artificial intelligence for location detection in robot navigation.


Moshe Y. Vardi (Rice University): And Logic Begat Computer Science: When Giants Roamed the Earth

Technische Universität Wien, Karlsplatz 13, 1040 Wien, 1st floor, large hall "Festsaal"

During the past fifty years there has been extensive, continuous, and growing interaction between logic and computer science. In fact, logic has been called "the calculus of computer science". The argument is that logic plays a fundamental role in computer science, similar to that played by calculus in the physical sciences and traditional engineering disciplines. Indeed, logic plays an important role in areas of computer science as disparate as architecture (logic gates), software engineering (specification and verification), programming languages (semantics, logic programming), databases (relational algebra and SQL), artificial intelligence (automated theorem proving), algorithms (complexity and expressiveness), and theory of computation (general notions of computability). This non-technical talk will provide an overview of the unusual effectiveness of logic in computer science by surveying the history of logic in computer science, going back all the way to Aristotle and Euclid, and showing how logic actually gave rise to computer science.


Dale Miller (INRIA Saclay, LIX/Ecole Polytechnique): Towards a broad spectrum proof certificate

Abstract: Proof assistants and automated theorem provers generally produce evidence of a successful proof in an assortment of (often ad hoc) formats. Unfortunately, the evidence generated by one proof is seldom useful for another prover or even for a future version of itself. I will outline some recent work on providing a single language for proof certificates for which a simple proof checker can be built and for which a wide range of common proof formats (sequent calculus, natural deduction, tableau, and tabled deduction) can be hosted. A proof checker for proof certificates can then immediately be a proof checker for all these hosted proof systems. Such proof certificates should make it possible for any theorem prover to trust proofs from any other theorem prover and for markets and libraries of proofs to become possible.


Jeremy Avigad (Carnegie Mellon University): Understanding, formal verification, and the philosophy of mathematics

Abstract: The philosophy of mathematics has long been focused on determining the methods that are appropriate for justifying claims of mathematical knowledge, and the metaphysical considerations that make them so. But, as of late, a number of philosophers have noted that a much broader range of normative judgments arise in ordinary mathematical practice; for example, questions can be natural, theorems important, proofs explanatory, concepts powerful, and so on. Such judgments are often viewed as providing assessments of mathematical understanding, something more complicated and mysterious than mathematical knowledge.

Meanwhile, in a branch of computer science known as "formal verification," interactive proof systems have been developed to support the construction of complex formal axiomatic proofs. Such efforts require one to develop models of mathematical language, inference, and proof that are more elaborate than the simple foundational models of the last century. In this talk, I will explain how these models illuminate various aspects of mathematical understanding, and discuss ways that such work can inform, and be informed by, a more robust philosophy of mathematics.


Hans de Nivelle (Institute of Computer Science, University of Wroclaw, Poland)

Abstract: We give a natural semantics for classical logic with partial functions. The semantics is based on multi-valued logic, so that formulas involving undefined values can have undefined truth values. An unusual aspect of our semantics is that it relies on the order of formulas. In this way, the semantics is able to capture the fact that functions must be declared before they are used.

We will introduce the notion of sequent. We argue that infinite sequents are more natural than finite sequents for our semantics. We introduce a sequent calculus for the infinite sequents, and prove the calculus sound and complete with respect to the semantics given before.

We think that our approach to partial functions is more natural than existing approaches, because in our approach, formulas involving undefined values are guaranteed to be undefined. This captures the natural behaviour of programs containing undefined values. The outcome of such programs should be undefined, even if an outcome could be deduced in principle.


Kazushige Terui (Kyoto University): Algebraic proof theory for substructural logics

Abstract of the author:

I will outline an algebraic and systematic approach to proof theory for substructural logics, which is being developed together with A. Ciabattoni, N. Galatos and L. Strassburger. In proof theory, one transforms Hilbert axioms into Gentzen rules to obtain cut-elimination theorems. In algebra, one embeds a given algebra into a complete one for various purposes. It is not surprising that these two are deeply related, since both are concerned with conservative extension. The connection is, however, even tighter than expected. For one thing, the passage from Hilbert axioms to Gentzen rules plays a crucial role for algebraic completion too. For another, Maehara-Okada cut-elimination can be construed as generalization of Dedekind-MacNeille completion. To illustrate the situation, we introduce a hierarchy on nonclassical axioms. One of our goals is then to identify a boundary in the hierarchy, below which proof theory (cut-elimination) works well and above which one finds negative algebraic results (failure of completions).

In the first talk, I will give a gentle introduction to the theory of Dedekind-MacNeille completions and algebraic cut-elimination, presumably for both sequent and hypersequent calculi. In the second talk, I will introduce the substructural hierarchy and show how it naturally classifies axioms in nonclassical logics, and apply the algebraic method to obtain uniform cut-elimination theorems for logics whose axioms are located at a low level of the substructural hierarchy.


Natasha Sharygina (Lugano, Carnegie Mellon): The Synergy of Precise and Fast Abstractions for Program Verification

Abstract of the author:

Predicate abstraction is a powerful technique to reduce the state space of a program to a finite and affordable number of states. It produces a conservative over-approximation where concrete states are grouped together according to the predicates. Given a set of predicates, a precise abstraction contains the minimal set of transitions, but as a result it is computationally expensive. Most model checkers therefore approximate the abstraction to alleviate the computation of the abstract system by trading off precision with cost. However, approximation results in a higher number of refinement iterations, since it can produce more false counterexamples than its precise counterpart. The refinement loop can become prohibitively expensive for large programs.

In this talk I will present a new abstraction refinement technique that combines slow and precise predicate abstraction techniques with fast and imprecise ones. It allows computing the abstraction quickly, but keeps it precise enough to avoid too many refinement iterations. We implemented the new algorithm in a software model checker. Our tests with various real life benchmarks show that the new approach systematically outperforms the classical approaches based on precise and imprecise techniques.

This is a joint work with my students, Aliaksei Tsitovich and Stefano Tonetta.


Marysia Spychalska (Warsaw University): Reasoning with quantifiers "most" and "some" - empirical evidence

We report on our experimental results concerning reasoning with quantifiers "some" and "most" in natural language. We focus on differences between logically correct inferences and the inferences that are accepted or produced by people in everyday reasoning. According to classical approach both "some" and "most" have existential meaning in this sense that they are false in empty domains, on the contrary - a sentence with "all" can be vacuously satisfied. Thus whereas "Some A's are B" can be inferred from "Most A's are B", neither of these sentences can be inferred from a universal sentence, unless we assume that A and B are non-empty. On the other hand, it seems that in natural language the use of informationally weaker "some" or "most" by a speaker strongly suggests that a sentence with the informationally stronger quantifier "all" is false.

What is even more, since "some" is informationally weaker than "most", the use of "some" suggests that a sentence with "most" is false. However, the stronger items logically imply the weaker, so "all" implies both "most" and "some" (with additional assumption that domains are non-empty). In our experiments we checked whether people's inferring "some" (resp. "most") from "all" depends on the character of domains (empty vs. nonempty terms). We analyzed also people's readiness to accept "not all" as a conclusion from sentences with "some" or "most", and "not most" as a conclusion from sentences with "some". In this talk we report on our observations and propose how to explain our results.


Makoto Tatsuta (NII, Tokyo), Non-Commutative First-Order Sequent Calculus

Abstract: This talk investigates a non-commutative first-order sequent calculus NCLK. First this talk extends a non-commutative positive fragment to a full first-order sequent calculus LK- having antecedent-grouping and no right exchange rule. This talk shows (1) NCLK is equivalent to LJ, (2) NCLK with the exchange rule is equivalent to LK, (3) LK- is equivalent to LJ, and (4) translations between LK- and NCLK.


Walter Carnielli (State University of Campinas, University of Luxembourg):Polynomials as a proof method: a "rendez-vous" between logic and algebra

The notion of truth-functionality seems to be essential for the analysis of logic systems, and the so-called non-truth-functional logics pose a challenge to the usual algebraically-based semantic tools.

I show that, by generalizing the Boolean setting towards polynomials over finite rings (in particular, Galois fields), we can in some sense bypass some difficulties of non-truth-functionality and offer a new view of abstract

truth-values, which brings together a unity between logic and algebra, online with some intuitions by Boole and Leibniz.

As motivating examples, I will show how semantics for several logics (as the ones by N. Belnap and J. M. Dunn, K. Gödel, J. Lukasiewicz and M. Sette) can be expressed by multivariable polynomials over Boolean rings, with happy consequences.

These ideas naturally apply to non-truth-functional paraconsistent logics (in particular to the LFI's, or `logics of formal inconsistency') which are uncharacterizable by finite-valued semantics.

Some challenging tasks, yet to be done, are to extend this method to full quantified logic, to higher-order logic and to modal logic, and to evaluate its connections to abstract algebraic logic.


Sergei Artemov (City University of New York): Logic of Justifications

We describe a general logical framework, Justification Logic, for reasoning about epistemic justification. Justification Logic extends epistemic modal logic by justification assertions t:F that read "t is a justification for F" and absorbs basic principles originating from both the mathematical theory of proofs and mainstream epistemology. We argue that Justification Logic is a natural formal framework for studying the well-known Justified True Belief vs. Knowledge problem which can be traced back to Plato. We state a general correspondence Theorem showing that behind each epistemic modal logic, there is a robust system of justifications. This renders a new, evidence-based foundation for epistemic logic. As a case study, we formalize the paradigmatic Gettier examples in Justification Logic and reveal their hidden assumptions and redundancies.


Ana Sokolova (University of Salzburg): Bismilarity and trace as instances of the same thing (Generic coalgebraic trace theory)

This is joint work with Ichiro Hasuo (University of Nijmegen, The Netherlands, and Kyoto University, Japan) and Bart Jacobs (University of Nijmegen, The Netherlands)

Labelled transition systems (LTS) are a common model of concurrent systems. Semantic equivalences on LTS provide notions of behaviour. A way to verify correctness of a concurrent system with respect to a given semantics ~ is to model both the system (Sys) and the correctness property (Spec) as LTS and show that Sys ~ Spec. The spectrum of behaviour equivalences is rich, from the strongest---bisimilarity, to the weakest---trace semantics.

We work in the abstract setting of coalgebras which are generic (category theoretic) transition systems. They come equipped with a generic notion of bisimilarity obtained by coinduction. However, defining other semantics for coalgebras is not straightforward. We developed a theory of generic traces (applicable to certain coalgebras in an order enriched setting) which shows that trace equivalence is also obtained by coinduction while shifting the base category to a Kleisli category. The theory instantiates to both (possibilistic, standard) LTS and to probabilistic transition systems.

I will start the talk with a mild introduction to coalgebras and their behaviour semantics. Further, I will present the results leading to the generic trace theory.


Peter Schuster (LMU München) -- Finite methods in commutative algebra

A partial realisation of Hilbert's programme in (commutative) algebra has proved practicable in the preceding years. It means to exclusively work with finite methods, and without ideal objects such as prime ideals, whenever the data under consideration are sufficiently concrete. In other words, constructive proofs are sought, and found, during which one need not go beyond the type level on which the theorems are formulated. This approach of a point-free and syntactical flavour gained power when ideas from formal topology and from dynamical algebra were combined. As stressed by Wraith, its success is guaranteed by Barr's completeness theorem for geometric logic, in which large parts of algebra can be formulated.

One of the milestones was the elementary and inductive characterisation of Krull dimension given by Coquand, Lombardi, and Roy. This made possible, among other things, a constructive proof on low type levels of the theorem of Eisenbud-Evans-Storch, which says that d+1 polynomials suffice to generate, up to radicals, any finitely generated ideal of univariate polynomials with coefficients in a ring of Krull dimension d. (This generalises the fact that every variety in dimension d is an intersection of d hypersurfaces, for which Kronecker still needed d+1 hypersurfaces.)

As a typical theorem with concrete input and concrete output, the Eisenbud-Evans-Storch theorem merits a proof exclusively done by finite methods. The proof we have produced with Coquand and Lombardi further provided deeper insight into the topological character of the problem, and prompted a variant of Kaplansky's regular element property which is of some interest on its own. We take this theorem and its proof as a guiding example for our survey, and will indicate some possible future work.


Prof. Makoto Tatsuta (NII, Tokyo) -- Substitution Theorem

This talk proved two theorems: Substitution theorem and characterization of persistently strongly normalizing terms. Substitution theorem is for usual lambda calculus (untyped lambda calculus with beta-reduction). We say a term M is strongly normalizing if it terminates with any reduction strategy. A simple instance of Substitution theorem is that for any term M, MXYZ is strongly normalizing for any strongly normalizing terms X,Y,Z if MXXX is strongly normalizing for any strongly normalizing term X. This means that in order to prove termination of a 3-place function f(x,y,z) in a functional programming language, it is sufficient to show termination of the function call f(x,x,x). A term M is called persistently strongly normalizing if MX_1 ... X_n is strongly normalizing for any n and any strongly normalizing terms X_1, ..., X_n. A type theoretic characterization of these terms was a long-standing open question. This talk solved this problem by using Substitution theorem. This is a joint work with Mariangiola Dezani-Ciancaglini (Torino University).


Hans de Nivelle -- Uebersetzungsbasierte Entscheidungsverfahren fuer Modallogiken

Der Vortrag beschaeftigt sich mit Uebersetzungsmethoden von Modallogiken in das guarded Fragment. Erstens erklaere ich die standard Uebersetzung von Modallogik K in das guarded Fragment. (Das guarded Fragment ist eine entscheidbare Teilmenge von Logik erster Stufe) Danach gebe ich eine erweiterte Uebersetzungsmethode mit der auch die Modallogik S4 uebersetzt werden kann.

Waehrend des restlichen Vortrags beschaeftige ich mich mit der Frage fuer welche Modallogiken diese erweiterte Uebersetzungsmethode verwendbar ist.

Jede Modallogik die eine regulaere Framebedingung hat, ist uebersetzbar. Aber, wenn eine Modallogik eine nicht regulaere Framebedingung hat, muss das noch nicht bedeuten dass die Logik keine Uebersetzung hat. Die meisten Logiken koennen durch mehrere Framebedingungen definiert werden. Fuer die Uebersetzung reicht es wenn eine der Framebedingungen regulaer ist.

Ich gebe eine sprachentheoretische Characterisierung die genau festliegt, wenn zwei Framebedingungen die gleiche Modallogik definieren. Danach zeige ich, dass wenn eine Modallogik eine regulaere Framebedingung hat, dass dann auch ihre maximale Framebedingung regulaer ist. Das bedeutet, dass man, wenn man wissen will ob eine Modallogik uebersetzbar ist, nur die maximale Framebedingung auf Regularitaet testen muss.

Ich habe ein Programm geschrieben das fuer die maximale Framebedingungen regulaere Automaten vorschlaegt. Diese Automaten sind mit hoher Wahrscheinlichkeit korrekt, aber der Algorithmus ist nur probabilistisch. Das Programm schlaegt fuer sehr viele Modallogiken regulaere Automaten vor, ich habe aber zur Zeit fuer die meisten keinen Korrektheitsbeweis. In der Zukunft will ich das Programm verbessern, (Die Laufzeit ist jetzt ziemlich lang), und die endgueltigen Beweise liefern dass die Automaten korrekt sind. Ich werde auch die Suche nach einem Semi-Entscheidungsverfahren, das in der Lage ist, garantiert richtige Automaten zu finden, nicht aufgeben.


Dr.Nikolaos Galatos (JAIST, Japan) --- Cut elimination for substructural logics from a semantical perspective

Substructural logics include classical, intuitionistic, linear, relevant, many-valued and fuzzy logic. Most of these logics are axiomatic extensions of FL, the logic associated with the full Lambek calculus. The algebraic semantics for FL is the variety of (pointed) residuated lattices. The latter include purely algebraic structures like lattice-ordered groups and ideal lattices of rings, on top of the ones that come from logic: Boolean, Heyting and MV-algebras.

The algebraic study of substructural logics over FL has been successful and this may be one of the reasons why relational semantics have not attracted a lot of attention. Another reason is that the various relational semantics that have been introduced by different authors have not had many applications.

In this talk we will introduce Kripke-style semantics, called residuated frames, for substructural logics and show how they are related to residuated lattices. Moreover, we will show how we can obtain a cut elimination (or rather cut admissibility) result at the level of frames that implies the cut elimination of FL and of two more algebraic variants. The language of frames brings to the surface a fundamental property of the underlying relation that is related to cut elimination. We use this to construct a Gentzen system for 'classical FL'. This is a Gentzen system with classical sequents and two distinct (due to the non-commutative nature of FL) negation operations. The cut elimination of the system is stated and proved at the frame level. The decidability of the Gentzen system and of the equational theory of involutive residuated lattices follow from this result.


Prof. Lenore Blum (School of Computer Science, Carnegie Mellon) ---Computing over the Reals: Where Turing Meets Newton

The classical (Turing) theory of computation has been extraordinarily successful in providing the foundations and framework for theoretical computer science. Yet its dependence on 0’s and 1’s is fundamentally inadequate for providing such a foundation for modern scientific computation where most algorithms --with origins in Newton, Euler, Gauss, et al.-- are real number algorithms.

In 1989, Mike Shub, Steve Smale and I introduced a theory of computation and complexity over an arbitrary ring or field R. If R is Z_2 = {0,1}, the classical computer science theory is recovered. If R is the field of real numbers, Newton’s algorithm, the paradigm algorithm of numerical analysis, fits naturally into our model of computation.

Complexity classes P, NP and the fundamental question "Does P= NP?" can be formulated naturally over an arbitrary ring or field R. The answer to the fundamental question depends on the complexity of deciding feasibility of polynomial systems over R. When R is Z_2, this becomes the classical satisfiability problem of Cook-Karp-Levin. When R is the field of complex numbers, the answer depends on the complexity of Hilbert’s Nullstellensatz.

The notion of reduction between discrete problems (e.g. between traveling salesman and satisfiability) has been a powerful tool in classical complexity theory. But now, in addition, the transfer of complexity results between domains (discrete and continuous) becomes a real possibility.

In this talk, I will discuss these results and indicate how basic notions from numerical analysis such as condition, round off and approximation are being introduced into complexity theory, bringing together ideas germinating from the real calculus of Newton and the discrete computation of computer science. This nexus provides a veritable meeting between Turing and Newton.


Dr. Pavel Hrubes (Mathematical Institute of the Academy of Sciences of the Czech Republic) -- Lower bounds for modal logics

We show that the basic system of modal logic K has a form of monotone interpolation, and hence we give an exponential lower bound on the lengths of proofs in K. The result immediately extends to K4, Godel-Lobs logic, S and S4. However, it is not immediately applicable to S5. Indeed, we show that S5 has an unbounded speed-up over K.


Dr. Carsten Sinz (University of Tübingen) - Proof Compression by Extended Resolution

Propositional logic decision procedures lie at the heart of many applications in hard- and software verification, artificial intelligence and automatic theorem proving. In many practical applications it is not sufficient, however,to obtain a yes/no answer from the decision procedure. Either a model, representing a sample solution, or a justification (i.e. a proof), why the formula possesses none, is required.

In this talk I will demonstrate the use of powerful proof systems (e.g. Extended Resolution) as a unifying means to compactly and concisely represent propositional logic proofs obtained from the application of different Boolean logic algorithms (e.g. Davis-Putnam, Binary Decision Diagrams).


Set Theory Workshop at the Kurt Gödel Research Center for Mathematical Logic

The Kurt Gödel Society and the Kurt Gödel Research Center for Mathematical Logic (Institute or Logic of the University of Vienna) invite to the Set Theory Workshop.

For further information please contact Sy Friedman (sdf@logic.univie.ac.at)


Thomas Forster (Cambridge) -- The iterative concept of set and its importance for computer science, 15:00-16:00, Bernd Buldt (Konstanz) -- Intensionsionality revisited and its importance for computer science

The Kurt Gödel Society and the Department of DBAI (TU Wien) invite to lectures by:

Thomas Forster - Cambridge, "The iterative concept of set and its importance for computer science"

Bernd Buldt - Konstanz, "Intensionality revisited and its importance for computer science"

The lectures will be given in the Seminarraum 184/2 of the Institut für Informationssysteme - Favoritenstr. 9-11/Stg.3/3.Stock/Zugang über die Rampe/RaumHF0309.


Prof. Hiroakira Ono (School of Information Science, JAIST, Japan) -Modalities in substructural logics ---- a preliminary report-

There are some attempts to introduce modalities in substructural logics. But, the selection of axioms for modalities looks sometimes ad hoc, as non-modal logical connectives in substructural logics are not related to each other like those in classical logic.

We propose a way --- a proper way we hope --- of introducing modalities over full Lambek logic FL. In algebraic terms, our modal operators are

"conuclei" on residuated lattices.

Using these modalities, Goedel translation can be naturally extended to one from substructural logics over FL to modal substructural logics over our FLS4. Moreover, Girard's translation of intuitionistic logic into intuitionistic linear logic with exponential can be obtained as a particular case of our result.


Richard Zach -- Algebraic Semantics for Logics of Vagueness, 16:45-17:30, Libor Behounek --On the motivation of fuzzy logic and 17:30-18:15, Petr Cintula -- Weakly implicative fuzzy logics - new results

Richard Zach (University of Calgay, Canada)

Algebraic Semantics for Logics of Vagueness"

In his recent paper "True, Truer, Truest", Brian Weatherson has argued for classical logic as the logic of vagueness by obtaining it from an algebraic (order theoretic) approach to vagueness based on a primitive relation of truer than, which yields a semantics in terms of Boolean lattices. The approach is similar--both in spirit and in the formal details--to approaches to vagueness based on other ordered structures, such as MV or Goedel algebras. The basic philosophical methodology of such an approach is, however, problematic in principle. Moreover, the details of the algebraic structures used are in conflict with some of the underlying pre-theoretic intuitions about "truer than."

Libor Behounek (Institute of Computer Science of the Academy of Sciences of the Czech Republic)

"On the motivation of fuzzy logic"

The traditional [0,1] motivation falls short of explaining several features of fuzzy logic. An independent motivation based on general pre-theoretical assumptions will be given in the talk, and some features of fuzzy logic addressed (truth-functionality, non-idempotency of strong conjunction,

etc.).


Mark van Atten --The hypothetical judgement in the history of intuitionistic logic

Three points will be argued for:

  1. Although Heyting and Kolmogorov both accept Ex Falso as a formal rule, Heyting's justification of it would not be acceptable to Kolmogorov; Heyting's claim that his interpretation and Kolmogorov's are essentially the same is not correct; and Kolmogorov's own justification is the more reasonable one of the two.

  2. It has been claimed that a discussion in Brouwer's dissertation shows that he at the time rejected the hypothetical judgement. This is not the case, and in fact that discussion contains an account of such judgements that served him all his life. On this account, hypothetical judgements may in certain cases have false antecedents, but there is no justification of the general principle Ex Falso.

  3. Neither Heyting's nor Kolmogorov's justification of Ex Falso fits into Brouwer's conception of logic. Unlike the name BHK-interpretation suggests, there never was one single understanding of logic that Brouwer, Heyting and Kolmogorov all agreed on.


Arnold Beckmann (Department of Computer Science University of Wales) --Uniform proof-complexity

Inspired by Cook's NP vs. coNP Program we will introduce uniform proof systems which will be subsets of first order formulas defined via translations to propositional proof systems. We will describe basic properties of them, discuss their relationship to the mentioned Cook's Program, and explain some more advanced features of them.


Kazushige Terui (National Institute of Informatics, Tokyo) -- Which Structural Rules Admit Cut Elimination? An Algebraic Criterion

Consider a general class of structural inference rules such as exchange, weakening, contraction and their generalizations. Among them, some are harmless but others do harm to cut elimination. Hence it is natural to ask under which condition cut elimination is preserved when a set of structural rules is added to a structure-free logic.

The aim of this talk is to give such a condition by using algebraic semantics.

We consider full Lambek calculus (FL), i.e., intuitionistic logic without any structural rules, as our basic framework. Residuated lattices are the algebraic structures corresponding to FL. In this setting, we introduce a criterion, called the propagation property, that can be stated both in syntactic and algebraic terminologies. We then show that, for any set R of structural rules, the cut elimination theorem holds for FL enriched with R if and only if R satisfies the propagation property can be "completed" into another set R, so that the cut elimination theorem holds for FL enriched with R, while the provability remains the same.

As an application, we show that any set R of structural rules can be "completed" into another set R, so that the cut elimination theorem holds for FL enriched with R, while the provability remains the same.

The paper is available at

research.nii.ac.jp/~terui/cut.pdf


Roy Dyckhoff (University of St Andrews) -- Focused Proof Systems in Intuitionistic Logics

We survey results about several proof systems for intuitionistic logic, beginning with a paper by Vorob'ev and including work of Hudelmaier and Herbelin. All have the property that, in our notation, there are two kinds of sequent, in one of which the formula in a special place, here called the focus, is the only possibility for use as principal formula. Inter alia we present some strengthenings of classical (but not well-known) results and an application to Jankov's logic of weak excluded middle, complementing by proof-theoretic methods results of Fiorino et al in Milan.


Nicola Olivetti (University of Torino, Italy) - Belief revision and Conditional Logic

Abstract: Belief revision and conditional logics are two active areas of research in Artificial Intelligence, Philosophy, and Computer Science. Gardenfors formulated a very natural relation between belief revision systems (ruled by the AGM-postulates) and conditional logics via the so-called Ramsey-Test. However Gardenfors himself proved a Triviality Result according to which such a relation cannot hold for any significant revision system. After Gardenfors' s negative result the problem of the relations between the two areas has been studied for more than fifteen years, and has stimulated a wide literature.

In this talk we show that the correspondence between conditional logics and belief revision established by Gardenfors's Ramsey Test can be safely preserved, and the Triviality Result avoided, once we weaken the revision postulates in a very intuitive way. Moreover, we can derive a conditional logic, called BCR, from the revision postulates via the Ramsey-Test.This logic has an axiomatization, a standard selection-function models semantics, and it is decidable. It turns out that there is an isomorphism between belief revision systems and selection function models of BCR. We can therefore claim that the logic BCR provides a logical formalization of belief revision in the (object-) language of conditional logic. This formalization could be a starting point to automatize reasoning about revision.


Yuri Matijasevich (St. Petersburg): Hilbert's tenth problem today: main results and open questions

The Kurt Gödel Society and the Kurt Gödel Research Center for Mathematical Logic (Institute or Logic of the University of Vienna) invite to a lecture by

Yuri Matijasevich

from the Steklov Institute, St. Petersburg, on

Hilbert's tenth problem today: main results and open questions

The lecture will be given in the Seminarraum of the Institute on

Monday, 25. October 2004 at 16:00

preceded by a tea at 15:30 in the Besprechungsraum of the institute, Währinger Straße 25, A-1090 Wien.


Piotr Wojtylak (Katowice), ``An operation on inferential rules'', place: Seminar room Favoritenstrasse

Our aim is to give a frugal formalization of classical logic based on certain ideas contained in the proof of Hilbert's Praemissentheorem'', see [1]. An operation on inferential rules, called H-operation, is used to minimize the axiom basis for classical logic. Given two inferential rules, say r and r', we can derive by use of it another rule r(r'). The definition of the operation is due to Pogorzelski [2]. Our attempt is to incorporate the H-operation into the set of usual consequence conditions as given by Tarski. The received proof system might be regarded as a kind ofreverse logic''. The H-operation applies to classical propositional as well as to quantifier logic. But it is not easy to give a natural (that means considered in literature) non-classical system which is closed under the operation.

[1] Hilbert D., Bernays P., Grundlagen der Mathematik, Bd II, Supplement III, Springer Verlag, Berlin 1939.

[2] Pogorzelski W.A. , On Hilbert's operation on logical rules I, Reports on Mathematical Logic 12(1981), 35-50.


George Metcalfe (King's College, London, UK): Uninorm Based Fuzzy Logics

Uninorms are a generalization of t-norm and t-conorm operators commonly used to interpret ands'' andors'' in fuzzy logics i.e. commutative, associative, increasing binary functions on the real unit interval [0,1], with an identity element e that can appear anywhere in the interval (e=1 giving a t-norm, e=0 giving a t-conorm). The idea of this talk is to generalize the well-known construction of fuzzy logics based on t-norms to fuzzy logics based on uninorms. To this end we give an axiomatization for a new basic fuzzy logic U which can be viewed both as Monoidal t-norm based fuzzy logic MTL without weakening, and as a kind of ``linear linear logic''. We then investigate axiomatic extensions of U which correspond to logics based on particular classes of uninorms, including a logic of idempotent uninorms which turns out to be the relevant logic RM, and a logic of continuous uninorms related to the combining function of the expert system MYCIN.