Nordic Online Logic Seminars

The Nordic Online Logic Seminar (NOL Seminar) is organised monthly over Zoom, with expository talks on topics of interest for the broader logic community. The seminar is open for professional or aspiring logicians and logic aficionados worldwide.

The tentative time slot is Monday, 16.00-17.30 (Sweden). If you wish to receive the Zoom ID and password for it, as well as further announcements, please subscribe to the NOL Seminar mailing list.

NOL seminar organisers
Valentin Goranko and Graham Leigh

Talks in the Nordic Online Logic Seminar

Albert Visser (Utrecht University) Restricted Sequential Theories

Sequential theories form a fundamental class of theories in logic. They have full coding possibilities and allow us to build partial satisfaction predicates for formulas of bounded depth-of-quantifier-alernations. In many respects, they are the proper domain of Gödelian metamathematics. We explain the notion of sequential theory.

A theory is restricted if it can be axiomatised by axioms of bounded depth-of-quantifier-alernations. All finitely axiomatised theories are restriced, but, for example, also Primitive Recursive Arithmetic. We explain the small-is-very-small principle for restricted sequential theories which says that, whenever the given theory shows that a definable property has a small witness, i.e., a witness in a sufficiently small definable cut, then it shows that the property has a very small witness, i.e., a witness below a given standard number.

We sketch two proofs that restricted theories are incomplete (however complex the axiom set). One uses the small-is-very-small principle and the other a direct Rosser argument. (The second argument was developed in collaboration with Ali Enayat.)

Lauri Hella (Tampere University) Game characterizations for the number of quantifiers

A game that characterizes definability of classes of structures by first-order sentences containing a given number of quantifiers was introduced by Immerman in 1981. In this talk I describe two other games that are equivalent with the Immerman game in the sense that they characterize definability by a given number of quantifiers.

In the Immerman game, Duplicator has a canonical optimal strategy, and hence Duplicator can be completely removed from the game by replacing her moves with default moves given by this optimal strategy. On the other hand, in the other two games there is no such optimal strategy for Duplicator. Thus, the Immerman game can be regarded as a one-player game, but the other two games are genuine two-player games.

The talk is based on joint work with Kerkko Luosto.

Peter Pagin (Stockholm University) Switcher Semantics and quantification

Switcher Semantics is a semantic framework that is basically characterised by allowing switching: when recursively applying a semantic function \(\mu\) to a complex term \(t\), the semantic function applying to an immediate subterm \(t'\) of \(t\) may be a function \(\mu'\), distinct from \(\mu\). An operator-argument-position pair is called a switcher if it induces such a switch. Switcher semantic systems do not satisfy the standard form of compositionality, but a generalized form, which allows greater flexibility. In earlier work (mostly published), some together with Kathrin Glüer, some with Dag Westerståhl, it has been applied to natural language constructions like proper names in modal contexts, general terms in modal contexts, indexicals in temporal contexts, quotation, and belief contexts. This talk will focus on quantifiers and quantification. First-order quantifiers can be regarded as switchers, switching from truth conditions to satisfaction conditions. The larger topic is quantification into switched contexts. I shall begin by giving an introduction to the framework.

Göran Sundholm (Leiden University) Curry-Howard: a meaning explanation or just another realizability interpretation?

Around 1930 a major paradigm shift occurred in the foundations of mathematics; we may call it the METAMATHEMATICAL TURN. Until then the task of a logician had been to design and explain a full-scale formal language that was adequate for the practice of mathematical analysis in such a way that the axioms and rules of inference of the theory were rendered evident by the explanations.

The metamathematical turn changed the status of the formal languages: now they became (meta)mathematical objects of study. We no longer communicate with the aid of the formal systems – we communicate about them. Kleene’s realizability (JSL 1945) gave a metamathematical (re-)interpretation of arithmetic inside arithmetic. Heyting and Kolmogorov (1931-2), on the other hand, had used “proofs” of propositions, respectively “solutions” to problems, in order to explain the meaning of the mathematical language, rather than reinterpret it internally.

We now have the choice to view the Curry-Howard isomorphism, say, as a variant of realizability, when it will be an internal mathematical re-interpretation, or to adopt an atavistic, Frege-like, viewpoint and look at the language as being rendered meaningful. This perspective will be used to discuss another paradigm shift, namely that of distinguishing constructivism and intuitionism. The hesitant attitude of Gödel, Kreisel, and Michael Dummett, will be spelled out, and, at the hand of unpublished source material, a likely reason given.

Sonja Smets (Institute for Logic, Language and Computation, University of Amsterdam) Reasoning about Epistemic Superiority and Data Exchange

In this presentation I focus on a framework that generalizes dynamic epistemic logic in order to model a wider range of scenarios including those in which agents read or communicate (or somehow gain access to) all the information stored at specific sources, or possessed by some other agents (including information of a non-propositional nature, such as data, passwords, secrets etc). The resulting framework allows one to reason about the state of affairs in which one agent (or group of agents) has ‘epistemic superiority’ over another agent (or group). I will present different examples of epistemic superiority and I will draw a connection to the logic of functional dependence by A. Baltag and J. van Benthem. At the level of group attitudes, I will further introduce the new concept of ‘common distributed knowledge’, which combines features of both common knowledge and distributed knowledge. This presentation is based on joint work with A. Baltag in [1].

[1] A. Baltag and S. Smets, Learning what others know, in L. Kovacs and E. Albert (eds.), LPAR23 proceedings of the International Conference on Logic for Programming, AI and Reasoning, EPiC Series in Computing, 73:90-110, 2020. https://doi.org/10.29007/plm4

Dag Westerståhl (Stockholm University, Tsinghua University) From consequence to meaning: the case of intuitionistic propositional logic (IPL)

One quarter of the talk presents background on how facts about entailments and non-entailments can single out the constants in a language, and in particular on an idea originating with Carnap that the standard relation of logical consequence in a formal language should fix the (model-theoretic) meaning of its logical constants. Carnap’s focus was classical propositional logic (CPL), but his question can be asked for any logical language. The rest of the talk gives a very general positive answer to this question for IPL: the usual IPL consequence relation does indeed determine the standard intuitionistic meaning of the propositional connectives, according to most well-known semantics for IPL, such as Kripke semantics, Beth semantics, Dragalin semantics, topological semantics, and algebraic semantics.

Thomas Bolander (Technical University of Denmark) Epistemic Planning: Logical formalism, computational complexity, and robotic implementations

Dynamic Epistemic Logic (DEL) can be used as a formalism for agents to represent the mental states of other agents: their beliefs and knowledge, and potentially even their plans and goals. Hence, the logic can be used as a formalism to give agents, e.g. robots, a Theory of Mind, allowing them to take the perspective of other agents. In my research, I have combined DEL with techniques from automated planning in order to describe a theory of what I call Epistemic Planning: planning where agents explicitly reason about the mental states of others. The talk will introduce epistemic planning based on DEL, address issues of computational complexity, and demonstrate applications in cognitive robotics and human-robot collaboration.

Thomas Ågotnes (University of Bergen and Shanxi University) Somebody Knows and Weak Conjunctive Closure in Modal Logic

Normal modal logics are closed under conjunctive closure. There are, however, interesting non-normal logics that are not, but which nevertheless satisfy a weak form of conjunctive closure. One example is a notion of group knowledge in epistemic logic: somebody-knows. While something is general knowledge if it is known by everyone, this notion holds if it is known by someone. Somebody-knows is thus weaker than general knowledge but stronger than distributed knowledge. We introduce a modality for somebody-knows in the style of standard group knowledge modalities, and study its properties. Unlike most other group knowledge modalities, somebody-knows is not a normal modality; in particular it lacks the conjunctive closure property. We provide an equivalent neighbourhood semantics for the language with a single somebody-knows modality, together with a completeness result: the somebody-knows modalities are completely characterised by the modal logic EMN extended with a particular weak conjunctive closure axiom. The neighbourhood semantics and the completeness and complexity results also carry over other logics with weak conjunctive closure, including the logic of so-called local reasoning (Fagin et al., 1995) with bounded “frames of mind”, correcting an existing completeness result in the literature (Allen 2005). The talk is based on joint work with Yi N. Wang.

Magdalena Ortiz (Umeå University) A Short Introduction to SHACL for Logicians

The SHACL Shapes Constraint Language was recommended in 2017 by the W3C for describing constraints on web data (specifically, on RDF graphs) and validating them. At first glance, it may not seem to be a topic for logicians, but as it turns out, SHACL can be approached as a formal logic, and actually quite an interesting one. In this paper, we give a brief introduction to SHACL tailored towards logicians and frame key uses of SHACL as familiar logic reasoning tasks. We discuss how SHACL relates to description logics, which are the basis of the OWL Web Ontology Languages, a related yet orthogonal standard for web data. Finally, we summarize some of our recent work in the SHACL world, hoping that this may shed light on how ideas, results, and techniques from well-established areas of logic can advance the state of the art in this emerging field.

Neil Tennant (Ohio State University) It's All or Nothing: Explosion v. Implosion

We set out five basic requirements for a logical system to be adequate for the regimentation of deductive reasoning in mathematics and science. We raise the question whether there are any further requirements, not entailed by these five, that ought to be imposed. One possible reply is dismissed: that the logical system should allow one to infer any proposition at all from an inconsistent set—i.e., it should have as primitive, or allow one to derive, the rule Ex Falso Quodlibet. We then propose that the logic should be implosive: it should not allow an inconsistent set to have any consequences other than absurdity. This proposal may appear to be very radical; but we hope to show that it is robust against objections.

Ali Enayat (University of Gothenburg) Arithmetic and set theory through the lens of interpretability

The notion of (relative) interpretation for first order theories was introduced in a landmark 1953 monograph by Alfred Tarski, Andrzej Mostowski and Raphael Robinson, where it was developed as a powerful tool for establishing undecidability results. By now the domain of interest and applicability of interpretability theory far exceeds undecidability theory owing to its multifaceted interactions with both proof theory and model theory. Special attention will be paid to recent advances in the subject that indicate the distinctive character of Peano Arithmetic, Zermelo-Fraenkel set theory, and their higher order analogues in the realm of interpretability theory. This talk will present a personal overview of the interpretability analysis of arithmetical and set theoretical theories.

Sven Ove Hansson (KTH) How to combine probabilities and full beliefs in a formal system

One of the major problems in formal epistemology is the difficulty of combining probabilistic and full (dichotomous, all-or-nothing) beliefs in one and the same formal framework. Major properties of actual human belief systems, including how they are impacted by our cognitive limitations, are used to introduce a series of desiderata for realistic models of such belief systems. This leads to a criticism of previous attempts to combine representations of both probabilistic and full beliefs in one and the same formal model. Furthermore, a formal model is presented in which this is done differently. One of its major features is that contingent propositions can be transferred in both directions between full beliefs and lower degrees of belief, in a way that mirrors real-life acquisitions and losses of full beliefs. The subsystem consisting of full beliefs has a pattern of change that constitutes a credible system of dichotomous belief change.

Vann McGee (MIT) Boolean Degrees of Truth and Classical Rules of Inference

Compositional semantics that acknowledge vagueness by positing degrees of truth intermediate between truth and falsity can retain classical sentential calculus, provided the degrees form a Boolean algebra. A valid deduction guarantees that the degree of truth of the conclusion be at least as great as every lower bound on the degrees of the premises. If we extend the language to allow infinite disjunctions and conjunctions, the Boolean algebra will need to be complete and atomic. If we extend further by adding quantifiers ranging over a fixed domain, we get the supervaluations proposed by Bas van Fraassen.

Alexandru Baltag (University of Amsterdam) From Surprise Exams to Topological Mu-Calculus

I present a topological epistemic logic, motivated by a famous epistemic puzzle: the Surprise Exam Paradox. It is a fixed-point modal logic, with modalities for knowledge (modelled as the universal modality), knowability of a proposition (represented by the topological interior operator), and (un)knowability of the actual world. The last notion has a non-self-referential reading (modelled by Cantor derivative: the set of limit points of a given set) and a self-referential one (modelled by the so-called perfect core of a given set: its largest subset which is a fixed point of relativized derivative). I completely axiomatize this logic, showing that it is decidable and PSPACE-complete, as well as briefly explain how the same model-theoretic method can be elaborated to prove the completeness and decidability of the full topological mu-calculus. Finally, I apply it to the analysis of the Surprise Exam Paradox and other puzzles.

References:

  • A. Baltag, N. Bezhanishvili, D. Fernández-Duque. The Topology of Surprise. Proceedings of the International Conference on Principles of Knowledge Representation and Reasoning. Vol. 19 (1), 33-42, 2022. Available online in ILLC Prepublication (PP) series PP-2022-06.
  • A. Baltag, N. Bezhanishvili, D. Fernández-Duque. The Topological Mu-Calculus: Completeness and Decidability. LICS ‘21: Proceedings of the 36th Annual ACM/IEEE Symposium on Logic in Computer Science, vol 89: 1-13, 2021. doi:10.1109/lics52264.2021.9470560. Available online in ILLC Prepublication (PP) series PP-2021-07.

Laura Crosilla (University of Oslo) On Weyl's predicative concept of set

In the book Das Kontinuum (1918), Hermann Weyl presents a coherent and sophisticated approach to analysis from a predicativist perspective. In the first chapter of (Weyl 1918), Weyl introduces a predicative concept of set, according to which sets are built ‘from the bottom up’ starting from the natural numbers. Weyl clearly contrasts this predicative concept of set with the concept of arbitrary set, which he finds wanting, especially when working with infinite sets. In the second chapter of Das Kontinuum, he goes on to show that large portions of 19th century analysis can be developed on the basis of his predicative concept of set.

Das Kontinuum inspired fundamental ideas in mathematical logic and beyond, such as the logical analysis of predicativity of the 1950-60’s, Solomon Feferman’s work on predicativity and Errett Bishop’s constructive mathematics. The seeds of Das Kontinuum are already visible in the early (Weyl 1910), where Weyl, among other things, offers a clarification of Zermelo’s axiom schema of Separation.

In this talk, I examine Weyl’s predicative concept of set in (Weyl 1918) and discuss its origins in (Weyl 1910).

Bibliography

  • Weyl, H., 1910, Über die Definitionen der mathematischen Grundbegriffe, Mathematischnaturwissenschaftliche Blätter, 7, pp. 93-95 and pp. 109-113.
  • Weyl, H., 1918, Das Kontinuum. Kritische Untersuchungen über die Grundlagen der Analysis, Veit, Leipzig. Translated in English, Dover Books on Mathematics, 2003.

Melvin Fitting (City University of New York (Graduate Center)) Strict/Tolerant Logic and Strict/Tolerant Logics

Strict/tolerant logic, ST, has been of considerable interest in the last few years, in part because it forces consideration of what it means for two logics to be different, or the same. And thus, of what it means to be a logic. The basic idea behind ST is that it evaluates the premises and the conclusions of its consequence relation differently, with the premises held to stricter standards while conclusions are treated more tolerantly. More specifically, ST is a three-valued logic with left sides of sequents understood as if in Kleene’s Strong Three Valued Logic, and right sides as if in Priest’s Logic of Paradox. Surprisingly, this hybrid validates the same sequents that classical logic does, though it differs from classical logic at the metaconsequence level. A version of this result has been extended to meta, metameta , etc. consequence levels, creating a very interesting hierarchy of logics. All this is work of others, and I will summarize it.

My contribution to the subject is to show that the original ideas behind ST are, in fact, much more general than it first seemed, and an infinite family of many valued logics have Strict/Tolerant counterparts. Besides classical logic, this family includes both Kleene’s and Priest’s logics themselves, as well as first degree entailment. For instance, for both the Kleene and the Priest logic, the corresponding strict/tolerant logic is six-valued, but with differing sets of strictly and tolerantly designated truth values. There is a reverse notion, of Tolerant/Strict logics, which exist for the same structures. And the hierarchy going through meta, metameta, \ldots consequence levels exists for the same infinite family of many valued logics. In a similar way all this work extends to modal and quantified many valued logics. In brief, we have here a very general phenomenon.

I will present a sketch of the basic generalizations, of Strict/Tolerant and Tolerant/Strict, but I will not have time to discuss the hierarchies of such logics, nor will I have time to give proofs, beyond a basic sketch of the ideas involved.

Jan von Plato (University of Helsinki) Gödel's work in logic and foundations in the light of his shorthand notebooks

The study of Gödel’s shorthand notebooks in the ERC project GODELIANA has revealed two main aspects of his work: First, there is behind each of his relatively few published papers an enormous amount of notes. They typically reveal how he arrived at his results, as with the incompleteness theorem. An extreme case are Gödel’s 1938 results on set theory, preceded by some 700 pages of notes never studied before. Secondly, his threshold for publishing was by 1940 so high that some two thousand pages of developments and results remained unknown. One highlight here is a series of notebooks titled “Resultate Grundlagen” in which numerous anticipations of later results in set theory are found. A second main topic are the relations between modal and intuitionistic logic. In particular, Gödel gives a syntactic translation from S4 to intuitionistic logic by methods that are readily generalizable to any decent intermediate logics. These newly discovered methods are, even by today’s standards, a clear step ahead in the study of interrelations between systems of non-classical logics.

Øystein Linnebo (University of Oslo) Potentialism in the philosophy and foundations of mathematics

Aristotle famously claimed that the only coherent form of infinity is potential, not actual. However many objects there are, it is possible for there to be yet more; but it is impossible for there in fact to be infinitely many objects. Although this view was superseded by Cantor’s transfinite set theory, even Cantor regarded the collection of all sets as “unfinished” or incapable of “being together”. In recent years, there has been a revival of interest in potentialist approaches to the philosophy and foundations of mathematics. The lecture provides a survey of such approaches, covering both technical results and associated philosophical views, as these emerge both in published work and in work in progress.

Michael Rathjen (University of Leeds) Completeness: Turing, Schütte, Feferman (and Löb)

Progressions of theories along paths through Kleene’s Omega adding the consistency of the previous theory at every successor step, can deduce every true \( \Pi^0_1 \)-statement. This was shown by Turing in his 1938 thesis who called these progressions “ordinal logics”. In 1962 Feferman proved the amazing theorem that progressions based on the “uniform reflection principle” can deduce every true arithmetic statement. In contrast to Turing’s, Feferman’s proof is very complicated, involving several cunning applications of self-reference via the recursion theorem. Using Schütte’s method of search trees (or decomposition trees) for omega-logic and reflexive induction, however, one can give a rather transparent proof.

Juliette Kennedy (University of Helsinki) Reading syntax off semantics

The practice of foundations of mathematics is built around a firm distinction between syntax and semantics. But how stable is this distinction, and is it always the case that semantically presented mathematical objects in the form e.g. of a model class might give rise to a “natural logic”? In this talk I will investigate different scenarios from set and model theory in which an investigation of the notion of an implicit or internal logic or syntax becomes possible. Time permitting we will also discuss the question whether logics without a syntax can be considered logics at all.

Thierry Coquand (Gothenburg University) Formalization of Mathematics and Dependent Type Theory

The first part will be about representation of mathematics on a computer. Questions that arise there are naturally reminiscent of issues that arise when teaching formal proofs in a basic logic course, e.g. how to deal with free and bound variables, and instantiation rules. As discussed in a 1962 paper of Tarski, these issues are “clearly experienced both in teaching an elementary course in mathematical logic and in formalizing the syntax of predicate logic for some theoretical purposes.” I will present two quite different approaches to this problem: one inspired by Tarski’s paper (N. Megill, system Metamath) and one using dependent type theory (N.G. de Bruijn).

The second part will then try to explain how notations introduced by dependent type theory suggest new insights for old questions coming from Principia Mathematica (extensionality, reducibility axiom) through the notion of universe, introduced by Grothendieck for representing category theory in set theory, and introduced in dependent type theory by P. Martin-Löf.

Johan van Benthem (University of Amsterdam, Stanford University, and Tsinghua University) Interleaving Logic and Counting

Reasoning with generalized quantifiers in natural language combines logical and arithmetical features, transcending divides between qualitative and quantitative. This practice blends with inference patterns in ‘grassroots mathematics’ such as pigeon-hole principles. Our topic is this cooperation of logic and counting on a par, studied with small systems and gradually moving upward. We start with monadic first-order logic with counting. We provide normal forms that allow for axiomatization, determine which arithmetical notions are definable, and conversely, discuss which logical notions and reasoning principles can be defined out of arithmetical ones. Next we study a series of strengthenings in the same style, including second-order versions, systems with multiple counting, and a new modal logic with counting. As a complement to our fragment approach, we also discuss another way of controlling complexity: changing the semantics of counting to reason about ‘mass’ or other aggregating notions than cardinalities. Finally, we return to the basic reasoning practices that lie embedded in natural language, confronting our formal systems with linguistic quantifier vocabulary, monotonicity reasoning, and procedural semantics via semantic automata. We conclude with some pointers to further entanglements of logic and counting in the metamathematics of formal systems, the philosophy of logic, and cognitive psychology. (Joint work with Thomas Icard)

Paper available at: https://eprints.illc.uva.nl/id/eprint/1813/1/Logic.Counting.pdf

Lars Birkedal (Aarhus University) Iris: A Higher-Order Concurrent Separation Logic Framework

I will introduce some of our research on Iris, a higher-order concurrent separation logic framework, implemented and verified in the Coq proof assistant, which can be used for mathematical reasoning about safety and correctness of concurrent higher-order imperative programs. Iris has been used for many different applications; see iris-project.org for a list of research papers. However, in this talk I will focus on the Iris base logic (and its semantics) and sketch how one may define useful program logics on top of the base logic. The base logic is a higher-order intuitionistic modal logic, which, in particular, supports the definition of recursive predicates and whose type of propositions is itself recursively defined.

Sara L. Uckelman (Durham University) John Eliot’s Logick Primer: A bilingual English-Wôpanâak logic textbook

In 1672 John Eliot, English Puritan educator and missionary, published The Logick Primer: Some Logical Notions to initiate the INDIANS in the knowledge of the Rule of Reason; and to know how to make use thereof [1]. This roughly 80 page pamphlet focuses on introducing basic syllogistic vocabulary and reasoning so that syllogisms can be created from texts in the Psalms, the gospels, and other New Testament books. The use of logic for proselytizing purposes is not distinctive: What is distinctive about Eliot’s book is that it is bilingual, written in both English and Wôpanâak (Massachusett), an Algonquian language spoken in eastern coastal and southeastern Massachusetts. It is one of the earliest bilingual logic textbooks, it is the only textbook that I know of in an indigenous American language, and it is one of the earliest printed attestations of the Massachusett language.

In this talk, I will:

  • Introduce John Eliot and the linguistic context he was working in.
  • Introduce the contents of the Logick Primer—vocabulary, inference patterns, and applications.
  • Discuss notions of “Puritan” logic that inform this primer.
  • Talk about the importance of his work in documenting and expanding the Massachusett language and the problems that accompany his colonial approach to this work.

References

[1] J.[ohn] E.[liot]. The Logick Primer: Some Logical Notions to initiate the INDIANS in the knowledge of the Rule of Reason; and to know how to make use thereof. Cambridge, MA: Printed by M.[armaduke] J.[ohnson], 1672.

Erich Grädel (RWTH Aachen University) Semiring semantics for logical statements with applications to the strategy analysis of games

Semiring semantics of logical formulae generalises the classical Boolean semantics by permitting multiple truth values from certain semirings. In the classical Boolean semantics, a model of a formula assigns to each (instantiated) literal a Boolean value. K-interpretations, for a semiring K, generalize this by assigning to each such literal a value from K. We then interpret 0 as false and all other semiring values as nuances of true, which provide additional information, depending on the semiring: For example, the Boolean semiring over {0,1} corresponds classical semantics, the Viterbi-semiring can model confidence scores, the tropical semiring is used for cost analysis, and min-max-semirings (A, max, min, a, b) for a totally ordered set (A,<) can model different access levels. Most importantly, semirings of polynomials, such as N[X], allow us to track certain literals by mapping them to different indeterminates. The overall value of the formula is then a polynomial that describes precisely what combinations of literals prove the truth of the formula.

This can also be used for strategy analysis in games. Evaluating formulae that define winning regions in a given game in an appropriate semiring of polynomials provides not only the Boolean information on who wins, but also tells us how they win and which strategies they might use. For this approach, the case of Büchi games is of special interest, not only due to their practical importance, but also because it is the simplest case where the logical definition of the winning region involves a genuine alternation of a greatest and a least fixed point. We show that, in a precise sense, semiring semantics provide information about all absorption-dominant strategies – strategies that win with minimal effort, and we discuss how these relate to positional and the more general persistent strategies. This information enables further applications such as game synthesis or determining minimal modifications to the game needed to change its outcome.

Anupam Das (University of Birmingham) On the proof theoretic strength of cyclic reasoning

Cyclic (or circular) proofs are now a common technique for demonstrating metalogical properties of systems incorporating (co)induction, including modal logics, predicate logics, type systems and algebras. Inspired by automaton theory, cyclic proofs encode a form of self-dependency of which induction/recursion comprise special cases. An overarching question of the area, the so-called ‘Brotherston-Simpson conjecture’, asks to what extent the converse holds.

In this talk I will discuss a line of work that attempts to understand the expressivity of circular reasoning via forms of proof theoretic strength. Namely, I address predicate logic in the guise of first-order arithmetic, and type systems in the guise of higher-order primitive recursion, and establish a recurring theme: circular reasoning buys precisely one level of ‘abstraction’ over inductive reasoning.

This talk will be based on the following works:

Dag Normann (Oslo) An alternative perspective on Reverse Mathematics

In his address to the International Congress of Mathematics in Vancouver, 1974, Harvey Friedman launched a program where the aim would be to find the minimal set of axioms needed to prove theorems of ordinary mathematics. More than often, it turned out that the axioms then would be provable from the theorems, and the subject was named Reverse Mathematics. In this talk we will survey some of the philosophy behind, and results of, the early reverse mathematics, based on the formalisation of mathematics within second order number theory.

In 2005, Ulrich Kohlenbach introduced higher order reverse mathematics, and we give a brief explanation of the what and why? of Kohlenbach’s approach. In an ongoing project with Sam Sanders we have studied the strength of classical theorems of late 19th/early 20th century mathematics, partly within Kohlenbach’s formal typed theory and partly by their, in a generalised sense, constructive content. In the final part of the talk I will give some examples of results from this project, mainly from the perspective of higher order computability theory. No prior knowledge of higher order computability theory is needed.

Wilfrid Hodges (Fellow of the British Academy) How the teenage Avicenna planned out several new logics

Almost exactly a thousand years ago a teenager known today as Avicenna lived in what is now Uzbekistan. He made a resolution to teach himself Aristotelian logic, armed with an Arabic translation of Aristotle and a century-old Arabic textbook of logic. A couple of years later, around his eighteenth birthday, he wrote a brief report of what he had learned. Six months ago I started to examine this report - I suspect I am the first logician to do that. It contains many surprising things. Besides introducing some new ideas that readers of Avicenna know from his later works, it also identifies some specific points of modal logic where Avicenna was sure that Aristotle had made a mistake. People had criticised Aristotle’s logic before, but not at these points. At first Avicenna had no clear understanding of how to do modal logic, and it took him another thirty years to justify all the criticisms of Aristotle in his report. But meanwhile he discovered for himself how to defend a new logic by building new foundations. I think the logic itself is interesting, but the talk will concentrate on another aspect. These recent discoveries mean that Avicenna is the earliest known logician who creates new logics and tells us what he is doing, and why, at each stage along the way.

Jouko Väänänen (Helsinki) Dependence logic: Some recent developments

In the traditional so-called Tarski’s Truth Definition the semantics of first order logic is defined with respect to an assignment of values to the free variables. A richer family of semantic concepts can be modelled if semantics is defined with respect to a set (a “team”) of such assignments. This is called team semantics. Examples of semantic concepts available in team semantics but not in traditional Tarskian semantics are the concepts of dependence and independence. Dependence logic is an extension of first-order logic based on team semantics. It has emerged that teams appear naturally in several areas of sciences and humanities, which has made it possible to apply dependence logic and its variants to these areas. In my talk I will give a quick introduction to the basic ideas of team semantics and dependence logic as well as an overview of some new developments, such as quantitative analysis of team properties, a framework for a multiverse approach to set theory, and probabilistic independence logic inspired by the foundations of quantum mechanics.

Dag Prawitz (Stockholm) Validity of inference and argument

An account of inferences should take into account not only inferences from established premisses but also inferences made under assumptions. This makes it necessary to consider arguments, chains of inferences in which assumptions and variables may become bound. An argument is valid when all its inferences are valid, and it then amounts to a proof in case it has no unbound assumptions or variables. The validity of an inference – not to confuse with the conclusion being a logical consequence of the premisses – seems in turn best explained in terms of proofs. This means that the concepts of valid inference and valid argument depend on each other and cannot be defined independently but have to be described by principles that state how they are related. A number of such principles will be proposed. It is conjectured that inferences that can be expressed in the language of first order intuitionistic predicate logic and are implied to be valid by these principles are all provable in that logic.