Masses of Formal Philosophy: Question 2

Here’s my (much delayed) answer to the second of Vincent Hendricks and John Symons’ five questions about Formal Philosophy.

What example from your work illustrates the role formal methods can play in philosophy?

I’ll focus on one example from some of my recent work.

In the last few years I have been working on topics in proof theory and connections between the way we can conceive of the structure of proofs and concerns in the theory of meaning. The idea that the meaning of a word or a concept might be usefully explicated by giving an account of its inferential role is a common one – the work of Ned Block, Bob Brandom and Michael Dummett are three very different examples of ways to take this idea seriously. It is a truism that meaning has some sort of connection with use, and use in reasoning and inference is a very important part of any account of use.

It has seemed to me that if we are going to take take inferential role as playing its part in a theory of meaning, then we had better use the best available tools for giving an account of proof. The theory of proofs should have something to teach philosophers who have interests in semantics. This is not a mainstream position – our vocabulary itself speeks against this, with the ready identification of model theory with ‘semantics’ and proof theory with ‘syntax’. The work of intuitionists such as Dummett, Prawitz, Martin-Löf and Tennant in conspicuous in its isolation at providing a contrary opinion to the mainstream. This has led to the opinion that semantically anti-realist positions – those that take proof or inference as the starting point of semantic theory, rather than truth-conditions or representation – are naturally revisionary and intuitionist. For intuitionistic logic has a clear motivation in terms of proof and verification, and it has seemed to many that orthodox classical logic does not.

I think that this is a mistake. It seems to me that natural proof-theoretic accounts of classical logic (starting with Gentzen’s sequent calculus, but also newer pieces of technology such as proof-nets) can have a central place in a theory of meaning that starts with inferential role and not with truth. We can think of the valid sequents (of the form XY, where X and Y are sets of statements) as helping us ‘keep score’ in dialectical positions. The validity of the sequent XY tells us that a position in dialogue in which each statement in X is asserted and each statement in Y is denied is out of bounds according to the rules of ’the game.’ In fact, the structural rules in the sequent calculus can be motivated in this way. Identity sequents X,AA,Y tell us that asserting and denying A (in the same context) is out of bounds. The rule of weakening tells us that if asserting X and denying Y is out of bounds then adding an extra assertion or extra denial would not aid the matter. The cut rule tells us that if a position in which X is asserted and Y is denied is not out of bounds, then given a statement A, either the addition as an assertion, or its addition as a denial will also not be out of bounds. If asserting A is out of bounds in a context, it is implicitly denied in that context. Explicitly denying is no worse than implicitly denying.

Thinking of Gentzen’s sequent calculus in this way gives an alternative understanding of classical logic. We think of the rules for connectives as ‘definitions’ governing assertions featuring the logical vocabulary. Proof-theoretical techniques such as the eliminability of the ‘cut’ rule tell us that these definitions are conservative. No matter what the rules of the game concerning our primitive vocabulary might be, we can add the classical logical connectives without disturbing the rules of assertion in that primitive vocabulary (the need for this point was made clearly in Nuel Belnap’s paper “Tonk, Plonk and Plink”). The logical vocabulary allows us to ‘make explicit’ what was merely ‘implicit’ before. The interpretation of the rules of the quantifiers is particularly enlightening. It allows us to sidestep the debate between ‘substitutional’ and ‘objectual’ accounts of quantification.

In my recent work I have tried to flesh out this picture, and to show how we can expand this story to take account of appropriate conditions for use for modal connectives such as possibility and necessity. The key idea is that in modal discourse we not only assert and deny, but we make assertions and denials in different dialectical contexts, and an assertion of a necessity claim in one context can impinge on claims in other dialectical contexts. This means that we can give a semantics of modal vocabulary that motivates a well-known modal logic (in the first instance, the simple modal logic S5, but the extension to other logics is not difficult) in which possible worlds are not the starting point of semantic explanation. Modal vocabulary needs not be conceived of as a way of describing possible worlds. It can be understood as a governing discourse in which we not only assert and deny to express our own commitments, but also to articulate the connections between our concepts. The structures of dialectical positions need not merely contain assertions and denials, but these may be partitioned into different ‘zones’ according to structure of the different suppositions and shifts of context in that discourse.


about

I’m Greg Restall, and this is my personal website. I am the Shelby Cullom Davis Professor of Philosophy at the University of St Andrews, and the Director of the Arché Philosophical Research Centre for Logic, Language, Metaphysics and Epistemology I like thinking about – and helping other people think about – logic and philosophy and the many different ways they can inform each other.

subscribe

To receive updates from this site, subscribe to the RSS feed in your feed reader. Alternatively, follow me at  @consequently@hcommons.social, where most updates are posted.

contact

This site is powered by Netlify, GitHub, Hugo, Bootstrap, and coffee.   ¶   © 1992– Greg Restall.