Great Moments in Logic

January 2004

“Great Moments in Logic” published online, 2004.

Bernhard Bolzano (1781-1848)

[Image of Bolzano] Bolzano was a philosopher, mathematician and logician who had important things to contribute in each of these fields. He’s probably best known for what he has done in mathematics: the precise definition of continuity when it comes to real-valued functions. But for my money (and in my discipline) Bolzano was worth much more than this. He was the first philosopher to give a precise analysis of logical consequence in terms we would recognise today.

(This was important for his project of understanding mathematics, and making it clearer, because mathematicians were getting into knots considering infinite numbers, infinitesimal numbers, and many other seemingly paradoxical things. The 1800s involved mathematicians painstakingly going over old reasoning step-by-step to see if (and how) the reasoning worked, and to see where it might go wrong. You can really only do this if you have a good understanding of how reasoning works. This is what Bolzano was trying to supply.)

Anyway, he argued that you can tell that an argument from premises to a conclusion is logically valid if and only if it never proceeds from truth to falsity no matter how you change the non-logical vocabulary in the argument. So

Sally is coming to the party.
If Sally is coming to the party, Jim will be happy.
Therefore Jim will be happy.

is valid because we will never step from truth to falsity no matter how we change the subject matter. For example

Sally is joining the army.
If Sally is joining the army, she is putting her life at risk.

Therefore Sally is putting her life at risk.

works too. In fact any argument with the shape

If p then q
p
Therefore q

works just as well. Note that the closely related shape

If p then q
q
Therefore p

doesn’t work, and we can see that it doesn’t work by looking at an example:

If it’s Thursday then it’s a weekday
It’s a weekday
Therefore it’s Thursday

This works sometimes but not all the time. The premises are true but the conclusion is false on a Friday for example.

Now, it’s still very much an open question as to what you are allowed to change, and what you should keep fixed when you’re testing an argument. Bolzano is really interesting here, because he says that it’s entirely a matter of convention. Different rules will give you a different story when it comes to what’s valid and what’s not. And that is pretty appealing to me.

Anyway, Bolzano’s analysis of logical consequence is a great step in the direction of constructing modern formal logic. We’ll see more of where it leads soon.

Read some more about Bolzano at the Interactive Real Analysis website.

George Boole (1815-1864)

[Image of Boole]Boole’s book The Laws of Thought (1854) is a landmark in the algebraic treatment of logic. That is, he showed how treating propositions or statements like quantities results in a pleasing formal structure.

The ideas are simple. For every proposition x there is its negation (or denial) ~x. The negation of a negation is the original statement (to deny ~x is to say no more or no less than x) so ~~x is identical to x. Negation acts sort-of like minus with regards to numbers. (Sort-of, anyway.) Similarly, for statements x and y we can form their conjunction x and y and their disjunction x or y. (To make the algebraic connection clear, Boole used xy for the conjunction of x and y, and x+y for the disjunction of x and y.) Then x or y is identical to y or x and x or (y and z) is identical to (x or y) and (x or z). This makes and and or look a lot like multiplication and addition. But the analogy is at most an analogy, because unlike the case with multiplication and addition x and x and x or x are just x, and ~(x and y) is ~x or ~y.

Boole characterised laws that he thought all systems of propositions must satisfy, and the resulting structures are called Boolean Algebras in his honour. They’re important concepts to this day.

Here’s more on Boole: Entries in the Oxford Companion to Philosophy, and the Oxford Dictionary of Scientists. Then there’s Boolean Algebra and its Applications showing some hints of where his research has gone today, and Boole’s original book Laws of Thought is still available today!

Georg Cantor (1845-1918)

[Image of Cantor]Georg Cantor conquered the infinite. Well, maybe he didn’t conquer it, but at least gave us a new way of conceptualising infinite quantities, and this is an amazing feat however you look at it.

Everyone agrees that some collections are infinite and some are not. The collection {1,2,3} of the first three counting numbers is finite, it’s bounded. But the collection {1,2,3,4,5,…} of all the counting numbers is not finite. It’s infinite or unbounded, it never comes to an end. Cantor showed that this is not the end of the story at all. He showed that there are bigger and bigger infinite numbers.

Hold on, you might say, how can things be bigger than infinite? Isn’t an infinite collection as big as you can get? Cantor’s answer is a surprising (qualified) no. It’s qualified because if by “infinite” you simply mean “not finite” then there’s nothing bigger than infinite. However, if you mean to say that there are no collections of things bigger than the collection {1,2,3,4,5,…} of all counting numbers, then Cantor can show that you’re wrong. But to do this, he needs to explain a little more about what we mean by “bigger than” and “smaller than” and “same size as” when it comes to collections.

This is pretty straightforward when it comes down to it. Two collections are the same size when you can correlate things from one collection with things from the other collection. So, {1, 2, 3} is the same size as {Christine, Greg, Zachary}. These collections have the same size as we can pair them up: Christine (1), Greg (2), Zachary (3).

One collection is at least as big as another collection if you can pair the things in the second up with things in the first, and maybe have things in the first collection left over. So, for example {Christine, Greg, Zachary} is at least as big as {1,2} because we can pair up Christine (1), Zachary (2) and have me left over, uncounted.

One collection A is smaller than another collection B just when A is not at least as big as B. So, A is smaller than B if there’s no way of pairing things from A with things from B with some things from A (possibly) left over. We try to pair A things with B things, but we’ve not got enough A things to go around to match up the B things.

This round-about-way of defining things is important, because Cantor wants to say that the collection {1,2,3,4,5,…} of all counting numbers is the same size as {2,4,6,8,…} of all even numbers, even though the even numbers are inside the counting numbers with some left over. You’d think that this means that the even numbers are fewer in number than the counting numbers, but this contradicts our first definition that two collections are the same size if they can be paired up. We can pair up all the numbers with the even numbers like this: 1 with 2, 2 with 4, 3 with 6, … n with 2 times n.

This is all good fun stuff (and it’s mind-bending when you think about what you can do with these infinite collections) but it’s not even the start of Cantor’s real discovery. Cantor’s best work was in showing that there had to be collections bigger than the collection {1,2,3,…}. This is a truly tricky piece of reasoning, but I think I can explain it. Remember how you can represent real numbers between 0 and 1 as decimals, numbers like 0.5, or 0.32, or infinite repeating ones like 0.333333… (that’s 1/3) or even infinite non-repeating ones like 0.14159… (that’s pi minus 3). Now Cantor showed that the collection of all real numbers between 0 and 1 is truly bigger than the collection of counting numbers. Here’s how he did it. He said, suppose that you did have a list of all of these numbers: then this list would contain all the numbers between 0 and 1. It would look something like this:

0.12345145…

0.11111111…

0.34989901…

0.31415926…

0.88888888…

.

.

.

but now, I’ll show you a number that can’t be on the list. Take the first digit (after the decimal point) in the first number, the second in the second number, the third in the third number and so on. You should be focussing on these digits

0.12345145…

0.11111111…

0.34989901…

0.31415926…

0.88888888…

.

.

.

Now, make a new number out of these digits, changing each one by adding one (and if a digit is 9, go back to zero). So the number we get is

0.22029…

and this number could not be in our original list of numbers, because it is not the same as any number on the list. (The nth number on the list differs from this number in the nth place.) So, the list you said was exhaustive, numbering all of the numbers, isn’t.

That is Cantor’s diagonalisation argument, and it’s a truly creative piece of work, showing that there are different infinite sizes of collections. The argument can be made more general, to show that for any collection there is one bigger, and that’s a mindboggling fact which we still haven’t truly come to understand. (If you think you understand it, ask yourself this: what happens when you apply it to the collection of everything that there is?)

Here’s more about Cantor at The St. Andrews History of Mathematics Site and the Interactive Real Analysis site.

Gottlob Frege (1848-1925)

[Image of Frege]Gottlob Frege is rightly called the father of modern logic. He helped set the tone and direction for the way we study logic to this day. Like many people who made their mark in logic, by training he was a mathematician, but his interests were also philosophical. He realised that the formal techniques of mathematics were good for representing things precisely, but that there were still many more gains to be made. In particular he thought that languages like German (his native language) and English and any other language we actually speak, is filled with ambiguities, inconsistencies and features which lead us astray. In particular, the similar structures of claims like

  • Socrates is mortal.
  • A human is mortal. (Or “Humans are mortal”, if you prefer.)
  • No-one is mortal.

lead us to think that the same kinds of things are being said in each case: we predicate mortality of Socrates, humanity and no-one in the three statements. Frege thought this was simply crazy, and that the real structure of what you’re saying in each case is rather different. In the first case, we indeed predicate mortality of the person Socrates. In the second case we’re not predicating mortality of anything in particular (or anything in general). According to Frege what we’re doing is very different: we’re saying that another statement is universally satisfied. What statement is this? It’s the statement if x is human, x is mortal. In saying that humans are mortal, I’m saying that if x is human, x is mortal is true for any x I choose. Or more swiftly for all x: if x is human, x is mortal. And this is nothing like the claim that Socrates is mortal. It has a very different structure.

The claim that no-one is mortal has a similar feature. It says not that a particular thing (that is, no-one) is mortal, but rather that the claim x is someone who is mortal is not satisfied. If you like for some x: x is someone who is mortal is not true.

Frege thought that this new way of analysing expressions gets closer to the real structure of the concepts involved. The new way of writing expressions —- making the form you write closer to the forms the concepts actually have —- is Frege’s Begriffschrift, his “Concept Script”. This notation looks very weird (there’s lots of lines going all over the place, and funny letters in different places) but it’s almost identical in content to the language we now call First-Order Logic, or Predicate Logic, and which we teach to our introductory logic students. Frege’s insights have stayed with us to this day.

There’s an entry on Frege in the Oxford Companion to Philosophy. You can get Anthony Kenny’s clear introduction Frege at Amazon. Both are worth reading.

Alexius Meinong (1853-1920)

[Image of Meinong]Frege’s analysis of expressions helps deal with some of the weird confusions you can get into with language: when you know that a expressions like “Socrates”, “somebody”, “everybody” and “nobody” work very differently, you won’t get so confused wondering whether the word “somebody” refers to a person, and if so, what kind of person he or she is (male, female, tall, short, logician, whatever). No, you’ll realise that terms like “somebody”, “everybody” and “nobody” don’t contribute to expressions by referring to things, but by doing something else: they quantify over things.

Meinong agreed with all of that, but he thought that sometimes there are expressions which genuinely do refer to things, and what they refer to tells us something important about what there is. He was one of those philosophers who thought that language, when properly understood, is a guide to ontology. One place where Meinong disagreed with many other philosophers was in the kind of ontology he was prepared to admit on the basis of language. For Meinong, a claim like “I’m thinking of a golden mountain” might be true, and it’s true because there’s something I’m thinking of. What is it I’m thinking of? It’s a golden mountain. So, there’s a golden mountain that I’m thinking about. Now, are there any golden mountains? Do any of them exist? I don’t think so. It follows that (for Meinong) there are two grades of being. There’s the kind of being that absolutely everything has, golden mountains, square circles, fictional characters, tables, chairs, you and me. And then there’s the kind of being that fewer things have: genuine existence. You and I exist (that is, if you’re not a fictional character reading this) but fictional characters, square circles and golden mountains don’t exist. We can think about them, and they have some properties, says Meinong, but existence is not one of them.

This is a bold claim, and it’s easy to see why some people don’t follow Meinong in this view. Why should language guide ontology in this way? (Of course, this is a general technique used by many analytic philosophers to this day. “For our speech about x to make any sense, we need to posit the existence of things such as y…”) But I think there’s much more of a problem with the view as stated. For Meinong, the golden mountain is golden, and it’s a mountain, because that’s how I’ve characterised it. But it doesn’t exist (there aren’t any golden mountains). Now, what if I try to characterise the existing golden mountain. It’s golden, it’s a mountain, but does it exist? No. So some ways of characterising things work, and others don’t. What’s the difference? It’s a tricky question, and it’s one that Meinong never really answered.

There’s an entry on Meinong in the Oxford Companion to Philosophy, and it’s a real shame that both Karel Lambert’s Meinong and the Principle of Independence and Richard Routley’s Exploring Meinong’s Jungle, the two best books on Meinong’s ontology, are pretty hard to get. You’ll have to steal them from me, I suppose.

Giuseppe Peano (1858-1932)

[Image of Peano]Guiseppe Peano was an Italian mathematician, remembered for what are called ‘Peano’s Axioms’, giving the properties of the numbers 0, 1, 2, 3,… and the operations of successor (going to the next number), addition and multiplication. The axioms group into sections. The first few govern zero and successor:

  • 0 is a number which is not the successor of any number
  • every number has just one successor which is a number
  • and that no two numbers have the same successor.

Then for addition we have two axioms:

  • x added to 0 is x.
  • x added to the successor of y is the successor of (x added to y)

And for multiplication we have two more axioms:

  • x multiplied by 0 is 0
  • x multiplied by the successor of y is (x multiplied by y) added to x

Both batches of axioms tell you how the operation works by showing how it works at the bottom of the scale (how to add zero or multiply by zero), and then it shows you how to do it higher up the scale (adding the successor of a number, or multiplying by the successor of a number) in terms of how it works lower down the scale.

Let’s see one example. Here’s how the axioms tell you what you get when you add 2 and 3. 2 is the successor of the successor of zero, so let’s write that “ss0”, and 3 is the successor of the successor of the successor of zero, so let’s write that “sss0”. In general, if x is some number, let’s write “sx” for the successor of x.

So, to get ss0 + sss0, the rules say that this is the same number as s(ss0 + ss0), because we’re adding sss0 (which is the successor of ss0) to ss0, so the result is the successor of the smaller addition (ss0+ss0). Let’s display this so that we can see it:

ss0+sss0 = s(ss0+ss0)

Now, what’s ss0 + ss0? The process is the same: ss0+ss0 = s(ss0+s0). The rules say we can trade an “s” on the right of an addition sign in for one outside the whole addition. So, we get

ss0+sss0 = s(ss0+ss0) = ss(ss0+s0)

But we can do this one more time. ss0+s0 = s(ss0+0), so

ss0+sss0 = s(ss0+ss0) = ss(ss0+s0) = sss(ss0+0)

But now, we know that ss0 + 0 = ss0, since adding zero gets you nowhere new. So,

ss0+sss0 = s(ss0+ss0) = ss(ss0+s0) = sss(ss0+0)= sssss0

But sssss0 is how we write 5, and indeed we’ve shown that 2+3=5.

You can show lots more things with these axioms, but they don’t do everything Peano wanted to think about. In particular, for any numbers x and y at all, you can show that x + y = y + x. However, with these axioms as they stand, you can’t show that for every number x and y, x + y = y + x. Do you get the difference here? The first says I’ve got a proof for every instance. The second says that I’ve got a proof for the general rule. That’s a different thing completely. To show that you’ve got a proof for the general rule we need something to say that whatever we can prove for the instances of numbers we can pick (0, s0, ss0, etc) goes for all of the numbers. A rule which does this is the rule of induction.

  • If something holds of 0, and if whenever it holds of a number it also holds of the successor of that number, then it holds of all the numbers.

This rule assures us that we can prove statements about all the numbers, like the statement that the order of addition doesn’t matter.

Now, proofs of facts using Peano’s rules are extremely longwinded, and you might be wondering what this is good for. After all, I already knew that 2+3=5, and I think you did too. Peano’s rules don’t explain what we do when we count or add or multiply. They’re intened to distill to their essence the postulates which govern the counting, adding and multiplying process, even if they’re not the rules we actually follow when we do things. (There are many different ways of learning to multiply. But if you don’t agree with Peano’s rules and get the same results as Peano does for his answers, then you’re either making a mistake, or you’re doing something other than counting.) Peano’s work was essential in clarifying and codifying what counting, adding and multiplying actually involves, and it is another great moment in the work of logicians in the 19th and 20th Centuries.

Peano’s axioms for arithmetic weren’t actually first invented by Peano: The great mathematician Dedekind was responsible for the axioms which Peano took up, clarified, detailed and explained, and it was through Peano that the work was disseminated to Russell and Whitehead and their magnificent Principia Mathematica which shaped a lot of work in the 20th Century.

I find it charming that Peano didn’t just do lots of mathematics and logic. The Oxford Dictionary of Scientists entry on Peano tells us that “among his extramathematical interests he was a keen propagandist for a proposed international language, Interlingua, which he had developed from Volapuk.” Quaint.

Here’s more on Peano from Oxford Companion to Philosophy, and the Oxford Dictionary of Scientists. The hard core among my readers might like to read Richard Kaye’s wonderful Models of Peano Arithmetic.

Edmund Husserl (1859-1938)

[Image of Husserl]What? Husserl wasn’t a logician, he was a phenomenologist! That’s the cry of my dedicated analytic philosopher readers. But a response like this is to presume that things in the turn of the 19th/20th Century looked to their inhabitants rather like we think they should, looking back from beyond the great divide between so-called-analytic and so-called-continental philosophy. Husserl couldn’t have conceived of such a division, and as far as I can tell, he’s pretty much an analytic philosopher along with Russell, Wittgenstein, and the rest of the gang. (And hey, didn’t Frege live on the continent? And didn’t half of the rest of them? It’s a crazy distinction, the analytic/continental one when you think of it.)

But why does Husserl warrant a place on my list of great moments in logic? For two reasons. The first is to mourn what might have been. The so-called split which means that people interested in phenomenology (a detailed analysis of phenomena, the structure of appearance, bracketing out questions of existence) can’t talk to people interested in logic is a very sad one. Looking back to Husserl who was active before such a split occured makes me wistful for what might have been and hopeful for what could yet be. Husserl, the great phenomenologist and the one who inspired Heidegger, Sartre and many others, was a student and assistant of the great mathematician Karl Weierstrass, and for fifteen years he was a colleague and close friend of Georg Cantor.

But Husserl is important too for what he has achieved and what we can learn from him. For Husserl, along with Frege and others helped get logic out of the quagmire of psychologism (I’m showing my cards here by the way I describe this, aren’t I?). You can see the confusion even in someone as great as Boole. He calls the laws of logic the Laws of Thought. And that is confusing because it could mean at least two very different things. Either you mean the laws governing how people reason (like laws of physics, which are intended to describe how objects move and interact) or the laws governing how people ought to reason. Only the latter kind of analysis gets the connection between logic and thought right, only it gives logic it’s kind of normative edge, and only this gets the phenomena of teaching first-year logic correct. (If logic is how people actually think, then why do so many of my students find logic hard to learn?) Husserl’s Philosophy of Arithmetic explores this and related themes.

More about Husserl can be found in the Oxford Companion to Philosophy

David Hilbert (1862-1943)

[Image of Hilbert]David Hilbert not only had great style in headwear, he was pretty influential when it came to setting the agenda in mathematics and logic too. He’s famous for many of different things, including his work in real analysis (Hilbert spaces) and on the Foundations of Geometry, but he’s most remembered for his 23 problems he set the mathematical community at an international congress in 1900. The great moment for me in this work is the spirit in which the enterprise was offered to the community. These new techniques of proof and rigour and clarity have helped us see these problems and issues anew. These are clearly stated problem with answers to be found. We have the technqiues to solve these problems in the affirmative or the negative. For Hilbert, proof and consistency was at the heart of mathematical technique. Proof, in clear logical steps, was the guarantee of truth. Consistency (the absence of any contradictions or other clashes in your assumptions or axioms) is the guarantee of existence. In both cases, logic is placed at the heart of the discipline of mathematics.

Hilbert’s scene setting was effective: the techniques of Godel and Cohen used to give an answer to Hilbert’s first problem on the cardinality of the collection of real numbers, and Gentzen’s proof of the consistency of arithmetic (dealing with Hilbert’s second problem) both used new and fruitful logical techniques. Not only was logic at the heart of mathematics in a way it never really had been before, but new logical techniques and results were being developed so that it could fill that role. The place of logic in the heart of mainstream mathematical practice, and not merely in foundational studies, is in part, due to Hilbert’s conception of the subject.

Here’s information about Hilbert in the Oxford Dictionary of Scientists.

Ernst Zermelo (1871-1953)

[Image of Zermelo]Zermelo did for set theory, the mathematical theory of collections or sets, what Dedekind and Peano did for arithmetic. He was the first to have a bash at working out principles which governed the mathematical practice of set-formation. The theory of sets mathematicians most often appeal to is a direct descendent of Zermelo’s original work. ZF set theory is named after Zermelo and Fraenkel who revised and extended his work.

A nice historical account of Zermelo’s contribution to set theory can be found at the St. Andrews History of Mathematics site. More information about Zermelo himself can be found there too.

Jan Lukasiewicz (1878-1956)

[Image of Lukasiewicz]Lukasiewicz is the first beardless logician on our journey, since Boole, and similarly, he’s the first algebraist since Boole too. Lukasiewicz is important for many reasons. For one, he was one of the leading lights on logic in Poland for many years (until his emigration to Ireland before the Second World War), responsible for the Logic group in Warsaw, along with Lesniewski, and Tarski (who we’ll learn about later) was educated in this school.

For another, though, Lukasiewicz was the first truly non-classical logician to gain a foothold for alternative views of logic. The guiding principle behind both Boole’s and Frege’s approach to logic was the statements were either true or false. Lukasiewicz investigated logical structures where this condition was violated. In particular, he looked at three-valued logical systems which kept a way open for things which were undetermined. After all, if our statements are about the future, then some might be made true by present facts, some might be made false by these circumstances, and some might neither be made true nor ruled out — at least if the future is genuinely open. Lukasiewicz’s “three-valued logic” is one way of modelling such phenomena. He also considered a more radical algebraic system where statements can have one of infinitely many values, one for each number between 0 and 1, representing degrees of truth between complete falsity and complete truth. This has now reached some popularity under the name “fuzzy logic”. This just goes to show that if you want to be popular, you ought to get a marketer to run his or her eyes over the papers you write before you submit for publication. “Fuzzy logic” will sell much better than “Lukasiewicz’s Infinely Valued Logic”.

A little on Lukasiewicz can be found in the Oxford Companion to Philosophy.

Bertrand Russell (1872-1970)

[Image of Russell]Bertrand Russell is the first philosopher on this list from an English-speaking background, and he is probably best considered as the one person who played the crucial role in founding the Anglo/American analytic tradition in philosophy.

Russell’s magnum opus (written with Alfred North Whitehead) is Principia Mathematica (don’t buy the full thing unless you have lots of disposable wealth —- the lite version is enough of the good stuff to give you the flavour of the work). What’s PM all about? It’s the first and greatest attempt in the research programme of logicism, the reduction of mathematics to logic. Now, this is an immensely creative work, because it played a very important part in helping define the notion of logic for the 20th Century. As Alberto Coffa has taught me, I know that logicism wasn’t a matter of taking a predefined notion of “logic” and showing how mathematics could be reduced to it. Rather, it was a matter of expanding the notion of logic so as to make a reduction of mathematics to logic more feasible. In Kant’s time, logic was a matter of the analysis of concepts into their constituents. By the time Russell and Whitehead got their hands on it (after it had passed through the work of Frege) it became so much more.

One thing Russell can be thanked for, above and beyond PM, is a miniature example of logicism and analytic philosophy at work. It’s his little vignette on definite descriptions, which stands to this day as a gem in analytic philosophy. The analysis concerns the difference between claims like

George Bush is articulate.

<p>The present U.S. President is articulate. </p>

They look like they have the same form. In the old-style Aristotelian analysis, they’re both of subject-predicate form, with the subject (George Bush, or the present U.S. President) on the one hand and the predicate being articulate being applied to him. But there is an important difference between these claims: The first uses a name and the second uses a description. The description in the second claim is a definite description because it refers to the present U.S. President, not simply a present U.S. President. (That would be an indefinite description.) Definite descriptions seem to have a number of important properties. For them to work, for that claim the present U.S. President is articulate to be true, you need three critieria to be satisfied. First, there needs to be a U.S. President —- if not, your claim fails because there’s no U.S. President to be articulate. Second, there has to be at most one U.S. President. If there are more than one, then it’s unclear which I was talking about, and (according to Russell at least) my claim fails too. Third, if there is exactly one U.S. President then my claim is true if and only if that person is, indeed, articulate.

So, to sum up the analysis, Russell says that a claim of the form “the F is a G” is not really of subject-predicate form at its base. Rather, it has the following form

There is a thing which is F, it is the only F, and it is a G as well

This is a typical example of an “analysis”, in analytic philosophy. The surface form of an expression may be deceptive. Different logical work may be involved in the claim, and you might need to expend effort to see what is actually being said.

Russell himself appears to have been rather confused about the significance of what he was doing. Sometimes he described his work as replacing objects with logical constructions. But this is not what’s going on here. The logical construction does not replace any object at all —- the present President of the U.S. is still a flesh and blood human being and not a “construction”. The analysis and reduction occurs at the level of semantics. The meaning of a definite description claim is reduced to being a construction of more primitive kinds of claims, involving predicates and quantification. Analytic philosophy of this kind made the way open for a new kind of reductionism, which didn’t explain away objects or construct them out of more simpler entities, but rather, it explains different kinds of statements by giving their significance in terms of other, simpler statements. You don’t need to refer to special kinds of objects as the referents of definite descriptions, for example, if you have an analysis of definite descriptions which doesn’t appeal to reference at all.

Bertrand Russell is all over the net. The Stanford Encyclopedia of Philosophy entry on him is especially good, and the Russell Archives at McMaster University in Canada are also worth a visit.

L. E. J. Brouwer (1881-1966)

[Image of Brouwer]Brouwer was a Dutch mathematician who founded the school of intuitionism, a radical tradition in mathematics. Intuitionism is radical because it takes much of contemporary mathematics to be simply mistaken, that mathematicians took a wrong turn when following Cantor’s direction in set theory, and the classical tradition in analysis (the theory of real-valued functions). Mathematics has gone bad when dealing with the infinite, because traditional mathematical practice does not do justice to what mathematics is actually about, according to Brouwer.

For Brouwer, mathematics is primarily governed by the intuition of the knowing and proving “mathematical subject”. Now, intuition doesn’t mean “gut instinct” like it does in contemporary language. The term is a piece of art looking back at least to Kant and his Critique of Pure Reason. Intuition, for Brouwer, just means a mental faculty. The important features of the intuition relevant to mathematics is that they are pure intuitions of space and time. These intuitions aren’t given by experience, but rather, they structure experience. We put our experiences together in a “manifold” of moments and locations: we don’t extrapolate the idea of time or space from our experiences (for Kant, and for Brouwer). Anyway, the details are irrelevant for us, the fact that the mental ability to count, to go to the next thing, an intuition of duality or two-ness is what is important in mathematics. The content of mathematical judgement is not some correspondence with an external world of mathematical objects, but rather, the contents of mathematical judgements (like the claim that 2+2=4, or that the area of a circle is proportional to the square of the radius, or that real-valued functions which are somewhere positive and somewhere negative are also zero somewhere) are to be found in the possible verifications or constructions involved.

Now, this approach is both conservative (it’s Kantian, not the radical approach of logicism stemming from Frege and Russell) and radical. It’s radical because it motivates a rejection of traditional mathematical claims, like the law of the excluded middle —- the thesis that every proposition is either true or not true. Brouwer takes this to be mistaken because of the nature of mathematical judgement and proof. Consider this simple example: suppose I give you a series of numbers, one by one. 1, 1/2, 1/4, 1/8, … I ask you if this series converges to zero or doesn’t. If this series is truly infinite (I’ll just keep giving you numbers as long as you keep asking for them) you’ll never know if the series is converging to zero, or that it’s not going to zero. (If I just give you the numbers one-by-one, and I do not give you a rule which generates the list: there’s always a chance the series will stop shrinking, and I go 1/8, 1/8, 1/8, …, or that I start going up instead of going down.) Browuer takes this to be a reason to reject the law of the excluded middle. We oughtn’t say that the series is converging or not. Once you make this change to mathematical practice, a lot else looks very different. Intuitionistic mathematics is alive and well to this day as a minority tradition.

More information about Brouwer can be found at the Oxford Companion to Philosophy.

Thoralf Skolem (1887-1963)

[Image of Skolem]Thoralf Skolem was a Norwegian logician and mathematician who continued the development of set theory along the lines started by Cantor and Zermelo. Skolem’s name is associated with one of the most interesting results in metalogic (that is, it’s a result about logic, not a result in logic), which we now call the Lowenheim Skolem Theorem. This states that if a statement (or set of statements) in Frege’s predicate logic can be satisfied by a model with infinitely many things in the domain, it can also be satisfied by a model with only countably many things. So, predicate logic, in an important sense, cannot tell the difference between the countably infinite and the uncountable infinite.

The most stunning and problematic consequence of this is Skolem’s Paradox. Take the theory of sets. In this theory you can distinguish between countably infinite sets and uncountably infinite sets (a countably infinite set is one which can be put into a one-to-one correspondence with the numbers {1,2,3,…}), and furthermore, you can prove that there are uncountably infinite sets (Cantor’s construction can be carried out). This theory, if it is consistent, has a model. This model must be infinite (it contains elements for each of the numbers 1, 2, 3, at least!) so by the downward Lowenheim-Skolem theorem, it also has a countable model. But what of the sets that the theory takes to be uncountably infinite. From inside the model, they are uncountable. But from outside, we see that they are not. What is the story?

Well, that’s a real problem. Skolem’s enticing response was to say that the difference between the countable and the uncountable was relative and not absolute. Another response would be to say that the predicate logic theory of sets is incomplete, and it must be extended somehow.

Interesting information about Skolem and the application of his ideas can be found in the Nordic Journal of Philosophical Logic’s special issue on Skolem.

Ludwig Wittgenstein (1889-1951)

[Image of Wittgenstein]The next entry on our tour simply has to be the inimitable LW. Wittgenstein is an incredibly fertile philosopher: his work has inspired philosophers from all traditions. His Tractatus Logico Philosophicus (written in 1918, after Wittgenstein served in the German army) is an enigmatic tract, which starts (in one translation) like this.

1 The world is everything that is the case.

1.1 The world is the totality of facts, not of things.

1.11 The world is determined by the facts, and by these begin allthe facts.

1.12 For the totality of facts determines both what is the case, and alsoall that is not the case.

1.13 The facts in logical space are the world.

And it continues in this vein. It is a work in logical atomism, the doctrine that the world is made up of atomic facts, the primary bearers of information, and it motivates the logical techniques of truth tables and other interesting formal things as consequences of this picture.

Much more important for Wittgenstein was the boundary between the sense and nonsense, or what could be said and what could not be said and what could only be shown. The closing words in the Tractatus are

7 Whereof one cannot speak, thereof one must be silent.

What is most interesting to me is that Wittgenstein explicitly acknowledges that his own book is, by his own lights, expressing what cannot be said. The entire book is an act of showing and not of saying. It’s a peculiar kind of showing, using German (or English) words, which the audience reads in the traditional manner. (Wittgenstein himself was later unsatisfied with his own approach and he attempted to paint a very different picture about how language works.)

More information about Wittgenstein can be found at the Oxford Companion to Philosophy. There are online editions of both the the Ogden-Ramsey and the Pears-McGuinness translations of the Tractatus. T. P. Uschanov has a comprehensive collection of links of online resources on Wittgenstein.

It is the mark of LW’s distinction that he is the only person on this page to have a [musical written about him. Unfortunately, I have not seen it. The Tractatus has also been set to music, more than once, unfortunately, I have not heard any of it.

Rudolf Carnap (1891-1970)

[Image of Carnap]Rudolf Carnap was a German philosopher, educated in the dominant Kantian tradition, interested in science and epistemology (the theory of knowledge). Carnap had one of the “conversion experiences” common to great evangelists. While sick one day he read some of the knew “formal” philosophy of Bertrand Russell, and the scales fell from his eyes. The combination of rigour and philosophical applicability convinced Carnap that from this day henceforth, he wanted to do philosophy like that.

And so, Rudolf Carnap worked in the tradition started by Russell, of the application of formal techniques to philosophical problems. But in Carnap’s hands, these techniques had very different outcomes. Carnap’s masterworks, the Aufbau and the Logical Syntax of Language indeed use techniques rather like Russell’s in Principia Mathematica and The Principles of Logical Atomism but the resulting philosophical picture cannot be more different. While Russell was an atomist and foundationalist for many of his years (though what these atoms might be was ever elusive for Russell), Carnap was temperamentally very different. He was, by nature, a kind of accommodating pluralist. He thought that the new logical techniques would help us see what was genuinely being said by different philosophical positions, and that we might see the things on which different philosophers agree, and those on which they have genuine disagreements, and those upon which the disagreement is a purely verbal matter. An example from the philosophy of mathematics might give you an idea of how this works. In mathematics, we have rules for adjudicating disagreement in many cases: if I say that 37+56=83, then you can show me where I went wrong by explaining how the addition actually works: 37+56 is 93 (perhaps I forgot to carry the 10 from the addition of 7 and 6). Similarly, if I deny that there is a prime number between 11 and 17, you can point out that 13 is a prime number between them. The meanings of the mathematical terms dictate how they should be used, and they give us answers in these cases. They don’t help in the argument between the platonist (who thinks that numbers really exist) and the formalist (who thinks that numbers really don’t exist). If either attempts to get into a discussion with Carnap about whether or not there really is a prime number between 11 and 17, he’ll simply say, “yes, there is a prime number between 11 and 17 — it’s 13 — but I don’t know what you mean by saying “really” in the question.” No sense has been given to the question of the existence of numbers, over and above the sense that has been given by virtue of the meanings of mathematical terminology. It’s the job of techniques from logic to clarify what can be said, and what cannot. In this sense, Carnap is quite close to the Wittgenstein of the Tractatus.

More information about Carnap can be found at the Oxford Companion to Philosophy.

Arend Heyting (1898-1980)

[Image of Heyting]Arend Heyting was a brilliant Dutch logician: beardless, as you can see —- the Return of the Beards will be quite some decades to come. Heyting’s claim to fame was to do something quite against the spirit of the intuitionist enterprise, but which made intuitionism a respectable and living (if minority) tradition in logic.

Heyting formalised intuitionistic logic. That is, he codified the kinds of inferences which are warranted by the lights of an intuitionist. This made “intuitionistic logic” an object of study, amenable to many of the same techniques that Frege and others had developed for the dominant tradition in logic, which we now call “classical” logic.

More information about Heyting can be found at the St. Andrews’ History of Mathematics Entry on him.

Haskell Curry (1900-1982)

[Image of Curry]Haskell Curry was an American student of Hilbert. His greatest innovative contributions to logic are in the odd area of combinatory logic. Combinators are strange beasts: they’re like functions which operate on other functions. Combinatory logic is a formal system like other kinds of logic: you have a collection of objects with different rules indicating allowable transformations from objects to other objects. But these combinators do not necessarily express statements, as the formulas in a traditional logic do. They’re functions.

The frame of mind required to think in this way (to see everything as a function) is important in many fields today, and the debt to Curry is sometimes acknowledged and sometimes not. For example, in computer science, the transformation required to move from thinking of a function as giving you an output given a pair of inputs (for example, addition as a function sending x and y to x + y) to thinking of it as giving a function as output when given a single input (so given x the output is the function which sends y to x + y: the adding x function) is called currying in Curry’s honour. This kind of transformation, which involves thinking of functions as first-class entities, is important in many disciplines, such as category theory and functional programming. In fact, my favourite functional programming language Haskell is named after Curry.

Curry made another very important contribution to the tradition: his textbook Foundations of Mathematical Logic is still rich with insight, and it rewards reading and rereading today. It is the first textbook in mathematical logic which takes a synoptic view of different logical systems and which encourages the reader to think philosophically and creatively about the different kinds of formal systems which might be used to model different logical phenomena.

More information about Curry can be found at the St Andrews’ History of Mathematics Entry on him.

Alfred Tarski (1902-1983)

[Image of Tarski]Alfred Tarski’s shadow looms large over 20th Century Logic, and it extends into the 21st Century too. An emigre from Poland, educated in the great Warsaw school of logicians, Tarski brought a new degree of conceptual clarification to a great number of issues in logic. I will sketch two here.

First, Tarski gave the first rigorous definition of what it was for a structure to be a model or an interpretation of the language of predicate logic. He gave precise clauses indicating for each kind of sentence (atomic sentence, conjunction, disjunction, negation, universally quantified, existentially quantified, etc) what it is for this sentence to be true in a model. To do this, he noticed that you need to do something tricky when it comes to quantified sentences like for all x(if Fx then Gx). For this to be given a sensible truth condition, we want its truth to depend on the truth of the part inside the quantifier. But if Fx then Gx is not a sentence: it’s got a variable in it unbound. (It’s like the term x + 2: you only know what number it is when I tell you the value of the variable x.) So, Tarski solved the problem this way. You don’t define what it is for a sentence to be true in a model. You define what it is for a formula (possibly including free variables) together with an assignment of values to each of the variables to be satisfied by a model. So, for all x(if Fx then Gx) together with the assignment A is satisfied by my model just when the inside bit if Fx then Gx together with any variant assignment A’ which assigns exactly the same values as A does to any variable other than x, but possibly assigns x any value it chooses. This gets the result exactly right. And it gives a clear (well, after you get used to it, anyway!), rigorous and recursive definition of satisfaction, and truth in a model.

The Second insight is related. It’s also on the topic of truth. Here, the idea is not truth in a model, but genuine truth. The kind of truth we attempt to express when we make assertions. Many philosophers have said deep and mystical things about the Nature of Truth. Some have said that truth is a matter of Correspondence to Reality (after all, propositions are designed to reflect the Way the World Is). Others have said that truth is a matter of the Coherence of an overall body of propositions (after all, we test propositions only against other propositions). Still others have said that truth is a matter of What Gets Things Done (after all, in describing things we always aim to Do something, and this governs the kinds of concepts we have). Tarski thought that all accounts like this are misguided. For given a language which doesn’t say anything about truth, it is easy to define the notion of truth in that language, without appealing to Correspondence, Coherence, or Pragmatic concerns. Suppose we’ve got a sentence in our language, like “snow is white”. (This is a favourite in discussions of truth, for some reason.) We know very well what the sentence means. We want to know whether or not the sentence is true. Well, given that we know what it means, we can say exactly the conditions under which this sentence is true:

“Snow is white” is true if and only if snow is white

Now this tells us exactly the circumstances under which “snow is white” is true. And it doesn’t say anything about Correspondence to Reality, Coherence, or Pragmatic concerns. It is Tarski’s definition of the concept of truth, and it’s become an important benchmark in any discussion of the topic to this day.

More information about Tarski can be found at the Oxford Companion to Philosophy.

Frank Ramsey (1903-1930)

[Image of Ramsey]If I’d done as much as Frank Ramsey had achieved by the time I was 27, I’d be glad. Of course, I’m very glad to have made it past 27.

Ramsey not only did amazing work in the philosophy of logic and probability. His insight into how theoretical terms work has earned his surname a place in the lexicon. Yes, Frank Ramsey is responsible for the technique we now delightedly call Ramsification.

Ramsification is a straightforward notion when you get your head around it. It’s a theoretical application to concepts of Russell’s account of definite descriptions. It’s always been a bit of a mystery how theoretical terms like neutrino actually work. People started talking about neutrinos (I love the name by the way: little neutral one is a perfect name for a pet, at least if he/she is placid) before they’d ever found conclusive evidence of their existence. It’s not like they’d laid eyes on them and said “I will call this a neutrino”. No, what happened instead is that the scientists endorsed a body which included the predicate “is a neutrino”. The theory will contain things like “Neutrinos have no charge”, “Neutrinos have negligible mass” and other things like this. Now this theory can have sense, and can assign a kind of meaning to the term “neutrino” in just the same way as Russell’s analysis of definite descriptions shows how sense is assigned to definite descriptions. The theoretical term “neutrino” doesn’t really refer, but you can treat the terms as a predicate variable, and the theory really says something like

(for some X)(X has no charge, and X has negligible mass, and …)

And in this way, theoretical terms are introduced into the language.

This technique has seen a lot of use in contemporary work coming out of Canberra.

More information about Ramsey can be found at the Oxford Companion to Philosophy.

Alonzo Church (1903-1994)

[Image of Church]Alonzo Church’s influence over logic is long-lived, as he was. I’ll focus on just two ways he has left his mark on the field. First, and most obvious, was his invention of the lambda calculus. This is a notion for representing functions, immensely suited to the same body of theory as Curry’s combinatory logic. In fact, Church’s notation for functional abstraction (using a lambda, which I’ll write here with a backslash “") is a uniform technique which, when combined with functional application, gives you exactly the same theory as Curry’s combinators. The notation is simple. The addition function can be represented like this:

\x\y(x+y)

It’s a function, which when applied to a number (say 2) gives you the result \y(2+y). You figure out how it works by taking the first lambda abstraction out the front, removing it, and replacing the variables inside the rest which were bound by that abstraction, by the value you chose. The result is still a function. If you apply this to another number (say 3) you get the result 2+3. So addition here is a function which takes two values, and returns a number.

More can be said about the lambda calculus, and its use as a language for representing functional abstraction. I want to go on to consider what might have been Church’s greatest achievement. He founded, in 1936, the Journal of Symbolic Logic in 1936, along with its parent body, the Association for Symbolic Logic. The JSL is the most prestigious journal in formal logic, and it has played (especially in its early years) a crucial pivotal role in helping define the community of logicians, by giving a focal point for research and a means for disseminating results broadly. Logicians did not have to resort to scanning over a wide body of mathematics and philosophy journals to find articles of importance to logic. Now they could concentrate in the one place.

More information about Church can be found at the St Andrews’ History of Mathematics Entry on him.

Kurt Gödel (1906-1978)

[Image of Gödel]Gödel’s incompleteness theorem is great in two ways. First, it’s the technical result in logic most cited outside its field. Second, it’s a stunning piece of reasoning which genuinely broke open a new field and closed off other areas of inquiry.

Gödel’s incompleteness theorem shows that Peano’s theory of arithmetic is essentially incomplete. For some sentences in the language of arithmetic, it cannot decide between that sentence and its negation. Gödel shows that this is not an accidental feature of Peano’s theory, to be fixed with a little patch here or there. No, any theory of arithmetic which is such that you can tell (by means of a recursive function: look to Turing’s entry coming on up) whether or not something is an axiom of the theory, then exactly the same incompleteness phenomenon will hold. Why is this? It’s because an arithmetic theory is rich enough to represent in some sense, it’s own facts about provability. That is, it can simulate provability in its own claims about numbers. Gödel showed how claims about sentences and proofs can be “encoded” into claims about numbers, in such a way that if there is a proof of some claim, then there is a proof in arithmetic which represents that proof. Once you’ve pulled this trick (it’s the trick of Gödel numbering), you then need to pull another trick (called diagonalisation, it’s close in spirit to Cantor’s diagonal argument) to construct a sentence which, in effect says

I am not provable in Peano’s Arithmetic

If this sentence is provable in Peano’s arithmetic, then since provability in this arithmetic implies truth, it is not provable in Peano’s arithmetic, and as a result the arithmetic is inconsistent. On the other hand, if it’s not provable in Peano’s arithmetic, then it itself is an example of a sentence which is true but is not provable: and as a result, the arithmetic is incomplete.

Now, there’s a lot that can be said about this result. And there’s a lot of garbage that has been said about this result too (so called “applications” range from theories of the mind, of culture, of value, of art, of this that and the other). It think it’s stunning enough in its original form. And working out its implications for logical theories, and knowing when (and on what basis) we can claim consistency or completeness, is plenty interesting enough.

More information about Gödel can be found at the St. Andrews’ History of Mathematics Entry on him.

W. V. O. Quine (1908-2000)

[Image of Quine]Quine is not famous for his logic so much as for the mark he made on philosophy in the english-speaking world in the second half of the twentieth century. However, he was a formidable logician, with contributions in set theory (his New Foundations is an alternative view of the set-theoretic universe, radically at odds with the Zermelo-Fraenkel picture inspired by Cantor) and other areas of logic. However, it was his imprint on philosophy as a whole for which Quine is rightly remembered.

The centre of Quine’s contribution can be seen in his work on meaning and the holist picture radically at odds with Carnap’s view of how language works. For Carnap, the meaning postulates governing a discourse show us how the language works, and once you have the meaning postulates (the analytic truths) the “world” contributes the rest, the (the synthetic truths). The analytic truths don’t tell us anything about the world but the synthetic truths do. This is a compelling picture, and it’s one that has basically been the received doctrine in understanding language and concepts, from the British Empiricists, through Kant, and up to the middle 20th Century. Of course, the picture has not been a uniform one. Kant argued that some synthetic truths were knowable a priori, and the work in logic in the early 19th and 20th Centuries greatly expanded our understanding of what might count as “analytic”. However, everyone agreed that there was a sensible distinction to be drawn between the analytic and the synthetic.

Quine changed all of that. With novel arguments and though experiments (in particular, the problem of radical translation: see Word and Object and the famous article “Two Dogmas of Empiricism”) Quine set about to dismantle the analytic/synthetic distinction. For Quine, any body of theory cannot be unambiguously split into analytic and synthetic components, where the synthetic is prone to revision and the analytic immune from revision. The body of theory is to be considered as a whole, and judged in this way. Quine was the first explicit empiricist holist.

To this day, anyone who takes the analytic/synthetic distinction seriously owes a debt to explain how it is to be understood, in the face of Quine’s arguments. For bringing this important issue to light, Quine is to be praised!

More information about Quine can be found at the Quine Home Page.

Gerhard Gentzen (1909-1945)

[Image of Gentzen]Gerhard Gentzen brought the study of logic to a completely new level with his work on the theory of proofs. Logicians, since Aristotle, have been interested not only in what you can prove, but also in proof itself. But somewhat suprisingly, up until Gentzen’s time, no-one had a coherent story to tell about what a proof actually is and what logical properties proofs might have. Gentzen’s work changed this, and he is now seen as the archetypal proof theorist.

Gentzen’s innovation was to introduce proof as an object of study. He formalised two different kinds of proofs, which we now call Natural Deduction and Gentzen Systems in his honour. A Gentzen system is an interesting kind of way of proving things, because unlike other kinds of proof, a Gentzen proof is not a series of statements, each of which follows from earlier statements in the series. (A Natural deduction proof is like this, except that statements may be assumed and assumptions may also be discharged at different points in the proof.) A Gentzen proof is very different: at each stage of a Gentzen proof, you prove not statements but sequents. A sequent is not an individual statement, but rather, a claim saying that one statement follows from other statements. One way of thinking about it (and this was Gentzen’s own way of thinking about it) is that a Gentzen proof is a proof about other proofs. The steps don’t say things like “this follows from that” but “if there’s a proof of B from A then there’s a proof of C from A too”.

Gentzen systems are beautiful because they give you a deeper understanding of what can be done with proofs. One of Gentzen’s own applications was to use a Gentzen system for Peano Arithmetic to show that arithmetic was indeed consistent. Gentzen’s argument is quite clever. He shows that there is no way that you could get a proof of an inconsistency from the axioms because any such proof could be transformed into a smaller proof of an inconsistency from the axioms, and there’s no one-step proof of an inconsistency. (Actually, the proof is a bit trickier than this, as it’s a corollary of Gentzen’s famous Cut Elimination Theorem, but this is the core of the way that the result is proved, if you think about it for a bit!)

This result might be seen to be in conflict with Gödel’s incompleteness result, which has a corollary that Peano’s arithmetic can’t prove its own consistency. However, there’s no conflict, as Gentzen’s techniques, though straightforward, actually presume something more that arithmetic itself. These techniques use a form of mathematical induction stronger than that supplied in ordinary arithmetic. This result has spawned many more like it, and the study of the relative proof-theoretical strength of different theories is now a well-understood discipline in logic.

More information about Gentzen can be found at the St. Andrews’ History of Mathematics Entry on him.

Alan Turing (1912-1954)

[Image of Turing]Alan Turing is famous for the Turing Test (a claim about what it would take for something to count as intelligent: get it to communicate with you in conversation over a network connection) for Turing Machines, and for half of the name of the Church-Turing Thesis. I think that the latter two are more interesting from the point of view of logic than the first one.

Turing Machines are not really machines at all. The idea of a Turing machine is that of a characterisation of a kind of computing device, and it’s the kind of computational device stripped down to its limit. All it involves is a long tape divided into segments (picture a roll of toilet paper unrolled). The tape is long in the sense that if you ever get to an end, you can go out to the shop and get some more to add on. So, you never run out of tape. Then, you have a marker where you can write a sign on a square of the tape. (You can overwrite an existing sign with a new sign too: the new sign replaces the old one.) You have a finite collection of possible signs, including the blank sign for a square which has no other sign written on it. You also have a finite set of states which you can think of as points on a flowchart. The instructions are the arrows on a flow chart which say things like this: If you’re in this state, and you’re looking at a square with this symbol on it, then follow this instruction, and go to this state. What’s an instruction? It can be one of three things. It’s either to write a sign in the square you’re looking at, or move left one square and look at this new square, or move right one square. That’s it. If you’re in a state looking at a symbol on a square and you have no instruction to follow, your computation simply stops.

That’s it. That’s all there is to a Turing machine. It’s not exactly the kind of computer you can use for word-processing, or playing games with. However, the kinds of computations you can do on this kind of machine (especially when you think of representing numbers on the machine using some code) are a really natural class of functions. They’re the recursive functions, independently characterised by Church. And the Church-Turing thesis is the claim that this class of functions is exactly the class of functions which could be computed using any computational device.

More information about Turing can be found at the St Andrews’ History of Mathematics Entry on him.

Arthur Prior (1914-1969)

[Image of Prior]Arthur Prior is the first (and only) antipodean logician on this list. Educated in New Zealand, he had academic positions both in New Zealand (at Otago and Canterbury) and in England (at Manchester, and Oxford). Prior invented the field of tense logic, the logic of “earlier” and “later”, “past” and “future”. He was the arch intensionalist, bringing to the philosophical fore these modalities which are not truth-functional, not obviously amenable to Tarski’s analysis of truth-in-a-model. Remember that in Tarki’s world, the truth or otherwise of a complex expression depends on the truth or otherwise of its components. The temporal operators don’t work like this at all. “it’s raining” and “it’s Thursday” might both be true, but “yesterday it rained” and “yesterday it was Thursday” might differ in truth value. The truth or falsity of temporal operators is intensional and not extensional. It depends on more than just the truth value of the components.

There is a tradition in logic to attempt to erase the intensional by making it genuinely extensional (replacing talk of “it rained” by “it rained on day x” and explicitly quantifying over days or times) — Quine’s work is a very good example of this — but Prior resisted this move strongly. For Prior, there was nothing wrong with irreducable intensionality, provided that you could be clear about the logical properties these intensional creatures satisfied. Prior’s work did a great deal to rehabilitate the intensional and to put it on an equal footing with the extensional tradition in logic. This work would explode onto the scene with some developments of Kripke and what is now known as “possible worlds semantics”, but it was Prior’s work which set the scene and showed the enormous range of philosophical applicability of these ideas.

Excellent information about Prior can be found in Jack Copeland’s entry on Prior in the Stanford Encyclopedia of Philosophy.

Helena Rasiowa (1917-1994)

[Image of Rasiowa]At last, a woman! The month is nearly at an end, and at last, I have found a female logician to join our list of stars. Helena Rasiowa was an incredible logician from Poland, who helped place the study of logics on a mathematical footing. Continuing on the work of Boole and Lukasiewicz, she showed, in her groundbreaking book An Algebraic Approach to Non-Classical Logics showed how contemporary techniques in algebra may be used to study the structures of propositions arising out of different conceptions of logic. This book is rich in ideas and treatments of Boolean algebras, Heyting algebras (models of intuitionistic logic), the many valued logics of Lukasiewicz, and others. This is one of the first books to give a unified treatment of logics rather than a partisan development of this system or that system.

Rasiowa’s work also encouraged me as a budding mathematics student, to go into logic. Browsing through its pages in the old University of Queensland Mathematics Library, I saw that the things I was learning in algebra could be used to illuminate my growing interests in logic.

More information about Rasiowa can be found at the St. Andrews’ History of Mathematics Entry on her.

Ruth Barcan Marcus (1921-)

[Image of Marcus]Ruth Barcan Marcus is another intensionalist like Prior. She was a proponent of modal logic who in a series of papers from 1946, defended the coherence of quantified modal logics against criticism, and helped ensure that modal logic gained a hearing, but eventually, became a part of the lingua franca of many working philosophers. This was an amazing feat, since her opponents in this included none other than the eminent W. V. O. Quine. Quine was an ardent and trenchant critic of intensional logic, but Ruth Barcan (later Ruth Barcan Marcus) was central in ensuring that philosophical interest in modality did not wither under Quinean criticism.

Ruth Barcan Marcus has lent her name to what is now called the Barcan Formula. She showed that on a plausible conception of how possibility worked, the following thesis turns out to be true:

If it is possible that something has property F, then there is something such that it possibly has poperty F.

Now, this might be a problem. After all, if we think it is possible that there be an object with atomic number 150, does it follow that we should think that there is some stuff such that it could be matter with atomic number 150? It’s kind-of odd to think that this must be so, but on a straightforward understanding of how modality and possibility works (that every “world” has the same “domain”: more on this later) this is how things turn out.

The debates arising out of the combination of modality and quantification are by no means over. Ruth Barcan Marcus’ sharp analysis has focussed the discussion of these topics for decades.

More information about Marcus can be found at her web page. For a good collection of Prof. Marcus’ work on modal logic, her collection Modalities: Philosphical Essays is the place to start.

Saul Kripke (1940-)

[Image of Kripke]Saul Kripke was a precocious teenager when his work revolutionised intensional logic. For many years people had known that the logic of possibly and necessarily had a coherence all of its own, and many people had strived to give it a plausible and satisfying formal semantics. This was to change in the work of a teenaged Kripke, published in 1959 (“A completeness theorem in modal logic”, published in the Journal of Symbolic Logic, pages 1 to 15 of volume 24). In this paper, Kripke exploited the well-worn notion of a possible world, and introduced the notion of “relative possibility” as a relationship between “possible worlds”. Then we could say that a formula “necessarily A” is true at a world if and only if A is true at all worlds which are possible relative to the original world, and “possibly A” is true at a world if and only if A is true at some worlds which are possible relative to the original world.

The notion of possible worlds is not original with Kripke. Neither, in fact, is the use of something like a relative possibility (or “accessibility”) relation. However, putting these things together in a neat bundle, and showing how a number of studied modal systems can be understood in this light, was new, and this generated a whole new field of study: the possible worlds semantics for modal and other intensional logics.

Kripke went on to work in many different areas in philosophy, including some influential work on Wittegenstein and other areas of metaphysics and the philosophy of language. His mark on logic, however, was already made in that paper written as a teenager.

More information about Kripke’s work in philosophy can be found at the Oxford Companion to Philosophy on him.

David Lewis (1941-2001)

[Image of Lewis]David Lewis is famous not only for his great impact on analytic philosophy (perhaps this impact is felt most keenly in Australia, a country he loved and visited regularly) but also on the reaction that his philosophical views provoked. Lewis’ modal realism, the view that other possible worlds are just as real, and just as real in exactly the same way as this world we live in, famously and regularly provokes an incredulous stare from people when they hear that he believes it quite literally.

Lewis is a good example of a philosopher whose work is the consequence of the work of others. This is not to deny his creative output or originality in any way: it is a creative work to spell out the consequences of a view, and no body of work in this area is richer or more well developed than Lewis’. David Lewis’ view, I think, can be seen as what you get when you cross Quine’s ontological views (his extensionalism) with the intensionalism you find in Kripke’s and others’ work on modal logic. Lewis acknowledges that modal notions (possibility and necessity and the like) are important, meaningful, and to be found in any theory we take seriously. From Quine, he takes the ontological commitment of a theory to be read off what the theory quantifies over (Quine’s dictum is “to be is to be the value of a variable”.) Modality is important and it makes sense. Our way of understanding modality is to quantify over possible worlds in semantics in the tradition of Kripke. Voila: modal realism.

Lewis’ work on this topic, especially in the book On the Plurality of Worlds is a compendium of what follows from taking this metaphysical picture seriously, and the kinds of philosophical applications the notions of possible worlds might have.

More information about Lewis can be found at the Oxford Companion to Philosophy.

Jon Barwise (1942-2000)

[Image of Barwise]Jon Barwise was a renaissance logician. He didn’t know everything but his contributions ranged so widely that he approximated omniscience quite well. His work ranges from infinitary logic (an extension of Frege-style predicate logic to deal with infinitely long sentences, and infinitary quantifiers), the model theory of first-order logic (continuing on from Tarski’s work), generalsed quantifiers (quantifiers other than “for all” and “for some”), admissible sets and generalised recursion theory (the connections between sets and computation), situation semantics and the philosophy of language (using situations, restricted parts of the world as bearers of information, rather than just entire possible worlds), information theory (an account of how information flows and is transmitted), and the logic of diagrams (examining visual representation and inference, as well as linguistic representation).

Barwise’s work, in all of 35 years, has covered a huge range of disciplines, and it gives you some idea of the breadth of work available in contemporary logic. Barwise’s approach of regularly moving into new fields, to keep fresh and active, is a helpful antidote in the current age of increasing specialisation and narrowing. If work like this is possible at the end of the 20th Century, it will be our job to see what might be done in the 21st.

More information about Barwise can be found at the Barwise Memorial Pages at Indiana.


about

I’m Greg Restall, and this is my personal website. I am the Shelby Cullom Davis Professor of Philosophy at the University of St Andrews, and the Director of the Arché Philosophical Research Centre for Logic, Language, Metaphysics and Epistemology I like thinking about – and helping other people think about – logic and philosophy and the many different ways they can inform each other.

subscribe

To receive updates from this site, subscribe to the RSS feed in your feed reader. Alternatively, follow me at  @consequently@hcommons.social, where most updates are posted.

contact

This site is powered by Netlify, GitHub, Hugo, Bootstrap, and coffee.   ¶   © 1992– Greg Restall.