## Tuesday, February 24, 2009

### Counting words in LaTeX

If you've ever wondered how many words a paper edited in LaTeX really has, here is a nice script that seems to get it right.

Labels:
bits and bobs,
LaTeX

## Sunday, February 22, 2009

### PHDs in Logic – news from the trenches

The Ghent workshop in logic is over. Since my life has been quite hectic (I was moving in to a new apartment), I couldn’t make it to all talks. Since (around) half of the people at the workshop were pure mathematicians, some of the talks that I went to were either above my head or beyond what I’m interested in. Here are some comments on those talks that I went to and that I feel I can say something about.

The workshop hit off with a tutorial on inaccessible cardinals by Benedikt Löwe (Amsterdam). It was quite entertaining and really informative. The main gist was this. Take the hierarchy of ordinals. Play around with it to get a hierarchy of cardinals. It turns out, all succesor cardinals are regular and there are also non-regular limit cardinals. Now, ZFC cannot prove the existence of regular limit cardinals (also called inaccessible cardinals). Interestingly, however, we can relate the existence of something as weird and prima facie useless as inaccessible cardinals to certain problems pertaining to the real number line. So, a conditional can be proven whose antecedent states the existence of inacessible cardinals and whose consequent says something about Lebesgue measurability. Which, in a sense, is quite exciting, because it shows that certain abstract questions that few people seem to care about have certain implications for stuff that every real number theorist should be interested in.

Reinhard Muskens (Tilburg), on the other hand, discussed type theory and introduced intensional models for type theory. They are meant to serve as a formal model of intensional contexts (so that, e.g., you can play around and introduce an explicit knowledge operator and yet avoid logical omniscience). The axiom of extensionality fails, but you still get classical reasonings as far as the usual operators go. Muskens also discussed a way one can construct possible worlds out of these intensional models (they’d be properties of propositions).

One thing we slightly disagreed about was the behavior of proper names. Muskens says (in his abstract):

When you look at arithmetic, it turns out that you can model it not only in infinite domains but also in finite but potentially infinite models (roughly, instead of the standard model, you take a set of its finite initial segments, making sure that all needed segments are in the domain). Quite surprisingly, even though this doesn’t seem to make much of a difference, in finite model theories of multiplication or division are undecidable, even though they are decidable in finite models. The weakest known arithmetic for which undecidability in finite models is proven is arithmetic of coprimality. Here, Marek’s work kicks in. He shows that even though the finite arithmetic of coprimality is quite poor, you can still code semantics in it, and obtain the undefinability of truth theorem for that system.

Jean Paul van Bendegem delivered a tutorial on many uses of paraconsistent logics. Basically, this was a very neat and quite accessible introduction to adaptive logics and their many uses. It was pretty informal and was fairly easy to follow (although, this might be an overstatement - I’ve seen the systems before).

Karl-Georg Niebergall (Berlin) gave an entertaining talk about Gödel’s incompleteness theorems. Given the talk title, I was afraid this might go in the direction of murky waters of philosophical interpretations of the theorems. I’m glad I was wrong. Karl-Georg, actually, discussed various versions of proofs of the first theorem and compared the types of soundness(/adequacy) needed for the proof. Given that, he briefly explained how we get the second theorem. This brought him to an explanation of the role of Rosser’s provability predicate (well, if you use Rosser’s notion, the soundness requirement is weaker). Now, the second theorem says that consistency of arithmetic is not provable within it. What’s less know is that even though this indeed is the case if you use the classical provability predicate, if you use Rosser’s predicate, things get funky, because you can prove within arithmetic that it is consistent in this sense. This brings us to a dilemma: either the modified definition expresses the consistency and consistency is provable (in which case the original formula doesn’t express consistency), or the modified provable formula fails to express consistency. The answer hinges on how you understand “expressing”. Karl-Georg discussed quite a few prima facie plausible definitions of what it is for a formula to express consistency, and showed why none of this definitions can be accepted.

Martin Mose Bentzen (Amsterdam) talked about game-theoretic formalizations of situations where situation participants would be better off if they trusted each other. Martin introduced neat formal tools to represent some of the examples he discussed intuitively, and discussed how those situations are to be assessed (interestingly, it doesn’t seem to be the case that we always would be better off if we trusted each other!).

Christian Straβer developed an adaptive logic for deontic dilemmas allowing for factual detachment. First, he explained why usually dyadic deontic operators are preferred to monadic ones (with monadic approach it is hard to avoid unrestricted applications of Strengthening the Antecedent, also, it is hard to model the Chisholm paradox or the Gentle-Murderer paradox). Second, he discussed one of the key difficulties for the dyadic approach. Roughly, A and the commitment under A to do B should imply the actual obligation to do B. But in standard dyadic deontic logics this fails (or: its acceptance either brings Strengthening the Antecedent back, or collides with certain intuitive natural language examples). To deal with this issues, Christian “adaptivized” Lou Bogle’s CDPM logics, so that the resulting (non-monotonic) theory handles restricted uses of factual detachment and strengthening the antecedent without allowing problematic applications.

The workshop hit off with a tutorial on inaccessible cardinals by Benedikt Löwe (Amsterdam). It was quite entertaining and really informative. The main gist was this. Take the hierarchy of ordinals. Play around with it to get a hierarchy of cardinals. It turns out, all succesor cardinals are regular and there are also non-regular limit cardinals. Now, ZFC cannot prove the existence of regular limit cardinals (also called inaccessible cardinals). Interestingly, however, we can relate the existence of something as weird and prima facie useless as inaccessible cardinals to certain problems pertaining to the real number line. So, a conditional can be proven whose antecedent states the existence of inacessible cardinals and whose consequent says something about Lebesgue measurability. Which, in a sense, is quite exciting, because it shows that certain abstract questions that few people seem to care about have certain implications for stuff that every real number theorist should be interested in.

Reinhard Muskens (Tilburg), on the other hand, discussed type theory and introduced intensional models for type theory. They are meant to serve as a formal model of intensional contexts (so that, e.g., you can play around and introduce an explicit knowledge operator and yet avoid logical omniscience). The axiom of extensionality fails, but you still get classical reasonings as far as the usual operators go. Muskens also discussed a way one can construct possible worlds out of these intensional models (they’d be properties of propositions).

One thing we slightly disagreed about was the behavior of proper names. Muskens says (in his abstract):

The semantics to a large degree implements Frege’s idea that all signs that have a referent, also come with a sense.Now, if I understood him correctly, Muskens would like to treat proper names as having senses: those senses would be just Quine-style predicatized proper names. So, for example, the sense of `Aristotle’ would be `the unique object that Aristotelizes’ or something like that. This, to my mind, has certain drawbacks. Most importantly, this doesn’t seem to be a fulfillment of the descriptivists’ program: for on the descriptive view, senses of names are purely qualitative, whereas Quineazing names this way doesn’t give you a qualitative specification of the reference of a name. But, I guess, the discussion should go deeper: why would we prefer purely qualitative descriptions? Why does the procedure seem kinda fishy, especially in the light of Kripke-style modal arguments against descriptivism? Anyway….

Marek Czarnecki (Warsaw) gave a talk on Semantics coded by coprimality in finite models. Marek has recently won one of the Polish Society for Logic and Philosophy of Science awards in the competition for best MSc theses in logic, and this talk was about his results.

When you look at arithmetic, it turns out that you can model it not only in infinite domains but also in finite but potentially infinite models (roughly, instead of the standard model, you take a set of its finite initial segments, making sure that all needed segments are in the domain). Quite surprisingly, even though this doesn’t seem to make much of a difference, in finite model theories of multiplication or division are undecidable, even though they are decidable in finite models. The weakest known arithmetic for which undecidability in finite models is proven is arithmetic of coprimality. Here, Marek’s work kicks in. He shows that even though the finite arithmetic of coprimality is quite poor, you can still code semantics in it, and obtain the undefinability of truth theorem for that system.

Jean Paul van Bendegem delivered a tutorial on many uses of paraconsistent logics. Basically, this was a very neat and quite accessible introduction to adaptive logics and their many uses. It was pretty informal and was fairly easy to follow (although, this might be an overstatement - I’ve seen the systems before).

Karl-Georg Niebergall (Berlin) gave an entertaining talk about Gödel’s incompleteness theorems. Given the talk title, I was afraid this might go in the direction of murky waters of philosophical interpretations of the theorems. I’m glad I was wrong. Karl-Georg, actually, discussed various versions of proofs of the first theorem and compared the types of soundness(/adequacy) needed for the proof. Given that, he briefly explained how we get the second theorem. This brought him to an explanation of the role of Rosser’s provability predicate (well, if you use Rosser’s notion, the soundness requirement is weaker). Now, the second theorem says that consistency of arithmetic is not provable within it. What’s less know is that even though this indeed is the case if you use the classical provability predicate, if you use Rosser’s predicate, things get funky, because you can prove within arithmetic that it is consistent in this sense. This brings us to a dilemma: either the modified definition expresses the consistency and consistency is provable (in which case the original formula doesn’t express consistency), or the modified provable formula fails to express consistency. The answer hinges on how you understand “expressing”. Karl-Georg discussed quite a few prima facie plausible definitions of what it is for a formula to express consistency, and showed why none of this definitions can be accepted.

Martin Mose Bentzen (Amsterdam) talked about game-theoretic formalizations of situations where situation participants would be better off if they trusted each other. Martin introduced neat formal tools to represent some of the examples he discussed intuitively, and discussed how those situations are to be assessed (interestingly, it doesn’t seem to be the case that we always would be better off if we trusted each other!).

Christian Straβer developed an adaptive logic for deontic dilemmas allowing for factual detachment. First, he explained why usually dyadic deontic operators are preferred to monadic ones (with monadic approach it is hard to avoid unrestricted applications of Strengthening the Antecedent, also, it is hard to model the Chisholm paradox or the Gentle-Murderer paradox). Second, he discussed one of the key difficulties for the dyadic approach. Roughly, A and the commitment under A to do B should imply the actual obligation to do B. But in standard dyadic deontic logics this fails (or: its acceptance either brings Strengthening the Antecedent back, or collides with certain intuitive natural language examples). To deal with this issues, Christian “adaptivized” Lou Bogle’s CDPM logics, so that the resulting (non-monotonic) theory handles restricted uses of factual detachment and strengthening the antecedent without allowing problematic applications.

Labels:
academia,
adaptive logic,
bits and bobs,
conferences,
logic,
paraconsistency

## Tuesday, February 17, 2009

### A reference for Lindenbaum Theorem

If you've read FOM recently, you probably already know, if you don't, here is a fun fact about Lindenbaum Theorem.

Arnold Neumaier asked why exactly Lindenbaum is credited with Lindenbaum Theorem, and the answer (given by Panu Raatikainen) is that it was never published by Lindenbaum, but it is credited to Lindenbaum by Tarski in On fundamental concepts of metamathematics (1930), theorem 12 and footnote, p. 34 in the English translation of Logic, Semantics, Metamathematics.

Since I did some Polish logic and still was lame enough not to know this, I'm grateful to Panu for the reference.

Arnold Neumaier asked why exactly Lindenbaum is credited with Lindenbaum Theorem, and the answer (given by Panu Raatikainen) is that it was never published by Lindenbaum, but it is credited to Lindenbaum by Tarski in On fundamental concepts of metamathematics (1930), theorem 12 and footnote, p. 34 in the English translation of Logic, Semantics, Metamathematics.

Since I did some Polish logic and still was lame enough not to know this, I'm grateful to Panu for the reference.

Labels:
bits and bobs,
Lindenbaum,
Lvov-Warsaw,
metalogic,
Tarski

## Saturday, February 7, 2009

### What Hempel didn't say about Ayer and Church

I’ve started prepping Philosophy of Language class for the next academic year (yeah, I don’t have to teach till Oc!). As for the text, The Philosophy of Language (third edition), edited by Martinich is the classic.

Right now, I’m looking at Hempel’s Empiricist Criteria of Cognitive Significance: Problems and Changes. At some point, he mentions Ayer’s revised verifiability requirement and Church’s argument against it, without explaining either of them. Below, a minor addendum which covers these things.

On p. 29 (Martinich pagination) Hempel discusses Ayer’s formulation of the testability criterion. The first formulation from Language, Truth and Logic is pretty much this:

A sentence S has empirical import if from S in conjunction with suitable subsidiary hypotheses it is possible to derive observation sentences which are not derivable from the subsidiary hypotheses alone.

Hempel then explains why this criterion is too wide: it allows empirical import to any sentence whatsoever. Take S to be any sentence. Let O be an observation statement that doesn’t follow from S. Then clearly, O is derivable from S and a subsidiary hypothesis: “If S, then O”.

This is all well known. What’s slightly less known is the formulation that Ayer gave in the appendix to the second edition of Language, Truth and Logic. Here’s how Hempel phrases it:

In effect, the modification restricts the subsidiary hypothesis mentioned in the previous version to sentences which either are analytic or can independently be shown to be testable in the sense of the modified criterion.

This restriction is expressed in recursive form and involves no vicious circle.

and refers the reader to the second edition of LTL. Next, he elaborates on one difficuly with this new version. He also says:

Another difficulty has been pointed out by Church, who has shown that if there are any three observation sentences none of which alone entails any of the others, then it follows for any sentence S whatsoever that either it or its denial has empirical import according to Ayer’s revised criterion.

Hempel then moves on to other issues, without explaining why Ayer’s revised version involved no vicious circle and how Church argued for his claim.

Now, if you’ve ever been curious as to how exactly Ayer formulated the criterion and why doesn’t it involve a vicious circle, and what exactly Church’s argument was, here are the answers.

Ayer’s formulation:

I propose to say that a statement is directly verifiable if it is either itself an observation-statement, or is such that in conjunction with one or more observation-statements it entails at least one observation-statement which is not deducible from these other premises alone; and I propose to say that a statement is indirectly verifiable if it satisfies the following conditions: first, that in conjunction with certain other premises it entails one or more directly verifiable statements which are not deducible from these other premises alone; and secondly, that these other premises do not include nay statement that is not either analytic, or directly verifiable, or capable of being independtly established as indirectly verifiable. And I can now reformulate the principle of verification as requiring of a literally meaningful statement, which is not analytic, that it should be either directly or indirectly verifiable, in the foregoing sense (p. 181 of the penguin books edition of LTL, 1971).

Step 1. Suppose O1, O2, O3 are three observation statements that are logically independent.

Step 2. ((~O1)^O2) v (O3^~S) is verifiable, for together with O1 it entails O3. Indeed, O1 entails ~((~O1)^O2), and this, by disjunctive syllogism applied to the original sentence gives us O3^~S. Conjunction elimination results in O3.

Step 3. S together with ((~O1)^O2) v (O3^~S)entails O2. This follows by a very similar argument. S forces us to reject the right argument of the disjuction, leaving us with a conjunction whose argument is O2.

Step 4. So S is verifiable, if ((~O1)^O2) v (O3^~S) doesn’t entail O2 without S.

Step 5. If ((~O1)^O2) v (O3^~S) does entail O2 without S, then each of the disjuncts entails O2 without S. So ~S ^ 03 entails O2. In such case, however, ~S comes out verifiable.

Labels:
Ayer,
Church,
Hempel,
logic,
meaning,
philosophy,
verifiability

Subscribe to:
Posts (Atom)