What is the smallest positive integer that cannot be defined in less than twenty-five syllables?
Before you say anything more about Godel or paradox, you and site readers really must read A. Garciadiego's classic 1992 account: BERTRAND RUSSELL AND THE ORIGINS OF THE SET-THEORETIC 'PARADOXES.' Read the footnotes, too, and ignore the innumerable typos (who proofread that damned thing?).
Little, if anything, remains of twentieth-century mathematics. The gist of the book is that the "paradoxes" which led to Godel's argument (and those of the Intuitionists, the Logicists and Formalists as well as their successors), are not paradoxes at all--they are meaningless formulations. This undermines most, if not all, of twentieth-century mathematics, and destroys Godel's argument.
Central to Garciadiego's project is Richard's "paradox." He cites Richard's own formulation of this "contradiction" (Richard's term) in a letter to Poincare. He also cites Richard reducing the argument to meaninglessness. Other so-called "paradoxes" (for example, the Burali-Forti "paradox") are simply inventions of Russell's (Russell's own "paradox" is simply an infirm version of Richard's argument).
What does this have to do with Godel? It's simple. For Godel, Richard's "paradox" means that truth in number theory cannot be defined in number theory. On this basis, he distinguishes truth from provability. He combines his idea of Richard's "paradox" with the idea that provability in number theory can be defined in number theory. He arrives at the conclusion that if all the provable formulae are true, there must be some true but unprovable formulae. However, since Richard's "paradox" is without meaning, there is no basis in Godel's argument for distinguishing truth from provability. You cannot find one, and although it is fun to think that there is a compelling math theorem out there, you're simply going to have to grow up and get over it: Godel's theorem is over.
It turns out that there is no logical content in the idea that if all the provable formulae are true, there must be some true but unprovable formulae. The problem is that Godel was a sloppy investigator, and did not apply himself sufficiently to the details of the development of set theory. For example, Richard's letter to Poincare was published, but Godel never bothered to read it. You can also read in Feferman's introduction to Godel's papers, some tart comments on Godel's inability to express his ideas about relativity or even about his own theorem(s).
Garciadiego's book has implications for all twentieth-century mathematics. For example, Brouwer based the idea of an infinite ordinal number on the idea that Cantor had proved well-ordering of the ordinal numbers. But not only did Cantor never prove this, but also, he never said he had done so, and never used the term infinite ordinal number. It seems increasingly like that all of twentieth-century mathematics is directed toward avoiding "paradoxes" which are not paradoxes. This ridiculous intuitionist thinking infected every twentieth-century discipline, from the economics of Sraffa to the biology of Kimura to the relativity of Einstein.
Garciadiego asks us to start over, to understand how these misconceptions affected later mathematics. I will give but one example, the work of Turing. Like Godel, he never examines the "paradoxes" in order to determine whether they are simply meaningless formulations. Thus, in an attempt to "prove that there is no general method for determining about a formula whether it is an ordinal formula, we use an argument akin to
that leading the Burali-Forti paradox, but the emphasis and the conclusion are different." As Garciadiego reveals, there is no Burali-Forti paradox. In the context of an attempt to prove the Trichotomy Law, Burali-Forti tried "to prove by reductio ad absurdum that the hypothesis [involved in his own argument] was false and this method required supposing the hypothesis true and arriving at a contradiction. The employment of the hypothesis, as an initial premise, generated the inconsistency. But once the hypothesis is seen to imply a contradiction it is thereby proved to be false." Turing purported to distinguish completeness from decidability, not realizing that the absence of a contradiction made the distinction insupportable. Turing claimed justification for his definition of a computable number in a "direct appeal to intuition." This is not a cavalier reference to intuitionism. In fact, it provides the basis for Turing's use of binary numbers. This base2 system is a metaphor
which traces itself back through Turing's own bifurcation of the mathematical process to Brouwer's own bifurcation of the operation of the human mind ("the connected and the separate, the continuous and the discrete")-all in an attempt to "avoid" the "paradoxes." Brouwer's complaint is that the "paradoxes" deprive us of distinctions. Turing's entire apparatus of calculability is designed to "restore" "distinctions." The binary number, and Turing's later restoration of a modified form of completeness in the form of decidability, are assertions by way of distinction.
However, there is no problem against which to assert it. Operating on these numbers with "finite means" (Turing's definition of a computable number) merely takes us back to Richard's response to his own contradiction and no recourse to intuition can rescue us from the consequences of that response: the computable number only has meaning if finite means are defined in totality, and this can only be done with infinitely
many words.
And face it: to the extent you ever gave a thought to the Burali-Forti "paradox," you assumed it had some logical content. You were wrong: it has no logical content. You're going to have to face the fact that you've got a lot of bad tradition in your heads. Get it out of there just as soon as you can. It turns out that Richard's discussion of his own "contradiction" serves as a useful template for evaluating, and then discarding, putative "paradoxes." There is much work to be done in that field. But as for twentieth-century mathematics, to the extent it is based on already-discredited "paradoxes," it loses any logical content. This, unfortunately, is certainly true of Godel's argument. It is even more glaringly true in the case of now-trivial figures such as Carnap and Tarski (Quine and Feferman, too). After Garciadiego, these names go from the headlines to the footnotes. In general, Garciadiego's book is an indictment of twentieth-century math academics. It simply takes careful reading to realize that these ideas are bogus. Where were our professors? Where are they now?
And Einstein? He also suffered from the Achilles' heel of Sraffa and Kimura: mathematics. He went looking for a mathematics in which to express his ideas, and found one in the intuitionist notion that formulations do not express reality because they are prone to paradox, and that mathematics is really a natural extention of perception. This is the "natural" mathematics Einstein adopted. How does it undermine his theory. Take a look. Here is a passage from Einstein's book Relativity:
"Up to now our considerations have been referred to a particular body of reference, which we have styled a 'railway embankment.' We suppose a very long train travelling along the rails with the constant velocity v and in the direction indicated....People travelling in this train will with advantage use the train as a rigid reference-body (co-ordinate system); they regard all events in reference to the train. Then every event which takes place along the line also takes place at a particular point of the train. Also the definition of simultaneity can be given relative to the train in exactly the same way as with respect to the embankment. As a natural consequence, however, the following question arises: Are two events (e.g. the two strokes of lightning A and B) which are simultaneous with reference to the railway embankment also simultaneous relatively to the train? We shall show directly that the answer must be in the negative. When we say that the lightning strokes A and B are simultaneous with respect to the embankment, we mean: the rays of light emitted at the places A and B, where the lightning occurs, meet each other at the mid-point M of the length A -> B of the embankment. But the events A and B also correspond to positions A and B on the train. Let M' be the mid-point of the distance A -> B on the travelling train. Just when the flashes (as judged from the embankment) of lightning occur, this point M' naturally coincides with the point M, but it moves towards the right in the diagram with the velocity v of the train."
NOW, DON'T LOOK AT THE PASSAGE BELOW UNTIL YOU HAVE READ EINSTEIN'S WORDS AND TRIED TO FIND THE LOGICAL PROBLEM WITH THEM. THEN READ THIS:
This translation is accurate (the French and Italian are not). Einstein really does say "fallt zwar...zusammen." That is, he says that one point "naturally" coincides with another. The "naturally" reveals the intuitionist expression of the concept, for it reflects the belief that the formulations of geometry do not express facts.
Obviously, the logical problem with it is that, regardless of what Einstein may "feel" about mathematical expressions, nowhere in Einstein's writings--either in the 1905 papers or after--is any meaning assigned to 'naturally.' The failure to do so, destroys the idea, and it is easy to see why. If we retain the concept without meaning there is no logical basis on which to proceed beyond it. If we eliminate it, we wind up with a contradiction: the two assumed coordinate systems collapse into one. What is more, when we place this train experiment next to the various other thought experiments, we see that they are simply translations of the same problem into other terms, just as the false 'paradoxes' turn out to be subject to the same problem Richard indicated (reference to an infinite domain which destroys the meaning). In special relativity, natural coincidence can only be defined by infinitely many words. So the distinction collapses.
So please get Garciadiego's book, read it, and begin your "revaluation of all values," as our dear syphilitic put it.
Cheers,
John Ryskamp