From: Nam Nguyen on 16 Jan 2010 17:16 Nam Nguyen wrote: > Marshall wrote: >> On Jan 15, 11:43 pm, Archimedes Plutonium >> <plutonium.archime...(a)gmail.com> wrote: >>> I want to prove that the only way to well-define or >>> precisely define Finite is to pick a large number and >>> say that is the end of Finite. >> >> Anyone can define anything to be anything. The idea that >> there is only one right definition of something is a failure >> to understand what definitions are. > > Totally agreed with you on this. (Not that AP's "precise" definition > of "Finite" would make a lot of mathematical sense anyway). > > So are you with me that the currently widely accepted definition of > the "natural numbers" is *not* the only right definition? > > For instance, the following 2 definitions would be equally the right > ones (as well as the current one): > > Let F be the formula "There are infinite counter examples of GC" > > Def 1: The natural numbers = the current definition + that F is true. > Def 2: The natural numbers = the current definition + that F is false. > > Right? I can be said that the key shortcoming of Godel's work is that it doesn't recognize the truth of the meta statement: (I) For any concept as strong as that of the natural numbers, there are concepts such as F = "There are infinite [number of] counter examples of GC" that would be independent from [the original concept]. It's truly an irony that this statement rhythms (i.e. sounds like) a statement in his Incompleteness work, when he tried to point out the weakness of relying on one "giant" formal system for proving _all useful_ mathematical/arithmetical properties and relations. In other words, in pointing out the Incompleteness of mathematical provability of any one "giant" formal system, Godel ignored the Incompleteness of knowledge of any "giant" definition of "The Natural Numbers".
From: Nam Nguyen on 16 Jan 2010 19:11 Jesse F. Hughes wrote: > Nam Nguyen <namducnguyen(a)shaw.ca> writes: > >> Of course by "There are infinite counter examples of GC" I meant the set >> of such counter example would be infinite, not each number is an infinite >> number. (A natural number is actually neither finite nor infinite!) > > Ah. Perhaps you should write, "There are *infinitely many* > counterexamples of GC," to clear up the confusion. > Agree.
From: Nam Nguyen on 23 Jan 2010 14:56 Nam Nguyen wrote: > Nam Nguyen wrote: >> Marshall wrote: >>> On Jan 15, 11:43 pm, Archimedes Plutonium >>> <plutonium.archime...(a)gmail.com> wrote: >>>> I want to prove that the only way to well-define or >>>> precisely define Finite is to pick a large number and >>>> say that is the end of Finite. >>> >>> Anyone can define anything to be anything. The idea that >>> there is only one right definition of something is a failure >>> to understand what definitions are. >> >> Totally agreed with you on this. (Not that AP's "precise" definition >> of "Finite" would make a lot of mathematical sense anyway). >> >> So are you with me that the currently widely accepted definition of >> the "natural numbers" is *not* the only right definition? >> >> For instance, the following 2 definitions would be equally the right >> ones (as well as the current one): >> >> Let F be the formula "There are infinite counter examples of GC" >> >> Def 1: The natural numbers = the current definition + that F is true. >> Def 2: The natural numbers = the current definition + that F is false. >> >> Right? > > I can be said that the key shortcoming of Godel's work is that it doesn't > recognize the truth of the meta statement: > > (I) For any concept as strong as that of the natural numbers, there are > concepts such as F = "There are infinite [number of] counter examples > of GC" that would be independent from [the original concept]. > > It's truly an irony that this statement rhythms (i.e. sounds like) a > statement in his Incompleteness work, when he tried to point out the > weakness of relying on one "giant" formal system for proving _all useful_ > mathematical/arithmetical properties and relations. > > In other words, in pointing out the Incompleteness of mathematical > provability of any one "giant" formal system, Godel ignored the > Incompleteness of knowledge of any "giant" definition of "The Natural Numbers". For lack of a better name, let's call such F a G2 (Godel-Goldbach) sentence, reflecting a concept in L(PA). Similarly (I) would be called here G2IT (Godel-Goldbach Incompleteness Theorem [of Knowledge]). Certainly G2IT wouldn't rhyme well with GIT unless for *any* current concept A of arithmetics (or "natural numbers"), we can demonstrate an existence of an independent concept, say, F = G2(A) such that A + G2(A) and A + ~G2(A) are both equally extended concepts of the natural numbers, which one could choose as a new "the standard model" of say Q (i.e. a new "the natural numbers"). Furthermore, for any new arithmetic A' = A + G2(A), or A' = A + ~G2(A), there would have to be a new G2(A') similarly. We'll demonstrate that for any concept A of "the natural numbers" such a G2(A) would exist in subsequent posts. But here, as an introduction to G2IT, let's briefly touch base on what we'd mean by a concept (e.g. G2) _independent_ of an underlying arithmetic A. *** Since Godel, we know that an F is undecidable(F and ~F are independent) in T means provability-wise we can't know the fact, if F is genuinely so. The matter would be resolved only through (subjective) interpretation known as model, where we'd find 2 models of T opposing the truth of F. In other words, model is a kind of "the-buck-stops-here" stigma in mathematical truth-knowledge, beyond which certain truths are impossible. So then, how could we _reliably_ demonstrate an concept (e.g.G2) independent from a foundational arithmetic A which is purported to be "the standard" "model" of important arithmetic formal systems such as Q or PA? The answer, imho, would be *not* using Induction. L(PA) is L(0,S,+,*,<) and 'S', '+' are well known for being instrumental in carrying out concepts that would depend on Induction. Then, if we have 2 formulas F and F' in which F includes 'S' or '+' while F' doesn't and (concept wise) F' is a semantically translated version of F, F could be a candidate for being independent from A. For instance, - instead of defining One as S0 we could have One = "the unique minimum number that 0 is less than", - instead of defining Two as SS0 we could have Two = "the unique minimum number that One is less than", - we could then define an even number e as Two*x for some x, .... Ultimately we'd translate certain concepts related to GC into formulas that are independent of Induction, in the sense that the translated formulas are free of 'S' and '+'. [To be continued...]
First
|
Prev
|
Pages: 1 2 Prev: A rather very complex definition of Cardinality: Next: modal logic exercise |