Prev: power of a matrix limit A^n
Next: best way of testing Dirac's new radioactivities additive creation Chapt 14 #163; ATOM TOTALITY
From: MoeBlee on 16 Jun 2010 12:16 On Jun 16, 11:07 am, Aatu Koskensilta <aatu.koskensi...(a)uta.fi> wrote: > Herman Jurjus <hjm...(a)hetnet.nl> writes: > > Also many classical mathematicians appreciate this as an example > > showing that the extensional notion 'Turing computable' is a slight > > distortion of the intuitive notion 'computable'. > > Possibly, but I don't think this is quite the right diagnosis. The issue > is more subtle. > > It's a well known phenomenon that many classically minded mathematicians > who have had little practice in constructive thinking are unwittingly > inconsistent in their reading of intuitionistic quantifiers. It's an > equally striking phenomenon that classically minded mathematicians in > certain contexts naturally adopt an intuitionistic reading of classical > quantifiers. In addition to the example provided by Virgil [...] So is there a FORMAL constructivistic (or, more specifically, intuitionistic) theory in which we can formalize 'computable' so that it is faithful to the kind of "intensional" notion as held by Virgil? (Classically, if f is recursive then it is computable, and, if we accept Turing's thesis we get the other direction so that we may formalize the notion of "computable" as recursive.) MoeBlee
From: WM on 16 Jun 2010 12:34 On 16 Jun., 17:55, Aatu Koskensilta <aatu.koskensi...(a)uta.fi> wrote: > WM <mueck...(a)rz.fh-augsburg.de> writes: > > It is very probable that every line has many different meaning, but no > > line has uncounatbly many meanings. > > Every line has an indefinite and indeterminate number of possible > meanings. It makes no sense to speak of the cardinality of the totality > of all possible meanings of a string of symbols. It is fact that all possible meanings must be defined by finite definitions. Therefore the meanings are countable. Therefore it makes sense to call the set of all meanings countable. In mathematics we can calculate or estimate things even if not all can be named. > > > Therefore the list contains only countably many finite definitions. > > The list contains just random, meaningless strings. Whenever we instill, > with mathematical precision, these strings with some definite meaning, > so that they become definitions of reals, we find there are definitions > not included in the list. There is an absolute notion of computability > in logic. If this notion yields numbers only that are in trichotomy with each other, then the notion is acceptable. If this notion yields numbers like you gave examples for (IIRC) like n = (1 if Obama gets a second term and 0 otherwise) or so, then this notion together with your logic should be put into the trash can. > There is no absolute notion of definability. We need not an absolute notion if we know that all possible definitions of the notion of definability belong to a countable set. It is enough to prove by estimation that set theory is wrong. Compare the famous irrationality proofs and transcendence proofs of number theory. We need not calculate its deviation from truth to the fifth digit. It is enough to see that ZFC is wrong unless there is a natural between 0 and 1. Wissenschaft bedingt Zweifel am Glaubhaften. Matheologie erfordert Glauben an das Zweifelhafte. Regards, WM
From: WM on 16 Jun 2010 12:46 On 16 Jun., 18:07, Aatu Koskensilta <aatu.koskensi...(a)uta.fi> wrote: > Herman Jurjus <hjm...(a)hetnet.nl> writes: > > Also many classical mathematicians appreciate this as an example > > showing that the extensional notion 'Turing computable' is a slight > > distortion of the intuitive notion 'computable'. > > Possibly, but I don't think this is quite the right diagnosis. The issue > is more subtle. The issue is very simple. A real number is computable, if its place on the real axis can be established, i.e., trichotomy. Otherwise it does not deserve the name number but at most number form or interval (like 0.1x means 0.10 to 0.19 in decimal). > > It's a well known phenomenon that many classically minded mathematicians > who have had little practice in constructive thinking are unwittingly > inconsistent in their reading of intuitionistic quantifiers. It's an > equally striking phenomenon that classically minded mathematicians in > certain contexts naturally adopt an intuitionistic reading of classical > quantifiers. In addition to the example provided by Virgil it's not > uncommon for people to mistakenly think, in a classical context, that > countable sets come equipped with a designated bijection witnessing > their countability. The proof of countability of S does not require a bijection of S with N but only an injection of some superset of S into N. This is established by my list 0 1 00 01 .... where every line may be enumerated by an element of the countable set omega^omega^omega (and, if required, finitely many more exponents for alphabets, languages, dictionaries, thesauruses, and further properties) Regards, WM
From: WM on 16 Jun 2010 13:31 On 16 Jun., 03:05, "Mike Terry" <news.dead.person.sto...(a)darjeeling.plus.com> wrote: > "Virgil" <Vir...(a)home.esc> wrote in message > > news:Virgil-3B2F0B.16291815062010(a)bignews.usenetmonster.com... > > > In article <87sk4ohwbt....(a)dialatheia.truth.invalid>, > > Aatu Koskensilta <aatu.koskensi...(a)uta.fi> wrote: > > > > Virgil <Vir...(a)home.esc> writes: > > > > > Note that it is possible to have an uncomputable number whose decimal > > > > expansion has infinitely many known places, so long as it has at least > > > > one unknown place. > > > > You need infinitely many unknown places. > > > If the value of some decimal digit of a number depends on the truth of > > an undecidable proposition, can such a number be computable? > > Yes - e.g. imagine just the first digit of the following number depends on > an undecidable proposition: > > 0.x000000000... This is not a real number. It restricts the set of numbers to 10 differents numbers (in decimal). > > There are only 10 possibilities for the number, and in each case it is > obviously computable... But it is not clear which case will be chosen. Two real numbers a and b satisfy a < b or a = b or a > b. 0.y and 0.x do not. These symbols are number forms like x and y in 3x + 5y = 0 or n in "let n be an even number". Obviously n need not be an even number. It is no number at all. Regards, WM
From: MoeBlee on 16 Jun 2010 13:45
On Jun 16, 11:46 am, WM <mueck...(a)rz.fh-augsburg.de> wrote: > On 16 Jun., 18:07, Aatu Koskensilta <aatu.koskensi...(a)uta.fi> wrote: > > it's not > > uncommon for people to mistakenly think, in a classical context, that > > countable sets come equipped with a designated bijection witnessing > > their countability. > > The proof of countability of S does not require a bijection of S with > N but only an injection of some superset of S into N. He didn't say "bijection with N" (I take it by N you mean the set of natural numbers or perhaps the set of positive natural numbers). But, every countable set does have a bijection with N or has a bijection with some member of w (the set of natural numbers). Aatu's point here is that in classical mathematics the definition of 'x is countable' is 'there exists a bijection from x onto the set of natural numbers or onto a natural number', and that definition does not include a clause that we (me, you, whoever) KNOW of a particular bijection that we can specifically mention or define. MoeBlee |