Prev: SuperKISS for 32- and 64-bit RNGs in both C and Fortran.
Next: New Primitive Programming Language - Is it Turing Complete?
From: David C. Ullrich on 3 Dec 2009 10:31 On Thu, 3 Dec 2009 06:09:45 -0800 (PST), Ludovicus <luiroto(a)yahoo.com> wrote: >On 2 dic, 13:19, master1729 <tommy1...(a)gmail.com> wrote: >> Aatu Koskensilta wrote : >> kolmogorov complexity is important imho for testing randomness of strings. > > >Most people accepts what Kolmogorov and Chaitin says about randomness: >"A random sequence have maximum complexity." >"The program that codes a random sequence cannot have a length >shorter than the sequence." > >But building an incremental mesure of complexity based on length >of minimum programs is easily contradicted with counter-examples. Counterexamples to a _definition_? Very curious. >Suppose someone writes a program for the primes sequence 100 bytes >long. >Suppose A is an arbitrary constant 100 bytes long. >Then the arithmetic progression: A.n + 1 needs a program with a >length: >100 + Code >According to Chaitin this arithmetic progression is more complex than >the sequence of primes! You didn't state that very well. But never mind the details - yes, it's clear that there are arithmetic progressions that have Chaitin complexity greater than the Chaitin complexity of the sequence of primes. Now where's the supposed contradiction? >Ludovicus > David C. Ullrich "Understanding Godel isn't about following his formal proof. That would make a mockery of everything Godel was up to." (John Jones, "My talk about Godel to the post-grads." in sci.logic.)
From: Ludovicus on 4 Dec 2009 09:12 On 3 dic, 11:31, David C. Ullrich <dullr...(a)sprynet.com> wrote: > On Thu, 3 Dec 2009 06:09:45 -0800 (PST), Ludovicus <luir...(a)yahoo.com> > You didn't state that very well. But never mind the details - yes, > it's clear that there are arithmetic progressions that have Chaitin > complexity greater than the Chaitin complexity of the sequence > of primes. Now where's the supposed contradiction? > David C. Ullrich Then, the Chaitin's mesure of complexity is useless. Here I present the Ludovicus' mesure of difficulty : "The mesure of difficulty of a theorem is equal to the quantity of characters needed for its minimum presentation." Sirs, its first application is the important result: "The difficulty of Fermat's theorem is less than the difficulty of Pitagoras theorem." Now I am waiting that Universities invite me to impart conferences on that marvelous finding. And editorials to print my books in so transcendent discovery.
From: master1729 on 6 Dec 2009 06:09 Luis A Rodriguez wrote : > On 2 dic, 13:19, master1729 <tommy1...(a)gmail.com> > wrote: > > Aatu Koskensilta wrote : > > kolmogorov complexity is important imho for testing > randomness of strings. > > > Most people accepts what Kolmogorov and Chaitin says > about randomness: > "A random sequence have maximum complexity." > "The program that codes a random sequence cannot have > a length > shorter than the sequence." > > But building an incremental mesure of complexity > based on length > of minimum programs is easily contradicted with > counter-examples. > > Suppose someone writes a program for the primes > sequence 100 bytes > long. > Suppose A is an arbitrary constant 100 bytes long. > Then the arithmetic progression: A.n + 1 needs a > program with a > length: > 100 + Code > According to Chaitin this arithmetic progression is > more complex than > the sequence of primes! > Ludovicus > > wrong. "A random sequence HAS AND EQUALS maximum complexity. IF AND ONLY IF "The !! SHORTEST !! program that codes THAT random sequence cannot have a length shorter than the sequence." is more like it ... thus when you give multiple algoritms for computing things that is a flawed argument. hope you finally understand. regards tommy1729
From: Ludovicus on 12 Dec 2009 08:07 > "A random sequence HAS AND EQUALS maximum complexity. IF AND ONLY IF > "The !! SHORTEST !! program that codes THAT random sequence cannot have a length shorter than the sequence." > tommy1729 If that is the definition of a random sequence then it is useless. Because any irrational number contains all the imaginable sequences. So a very sort program for producing the infinite digits of a irrational can code any random sequence. Chaitin affirms that the time of computation is immaterial. Then a computer working centuries can find the given sequence. Ludovicus
From: zzbunker on 12 Dec 2009 09:05 On Nov 29, 7:25 am, Ludovicus <luir...(a)yahoo.com> wrote: > ON CHAITIN MEASURE OF COMPLEXITY > > By Ludovicus 20/10/2009 > > G. CHAITIN's article: 'Randomness and Mathematical Proof' in > Scientific American (May 1975) asserts that:; > 'The complexity of a series of digits is the number of bits that > must be put into a computing machine in order to obtain the > origi- > nal series as output.The complexity is therefore equal to the > size > in bits of the minimal program of the series.' > > This definition contradicts the accepted concept of complexity > by example, Herbert Simon': ; > 'Given the properties of the parts and the laws of its inter- > actions, it is not a trivial thing to infer the properties or the > behavior of the system.' > > This is confirmed by Chaos Theory. It's not the length of an > algorithm that conveys to Chaos (great complexity), but the ini- > tial parameters and the interactions between operations . > > 'EXAMPLES: > 1.- Chaos by iteration of a function. > > An illustration of complexity with a very short program is de- > monstrated by the iteration of: Y = kY^2- 1. > ( With initial 0 < Y < 1 )." > > If 1 < k < 2 ,eventually, the values periodically are repeated > but with k = 2 Chaos is established. If Chaos do not occur it's > because the precision is too low. But, that's also why the people who understand real computers, rather tham just arbitary Quantum Mechanics momenta invented Laser-Guided Phasors, rather than double slit expermiments. And invented invented Holograms and Desktop Publishing rather than Spectrum Analizers. And invented Flat Screen Software Debuggers, rather than Hollerith Codes. And invented Blue Ray rather than Hard Disks. And invented XML, rather than GE. And invented Atomic Clock Watches and All-In-One Printers rather than keyboards. And invented Home Broadband, rather than Unix. And invented Holograms. USB, and GPS, rather than IBM. And invented Self-Assembling Robots, External Mini Computer Harddisks, and Rapid Prototyping, rather than Intel. And Invented Weather Satellites, Data Fusion, UAVs, mp3, mpeg, and Optical Computers, rather and Honeywell. > > 2.- Conway's Game of Life. > > That Cellular Automata is generated by a minimum program > of N bits plus M bits of data. > > Suppose that the data are the coordinates of a figure and > the produced sequence be the number of 'critters' in each > cycle. The following configuration of six coordinates will > produce a sequence whose first eight numbers are: > 6,5,8,8,12,12,20,12 and then repeat 12,12,..indefinitely. > (That is: low complexity) > > o o > o o > o o > > But with the same program and number of coordinates: > > o o > o > o o o > > We have a sequence of 1108 numbers before it repeats. > > And the evidence that the process can be not-linear is > that it could exits synergy between two identical configura- > tions: > > o o o > o o o . . . o o > o o > o o > > This produces a sequence of 3160 numbers before it repeats. > > But the definitive demonstration that a minimum finite pro- > gram can produce infinite complexity is that the simple four > laws of Game of Life can simulate a non periodic sequence. > And behold, it can simulate an Universal Turing Machine! > > 3.- Ludovicus Curve > > In what follows two parametric functions with the sole diff. > of a letter's position, produce very different complexities. > > Periodic Curve: > > X = T + SIN(5 * Y) > Y = T - COS(2 * X) > > "PSEUDO -SINUSOID" > > Swap X , Y and you have the chaotic Curve: > > X = T + SIN(5 * X) > Y = T - COS(2 * Y) > > I call it : "LUDOVICUS' FUNCTION" > > In his book META MATH Vintage 2006 (Pag. 23) , he says: > because in general the best way to specify N via a computer > program is just to give it explicitly as a constant in a > program > With no calculation at all ! > > Why? If a given N of 10^9 bits long is extracted from a list > of > digits of pi, or ?7 surely ,it can be reproduced with a program many > times smaller!
First
|
Prev
|
Next
|
Last
Pages: 1 2 3 4 5 Prev: SuperKISS for 32- and 64-bit RNGs in both C and Fortran. Next: New Primitive Programming Language - Is it Turing Complete? |