Prev: SuperKISS for 32- and 64-bit RNGs in both C and Fortran.
Next: New Primitive Programming Language - Is it Turing Complete?
From: Jesse F. Hughes on 12 Dec 2009 13:15 Ludovicus <luiroto(a)yahoo.com> writes: > If that is the definition of a random sequence then it is useless. > Because any irrational number contains all the imaginable sequences. The sentence above is simply false. The number 0.1101001000100001000001... is irrational but does not contain the subsequence 12345. > So a very sort program for producing the infinite digits of a > irrational can code any random sequence. That does not follow. Suppose that pi contains every finite sequence. We need not only an algorithm to compute pi, but also which digit begins the sequence in question. If that digit is large and not very compressible, then the program that uses the computation of pi to output our sequence may be longer than the sequence. -- "I am sure that my writings will survive far longer than Wikipedia, and that deep into the future, about the only human that people will want to know alot [sic] about is Archimedes Plutonium. At some point in the future history of humanity, AP will eclipse even Jesus." -- AP
From: Ilmari Karonen on 12 Dec 2009 15:08 On 2009-12-12, Jesse F. Hughes <jesse(a)phiwumbda.org> wrote: > Ludovicus <luiroto(a)yahoo.com> writes: > >> So a very sort program for producing the infinite digits of a >> irrational can code any random sequence. > > That does not follow. Suppose that pi contains every finite > sequence. We need not only an algorithm to compute pi, but also which > digit begins the sequence in question. If that digit is large and not > very compressible, then the program that uses the computation of pi to > output our sequence may be longer than the sequence. I suspect that what he intends to do is simply to have that short pi-generating program output the digits of pi sequentially until the desired finite subsequence just happens to come up. Of course, the reason that won't really help is that the program must be able to recognize the desired sequence when it sees it. And for most sequences, the shortest program to recognize the sequence is of approximately the same length as the shortest program to generate it. (Indeed, given a program to generate the digits of a normal number, I believe it should be easy to show, using exactly the argument above, that there exists a constant C such that, for all finite sequences S, |length(Gen(S)) - length(Rec(S))| < C, where Gen(S) and Rec(S) are the shortest programs to generate and recognize the sequence S.) -- Ilmari Karonen To reply by e-mail, please replace ".invalid" with ".net" in address.
From: zzbunker on 13 Dec 2009 16:58 On Dec 12, 9:05 am, "zzbun...(a)netscape.net" <zzbun...(a)netscape.net> wrote: > On Nov 29, 7:25 am, Ludovicus <luir...(a)yahoo.com> wrote: > > > > > > > ON CHAITIN MEASURE OF COMPLEXITY > > > By Ludovicus 20/10/2009 > > > G. CHAITIN's article: 'Randomness and Mathematical Proof' in > > Scientific American (May 1975) asserts that:; > > 'The complexity of a series of digits is the number of bits that > > must be put into a computing machine in order to obtain the > > origi- > > nal series as output.The complexity is therefore equal to the > > size > > in bits of the minimal program of the series.' > > > This definition contradicts the accepted concept of complexity > > by example, Herbert Simon': ; > > 'Given the properties of the parts and the laws of its inter- > > actions, it is not a trivial thing to infer the properties or the > > behavior of the system.' > > > This is confirmed by Chaos Theory. It's not the length of an > > algorithm that conveys to Chaos (great complexity), but the ini- > > tial parameters and the interactions between operations . > > > 'EXAMPLES: > > 1.- Chaos by iteration of a function. > > > An illustration of complexity with a very short program is de- > > monstrated by the iteration of: Y = kY^2- 1. > > ( With initial 0 < Y < 1 )." > > > If 1 < k < 2 ,eventually, the values periodically are repeated > > but with k = 2 Chaos is established. If Chaos do not occur it's > > because the precision is too low. > > But, that's also why the people who understand real computers, > rather tham just arbitary Quantum Mechanics momenta > invented Laser-Guided Phasors, rather than double slit > expermiments. > And invented invented Holograms and Desktop Publishing > rather than Spectrum Analizers. > And invented Flat Screen Software Debuggers, rather than > Hollerith Codes. > And invented Blue Ray rather than Hard Disks. > And invented XML, rather than GE. > And invented Atomic Clock Watches and All-In-One Printers > rather than keyboards. > And invented Home Broadband, rather than Unix. > And invented Holograms. USB, and GPS, rather than IBM. > And invented Self-Assembling Robots, External Mini Computer > Harddisks, > and Rapid Prototyping, rather than Intel. > And Invented Weather Satellites, Data Fusion, UAVs, mp3, mpeg, > and Optical Computers, rather and Honeywell. > Or you could even state it as: "It's not like we can't be bought and sold, we just cam't be bought and sold by Q-bit imbeciles". > > > > > 2.- Conway's Game of Life. > > > That Cellular Automata is generated by a minimum program > > of N bits plus M bits of data. > > > Suppose that the data are the coordinates of a figure and > > the produced sequence be the number of 'critters' in each > > cycle. The following configuration of six coordinates will > > produce a sequence whose first eight numbers are: > > 6,5,8,8,12,12,20,12 and then repeat 12,12,..indefinitely. > > (That is: low complexity) > > > o o > > o o > > o o > > > But with the same program and number of coordinates: > > > o o > > o > > o o o > > > We have a sequence of 1108 numbers before it repeats. > > > And the evidence that the process can be not-linear is > > that it could exits synergy between two identical configura- > > tions: > > > o o o > > o o o . . . o o > > o o > > o o > > > This produces a sequence of 3160 numbers before it repeats. > > > But the definitive demonstration that a minimum finite pro- > > gram can produce infinite complexity is that the simple four > > laws of Game of Life can simulate a non periodic sequence. > > And behold, it can simulate an Universal Turing Machine! > > > 3.- Ludovicus Curve > > > In what follows two parametric functions with the sole diff. > > of a letter's position, produce very different complexities. > > > Periodic Curve: > > > X = T + SIN(5 * Y) > > Y = T - COS(2 * X) > > > "PSEUDO -SINUSOID" > > > Swap X , Y and you have the chaotic Curve: > > > X = T + SIN(5 * X) > > Y = T - COS(2 * Y) > > > I call it : "LUDOVICUS' FUNCTION" > > > In his book META MATH Vintage 2006 (Pag. 23) , he says: > > because in general the best way to specify N via a computer > > program is just to give it explicitly as a constant in a > > program > > With no calculation at all ! > > > Why? If a given N of 10^9 bits long is extracted from a list > > of > > digits of pi, or ?7 surely ,it can be reproduced with a program many > > times smaller!- Hide quoted text - > > - Show quoted text -- Hide quoted text - > > - Show quoted text -
From: master1729 on 15 Dec 2009 22:31
> > > > "A random sequence HAS AND EQUALS maximum > complexity. IF AND ONLY IF > > "The !! SHORTEST !! program that codes THAT random > sequence cannot have a length shorter than the > sequence." > > > tommy1729 > > If that is the definition of a random sequence then > it is useless. > Because any irrational number contains all the > imaginable sequences. > So a very sort program for producing the infinite > digits of a > irrational can code any random sequence. > Chaitin affirms that the time of computation is > immaterial. Then a > computer working centuries can find the given > sequence. > Ludovicus > thats plain wrong. tommy1729 |