Prev: SuperKISS for 32- and 64-bit RNGs in both C and Fortran.
Next: New Primitive Programming Language - Is it Turing Complete?
From: Ludovicus on 29 Nov 2009 07:25 ON CHAITIN MEASURE OF COMPLEXITY By Ludovicus 20/10/2009 G. CHAITIN's article: 'Randomness and Mathematical Proof' in Scientific American (May 1975) asserts that:; 'The complexity of a series of digits is the number of bits that must be put into a computing machine in order to obtain the origi- nal series as output.The complexity is therefore equal to the size in bits of the minimal program of the series.' This definition contradicts the accepted concept of complexity by example, Herbert Simon': ; 'Given the properties of the parts and the laws of its inter- actions, it is not a trivial thing to infer the properties or the behavior of the system.' This is confirmed by Chaos Theory. It's not the length of an algorithm that conveys to Chaos (great complexity), but the ini- tial parameters and the interactions between operations . 'EXAMPLES: 1.- Chaos by iteration of a function. An illustration of complexity with a very short program is de- monstrated by the iteration of: Y = kY^2- 1. ( With initial 0 < Y < 1 )." If 1 < k < 2 ,eventually, the values periodically are repeated but with k = 2 Chaos is established. If Chaos do not occur it's because the precision is too low. 2.- Conway's Game of Life. That Cellular Automata is generated by a minimum program of N bits plus M bits of data. Suppose that the data are the coordinates of a figure and the produced sequence be the number of 'critters' in each cycle. The following configuration of six coordinates will produce a sequence whose first eight numbers are: 6,5,8,8,12,12,20,12 and then repeat 12,12,..indefinitely. (That is: low complexity) o o o o o o But with the same program and number of coordinates: o o o o o o We have a sequence of 1108 numbers before it repeats. And the evidence that the process can be not-linear is that it could exits synergy between two identical configura- tions: o o o o o o . . . o o o o o o This produces a sequence of 3160 numbers before it repeats. But the definitive demonstration that a minimum finite pro- gram can produce infinite complexity is that the simple four laws of Game of Life can simulate a non periodic sequence. And behold, it can simulate an Universal Turing Machine! 3.- Ludovicus Curve In what follows two parametric functions with the sole diff. of a letter's position, produce very different complexities. Periodic Curve: X = T + SIN(5 * Y) Y = T - COS(2 * X) "PSEUDO -SINUSOID" Swap X , Y and you have the chaotic Curve: X = T + SIN(5 * X) Y = T - COS(2 * Y) I call it : "LUDOVICUS' FUNCTION" In his book META MATH Vintage 2006 (Pag. 23) , he says: because in general the best way to specify N via a computer program is just to give it explicitly as a constant in a program With no calculation at all ! Why? If a given N of 10^9 bits long is extracted from a list of digits of pi, or ?7 surely ,it can be reproduced with a program many times smaller!
From: Timothy Murphy on 29 Nov 2009 09:26 Ludovicus wrote: > G. CHAITIN's article: 'Randomness and Mathematical Proof' in > Scientific American (May 1975) asserts that:; > 'The complexity of a series of digits is the number of bits that > must be put into a computing machine in order to obtain the > origi- > nal series as output.The complexity is therefore equal to the > size > in bits of the minimal program of the series.' > > This definition contradicts the accepted concept of complexity > by example, Herbert Simon': ; > 'Given the properties of the parts and the laws of its inter- > actions, it is not a trivial thing to infer the properties or the > behavior of the system.' There are many different notions of "complexity". Chaitin's definition makes perfect sense. If you don't like the term "complexity", call it algorithmic entropy, or informational content. -- Timothy Murphy e-mail: gayleard /at/ eircom.net tel: +353-86-2336090, +353-1-2842366 s-mail: School of Mathematics, Trinity College, Dublin 2, Ireland
From: Don Stockbauer on 29 Nov 2009 09:41 On Nov 29, 8:26 am, Timothy Murphy <gayle...(a)eircom.net> wrote: > Ludovicus wrote: > > G. CHAITIN's article: 'Randomness and Mathematical Proof' in > > Scientific American (May 1975) asserts that:; > > 'The complexity of a series of digits is the number of bits that > > must be put into a computing machine in order to obtain the > > origi- > > nal series as output.The complexity is therefore equal to the > > size > > in bits of the minimal program of the series.' > > > This definition contradicts the accepted concept of complexity > > by example, Herbert Simon': ; > > 'Given the properties of the parts and the laws of its inter- > > actions, it is not a trivial thing to infer the properties or the > > behavior of the system.' > > There are many different notions of "complexity". > Chaitin's definition makes perfect sense. > If you don't like the term "complexity", > call it algorithmic entropy, or informational content. > You think you'll ever get a simple definition of complexity?
From: David C. Ullrich on 29 Nov 2009 11:37 On Sun, 29 Nov 2009 04:25:07 -0800 (PST), Ludovicus <luiroto(a)yahoo.com> wrote: > ON CHAITIN MEASURE OF COMPLEXITY > > By Ludovicus 20/10/2009 > > > G. CHAITIN's article: 'Randomness and Mathematical Proof' in > Scientific American (May 1975) asserts that:; > 'The complexity of a series of digits is the number of bits that > must be put into a computing machine in order to obtain the >origi- > nal series as output.The complexity is therefore equal to the >size > in bits of the minimal program of the series.' > > This definition contradicts the accepted concept of complexity > by example, Herbert Simon': ; > 'Given the properties of the parts and the laws of its inter- > actions, it is not a trivial thing to infer the properties or the > behavior of the system.' Huh? Why in the world do you imagine that these two statements contradict each other? > This is confirmed by Chaos Theory. It's not the length of an > algorithm that conveys to Chaos (great complexity), And where do you get the impression that "chaos" is the same as "great complexity"? >but the ini- > tial parameters and the interactions between operations . > > 'EXAMPLES: > 1.- Chaos by iteration of a function. > > An illustration of complexity with a very short program is de- > monstrated by the iteration of: Y = kY^2- 1. > ( With initial 0 < Y < 1 )." > > If 1 < k < 2 ,eventually, the values periodically are repeated > but with k = 2 Chaos is established. If Chaos do not occur it's > because the precision is too low. > > > > > 2.- Conway's Game of Life. > > That Cellular Automata is generated by a minimum program > of N bits plus M bits of data. > > Suppose that the data are the coordinates of a figure and > the produced sequence be the number of 'critters' in each > cycle. The following configuration of six coordinates will > produce a sequence whose first eight numbers are: > 6,5,8,8,12,12,20,12 and then repeat 12,12,..indefinitely. > (That is: low complexity) > > > > > > > o o > o o > o o > > But with the same program and number of coordinates: > > o o > o > o o o > > We have a sequence of 1108 numbers before it repeats. > > And the evidence that the process can be not-linear is > that it could exits synergy between two identical configura- > tions: > > o o o > o o o . . . o o > o o > o o > > This produces a sequence of 3160 numbers before it repeats. > > > But the definitive demonstration that a minimum finite pro- > gram can produce infinite complexity is that the simple four > laws of Game of Life can simulate a non periodic sequence. > And behold, it can simulate an Universal Turing Machine! > > > 3.- Ludovicus Curve > > In what follows two parametric functions with the sole diff. > of a letter's position, produce very different complexities. > > Periodic Curve: > > X = T + SIN(5 * Y) > Y = T - COS(2 * X) > > "PSEUDO -SINUSOID" > > > Swap X , Y and you have the chaotic Curve: > > X = T + SIN(5 * X) > Y = T - COS(2 * Y) > > I call it : "LUDOVICUS' FUNCTION" > > In his book �META MATH� Vintage 2006 (Pag. 23) , he says: > ��because in general the best way to specify N via a computer > program is just to give it explicitly as a constant in a >program > With no calculation at all !� > > Why? If a given N of 10^9 bits long is extracted from a list >of >digits of pi, or ?7 surely ,it can be reproduced with a program many >times smaller! David C. Ullrich "Understanding Godel isn't about following his formal proof. That would make a mockery of everything Godel was up to." (John Jones, "My talk about Godel to the post-grads." in sci.logic.)
From: master1729 on 29 Nov 2009 07:22
David C Ullrich wrote : > On Sun, 29 Nov 2009 04:25:07 -0800 (PST), Ludovicus > <luiroto(a)yahoo.com> wrote: > > > ON CHAITIN MEASURE OF COMPLEXITY > > > > By > Ludovicus 20/10/2009 > > > > > > G. CHAITIN's article: 'Randomness and > Mathematical Proof' in > > Scientific American (May 1975) asserts that:; > > 'The complexity of a series of digits is the > number of bits that > > must be put into a computing machine in order > to obtain the > >origi- > > nal series as output.The complexity is > therefore equal to the > >size > > in bits of the minimal program of the series.' > > > > This definition contradicts the accepted > concept of complexity > > by example, Herbert Simon': ; > > 'Given the properties of the parts and the > laws of its inter- > > actions, it is not a trivial thing to infer the > properties or the > > behavior of the system.' > > Huh? Why in the world do you imagine that these two > statements > contradict each other? > > > This is confirmed by Chaos Theory. It's not > the length of an > > algorithm that conveys to Chaos (great > complexity), > > And where do you get the impression that "chaos" is > the same > as "great complexity"? > > >but the ini- > > tial parameters and the interactions between > operations . > > > > 'EXAMPLES: > > 1.- Chaos by iteration of a function. > > > > An illustration of complexity with a very > short program is de- > > monstrated by the iteration of: Y = kY^2- 1. > > ( With initial 0 < Y < 1 )." > > > > If 1 < k < 2 ,eventually, the values > periodically are repeated > > but with k = 2 Chaos is established. If Chaos > do not occur it's > > because the precision is too low. > > > > > > > > > > 2.- Conway's Game of Life. > > > > That Cellular Automata is generated by a > minimum program > > of N bits plus M bits of data. > > > > Suppose that the data are the coordinates of > a figure and > > the produced sequence be the number of > 'critters' in each > > cycle. The following configuration of six > coordinates will > > produce a sequence whose first eight numbers > are: > > 6,5,8,8,12,12,20,12 and then repeat > 12,12,..indefinitely. > > (That is: low complexity) > > > > > > > > > > > > > > o o > > o o > > o o > > > > But with the same program and number of > coordinates: > > > > o o > > o > > o o o > > > > We have a sequence of 1108 numbers before it > repeats. > > > > And the evidence that the process can be > not-linear is > > that it could exits synergy between two > identical configura- > > tions: > > > > o o o > > o o o . . . o o > > o o > > o o > > > > This produces a sequence of 3160 numbers > before it repeats. > > > > > > But the definitive demonstration that a > minimum finite pro- > > gram can produce infinite complexity is that > the simple four > > laws of Game of Life can simulate a non > periodic sequence. > > And behold, it can simulate an Universal > Turing Machine! > > > > > > 3.- Ludovicus Curve > > > > In what follows two parametric functions with > the sole diff. > > of a letter's position, produce very different > complexities. > > > > Periodic Curve: > > > > X = T + SIN(5 * Y) > > Y = T - COS(2 * X) > > > > "PSEUDO -SINUSOID" > > > > > > Swap X , Y and you have the chaotic Curve: > > > > X = T + SIN(5 * X) > > Y = T - COS(2 * Y) > > > > I call it : "LUDOVICUS' FUNCTION" > > > > In his book META MATH Vintage 2006 (Pag. > 23) , he says: > > because in general the best way to specify N > via a computer > > program is just to give it explicitly as a > constant in a > >program > > With no calculation at all ! > > > > Why? If a given N of 10^9 bits long is > extracted from a list > >of > >digits of pi, or ?7 surely ,it can be reproduced > with a program many > >times smaller! > > David C. Ullrich > > "Understanding Godel isn't about following his formal > proof. > That would make a mockery of everything Godel was up > to." > (John Jones, "My talk about Godel to the post-grads." > in sci.logic.) i agree with Ullrich. and all replies to the OP till now. Ludovicus is delusional to think that those 2 definitions contradict eachother. the chaitin measure of complexity is 1) not wrong 2) just a kind of complexity , there are others 3) quite similar to kolmogorov complexity ( some argue its equivalent and chaitin ' had nothing ' but thats another subject and not the objection of the OP ) BUT most importantly : chaitin complexity is defined for a series of digits ; thus a number or a single number sequence. WHEREAS Herbert Simon defines chaos of functions. so chaitin talks about the complexity of the digits of a real, and h. simon talks about the chaos of functions. big difference. yes , i said i agree with david c ullrich ! but i posted to clarify for ludovicus , a typical ullrich ' huh ?! ' is not always sufficient ... on the other hand maybe im naive and ludovicus will be stubborn ... and i will regret posting more than ullrich. time will tell. btw , kolmogorov complexity is one of the best. regards tommy1729 |