From: Ludovicus on
>
>
>
> >       'EXAMPLES:
> >        1.- Chaos by iteration of a function.
>
> >       An illustration of complexity with a very short program is de-
> >     monstrated by the iteration of:  Y = kY^2- 1.
> >     ( With initial 0 < Y < 1 )."
>
> >       If  1 < k < 2 ,eventually, the values periodically are repeated
> >     but with k = 2  Chaos is established. If Chaos do not occur it's
> >     because the precision is too low.
>
> >       2.- Conway's Game of Life.
>
> >       That Cellular Automata is generated by a minimum program
> >     of N bits plus M bits of data.
>
> >       Suppose that the data are the coordinates of a figure and
> >     the produced sequence be the number of 'critters' in each
> >     cycle. The following configuration of six coordinates will
> >     produce a sequence whose first eight numbers are:
> >     6,5,8,8,12,12,20,12 and then repeat 12,12,..indefinitely.
> >     (That is: low complexity)
>
> >                              o o
> >                             o   o
> >                            o     o
>
> >       But with the same program and number of coordinates:
>
> >                            o o
> >                            o
> >                            o o o
>
> >       We have a sequence of 1108 numbers before it repeats.
>
> >       And the evidence that the process can be not-linear is
> >     that it could exits synergy between two identical configura-
> >    tions:
>
> >                                       o o o
> >                           o o o . . . o   o
> >                           o               o
> >                           o o
>
> >        This produces a sequence of 3160 numbers before it repeats.
>
> >        But the definitive demonstration that a minimum finite pro-
> >      gram can produce infinite complexity is that the simple four
> >      laws of Game of Life can simulate a non periodic sequence.
> >        And behold, it can simulate an Universal Turing Machine!
>
> >       3.- Ludovicus Curve
>
> >       In what follows two parametric functions with the sole diff..
> >     of a letter's position, produce very different complexities.
>
> >       Periodic Curve:
>
> >            X = T + SIN(5 * Y)
> >            Y = T - COS(2 * X)
>
> >       "PSEUDO -SINUSOID"
>
> >       Swap X , Y and you have the chaotic Curve:
>
> >            X = T + SIN(5 * X)
> >            Y = T - COS(2 * Y)
>
> >       I call it : "LUDOVICUS' FUNCTION"
>
On 29 nov, 12:37, David C. Ullrich <dullr...(a)sprynet.com> wrote:
> On Sun, 29 Nov 2009 04:25:07 -0800 (PST), Ludovicus
> >
> Huh? Why in the world do you imagine that these two statements
> contradict each other?
>
> And where do you get the impression that "chaos" is the same
> as "great complexity"?
>

>
>

I did not say that chaos means absolute complexity, but that
the relative complexity of a periodic sequence have different
complexity that a chaotic sequence.
And that the decimal period of a rational have different complexity
that the development of an irrational. Notwithstanding they are
produced by the same program but different data with same quantity of
bits.

The kernel of the problem is that the same program with
new data but with the same number of bits produce
very different sequences. With one data it produces a periodic
sequence, with the other produces an irrational sequence.
If they not have different complexities, what use of Chaitin mesure?

The 1200 paages of Wolfram's book "A New Kind of Science" contradicts
the Chaintin's thesis. In "Meta Math" Chaitin accepts that his concept
of complexity is utterly different of Wolfram's.

Why no one discuss my examples but only my wording?
Refute my examples.
Ludovicus
From: Timothy Murphy on
Ludovicus wrote:

> The 1200 paages of Wolfram's book "A New Kind of Science" contradicts
> the Chaintin's thesis. In "Meta Math" Chaitin accepts that his concept
> of complexity is utterly different of Wolfram's.

That is perfectly possible, but does not show Chaitin (or Wolfram) is wrong.
If you think the use of the word "complexity" causes confusion,
use a different term like "algorithmic entropy"
(which Chaitin often uses, I think).



--
Timothy Murphy
e-mail: gayleard /at/ eircom.net
tel: +353-86-2336090, +353-1-2842366
s-mail: School of Mathematics, Trinity College, Dublin 2, Ireland
From: master1729 on
Timothy Murphy and Ludovicus wrote :

> Ludovicus wrote:
>
> > The 1200 paages of Wolfram's book "A New Kind of
> Science" contradicts
> > the Chaintin's thesis. In "Meta Math" Chaitin
> accepts that his concept
> > of complexity is utterly different of Wolfram's.
>
> That is perfectly possible, but does not show Chaitin
> (or Wolfram) is wrong.
> If you think the use of the word "complexity" causes
> confusion,
> use a different term like "algorithmic entropy"
> (which Chaitin often uses, I think).
>
>
>
> --
> Timothy Murphy
> e-mail: gayleard /at/ eircom.net
> tel: +353-86-2336090, +353-1-2842366
> s-mail: School of Mathematics, Trinity College,
> Dublin 2, Ireland

in fact wolfram and chaitin do not contradict , but co-exist.

one can transform wolfram complexity to chaitin complexity and visa versa.

tommy1729
From: Ludovicus on
On 1 dic, 08:18, master1729 <tommy1...(a)gmail.com> wrote:
> Timothy Murphy and Ludovicus wrote :
>
>
>
> > Ludovicus wrote:
>
> > > The 1200 paages of Wolfram's book "A New Kind of
> > Science" contradicts
> > > the Chaintin's thesis. In "Meta Math" Chaitin
> > accepts that his concept
> > > of complexity is utterly different of Wolfram's.
>
> > That is perfectly possible, but does not show Chaitin
> > (or Wolfram) is wrong.
> > If you think the use of the word "complexity" causes
> > confusion,
> > use a different term like "algorithmic entropy"
> > (which Chaitin often uses, I think).
>
> > --
> > Timothy Murphy  
> > e-mail: gayleard /at/ eircom.net
> > tel: +353-86-2336090, +353-1-2842366
> > s-mail: School of Mathematics, Trinity College,
> > Dublin 2, Ireland
>
> in fact wolfram and chaitin do not contradict , but co-exist.
>
> one can transform wolfram complexity to chaitin complexity and visa versa..
>
> tommy1729

As Chaitin said in "Meta Math" : What Wolfram consider as maximun
complexity,
for me is minimum complexity. He, as most of us, acknowledges that
randomness
means maximum complexity. Wolfram holds that one of his minimum
programs
produce pseudo-randomness so complex that he utilizes it in
"Matemathica" .
Ludovicus.
From: Aatu Koskensilta on
Ludovicus <luiroto(a)yahoo.com> writes:

> G. CHAITIN's article: 'Randomness and Mathematical Proof' in
> Scientific American (May 1975) asserts that:;
> 'The complexity of a series of digits is the number of bits that
> must be put into a computing machine in order to obtain the
> original series as output.The complexity is therefore equal to the
> size in bits of the minimal program of the series.'
>
> This definition contradicts the accepted concept of complexity
> by example, Herbert Simon': ;
> 'Given the properties of the parts and the laws of its inter-
> actions, it is not a trivial thing to infer the properties or the
> behavior of the system.'

It is obscure how this "accepted concept of complexity" applies to a
series of digits. It thus makes no immediate sense to claim Chaitin's
definition, or assertion, contradicts the "accepted concept of
complexity". Whether or not the notion of algorithmic complexity, or
randomness, of a string adequately captures this or that informal notion
of complexity also has no bearing on the purely mathematical side to
algorithmic information theory, where we find many interesting and
useful mathematical theorems, applications, notions. That a chaotic
sequence obtained by iterating some function may well have very low
algorithmic complexity in itself doesn't tell us anything about whether
various conceptions, questions, claims about complexity can be
understood and approached in terms of algorithmic information theory.

> Why? If a given N of 10^9 bits long is extracted from a list
> of digits of pi, or ?7 surely ,it can be reproduced with a program
> many times smaller!

That depends on N.

--
Aatu Koskensilta (aatu.koskensilta(a)uta.fi)

"Wovon man nicht sprechan kann, dar�ber muss man schweigen"
- Ludwig Wittgenstein, Tractatus Logico-Philosophicus