From: Aatu Koskensilta on
master1729 <tommy1729(a)gmail.com> writes:

> 3) quite similar to kolmogorov complexity ( some argue its equivalent
> and chaitin ' had nothing ' but thats another subject and not the
> objection of the OP )

I don't know of anyone who argues Chaitin "had nothing". Chaitin,
Solomonoff and Kolmogorov all independently arrived at essentially the
same notion of complexity. Beyond the basic definition Chaitin's main
contribution to the field are his complexity theoretic incompleteness
theorems.

> btw, kolmogorov complexity is one of the best.

One of the best what?

--
Aatu Koskensilta (aatu.koskensilta(a)uta.fi)

"Wovon man nicht sprechan kann, dar�ber muss man schweigen"
- Ludwig Wittgenstein, Tractatus Logico-Philosophicus
From: master1729 on
Luis A Rodriguez wrote :

> On 1 dic, 08:18, master1729 <tommy1...(a)gmail.com>
> wrote:
> > Timothy Murphy and Ludovicus wrote :
> >
> >
> >
> > > Ludovicus wrote:
> >
> > > > The 1200 paages of Wolfram's book "A New Kind
> of
> > > Science" contradicts
> > > > the Chaintin's thesis. In "Meta Math" Chaitin
> > > accepts that his concept
> > > > of complexity is utterly different of
> Wolfram's.
> >
> > > That is perfectly possible, but does not show
> Chaitin
> > > (or Wolfram) is wrong.
> > > If you think the use of the word "complexity"
> causes
> > > confusion,
> > > use a different term like "algorithmic entropy"
> > > (which Chaitin often uses, I think).
> >
> > > --
> > > Timothy Murphy  
> > > e-mail: gayleard /at/ eircom.net
> > > tel: +353-86-2336090, +353-1-2842366
> > > s-mail: School of Mathematics, Trinity College,
> > > Dublin 2, Ireland
> >
> > in fact wolfram and chaitin do not contradict , but
> co-exist.
> >
> > one can transform wolfram complexity to chaitin
> complexity and visa versa.
> >
> > tommy1729
>
> As Chaitin said in "Meta Math" : What Wolfram
> consider as maximun
> complexity,
> for me is minimum complexity. He, as most of us,
> acknowledges that
> randomness
> means maximum complexity. Wolfram holds that one of
> his minimum
> programs
> produce pseudo-randomness so complex that he utilizes
> it in
> "Matemathica" .
> Ludovicus.
>

Good. they are correct then and understood eachother ( 's work ? ).

that 'one of wolframs minimum programs which produces pseudo-randomness ' must be the rule 30 ( cellular automaton ).

since Luis A Rodriguez quotes the reply by chaitin , it seems he agrees , and then i wonder what he objects to in the OP considering his agreement resp disagreement ...

btw i read the book ' a new kind of science ' by wolfram.

regards

the master

tommy1729
From: master1729 on
Aatu Koskensilta wrote :

> master1729 <tommy1729(a)gmail.com> writes:
>
> > 3) quite similar to kolmogorov complexity ( some
> argue its equivalent
> > and chaitin ' had nothing ' but thats another
> subject and not the
> > objection of the OP )
>
> I don't know of anyone who argues Chaitin "had
> nothing". Chaitin,
> Solomonoff and Kolmogorov all independently arrived
> at essentially the
> same notion of complexity. Beyond the basic
> definition Chaitin's main
> contribution to the field are his complexity
> theoretic incompleteness
> theorems.

true.

>
> > btw, kolmogorov complexity is one of the best.
>
> One of the best what?

complexity definitions.

because i find it relates best to randomness.

kolmogorov complexity is important imho for testing randomness of strings.


>
> --
> Aatu Koskensilta (aatu.koskensilta(a)uta.fi)
>
> "Wovon man nicht sprechan kann, darüber muss man
> schweigen"
> - Ludwig Wittgenstein, Tractatus
> s Logico-Philosophicus

regards

tommy1729
From: Ludovicus on
On 2 dic, 13:19, master1729 <tommy1...(a)gmail.com> wrote:
> Aatu Koskensilta wrote :
> kolmogorov complexity is important imho for testing randomness of strings.


Most people accepts what Kolmogorov and Chaitin says about randomness:
"A random sequence have maximum complexity."
"The program that codes a random sequence cannot have a length
shorter than the sequence."

But building an incremental mesure of complexity based on length
of minimum programs is easily contradicted with counter-examples.

Suppose someone writes a program for the primes sequence 100 bytes
long.
Suppose A is an arbitrary constant 100 bytes long.
Then the arithmetic progression: A.n + 1 needs a program with a
length:
100 + Code
According to Chaitin this arithmetic progression is more complex than
the sequence of primes!
Ludovicus


From: Ostap S. B. M. Bender Jr. on
On Nov 29, 6:41 am, Don Stockbauer <don.stockba...(a)gmail.com> wrote:
> On Nov 29, 8:26 am, Timothy Murphy <gayle...(a)eircom.net> wrote:
>
>
>
> > Ludovicus wrote:
> > >         G. CHAITIN's article: 'Randomness and Mathematical Proof' in
> > >      Scientific American (May 1975) asserts that:;
> > >      'The complexity of a series of digits is the number of bits that
> > >      must be put into a computing machine in order to obtain the
> > > origi-
> > >      nal series as output.The complexity is therefore equal to the
> > > size
> > >      in bits of the minimal program of the series.'
>
> > >        This definition contradicts the accepted concept of complexity
> > >      by example, Herbert Simon': ;
> > >        'Given the properties of the parts and the laws of its inter-
> > >      actions, it is not a trivial thing to infer the properties or the
> > >      behavior of the system.'
>
> > There are many different notions of "complexity".
> > Chaitin's definition makes perfect sense.
> > If you don't like the term "complexity",
> > call it algorithmic entropy, or informational content.
>
> You think you'll ever get a simple definition of complexity?
>

Why not? There are plenty of complex definitions of simplicity.