From: Ludovicus on
COMMENTARY TO THE CHAITIN’ S BOOK “META MATH!”

This book is well written and presents a very interesting reading ,
but I found many contradictions and inexactness.
I will show some examples :
Page 14:
”…it is easy to state clear, straightforward questions that nobody
knows to answer, not even in a thousand years…” But in page 23 says:
”If a mathematical statement is false there will be no proofs, but if
it is true, there will be an endless variety of proofs.”’
The last, contradicts Goedel’s second theorem, and what is posed in
page 41: “This proves that uncomputability and incompleteness are
lurking right at the core.”
Page 15:

“… Stephen Wolfram’s “A new Kind of Science” gives many
other examples of simple rules that yield extremely complicated
behavior.” But in page 141: “The main idea is to measure the
complexity….. by the size in bits of the program for generating all…”
And in page 147: “To get more out, put more in !”

Each author can forge his own definition of complexity but, for
measuring it he must consider the cases where it is evident that
one situation is more complex than other.
His idea of measuring the complexity of a sequence of numbers by the
quantity of bits of the program that reproduces it, is easily
contradicted by many examples .

1.- If in a program a formula produces a sequence of numbers, the
change of position or the value of a parameter, can change profoundly
the character of the sequence without augmenting the size in bits of
the program . Ex. Beginning with X = 0.75 the iteration of X =
1.5X^2 -1 produces a periodic sequence, but in X = 2.X^2 – 1 the
sequence produced is utterly chaotic.

2.- Its evident that the sequence of primes is more complex than the
sequence of values of an arithmetical progression . But in an AP: an
+ b , where ‘a’ have
k1 bits and ‘b’ have k2 bits it can occur that k1 + k2 > m .(m
being the size of a program that produces primes indefinitely.)

3.- A language of computation is a program and have a definite length
in bits, but it can produce sequences of any complexity.

4.- In the Conway’s Game of Life the change of position of a ‘critter’
can produce
an unexpected large quantity of figures.
But behold! A definite structure of ‘critters” can simulate a
Universal Turing Machine. That is, it can simulate any language of
computation.

Ludovicus
From: Robert Israel on
Ludovicus <luiroto(a)yahoo.com> writes:

> Each author can forge his own definition of complexity but, for
> measuring it he must consider the cases where it is evident that
> one situation is more complex than other.
> His idea of measuring the complexity of a sequence of numbers by the
> quantity of bits of the program that reproduces it, is easily
> contradicted by many examples .
>
> 1.- If in a program a formula produces a sequence of numbers, the
> change of position or the value of a parameter, can change profoundly
> the character of the sequence without augmenting the size in bits of
> the program . Ex. Beginning with X =3D 0.75 the iteration of X =3D
> 1.5X^2 -1 produces a periodic sequence,

No. If x_0 is rational with denominator 2^m, x_n has denominator
2^((m+1) 2^n - 1). Perhaps you mean the sequence approaches a 4-cycle.

> but in X =3D 2.X^2 =96 1 the
> sequence produced is utterly chaotic.

What makes you think that (from the algorithmic point of view) a
chaotic sequence is more complex than a converging one?

> 2.- Its evident that the sequence of primes is more complex than the
> sequence of values of an arithmetical progression . But in an AP: an
> + b , where =91a=92 have
> k1 bits and =91b=92 have k2 bits it can occur that k1 + k2 > m .(m
> being the size of a program that produces primes indefinitely.)

This tells me that your "evident" statement is false.

> 3.- A language of computation is a program and have a definite length
> in bits, but it can produce sequences of any complexity.

In the case of a program that has inputs, you need to include the size of the
input together with the size of the program itself.
--
Robert Israel israel(a)math.MyUniversitysInitials.ca
Department of Mathematics http://www.math.ubc.ca/~israel
University of British Columbia Vancouver, BC, Canada
From: Ludovicus on
On May 27, 2:56 pm, Robert Israel
<isr...(a)math.MyUniversitysInitials.ca> wrote:
> Ludovicus <luir...(a)yahoo.com> writes:

> > 1.-  If in a program a formula produces a sequence of numbers, the
> > change of position  or the value of a parameter, can change profoundly
> > the character of the sequence  without  augmenting the size in bits of
> > the program . Ex. Beginning with X = 0.75 he iteration of X = 1.5X^2 -1
> > produces a periodic sequence,
>
> No.  If x_0 is rational with denominator 2^m, x_n has denominator
> 2^((m+1) 2^n - 1).  Perhaps you mean the sequence approaches a 4-cycle.

My writing was modified by Google. Above is the correct redaction.
L.
> > but  in X = 2.X^2 - 1 with Xo = 0.75  the sequence produced is
> utterly chaotic.

> What makes you think that (from the algorithmic point of view) a
> chaotic sequence is more complex than a converging one?

Justly, that is my thesis. Two algorithms of the same structure and
length can produce very diferents outputs.
L.



> > 2.- Its evident that the sequence of primes is more complex than the
> > sequence of values  of an arithmetical progression . But in an AP:  
> >an + b , where a have k1 bits and b have k2 bits it can occur that
> >   k1 + k2 > m .(m being the size of a program that produces primes indefinitely.)
>
> This tells me that your "evident" statement is false.

Why?. I wrote a program of 167 bytes long for producing primes
indefinitely.
But a program of one AP with an a of 100 digita and b of 100 digits is
longer.
L.
>
> > 3.- A language of computation is a program and have a definite length
> > in bits, but it can produce sequences  of any complexity.
>
> In the case of a program that has inputs, you need to include the size of the
> input together with the size of the program itself.

Precisely. That was the case of the large input for a and b in my AP.
And that is the case of the Game of Life.
Ludovicus
> --
> Robert Israel              isr...(a)math.MyUniversitysInitials.ca
> Department of Mathematics        http://www.math.ubc.ca/~israel
> University of British Columbia            Vancouver, BC, Canada

From: Timothy Murphy on
Ludovicus wrote:

> Each author can forge his own definition of complexity but, for
> measuring it he must consider the cases where it is evident that
> one situation is more complex than other.
> His idea of measuring the complexity of a sequence of numbers by the
> quantity of bits of the program that reproduces it, is easily
> contradicted by many examples .

How can a definition like this be "contradicted"?
You may have a different definition of complexity,
or a different idea of what complexity should mean,
but that's life.
If you feel it is confusing, why not call it "Chaitin complexity"?


--
Timothy Murphy
e-mail: gayleard /at/ eircom.net
tel: +353-86-2336090, +353-1-2842366
s-mail: School of Mathematics, Trinity College, Dublin 2, Ireland
From: Ludovicus on
On 27 mayo, 20:44, Timothy Murphy <gayle...(a)eircom.net> wrote:
>> How can a definition like this be "contradicted"?
> You may have a different definition of complexity,
> or a different idea of what complexity should mean,
> but that's life.
> If you feel it is confusing, why not call it "Chaitin complexity"?
>
Thanks. I agree. Henceforth I will do the distinction.
I found Chaitin Complexity damaging because it destroys the lexicon of
most languages. Conveys to the notion that, in a system is more
important the number
of parts than the interaction between them. It sustains the falsity
that feedbacks
do not augment complexity because it consumes little bits.
Make ridiculous the labor of 2500 years struggling to resolve the
problems of prime numbers. A sequence of so low complexity.
The 1100 pages of Wolfram's book: "A new Kind of Science" time of
labours lost.
Ludovicus
 |  Next  |  Last
Pages: 1 2
Prev: new names for trig ratios
Next: Precompactness