From: spudnik on
darn; I thought, from the header,
you were using a multiplier of 7 ... and that
made me realize, the professors who do that,
are subverting the "big Oh" and "little oh" formalism.

that partition of the triplet is so important,
vuz Brun's constant!

> for x,y > 7, twins(x+y) <= twins(x) + twins(y)
> where twins is the prime twins counting function,
> where 3,5,7 is considered as 2 twins.

thus:
just because it was British,
I'd assume that the folks at E.Anglia did this, on purpose.

"global" warming is almost & assiduously all computerized simulacra,
and extremely limited reporting, about glaciers e.g.

> >http://www.abc.net.au/unleashed/stories/s2868937.htm

thus:
to recap my reply to the TEDdies comments (as I am still
listening to B.Greene's pop-sci talk ... zzzz),
first of all,
Minkowski made a silly slogan about ordinary phase-space,
then he died. thank you!
> http://www.ted.com/talks/lang/eng/brian_greene_on_string_theory.html

thus:
they were just at the library auditorium,
selling the electromags to cure depression.... beats the heck
out of electroconvulsing, but I missed the refreshments!

thus:
I didn't get the gist of the CBS reportage, although it seemed
to be literate & wikipediaized (yeeha .-)
seemed like "more decimal points," although
there was a (wikip.) bibliographic note referring to Dicke --
I think, it was his paper that Einstein saw on one
of his rare visits to his Caltech office, and pooh-poohed,
regarding the predominant redshifitng of the heavens.

thus:
and, if at the centerof Sun is an iron core,
the theory might have to be revized (don't laugh;
not only was this a mainstream theory at one time,
it may not have been laid to rest (in current research)).

thus:
Rob, you uneducated, global-warmed-over bog-creature --
did you create any oil, today?...
seriously, that was amuzing about the cancellation-of-submission.
reminds me
of the time that Popular Science made an on-the-wayside attack
upon S. Fred Singer; at the time they were owned by Times-Mirror,
the then-owner of the LAtribcoTimes. the article was nominally and
visually an aggrandizement of three professors (and taht could
have included one of my own, at UCLA) of a theory about climate,
which had been celebrated already (I think) with a Nobel.
they included a mug-shot of the good doctor,
along with no mention of his vitae; alas!

thus:
the Skeptics were a Greek cult in the Roman Pantheon,
along with the Peripatetics, the Gnostics, the Solipsists etc.
ad vomitorium; as long as the Emperor was the Top doG,
you were left to your beliefs (til, of course,
Jesus -- after it became the state church).

thus:
virtually all of "global" warming -- strictly a misnomer, along
with Arrhenius 1896 "glasshouse gasses," except to first-order --
is computerized simulacra & very selective reporting, although
a lot of the latter is just a generic lack of data (that is,
historical data for almost all glaciers -- not near civilization).
I say, from the few that I casually *am* familiar with,
that *no* database shows "overall" warming --
not that the climate is not changing, rapidly,
in the Anthropocene.

thus:
instead, we should blame Pascal for discovering,
experimentally, his "plenum," which he thought was perfect. I mean,
it's always good to have a French v. English dichotomy,
with a German thrown-in for "triality."
> of Newton's "action at a distance" of gravity,
> via the re-adumbration of his dead-as-
> a-doornail-or-Schroedinger's-cat corpuscle,
> "the photon." well, and/or "the aether,"
> necessitated by "the vacuum."

--Light: A History!
http://21stcenturysciencetech.com

--NASCAR rules on rotary engines!
http://white-smoke.wetpaint.com
From: Transfer Principle on
On Apr 14, 9:17 am, master1729 <tommy1...(a)gmail.com> wrote:
> master1729 - Littlewood conjecture
> for x,y > 7
> twins(x+y) <= twins(x) + twins(y)
> where twins is the prime twins counting function where 3,5,7 is considered as 2 twins.

In another thread, some standard theorists already explained
why the "master-Littlewood conjecture" is likely false. In
fact, Bau wrote yet another heuristic explaining why most
standard theorists disbelieve this conjecture.

I think that I have yet another analogy explaining the
heuristic that argues against the conjecture.

Let us define a sequence a_n as follows:

a_0 = 2
a_1 = 3
a_2 = 5
a_3 = 7
a_4 = 11

So the sequence starts out looking like the primes. But
for all subsequent values, we shall flip a coin. Then:

a_(n+1) = a_n + 2, if the coin lands heads
= 2a_n, if the coin lands tails.

Let me simulate some coin flips right here:

THTTTTHTHTHTTHHTHHHTHTTHHT

Based on these coin flips, we have:

a_5 = 22
a_6 = 24
a_7 = 48
a_8 = 96
a_9 = 192
a_10 = 384
a_11 = 386
a_12 = 772
a_13 = 774
a_14 = 1548
a_15 = 1550
a_16 = 3100
a_17 = 6200
a_18 = 6202

This sequence obviously increases exponentially faster than
the primes does. Indeed, since the coin will land tails with
probability 1/2, and we double after every tail, we see that
a_n is approximately 2^(n/2).

So it appears that we have a no-brainer:

The Transfer Principle-Littlewood Conjecture:
If given the sequence {a_n} above, we define a function:

f_a(m) =def card({neN|a_n <= m})

then the conjecture states that:

f_a(x+y) <= f_a(x) + f_a(y)

Since a_n is approximately 2^(n/2), we have that f_a(m) is
approximately 2log_2(m) (also written as 2lg(m)).

But despite this, the Transfer Principle-Littlewood
Conjecture is most likely false. For suppose after finding
a_n for some n, we flipped heads five times in a row. Then
we would have:

a_(n+1) = a_n + 2
a_(n+2) = a_n + 4
a_(n+3) = a_n + 6
a_(n+4) = a_n + 8
a_(n+5) = a_n + 10

But if we let x=a_n and y=10 in the conjecture, then:

f_a(a_n + 10) <= f_a(a_n) + f_a(10)
n+5 <= n + 4

or 5<=4, a blatant contradiction. So we must conclude that
as soon as we flip five heads in a row, the conjecture
becomes false.

But will we ever flip five straight heads? Well, after the
sequence above, I eventually flipped five straight heads
to give me the values:

a_53 = 813681752
a_54 = 813681754
a_55 = 813681756
a_56 = 813681758
a_57 = 813681760
a_58 = 813681762

Thus x=813681752, y=10 is a counterexample to the conjecture.

Of course, if we were to flip tails forever, or at least
avoid flipping four straight heads, then our conjecture would
be true. But this is extremely unlikely -- indeed, we will
eventually flip five straight heads with probability _1_.

Thus, the Transfer Principle-Littlewood Conjecture -- as well
as the master1729-Littlewood Conjecture -- is akin to stating
that we'll never flip five (or n for some large n) straight
heads since each head has probability 1/2 (so n straight heads
would have probability 1/2^n, a small number), but Bau's
counterexample is akin to stating that if we flipped a coin
_infinitely_ many times, then we'll eventually flip five (or n)
straight heads with probability 1.

In particular, it may be unlikely that all 48 of the numbers
that Bau listed (n, n+2, n+6, n+8, etc.) are prime, perhaps as
unlikely as flipping 48 straight heads, but just as we'll flip
48 straight heads eventually with probability 1, all 48 of the
numbers will be prime for some n, with probability 1.

Once again, I don't believe that tommy1729 should be ridiculed
or called a "crank" for just stating his conjecture (unless one
is prepared to call Hardy and Littlewood "cranks" as well). I
had to reread Bau's post several times until I understood why
conjectures of this type are probably false.
From: Transfer Principle on
On Apr 14, 9:17 am, master1729 <tommy1...(a)gmail.com> wrote:
> 1729

Actually, since I'm here in a tommy1729 thread, I might as
well investigate the link he mentioned in a previous post:

> http://sites.google.com/site/tommy1729/

At first I expected tommy1729 to explain more about the
master 1729-Littlewood conjecture on that site, but instead
I found the page called "debunking nonmeasurable set."

Now we already know that tommy1729 is an opponent of AC,
and AC is used in the usual ZFC proof of the existence of
non-Lebesgue measurable sets (so that the proof would fail
in a theory such as ZF+~AC).

But on the page, I noticed what was written at the bottom:

> x = 1/n lim n -> oo
> Notice x IS NOT 0 , but an infinitesimal.

And of course, we immediately see what's going on here. In
standard analysis, there are no nonzero infinitesimals, and
lim n->oo (1/n) is exactly zero in standard analysis.

I've actually noticed this myself years ago. (I don't recall
whether I ever mentioned this in any sci.math post.) The
usual proof of a nonmeasurable set involves taking the unit
interval and partitioning it into countably many sets, each
of which would have the same measure. But in the (extended)
standard reals, there is no real number r such that oo * r
equals one (the measure of the unit interval). So we must
say that these sets are nonmeasurable.

But I often wonder, if we were allowed to assign nonstandard
_infinitesimals_ as measures, then perhaps it may be possible
to assign one to the measure of a Vitale set. This is related
to another common so-called "crank" idea -- why should
individual points have measure zero? Why can't we assign yet
another infinitesimal (smaller than the infinitesimal for
Vitale sets, of course) for the measure of each point?

This idea is intuitive to many "cranks." The only set with
measure zero would be the empty set. The measure would be
completely additive -- that is, _uncountably_ additive as
well as countably additive. Indeed, "cranks" often combine
this with a nonstandard cardinality, so that the measure of
a set would equal the product of its cardinality and some unit
infinitesimal, the measure of a single point. (The measure of
any set should be proportional to its cardinality.)

Both RF and TO -- the infinitesimal "cranks," have proposed
such a theory. MR, another infinitesimalist, has also looked
at something similar. But of course, the standard theorists
refuse to consider such ideas at all.

Of course, probability theory would also work differently in
such a theory, since probability is based on measure. For
example, the probability I mentioned in my last post, of
avoiding five straight heads when flipping a coin infinitely
many times, would no longer be 1, but 1-(31/32)^oo, which is
infinitesimally short of unity.

And so I'd love to discover a theory in which the intutions
of at least four "cranks" (RF, TO, MR, and tommy1729) would
all be satisfied.

Now some standard theorists might be wondering why I'm
bending over backwards to accommodate tommy1729's idea of
infinitesimal measure, right after attacking him for his
insistence that his Littlewood conjecture is true. The
answer is that I don't find a theory which redefines
"prime," "twin prime," etc., to be as interesting as one in
which sets can have nonzero infinitesimal measure.

(The closest I can come to a theory in which tommy1729's
conjecture is true is an ultrafinitist theory. For example,
according to AP, 10^500 is the largest natural number. So
according to AP, if a conjecture is true for all naturals
at most 10^500, then it's true for all naturals. Since the
estimated size of the counterexample mentioned in Bau's
post is somewhat larger than 10^500, tommy1729's conjecture
is probably true for AP-naturals.)
From: Jesse F. Hughes on
Transfer Principle <lwalke3(a)lausd.net> writes:

> Both RF and TO -- the infinitesimal "cranks," have proposed
> such a theory. MR, another infinitesimalist, has also looked
> at something similar. But of course, the standard theorists
> refuse to consider such ideas at all.

Neither Ross nor Tony has proposed any theory at all --- aside from
Ross's nonsense about the wonders of the empty theory. Similarly,
Mitch has not "looked at" anything at all, but merely occasionally
makes pronouncements about how things "really" are near zero.

On the other hand, the so-called standard theorists do not "object" to
non-standard analysis. Perhaps the bulk of them choose not to work in
NSA, but this is surely not what you mean when you write that they
"refuse to consider" certain ideas.

Why not give up the play-by-play analysis, since you have no talent
for it? Just go ahead and suggest an interesting theory that you
think approaches the ideas of Ross, Tony or Mitch. I don't promise
that it'll get a whole lot of attention -- most mathematicians are not
looking for unusual theories -- but I do wager that, assuming it's a
formalizable first order theory, no one will object to it in the
manner you pretend.

--
Jesse F. Hughes

"You people are the diminishment of a world."
-- James S. Harris, to mathematicians.
From: master1729 on
10^500.

if we want to know the primes in an interval [10^500,10^500 + q]

we need to sieve out all primes up to sqrt(10^500 + q).

however the ' counterarguments ' only sieve out the primes up to about sqrt(q).

thats their big mistake.

surely lwalke you must have noticed that ??

tommy1729

the master