From: Patricia Shanahan on
Joshua Cranmer wrote:
> On 01/01/2010 10:00 PM, Arne Vajh�j wrote:
>> There are no super good solution. They mad a mistake 15 years
>> ago in the design and did not make array indexes long.
>
> 15 years ago, I don't think computers existed that could hold enough
> memory to make a 63-bit index anything more than redundant. IIRC, it
> wasn't until the beginning of this decade or so that arrays grew large
> enough to uncover a bug in their binary search implementation. So, 15
> years ago, it wasn't a mistake.
>

On the other hand, Sun was already building computers with more than 4
GB total memory. *Every* increase in hardware address width has been
rapidly followed by demand for data structures that need the new width.

Patricia
From: Arne Vajhøj on
On 01-01-2010 23:58, markspace wrote:
> Arne Vajh�j wrote:
>>> Map<BigInteger,something> ....
>>
>> You don't think that uses an array as backing storage?
>
> Well, I'm assuming he's not storing every single integer, just some of
> them. Maybe only primes, since non-primes could be determined by being
> absent from the list.

Usually sieve means all, but maybe you are right.

Arne

From: markspace on
Arne Vajh�j wrote:
> On 01-01-2010 23:58, markspace wrote:
>> Arne Vajh�j wrote:
>>>> Map<BigInteger,something> ....
>>>
>>> You don't think that uses an array as backing storage?
>>
>> Well, I'm assuming he's not storing every single integer, just some of
>> them. Maybe only primes, since non-primes could be determined by being
>> absent from the list.
>
> Usually sieve means all, but maybe you are right.


I don't think so, this time. I built a small test program and the
performance was horrible. I do wonder if it was technically a sieve at all.

For posterity, here's a lousy way to sieve primes.



public static void sieve1()
{
BigInteger num = BigInteger.ZERO;
final BigInteger LIMIT = BigInteger.ONE.shiftLeft( 16 );
List<BigInteger> primes = new LinkedList<BigInteger>();
int loopCount = 0;
long startTime = System.nanoTime();
mainloop:
while( num.compareTo( LIMIT ) < 0 )
{
if( ++loopCount % 1000 == 0 )
{
System.out.println( primes.size() + " primes @ " +
((System.
nanoTime() - startTime) / 1000000) / 1000.0f +
"s" );
}
num = num.nextProbablePrime();
for( BigInteger prime : primes )
{
if( num.remainder( prime ).equals( BigInteger.ZERO ) )
{
continue mainloop;
}
}
primes.add( num );
}
System.out.println( primes.size() );
}
From: Thomas Pornin on
According to Joshua Cranmer <Pidgeot18(a)verizon.invalid>:
> 15 years ago, I don't think computers existed that could hold enough
> memory to make a 63-bit index anything more than redundant.

Cray was building machines with more than 4 GB of RAM in the 80's, I
think.

The design flaw here is not really to have used 'int' instead of 'long';
rather, it is to have defined Java with fixed-length modular integers as
primary integer types. A more forward-looking design would have used
"big integers" as plain int (there is an implied overhead, but it is
much lower than usually expected, when implemented properly -- i.e. not
as a plain BigInteger class). Some programming languages do that, and
some of them were already doing it in the early 90's.

In that respect, Java tried real hard to look non-revolutionary, so as
to gain acceptance among "traditional" C and C++ programmers (as a
whole, programmers are a very conservative bunch). The mere idea of
16-bit characters was already triggering much painful shrieking; with
unlimited integers, I guess that Java would have been rejected as
"research toy". Therefore, I cannot call that design decision a
"mistake". Strategically, Sun had little choice here. However, it was
already known at that time that it would bite them one day.


(I personally recall having discussed index size with a friend in 1994,
and our conclusion was that even 64 bits were not enough for the long
term. Mark my words.)


--Thomas Pornin
From: Maarten Bodewes on
Arne Vajh�j wrote:

> BTW, even long would be too small for indexes if Java
> will be used after 2074, but somehow I doubt that would
> be the case. And besides we do not have the verylong datatype
> yet.
>
You are expecting memory sizes of 9,223,372,036,854,775,807 bytes????

That's 9,223 PETA bytes. Hmm, weird, may happen. But it is certainly
rather large.

Maarten