From: Lew on
Lew wrote:
>> From the JLS, which I strongly urge you to study:

Arne Vajhøj wrote:
> Unless the poster has a solid programming experience,
> then the JLS may not be the best to study.
>
> Sure it is by definition correct, but it is written
> to be detailed and correct not to be easy to read.

Well, boo-hoo-hoo, programming is hard! Waaaahhh!

--
Lew
From: Wayne on
On 7/8/2010 5:35 PM, Boris Punk wrote:
> Integer.MAX_VALUE = 2147483647
>
> I might need more items than that. I probably won't, but it's nice to have
> extensibility.

To me, it is unlikely your system will run well if this one data structure
consumes 2G of memory. (You didn't really state the application or system;
certainly there are exceptions to the rule.) I would suggest you use a
more flexible system, where you keep the data on storage (disk) and use
memory as a cache. Perhaps an ArrayList of soft references would work well.
It might even be possible in your particular case to run a daemon thread
that pre-fetches items into the cache.

Keep in mind a modern general-purpose computer will use virtual memory,
typically with 4kiB pages. Any data structure larger than that will
likely end up swapped to disk anyway. If you need the semantics of
a "BigList", try a custom class, a List of <pagesize> lists with
appropriate set and get methods to access the items.

Questions like yours are missing context. If you want a good answer,
you need to post the problem you are really trying to solve, rather
than posting a question about how to implement the solution you've
already decided on.

Hope this helps!

--
Wayne
From: Patricia Shanahan on
Wayne wrote:
> On 7/8/2010 5:35 PM, Boris Punk wrote:
>> Integer.MAX_VALUE = 2147483647
>>
>> I might need more items than that. I probably won't, but it's nice to have
>> extensibility.
>
> To me, it is unlikely your system will run well if this one data structure
> consumes 2G of memory. (You didn't really state the application or system;
> certainly there are exceptions to the rule.) I would suggest you use a
> more flexible system, where you keep the data on storage (disk) and use
> memory as a cache. Perhaps an ArrayList of soft references would work well.
> It might even be possible in your particular case to run a daemon thread
> that pre-fetches items into the cache.

What's the difference between one data structure occupying over 2 GB and
a set of data structures that use that much space?

Certainly, given enough memory, Java can support total data structure
sizes well over 2 GB without excessive paging.

Patricia
From: Kevin McMurtrie on
In article <4c368bee$0$4837$9a6e19ea(a)unlimited.newshosting.com>,
Wayne <nospan(a)all.invalid> wrote:

> On 7/8/2010 5:35 PM, Boris Punk wrote:
> > Integer.MAX_VALUE = 2147483647
> >
> > I might need more items than that. I probably won't, but it's nice to have
> > extensibility.
>
> To me, it is unlikely your system will run well if this one data structure
> consumes 2G of memory. (You didn't really state the application or system;
> certainly there are exceptions to the rule.) I would suggest you use a
> more flexible system, where you keep the data on storage (disk) and use
> memory as a cache. Perhaps an ArrayList of soft references would work well.
> It might even be possible in your particular case to run a daemon thread
> that pre-fetches items into the cache.
>
> Keep in mind a modern general-purpose computer will use virtual memory,
> typically with 4kiB pages. Any data structure larger than that will
> likely end up swapped to disk anyway. If you need the semantics of
> a "BigList", try a custom class, a List of <pagesize> lists with
> appropriate set and get methods to access the items.
>
> Questions like yours are missing context. If you want a good answer,
> you need to post the problem you are really trying to solve, rather
> than posting a question about how to implement the solution you've
> already decided on.
>
> Hope this helps!

24GB of RAM is a standard server configuration this year. Even my
laptop has 8GB and can only run 64 bit Java. A Java array indexing
limit of 2147483647 is a growing problem, not a future problem.

Multiplexing to smaller arrays through a class isn't a great solution.
First, it's unlikely that an application needing a 2+ GB array can
tolerate the performance hit of not using an array directly. Some
critical JIT optimizations for memory caching and range checking won't
work because of the multiplexing logic. Second, such a class could not
be compatible with anything else because it can't support the Collection
design. Oracle can't define "Collection64 extends Collection" and be
done with it because such a design can not be compatible in Java.
--
I won't see Google Groups replies because I must filter them as spam
From: Mike Schilling on


"Arne Vajh�j" <arne(a)vajhoej.dk> wrote in message
news:4c3655fd$0$283$14726298(a)news.sunsite.dk...
> On 08-07-2010 18:22, Boris Punk wrote:
>> Is there no BigList/BigHash in Java?
>
> No.
>
> But You can have a List<List<X>> which can then
> store 4*10^18 X'es.
>

Or you could pretty easily build a class like

public class BigArray<T>
{
T get(long index);
void set(long index, T value);
}

backed by a two-dimensional array.[1] The reason I prefer to say "array"
rather than "List" is that random access into a sparse List is a bit dicey,
while arrays nicely let you set index 826727 even if you haven't touched any
of the earlier indices yet, and will tell you that the entry at 6120584 is
null, instead of throwing an exception.

1. Or two three-dimensional arrays, if you won't settle for 62 bits of
index.