David Vallner wrote:
> On Thu, 17 Aug 2006 19:49:03 +0200, Guillaume Marcais <guslist / free.fr>
> wrote:
> > Is there any tuning of GCC so it kicks in less frequently when the
> > memory consumption is large? Or could it be that the Hash algorithm
> > chokes when the number of keys gets large (like > 100_000)?
>
> If I remember a past thread correctly, the Ruby GC is set to keep memory
> usage below 8 MB by default. This is specified is determined by a #define
> in the interpreter source code, and I don't think it's really an adaptive
> algorithm. If GC is the performance problem, you could try changing that
> constant and rebuilding the interpreter.
>
> As for the Hash problem, profiling could probably tell you that. Of
> course, profiling with such a huge dataset when you already know the code
> gets anomalously slow might take really, really long.

Yeah, that's my problem. I did some profiling on some smaller subset,
and that's why I removed the binary search (see posted code). A binary
search on a sorted of only 4 elements doesn't buy you anything, it was
even slower. But I never tried with a dataset that would push the
memory consumption, sot I got no insight on that.

Guillaume.

> 
> David Vallner