On Thu, 17 Aug 2006 19:49:03 +0200, Guillaume Marcais <guslist / free.fr>  
wrote:
> Is there any tuning of GCC so it kicks in less frequently when the  
> memory consumption is large? Or could it be that the Hash algorithm  
> chokes when the number of keys gets large (like > 100_000)?

If I remember a past thread correctly, the Ruby GC is set to keep memory  
usage below 8 MB by default. This is specified is determined by a #define  
in the interpreter source code, and I don't think it's really an adaptive  
algorithm. If GC is the performance problem, you could try changing that  
constant and rebuilding the interpreter.

As for the Hash problem, profiling could probably tell you that. Of  
course, profiling with such a huge dataset when you already know the code  
gets anomalously slow might take really, really long.

David Vallner