I recently came across a little performance problem I have difficulties 
to explain.
I wrote a small code sample to demonstrate it :



tmax = 100
const = 1
e = []
tmax.times do
   env = [1000]*1000
   e << env
end

t1 = Time.now
c = 0
1000000.times do
   c += const
end
t2 = Time.now
puts "Loop took : #{t2-t1} seconds"



With tmax = 100 and const = 1, the look take 1.52s to run (on my PC).
With tmax = 1000 and const = 1, the code take 1.52s to run.

Quite logical.

With tmax = 100 and const = 1.0, the code take 2.79s to run.
With tmax = 1000 and const = 1.0, the code take 9.56s to run.

I do not understand why it should be far much slower with a float than 
with an
integer, and even slower when the (completely independent) array created 
before is bigger.

So I tried to disable the Garbage Collector (GC.disable) :

With GC.disable, tmax = 1000 and const = 1, the code take 1.52s to run.
With GC.disable, tmax = 1000 and const = 1.0, the code take 1.89s to run.

Could someone explain to me why the floating point addition seems to 
trigger the GC ?

Thanks,
js