I've created a small daemon, that serves certain data very fast to our
remaining web-application. It does so by pre-loading a lot of data from
our database into a special structure in memory. Boiled down, the
pre-loading procedure looks like this:

@data_hash = {}
DBI.connect(dns, user, pass) do |dbh|
  sth = dbh.execute("some-sql-returning >300.000 rows")
  while row = sth.fetch_hash
    hash_key = <calculated-from-row-data>

    @owner_relations[hash_key] ||= []
    @owner_relations[hash_key] << row
  end
  sth.finish
end

It gives a @data_hash with a structure like this:

@data_hash = {
  'key1' => [row1, row2, row3, ...]
  'key2' => [row4, row5, ...]
  ...
}

I have a problem with the speed of the pre-loading though. I am loading
~360.000 rows into memory (about 500MB). The first 50% of the rows are
read and placed in the hash pretty quick. But after that it slows down.

I suspect that Ruby's dynamic memory allocation for the Hash is to
blaim. Can I somehow pre-allocate 500MB memory for the hash? Or tweak
the way Ruby allocates memory?

Thanks in advance!

- Carsten
-- 
Posted via http://www.ruby-forum.com/.