I have a problem that is a bit hard to express.

I have an application which serializes a set of Ruby objects to Ruby 
source code.  This object graph when serialized out is now pushing 100k 
lines of Ruby code, and ~ 6.5MB of characters.

The code itself looks structurally something like:

X.new do |x|
   x.add_y do |y|
     ...do stuff here
   end
   ...do lots of more adds, and they are nested sometimes 5 blocks deep
end

and to load it I do:

f = File.new(@file)
@defintion = eval f.read
f.close()

It seems that when I do this with this big (6.4MB) script file, I get 
the [BUG] gc_sweep problem.

As a fallback we serialize to XML and read back in the XML and that 
seems to work.  The reason for doing it to Ruby source was then the 
parser parses (which is written in C...and is fast) vs. the XML parser. 
  We get an average 20x increase in performance loading from Ruby source 
vs. REXML-based XML.

Anyway, is there some limit on the size of Ruby objects that we are 
running up against...or on the deeply nested blocks...or something 
else?

Unfortunately, this is part of a very complex framework and hard to 
isolate/debug.

Oh, and we are running this on a dual-xeon 2.4 GHz server w/2GB RAM and 
Redhat 8 (and Ruby 1.8 built on 8/8/03).

Thanks,

Rich Kilmer