Yuki and Roger,

I'm glad to hear these patches are working out well for you. 

I have just posted yet another update to the MBARI7 patch at:
http://sites.google.com/site/brentsrubypatches/

The latest spin uses a separate stack for garbage collection passes,
eliminating the need to clear the GC stack after each pass.  It also
disables use of assembly code to read the stack pointer on x86 machines by
default, because this asm code sometimes caused gcc to emit pushes to the
stack between the reading the stack pointer and clearing the area above it. 
Changed default STACK_WIPE_SITES value from 0x2370 to 0x4770.

This should all make the patches a little more portable and a bit faster in
their default configuration.
I don't plan to update MBARI7 again unless bugs are found. (We all know how
that goes :-)

- brent


Yuki Sonoda-2 wrote:
> 
> Issue #744 has been updated by Roger Pack.
> 
> Here's my field report.
> I have a small rails app on a linode slice.  After running it awhile I
> noticed that the system stopped responding--it was running out of RAM.  
> 
> For some reason my rails app was growing by 8MB of RSS per request.  If
> anybody wants to look into this in more depth I'd be happy to give them
> access.
> 
> Updating to 187 trunk:  same result.
> Updated to 187 + MBARI patches.  Problem gone.
> Also the total RSS now starts at 59MB and [4 days later] has appeared to
> stabilize at 62MB.  Without patches it starts at 78MB, so a 25% RAM use
> reduction, which is very nice for those on slices.
> 
> I'd encourage the inclusion of these patches into trunk for the next patch
> release.
> 
> ...
> 
> Thanks much for your work.  It spared me hours of debugging and has
> improved my opinion of Ruby.  Three cheers :) 
> Where to send donation?
> 
> -=r
> ----------------------------------------
> http://redmine.ruby-lang.org/issues/show/744
> 
> 

-- 
View this message in context: http://www.nabble.com/-ruby-core%3A19846---Bug--744--memory-leak-in-callcc--tp20447794p21538514.html
Sent from the ruby-core mailing list archive at Nabble.com.