I had the same problem writing a game in Eiffel, also using the SDL.  Here's
the steps I took:

1) Explicitly invoked the GC each frame.  That ensured that there was about
the same amount of delay for GC every frame, instead of sudden lurches every
few seconds.

2) Ran the animation loop as fast as possible (with a tiny delay to allow
the OS to collect events), rather than using a delay to force a fixed fps.
Fed the length of each frame into the simulation as the duration of the next
frame.

3) Smoothed the sampled frame duration by calculating the average frame
duration, starting with the first frame measured, up to about 1-2 seconds
worth of frames.  This avoided jitter due caused by GC and the OS process
scheduler.

4) Checked for frames with an unusual duration. When a frame was much too
long or too short the timing algorithm didn't use its duration, but instead
restarted the sampling of frame values for the average calculation.  This
avoided sudden jumps when one frame was too long because the user had put
the process to sleep or the game was loading large images between levels.

The final algorithm worked well:
* It stopped the GC making the animation jerky.
* It adapted gracefully to changing system load by dropping the frame rate.
* It smoothed out jittery frames.
* It could adapt to sudden events that caused drastic changes in frame rate.

The downside is that animation code becomes more a little more complex
because you are not sure how much time each frame is going to simulate, and
therefore how far each actor is going to move.  You also have to pass the
frame duration to the animation methods of all the actors in the game.

Cheers,
            Nat.

________________________________
Dr. Nathaniel Pryce
B13media Ltd.
Studio 3a, Aberdeen Business Centre, 22/24 Highbury Grove, London, N5 2EA
http://www.b13media.com

----- Original Message -----
From: "Matthew Bloch" <mattbee / soup-kitchen.net>
Newsgroups: comp.lang.ruby
To: "ruby-talk ML" <ruby-talk / ruby-lang.org>
Sent: Wednesday, May 22, 2002 4:00 PM
Subject: Stymied by Ruby's garbage collector


> Hello;
>
> This one is driving me crazy: I've got an about-to-be-deployed
> entertainment product written in Ruby using SDL for the graphics (through
> RUDL), and I've hit a brick wall with what I believe is the garbage
> collector.
>
> Basically, the drawing loop for a particular screen takes the same time
> every frame but occasionally the garbage collector kicks in and makes the
> whole game lurch for that frame: graphics jump to compensate etc. and this
> is a regular occurrence (once every 2/3 secs).  I've tried turning it off
> during critical sections of animation, but on most systems I've tried, it
> gobbles all the memory before Ruby can start garbage collection again.  I
> know the problem *is* down to the garbage collector's timing because I can
> see smooth animation for a few seconds after GC.disable (where there
wasn't
> before) before the inevitable seizure.
>
> Looking at Ruby's garbage collector (from 1.6.7), it seems an
> 'all-or-nothing' proposition.  That is, the whole algorithm is run at any
> point to free as much memory as possible, or it is not.  There's no
partial
> collection to satisfy what may be a small allocation request.
>
> Now my deadline is pretty tight on this, so I'm after some tips to solve
> this in the short term for now :-)  Various solutions present themselves,
> in order of simplicity:
>
> *) Upgrade the game's runtime to Ruby 1.7 -- does this have a less lumpy
gc
> algorithm?  Or is there an even more advanced version of Ruby around from
> which I could steal just a better gc?
>
> *) Redesign critical parts of the game to stop burning through so much
> damned memory-- but surely this would ruin the maintainability of the
code,
> to have to scope everything as widely as possible?  Or are there other
> techniques I can use for the same end?
>
> *) Hack the gc algorithm used by ruby_xmalloc etc. to stop recovering
> memory after the number of bytes needed is available, rather than running
> the whole algorithm.  I haven't studied the algorith in detail yet, so I
> have no idea whether this is viable;
>
> *) Find the correct places to GC.disable / GC.enable to smooth the more
> visible glitches over-- but I'd be very wary to deploy such a solution
> because it's unpredictable across different systems and in the worst case
> can seize the machine up totally.
>
> *) last resort: leave everything as it is, but deliberately slow animation
> loops to assume the worst case.
>
> Can anyone who's been in a similar situation comment?  I assume this kind
> of problem is endemic to games that rely on garbage collectors, so someone
> must have some opinions :-)
>
> thanks in advance,
>
> --
> Matthew
>