ara.t.howard / noaa.gov wrote:
> On Sat, 23 Sep 2006 ben / somethingmodern.com wrote:
> 
>> Does anyone have experience with using Ruby for analysis (*lots* of
>> maths), on a machine with a ridiculous amount of RAM? For example, a
>> hip 64-bit Linux kernel on a machine with 32 or 64 GB of physical RAM.
> 
> i've had issues using mmap with files larger than 32gb - i'm not sure if 
> the
> latest release has fixed this or not...  in general you can run into issues
> with extenstions since ruby fixnums keep a bit to mark them as objects...
> 
>> Are there any "gotchas" I should be aware of? Would all the RAM be
>> addressable by a given Ruby process? Or would I still have to be 
>> forking a
>> number of processes, each allocated a bit of the address space (blech)?
> 
> assuming you have two or four cpus this might not be a bad idea - ipc is so
> dang easy with ruby it's trivial to coordinate processes.  i have a slave
> class i've used for this before:
> 
> http://codeforpeople.com/lib/ruby/slave/
> http://codeforpeople.com/lib/ruby/slave/slave-0.0.1/README
> 
> regards.
> 
> 
> -a

Ah, someone *has* done some of this! What compiler did you use to 
recompile Ruby for 64-bit addressing? Did it work out of the box?

What's the bottleneck in Ruby's built-in IPC? Network traffic to 
"localhost" and to the other hosts? System V IPC? Something else?

I haven't really looked at the whole "lots of coordinated tiny 
processes" thing in Ruby, since Erlang seems to have nailed that 
approach and made it the core of the Erlang way to do things. I'm not a 
big fan of re-inventing wheels; I'd much rather just get my numbers 
crunched.