Brian Adkins wrote:
> When running a test that primarily involves loading up a few MySQL 
> tables with ActiveRecord objects, I was surprised to see the Ruby CPU 
> utilization at 93% and the MySQL CPU utilization at 7%. I would expect 
> this workload to be heavier on MySQL than that.
> [...]

I just moved my test code into a controller and ran it via:

mongrel_rails start -e production

Similar CPU characteristics except that Mongrel wasn't able to fully 
utilize my dual core CPU (I suppose because of the serialization of 
Rails code due to lack of thread safetyness).

So the unit test (1093 records -> table1, 1093 records -> table2, 1 
record -> table3) took 5.5 seconds to complete and the identical test in 
a controller with Mongrel in production mode took 27.4 seconds!

Yeah, I know I can have a cluster of Mongrel processes, and that's how I 
run for real, but I'm still a little bummed with these results :(

I've switched my company's development from 100% Java to 100% Ruby, and 
I still believe that was a good decision because of productivity gains 
and joy, but I do miss some of the runtime performance of Java and the 
ease with which I could spin up a thread to do some background process. 
I'm glad BackgrounDRB has been provided, but it's not quite the same.

Hopefully future versions of Ruby/Rails will provide some more runtime 
performance and concurrency - I'd be glad if I could just fork in Rails 
without trouble, but I don't think that's the case.

For now, I don't have more customers than a Core 2 Duo can handle, so 
it's not exactly on the critical path for me yet :)  In fact, I'm glad 
MySQL isn't on the critical path because overcoming that seems much more 
difficult than having a bunch of Apache/Mongrel processes running.

Brian