Chad Perrin wrote:
> On Tue, Sep 25, 2007 at 10:40:04PM +0900, Ruby Maniac wrote:
>> Ruby scales just fine as long as you are willing to throw a ton of
>> compute hardware at it !
>>
>> I believe Twitter is successfully using Ruby for their site but then
>> they have also invested in a ton of servers dedicated to running
>> hundreds of Mongrels.
>>
>> So yeah, get out your checkbooks and write more checks for more
>> servers and sure Ruby scales just fine !
> 
> Assuming about an 80k salary and a 2,000 dollar server, a server is worth
> about 50 hours of programmer time.
> 
> I just figured I'd provide a simple starting place for comparing the cost
> of software development with that of hardware upgrades.

I find this perspective puzzling. In most large datacenters, the big 
cost of operation is neither the cost of the servers nor the cost of the 
development time to put code on them, it's the peripheral electricity, 
administration and cooling costs once the application written must be 
deployed to thousands of users.

An application that scales poorly will require more hardware. Hardware 
is cheap, but power and administrative resources are not. If you need 10 
servers to run a poorly-scaling language/platform versus some smaller 
number of servers to run other "faster/more scalable" 
languages/platforms, you're paying a continuously higher cost to keep 
those servers running. Better scaling means fewer servers and lower 
continuous costs.

Even the most inexpensive and quickly-developed application's savings 
will be completely overshadowed if deployment to a large datacenter 
results in unreasonably high month-on-month expenses.

- Charlie