> Interesting tidbit: research has shown that developers have roughly
> the same productivity in lines of code per day _independently of
> programming language_.  ¨Βςοτθατ αυτονατιγαμμζομμοχτθατ χιτθ νοςε
> preproduced code at your hands (either in libraries, interpreters or
> compilers) your overall performance increases.

Thank you.
My English is not very good.  So from what I understand, you are
saying that productivity has not direct relationship to the
programming language?

Based from your comment, I did a little research.

http://www.codinghorror.com/blog/2005/08/are-all-programming-languages-the-same.html

Will you please point me to the source of your comment?

Anyhow, I was not talking about productivity.  I was trying to say
measuring the web framework in "number of lines" in 2 different
languages is almost pointless...esp. when it is used as a reference on
performance.

In Ruby, I know you can increase the script's performance by writing
an extension in C.  But then, is it necessary?  One can obtain this
increase of performance by running the program in different hardware
(ie: 386 vs 486 vs core 2 dual )

> I am not sure this analysis is correct: typically, if you have a
> library with a good implementation efforts have gone into it to
> provide a good interface and to optimize it.

You are right.  I love libraries/OOP...  I cannot imagine if I have to
write EVERY routines myself in order for my script to work.

But what if it is an mission-critical scenario?  First, I prob.
shouldn't use Ruby to begin with.  Second, if I have to use Ruby, in
order to gain back few extra cycles, I might need to write my own
extension in C instead of using "standard libraries".

I am already way off the original topic.  Sorry.