David Vallner wrote:
>> How about C#, well it runs in Windows and without serious and expensive
>> firewalls you just can't go anywhere near the Internet.
> 
> You need to tighten off Unix-based servers too. Heck, there are even
> serious and expensive firewalls for Linux around too, because not
> everyone has an in-house iptables guru.
But everybody *should* have a *certified* Cisco engineer if they use
Cisco routers, for example. It's one of the costs of doing business.

> Speaking purely theorethically, Ruby can not be made as performant as
> Java or C# could be made if they had ideally performing implementations.
> Latent typing makes it almost impossible to do certain optimizations as
> static typing does. That's pure fact.

I'm not sure I agree with you here. First of all, while latent typing
may prevent you from optimizing (and I'm writing in Perl, not Ruby)

$j=0;
for ($k=0; $k<100000; $k++) {
  $j++;
}

to

$j=$k=100000;

that kind of optimization is a trick used by compilers to get good
performance on trivial benchmarks, rather than something with a more
wide-ranging real-world payoff.

Second "compiled languages", like Java, C#, C++ and even C have
extensive optimized run-time libraries to do all the lower-level things
that a "true optimizing compiler", if such a thing existed, would do
automatically. Over the years, compilers have improved to the point
where they generate optimal code for things like LINPACK and the
Livermore Kernels.

In short, I don't see why a Ruby interpreter *and* run time can't
compete with a Java, C# or C++ compiler *and* run time! As long as you
have to have the same number of bits around to keep track of the
program's data structures, objects, etc., "optimization" becomes a
matter of implementing the operations on the data structures efficiently.