I don't think I'm following you, can you explain what's supposedly
ironic about it? Using Hashie only "slows" things down based on whether
you use Symbols, Strings or object attributes. Unless you use it *all*
over the place the performance impact is small.

I personally don't fully agree with what Hashie does because I believe
people should be competent enough to realize that when they take in
external data it's going to be String instances (for keys that is).

Having said that, I think fundamentally changing the way Ruby works when
it comes to handling Strings and Symbols because developers can't be
bothered fixing the root cause of the problem is flawed. If you're
worried about a ddos stop converting everything to Symbols. If you're
worried about not remember what key type to use, use a custom object or
document it so that people can easily know.

While Ruby is all about making the lifes easier I really don't want it
to become a language that spoon feeds programmers because they're too
lazy to type 1 extra character *or* convert the output manually. Or
better: use a custom object as mention above.

The benchmark you posted is flawed because it does much, much more than
benchmarking the time required to create a new Symbol or String
instance. Lets take a look at the most basic benchmark of these two data
types:

     require 'benchmark'

     amount = 50000000

     Benchmark.bmbm(40) do |run|
       run.report 'Symbols' do
         amount.times do
           :foobar
         end
       end

       run.report 'Strings' do
         amount.times do
           'foobar'
         end
       end
     end

On the laptop I'm currently using this results in the following output:


     Rehearsal 
----------------------------------------------------------------------------
     Symbols                                    2.310000   0.000000 
2.310000 (  2.311325)
     Strings                                    5.710000   0.000000 
5.710000 (  5.725365)
     ------------------------------------------------------------------- 
total: 8.020000sec

                                                    user     system 
  total        real
     Symbols                                    2.670000   0.000000 
2.670000 (  2.680489)
     Strings                                    6.560000   0.010000 
6.570000 (  6.584651)

This shows that the use of Strings is roughly 2,5 times slower than
Symbols. Now execution time isn't the biggest concern in this case, it's
memory usage. For this I used the following basic benchmark:

     def get_memory
       return `ps -o rss= #{Process.pid}`.strip.to_f
     end

     def benchmark_memory
       before = get_memory

       yield

       return get_memory - before
     end

     amount = 50000000

     puts "Start memory: #{get_memory} KB"

     symbols = benchmark_memory do
       amount.times do
         :foobar
       end
     end

     strings = benchmark_memory do
       amount.times do
         'foobar'
       end
     end

     puts "Symbols used #{symbols} KB"
     puts "Strings used #{strings} KB"

This results in the following:

     Start memory: 4876.0 KB
     Symbols used 0.0 KB
     Strings used 112.0 KB

Now I wouldn't be too surprised if there's some optimization going on
because I'm re-creating the same values over and over again but it
already shows a big difference between the two.

To cut a long story short: I can understand what you're trying to get
at, both with the two data types being merged and the ddos issue.
However, I feel neither of these issues are an issue directly related to
Ruby itself. If Ruby were to automatically convert things to Symbols for
you then yes, but in this case frameworks such as Rails are the cause of
the problem. Merging the two datatypes would most likely make such a
huge different usage/code wise that it would probably be something for
Ruby 5.0 (in other words, not in the near future).

Yorick