I'm parsing simple text lines which look like this:

  key_1=val_1,key_2=val_2,...key_n=val_n

I want to extract the keys and values and put them into a hash. Simple
enough. I've written the following simple code which does just that:

h = Hash.new

while line = $stdin.gets.chomp!
 line.split(",").each { |f|
    fid, val = f.split("=")
    h[fid] = val
  }

  # Normally, I'd use the data here to do something. For now, I'll
  # just throw it away.

  h.clear
end

Running on a small dataset of 13 MB of data (around 81000 lines), the
program takes about 27 seconds on my Linux machine. An equivalent Perl
code to do the same thing runs in about 10 seconds. This isn't to
knock Ruby (I love it and use it in place of Perl now). I'm just
curious about ways to write better Ruby which makes it faster than my
"first try implementation" above.

Any ideas?

Thanks.

-- 
"I am easily satisfied by the very best." --Winston Churchill