On Sat, 4 Dec 2004 01:30:33 +0900
Hal Fulton <hal9000 / hypermetrics.com> wrote:

> nobu.nokada / softhome.net wrote:
> > Hi,
> > 
> > At Thu, 2 Dec 2004 15:51:46 +0900,
> > Yukihiro Matsumoto wrote in [ruby-talk:122141]:
> > 
> >>Nobu himself made a patch to preserve hash order.  I have not decided
> >>yet to merge it.  The only concern is performance.
> > 
> > 
> > The performance wouldn't increase for insertion, iteration and
> > lookup, but do for direct deleting (except for st_delete_safe).
> > And the memory usage increases 2 pointers for each hash
> > entries.
> > 
> 
> In my opinion, it would be worth it.
> 
> I am curious: How often do people use "large" hashes? Nearly all
> of mine are under 50 keys, I think.
> 
> It becomes more of an issue if the hash has 1000000 keys, of
> course.
> 
> 
> Hal

I use hashes as large as this. (For building my chordlist.
http://chordlist.brian-schroeder.de/)

Regards,

Brian


-- 
Brian Schr?der
http://www.brian-schroeder.de/