>>>>> Joseph McDonald <joe / vpop.net> writes:

> I'm going to be dealing with a lot of files, some operating
> systems don't like to have too many files in a single directory.
> So I thought that if I created a directory with 256 entries,
> each of which was another directory with 256 directories each
> and each of those held 256 files, I could handle 256^3
> (16,777,216) files without the OS choking.
<snip>


  This isn't an answer to your question - How large are the files?

  Have you considered using postgresql or dbm?

  I had a similar situation under Solaris, dealing with an application
  that generated 25,000 files a day and we needed to keep three days worth.

  Once we hit 30,000 things got seriously boggy.

  The simple solution: I added a gdbm interface and stored the files with
  the filename as the key.

  With 75,000 files the gdbm db was 200Mb and *fast*.

  Not really Ruby related...


  -David Tillman


-- 
How absolute the knave is!
We must speak by the card, or equivocation will undo us.