On 5/30/06, Harish TM <harish.tmh / gmail.com> wrote:
> hi...
>        I need to store something like a couple of million rows is a MySql
> table. Is that ok or do I have to split them up. I intend to index each of
> the colums that I will need to access so as to speed up access. Insertion
> will be done only when there is very little or no load on the server and
> time for this is not really a factor. I also do not have any constraints on
> disk space.
>
>      Please let me know if I can just use MySql as it is or if I need to
> make some changes

MySQL should hold up just fine. I've got a Ruby app backed by a MySQL
database containing a table now with close to 2 million rows, and
constantly growing. The performance of the application right now seems
to be more bounded by the fact that I'm running on a dinky machine
with slow disk drives and not a lot of memory, but as of now Ruby-DBI
and ActiveRecord seem to have reasonably acceptable performance.
Profiling the database access code shows that the application's
slowdown is not in Ruby but in MySQL, and MySQL itself appears to be
limited by the hardware we're running it on. As long as you've got a
reasonable machine, you should be fine.