------art_3744_30603385.1148975719253
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
Content-Transfer-Encoding: 7bit
Content-Disposition: inline

hey thanks a lot
Guess MySql is fine

about the Ruby part I am using: mysql-ruby1.4.4a   is that ok? Or are there
better ways of doing this?

I need an access time of about 0.05 sec. (record retrieval time)

System config:
Processor: Dual Processor Intel Pentium (P4) 2.53 GHz
Memory RAM: 4 GB
HardDrive Capacity : 250GB
Operating System: SUSE Linux Enterprise Server 9



harish


On 5/30/06, Dido Sevilla <dido.sevilla / gmail.com> wrote:
>
> On 5/30/06, Harish TM <harish.tmh / gmail.com> wrote:
> > hi...
> >        I need to store something like a couple of million rows is a
> MySql
> > table. Is that ok or do I have to split them up. I intend to index each
> of
> > the colums that I will need to access so as to speed up access.
> Insertion
> > will be done only when there is very little or no load on the server and
> > time for this is not really a factor. I also do not have any constraints
> on
> > disk space.
> >
> >      Please let me know if I can just use MySql as it is or if I need to
> > make some changes
>
> MySQL should hold up just fine. I've got a Ruby app backed by a MySQL
> database containing a table now with close to 2 million rows, and
> constantly growing. The performance of the application right now seems
> to be more bounded by the fact that I'm running on a dinky machine
> with slow disk drives and not a lot of memory, but as of now Ruby-DBI
> and ActiveRecord seem to have reasonably acceptable performance.
> Profiling the database access code shows that the application's
> slowdown is not in Ruby but in MySQL, and MySQL itself appears to be
> limited by the hardware we're running it on. As long as you've got a
> reasonable machine, you should be fine.
>
>

------art_3744_30603385.1148975719253--