Hugh Sasse wrote:
> On Tue, 29 Nov 2005, Robert Klemme wrote:
>
>> Hugh Sasse wrote:
>>> On Tue, 29 Nov 2005, Robert Klemme wrote:
>>>
>>>> If Hugh is using ActiveRecord intensively with a database then it's
>>>> most likely that he'll see no positiv performance effect from
>>>> compiling it with more aggressive optimization.
>>>>
>>>> In fact it's likely that careful optimization on the database side
>>>> will yield better results.  This can be as easy as creating some
>>>> indexes - but might be much more complicated - depending on the
>>>> bottleneck.  (Often it's IO and this might have several reasons,
>>>> from sub optimal execution plans to slow disks / controllers.)
>>>>
>>> At the moment my script to populate the tables is taking about an
>>> hour.  Anyway it's mostly ruby I think, because it spends most of
>>> the time setting up the arrays before it populates the db with them.
>>
>> How did you measure that?
>
> By eye! :-)  The code doesn't access the database at all until the
> last part, and it doesn't get there till about 45 mins.  But to be
> honest, this is so slow it isn't worth benchmarking to get the
> milliseconds.

Wow!  In that case it certainly seems to make sense to optimize that.  Did
you keep an eye on memory consumption and disk IO?  Could well be that the
sheer amount of data (and thus memory) slows your script down.

>      555    1676   17179 /home/hgs/csestore_meta/populate_tables2.rb
> I could post the script if you like. I've not profiled it to find
> out where the slow bits are because it would take about 5 hours
> going by previous slowdowns when profiling.

Unfortunately we're close to release and I don't really have much time to
look into this deeper.  If anyone else volunteers...

> MySQL.  Part of the problem is that this script is also for
> updating, based on new data.  If the db is empty it just inserts,
> else it updates.  Easy enought in ActiveRecord.

Ok, bad for bulk loading.

Kind regards

    robert