On Fri, 9 Feb 2007, Ezra Zygmuntowicz wrote:

>
>
> 	ActiveRecord will start to consume a huge amount of memory and cpu if 
> you fetch more then a thousand or so records at a time. This is because it 
> does load all results into memory at once and then wraps each row in a AR 
> object which is expensive with the amount of records being talked about in 
> this thread. There is an AR plugin somewhere called paginating find that 
> works sort of like a cursor but doesn't use a cursor, it does limits and 
> offsets.
>
> 	But I do think there is an ActiveRecordExtensions[1] project that 
> fixes some of these issues as well.
>
> Cheers-
> -- Ezra Zygmuntowicz-- Lead Rails Evangelist
> -- ez / engineyard.com
> -- Engine Yard, Serious Rails Hosting
> -- (866) 518-YARD (9273)
>
> [1]  http://rubyforge.org/projects/arext/
>

hey ezra-

fyi, nearly all the database apis support

   db.execute(sql) do |tuple|
     p tuple
   end

which does the obvious - only one tuple is in memory at a time.  i sent in a
patch for rails to use any given block as meaning: construct the AR object and
yield it, one at the time which largely consited of a bunch of '&b's but it
there was no interest at the time.  i'm shocked that this is still an issue in
the rails core.  haven't people been hitting this as rails is used on bigger
projects with bigger databases?

have used any of the patches/extensions?

ps.  wore my engine yard shirt today!

regards.

-a
-- 
we can deny everything, except that we have the possibility of being better.
simply reflect on that.
- the dalai lama