gavin, you were quite correct!

while the actual database exectuon amounted to about the same, the 
pereperation time showed about 4x difference in favor of prepare.

inserting 10000 records with code shown below perp finished in 103.35 secs, 
while sending a slew of sql amounted to 448.01 secs. again, most of that time 
was taken up building the giant string.

so now its time to refactor my databse routines....

thanks,
transami

p.s. if you look at the code and wonder what the hell a DBI::Factory is, don't 
sweat it. i added that to DBI to do simple connection pooling under mod_ruby. 
it just returns a DatabaseHandle.

CODE:

def stat_go(dbh)
  sql = ''
  10000.times do |i|
    sql += %Q{INSERT INTO acct (header, number, name, type, gifi) values 
(false, '1001', 'Sales Tax', 'Liability', '#{i}');\n}
  end
  dbh.do(sql)
  dbh.disconnect
end

def prep_go(dbh)
  recs = []
  10000.times do |i|
    recs << [false, '1001', 'Sales Tax', 'Liability', "#{i}"]
  end
  prp = "INSERT INTO acct (header, number, name, type, gifi) VALUES (?, ?, ?, 
?, ?)"
  dbh.prepare(prp) do |sth|
    recs.each do |r|
      sth.execute(r[0], r[1], r[2], r[3], r[4])
    end
    dbh.commit
  end
  dbh.disconnect
end

dbh = DBI::Factory.make(DB_DSN, DB_USER, DB_PASS)

stat_go(dbh)
#prep_go(dbh)