Hi, KUBO

Thank you very much for the reply. I got the following error after I
applied the patch:
/usr/local/ruby-1.8/lib/ruby/site_ruby/1.8/oci8.rb:656:in `bindByName':
Not supported type (8)

I changed bind_param to 
c.bind_param(':data', data, OCI8::SQLT_LNG)

If I do not change it, I end up with truncated data.

As for changing "datasize = 65535", I did "datasize = 500000". However,
when querying the database populated by other means with 200k long
columns, I always get back a string of size 65535. This limitation must
be in place somewhere else (inside OCI?).

Thanks again for trying to help. I ended up using OTL (C++) for
populating and updating long columns, as pure OCI interface simply
overwelms me. It would be great, though, to have support for this in
Ruby OCI8 later sometime.

Sincerely,
Gennady Bystritsky.

-----Original Message-----
From: KUBO Takehiro [mailto:kubo / jiubao.org] 
Sent: Wednesday, January 18, 2006 10:15
To: ruby-talk ML
Subject: Re: Populating 'long' data column with Ruby OCI8 (trying again,
need a solution badly)

"Gennady Bystritsky" <Gennady.Bystritsky / quest.com> writes:

> Hello,
>
> Is there any way to insert a big chunk of data (say, 100K) into a 
> column of type long with Ruby OCI8 or by any other Ruby means? I saw 
> that to do it in C you must be prepared to handle OCI error code 
> inviting you to insert another piece. But how to do it in Ruby, 
> especially with OCI8? If I do
>
> require 'oci8'
>
> db = OCI8.new 'system', 'manager'
> db.exec("create table sample (id number, data long)")
>
> data = 'A' * 1024 * 100
>
> c = db.parse('insert into sample values (2, :data)') 
> c.bind_param(':data', data)
>
> c.exec
> db.commit
>
> What I end up with in column 'data' is host dependent (or db block 
> size dependent?). I observed 14464 bytes on 2K database on Solaris, 
> and 34652 bytes on 8K database on Linux.

I don't know why. It may needs piecewise inserts to insert long values.

I've not tested the following patch. It may work, but may not...
--- oci8.rb.bak	2005-11-13 16:21:33.000000000 +0900
+++ oci8.rb	2006-01-19 03:07:00.000000000 +0900
@@ -264,6 +264,14 @@
       end
     end
 
+    # get/set LONG
+    LONG = Object.new
+    class << LONG
+      def fix_type(env, val, length, precision, scale)
+        [OCI8::SQLT_LNG, val, length || (val.nil? ? nil : val.length)]
+      end
+    end
+
     # get/set RAW
     RAW = Object.new
     class << RAW
@@ -867,7 +875,7 @@
   # datatype        type     size prec scale
   # -------------------------------------------------
   # LONG          SQLT_LNG      0    0    0
-  BindType::Mapping[OCI8::SQLT_LNG] = BindType::String
+  BindType::Mapping[OCI8::SQLT_LNG] = BindType::LONG
 
   # datatype        type     size prec scale
   # -------------------------------------------------


> UPDATE: It is not possible to read (with OCI8) columns of type long if

> they contain large data chunks (100K). Reported error is:
> `fetch': ORA-01406: fetched column value was truncated

How about edit oci8.rb at line 693 as the following:

from:
      when SQLT_LNG, SQLT_LBI
        # TODO: use OCI_DYNAMIC_FETCH
        datasize = 65535
      end
to:
      when SQLT_LNG, SQLT_LBI
        # TODO: use OCI_DYNAMIC_FETCH
        datasize = 100 * 1024
      end

--
KUBO Takehiro
email: kubo / jiubao.org
web:   http://www.jiubao.org
GnuPG fingerprint = 5F7B C8EF CA16 57D0 FDE1  9F47 C001 1F93 AC08 2262