Martin Bosslet <Martin.Bosslet / googlemail.com> wrote:
> The problem is in lib/openssl/buffering.rb:
> 
>     def do_write(s)
>       @wbuffer = "" unless defined? @wbuffer
>       @wbuffer << s
>       @sync ||= false
>       if @sync or @wbuffer.size > BLOCK_SIZE or idx = @wbuffer.rindex($/)
>         remain = idx ? idx + $/.size : @wbuffer.length
>         nwritten = 0
>         while remain > 0
>           str = @wbuffer[nwritten,remain]
>           begin
>             nwrote = syswrite(str)
>           rescue Errno::EAGAIN
>             retry
>           end
>           remain -= nwrote
>           nwritten += nwrote
>         end
>         @wbuffer[0,nwritten] = ""
>       end
>     end
> 
> remain gets initialized with @wbuffer.length, the string length in characters, 
> but nwrote receives the actual number of bytes written, so less bytes than 
> actually available are written.
> 
> A fix for this would be treating @wbuffer strictly as binary data by forcing
> its encoding to BINARY. I'm not sure, does anyone see a more elegant way or
> would this solution suffice?

I use an "-*- encoding: binary -*-" comment at the top of all Ruby
source files where I initialize string literals for storing binary data.
It's cleaner than setting Encoding::BINARY on every string I create
(and nearly all my code works exclusively on binary data).

Also, all of the Ruby (non-SSL) *Socket objects have Encoding::BINARY by
default anyways, so I think SSLSocket should be the same.