Issue #13085 has been reported by Eric Wong.

----------------------------------------
Bug #13085: io.c io_fwrite creates garbage
https://bugs.ruby-lang.org/issues/13085

* Author: Eric Wong
* Status: Open
* Priority: Normal
* Assignee: 
* Target version: 
* ruby -v: 
* Backport: 2.2: UNKNOWN, 2.3: UNKNOWN, 2.4: UNKNOWN
----------------------------------------
Relying on rb_str_new_frozen for unconverted strings does not
save memory because copy-on-write is always triggered in
read-write I/O loops were subsequent IO#read calls will
clobber the given write buffer.

  buf = ''.b
  while input.read(16384, buf)
    output.write(buf)
  end

This generates a lot of garbage starting with Ruby 2.2 (r44471).
For my use case, even IO.copy_stream generates garbage, since
I wrap "write" to do Digest calculation in a single pass.

I tried using rb_str_replace and reusing the string as a hidden
(klass == 0) thread-local, but rb_str_replace attempts CoW
optimization by creating new frozen objects, too:

  https://80x24.org/spew/20161229004417.12304-1-e / 80x24.org/raw


So, I'm not sure what to do, temporal locking seems wrong for
writing strings (I guess it's for reading?).  I get
test_threaded_flush failures with the following:

  https://80x24.org/spew/20161229005701.9712-1-e / 80x24.org/raw


IO#syswrite has the same problem with garbage.  I can use
IO#write_nonblock on fast filesystems while holding GVL,
I guess...




-- 
https://bugs.ruby-lang.org/

Unsubscribe: <mailto:ruby-core-request / ruby-lang.org?subject=unsubscribe>
<http://lists.ruby-lang.org/cgi-bin/mailman/options/ruby-core>