On Thursday 28 June 2001 06:49, you wrote:
> Well, I'm not certain what the exact reason is, but in C, a char *is* an
> integer, but with a smaller range than an int.  In Ruby, it is the same
> way.  If you ask for "foo"[0] then you get the character ?f.  Since ?f is
> really just an integer, though, printing it out gives you a number.

One of the problems with this is that it makes working with non-ASCII strings 
difficult.  You have to adopt an entirely different metaphor to deal with 
UTF-16 strings, or other non-ASCII encodings.  This is one thing that I 
believe Java did right the first time.

The reason that I dislike the way Ruby handles Strings (in general) is 
because Strings really aren't OO objects -- they don't hide the 
implementation details.  You know that a String is just an array of integers 
with the ASCII encoding, and you have do deal with this fact through the API.

I'd rather see something like an explicit conversion class for Strings to 
specific encodings, and have String hide the implementation details.  
some_string[x] should return a String.  This would make Unicode handling 
easier.

class ASCII < Encoding
	# Creates a String-ish object in ASCII, if possible
	def initialize( some_string )
		#...
	end
	def []( int )	# Gets you an int
		# ...
	end
	def each	# Itterates on the characters
		# ...
	end
end

Or something like this.

While I'm complaining, I prefer Java's IO class hierarchy to Ruby's, too.  ;-)

--- SER