Seebs wrote:
> It makes more sense to me that
> "foo"[1] == "o" than that "foo"[1] = 111.

Languages that support a character data-type return foo[1] as a 
character. OTOH, languages that don't support a character data-type (all 
scripting languages I know) return foo[1] as a string.

The reason Ruby behaves differently (till 1.8) than other scripting 
languages is, I think, because it was influenced by LISP. LISP does have 
a character data-type. The "proof" is that Ruby even copies LISP's 
character literal (a question mark followed by a raw character).

>
> I think the reason you need a single-character-string now is that
> things like UTF-8 may make it ambiguous what the next "character" is

And because, IIUC, each strings in 1.9 carries an encoding (that's what 
makes if different than many other languages, that always represent 
strings in unicode). An integer doesn't have a "place" to store an 
encoding, so you have to use a string object.

-- 
Posted via http://www.ruby-forum.com/.