On 7 February 2013 20:46, rosenfeld (Rodrigo Rosenfeld Rosas) wrote:
> I agree that a string is what I want in all cases. That is exactly why I
> don't feel the need for symbols. If symbols are just really required as
> a fundamental implementation detail of the MRI implementation, then I
> don't think it is a good reason to justify keeping them in the language
> level. Just find other ways to optimize methods/etc lookup in the
> internal MRI code. This should be a separate discussion from the language
> design itself.
>
> I'd really prefer you to think if symbols are really a good thing to
> have in the design of the Ruby language if you forget about all
> performance impacts it might have on the MRI implementation details.

Ok, methods.  They have a bucket of queriable information (a Method
instance), and they have a symbolic representation (a Symbol).  I don't
want to have to instantiate an entire Method object (or a whole bunch of
them) every time I want to talk to an object abouts its methods; I just
want a single, simple, universal token that represents that (or those)
method(s).

Sorry, that's a performance optimisation detail.  Ok, I don't want to have
to instantiate a Method object that potentially doesn't have a
corresponding method.  That could be confusing.

You will now argue that I could as easily use a String as a Symbol, and
yes, ignoring performance and implementation details that is true.  But I
don't want to write code that performs poorly.  If, in thise case, exposing
implementation details make my code easier and better, then by the gods,
let me use it.  It is then up to me not to misuse it.  Similarly: why have
any numeric class that isn't Rational?

And for the record: "I don't ever want to use ClassX so let's remove it"
is, frankly, silly.

> Then, still forgetting about performance and internal implementation
> details, try to reason why :symbol != 'symbol' is useful in Ruby just
> like a[:a] != a['a']. I've been using Ruby for several years now and I
> can tell you for sure that people often want them to behave the same and
> they don't want to worry about performance impact either.

Ok, completely philosphically, without any reference to performance or
implementation details, why is a Java enum not equivalent to (or auto-cast
to and from) a Java string?  An enum is just a token, yes?  It looks like a
string; it often spells a word that people might like to read, even
capitalise.  But it's not a string.  It's something else.  A Symbol is
exactly like that enum.

I, too, have been using Ruby for several years now; and I, too, have seen a
lot of people wanting Symbol and String to behave the same.  Hells, at
times even I have wanted that.  But the simple fact is:  those people
(myself included) are wrong.  If they want a String, use a String.  If they
want to force a Symbol-shaped peg into a String-shaped hole, then they'll
have to do whatever hoop-jumping is required; exactly as if you want a Java
enum to support implicit coercion to and from a string.

> People just don't know when to use symbols and strings.

Bingo.  Your solution is: hide Symbols from those people.  My solution is:
don't change anything; maybe eventually enough people will learn that the
two classes are, in fact, different.

> Take the Sequel library for instance.

No thanks, apparently the authors don't know the difference between Symbols
and Strings.

> We all know how symbols are different from strings,

Well apparently not, otherwise this would be a non-issue.

> it doesn't help repeating it all the way.

Perhaps I believe that if I say it enough times, in enough places, people
might actually notice.  And maybe even listen.

> I'd prefer that you focus on explaining why you think keeping symbols a
> separate beast is of any usefulness

I'll choose to interpret that as "... why I think keeping symbols at all
...".  Simply: because they're already here.  Relegating them to an
implementation detail and hiding them from the language will only break
100% of existing code.  Some of that code is good code.  Is it worth
breaking everything so a bunch of people can't accidentally use ClassX when
they should be using ClassY?

-- I'll inject this later comment here, because it's topical:
> Also, so that you stop arguing that the differences between symbols and
> strings are just like the differences between strings and integers
> (non-sense), notice that HashWithIndifferentAccess makes this distinction:
> [...]
> Since you don't see any popular hash implementation that will consider
> h[1] == h['1'] (like JavaScript), you could take the conclusion that
> people only really care that string behave differently than symbols.

Yes, but those are people who don't know the difference between Symbols and
Strings.  Just because they don't know it, doesn't make it untrue.
Personally I've never used HashWithIndifferentAccess, or needed to.

Incidentally those people don't want a Hash at all.  They want an
associative array, one that uses something like <=> or === to compare keys
(instead of #hash and #eql?).  If RBTree was more mature,
HashWithIndifferentAccess wouldn't be needed.  Shall we repeat this
discussion, but this time about Hash and assoc.array instead of Symbol and
String?
--

> This is what I do but I don't control other libraries.

This is true of _any_ issue in a library.  If you think the library's
benefits outweigh its costs, then you use the library.  If the fact that
the authors erroneously conflate Symbols and Strings is outweighed by the
fact that it's otherwise extremely useful, it's up to you to work around
the shortcomings.  Just like if some otherwise brilliant library uses 0
instead of nil, or something.

> Anyway this makes the new sexy hash syntax almost unuseful to me since
> strings is what I want most of the times.

So, like I said before, just don't use it.

> And I really do hate the general syntax for hashes. The new one is more
> compact and takes me much less type to type and it is also similar to
> what most languages do (JavaScript, Groovy, etc).

The general syntax served us well enough through 1.8 and 1.9.  Personally I
preferred being able to use `:` at the end of if/when/etc. statements.  If
I want to use javascript syntax, there's always node.js

> The difference is that in the other languages a string is used since
> they don't have the symbols concept.

That's a good point.  I'd love to be able to change the new syntax so {a:1}
meant {'a'=>1}, but that's not going to happen.  As such, in your eyes and
mine, the new syntax is useless for most intents and purposes, so we might
as well keep on writing Ruby code the way we always have (with `=>` tokens).


P.S. sorry to anyone else who is sick of this conversation, but I think it
needs to be had.  Let us know if we should take it offline somewhere.