On Sat, Dec 31, 2005 at 05:03:20AM +0900, Austin Ziegler wrote:
> 
> Please edit your guide, as it is *not* correct when it says:
> 
>    A Ruby symbol is a thing that has both a number (integer) and a string.
> 
> This is not correct in any way. If you leave it as "A Ruby Symbol is a
> thing", you're fine. But when you say that it "has" a number and a
> string, you are incorrect. Symbols are not composed of the integer or
> string values that you can convert Symbols to; they just are. The
> value of a Symbol is not its String value or its Fixnum value, but
> itself.

That depends on how one understands the word "has" in this context.  For
instance, as you say, a symbol does not contain or consist of parts
including an integer and a string.  On the other hand, a symbol "has"
associated with it, if and when you place it within the correct context,
both a string and an integer.  The string association springs into being
when one uses .to_s, and the integer association exists because of the
hash table in which symbols are stored "behind the scenes".  Actually,
one might make a case for the string association existing from the word
go, but I have no idea how that could be explained succinctly.


> 
> If you would like, I can see about possibly helping you edit this or
> coauthor it such that it might become a useful article to be published
> at Ruby Code & Style to explain where Symbols can be used and what
> they are. Ultimately, though, trying to dig into the internals of
> Symbols misses the point -- it's just something that represents
> itself.

I'm afraid I disagree with the implication that what strings do from
Ruby's perspective is not useful information.  Simply saying that a
symbol is "something that represents itself" might explain it perfectly
to you, as it might to legions of (for instance) Python programmers used
to using names rather than variables, or whatever it is that Python
programmers do, but it clearly didn't help Steve or me very much.  This
indicates that another approach is needed, at least sometimes, and that
approach might involve an understanding of *why* symbols act the way
they do in context so that symbol behavior in code can be predicted when
one is writing said code.

. . . and please don't point out that symbols don't "do" anything, or
have any "behavior".  You (should) know what I mean from context.

-- 
Chad Perrin [ CCD CopyWrite | http://ccd.apotheon.org ]

"Real ugliness is not harsh-looking syntax, but having to
build programs out of the wrong concepts." - Paul Graham