Em 07-02-2013 19:11, Matthew Kerwin escreveu:
> On 7 February 2013 23:09, Rodrigo Rosenfeld Rosas wrote:
> > Enums have two goals in such languages. Improving performance and
> > reducing memory footprint is one of them. The other one is to help the
> > compiler to find errors at compile time by restricting the input type in
> > some functions/methods and variables. I don't really understand how this
> > is relevant to this discussion.
>
> No, no no no. Enums exist because they are identifiers.  They are 
> symbolic representations of a concept that either does not necessarily 
> otherwise exist in the code, or does not have to be fully instantiated 
> in order to discuss it.  That is exactly what Symbols are.

If you really believe symbols are similar to enums I guess you haven't 
done much C, C++ or Java programming and used enums. Here is the main 
reason why enums exist. First let me notice that C and Java implement 
enums in different ways:

C example:

typedef enum {HEAD, TAIL} coin_side;
coin_side my_coin = HEAD; // my_coin = 0 would also work here

if you try to create another typedef like this, the compiler will 
complain that HEAD is already declared:
typedef enum {HEAD, TAIL} alternate_coin_side;

if you do something like:

typedef enum {PAPER, ROCK, SCISORS} game;
coin_side my_coin = PAPER;

The C compiler won't complain. But Java takes a different approach when 
it comes to enums:

class MyClass {
   enum Game {PAPER, ROCK, SCISORS};
   enum CoinSide {HEAD, TAIL};
   void test(){
     Game a = Game.PAPER;
     Game b = CoinSide.HEAD; // won't compile!
   }
}

In that sense, if you write a method accepting a coin side you can only 
pass in a CoinSize enum. Not a number. Not another enum type. This is 
what I understand by enums. Not symbols related at all in my opinion.

> > They don't spend their time thinking about whether they should be using
> > symbols or strings. They don't WANT to worry about it!
>
> Your overarching goal is to coddle developers who write code without 
> understanding either the language, or the concepts behind their own 
> code.  There is a massive philosophical disjunct here between you and 
> I, and I think it will never be overcome.

It is a matter of choosing the right tool. If you're really concerned 
about some really small performance improvements you might get by using 
symbols instead of strings I would question if Ruby is really the right 
language for you.

I'd never consider Ruby or Java to write hard-real-time applications the 
same way I wouldn't consider Windows or plain Linux for such task. I'd 
most likely use C and Linux + Xenomai patch (if building for desktops) 
instead.

When I'm writing Ruby code I don't want to worry about micro performance 
improvements. The minimal amount of time I would probably care in 
optimizing would be 100ms instead of micro-seconds when I'm writing C 
programs that must complete some complex tasks in very short times when 
writing real-time tasks.

> > I agree it is unlike to happen. What about another syntax: {{a: 1}} =>
> > {'a' => 1}? Maybe it would worth trying to ask for some syntax change
> > like this one. We could even add interpolation to it:
> > {{"value #{computed}": 1}}.
>
> You'd probably be more likely to succeed with a new %string-style 
> notation, like %h{a:1, b:2}.  Although then again, possibly not.

That's an idea, yes, but I guess I prefer Thomas' suggestion of using 
Map(a: 1, b: 2).