Pe, Botp wrote:

> I think the ".." notation came fr "...". But the ".." was easier to type
> (you know *nix/pl guys better), so the history. But now we used "...", so
> the confusion addressed by the op.

Actually, ".." for a range predates nix/pl--it goes all the way back to 
Pascal, at least... Or it may come from the fact that Algol60 had a 
reference language that was distinct from the so-called implementation 
language--this had to map more-or-less one-to-one with the reference 
language, for any given implementation. The reference language used ":" 
for ranges, but, if I remember correctly, the implementation I used 
allowed ".." as a synonym, just in case you couldn't type ":" on your 
favourite key punch equipment...

So, I disagree: I think that ".." came from trying to represent ":" in 
ancient software/hardware environments. Why ":" was used by languages 
like Algol, in the first place, I have no idea.

> In the end, it is not advisable to think in english when using a
> programming language. If you program in ruby, just think in ruby.

I'm inclined to agree. I had, however, been reading why's (poignant) 
guide just the other day, and so I was thinking in natural language 
terms about ruby... His reasoning around ".." vs. "..." is not the most 
compelling, and I prefer the idea that, as a mnemonic, at least, I can 
remember that "..." means *something* is elided, whether or not this 
works in English sentence analogues.

Of course, I've never actually had occasion to *use* the "..." operator, 
but then I'm still a ruby nubie.

Bob