2009/8/19 David Masover <ninja / slaphack.com>

> On Monday 17 August 2009 08:46:25 am Scott Briggs wrote:
> > Piyush Ranjan wrote:
> > > +1 for that. and -1 for the change.
> > >
> > > It just makes the developer look too stupid. Can't we let the
> developers
> > > understand the difference between a string and an integer ?
> >
> > If it was 20 years ago, I'd understand this sentiment.  What I don't
> > understand is why programming languages seem to insist on using
> > semantics that don't adapt to the natural ways that humans interact or
> > think.
>
> Because the semantics with which humans interact and think are ambiguous,
> often illogical, and often rely on intuition.
>
> We can't give our languages intuition, but the more we try to do so, and
> the
> more magic we introduce, the less predictable things get.
>
> > There are a lot of constructs in ruby that make it much easier to use
> > and understand from a natural language point of view, one of the big
> > strengths of ruby, and this in turn makes it more accessible to people
> > who are interested in programming and not getting bogged down in the
> > minutiae of why "2" is greater than "11".
>
> Programming inevitably leads to at least understanding these minutiae. I
> use
> Ruby, and I love it for that natural-language expressiveness, and also just
> for the conciseness, even where I know it's less efficient:


I second this. "Magic" (for want of a better word) is only useful when it
gives you a faster way to achieve the same result. To anyone with moderate
or above programming experience, the difference between strings and numbers
is important and I for one would be annoyed at finding strings being
magically handled as numbers when that isn't what I wanted -- especially if
it were happening to user-supplied data.

This isn't an implementation detail that ought to be hidden from the user to
make things easier (like dynamic typing, or automatic garbage collection):
strings and numbers are conceptually different types of data that support
different operations and different semantics. I think trying to do too much
automatic type conversion is likely to end up producing a lot of the
problems that exist with number/string/boolean comparison in PHP and (to a
lesser extent) JavaScript.

David mentions concatenation vs addition -- what about splitting? I can
split "1234" into "12" and "34" and I have two perfectly valid strings; if I
split the number 1234 into 12 and 34 I've not done something meaningful. In
a number the digits have meaning based on their position within the number,
which itself depends on the base used to represent the number. A string is
just a sequence of glyphs, which have no intrinsic meaning at a technical
level.

Ruby's design is said to follow the principle of least surprise; to me this
means that consistency and correctness shouid be maintained. Blurring the
boundaries between strings and numbers is a frequent cause of bugs for
beginners in some other languages, and I think Ruby does well to enforce
some separation between them to guide people in the right direction.

--
James Coglan
http://jcoglan.com