"Mary M. Inuet" <maryinuet / ya_hoo.com> wrote in message news:<2h1dfjF7uummU1 / uni-berlin.de>...
> "Michael Geary" <Mike / DeleteThis.Geary.com> wrote in message
> news:10allar4vlk2092 / corp.supernews.com...
> > > what makes zero
> > > so special that it would warrant being false?
> >
> > Because in the real world, zero means false.
[...]
> failure to correctly interpret some boundary condition.  If numbers always
> evaluate to true, it seems absurd to allow them to be used in conditionals
> predicating on truth - we should at least get a good diagnostic, IMHO.
> Arguments about the elegance or uniformity of zero's truth are pretty weak
> in my opinion, due to the fact that Ruby does not exist in a vacuum.  LOTS
> of external libraries, extensions, interfaces, etc, implicitly and
> explicitly require zero to be false.

I agree that if numbers are always evaluated to true, it's better
for Ruby to warn the use of numbers as predicates. ("Absurd" is a
bit too strong a word, in my opinion.)

But, I don't agree with the practical universality of zero as false.
If you think that zero is almost always treated as false in
practice, I guess that's because of your "culture" (perhaps related
to the C language and cousins).  When I began programming in the Bourn
shell (/bin/sh), I was surprised that the return value of zero
is treated as true in predicates, as in

    if grep -s hello file.txt; then ...

grep returns zero when it finds a match.  Almost all standard Unix
tools return zero to mean "success" and/or "true". . . . What's the
moral?  It's best to keep truthness and numbers separate.  Also, I
don't buy the argument that since "LOTS" of libraries uses the
zero-as-false convention, Ruby should follow the suit.  I would
argue that in LOTS of other contexts zero is treated as the signal
of "true".  (Another example is, as someone mentioned, some assembly
languages have instructions which say "jump if such
and such flag is zero".)