>On Jun 15, 2004, at 10:19 AM, tony summerfelt wrote:
>I think the whole point here is that it isn't Ruby-ish to rely on a variable being defined or not. Ruby wassn't built to do things in that way; a couple examples in this thread show how variables are automagically defined when code is parsed. They aren't defined at runtime, they are defined at eval time.

For the sake of it (fixed this time, I hope):

def test_1()
  p defined? x
  x = "hello" if defined? x
  p x, defined? x
end
test_1() # => nil, "hello", "local-variable"

def test_2()
  p defined? x
  if defined? x then
    x = "hello"
  elsif defined? x then
    x = "world"
  end
  p x, defined? x
end
test_2() # => nil, "world", "local-variable"

According to the definition of "defined" implemented by
defined?(), it appears that in fact a local variable is
not defined until a statement where it gets assigned gets
its chance to be executed. That happens at runtime.

However, at parse time (or eval time), the compiler determines
what is that statement that will potentially introduces a
local variable if given a chance to be executed.

This is a rather contrived definition and I wonder why
local variables are not simply defined at method level
(and begin/end level). There should be enough information
at parse time to do that I guess. Ruby scoping rule
is, to me, an usual surprising mix of static/dynamic
scoping. Other languages I know either do static xor
dynamic scoping in the similar case. A pure static
scoping rule is usually more efficient because it makes
it possible to compute at compile time the number of
"slots" that need to be allocated, once only, at method
invocation time.

This is a very minor thing and I would be surprised
to see some example of code taking advantage of that
unusual (to me) behavior of defined?()

Thank you to those who flagged mistakes in my previous
post.

Yours,

JeanHuguesRobert



-------------------------------------------------------------------------
Web:  http://hdl.handle.net/1030.37/1.1
Phone: +33 (0) 4 92 27 74 17