Guillaume Cottenceau <gc / mandrakesoft.com> wrote:
>"Benjamin J. Tilly" <ben_tilly / operamail.com> writes:
>
>[...]
>
>> >Second, as far as I know[1], "." is used in C++, Java, Python, Beta,
>> >Cecil, Delphi, Eiffel, Sather, Modula-3, Visual Basic, Icon, whereas "->"
>> >is only used in Perl, PHP and C++. Thus, Henry's argument that "." is
>> >unknown to programmers is false.
>> >
>> What do you mean by "use"?
>
>What operator is used by languages to 'call' a method on a given receiver.
>
I would be more generous.  (In fact I was.)
>
>> Both "." and "->" are used in C.  "." to access a field in a struct,
>
>Let's limit to the method invocation operator in OO languages.

Fair enough, but I don't think that method invocation is
separate from the rest of the language.  In particular the
notation that you use to get access to a method of an
object bears a suspicious similarity to accessing fields
of a struct in C, and I don't believe this is entirely a
coincidence.

>> and "->" to access a field in a struct through a pointer.  I believe
>> that any C-derived language should preserve this basic usage.  The
>> choice of "." or "->" for method calls should reflect your concept
>> of what an object is.
>
>Not when identifiers are indeed references to objects, think Java and
>Ruby. Namely, don't think C, C++ or Perl.

I didn't mean the implementation detail.  I meant how you think
about things when you manipulate them.  For instance in higher
level languages you think in terms of variables holding strings,
and not in terms of variables holding a pointer to a string.  In
all probability it is implemented the latter way, but that isn't
how you think about it.

So in a language like Ruby, an identifier is always a reference
to an object.  But you don't think through that dereference.
Instead you think about having things and calling methods on
the things you have.  Like the way we teach physics, this model
is (of course) a lie, but it is a good enough approximation
most of the time.  (For an example where the approximation breaks
think of the periodic discussion of why += for strings cannot be
as efficient as <<.)

>> For instance Perl's objects are references (ie Perl's version of
>> a pointer) and OO operations operate through dereferencing.
>> Therefore it uses "->" for method lookups.  This analogy goes very
>
>Actually Perl 5 needs to separate references to arrays and hashes, to
>"real" arrays and hashes, thus they have the similar problem to C/C++ and
>they've solved it the same way.

I don't think of it this way, but I can't really disagree
either. :-)

However if you sit down to write a class in Perl, for all of
your manipulations at every step you have to write down the
fact that you are operating through a reference.  If you do
the same in Ruby, you don't write that fact everywhere.  Now
in fact you *are* working through a reference, but you don't
ever say that.  And - more importantly - the programmer
describing their code to someone else (or themselves) doesn't
have say (or think) that.

[...]
>> Given that in Ruby all things are objects, and Ruby does not
>> encourage people to think in terms of explicit dereferencing, I
>> absolutely believe that it should use a ".".  Not due to the fact
>
>Yes. That's actually the trend for references-only based objects, and the
>principle of least surprise comes up here, together with what's most used
>by OO languages currently in use as I tried to show. And it's easier to
>type in, as I also said :-).
>
>> that programmers are used to seeing method calls written that way,
>> but because the syntax tells people something about what the
>> language's model for an object is.
>
>The syntax doesn't tell anything here, or I failed to understand the
>bottom line: using only "." doesn't tell anyone that we're using only
>objects, or only references to objects. It just tells that we're not
>enjoying two ways of designation of objects.
>
If one programmer walks through the implementation of a class,
what words do you use?  The rule that I am pointing out is that
in languages that use "->" the programmer will at some point
be talking about references or pointers.  In languages that use
".", you generally won't.

Cheers,
Ben