On Tue, Oct 29, 2002 at 03:30:08AM +0900, George Ogata wrote:
> Brian Candler wrote:
> 
> > Somewhere deep in the innards of String#+ it is doing the equivalent of a
> > String.new; should it be doing a self.class.new instead?
> 
> But wouldn't this be a problem if the subclass's constructor had different 
> access/arity to that of the base class?

Hmm, good point. Or it might have the same arity, but completely different
semantics (just to be perverse).

In one sense, you just want the parent class (String) to do its job and then
coerce the result to the subclass (S). However the subclass may have its own
additional instance variables, and simple coercion would leave them
uninitialised.

Or, you could have a special constructor which takes an object of type
String and turns it into the corresponding object of class S - which in our
example would be a null procedure, but in general might have other things to
do. I think someone suggested a method 'from_parent' to do that. If you had
multiple levels of inheritance then you would have to arrange to call the
whole chain of from_parent's. I'm not sure how to do that cleanly.

I guess what is emerging here is something fundamental but uncomfortable
about the "is_a" relationship: S is_a String, but also S isn't_a String
(because it's an S). Depending on what the implementor of class S has done,
S could be made more and more unlike_a String, to the point where it isn't
much like a String at all.

We know that S has all the instance variables of a String, so that when we
apply the default inherited '+' method, we know the result can set the
corresponding instance variables of S. However S may have other instance
variables, and unless we explicitly call S's constructor, we are not (in the
general case) going to have a valid instance of S.

So I think I understand why the inherited S#+ _has_ to return a String not
an S. It's still not particularly satisfying though :-)

Regards,

Brian.