On 1/1/06, gwtmp01 / mac.com <gwtmp01 / mac.com> wrote:
> On Dec 31, 2005, at 10:07 PM, Johannes Friestad wrote:
> > So in
> > a=4
> > 'a' does not hold a reference, technically speaking, but rather the
> > immediate value 4.
>
> I know I'm being pedantic (again) but I'd rather think of assignment
> as *always* copying references.  It is simpler that way.

You're describing the user model, I was talking about the
implementation. Immediate values are objects for (almost) all
practical purposes, so whether you think of fixnums as references to
immutable singleton objects or as immediate values doesn't make much
difference.
And as you say, it's simpler to use the same model for everything,
which is why many languages go to quite some lengths to hide the
difference.

I tend to think of programming languages (Ruby, C, assembler) as user
intefaces for the computer hardware. That makes 'usability' an
important factor for programming langauges, and supporting simple (but
still powerful) user models for how the language works is one aspect
of usability.
Implementation details are IMO mostly relevant for explaining why
there are two classes for integers (fixnum and bignum) in the first
place, or for that matter why there are 'integers' and 'floats'
instead of simply 'numbers'. In day-to-day programming, they rarely
matter.


jf