Paul Lutus <nospam / nosite.zzz> writes:

> One can't argue with results, but this is some kind of language-specific
> anomaly if one considers the operations (presumably) taking place behind
> the scenes. For arbitrary numbers n,p and the operation n^p, 
>
> n^p = e^(log(n)*p)

That's not the way anyone does exponentiation with integers.  Instead,
you represent the exponent as a sum of powers of 2 (which you already
have as its binary representation) and then do something like, e.g.,
this to find n ** 10:

n2 = n * n
n4 = n2 * n2
n8 = n4 * n4
n ** 10 = n8 * n2

Still, n ** 2 does result in a multiplication, but I think that what
we're seeing is that n * n involves two lookups of the value of the
variable "n", whereas n ** 2 involves only one retrieval of n.  That
extra lookup operation swamps any of the actual mathematical
operations.  This also explains why with a large literal simple
multiplication is faster than doing ** 2.

-- 
s=%q(  Daniel Martin -- martin / snowplow.org
       puts "s=%q(#{s})",s.map{|i|i}[1]       )
       puts "s=%q(#{s})",s.map{|i|i}[1]