On Thu, 2004-10-28 at 04:23, Mark Hubbart wrote:

> Reconfigure your misconceptions: decimal numeric notation has no basis
> in reality. It is simply a way for humans to write down numbers.
 
> Decimal, like binary, is a notation, a way to represent finite
> numbers. It can not represent all finite numbers satisfactorily,
> though; the notation has it's failings. One of the failings is shown
> when representing finite rational numbers where the denominator is not
> solely a multiple of powers of 2 and 5 (the factors of ten).
> 
> 1/5 => 0.2
> 1/2 => 0.5
> 1/4 => 0.25
> 1/7 => 0.142857142857143.... (whoops)

Another funny failing of the notation is that not everything that looks
like a number actually represents a number. The following is quite
intriging:

x = 0.99999... (infinite serie of 9s)
10*x = 9.99999...
10*x - x = 9*x = 9
So x = 1!

IIRC, any 'word' that ends with an infinite serie of 9s does not
represent an actual real number.

Guillaume.