On 6/23/07, Pe?a, Botp <botp / delmonte-phil.com> wrote:
> From: Michael W. Ryder [mailto:_mwryder / worldnet.att.net]
> # I have been using Business Basic for over 25 years and it is what I
> # consider fixed point.  For example, entering 'Print 14.95*.6'
> # results in
> # 8.97.  The language uses a set precision which can be changed.  For
> # example if I tell it to print 120*.000003 with the default
> # precision of
> # 2 it displays 0.  If I change the precision to 6 and tell it to print
> # the same thing it displays .00036.  The computer stores the result in
> # the precision at the time of the operation so entering
> # 'a=120*.000003'

Yes, basically what you are dealing with is how the object is
represented, and not how it really is:

irb(main):001:0> 1.0/9.0 + 8.0/9.0 == 1
=> true
irb(main):002:0> 1.0/9.0
=> 0.111111111111111
irb(main):003:0>

Pretty obvious what's going on there without looking at the source
code.  If you 'really' want it the way it is, you use BigDecimal.
Again, LCD factor.  What is the easiest way to appease your
programmer?  You give them what you think they expect most of the
time.  Most people don't need a BigDecimal.

Todd