On Fri, 2001-09-21 at 21:47, Todd Gillespie wrote:
> You are making a classic marketing mistake, namely confusing lack of
> training with unwillingness.  There is no reason to believe that
> someone who is trying to learn a computer language is averse to learning
> mathematics at the same time.

Well, that would have been me, anyway. I started programming because I
hated math and wanted the computer to do it for me ;-) Well, that's not
exactly how it happened; but suffice to say I've always been behind in
math, but I picked up the _logic_ of programming quickly, and now I have
a Software Engineering job (doing C++) and I have some knowledge of at
least a half dozen other languages.  All with no more than high-school
level Algebra 1 math, most of which I've forgotten.

If I had thought that programming well *required* advanced math skills
back when I began, I probably wouldn't have started at all. Things are
different now; after a few years of experience, I'm eager to learn the
more esoteric maths and only wish I had the time to do so.  But that
comes out of a purely personal desire to more fully understand the field
in which I work, *not* out of any proffessional need. My job has never
presented me with a problem that I was at a loss to solve because of
poor math training.  The times I've coded recursive functions I did it
because it just "made sense" to do so for the task at hand, not out of
any formal mathematical training.

There seems to be a dichotomy between the CS view of programming, and
the more pragmatic industry view of it.  The CS view seems to be that
you can't even begin to program without having advanced math knowledge.
My real world experience suggests that only a fraction of the
programming jobs out there ever utilize that math knowledge.  On the
other hand, my experience also suggests that when you really strive to
master the art of programming you begin to get an intuitive grasp of
some of the relevant mathematical concepts, without necessarily knowing
the names for them.  But that's just conjecture...

I will readily grant that modern programming pretty much owes everything
to math and mathemeticians.  But I've lost count of the times I've told
people I'm a programmer, and they've responded "oh, you must be really
good at math then, huh?". This strikes me as really sad, because these
people all think that they can't program because they never got far in
math.  A lot of people, who might find their work made somewhat easier
by a smattering of programming knowledge, are hampered in getting that
knowledge by their fear of math.  They think that to gain entry to this
elite world, they need to first be able to fill a wall with equations.
Which scares them to death because they hated math in high school.
Whereas if you show them a simple Ruby (or Python, or even Perl) script,
they'll usually comprehend it quickly with a bit of help.

I would even go so far as to say that for some people, programming might
be a good introduction to math, rather than vice-versa.  It certainly
seems to have been true of me.

somewhat related questions:

1. Someone made a mention of Ruby being "properly tail-recursive".
What's the definition of "properly tail recursive"? I've heard the term
before, but I'm not sure what it means.  In every language I've coded
in, functions can call themselves; is this all that the term means? Or
is it something more?

2. My impression of recursion has been that it ought to be used only in
cases where the function can only recurse a predictable, reasonably
small number of times; otherwise the stack will overflow eventually.
The "factorial" function in the examples seems to break that rule.  Is
this safe?  If so, why?

3. Todd: Since you have such a high respect for math (and you quoted
heinlein :-), do you have any book recommendations for someone desiring
to learn the formal mathematics that relate to programming?  

-- Avdi Grimm