On Mon, Apr 28, 2008 at 8:55 AM, stephenT <stiv.thomas / gmail.com> wrote:
>
>  >
>  > Ruby is not performant enough to use for an AI inference engine.
>  >
>  Neither is the 'processing' unit of the brain performant enough to be
>  used as an inference engine.
>
>
>  English, Japanese, German, .... are no more natural than any computer
>  language.
>
>  Don't let the surface structure of the problem lead you astray nor let
>  the surface similarity of a tool (computer language) to a problem
>  domain (language understanding) lead you to the idea that one can be
>  as effective as the other. Interpretation, understanding, and
>  generation of spoken and written language is, at the core, a spatial-
>  temporal pattern recognition problem. Whereas computer languages
>  (which are usually a formal subset of another language) require more
>  of a patter matching approach. One could argue that this is just a
>  difference in specificity of the associated meaning of each individual
>  language token but I think it goes beyond that. By this I mean that
>  language, in the traditional sense, has a dependency on the greater
>  context in which it is used. Context can actually create unique new
>  associated meanings to tokens. Computer languages do not have this
>  feature. (the associated runtime value of a variable is not the same)

This is why there exists reliance upon language convention and unit
tests.  In other words, for people to play well together, they have to
make their own set of language usage rules that are a subset of those
of the language.  Unfortunate consequence of programming in general,
just in different spades with Ruby.

Todd