------art_131565_18072463.1161077778917
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
Content-Transfer-Encoding: 7bit
Content-Disposition: inline

On 10/16/06, Rick DeNatale <rick.denatale / gmail.com> wrote:
>
> On 10/16/06, Rich Morin <rdm / cfcl.com> wrote:
> > FYI, here is a quote which seems relevant to Ruby:
> >
> >   Although the possibility of program modification at
> >   run-time was heralded  as one of the great consequences
> >   of John von Neumann's profound idea of storing
> >   program and data in the same memory, it quickly turned
> >   out to enable a dangerous technique and to constitute an
> >   unlimited source of pitfalls.  Program code must remain
> >   untouched, if the search for errors was not to become a
> >   nightmae.  Program's self-modification was recognized as
> >   an extremely bad idea.
>
> Let's paraphrase this:
>
> "Although the possibility of travelling longer distances at greater speed
> was heralded  as one of the great consequences
> of Orville and Wilbur Wright's profound idea of combining an internal
> combustion engine with their newly uncovered mastery of controlling
>   the flight of a glider in three-axes, it quickly turned
> out to enable a dangerous technique and to constitute an
> unlimited source of pitfalls.  Man must stay on the ground and out of such
> dangerous flying machines, if the possibility of crashing was not to
> become
> a nightmare.  Aeroplanes were recognized as an extremely bad idea."


And it is, sometimes it is very very brave not to go directly for the most
advanced thing imediately.
Human's nature assures that we go there anyway :)
I think that the merit of reading Wirth's article is not to agree with him,
but to be more
critical more sceptical
Let me use Rick's paraphrase,
Great we can fly now, and that is a beautiful and important accomplishment,
but...
maybe I shall not use it *only* because it is new and sophisticated, what
are the shortcomings, sometimes the old method like *walking* is better.

I think that when Wirth says bad ideas he is plain wrong, but he is a well
recognized man in his science and not the youngest any more. So I translated
certain statements from
"bad idea" --> "dangerous idea" etc. etc. and in that light I found his POVs
most interesting.

As a matter of fact he does not so much critisize the concepts but the Human
Way to use them, well that's how I read it ;)

Cheers
Robert


Sure self-modification is powerful, and anything powerful has its
> dangers, that doesn't mean that we've figured out ways to contain and
> control the dangers.
>
> Professor Wirth is known for his strong, often obstinate opinions.
> Reading a bit further, he uses the references paragraph as a
> motivation for the introduction of various indirect addressing modes
> in computer architectures to remove the need for program modification
> to implement concepts like arrays.  He then goes on to disparage the
> notion of array descriptors for array bounds checking.
>
> This last argument seems to be based solely on perceived 'efficiency'.
>
> He seems to see things in stark black and white, techniques are either
> 'good' or 'bad' depending on his subjective assessment and without
> regards to context or the evolution of technology with it's attendant
> shift in the economies of processors and memory.
>
> Look at his opinion about functional programming:
>
> "To postulate a state-less model of computation on top of a machinery
> whose most eminent characteristic is state, seems to be an odd idea,
> to say the least. The gap between model and machinery is wide, and
> therefore costly to bridge. No hardware support feature can wash this
> fact aside: It remains a bad idea for practice. This has in due time
> also been recognized by the protagonists of functional languages. They
> have introduced state (and variables) in various tricky ways. The
> purely functional character has thereby been compromised and
> sacrificed. The old terminology has become deceiving."
>
> This paragraph is deceiving. FP advocates have not "introduced state
> (and variables) in various tricky ways." because of the cost of
> implementation, they have done it because sometimes, like when you are
> doing IO you NEED to have side-effects.
>
> Then he goes on to minimize OOP "After all, the old cornerstones of
> procedural programming reappear, albeit embedded in a new terminology:
> Objects are records, classes are types, methods are procedures, and
> sending a method is equivalent to calling a procedure. True, records
> now consist of data fields and, in addition, methods; and true, the
> feature called inheritance allows the construction of heterogeneous
> data structures, useful also without object-orientation. Was this
> change of terminology expressing an essential paradigm shift, or was
> it a vehicle for gaining attention, a
> 'sales trick'?"
>
> I've actually gone head to head with the good professor about his
> limited view of OOP.  Many years ago I attended a lecture he gave at
> UNC, as a guest of Fred Brooks.  His talk was on "object oriented
> programming with Oberon."  While I can't recall the exact details, he
> basically boiled OOP down to a stylized use of case statements using
> variant records.  When I suggested that perhaps OOP might be more
> about decoupling software components by using Kay's message semantics
> a la Smalltalk, he first tried to argue, but apparently failing to
> understand the question, quickly devolved to a statement like "All you
> American programmers are hacks."
>
> There's a famous story about a similar lecture he gave at Apple, where
> someone else pushed back in a similar way.  If Oberon doesn't have
> encapsulation how can it be object-oriented. In this case, Wirth's
> ultimate rejoinder boiled down to  "who can really say what
> object-oriented means."  To which the questioner responded, "Well, I
> suppose I do, I'm Alan Kay and I invented the term."
>
> This story has appeared in various forms. Here's a reference from the
> ruby-talk list:
> http://blade.nagaokaut.ac.jp/cgi-bin/scat.rb/ruby/ruby-talk/18422
>
> I do have to say though that the interpretation in this post seems a
> little off in that it stresses inheritance, which Kay doesn't seem to
> be an essential feature of OOP. Here's a passage from his article
> about the early history of Smalltalk:
>
>    "By this time (1972) most of Smalltalk's schemes had been sorted out
>      into six main ideas that were in accord with the initial premises in
>      designing the interpreter.
>
>          1.  Everything is an object
>          2. Objects communicate by sending and receiving messages
>              (in terms of objects)
>          3. Objects have their own memory (in terms of objects)
>          4. Every object is an instance of a class (which must be an
> object)
>          5. The class holds the shared behavior for its instances (in
> the form of
>              objects in a pogram list
>          6. To eval a program list, control is passed to the first
> object and the
>               remainder is treated as its message
>      The 1st three principals are what objects "are about"--how
>      they are seen and used from "the outside." Thse did not require any
>      modification over the years. The last three --objects from the
>      inside--were tinkered with in every version of Smalltalk (and in
>      subsequent OOP designs)."
>
> So the lasting essentials of object-orientedness for Kay are
> everything being an object, computation built solely on messages
> between objects, and encapsulation of object state.  These are shared
> with Ruby, note that Classes are an optional (even experimental)
> feature, and inheritance isn't even mentioned.
>
> Now there are some good things in the Wirth paper.  I like his
> observations about how computer architects often get things quite
> wrong when they define complex instructions to 'help' say compiler
> writers.    This resonated with my early experience at IBM when on the
> first day on the job I was handed a specification for a new computer
> called FS, which was then supposed to be the replacement for the
> S/370, for a cogent analysis of where THAT was likely to (and in fact
> did) end up, see http://www.jfsowa.com/computer/memo125.htm a
> confidential IBM memo from 1974 which I never expected to see again.
>
> On the whole though, despite his notable accomplishments such as
> designing Pascal, and popularizing interpreters using what we now call
> byte-codes. He seems to be stuck in the late 1960s/early 1970s, and
> dismisses anything which doesn't fit into his limited view which seems
> to require software designs which are  quite close to the hardware
> architectures he liked back then.
>
> He really seems to have been re-inventing Pascal ever since the first
> version, Modula and Oberon are really just slightly different Pascals.
>
> --
> Rick DeNatale
>
> My blog on Ruby
> http://talklikeaduck.denhaven2.com/
>
>
Nevertheless the following still holds ;)

-- 
The reasonable man adapts himself to the world; the unreasonable one
persists in trying to adapt the world to himself. Therefore all progress
depends on the unreasonable man.

- George Bernhard Shaw

------art_131565_18072463.1161077778917--