Michael T. Richter wrote:
> Mathematically I agree with you, but in terms of hardware underlying all 
> this stuff it's basically a real-world Turing machine.  (Which is what 
> the von Neumann architecture is: Turing's machine turned into something 
> that could actually be implemented.  Things like "infinite tapes" and 
> "infinite decision tables" turned out, surprisingly, to be implausible 
> at point of implementation. :D)

Again, I have to plead ignorance on Turing's contributions to practical 
computing. But the Turing machine was introduced in the context of logic 
and foundations of mathematical logic, *not* as a conceptual foundation 
for *computing*, either scientific or commercial. The same is true of 
Church's Lambda Calculus and Schonfinkel's Combinatory Logic, as so 
eloquently documented by (Haskell!) Curry and Fitch and numerous others.

The Von Neumann machine, on the other hand, was designed from the ground 
up for the practical solution of equations -- linear, nonlinear, 
algebraic, differential, difference and integral. It was patterned on 
successes of the past in this domain using electromechanical (relay) or 
even mechanical technologies. Indeed, the original Burks/Von Neumann 
design looks a lot like a programmable Marchant or Friden desk 
calculator, except that the mechanical calculators were always decimal 
and the Von Neumann machine was binary.

> Church's model of calculation is far more appealing to me and the 
> languages based on it -- Lisp (arguably: there's some evidence that 
> McCarthy stumbled over this rather than deliberately trying to model 
> Church), Haskell, etc. -- are increasingly the way I like to work.  But 
> it's all smoke and mirrors.  Underneath it all is a von Neumann machine 
> masquerading as a Church lambda expression engine.

McCarthy and others have written extensively on the origins of Lisp, and 
I think most of the design decisions are well documented. But the key 
concept McCarthy and the others introduced with Lisp is analogous to Von 
Neumann's. Von Neumann introduced the practical equivalence of programs 
and data by storing them in the same address space. McCarthy introduced 
the practical equivalence of programs and data by expressing programs as 
S expressions just like the data. I don't think either Turing's or 
Church's "logics" do this.

Lisp 1 and 1.5 were indeed Von Neumann machines interpreting Church 
lambda expressions. But later generations of Lisp machines arose that 
carried out the interpretation at a lower level of hardware abstraction.

> /There are two ways of constructing a software design. One way is to 
> make it so simple that there are obviously no deficiencies. And the 
> other way is to make it so complicated that there are no obvious 
> deficiencies. (Charles Hoare)/

Charles Anthony Richard Hoare, most often referred to as C. A. R. Hoare, 
although I believe his friends call him "Tony". :)