Lloyd Linklater wrote:
> In the olden days, programs used to be a combination of assembly and a 
> low level compiled language like C or Pascal.  Lotus was actually 
> written wholly in assembler back in the day.  The comparison of C to 
> assembler was, in the early days, much like the comparison of Java to 
> Ruby today.  However, the speed comparison of assembler to C did not 
> sustain its original conclusion for long.  As compilers improved, it was 
> possible to wrote wholly in C and have it execute faster than in pure 
> assembler.
> 
> Personal anecdote #1: actual testing
> 
> I was working at Quantum at the time (hard disk maker) and this very 
> controversy arose.  The managers listened to the philosophical debate 
> for a while and decided to settle it.  We had volunteers from each side 
> to write code in their idiom doing theoretical and practical tests of 
> speed.  By theoretical I mean do <this thing> X times and see how fast 
> it is.  By practical, I mean that you had to see how many seeks, 
> read/writes, etc you could get in a certain time with the different 
> algorithms.
> 
> The pure language, no assembler, won hands down.

I'm really curious about two things:

1. The processor architecture, and
2. The language.

There once was an architecture called VLIW, embodied in a
mini-supercomputer called Multiflow. This architecture was so
complicated that it literally *had* to have a compiler -- no human could
even program it, let alone optimize code for it. The compiler used a
technique called "trace scheduling" to do this.

The punchline is that the optimization problem for this beast was
NP-complete. Now *most* compiler optimization problems are NP-complete
once you express them as true combinatorial optimization, and the good
folks at Multiflow weren't oblivious to that fact. However, their
approximations were still slow relative to what simpler architectures
required, and Multiflow went out of business. They disappeared without a
trace.

<ducking>