Note: this has moved far beyond Ruby. This will, therefore, be my last
      post on the topic. This isn't a matter of having to have the last
      word; this is a matter of not continuing a discussion that's
      willfully fruitless.

On 10/15/06, Kevin Olemoh <darkintent / gmail.com> wrote:
> Having ten dialects of the same language does not nessecarily improve
> anything at all. Just as having only one thing does not nessecatily
> improve anything.

Do you even know what you're talking about here, or are you just
spouting off?

In terms of human language, one doesn't get to choose the number of
dialects that exist. They are created, destroyed, and merged on a
regular basis by separation and reconnection. Sometimes, these
distinctions become large enough or varied enough that a wholly new
language is formed. Would you say that Australian English, Canadian
English, American English, and British English are all the same? I
should hope not (because to do so would be pure ignorance). Would you
say that all of these (broad) dialects should be merged into one? Say,
mid-west American English? Or maybe we should all be talking like Eton
graduates? Or shall we put the dialects on the barbie, mate! Each of
these dialects of English -- and they are very broad dialects, because I
guarantee you that a Nova Scotian or a Newfoundlander doesn't sound like
a Torontonian or an Albertan -- serves multiple purposes and exist
despite (or in spite of!) the presence of modern communications.

In terms of computer languages, few programming languages are truly
dialects of one another. Sure, one could say that C++ is a dialect of C,
but that's an uninformed and limiting view. C, C++, Java, and C# are all
closely related, but they all bring different things to the table.

* C is a low-level assembly.
* C++ adds objects, and later added templates with which can be
  accomplished some truly amazing and scary things (generic programming
  is the beginning of it; it is possible to do functional programming
  with C++ templates).
* Java adds the JVM, restricts multiple inheritance, and adds a massive
  library.
* C# deviates from Java, but adds some nicities of its own and has a
  different underlying technology (generated into MSIL running on the
  CLR). C# 3.0 will even be taking some cues from Ruby and other dynamic
  languages to make it less verbose than its prior incarnations or its
  kissing cousin, Java.

You can also make a close kinship argument for Pascal, Oberon, and
Modula 2, Delphi, and Ada (if you look at Ada, it looks a *lot* like
Pascal). You can clealy say that Oracle's PL/SQL is a member of the same
family, and it is in fact descended from Ada. When Wirth developed each
of the first three, he had slightly different goals in mind; when
Heljsberg developed Delphi, he needed the ability to do both component-
based and object oriented programming and extended the Pascal language
to do so. Ada was developed by committee, but is a remarkably good
language despite that, even if it is a bit like a straight-jacket at
times. PL/SQL didn't need everything that Ada gave, so Oracle developed
something that emphasized embedded SQL commands and gave faster access
to the database.

The C families and the Pascal families share almost no syntactical
similarities, however.

Then you have Lisp. The less I say about Lisp, the better, because I
don't know it. But there are dialects of Lisp because different people
had different ideas on how certain things should be represented. Once
lisp machines went away, it became rather more important to consider
Lisp differently, so that probably contributed some to the fragmentation
of Lisp, Scheme, Guile (a dialect of Scheme, IIRC), and other similar
languages. Lisp is a functional language, which means that it begat
thinking about programming that begat ML, Caml, OCaml, Haskell, and even
Oz/Mozart. Each of these is a functional programming language, but they
cause you to think about the way that you're writing the program
differently.

Then we have the scripting languages. These things are meant to be glue
that ties other programs together, but somehow many of them have grown
into significant languages on their own. Some of these differences are
evolutionary (and purposefully so): sh to ksh to bash. Some of them are
revolutionary: sed to awk to Perl. Some sidestep: csh (and tcsh),
Python, and Ruby.

In every single one of the languages that I mentioned above -- and I've
programmed in C, C++, C#, Java, PL/SQL, Ada, Pascal, Object Pascal
(Delphi), sh/ksh/bash, awk, Perl, Ruby and several more that I
haven't mentioned here -- there is a usefully different way of looking
at how one would solve a problem. There's a different *purpose* for
each. Sometimes, I use a particular language because it's the most
useful for a particular platform; sometimes it's mandated. Sometimes I
use multiple languages in a single program. I have an installation shell
script at work that is written primarily in a Bourne shell dialect
(assuming minimal capabilities) that uses various standard utilities and
even has awk and sed miniprograms inside of it. This is written in the
lowest common denominator because I don't control the installation
platforms. My build scripts, however, are written to take advantage of
bash 2.x features, because I control the build platforms. (And yes, I
want to rewrite the scripts to Ruby. I'll have to get Ruby on all of
them, first.)

My point is that what you've argued is essentially ignorant and
willfully so. You're coming in and assuming that these differences just
exist because people are too lazy to make sure that they don't exist.
No, these differences exist *for a reason* in every case. And as people
lose the need for certain things, those things will go away. You don't
see many people programming in 6502 Assembler these days, do you? (You
don't see many people programming in assembler at all these days. It
happens, but usually at a very late stage.)

> It really is a balancing act for example as far as human biology is
> concerned there is only about a one percent difference between any two
> given people on earth. If there was too little differentiation at the
> genetic level we would all suffer greater and greater genetic damage
> due to inbreeding and eventually cease to exist. To call me arrogant
> for pointing out that too much or too little difference (depending on
> the situation) is a bad thing shows that you don't know what you are
> talking about. The inability or human beings to focus on the things
> that the have in common has been the source of so much greif.

Kevin, the above is so much nonsense. Yes, I know that the amount of
genetic variation among humans is a small value when you're talking
percentages. But percentages aren't useful here: IIRC, we share 50% of
our genetic code with most bacteria on Earth. Our genetic code is so
large and complex that it only takes a tiny fraction of difference to
express completely different phenotypes, for example. I call your
statement about human language or computer language arrogant and -- more
importantly -- painfully ignorant.

> Furthermore as far as computers go limiting things to a set of agreed
> upon standards is part of what allowed the desktop computer as we
> understand it to take off. If every manufacturer used their own
> standards people would not be able to easily replace parts and they
> would probably end up locked into one or two vendors. (the situation
> with laptops currently.) The vendor could also charge fairly high
> prices since they are the only ones who produce the hardware; however
> since we have an agreed upon standard for most hardware we have low
> prices and the buyer is more or less in control of his or her own
> machine rather than the vendors.

Twaddle -- and ultimately self-defeating if you're right. Limiting
things to standards retards growth. Not having standards retards growth.
If someone comes out with a new capability that isn't covered by a
standard, what are they to do? Wait until there is a standard? Right.
No, they put it out even without a standard. Other people put out
something that competes. Ultimately, this competition leads to something
that *can* be standardized. In the interim, though, people have been
using incompatible things which will need to be upgraded. I suggest you
look at the history of C++ to see how it went from Cfront to a proper
compiler, and how many compilers implemented different parts of an
emerging standard. This wouldn't matter, except that sometimes these
compilers are still in production. (!@#!!@#$! HP aCC.)

(Your parenthetical about laptops, by the by, is nonsense, much like the
rest of this paragraph. There *are* standards used, else you wouldn't be
able to do much with a laptop. They're just not things that are
necessarily made to be user serviceable or upgradeable. And I can extend
the power of my laptop just fine, thanks, with USB and expansion cards.
By and large, IME, by the time that you want to upgrade your laptop's
capabilities, the state of the art is so much better that you're better
off replacing it. Hell, at this point, I would be replacing my desktops
rather than upgrading them in any case. It's a mook's game, even though
there's a lot of standards-based extensibility.)

> Creating a common culture or language need not suppress diversity of
> culture and ways of thinking in the attempt to bridge the gaps between
> people and in the case of a programming language what can be done with
> the various languages.

Excuse me? You *don't* know what you're talking about here. Creating a
common culture almost *always* depends on suppression of the divergent
cultures. Read some history. You might learn something about that which
you're spouting nonsense. I suggest reading about the British
subjugation of Ireland and what it did to the Gaelic subculture. Or
maybe the American and Canadian reservations and residential schools.
(The residential school stuff is a particularly shameful part of
Canadian history.) Or why not look at various cultural behaviours which
have been outlawed in history?

France is going through this right now. In France, the wearing of
religious symbols is illegal in public schools and government jobs. As
(as near as mandatory to be indistinguishable from mandatory) outward
expressions of their religion, Sikhs wear turbans, have uncut hair
(unshorn beards), wear a special iron bracelet on their dominant hand,
and must carry a kirpan (a short blade). In France, the intersection of
the secular laws with these religious injunctions means that it is
highly unlikely that a Sikh would become a police officer or a
government worker. It's something that was created for good reason
(reducing control of the Catholic church over the French state) but has
since become a powerful tool of marginalisation against a wide variety
of groups. To be fully accepted in France, Sikhs would have to
*suppress* their religious differences. If they don't do that, they
aren't going to be fully accepted.

Monocultures require the suppression of the different. It's not a happy
circumstance, and reading a bit of history would tell you that.

-austin
-- 
Austin Ziegler * halostatue / gmail.com * http://www.halostatue.ca/
               * austin / halostatue.ca * http://www.halostatue.ca/feed/
               * austin / zieglers.ca