Marc Heiler wrote:
> Hi,
> 
> On http://www.gcn.com/print/27_8/46116-1.html Ada is touted briefly.
> 
> The sentence(s) that most jumped into my eye (and hurt my brain a bit)
> was this:
> 
> "[...] Ada has a feature called strong typing. This means that for every
> variable a programmer declares, he or she must also specify a range of
> all possible inputs.[...]"
> 
> "[...] This ensures that a malicious hacker can¡Çt enter a long string of
> characters as part of a buffer overflow attack or that a wrong value
> won¡Çt later crash the program. [...]"
> 
> But clearly that is simple to do in ruby as well (and I never heard of a
> buffer overflow outside of the C world anyway): Just specify which input
> range would be allowed and discard the rest, warn the programmer, or
> simply convert it to the nearest allowed value - am I missing on
> something? Maybe there are some other reasons why Ada is still so en
> vogue for aviation software but I dont really get it (other than legacy
> code that was sitting there for thousand of years already). Maybe it is
> a paradigm that is only possible in Ada.

You're right. The problem in C is that C strings do not have a length, 
they are just pointers, and strings have to be zero-terminated. That is
a very bad thing. Imagine there is no terminating zero, then any call to
a string related function will read the whole memory and will most 
likely result in an exception. And determining the length of a string is
O(n). But the real security issue is, that some functions that read
input don't take a maximum length. Function gets(3) is one example.
It reads a line into a buffer, regardless how long the buffer is.

But this is more a library related problem, not so much language 
related. There are string libraries out there for C that are safe.

Ada compilers have to pass a lot of tests before they get a certificate.
A huge problem is that you can't trust the compiler, especially not
optimizing compilers. They might produce code that is buggy, even if
your program is correct. That's where Ada shines.

Then the language C is not type safe. You can do all kind of type casts.
And there are numerous constructs in C that increase the possibilities 
for errors. Ada is here a lot better too. For example you can limit the
range of an integer.

Furthermore, Ada has built-in support for processes and synchronization 
primitives. C and C++ just can't reliably do that, as there is no 
language support. That's why C++0x, the next upcoming version of C++, 
exist. It's goal is to make C++ multi-thread safe.

And Ada's language specification is very detailed, whereas that of C
lets many things open, which is not that desirable. You don't want any
suprise here. This problem came up recently in the Gnu Compiler 
Collection (GCC), where they changed the behaviour of the generated 
code, just because the C spec didn't specified it. This broke some
applications and operating systems, and possibly introduced a lot
of unknown bugs. Nothing you can build on reliable software.

> Ruby being too slow would be something I could not quite understand
> insofar that, after all you could write parts in C anyway, or you could
> use (in the case of replacing ADA) Lua - I'd figure Lua would be quite
> fast. Somehow despite that Ada is still in use, to me it seems like a
> "dead" language (means noone really learns it because there are better
> alternatives available)

You will never ever be able to use Ruby for aviation software, neither
Lua, Python, Perl etc.

It's not about slowness. Realtime systems can be slow as long as they 
meet their deadlines. Indeed, a lot of real-time systems are very slow.
They use 20 year old technology, no caches, no speculation etc., just 
because in real-time systems, you always have to calculate with the
longest possible execution time, and modern processors only improve 
average execution time.

Ada is not that bad at all. It's a beautiful language, maybe a bit 
verbose, but very powerful. Personally, I like it more than C++.

> The biggest confusion I get here is simply that strong typing is touted
> as a very good thing to have. I dont know if this is the case or not,
> but it seems to me that this is more "behaviour" that is imposed onto
> the programmer anyway (as in, he must do extra work to ensure his
> variables are a certain way etc..)
> For example, the "strong typing" as described here appears to me more a
> "force the programmer to do this and that". This may have advantages in
> the long run, I dont know, maybe fewer bugs or no buffer overflow
> problems, but to me it still is forcing the programmer to comply. I dont
> get what is so great about having to worry about many details. And on
> blogs you do sometimes see proponents of this solution scold on the
> people that use another solution (not only typing, but also test driven
> development and so on...)

Well, in the case of safety critical software, you don't want to have 
runtime exceptions. This software must not have errors, at least it's 
desirable ;-)

Duck-typing doesn't guarantee you anything at compile-time.

Regards,

   Michael