Wednesday, August 22, 2012

Ada: It's not just a different C

Rarely does a week pass at the large, bureaucratic nightmare of a defense contractor at which I work where a coworker doesn't opine the uselessness of Ada.  "What a waste!" they claim. "Everything would be so much easier if we just used C!"

And they are dead right.

(Emphasis on dead)

See, our little group of programmers write safety-critical, avionics-type software, so of course we use Ada (which you probably already know is a programming language typically associated with safety-critical software and embedded systems).  But most of the new programmers (and even a lot of the older ones) grew up learning to program in a world full of brackets and asterisks, and so they view Ada as some sort of programming masochism.  Strict rules and extensive verbosity simply for the sake of being strict and verbose.

The thing is that while we use Ada, we use Ada wrong: we use it like C.  Back when I was still running up my parent's phone bills dialing up BBS's, a bunch of C programmers got the Ada mandate dumped in their lap and, like any good C programmer, were far too prideful to admit that maybe they didn't know best how to do something.  So they stuffed their square peg of C programming into the round hole of Ada, and a whole generation of new programmers learned Ada by their bad example.
 
Recently, after days of tie-loosened, sleeves-rolled-up debugging, we found the bug that several of us had been searching for: a simple copy/paste error.  The API for VxWorks is of course done in C, which means that when you create a queue you have to give it the size each element of that queue will be.  But we use Ada, not C, so a programmer had of course gone to the good time and trouble to create Ada wrappers around the VxWorks calls.  And, of course, they were exact duplicates of the C calls, only done in Ada.  And, of course, nobody had noticed when some errant programmer was too busy trying to bang the new college intern to remember to change the size of the queue element, and so each message was too big, and everything just silently failed.

Not unexpectedly, this revived the continuous stupid debate that we always have: we tripled our development time, because we had to write all the wrapper code and train all the programmers on how to use Ada, and still spent the man-weeks of debugging time trying to solve the problems that would have been the same had we just used C in the first place.

Clearly the problem here is not that Ada is junk, it's that we use it the wrong way.  The "Ada" way to do this is make it a generic on the type of element and build the initialization into the creation (well, the right way is to just use a protected object, but I digress).  That way, as soon as the errant programmer tried to compile the code, it would have rejected it because of a type mismatch.  And if anything goes wrong, it raises an exception so that instead of spending weeks debugging it, it prints out "Hey stupid: line 123".

And this sort of crap is all over the place (not just at my company, but at others).  Everything passed around as either a float or an integer.  Exceptions "handled" by supressing them.  Passing arrays by pointer and length.  Case-style blocks performed on tag values.  Programmers going out of their way to try and defeat every single feature and safeguard Ada provides.  And the worst part?  Without anyone to say different, it's tacitly assumed that this is the right way to program.

Whenever we have this debate, I take the opportunity to lambaste the low-quality and general lack of information about using Ada.  Google for 'Ada', and you will get the Dental, Disabled, and Diabetic associations of America before you get near the language.  What little you do find is almost always out of date, completely sterile, but most importantly suffers from the same flaw as my coworkers: it teaches you how to use Ada like C.  One of the Ada books on my shelf mentions that new-fangled "type system" near the back, as an "advanced feature" not really germane to the subject, but is more than happy to demonstrate the Integer type in chapter one.  Really?  Really?

So should we really be surprised that nobody uses Ada?  Books and tutorials teach just enough for a reluctant C programmer to get by for a few years until he is eventually promoted to manager and then shitcans Ada for C altogether.  Because the fact is they're right: if you use Ada like C, you might as well save some budget and just use C in the first place.

But if you use Ada like Ada, well, then it's a whole new ballgame.  Your programs will magically just start to, you know, work.  Those hours you spend staring at hex values through a debugger can be better spent staring at the college intern through her white mesh shirt.  You will have less code to write, less to test, and less to maintain.  You will focus on actually writing new features and new programs, and not just marching through a seemingly endless list of bugs.

The key is that Ada is not just a better C.  Strong typing is a totally different paradigm from weak typing, like switching from procedural or OOP, or from polling to event-driven.  You can't just dip your toes in the shallow end and expect to pick it up as you go.  You need to jump in the deep end without any reservations.  You need to completely rethink the way you write software.  Everything you learned about programming is wrong.

And when I casually said that to my C fanboy coworkers, they called my bluff.  If the lack of appropriate, relatable literature was all that was stopping Ada from ruling the programming world, then why not educate everyone myself?  So I came home, had a martini or two, and started a blog.  A blog full of nothing but my opinionated, truculent, perhaps misguided views on programming in Ada.  I didn't write the LRM, I don't work for AdaCore, and I have no claim to be an expert other than getting drunk and dared by coworkers. 

But I have always found that the best programming books, the ones that stay on my shelf long after the content is obsolete, are the ones that put things in context.  The one's that tell you why, and not just what.  Books like "Peter Norton's Assembly Language Book for the IBM PC" or "Michael Abrash's Graphics Programming Black Book" still teach me more twenty years later than anything you will find at your local bookstore today.  With any luck, agree or disagree, you will at least find something worth considering in the forthcoming posts.

Just remember though: opinions are like assholes. Everyone's got one, and yours stinks.









No comments:

Post a Comment