Monday, December 26, 2005

Programming Languages

I've learned a lot of programming languages over my career. I started with BASIC, I think, back in the 1970's. There was Action! (a cartridge for the Atari 800), Pascal, DIBOL, C, sh, csh, and it's not even 1982 yet. In the 80's I learned PostScript, Lisp, and Objective C. In the 90's there was Java, C++, Perl, Python, Objective Pascal (!), and way too many conversions back and forth between C strings and Pascal strings. Now there's Ruby, PHP, and who knows what else?


Why do we (collectively) invent new programming languages? What's the point?


Well, I would say that each programming language really did improve on some shortcoming in predecessors, or at least that was the perception at the time. Some were more interesting than others.


Pascal is what's known as a strongly typed language. You can't assign a long integer to a short integer, even if the value is 0 or 1. The language won't let you. It's supposed to help you write better code. When you're learning to program, this is a good thing, which is why all of us at that time (late 1970's, early 1980's) were taught Pascal first.


You can contradict me, but I claim that C was invented, or at least rose to popularity, because it eliminated all the rules imposed by Pascal. You could cast one type of variable into another, essentially jamming anything you wanted into memory. Kind of like assembly language, which has no "types". C is powerful, like a chain saw or a sharp knife. And yes, people have hurt themselves with C, particularly in memory management, which is explicit (you allocate memory when you need it, and you free it up when you're done). People hurt themselves this way, and programs crash because of it.


C and Pascal are compiled languages, which generate machine code that runs native on the CPU. So you have to compile a C program separately to run on, say, an Intel CPU versus a Sun CPU, even if both are running Linux. This is both an advantage (fast performance) and a disadvantage (need to recompile) of compiled languages.


Java was perhaps the most revolutionary of all these languages, because it "fixed" many of the issues with C: it is an interpreted language (as opposed to being compiled) so it can run (theoretically) without modification on any kind of CPU. Sun marketed this as "write once, run anywhere". In reality it's more like "write many times, run nowhere", but the problems with Java in the late 90's were more that the toolkit kept changing, trying to emulate all existing operating systems. The language is intact, and it really is a nice language.


Java improved upon many other things missing/wrong in C, as well. Memory is "garbage collected" which means you don't have to worry about allocating it or disposing it, and you can't really have a memory crash bug, which is nice. Lack of control over memory can sometimes lead to performance issues, but on balance it's probably a good tradeoff.


My favorite thing in Java is actually the ability to glue strings together with the "+" operator (which can be done in C++ as well but it's more of a hack and requires an object type other than C strings). In Java you can just add strings together like:


myString = "this string " + 27 + " another string";


Java has become a web programming language, mostly because the need for a consistent toolkit across operating systems is greatly diminished. Server-side web programming in particular has been dominated, until recently, by Java.


Now there is PHP and Ruby and other interpreted languages designed specifically for server-side web programming. PHP is the most innovative of these, since it can live in harmony with HTML code in which it's embedded, which is a peculiarity of web programming.


Sorry about all the historical perspective here. I really was starting out to make a point. So here it is...


Programming languages are just tools, means to an end. Depending on what problem you're trying to solve, you may be able to choose from quite a few different languages. This is a Good Thing.


But when I eavesdrop at conventions and in coffee shops and I hear young kids talking about programming languages, what I hear most is, "I learned it over the weekend" or something along those lines. Ease of learning is not a good reason to use a programming language, in my opinion. Balsa wood is "easy to learn" when you're woodworking because it's soft and you can cut it with a pair of scissors. It turns out not to be such a good wood for, say, building a dresser or a bed.


What I observe is that the craft of software programming is being polluted by throngs of young people who are able to "pick up" a programming language and "hack something together" and it sort of works, because CPU's are so blindingly fast and the safety of interpreted languages with memory garbage collection means you don't really ever have to learn any Computer Science to make things work.


But the skill and the judgement of choosing the right tools and materials to build the best possible software are disappearing rapidly, in the same sort of way that house-building has turned into a profession of stapling pre-formed lumber together, and is a profession, when 150 years ago almost everyone knew how to build their own house, could cut and square timbers, and built houses that lasted many, many generations. Those skills and knowledge of materials are disappearing, too, along with the actual lumber, come to think of it.


Maybe nobody "needs" to know the difference between an 8-bit char data type and a 64-bit long integer, and maybe there's value in snap-together software. But I have a feeling that when we get to the point where everything is built out of Legos--houses, software, cars, everything--we will wish they still taught Industrial Arts and Programming in junior high school.

No comments:

Post a Comment