Tuesday, December 27, 2005

Servers are the new Apps

My career has, until recently, been devoted to building applications. What this used to mean is programs that you ran on your computer do get stuff done. Photoshop. Microsoft Word. TurboTax. Excel. Most applications have been document-centric--you open and save documents, usually on your local hard drive.


All that has changed. Even Microsoft is starting to build the venerable Microsoft Office suite as a service instead of an old-style application.


What does that mean?


It means, fundamentally, that you should quit thinking in terms of documents and start thinking in terms of transactions. When you create a blog entry for your weblog, you're not "editing a document", you're posting a transaction to a database. The old way of doing this was to open your existing "web page" as a document in a "web page editor" like FrontPage or Dreamweaver, "save" your document with the new text it it, then "upload" the document.


All of that is out the window.


The word "server" is old-fashioned, but it's the closest word we have at the moment. What it really means is that your information is centralized or, as a better way to phrase it, "in the sky somewhere" and you access your data over the network, interacting with it in a series of transactions.


That makes the "server" into the "application", because that's where your data is being maintained, and it's the server that's doing the work of editing it. You're editing it by remote control, through a transaction metaphor.


What this means is that all document-centric software will be history relatively soon. I mean all of it. The idea of "open" and "save" will just go away. Photoshop will have to learn how to edit photographis transactionally over a network, rather than loading the whole thing into memory and "saving" it. Word will learn to edit text through transactions to a remote database, not a monolithic document model.


The nay-sayer in you is saying that this will never happen, yet it is already happening. Web pages are increasingly built through this model, because they are already in the sky somewhere, so the whole round trip of download-edit-save-upload makes no sense. Other document types will follow, until there is no such thing as a document or even a "file" any more.

Monday, December 26, 2005

Programming Languages

I've learned a lot of programming languages over my career. I started with BASIC, I think, back in the 1970's. There was Action! (a cartridge for the Atari 800), Pascal, DIBOL, C, sh, csh, and it's not even 1982 yet. In the 80's I learned PostScript, Lisp, and Objective C. In the 90's there was Java, C++, Perl, Python, Objective Pascal (!), and way too many conversions back and forth between C strings and Pascal strings. Now there's Ruby, PHP, and who knows what else?


Why do we (collectively) invent new programming languages? What's the point?


Well, I would say that each programming language really did improve on some shortcoming in predecessors, or at least that was the perception at the time. Some were more interesting than others.


Pascal is what's known as a strongly typed language. You can't assign a long integer to a short integer, even if the value is 0 or 1. The language won't let you. It's supposed to help you write better code. When you're learning to program, this is a good thing, which is why all of us at that time (late 1970's, early 1980's) were taught Pascal first.


You can contradict me, but I claim that C was invented, or at least rose to popularity, because it eliminated all the rules imposed by Pascal. You could cast one type of variable into another, essentially jamming anything you wanted into memory. Kind of like assembly language, which has no "types". C is powerful, like a chain saw or a sharp knife. And yes, people have hurt themselves with C, particularly in memory management, which is explicit (you allocate memory when you need it, and you free it up when you're done). People hurt themselves this way, and programs crash because of it.


C and Pascal are compiled languages, which generate machine code that runs native on the CPU. So you have to compile a C program separately to run on, say, an Intel CPU versus a Sun CPU, even if both are running Linux. This is both an advantage (fast performance) and a disadvantage (need to recompile) of compiled languages.


Java was perhaps the most revolutionary of all these languages, because it "fixed" many of the issues with C: it is an interpreted language (as opposed to being compiled) so it can run (theoretically) without modification on any kind of CPU. Sun marketed this as "write once, run anywhere". In reality it's more like "write many times, run nowhere", but the problems with Java in the late 90's were more that the toolkit kept changing, trying to emulate all existing operating systems. The language is intact, and it really is a nice language.


Java improved upon many other things missing/wrong in C, as well. Memory is "garbage collected" which means you don't have to worry about allocating it or disposing it, and you can't really have a memory crash bug, which is nice. Lack of control over memory can sometimes lead to performance issues, but on balance it's probably a good tradeoff.


My favorite thing in Java is actually the ability to glue strings together with the "+" operator (which can be done in C++ as well but it's more of a hack and requires an object type other than C strings). In Java you can just add strings together like:


myString = "this string " + 27 + " another string";


Java has become a web programming language, mostly because the need for a consistent toolkit across operating systems is greatly diminished. Server-side web programming in particular has been dominated, until recently, by Java.


Now there is PHP and Ruby and other interpreted languages designed specifically for server-side web programming. PHP is the most innovative of these, since it can live in harmony with HTML code in which it's embedded, which is a peculiarity of web programming.


Sorry about all the historical perspective here. I really was starting out to make a point. So here it is...


Programming languages are just tools, means to an end. Depending on what problem you're trying to solve, you may be able to choose from quite a few different languages. This is a Good Thing.


But when I eavesdrop at conventions and in coffee shops and I hear young kids talking about programming languages, what I hear most is, "I learned it over the weekend" or something along those lines. Ease of learning is not a good reason to use a programming language, in my opinion. Balsa wood is "easy to learn" when you're woodworking because it's soft and you can cut it with a pair of scissors. It turns out not to be such a good wood for, say, building a dresser or a bed.


What I observe is that the craft of software programming is being polluted by throngs of young people who are able to "pick up" a programming language and "hack something together" and it sort of works, because CPU's are so blindingly fast and the safety of interpreted languages with memory garbage collection means you don't really ever have to learn any Computer Science to make things work.


But the skill and the judgement of choosing the right tools and materials to build the best possible software are disappearing rapidly, in the same sort of way that house-building has turned into a profession of stapling pre-formed lumber together, and is a profession, when 150 years ago almost everyone knew how to build their own house, could cut and square timbers, and built houses that lasted many, many generations. Those skills and knowledge of materials are disappearing, too, along with the actual lumber, come to think of it.


Maybe nobody "needs" to know the difference between an 8-bit char data type and a 64-bit long integer, and maybe there's value in snap-together software. But I have a feeling that when we get to the point where everything is built out of Legos--houses, software, cars, everything--we will wish they still taught Industrial Arts and Programming in junior high school.

Saturday, December 17, 2005

Razor Blades

In the computer industry, a common topic of conversation in discussions on "how to make money" is the old razor blade analogy. The idea is that you give away the printer, and make money selling ink cartridges. I think we've all been on the receiving end of that, when you realize how expensive the ink cartridges are, and how quickly you need them. "But I only paid $49 for the printer!" That's exactly how it works.


The other day an actual razor arrived in our mail at home, addressed to my wife. It was from Schick, and it's called the Quattro for Women. It has four blades ! I was sort of amazed that they'd send a whole razor, and some blades, which I knew cost something like $10 at the store. How often does somebody send you a $10 product unsolicited in the mail? Then I realized it was a gambit to try to get my wife hooked on it so she'd buy more blades. The exact same argument we constantly have in the computer industry. I actually chuckled to myself.


I suppose Schick is upset because they have early claim to the invention of the razor, by one Lieutenant Colonel Jacob Schick, in 1926. I am surprised to learn that Schick-Wilkinson Sword is now owned by Energizer (yes, the battery people) and that schick.com gives "404 not found". But I digress.


Gillette made a huge splash a few years ago with their Mach 3, which had three blades !


I remember when the twin-blade razors came out. Now that was a real revolution. It had two blades ! And they had lots of diagrams and pictures to show you how the first blade bent the hair over, and the second blade clipped it. It worked on me, and the 3-blade update even worked on me. I use a Mach 3 from Gillette to this day.


But wait! The 4-bladed razor isn't enough. Gillette fires back with the Fusion razor, which, you guessed it, has five blades ! Maybe this has something to do with Gillette's being acquired by Procter and Gamble. In researching this blog post I see now that the competing Quattro from Schick has been out since last year, but I didn't know about it. Maybe they started carpet-bombing the world with free razors because of Gillette's announcement in September of the 5-blade razor.


This is clearly getting ridiculous. I tried my wife's 4-blade razor and I can tell you it doesn't work as well as the 3-blade razor, and if I were objective and didn't like my Mach 3 so well, I'd probably have to admit that the 2-blade razors are fine, too. I don't know about you, but my face is not flat, and most surfaces that get shaved are not flat, so it's hard to see how more than two blades could be improving the situation much, especially on concave surfaces like underarms. But clearly that's not the point. We live in a culture where more is better almost by definition.


This escalation seems silly, yet there is big money chasing this industry, all because of the original concept of keeping the handle and having to buy the blades.


I'm sticking with my trusty Gillette Mach 3, and hoping that now that all these new-fangled 4- and 5-blade razors are out, my blades will get cheaper. And I have no doubt that Schick is busily at work on the 6-blade razor, and that both companies have skunkworks projects working on 7- and 8-blade razors. I can't wait.