Tuesday, December 27, 2005

Servers are the new Apps

My career has, until recently, been devoted to building applications. What this used to mean is programs that you ran on your computer do get stuff done. Photoshop. Microsoft Word. TurboTax. Excel. Most applications have been document-centric--you open and save documents, usually on your local hard drive.


All that has changed. Even Microsoft is starting to build the venerable Microsoft Office suite as a service instead of an old-style application.


What does that mean?


It means, fundamentally, that you should quit thinking in terms of documents and start thinking in terms of transactions. When you create a blog entry for your weblog, you're not "editing a document", you're posting a transaction to a database. The old way of doing this was to open your existing "web page" as a document in a "web page editor" like FrontPage or Dreamweaver, "save" your document with the new text it it, then "upload" the document.


All of that is out the window.


The word "server" is old-fashioned, but it's the closest word we have at the moment. What it really means is that your information is centralized or, as a better way to phrase it, "in the sky somewhere" and you access your data over the network, interacting with it in a series of transactions.


That makes the "server" into the "application", because that's where your data is being maintained, and it's the server that's doing the work of editing it. You're editing it by remote control, through a transaction metaphor.


What this means is that all document-centric software will be history relatively soon. I mean all of it. The idea of "open" and "save" will just go away. Photoshop will have to learn how to edit photographis transactionally over a network, rather than loading the whole thing into memory and "saving" it. Word will learn to edit text through transactions to a remote database, not a monolithic document model.


The nay-sayer in you is saying that this will never happen, yet it is already happening. Web pages are increasingly built through this model, because they are already in the sky somewhere, so the whole round trip of download-edit-save-upload makes no sense. Other document types will follow, until there is no such thing as a document or even a "file" any more.

Monday, December 26, 2005

Programming Languages

I've learned a lot of programming languages over my career. I started with BASIC, I think, back in the 1970's. There was Action! (a cartridge for the Atari 800), Pascal, DIBOL, C, sh, csh, and it's not even 1982 yet. In the 80's I learned PostScript, Lisp, and Objective C. In the 90's there was Java, C++, Perl, Python, Objective Pascal (!), and way too many conversions back and forth between C strings and Pascal strings. Now there's Ruby, PHP, and who knows what else?


Why do we (collectively) invent new programming languages? What's the point?


Well, I would say that each programming language really did improve on some shortcoming in predecessors, or at least that was the perception at the time. Some were more interesting than others.


Pascal is what's known as a strongly typed language. You can't assign a long integer to a short integer, even if the value is 0 or 1. The language won't let you. It's supposed to help you write better code. When you're learning to program, this is a good thing, which is why all of us at that time (late 1970's, early 1980's) were taught Pascal first.


You can contradict me, but I claim that C was invented, or at least rose to popularity, because it eliminated all the rules imposed by Pascal. You could cast one type of variable into another, essentially jamming anything you wanted into memory. Kind of like assembly language, which has no "types". C is powerful, like a chain saw or a sharp knife. And yes, people have hurt themselves with C, particularly in memory management, which is explicit (you allocate memory when you need it, and you free it up when you're done). People hurt themselves this way, and programs crash because of it.


C and Pascal are compiled languages, which generate machine code that runs native on the CPU. So you have to compile a C program separately to run on, say, an Intel CPU versus a Sun CPU, even if both are running Linux. This is both an advantage (fast performance) and a disadvantage (need to recompile) of compiled languages.


Java was perhaps the most revolutionary of all these languages, because it "fixed" many of the issues with C: it is an interpreted language (as opposed to being compiled) so it can run (theoretically) without modification on any kind of CPU. Sun marketed this as "write once, run anywhere". In reality it's more like "write many times, run nowhere", but the problems with Java in the late 90's were more that the toolkit kept changing, trying to emulate all existing operating systems. The language is intact, and it really is a nice language.


Java improved upon many other things missing/wrong in C, as well. Memory is "garbage collected" which means you don't have to worry about allocating it or disposing it, and you can't really have a memory crash bug, which is nice. Lack of control over memory can sometimes lead to performance issues, but on balance it's probably a good tradeoff.


My favorite thing in Java is actually the ability to glue strings together with the "+" operator (which can be done in C++ as well but it's more of a hack and requires an object type other than C strings). In Java you can just add strings together like:


myString = "this string " + 27 + " another string";


Java has become a web programming language, mostly because the need for a consistent toolkit across operating systems is greatly diminished. Server-side web programming in particular has been dominated, until recently, by Java.


Now there is PHP and Ruby and other interpreted languages designed specifically for server-side web programming. PHP is the most innovative of these, since it can live in harmony with HTML code in which it's embedded, which is a peculiarity of web programming.


Sorry about all the historical perspective here. I really was starting out to make a point. So here it is...


Programming languages are just tools, means to an end. Depending on what problem you're trying to solve, you may be able to choose from quite a few different languages. This is a Good Thing.


But when I eavesdrop at conventions and in coffee shops and I hear young kids talking about programming languages, what I hear most is, "I learned it over the weekend" or something along those lines. Ease of learning is not a good reason to use a programming language, in my opinion. Balsa wood is "easy to learn" when you're woodworking because it's soft and you can cut it with a pair of scissors. It turns out not to be such a good wood for, say, building a dresser or a bed.


What I observe is that the craft of software programming is being polluted by throngs of young people who are able to "pick up" a programming language and "hack something together" and it sort of works, because CPU's are so blindingly fast and the safety of interpreted languages with memory garbage collection means you don't really ever have to learn any Computer Science to make things work.


But the skill and the judgement of choosing the right tools and materials to build the best possible software are disappearing rapidly, in the same sort of way that house-building has turned into a profession of stapling pre-formed lumber together, and is a profession, when 150 years ago almost everyone knew how to build their own house, could cut and square timbers, and built houses that lasted many, many generations. Those skills and knowledge of materials are disappearing, too, along with the actual lumber, come to think of it.


Maybe nobody "needs" to know the difference between an 8-bit char data type and a 64-bit long integer, and maybe there's value in snap-together software. But I have a feeling that when we get to the point where everything is built out of Legos--houses, software, cars, everything--we will wish they still taught Industrial Arts and Programming in junior high school.

Saturday, December 17, 2005

Razor Blades

In the computer industry, a common topic of conversation in discussions on "how to make money" is the old razor blade analogy. The idea is that you give away the printer, and make money selling ink cartridges. I think we've all been on the receiving end of that, when you realize how expensive the ink cartridges are, and how quickly you need them. "But I only paid $49 for the printer!" That's exactly how it works.


The other day an actual razor arrived in our mail at home, addressed to my wife. It was from Schick, and it's called the Quattro for Women. It has four blades ! I was sort of amazed that they'd send a whole razor, and some blades, which I knew cost something like $10 at the store. How often does somebody send you a $10 product unsolicited in the mail? Then I realized it was a gambit to try to get my wife hooked on it so she'd buy more blades. The exact same argument we constantly have in the computer industry. I actually chuckled to myself.


I suppose Schick is upset because they have early claim to the invention of the razor, by one Lieutenant Colonel Jacob Schick, in 1926. I am surprised to learn that Schick-Wilkinson Sword is now owned by Energizer (yes, the battery people) and that schick.com gives "404 not found". But I digress.


Gillette made a huge splash a few years ago with their Mach 3, which had three blades !


I remember when the twin-blade razors came out. Now that was a real revolution. It had two blades ! And they had lots of diagrams and pictures to show you how the first blade bent the hair over, and the second blade clipped it. It worked on me, and the 3-blade update even worked on me. I use a Mach 3 from Gillette to this day.


But wait! The 4-bladed razor isn't enough. Gillette fires back with the Fusion razor, which, you guessed it, has five blades ! Maybe this has something to do with Gillette's being acquired by Procter and Gamble. In researching this blog post I see now that the competing Quattro from Schick has been out since last year, but I didn't know about it. Maybe they started carpet-bombing the world with free razors because of Gillette's announcement in September of the 5-blade razor.


This is clearly getting ridiculous. I tried my wife's 4-blade razor and I can tell you it doesn't work as well as the 3-blade razor, and if I were objective and didn't like my Mach 3 so well, I'd probably have to admit that the 2-blade razors are fine, too. I don't know about you, but my face is not flat, and most surfaces that get shaved are not flat, so it's hard to see how more than two blades could be improving the situation much, especially on concave surfaces like underarms. But clearly that's not the point. We live in a culture where more is better almost by definition.


This escalation seems silly, yet there is big money chasing this industry, all because of the original concept of keeping the handle and having to buy the blades.


I'm sticking with my trusty Gillette Mach 3, and hoping that now that all these new-fangled 4- and 5-blade razors are out, my blades will get cheaper. And I have no doubt that Schick is busily at work on the 6-blade razor, and that both companies have skunkworks projects working on 7- and 8-blade razors. I can't wait.

Friday, November 11, 2005

That's Palo Alto for ya...

I had a quick dinner in downtown Palo Alto tonight after soccer practice, with my wife and two kids and one of my daughter's friends. We went to the California Pizza Kitchen.


To my amazement, within a 20-foot radius, also having a quick dinner with their families, was Ross Mayfield, the CEO of SocialText, a popular wiki provider; David Hornik, noted VC and major investor in Six Apart; and Jeff Jordan, the President of PayPal.


And people wonder why so many amazing things happen in Silicon Valley. We could practically have put together a deal right there, over Thai Chicken pizza. Instead, of course, we all enjoyed our dinners and went on our way, spending valuable time with our families.


But who knew? Maybe the California Pizza Kitchen is the new Buck's :)

Wednesday, November 09, 2005

Little Things for the customer

We've made a few little changes that you might not notice unless you were one of the folks who wrote to our Support team asking for it.


One thing is that the viewer for photographs just got better. Super-large photos are now resized automatically so they display at a good size for computer screens, and there's a navigation link to click through all the photos on a page without closing and opening the windows individually. A little thing, but a welcome improvement.


Podcast RSS feeds can now be subscribed to directly in iTunes (including videos!). This is pretty cool, actually. If you drag some audio and/or video files onto a Bubbler page and click the "RSS Feed" box in web settings, you are now podcasting, and people can subscribe to your podcast directly from iTunes (Advanced menu).


Another is that the "URL" field is no longer required when posting comments. This seemed to confuse some people who don't have blogs or web sites of their own, so now it's an optional field instead of a required field. A little thing, but important.


The Files list used to also include Photos. It doesn't now; it lists the "other" files that aren't photos.


The elves are always working here at Five Across, and although you don't always see the changes, things are always getting better, and faster.

Thursday, October 13, 2005

If you go to our bubbler.net site and look around, you'll see that our web platform supports custom domain mapping.


It has been dawning on me, by reading some blogs (cybersonic ) and from talking to customers, that the way we do this is really far superior to the Virtual Host mechanism in Apache. So let me geek out for a second and talk about this, because it's cool.


DNS servers are one of those things that everybody knows they need to configure to get on the internet, but most people aren't quite sure what they do. It's pretty simple: they look up a "name", and return a "number". Like a giant phone book. They resolve a name like 'bubbler.net' to an IP address (66.201.44.131). That's pretty much all they do.


So when you type an address into a browser, it looks it up in the directory (DNS server) and opens an HTTP connection to that IP address. Simple, right?


But for "custom domains", what happens is that lots and lots of different domain names all map to the same IP address. In our system, not only does bubbler.net map to 66.201.44.131, so does blog.glennreid.com, m949.com, www.blodgettcommunications.com, and hundreds of others.


So how does the server sort this out, with all these domains coming to the exact same place, just an IP address?


The HTTP request header has the domain name that the user is surfing in on. The only question is what page to serve, as you don't want to give index.html to everybody. So you need to associate a user/page owner with each domain name.


With Apache, the administrator has to create a VirtualHost entry in the Apache config file, which is not for the faint of heart. The admin has to set up a "root directory" and permissions and decide whether or not to allow server-side includes and a bunch of stuff like that. It is not a trivial process. Then you have to restart the Apache server for it to take effect.


With our server, we just look at the incoming HTTP request header and see what domain it thinks it's being sent to, and we look it up on the fly to see if any of our users have claimed that domain for one of their pages, which they do by just typing it in, in the Rename Page dialog. It is instantaneous and highly scalable, unlike Apache, which reserves resources for each virtual host and you can only host a relatively small number on any given server. We can host hundreds, or thousands, of unique domains on a single server, without the administrator doing anything, or even knowing about it. The users can do it themselves. The only hard part is configuring the domain registrar's interfaces, which can be a bit challenging (we have partnered with one, PairNIC, who makes it fairly easy, and we have instructions on how to do it).


Anyway, it occured to me that we have a pretty great innovation in this area and as hosting providers discover this, I think they will be sort of amazed. It's not the kind of thing that gets a lot of attention in the press, to come up with a better way to do virtual host tables, but for the people who have to do this all day long--and there are a lot of them--it's pretty significant.

Saturday, October 08, 2005

Web 2.0: the energy is building

I am excited about a lot of the news coming out of the Web 2.0 show. Our server engine to power many of these kinds of services and we're getting a ton of interest in it.


We're going to roll out a developer API relatively soon and we're looking for key developers to partner with to build innovative front-end applications on our platform. If you're interested, drop me an email.


We're about to announce some new OEM partners as well.


I do have to take issue with some of the press around Web 2.0 that is characterizing it as "frothy" or a "bubble economy". There is certainly some speculation happening, but it's too early to tell which things (voice over IP, social networking, blogging) will be the fabric of the new internet, and which are passing fads. We're agnostic, and believe that all of these things should be the basic building blocks of the next-generation internet. And we plan on powering it.

Tuesday, September 20, 2005

Software Design: Ease of Use vs. Discoverability

Almost every product you pick up, whether it's a hot glue gun or a $1000 impossibly complex piece of software, says this on it somewhere: "Easy to Use!"


But many products aren't so easy to use, as we all know. So this term has almost lost its meaning, like "intuitive" or "all natural". So what do you say about your product if it truly is easy to use? And what does that mean, anyway?


I want to make a distinction between ease of use, and ease of learning or discoverability.


Here's a simple example in software products: drag and drop. It may be easy to use, but how the heck do you discover that certain areas of certain windows can have things dropped on them? And just what kinds of things can you drop on them? The inverse of this is the File/Open menu item, which brings up an open panel in which you hunt around your hard drive in a tiny little modal window. That functionality is easy to discover (it's right there in the menu) but very tedious to use, particularly if you have a lot of files, in different places.


Another example is the use of styles in programs like InDesign and Word. They're really easy to figure out what they're supposed to do, and really hard to use. The problems range from scope (what in the heck will change when you click Apply) to the tedium of repeatedly selecting text then going back to the palette to click the style. NeXT Computer innovated in this area, it's now in many MacOS X apps, and still nobody has noticed: "Copy Style" and "Paste Style". You can load up the style you want to use and then, with one hand, select text; with the other, you "Paste Style" with the menu key equivalent. It's amazingly efficient and easy to accomplish. Why isn't this idea all over the place, in every product? Or Adobe Illustrator's "Transform Again" menu commmand. Brilliant.


In my view, the right choice is "easy to use, hard to discover", with some way to make it more discoverable. Tool tips, horrible as they are, were invented for this reason, to help you discover stuff that you might not otherwise have noticed. The dreaded "Tips" that pop up, with the handy check box, "Don't show me this again", are actually a really good idea. If they're telling you something you already know, at least you won't have to suffer through it again; and if you didn't know whatever it is they're telling you, often it's truly helpful. Microsoft really pioneered this approach, and, other than overdoing it with dancing paperclips and puppies, did a very good job of integrating it into their products. Their only mistake, I think, was having 38 small opt out preferences, rather than one big giant "leave me alone!" preference (I'm thinking of Word here, with all its helpful grammar checkers, date suggesters, "You seem to be writing a letter; want some help?" and stuff like that).


Ease of use boils down to this: can you remember, a month after you last used something, how to use it efficiently?


Discoverability is a bit different: can you figure out most of the functionality of a program within the first 10 minutes, "cruising the menus"?


I think one of the best (simple) ideas of all time in this category is, I believe, Microsoft's. It's the Recent Documents menu. It does exactly what you want 99% of the time, bringing back the documents you were most recently working on, regardless of where on your computer they might be living—and it is obvious how the feature works.

Monday, September 19, 2005

AppleMatters vs. iMovie

If you read this posting on the "AppleMatters" web site:



http://www.applematters.com/index.php/section/comments/540/


You would think that iMovie was just a modest adaptation of existing movie editing software. As the original author of iMovie, I wrote a comment on that site, which required me to register.


I received some spam as a result of registering, but my comment was never posted. This bothers me still, as it seems not to be such an open forum if comments are silently not posted. So I responded to the spam, which went as email to Hadely Stern, who never responded, and my comment is still not posted. I lost my original comment, but here's the text of my email to Hadley. At least I can comment over here on my blog and maybe somebody will read it :)




To: hadley@applematters.com

From: Glenn Reid <glenn@fiveacross.com>

Subject: Re: Win a Mac mini forum contest (really another issue)

Cc:


Hi Hadley,


Your site requested that I registered in order to post a comment, and you have no problem sending me spam as a result of my registration, yet my comment never appeared on this article:


http://www.applematters.com/index.php/section/comments/540/


As the author of iMovie at Apple, I was trying to take issue with Chris Seibold's contentions that iMove was somehow derived from Final Cut Pro. It wasn't. It was written ground-up (by me). iMovie 1.0 was a dramatic sea change and ushered in the era of Digital Media. I and a very small team (3 total) wrote iMovie in just 9 months in 1998, and it first shipped on the iMac DV. It was DV-only, brought a copy/paste metaphor to the previously hobbyist-only marketplace that was video editing, and the iMac DV showed for the first time why you might want a PC with a large hard drive (though it was only a 6GB drive!). To say that this was not innovation is a bit short-sighted. To say this:


"Not only was the original iMovie limited it was also a program based on an application conceived outside of Apple. The Cupertino giant did not invent iMovie they merely purchased an existing program and polished to a point where the masses would fall in love. Far from being the exception to the rule this is the norm."


Is just plain wrong.


iMovie was the first ground-up application in a very long time at Apple (10 years maybe?). Big companies do usually just buy and repurpose apps. Apple, starting with iMovie, is writing some new ones. I also was the principle author of iPhoto, also written from the ground up (iTunes, it is worth noting, was an acquisition: it was originally called SoundJam). iMovie, iPhoto, Keynote, and Pages were all written from scratch and are trully innovative. Even Adobe and Microsoft don't write apps from scratch any more, as far as I can tell, with the possible exception of InDesign which took probably over 5 years.


Cheers,
Glenn Reid
Founder/CEO
Five Across, Inc.
http://www.fiveacross.com/company/team.shtml



>To the many hundreds of you who have registered over the past couple of days

>welcome to Apple Matters!

>

>To our older members hello! In case you haven't heard yet Apple Matters is

>having a competition running from now until October 31st. Each time you post

>to our new forums you get an entry to win a Mac mini setup worth over $700. 

>

>Details can be found here:

>http://www.applematters.com/index.php/section/comments/win_a_mac_mini_and_a

>_ministack/

>

>Why are we doing this? Well, because we want your help building the best

>forums on the Apple web.

>

>So keep posting!

>http://www.applematters.com/index.php/forums/

>

>Hadley Stern

>Publisher, Editor-in-Chief

>AppleMatters

>http://www.applematters.com

>hadley@applematters.com



Wednesday, August 31, 2005

Serving our own Food

Have you heard the phrase "eat your own dog food" as applied to the technology business? If you expect your dog to eat it, you should eat it too. Which means, use your own software, if you expect other people to use it!


We do this, at Five Across, and our business is practically run on the collaboration and file sharing that is provided by our Bubbler Instant Messenger application.


But yesterday we made a more important transition: now we're serving our own dog food too.


We moved our main bubbler.com site from Apache to our own Five Across web server. So the whole site bubbler.net is served by our web server, not just the blogs themselves. It is how we structure the product for our OEM customers, so we use it too.


This is a "transition" because we built out the bubbler.com web site before our own software was ready, so we had two sites (bubbler.com, which was powered by Verio, and bubbler.net, which was our blog hosting server). Now the whole thing is powered by our Five Across Web Platform, and it's performing beautifully. It handles server-side includes. It handles multiple IP addresses per server. It is a total replacement for Apache, yet it does way more. Blogs, file serving, podcasting, instant messaging -- all built directly into the web server itself--compiled C code, blazingly fast. It's THOUSANDS of times faster and more scalable than our competition, scripted things like Moveable Type. Perhaps 10's of thousands. We're planning to do some performance benchmarking soon.


This is getting to be really fun.

Performance

Underneath the hood of our "fastest blogging engine in the world" is in fact an engine. We built our own web server from scratch, and it is massively parallel and breathes fire. We weren't sure exactly how fast it was, but we knew it was pretty fast...


One of our OEM's, about to be announced, happens to be expert in web servers, and routinely deploy several different ones--Apache, of course, but also some more innovative high-performance servers. They benchmarked our server with some sophisticated test tools as part of the evaluation process...


Our server can handle roughly 600 times the traffic load that Apache can handle. We were clocking about 6 million hits a second across 20,000 parallel users. That's cooking with gas.


And we don't even sell it as a web server. It's just the engine on which we build lots of innovative services like blogging, podcasting, file sharing, etc.


What I think this means is that if you're considering offering a blogging service, you may need to buy one piece of hardware to run our solution, or 600 pieces of hardware to handle a comparable load with Apache. Except that the test wasn't of Perl scripts running through CGI, it was flat HTML serving. So maybe multiply that number by another 1,000 or so.


If you can't tell, I'm proud of our server. Put simply, it kicks butt.

Monday, August 01, 2005

Livewire

Last week our new OEM partner launched a great new community site called Livewire, powered entirely by the Five Across Server and our publishing and messaging client applications.


I am especially proud of our web server, which serves their entire site and is wickedly fast, handling large amounts of traffic as well as driving the blog/site hosting. We are benchmarking much, much faster than Apache and able to handle far more concurrent transactions, which is cool. And it runs great on FreeBSD, the OS of choice for our OEM partner.


The Livewire team have a great vision for their service and we're working hard with them to make it a success. I encourage you to check it out.


And if you have a dream of creating and launching a blog hosting service, a social networking site, or giving your customer base a place to comment, share ideas, blog, or communicate--drop me a line. That's what we do.

Thursday, July 07, 2005

I Don't Like Dashboard or Spotlight

<rant>
I've upgraded to Apple's "Tiger" OS and I'm trying to like it, but really I don't.


I don't need or want Dashboard widgets, but there's NO WAY to turn them off, or opt out. The closest thing is to "Remove from the Dock" but all the widget processes are still running.


I also don't want Spotlight indexing everything on my computer, and there is no opt out for that either! I turned off all the check boxes for types of content in preferences but it's still indexing away and there's NO WAY to turn it off or stop it. That is totally bogus.


Do I have to go in and edit /etc/rc files to stop this?! That's crazy! New features are fine, but it's my CPU and I want an off switch for these new things!!


</rant>

Monday, July 04, 2005

Open Source

Thanks to Alan Kleymeyer for this pointer to an open-source "smackdown" posting on Forbes.com. Very interesting.


I have some opinions about open source. My old company, RightBrain Software, released several products as open source for NeXT computers in 1992, long before the current craze. It didn't work that well as an endeavor, mostly because people don't really want to see or touch the source code. "Open source" is a euphemism for "free". People want free stuff. It's as simple as that.


But is it? Most open source is almost complete and somewhat maintained. A long dig through sourceforge.net which is one of the biggest open source repositories will show you that a staggeringly large number of source projects are languishing (or rotting, frankly).


Open source projects usually come into existence when one extremely bright programmer builds something ambitious, but then tires of it, or gets a day job. Unable to just delete and forget the project, he/she pushes it into the open source community, where (if it's cool, and useful) it is swarmed by a half dozen programmers who "maintain" it for a while, or port it to BeOS, or fix compile errors in new versions of MacOS X or whatever it takes.


I've seen many good programmers use open source projects as "example code" to build commercial products. I've rarely seen open source actually become the foundation of anything useful. And when it does, it becomes extremely difficult to maintain, fix bugs, and move forward, because none of the original authors of the code are around to help.


It's not that open source is bad, it's just a lot more difficult to use effectively than most people think. And it's definitely not going to save the world.

Sunday, July 03, 2005

SuperNova conference

David Weinberger did some great video interviews for c|net at the SuperNova show. Here's the link to the whole page with lots of interesting folks.


There's an interview of me talking about how blogging is a symptom of a bigger problem, that it's still way too hard to get a web presence.

Tuesday, June 28, 2005

WEB: Tangle, Weave

I'm old enough to have seen a few things go by in the technology world, some good, some bad. What fascinates me is when younger people accidentally re-invent something that has been done before. This happens a lot more than you might think.


A historical case in point....


A couple of decades ago or so, the grandfather of Computer Science, Donald Knuth, chairman of the CS department at Stanford for many, many years, invented a typesetting language called TeX. It was one of the first markup languages, and was created mostly to typeset mathematics, which is very hard to do. Knuth was one of the first people to be interested in digital typography and typesetting and in fact created a Digital Typography degree program at Stanford, which graduated I think only about a half dozen people, most of whom I know personally (Dan Mills, Carol Twombly, David Siegel, Cleo Huggins). But I digress...


Knuth created a system he called WEB in about 1982 (10 years before the "world wide web") which was a programming language (PASCAL, more or less) but the comments in the programming language were in fact instructions to a typesetting program that would typeset the text of the program.


The key idea here is that a single text file (a .web file) could be read by two different programs, and each would see something different:


  • the PASCAL compiler would see source code

  • the WEB compiler would see typesetting instructions

If I recall correctly, the interpreter that read the "program" code was called Tangle, and the interpreter that read the "typesetting" code was called Weave.


The notion of "structured comments" I think may have started here. By structured comments I mean information that technically is "comments" (not interpreted as part of the source code) but they are structured in such a way that other programs can make sense of them.


I borrowed this idea in 1985 or so to enhance and expand PostScript comments to contain information to be read by document processing systems. Inserting %%Page: 3 into a PostScript program allowed other programs to find the page boundaries.


This whole idea is being reinvented to some degree on the WWW now. What began as HTML tags have grown additional appendages or "tags" or "fields" which are used to communicate with other programs which read the HTML pages (search engine crawlers, or XML parsers, or whatever).


There are good and bad things about this. The good ideas are obvious: an HTML page can contain hidden, embedded instructions for other programs to read. The bad ideas are harder to spot, but are real, and insidious. One is this: validating these files is almost impossible. Information in a "comment" cannot be verified by the parser to which it is a comment, by definition. Even JavaScript programs look like "comments" to most HTML parsers, including search engines. You can't spot a JavaScript error except by running it through a JavaScript interpreter, which in turn sees HTML code as "comments".


I chuckle to myself when I hear terms like "micro-formats" and I see entire companies like Technorati being built around the idea of hiding metadata inside HTML comments. It will certainly work for a while, but it's extremely hard to standardize and police, it's trivial for competing standards to pop up, and eventually the whole thing collapses under the weight of micro-variations, competing non-standard implementations, and sometimes, just plain old questionable motives.

Thursday, June 16, 2005

Ease of Use

I have probably blogged about this before, but "easy to use" has become a meaningless phrase in today's technology world. I ran a test through Google for the literal phrase (with quotes) "ease of use" and it told me that there are about 10,100,000 pages that match that. "user friendly" is even more: 15,000,000 results.


I see that IBM is the first entry in the results for "ease of use". I don't think IBM has ever made a product that is truly easy to use. Intel is right behind them. Intel has easy-to-use products? They make microchips!


We actually do have a product that's easy to use, but unfortunately we can't really say that, because the phrase has lost all meaning.


If you read this blog post, and there are words that actually mean something to you in evaluating software, I'd love to know what they are. Add a comment to this post or send me email. I'm betting against compelling, fast, innovative, and all the other words we use to describe our products.


I'm considering a marketing initiative in which we borrow from another lexicon. How about tasty or oaky or loyal or frictionless or maybe XML-enabled ?

Saturday, May 28, 2005

Drag and Drop Web Publishing

During our public beta period we learned a lot, mostly from our customers. Overwhelmingly they told us: you are a site-building tool, and blogging is just a small part of it.


This is fascinating to me, because we launched the beta as a blogging tool. I now believe that a lot of the people who are investigating the blogging phenomenon don't really want blogs, they want web sites, but they want them to be dramatically easier to build and maintain.


So we added more templates that have horizontal menu bars and enhanced the multiple page and multiple site support. And what we ended up with is truly drag and drop web publishing.


It simply does not get any easier to get content onto a web site. Drop photos (or audio, or video) onto Bubbler, and your files are instantly live on the internet. Nothing else to do. Not even an OK button, much less a "publish and wait a few minutes" button.


We think that all web sites should be built this way: just take a handful of content and throw it, and it will stick to the web. Simple. Powerful.

Tuesday, May 24, 2005

Fixing problems

We are a real-time company. If you have problems with our software, tell us about it. We are responsive and will help you quickly and even fix problems as quickly as we can.


In the traditional software world, you had to wait six months for a bug fix or a new release of your favorite productivity tool. We've learned to "live with" bugs in software because the hopes of getting them fixed seem very dim. Can you imagine writing to a big software company's support address and either (a) getting an actual response from a human, or (b) seeing your problem addressed?


We've fixed a good number of problems within a few hours of hearing about it from our users. Horst-Dieter Schipporeit is one of our customers, in Germany. He writes to our support hotline and points out things, and we fix them. I think he would describe us as a responsive company, even if our response is "that's a great idea but it's a few months down the road"


Try us.

Monday, May 16, 2005

Life's Persistent Questions


I'm staying at the Marriott Marquis hotel on Times Square on the 26th floor for the Syndicate conference that starts tomorrow.


I was talking to my almost-9-year-old daughter at home on the phone, telling her about the hotel, and she came out with this perfect radio voice...


"On the 26th floor of the Acme building, one man is still trying to find the answers to life's persistent questions--Guy Noir, private eye...."


I had to laugh. That's me, I guess :)