The XSL Challenge

Friday May 21st, 1999

C. David Tallman has news of an interesting article at In it, Michael Leventhal of CITEC (the group creating DocZilla) takes off the gloves and challenges XSL to best XML/CSS/DOM in a competition of functionality/usability in the web application space. Interestingly, Mozilla plays an important role in the contest. From

"Anything XSL can do in the Web environment, I can do better using technologies supported by current W3C Recommendations. Of course, what is 'meaningful' in the Web environment is open to a variety of interpretations. Therefore, the subject of the challenge should be one that the XSL camp and I agree is meaningful. I am also ready to make this bet a little bit more than an academic exercise. If I lose, I will pledge that I, and my crack mozilla development team, will assist in implementing XSL in the mozilla open source project. If my opponents lose they will agree to desist from XSL advocacy, vote against an XSL Recommendation if they are members of the W3C, and will join me in calling for full, flawless, and unequivocal vendor support of CSS1 and CSS2, DOM Level 1, and XML 1.0 as the very first and top priority of the web community."

#4 Re:The XSL Challenge

by JavaScriptDOMTedious

Saturday May 22nd, 1999 6:34 PM

You are replying to this message

mozillaZineAdmin, It is standard practice to write articles and publish them to get PR for your product/company. Every editorial that decries a competing technology is a subtle PR move. If you've ever written an article for a technical magazine, you would know that even if your company isn't it that area, you get PR out of it.

Irregardless, the DocZilla connection is that DocZilla claims you can view/print SGML with just CSS/DOM which is true, but a bogus argument and he should know better. Maybe they are too lazy to write an XSL/DSSSL engine for their SGML browser?

1) You can't *EDIT* any heavy DOM/JavaScript laden documents like you can with a declarative language like CSS, or XSL, or DSSSL. No one is going to write a two-way WYSIWYG editor like DreamWeaver that can automatically parse complicated DOM manipulation expressions and figure out what is going on, because DOM/JavaScript isn't side-effect free.

2) Declaration languages lead to BETTER optimization, not worse optimization as the DocZilla guy claims.

For example: With XSL you can use the filter paradigm to transform a document on-the-fly without using up memory. With DOM trickery, you end up creating a new document in memory. Some transformed documents can turn a 10K XML or SGML file into a 100k document of flow objects.

Or, with XSL expressions, you can compile them to a finite-state-automaton like with Regular Expressions and can search a document tree *much much* more efficiently than JavaScript loops and DOM calls.

There are so many technical arguments against DocZilla's article, including the existence of style transformation languages that are widely used in the SGML community, that I question the motives behind it. It smells like FUD.

Microsoft tried to pull the same stunt with SMIL saying "anything you can do with SMIL to synchronize multimedia,you can do with JavaScript controling plugins" Reason? Because they didn't have an implementation ready and RealNetworks had a good one.

Microsoft knows the right solution is specialized declarative languages, thats why they proposed Behavior Style Sheets, and Vector Markup Language.

And what do you know, a few weeks later, Microsoft proposed HTML+TIME which competes with SMIL in some ways.

Finally, James Clark, Jon Bosak, Tim Bray, and everyone else who have had years of experience with SGML, DSSSL, and XML have chimed in and shot his argument down.

It is all too tempting to accomplish something new by relying on hacking together a solution from the past. I mean, why do we need MathML? I can already transform LaTeX documents into HTML with GIFs substituting for complicated math expressions. Or, there are several java applets out there which already parse and render MathML, so why does Mozilla need it built in?

It's precisely this attitude, of trying to accomplish something with a quick fix, that got Netscape into trouble in the first place. They should have rewritten their rendering engine 2 years ago just like MS did instead of trying to hack CSS into Netscape. That codebase was *beyond* ugly.

Hell, you can even see people on Mozilla's newsgroups trying to find ways to render MathML with CSS and Gecko which is a totally bogus idea.

Do the work to support the right technical solution. Pay now, or pay later when you have to rewrite it anyway.