Web Hypertext Application Technology Working Group Launches Mailing List
Friday June 4th, 2004
Ian Hickson writes: "Some of you may be interested to hear that people from Opera and Mozilla have set up an open mailing list for discussing Web Forms 2 and other new specs that have been discussed in various places over the last few months."
The list is the public forum of the newly-formed Web Hypertext Application Technology Working Group, an organisation made of contributors from several major Web browser development teams. Current invited members are Brendan Eich, David Baron, David Hyatt, Håkon Wium Lie, Ian Hickson, Johnny Stenback and Maciej Stachowiak.
The group is working on specifications for Web Forms 2.0, Web Apps 1.0, Web Controls 1.0 and a CSS object rendering model. This work will be largely done outside of the World Wide Web Consortium, though finalised versions of the specs are expected to be submitted for formal standardisation. While the decision to operate independently of the W3C may be seen as controversial, many feel that formal standards bodies move too slowly to react to proprietary technologies such as Microsoft's forthcoming XAML. In addition, many in the W3C are pushing for Web applications standards based on technologies such as XForms and Scalable Vector Graphics, whereas the members of the WHATWG favour backwards-compatible HTML-based solutions, which they believe would be easier to implement and more likely to be adopted by Web developers.
#1 A Rich Internet For Everyone (RICHIE)
Friday June 4th, 2004 8:38 PM
Allow me to highlight a similar initiative called RICHIE (short for a Rich Internet for Everyone) also known as the Open XUL Alliance. See [removed] for details.
#2 Re: A Rich Internet For Everyone (RICHIE)
Friday June 4th, 2004 8:35 PM
Allow me to inform you that if you spam my site again, your account will be turned off.
#3 Re: Re: A Rich Internet For Everyone (RICHIE)
Friday June 4th, 2004 9:05 PM
> Allow me to inform you that if you spam my site again, your account will be turned off.
Can you explain in more detail why you consider a link to the Rich Internet for Everyone (RICHIE) initiative as spam? If you have ever checked out the RICHIE initiative you will see that it launched a mailinglist (that is, xul-talk) to work on interop for web applications more than a year ago. Even Ian Hickson - now the spokesperson for the WHAT WG is subscribed to it.
#5 Re: A Rich Internet For Everyone (RICHIE)
Friday June 4th, 2004 10:04 PM
Amusingly enough, the WHATWG list, in its few hours of existence, has done more for interop than the xul-talk list has done in its entire existence. But that's another story...
#9 Is the RICHIE link spam?
by SpaceDogDN <email@example.com>
Saturday June 5th, 2004 7:28 AM
Indeed, in the context you introduced the link, it is spam. Effectively, you posted a message saying "Here's my competing website. [Link]". You didn't say how it was similar, how it differed, what the initiative is about, or why it was better than WHAT WG.
That's not to say that you don't have a point to make. It's just that you aren't really making one in your first posting. You're simply pimping your site.
> You didn't say how it was similar, how it differed, what the initiative is about, or why it was better than WHAT WG.
First, let's be clear that I welcome the new WHAT WG initiative. The RICHIE initiative is not better than the WHAT WG it's just different. The RICHIE initiative is - suprise, suprise - about creating a RICH Internet for Everyone. The core difference is that the RICHIE initiative is somewhat more ambitious as it tries to reach beyond HTML and web browsers (e.g. rich clients, rich browsers, rich portals).
#33 RICHIE - Another W3C Alternative
Monday June 7th, 2004 11:29 AM
> Allow me to inform you that if you spam my site again, your account will be turned off.
Any comments on why a link to the RICHIE initiative is spam? For those interested in the Rich Internet for Everyone initiative you can find out more @ <http://xul.sourceforge.net> and decide for yourself.
#38 Re: RICHIE - Another W3C Alternative
Tuesday June 8th, 2004 6:03 AM
You already had a comment telling you why, which you've just ignored.
You still haven't said how your link is related to this article about WHATWG - you've just said that it's different to your project. Posting links without context tying them to the particular article means they're off the topic. The admins can define spam however they like, but evidently Kerz feels that off topic promotional posts fit the description. Obviously they'd lose the respect of some readers if other readers felt they were being unreasonable, but that's up to them and the other readers.
Personally I think doing some moderation of the posts out here would be good. Seems a little inconsistent to complain about your comments when they've posted your articles plugging your site previously, but there you are...
#8 Re: Spelling fix
Saturday June 5th, 2004 6:38 AM
Ok i very surprised with the "alliance" but this another notice.... its too much...
Is that announcement the end of a short love affair between mozilla and SVG (<http://www.mozilla.org/ev…ozilla-futures/build.html>) that started with Brendan's future development talk? Yes, I know we have SVG for a long time but it seemed to me that it got some more drive with Brendan's talk. Hixie's blog entry (<http://ln.hixie.ch/?start=1086387609&count=1>) seems to point in the same direction. It looks to me that the companies that drive SVG are primarily interested in getting rid of the browser itself. At least requiring your own socket implementation and making it incompatible with CSS is a big step in this direction.
Reading the comments, that Robert got from his mailing to the SVG mailing list indicates that the companies that drive SVG would like to see a closed autarkic standing standard. I can understand their motives, if you plan to develop for mobile phones proprietary content, where people need to pay for the content, this is the way to go.
The idea of vendor driven fast development of web-technologies seems to me appealing. At least browser vendors should know by now what their customers would like to see or where they expect the market will go. It looks to me like browser vendors are trying to get their "freedom to innovate" back from a standards body. I believe this is good. It will put a stop sign on a way where people create unimplementable specs, remove attributes only by "political" reasons (see how the XHTML people did try to remove the style attribute") and are in general slow. But this is what MS is saying for years now.
#11 Re: WHATWG vs SVG
by SpaceDogDN <firstname.lastname@example.org>
Saturday June 5th, 2004 8:33 AM
I'm not really concerned about what the W3C does with SVG. SVG 1.0 is already out, and the browser makers can decide how or if they want to support it. If they create a newer spec that's too complicated to implement, the browser makers will simply ignore anything they don't want to implement.
Of course, what this may lead to is a company like Adobe creating a plug-in for SVG 1.2 that allows full-blown web applications. It's a sort of browser-on-top-of-a-browser situation similar to the way Java was going to be a platform on top of an OS. The various browser developers could do little to stop it, since they'd effectively have to cut out plug-in support. It's also possible such a plug-in would force Mozilla to implement at least some of SVG 1.2, since Mozilla would want to maintain its reputation as a supporter of web standards.
My major concern, though, is that the W3C may hold up the standardization of XBL while they're working on SVG 1.2. XBL, especially in combination with the use of alternative stylesheets, is an incredibly power tool for web developers. If we could get it standardized and put into browsers like Opera and Safari, we'd have a far more persuasive argument for people to move away from IE.
> It looks to me like browser vendors are trying to get their "freedom to innovate" > back from a standards body. I believe this is good. It will put a stop sign on a way > where people create unimplementable specs, remove attributes only by "political" > reasons (see how the XHTML people did try to remove the style attribute") and > are in general slow. But this is what MS is saying for years now.
Not exactly. Microsoft created its own proprietary extensions for the purposes of user lock-in and made little or no attempt to discuss the features with other browser developers beforehand. Then they complained that creating a standard would take to long. The reality was they didn't care about making their extensions a standard, since it would only help other browser developers implement their proprietary features.
WHAT WG was created not because a specific developer wanted to do it's own thing, but because the majority of W3C members aren't browser developers. They're plug-in developers. Some people within the W3C have even stated that the browser is dead. This kind of environment is openly hostile to the further development of existing browser-based standards. The only logical course of action in this situation would be for the various browser developers to form their own standards group, which is what happened.
The bulk of SVG is great and continues to make good sense. Mozilla already implements some 20% of it (that's a wild guess estimate) and will not be removing this support; Opera doesn't yet implement any of it but it is certainly something that has been looked at. As was mentioned at the workshop, many mobile vendors want SVG support, and Opera mainly caters for the mobile market.
The standard of XForms appears cool but in my mind I have the bad memory of this thing called CSS that the W3C devised. Sure, you can do all kinds of wacky things with overriding tags and absolute positioning of DIV's and such but I still like to point out how difficult it is to make a simple column layout in CSS that does what a user would expect it to do. Sometimes these standards appear to have been created in a vaccuum. One of the most common patterns a web designer uses: laying out columns for a web page. With all the time and effort that went into CSS you would think at least a <COLUMN> tag could have been added that would allow text to flow from one container to another using an index attribute. It just makes me shake my head in disbelief sometimes at looking at the current schemes designers have to use to get DIV's to remain side by side by using large margin sizes that force the columns to resize correctly. How did something that could have been so simply spelled out turn out to be so long-winded, unintuitive, and not layout friendly such as CSS. I can only imagine what will happen to XForms once it becomes a 'standard'.
#15 Re: standards shmandards...
Saturday June 5th, 2004 10:28 AM
They managed put something together for this in the CSS3 specs. Better late then never I guess. Look at who worked on the document though... It's edited by Håkon Wium Lie and has contributions from Hixie (and Glazman). The exact same people making this effort. In my view, this can only be a good thing. <http://www.w3.org/TR/2001…D-css3-multicol-20010118/>
#13 What about Apple ?
by olace <email@example.com>
Saturday June 5th, 2004 9:33 AM
This is really great news. It is exactly how I think the web should develop : UA implementor develop a new technology, see if web developper use it, make it better and submit it for standartization when it's been proven usefull and easy to use. It will be much better than trying to create standard from nothing without real experience to back it up.
I see that David Hyatt is part of the team. Does that mean that Apple is supporting this project too ? If so, that's really good, that would mean that 3/4 of the major browser developper are working together for a better web. If not, there is still time to convince them or go straight to khtml developper to see if they want to support this initiative. I think that it is important that khtml based browser are not left behind.
Google says that Maciej Stachowiak is also @apple.com. And counting Microsoft as an browser developer anymore seems a bit of a stretch to me :-)
Tantek Çelik from Microsoft (IE Mac, W3C) is listed in the acknowledgements section of the web forms draft, so Microsoft is at least aware of this initiative.
#17 Grid widget alone worth price of admission
Saturday June 5th, 2004 1:49 PM
some sane, comfortable way to do tabular data entry would be fantastic.
would it be possible to define, say, a <select> just once and reuse it in each row of a tabular grid?
#19 Re: Grid widget alone worth price of admission
Saturday June 5th, 2004 10:55 PM
Tabular data is on the cards. As for reusing the data in a <select>, look at the "data" attribute in Web Forms 2.
I can't understand why nobody adopts MS technologies, even when they don't comly with standards. Fact is that MS owns >90% of the browser market. So <10% care about mozilla- opera task groups. The average user has IE6 and is happy with it as long as he can browse the web. Competing products like Mozilla will only have a chance to grow significantly when they are backward compatible with IE. So other browsers have to adopt IE completely and then implement features on top of it. In a few years every browser will have to deal with XAML. Not because it's a W3C or because it's cool, but simply because 90% of the market uses it.
Just forgot to mention that MS calls this strategy enhance and embrace. It's time that others hit them with their own weapons
I disagree on two issues:
1. It is not just that they 'do not comply with standards'. The problem is that IE *controls* the technologies. If you allow that, you will *always* be running behind on IE. Instead, you must aim to take the steering wheel while you're still able to. The more you allow the proprietary technology XAML to take hold, the less chance you have. Besides, there's nice things in this Web Forms 2 for M$ as well... The basic data typing for example, very useful for the Tablet PC support in Longhorn.
2. XAML will certainly not be adopted completely in a few years. It will take much longer. According to an interview with Microsoft employee Bob Muglia (on winsupersite.com), only now after about 2 years companies are really starting to migrate from Windows 2000 to Windows Server 2003. And there are still many around which use Win2k. The consumer market develops similarly. Just like the common/'dummy' user (the large part of those 90% you mention) doesn't download a better browser than IE, most people won't upgrade their OS until they get a new PC and the OS that is being delivered with it. And even after that, usually their older computer is still being used, given to a child or relative. It's really astonashing if you see the numbers about how many people are still using IE 5!! So, XAML will only get really commonly used in software and web applications when the backbone for it (Longhorn) has been widely spread. After all, it is not a matter of just upgrading the software, but will require an OS upgrade. This is also obviously an opportunity for Mozilla to kick in, because it is much easier to tell people to 'download another browser' (or just 'a product to make this web application work', really) than to require them to make an OS upgrade.
p.s. 25% of the visitors of one of my sites use Netscape/Mozilla. Admittedly, it is a pretty technical site, but nevertheless it shows that there definately *are* areas where the non-IE browsers have a pretty big share. If the experience of my site (and other related sites) were really much much better in Firefox than in IE, it could definately convince a number of people to choose Mozilla instead of IE. If the technical guys (who the 'dummies' turn to if they have a problem) start strongly recommending a different browser to them dummies, it may turn things around a little.
#24 Re: Re: When will they ever learn?
Sunday June 6th, 2004 12:24 PM
Note that Longhorn is still a few years away. Incubation time then adds another couple of years. So that adds up to 4 years, at least. But, indeed, time is still limited, so that's why this WHATWG is very nice as it moves (or is supposed to move) much faster than the W3C.
And no offense to the W3C, but I agree with buff to some extent in that it really *is* astonashingly difficult to make layouts like a common 3-column layout with header and footer. And it's not IE compatibility that's a pain here, even when doing it in Firefox it is difficult to find a clean method. OTOH, there are some very nice parts in CSS(3) that I'd very much like to see implemented in browsers today :). So some inventive input coming straight from a 'vacuum' would also still be appreciated... ^_^
#25 Re: Re: When will they ever learn?
Sunday June 6th, 2004 12:36 PM
And another thing about those browser usage numbers: Opera spoofs the user agent string of MSIE. Therefore, it is very difficult to count the number of users using Opera, and, perhaps worse, those users are credited to MSIE instead! According to my page's tracker (eXtreme Tracking), I've had the grand total of 1 (!!) Opera users visiting my site. Clearly, this is nonsense :). So, among those remaining 60% of so-called IE6 visitors my site has, there might very well be 10-20% of Opera users.
I think the 10-20% number is mainly whishful thinking. I don't think it will be nearly that high. For a site for the gereral public anyway. And afaik opera has 'opera' somewhere in the UA string, so you could count them. easy test: install opera, and see if the counter works.
#28 It's very easy to count Opera numbers
Monday June 7th, 2004 2:17 AM
Opera numbers are very easy to count. Search for the string 'Opera' in the useragent. (Though I agree that the practice of spoofing useragents by default - something that nearly *all* browsers do to a greater or lesser extent, Opera being rather greater than most - is disturbing and not to be welcomed.)
If your tracker doesn't count Opera, which is the #3 browser engine (below MSIE/Win and Gecko) on the (non-technical) site stats I've looked at, then it's entirely broken. You should cease using that software and write to them to tell them why. I mean, if the Top 40 music charts only got as far as #2 wouldn't you consider that pretty poor?
The correct, relevant way to count is (a) to count engines or engine versions, and only (b) - for mild interest - to track actual specific browser versions. This is something most log analysers seem yet to realise. Damnit, maybe I should write one specifically for that...
This is all good news. The WHATWG is a big step in exactly the right direction.
If the W3C goes in the wrong way, the WHATWG may even turn into an independent non-profit standards body, but that is idle speculation.
I was very pleased to see that Safari is with us. That's huge. With Opera, Safari, and Mozilla on board, we just may have enough to knock King Kong off the building. (Or is Microsoft "Mothra," after the butterfly?)
The WHATWG project is already pointing in a terrific new direction for the WWW, and it's just getting started.
Congratulations and godspeed to Hixie and the whole WHATWG.
The last paragraph smacks of 'we are finding it difficult to implement SVG, it's taken three years so far and still isn't at a releasable state, so let's knock up something inferior'. I wish we'd get SVG support some time this side of the next ice age; I was using vector graphics (Corel Draw!) when my computer ran Windows 3.1 so it's not like the concept is a new one. (And yeah yeah, I know, I'm not volunteering to develop it so I can't complain.)
I'm in favour of a 'clean break' for Web technology; remaining backwards-compatible for ever keeps sites messy. XHTML2 for example (which admittedly doesn't seem to have progressed recently) looks like a great improvement to me and, now that standards are XML-based, there really shouldn't be too much difficulty in implementing that; if and when all the major browsers support it, people can start developing pages using it, and we have a way forward. XHTML 1.0 with its 'backwards compatible' bodge-job simply made a mess of the situation; people continue to write broken HTML 4 but now they use the XHTML 1.0 doctype. Worse, they do exactly the same with 1.1.
Personally I'd prefer if new extensions to forms or other aspects were introduced only as part of the newer XML (well-formed) standards. The page isn't going to work in IE anyway... why not make a clean break?
In summary: (a) I think it's fine if Mozilla wants to do their own thing or if they want to form a cabal with the other 'browser opposition' groups to do their own collective thing (b) I think it would be unfortunate if they also do not, in parallel, make serious efforts to support appropriate W3C standards such as SVG (c) Compound XML documents with things like MathML and SVG (and any other format somebody wants to invent), via appropriate browser plugins/extensions that can handle namespaced DOM entries rather than only external files, could solve many problems in a coherent manner
PS Somebody suggested Safari was 'huge'; maybe they meant in terms of a PR coup, but if they were thinking actual numbers, that's probably an error. Safari's a percentage of all Mac users, which as a group are anything but 'huge'; there are fewer Mac Web users in total (in terms of pageviews, site stats) than there are Gecko users, and in my stats at least there are many fewer Safari users than Opera users; I'm willing to bet that's general. That doesn't mean it's not extremely worthwhile to have co-operation between the three main alternative (non-IE) browser engines, of course it is. Just don't overestimate the significance of Safari/Konqueror.
I think you've missed the point on the SVG stuff. This is about building applications on top of a future version of SVG that isn't finished yet, not just about doing vector graphics - SVG 1.0 support wouldn't enable that anyway. AIUI, the stuff in current drafts of SVG 1.2 isn't compatible with existing concepts, making it pretty much impossible to implement as part of the browser - it'd work more like a Java applet, suitable to run as a plugin or separate app, not as something you'd use in a compound document.
"if and when all the major browsers support it, people can start developing pages using it"
Well that's the big issue. If one of the major browsers doesn't support it (for example, the one with the largest market share which is produced by someone making their own standard), then you're stuffed unless you can grab the market from them before you start.
Hixie's blog entry at <http://ln.hixie.ch/?start=1085056751&count=1> makes the case in a more direct way than document linked from the article.
> I'm in favour of a 'clean break' for Web technology; remaining backwards-compatible for ever keeps sites messy.
Messy, yes. But it also keeps sites working in the UAs that people actually use. The difficulty in breaking backwards compatibility is that it necessarily involves breaking your site for some subset of the possible audience. Of course, one could imagine two presentations of a site, one using the 'old' technlogy and one using the newer technology but that is more expensive than just developing for the universally supported tech and may prevent the use of the more interesting or worthwhile aspects of the new development.
Given these limitations, the number of people who will actually use new technologies /on the web/ is very small.
> now that standards are XML-based, there really shouldn't be too much difficulty in implementing that
For browser authors? Sure - inasmuch as there will be no need to reimplement the parser, although presumably a lot of new code to actually display and interact with the documents will be needed. But browser authors don't write sites. People who write sites don't, as a rule, use XML based languages, even where they have reasonably wide support. That's partially because XML isn't very widely suppoorted, of course. I expect it's also partially because XML is an order of magnitude harder to write than HTML. The (lack of) error handling in XML documents requires an entrely different toolchain to HTML. One has to ensure at every stage that there is no possibility of trivial validation errors creepng into the document. As Evan Goer found <http://www.goer.org/Journal/2003/Apr/index.html#29> , even XML advocates find this to be a very difficult challenge on their personal weblog. Scale this up to a large commercial site like ebay with thousands of contributers and you have a tough challenge to solve just making sure that your site doesn't spend most of it's life showing a parsing error.
XHTML has failed because it offers no benefits sgnificant enough to be worth replacing entire CMS installations and retooling websites to deal with the added complexity of XML.
The one significant benefit it does offer is the support for compound doocuments. At the moment that means MathML. Whilst it's true that some people are using MathML in XHTML (e.g. <http://golem.ph.utexas.edu/string/> ), sites like Wikipedia are choosing to ignore it and use inaccessible PNGs instead. Why is this? Well I don't know about the details of Wikipedia, but I would guess some of the following issues led them toward bitmap graphics over XHTML+MathML:
1) Support. Actually, as of recently, one can serve the same XHTML+MathML documents to IE users and Mozilla (but not Camino!) users. But IE users need to download a plugin. And for safari / Opera / pluginless IE users, all you get is dense, incomprehensible text - there's no provision for graceful degredation (unless you can create both MathML and PNG representations of an equation and content type sniff. Or rather, UA sniff because browsers don't actually advertise MathML in their accept header)
2) Validity. Wiikepedia appears to validate (at least, I tried a random page and it seems to validate). If, however, it is at all possible to create an invalid page within the wiki, then XML is a problem because it could lead to pages returning XML parsng errors. So they would need to hack validation on submit into the wiki.
The point of the new group is to avoid the first of these problems. The fact that the second will also be avoided is just a collary of maintaining backwards compatibility. Sadly it's necessary in order for any significant adoption to occur.
> The page isn't going to work in IE anyway... why not make a clean break?
As I understand it, the idea of the group is to extend into things that can be implemented in IE using HTCs or some other method of extending IE (short of binary plugins). At the very least, any new technologies are going to degrade gracefully enough that all enhanced pages will continue to work, albeit less smoothly, in IE 6. That's the essence of the backward compatibility - it's not just supposed to be a surface feature.
> appropriate browser plugins/extensions that can handle namespaced DOM entries rather than only external files, could solve many problems in a coherent manner
Such as the problem of getting people to download and install plugins? Flash is bundled with IE and no other plugn is at all widely used. Otherwise, we could all use MathML and SVG already...
#37 Re: Re: Disappointing news, IMO
Tuesday June 8th, 2004 5:59 AM
Ok, some things I won't argue with (they're going to implement this stuff so it really is IE-compatible? okay, but that still seems pretty hacky)...
However, the point about 'it won't happen until it's supported in browsers, so nobody will use it' - I disagree. It's entirely possible that XHTML2 will be supported in future Microsoft browsers (operating systems). Should that come to pass (and assuming Mozilla etc. also have support), then give it a few years and people may well start using it. Look at UTF-8: plenty of people use UTF-8 on their sites now, whenever they have a need for internationalisation, because it just works. When XHTML 2 just works, people may use it.
As for XML - I do not see the need for well-formedness (nobody actually validates XML on reading it, so validation though nice isn't a practical issue any more than it is for HTML; the only issue is well-formedness) as any kind of problem. You're right that it requires a change in the tools people use, but creating WYSIWYG tools that will only output well-formed XML is hardly a major challenge.
It's not difficult to write XML by hand either using an XML-aware editor (one that warns you when and where your file isn't well-formed), certainly not the 'order of magnitude' suggested. People who write HTML by hand can certainly cope. With XHTML2 though, there might be less need to write code by hand.
The only major problem in outputting XML sites (and by the way I'm working on one now which outputs XHTML as XHTML to Mozilla, XHTML as HTML to IE, and uses XML formats for internal communication between services) is really when you need to pull in unreliable content. Even in that case it only took me an afternoon to write code that transforms the completely broken HTML produced by a conferencing system we use, into guaranteed-valid XHTML with the crap stripped out.
By the way, don't consider the results of that test linked (it's a fun read, I've seen it before) to reflect the difficulty or otherwise of XML. We're living in a world where the major browser doesn't support XHTML served using the correct MIME type; that significant issue aside, I expect most of those sites will actually work. Yes, hand-writing XHTML that actually validates (as opposed to just being well-formed) can be difficult, but the same is true of HTML, and the consequences are identical (the page still works).
As for not using MathML, this was probably due to lack of support in browsers (and possibly in some of the tools that people might use to create equations). MathML is clearly, and I mean there is simply no argument given that the alternative is bitmap images, the best currently-defined way to include equations in web pages... if it were widely implemented (it isn't) and done correctly (that might be difficult due to lack of software). There might be technical limitations in the wiki software that let it output ill-formed code; those are technical limitations, it's a software issue, fixing it isn't a problem.
You're right that people don't like having to download plugins, but I think this is a surmountable problem (plenty of people manage to download Comet Cursor). And current plugins don't solve the problem because they work with separate files, whereas the real power and convenience IMO is in embedded, inline XML formats - when you can put equation or a bar of music notation in the middle of your essay, when the server can generate a simple bar chart or pie chart using a couple of lines of SVG instead of a nasty hack with CSS and bunch of DIVs (or tables and a bunch of spacer images).
#40 Re: Re: Re: Disappointing news, IMO
Tuesday June 8th, 2004 7:32 AM
"It's entirely possible that XHTML2 will be supported in future Microsoft browsers (operating systems)."
Maybe it's possible, but it's not probable. It certainly isn't going to happen on anything less than longhorn, and Microsoft's intentions for that are HTML4+XAML, at least initially. XHTML 2 isn't supported by anyone yet. As you say, we are living in a world where the major browser doesn't support XHTML served using the correct MIME type - you can't just put that aside, it's a reality and based on the history, it's not likely to change.
"Should that come to pass (and assuming Mozilla etc. also have support), then give it a few years and people may well start using it."
If that came to pass, and assuming others had support, then yes. But that's not the real world, and by all accounts from people that know far more than I do, it's not going to become a reality, even if Mozilla goes down that road.
You say software issues aren't hard to fix. Actually writing the program code may not be too hard, but that's not the hard part - getting that software in the hands of all authors and users means either convincing a huge number of people to switch software, or convincing the likes of Microsoft, Adobe, etc that they should make the changes to their software.
#43 Re: Re: Re: Disappointing news, IMO
Tuesday June 8th, 2004 2:10 PM
> It's entirely possible that XHTML2 will be supported in future Microsoft browsers (operating systems).
"Another point that came out of the discussions is that, in case there was any doubt, Internet Explorer in Longhorn will not support XHTML or SVG." <http://ln.hixie.ch/?start=1086158925&count=1>
If it's not in Longhorn, it's irrelevant for the forseeable future.
> Look at UTF-8: plenty of people use UTF-8 on their sites now, whenever they have a need for internationalisation, because it just works.
It also helps that UTF-8 is universially supported and backwards compatible with the most common existing encodings. The X* technologies have neither of these advantages.
> As for XML - I do not see the need for well-formedness as any kind of problem.
So you think companies will happily throw away their existing software and upgrade to 'XML compatible' software despite the huge cost and marginal benefit? I don't.
> creating WYSIWYG tools that will only output well-formed XML is hardly a major challenge
So where are all the WYSIWYG tools that output well formed XML? I know Dreamweaver can be persuaded to produce decent code. Nvu doesn't do XML. Every other WYSIWYG tool I can think of produces dreadful code. And, in any case, the editor is the wrong place to enforce well-formedness. Most people consder the fact that Nvu eats certian kinds of preprocessor code and many hand edits to be a bug. But it's just an artefact of the fact that Nvu (as far as I know) writes out a serialisation of the in-memory DOM struture when it saves a file. That's the way a 'real XML' tool is suppsed to work, but it turns out not to have the flexibility that most people require.
> The only major problem ... is really when you need to pull in unreliable content.
Where unreliable content includes: 100% of all existing ('legacy') content, any content that has been edited by a human, any content (e.g. trackbacks) that is automatcally syndicated from another site (Jacques Distler, who has been successfully running an XML based website for years recently had a problem with syndicated content that led to the yellow parsing error of death until he could roll out a patch). That's pretty close to 100% of all content.
Incdentially, the 'edited by a human' thing is pretty important. If you have people contributing to your site, everything they enter will have to be validated (even if you use a tool, if it doesn't use a 'real XML processor', you need to be wary of bugs...) which meanns that you have to include a validation step in the publishng process and make sure that all the contributers are clued up enough to be able to fix valiidation errors.
If you can do that, fine. The problem with trying to base the web on a technology that requires validation is that most people *can't* do it. The situation is compounded by the fact that, at present, they can do what they're trying to do *without the extra effort*. I don't see people transitiooning from a forgiving system to an unforgiving system without kicking and screaming. If XML becomes a successful language on the web, I expect it will be a direct result of the development of a liberal XML parser that lowers the barrier to authoring.
> As for not using MathML, this was probably due to lack of support in browsers
> current plugins don't solve the problem because they work with separate files, whereas the real power and convenience IMO is in embedded, inline XML formats - when you can put equation or a bar of music notation in the middle of your essay
MathML might be a faliure because of the lack of native browser support. But if that's true, it is also neccessarily the case that plugins that allow specialised inline content are a faliure. Mathplayer <http://www.dessci.com/en/products/mathplayer/> will display inline MathML in IE. It will (now) work with Real XHTML files. It will do all the things that you advocate that plugins do. But people still use bitmaps rather than MathML because bitmaps have close to universial support (well, non-visual UAs don't support them but they don't support MathML either). MathML also suffers from being a pig to author. The only reasonable solution for heavy use is to use something like Itex2MML to convert from a LaTeX-like syntax to MathML. Sadly, as with all auttomatic authoring tools, this places substantial limits on the quality of the final code (in particular, disttinctions between, say, mi and mo are often lost).
> There might be technical limitations in the wiki software that let it output ill-formed code; those are technical limitations, it's a software issue, fixing it isn't a problem.
I have a strong suspicion that you're wrong and that, unless the tool was designed to produce valid markup from the start, it will often be non-trivial to ensure that iit always produces valid markup under and circumstances. Even if it's technically possible, no one will do it because, unless certian specific technologies are required, XHTML offers notthing that HTML 4 doesn't but comes with big strings attached.
The logo bears an uncanny resemblance to: <http://www.which.net>
I doubt they'd be happy about it.
#32 Short version of why XAML is threat
Monday June 7th, 2004 7:33 AM
Is there a good chance that web applications and/or individual pages will be written in XAML? Will users point their no-longer-IE browser at a xaml page hosted online in 2006 or 2007? It would be like XUL in that regard then, and if so, wouldn't adding XUL to this group's effort make sense? Or, is XUL already part of a web-standardization effort? Even if it is, is there a snowball's chance the M$ would consider implementing it? Thanks, Dennis
#34 W3C SVG Chair Comments On WHAT WG
Monday June 7th, 2004 1:08 PM
Allow me to quote Chris Lilley who chairs the W3C SVG Working Group:
Given that Mozilla itself embraces XML, RDF, has a simple XLink implementation, does some SVG, and so on its a fair bet that this represents a few disgruntled, tired individuals trying to bring back the 'who needs standards, we have quick hacks' good old days rather than a radical change of direction for Mozilla as a whole.
#35 Re: W3C SVG Chair Comments On WHAT WG
Monday June 7th, 2004 4:06 PM
I strongly suspect that Chris is just blinded by his love affair with SVG. He's right that there is no radical change of direction, though -- Mozilla was at a crossroads and has picked a direction. SVG 1.2 is not that direction.
I investigated using Mozilla as a Rich Internet Client for Database applications but decided to waited because there were a number of things missing in XUL/RDF to make it work easily. The suggestions here would make it much easier to do in HTML. The fact that you could use XML to transfer data means you could work with any database backend.
It would also be nice if <option> elements included a "picture" attribute so you could have drop down lists with pictures as well as text.
You wouldn't need a picture attribute. You just need to be able to put <option><img src="" alt=""> Text</option> I just tried it and it didn't work. I don't see why it shouldn't though because you can put pictures on form buttons.