RUS-CERT Criticizes Mozilla Security Bugs PolicyThursday March 11th, 2004Simon Paquet sent us a link to a German article from the Computer Emergency Response Team of the University of Stuttgart (RUS-CERT) that criticizes Mozilla's security policy. Simon provides a translation of the key points of the article: "The RUS-CERT criticizes Mozilla's security approach to security leaks. They find fault in the fact that no official security advisories are published by the Mozilla Foundation. Instead security leaks are silently patched and incorporated into newer releases. "They come to the conclusion, 'At the moment Mozilla is obviously no convincing alternative to the market leader [Microsoft].' In an update to their article, they state that this sentence was interpreted as saying that everyone should refrain from using Mozilla. This is not the case. They only state that Mozilla suffers from the same security problems as all other clients and that the use of Mozilla alone is no solution to those security problems." The last major update to the Known Vulnerabilities in Mozilla page was in November. The Mozilla Security Bugs Policy explains how security flaws are handled. "They find fault in the fact that no official security advisories are published by the Mozilla Foundation. Instead security leaks are silently patched and incorporated into newer releases." It's interesting that's exactly how MS handle security bugs, as we found out because of the leaked Windows code. Some one found a secutity hole in the leaked code in IE5 and then MS announced that this hole had already been fixed in IE6, but they never ever told anyone about, until we about it accendently. Yesterday a friend of mine said that he wanted to install Windows XP and Office XP in a new laptop, and he asked which patches should be applied. For Office XP, only service pack 3, released last Tuesday was needed. For Windows XP, though, service pack 1 plus 31 patches were needed! Even if we restrict ourselves to the critical ones, we still have 17 patches! I don't think this is actually better… XP SP2 is in beta stage afaik and when it's released you just have to apply it and the additional 17 fixes that has been released after that =) Although I'm a real Linux fan, I have to say that the number of patches alone is not indicative. Take a stock/vanilla Fedora (more recent than XP!) and count all patches you need to get it secure. It's probably around 17. But in M$ 17 patches can mean 17*??? number of security issues that are fixed. "The last major update to the Known Vulnerabilities in Mozilla page was in November." And it's therefore out of date. I'm pretty sure there were security issues in 1.5 that have been fixed in 1.6. According to the policy, those should now be on that page. "The Mozilla Security Bugs Policy explains how security flaws are handled." It begins by saying that the first step to improving their handling is to appoint a security module owner, and goes on to say that it's Mitch Stoltz of Netscape, who isn't (AFAIK) around any more, as of when Netscape laid everyone off. Mozilla is not MS, MS would whine. Just acknowledge and fix if there's something wrong. If there's older versions with known security issues then this should be communicated to users properly. The problem with Mozilla's products right now is that there's no easy way to fix security or any other bugs without upgrading to a nightly (or wait for a new release). Both options are less than satisfactory. Although Fx is not yet 1.0, the suite is past due. Perhaps instead of rewriting the download manager or password manager we should concentrate on the upgrade process. Forgot to mention that although I wouldn't trust MS's security at least I can just download a patch. This is a serious problem for Mozilla and I hope we don't just ridicule MS but try to do a bit better. From discussions about patching before, I think they'd be happy to consider it if someone else contributed the work, but didn't see it as beneficial enough to bother doing the work. Someone guesstimated that to move between releases (even alpha/beta releases), the patch would end up being at least 80-90% the size of the whole thing. So (for Mozilla) you'd have a choice of downloading a 10MB "patch" version instead of a 12MB full version. Hardly worth the effort. For Firefox, it would make even less difference. There'd be more value in patching to go from a stable release to a stable release to a x.x.1 security fix release, but that hasn't generally happened in the past because there haven't been any security issues that have been both serious and known about beyond a few developers. Of course that's not necessarily a reason not to do something in anticipation of future problems, but... > you'd have a choice of downloading a 10MB "patch" version instead of a 12MB full version As bz mentioned, you don't have to update all of Mozilla to a new version (no matter how large that leap is). You might apply the patch to the last 1 or 2 stable releases and a) only ship the changed files or b) make a binary patch in its true meaning, i.e. the patch only changes the modified parts of the affected libraries, which results in relatively small patches I think. I'm not sure setting up an environment to automate most of this takes a lot of time. (Neither I say it's easy and quick!) Yes, that's why I wrote (although I screwed up that sentence) that there'd be more value in patching from a stable release to a x.x.1 release with a security fix. The hard bit isn't making the patch, it's making, testing and releasing a build of a previous version with one extra patch in it. If nobody is going to do the work to make the fixed builds, setting up the patching stuff isn't worth the effort - you need to do both tasks to make it worthwhile. > Perhaps instead of rewriting the download manager or password manager we should concentrate on the upgrade process. You should realize that Ben is doing this soon and that it will come before the next release. http://bugzilla.mozilla.org/show_bug.cgi?id=214360 http://www.mozilla.org/projects/firefox/roadmap.html It seems to me that the simplest way to purvey security fixes is with x.x.1 releases. Say a major security bug was found in Firefox 0.8; once there is a fix, this patch should be inserted into the 0.8 source and released as 0.8.1 (i.e. without any of the new development in the nightlies that might decrease stablilty). If the patch for the nightlies isn't compatible with the last release version (for whatever reason), concoct one that is, shove that in the 0.8 source and distribute that as 0.8.1. This should become easier once Firefox has an automatic update notification system. > If the patch for the nightlies isn't compatible with the last release version (for whatever reason), concoct one that > is, shove that in the 0.8 source and distribute that as 0.8.1 You skipped the "text extensively, because the patch required major changes to sensitive code" (as happens all too often with security fixes) step. I supported Mozilla's policy all the way. It doesn't take a genius to figure out how to create a virus or hacking ability to take advantage of the Mozilla's vulerablity because all it take is searching through bugzilla to find one and implement it. Just look at the living example of M$ vulerability when it was discovered or announced and a few days later, a computer virus start swarming around. Which is better, less headache in the long run or more headache? You be the judge. Zook "It doesn't take a genius to figure out how to create a virus or hacking ability to take advantage of the Mozilla's vulerablity because all it take is searching through bugzilla to find one and implement it." You shouldn't be able to do that. In theory, all security bugs should be secret. Alex That's precisely one of the parts of the policy that the article in question attacks, no? "You shouldn't be able to do that. In theory, all security bugs should be secret." There are probably serveral crasher bugs (or parser bugs) which are security sensitive, but no-one took the time figure out if they could be exploited. They just fix them and move on. I read somewhere (slashdot post so your guess is as good as mine whether its accurate) that OpenBSD assumes all bugs are security bugs until a security expert descides otherwise, not the other way around. Also the CVS checkins for a security bug are public. So users of the next nightly may be secure, but not the milestone builds. It wouldn't be too hard to figure out a vunerablity by looking at the fix. In short I would guess that the main reason Mozilla is hacked less (That's a guess, where could I find stat's on this?) is because A.) It's probably coded more sanely than IE. (Again just a guess, but since so many of MS's publicly released or reverse engineered protocols are so messy, I'm assuming there code must be as well). B.) It's a smaller target. - If you want to make money, fame, or just be malicios write a tool to attack the most people possible. C.) Mozilla exploits rarely make the press. Perception is reality and attacking MS has become sport for the press recently. Mozilla may be more secure than IE, but for the common user who only uses milestones its still vulnerable. "It's probably coded more sanely than IE." If this were true, I would think it would be easier to make patches without recompiling the entire program. Is that not one of the main reasons for refactoring and object-oriented programming? > Is that not one of the main reasons for refactoring and object-oriented programming? Of course. There are a few things that lead to the "recompile the entire thing" problem (or rather the perception that the problem exists). 1) A large chunk of the mozilla code is the layout library. This _could_ be split into multiple libraries (used to be, in fact), but that introduces performance overhead, memory overhead, and code maintainability overhead that makes it not worth doing. Naturally, changes to the layout library require the layout library to be recompiled. Once you have the recompiled version, you can either ship the whole thing (about 3MB) or try to make a "binary patch". The latter shouldn't really be all that bad, in my opinion.... So distributing a typical security patch really shouldn't require recompiling all of mozilla. It should only require shipping an update for the one library affected, IF the patch is applied to the version that shipped. 2) When people are talking about "the whole thing would need to be recompiled" they are talking about a wholesale update from one milestone to another, more often than not. The problem there is that Mozilla is in active development and a large fraction of the libraries is touched between milestones. This is due not so much to the fact that there is poor separation between libraries (though there is some of that too) as to the fact that with about 1000 changes (that's how many we tend to have per milestone) and about 100 libraries all told most of the libraries end up with at least one change in them. Those that don't tend to be pretty small. So doing milestone upgrades via incremental update is really not worth it. All that said, localized security fixes really should not require rebuilding the whole thing and could be distributed incrementally, as far as I can tell. >>"It doesn't take a genius to figure out how to create a virus or hacking ability to take advantage of the Mozilla's >>vulerablity because all it take is searching through bugzilla to find one and implement it." >> >>You shouldn't be able to do that. In theory, all security bugs should be secret. >> >>Alex That is correct. Just image if the Moz's policy is the other way around. That's all. > In theory, all security bugs should be secret. That's funny... e.g. look at bug 221526 and then look at http://bonsai.mozilla.org/cvsquery.cgi?treeid=default&module=SeaMonkeyAll&branch=HEAD&branchtype=match&dir=&file=&filetype=match&who=&whotype=match&sortby=Date&hours=2&date=explicit&mindate=10%2F07%2F2003+15%3A55&maxdate=10%2F07%2F2003+16%3A15&cvsroot=%2Fcvsroot As long as security checkins are publich and as long as they are called "Late-breaking security fix" you don't even need a script that automatically watches all checkins and alerts you when the respective bug report is not publicly visible. (Ok, this has already been mentioned, but not that clearly.) > You shouldn't be able to do that. In theory, all security bugs should be secret. Only until the fix is incorporated into a release. Regarding the original article, it comes from a university CERT, targeted at that particular university. One fundamental issue in that article is that linux distros don't distribute Mozilla security fixes as security updates. On top of that, the 1.6 release notes mention fixed security bugs, but there are no advisories, so one can't estimate the risk from either the release notes nor from the known-vulnerabilities page. I guess we should a) update the know-vulnerabilites page again. (or skip the vanilla entry about security bug fixes from the release notes, not sure if we just paste that entry from release to release.) b) sell security fixes more agressivly to distributors. You can't sell them "security fixes" though, unless you're asking them to do the patching and regression testing themselves. The only way to get the security fixes to 1.5 is to go to 1.6, which is the same issue that mozilla.org binary users have. > unless you're asking them to do the patching and regression testing themselves Every single distributor already ships builds that are somewhat different from the SeaMonkey trunk. These builds include various changes ranging from "never submitted to Mozilla.org" to "rejected from inclusion in the Mozilla.org tree". The distributors already do regression testing on such patches (one hopes). So if they wished they could add security patches and rerelease an interim package. Yes, they may have to do some testing.... but that may be something they are willing to do for the sake of their customers (especially paying customers). A version number lets me know exactly what I have, and why I have it (by looking at release notes.) A version number means testing and stability. With Microsoft, a patch (to IE 5.01 from 5.0) causes a bug (iilegal operation that requires reboot). A security patch may eventually fix it, or make it more stable. My father and I both had this happen to us, and I reproduced the 5.01 bug on two seperate machines. A compiled release makes deployment and technical support easier. As multiple posts have demonstrated, patching really wouldn't be faster, and it would (for the reasons I have outlined) provide a false sense of security. That said, the known vulnerabilities page needs to be updated. Ideally, Bugzilla should incorporate a module for security patches where if the patch is marked confidential, it is published on close. If the patch is open, and security-related, it is published immediately. I assumed, because the security policy said so, that a notice is given only if the bug team thinks its warranted *before* it is fixed. After it is fixed, it can be found in Bugzilla, so it's a moot point. It's more a matter of organization (notice placement) and clarity of policy I think. --Sam |