Tuesday, February 13, 2007

A Better Web Cache Timing Attack

I've been thinking on whether I should bother writing an actual paper on this or not, but when I found that Princeton had already written a pretty decent paper on Web Cache timing attacks back in 2000, which you can find here: http://www.cs.princeton.edu/sip/pub/webtiming.pdf I decided against it.

If you read the paper you will see that the attack relies on the user of two images to determine if the images are cached by determining if there is a dramatic change in loading times between getting the timing for one image, then loading a page which caches both images, and then getting the timing for the second image. If there is a significant difference, then then the first image had not been cached, and they had therefore not visited the page; whereas if there was no significant difference then the image was already cached, and therefore the page had been viewed before.

Now, this suffers from the fact that you do actually need two images which are ONLY displayed together, because otherwise your results will be erroneous, and not only that; but it requires that the images are approximately the same size so that your inferences about cache state are accurate.

A much better solution is to be able to determine the time it takes to retrieve a cached and non-cached version of an image by supplying request parameters, e.g. http://www.example.com/image?test=123456

The first thing we do is generate a random request string, and make a request for that image, and we now have the approximate time it should take to get the image when it is not cached, and we then make a second request to see how long the image takes to load when it is cached, and by generating a large amount of query strings to test we can get more accurate amounts.

We then make a request for the image without any request parameters and see which averaged value it is closer to, and then determine cache state.

This benefits from the fact that not only does there only need to be one image, and we can therefore find a page with a large photo or similar to give us a greater margin for error, but we also do not need to find a page with two images of equal size because we will always be making requests for the same size image.

Sadly not quite as effective as the attack against the SafeCache extension, but that's why its a timing attack I guess, :)


Sid Stamm said...

Something related to what you discuss will be published at the 16th annual WWW conference

kuza55 said...

That's interesting, do you know if there will be video or audio of the talk available afterwards? Or any other kinds of material?

Sid Stamm said...

There probably will only be a publication and possibly a set of slides, unless they arrange to video tape it. The conference is pretty large.

Consider watching the authors' web sites for material that will probably be posted shortly after the conference.