I recently discovered a poorly documented (in fact I couldn’t find any documentation about it) side effect of writing server-side cookies. They cause outputcaching not to work, and they do so in a nonobvious way.
If your outputcache directive looks like this:
<%@ OutputCache Duration="100" VaryByParam="none" location="Any" %>
But somewhere in your page you have code that looks like:
Response.Cookies["user"]["value"] = "something";
The page will not be cached. Sure, it’ll have the public cache-control/expires headers, but the value will always be 100 (duration), and it won’t be cached by IIS. There won’t be any errors or warnings, it simply won’t cache properly. In our case it was even less obvious because the cookie writing code had been written a long time ago in a base page, and was only triggered on certain pages with certain querystring parameters. Even when I discovered pages that I had assumed to be cached were not, I saw the cookie but dismissed it because I mistakenly assumed the effect would be the other way around. I assumed that a cookie on a cached page simply wouldn’t work right, the first request would write cookie and then every user thereafter would receive the cached page with the cached cookie value effectively giving everyone the same cookie. You know what the say about assumptions.
After wasting a lot of time with the Failed Request Tracing, which proved impossible to decipher, I cycled back to my suspicious cookie logic. Once I removed the cookie logic, caching behaved normally. A quick audit uncovered a second place where some old rudimentary tracking logic was also writing cookies with a similar effect on caching.
If you’re combining outputcaching with writing cookies on the server-side I’d suggest taking another look.