A user account is required in order to edit this wiki, but we've had to disable public user registrations due to spam.

To request an account, ask an autoconfirmed user on Chat (such as one of these permanent autoconfirmed members).

HTTP

From WHATWG Wiki
Jump to navigation Jump to search

This page is an attempt to document some discrepancies between browsers and RFC 2068 (and its successor, RFC 2616) because the HTTP WG seems unwilling to resolve those issues. Hopefully one day someone writes HTTP5 and takes this into account.

Redirects

For 301 and 302 redirects browsers uniformly ignore HTTP and use GET for the subsequent request if the initial request uses an unsafe method. (And the user is not prompted.)

Raised: http://lists.w3.org/Archives/Public/ietf-http-wg/2007JanMar/thread.html#msg225

Location header

Browsers handle relative URIs and URIs with invalid characters in interoperable fashion.

Raised: http://lists.w3.org/Archives/Public/ietf-http-wg/2009JanMar/thread.html#msg276

Content-Location header

Browsers cannot support this header.

Raised: http://lists.w3.org/Archives/Public/ietf-http-wg/2006OctDec/thread.html#msg190

This has apparently been fixed by making Content-Location have no UA conformance criteria. (It's not clear what it's good for at this point.)

Accept header

Accept header should preferably be done without spaces.

(not raised, odinho: I came across a site that didn't like the spaces, the developer said he'd gotten it off php.net or stackoverflow. He fixed the site. This could be disputed.)

Requiring two interoperable browser implementations

To prove that RFC 2616 can be implemented there should be two compatible implementations in browsers.

Raised: http://lists.w3.org/Archives/Public/ietf-http-wg/2007JanMar/0222.html

Assume Vary: User-Agent

UAs and intermediary caches should act as if all responses had Vary: User-Agent specified since many pages on the Web serve different content depending on the User-Agent header but do not bother specifying Vary: User-Agent.

Raised: http://lists.w3.org/Archives/Public/ietf-http-wg/2012OctDec/0114.html

You may as well not have a cache if you do this. It's hard to find two users with the same User-Agent string if you try. It varies based on minor browser version, major OS version, and in old IE doesn't it vary based on installed plugins? Yes, some pages will break if you run a transparent caching proxy and don't vary based on UA, but it will be a small minority and somewhat random, and generally they'll fix themselves if you force-refresh. (Browsers send Cache-Control: no-cache when you force-refresh, which will skip a normally-configured cache.) Even if you vary based on UA, caching proxies will break some pages, because some sites serve incorrect caching headers and a caching proxy will make you hit these more often even in the single-user case. (E.g., hitting refresh will skip browser cache for the current page but not proxy cache, right?)

So basically, this is a performance vs. correctness tradeoff, and the correct answer for the vast majority of users is not to have a caching proxy at all. Some will want a caching proxy that serves them some incorrect pages. No one wants a caching proxy that varies based on UA, because then the cache will be useless. The only case I could think of where this might make sense is in an office with a homogeneous browser environment, which wants caching for its standard browsers (which all have the same UA string), but still wants to be relatively correct for people using Wi-Fi on their laptops with different browsers. But it's not something that makes any sense to require across the board. Aryeh Gregor 08:45, 17 October 2012 (UTC)