I'm simply trying to use XHR to precache some resources, but the cache is not behaving as expected.
Here are the bounds of the problem:
I know the resource URLs in advance (of course).
I don't know their content-types (mix of CSS, images, and other).
They will always be same-origin.
I control the cache headers.
They can be cached forever for all I care.
I've always been under the impression that XHR used the browser cache more or less like any other resource, but never rigorously tested that. Here's what I'm doing:
Request all resources up-front with XHR.
Explicitly set request header Cache-Control: max-age=3600 (Chrome was setting max-age=0 for some reason).
Set the following response headers on the server:
Cache-control: public; max-age=3600
Date: now
Expires: now + 1 hour
[Content-Type, Content-Length]
Here's what I'm seeing:
XHR always fetches the resource (confirmed on server and with dev tools).
Subsequent requests (via image/css/etc elements) always fetch (even after the XHRs have completed) on a cold cache.
But they always use the cache when it's warm.
I've poked at it in various ways, but this behavior never seems to change.
After much wailing and gnashing of teeth, I believe I've proven that this approach simply won't work on all browsers. Here's my experience thus far:
Firefox: Works a charm.
Chrome: Seldom works -- it's as though XHR uses a different cache than the elements (even though I'm pretty sure that's not the case; I haven't had time to delve into Chrome code to figure out exactly what's going on.
Safari: Apparently random. Sometimes resource requests kicked off from elements retrieve from the cache, sometimes not. I'm sure there's a method, but it appears to be madness from the outside.
In the end, I had to switch to the somewhat more craptastic-but-reliable approach of creating a hidden iframe, injecting script/img elements into it, then waiting on the iframe window's onload event. This works, but gives no fine-grained feedback in terms of which elements are loaded (getting reliable onload events from the individual elements is more "cross-browser tricky" than just waiting on the whole frame.
I'd love to understand more precisely what's going on in Chrome/Safari, but sadly don't have the time to dig in further.
Related
Based on some reading, I've been lead to understand the following:
Javascript files loaded from a CDN will produced sanitized stack traces in error.stack which contain simply "Script Error"
Actual traces can be obtained if BOTH the script file includes crossorigin="anonymous" and the CDN sends Access-Control-Allow-Origin: *
I have a number of questions about this that I'm hoping someone can answer:
If I programmatically append a script to the DOM and call script.crossorigin="anonymous" on it first, does that "count"? Will it cause issues on any older browsers?
Any advice on getting Access-Control headers working with S3 => CloudFlare? It honestly seems impossible. Despite being configured, S3 only sends the headers if the REQUEST includes Origin, which never seems to happen. And even if it did, if the first request came from a browser that doesn't support CORS/Origin headers, the invalid version would be the one that gets cached.
Is it actually true that I even need these headers? I feel like I've seen stack traces in Chrome without the headers being there. Was the restriction removed from Chrome because it was unrealistic? Or does it not apply to javascript files added programatically? What browser versions actually sanitizes the traces?
I need to handle an infinite HTTP response (with Transfer-Encoding: chunked header).
This response contains a stream of images, so it must be handled as efficiently as possible.
XmlHttpRequest is not a solution here since it keeps all the reply in memory. Plus, if reading ArrayBuffer, the response isn't populated before the end of streaming, which means never here.
So, since I am under Firefox OS, the TCPSocket API seems to be my only hope.
I already started to implement a dirty HTTP stack (here and here), getting inspiration from the IMAP/SMTP implementations but it is still very slow.
So, two questions:
Is it worth spending time on this, or did I miss something easier?
If I want to implement it, what are the best practices not to foget about?
PS: I communicate with an external device, so changes on the server side are just not possible here.
As stated by the XMLHttpRequest doc on MDN, Firefox actually makes available extra responseType values (and so does Firefox OS) for streaming data, like moz-chunked-arraybuffer.
var xhr = new XMLHttpRequest({ mozSystem: true });
xhr.responseType = "moz-chunked-arraybuffer";
xhr.open('GET', deviceStreamingUrl);
xhr.addEventListener('progress', event => {
processChunk(xhr.response);
});
xhr.send();
Thanks to fabrice on #fxos#irc.mozilla.org!
I'm conditionally loading the javascript code using the Modernizr's loader (integrated yepnope.js). Here's my code:
Modernizr.load({
test: Modernizr.geolocation,
yep: ['js/get-native-geo-data.js','https://www.google.com/jsapi'],
nope: ['js/get-geo-data-by-ip.js','https://www.google.com/jsapi'],
complete : function () {
google.load("maps", "3",
{other_params: "sensor=false", 'callback':init});
});
It works, but the Network tab in both FireBug and Google Developers tools shows that it loads get-native-geo-data.js twice. I've added a console.log() in the native-geo-data.js, and prints the message only once. So what makes both FireBug and Dev Tools report two network calls both return 200, size 3K?
This is how Firebug reports the response header for each GET (they are the same):
HTTP/1.1 200 OK
Date: Thu, 20 Dec 2012 19:39:52 GMT
Server: HttpComponents/4.1.3
Content-Length: 3054
Content-Type: application/x-javascript
Connection: keep-alive
After running it though the Charles monitoring tool I see the same results - the function gets called twice. So where is the bug - in Modernizr, in yepnope, or in my head?
Not sure if you've seen it in the yepnope documentation:
I'm seeing two requests in my dev tools, why is it loading everything
twice? Depending on your browser and your server this could mean a
couple different things. Due to the nature of how yepnope works, there
are two requests made for every file. The first request is to load the
resource into the cache and the second request is to execute it (but
since it's in the cache, it should execute immediately). Seeing two
requests is pretty normal as long as the second request is cached. If
you notice that the second request isn't cached (and your script load
times are doubling), then make sure you are sending the correct cache
headers to allow the caching of your scripts. This is vital to
yepnope. It will not work without proper caching enabled. We actually
test to make sure things aren't loaded twice in our test suite, so if
you think we may have a bug in your browser regarding double loading,
we encourage you to run the test suite to see if the double loading
test passes.
I have a certain xml answer in a server which I want to use as origin in an AJAX request from my page. The server is properly configured so that cross-domain requests work.
The problem is the content type - it's fixed to 'text/html', and I can't change that.
Most browsers seem happy to accept an XML response with that content type. So far my code works in any recent version of Firefox, Chrome and Safari.
Internet Explorer 8 is giving me trouble, though.
I've prepared a jsfiddle trying to simulate my issue:
http://jsfiddle.net/LPa45/4/
On that jsfiddle, an AJAX request is made to the /echo/html service (which returns 'text/html' in the content type) but then it's used as an xml response. The "accepts" parameter, even if set for this specific purpose, doesn't really affect anything - I can remove it and everything works on FF, Chr too.
But I can't make it work on IE8. Does anyone have any hints?
Thanks!
Looking at the console output (I'm using IE9 in IE8 mode so this may not be the same in pure IE8) it errors because the Array.Map() function is unavailable. Googling seems to suggest this was added in IE9.
See this answer for a map function you can use which should fix your issue.
Fiddle: http://jsfiddle.net/WvmBL/
The problem was that I thought the same-origin policy had been solved by the Access-Control-Allow-Origin "*" directive to Apache, but it turns out that IE8 doesn't respect it.
The only option I had was implementing a makeshift proxy server.
But!
Once the proxy server was up an running, I hit exactly the problem akiller is mentioning and answering - IE8 doesn't have an Array.map.
So, I'm marking is answer as correct, but keep in mind that it is not the whole answer! You need a proxy too.
I make an Ajax request in which I set the response cacheability and last modified headers:
if (!String.IsNullOrEmpty(HttpContext.Current.Request.Headers["If-Modified-Since"]))
{
HttpContext.Current.Response.StatusCode = 304;
HttpContext.Current.Response.StatusDescription = "Not Modified";
return null;
}
HttpContext.Current.Response.Cache.SetCacheability(HttpCacheability.Public);
HttpContext.Current.Response.Cache.SetLastModified(DateTime.UtcNow);
This works as expected. The first time I make the Ajax request, I get 200 OK. The second time I get 304 Not Modified.
When I hard refresh in Chrome (Ctrl+F5), I get 200 OK - fantastic!
When I hard refresh in Internet Explorer/Firefox, I get 304 Not Modified. However, every other resource (JS/CSS/HTML/PNG) returns 200 OK.
The reason is because the "If-Not-Modified" header is sent for XMLHttpRequest's regardless of hard refresh in those browsers. I believe Steve Souders documents it here.
I have tried setting an ETag and conditioning on "If-None-Match" to no avail (it was mentioned in the comments on Steve Souders page).
Has anyone got any gems of wisdom here?
Thanks,
Ben
Update
I could check the "If-Modified-Since" against a stored last modified date. However, hopefully this question will help other SO users who find the header to be set incorrectly.
Update 2
Whilst the request is sent with the "If-Modified-Since" header each time. Internet Explorer won't even make the request if an expiry isn't set or is set to a future date. Useless!
Update 3
This might as well be a live blog now. Internet Explorer doesn't bother making the second request when localhost. Using a real IP or the loopback will work.
Prior to IE10, IE does not apply the Refresh Flags (see http://blogs.msdn.com/b/ieinternals/archive/2010/07/08/technical-information-about-conditional-http-requests-and-the-refresh-button.aspx) to requests that are not made as a part of loading of the document.
If you want, you can adjust the target URL to contain a nonce to prevent the cached copy from satisfying a future request. Alternatively, you can send max-age=0 to force IE to conditionally revalidate the resource before each reuse.
As for why the browser reuses a cached resource that didn't specify a lifetime, please see http://blogs.msdn.com/b/ie/archive/2010/07/14/caching-improvements-in-internet-explorer-9.aspx
The solution i came upon for consistent control was managing the cache headers for all request types.
So, I forced standard requests the same as XMLHttpRequests, which was telling IE to use the following cache policy: Cache-Control: private, max-age=0.
For some reason, IE was not honoring headers for various requests types. For example, my cache policy for standard requests defaulted to the browser and for XMLHttpRequests, it was set to the aforementioned control policy. However, making a request to something like /url as a standard get request, render the result properly. Unfortunately, making the same request to /url as an XMLHttpRequest, would not even hit the server because the get request was cached and the XMLHttpRequest was hitting the same url.
So, either force your cache policy on all fronts or make sure you're using different access points (uri's) for your request types. My solution was the former.