Why Chrome sometimes makes 2 AJAX requests instead of 1? - javascript

I have encountered a strange behaviour of current version of Google Chrome (44.0.2403.157) on OS X.
When I do AJAX (XHR) request with method GET with my Chrome sometimes makes 1 request and sometimes - 2 (!).
My code:
var jsUrl = 'http://www.some-domain.com/xy.js';
var ajax = new XMLHttpRequest();
ajax.withCredentials = true;
ajax.open( 'GET', jsUrl, true ); // async, because sync cannot be send with cookies
ajax.onreadystatechange = function () {
var script = ajax.response || ajax.responseText;
if (ajax.readyState === 4) {
switch( ajax.status) {
case 200:
case 304:
eval.apply( window, [script] );
// no break on purpose here
default:
someFunction();
}
}
};
ajax.send(null);
A screenshot of my developer console when 2 requests are being made:
Server log confirms that the second request is being made - it's not just in the console.
The headers of the second request are practically the same as the first one - the only difference is the value of the cookie that is being changed with the 3rd request on the screen above.
Note that current version of Firefox (40.0.3) on OS X doesn't show this behaviour - only 1 request is made here, every time (tested by both watching request using Firebug (with BFCache requests shown) and by watching the server logs).
For a while I thought that using send() vs send(null) makes a difference - that when using send() only one request is being made - but after more tests I see that the strangeness is present with both syntaxes.
Can you give me a hint about what is happening here?

Related

XMLHttpRequest returning with status 200, but 'onreadystatechange' event not fired

We have been receiving an intermittent bug with the XMLHttpRequest object when using IE11. Our codebase is using legacy architecture, so this browser is required.
After clicking a button, the browser launches an out-of-band process by creating a new ActiveX control which integrates with a camera to capture an image. This control appears to be working fine... it allows the operator to capture the image, and the Base64 content of the image is returned out of the control back to the browser interface, so I think we can rule out a problem with this object.
Once the image is returned to the browser, the browser performs an asynchronous 'ping' to the web server to check if the IIS session is still alive or it has expired (because the out-of-band image capture process forbids control of the browser while it is open).
The ping to the server returns successfully (and running Fiddler I can see that the response has status 200), with the expected response data:
<sessionstate>ok</sessionstate>
There is a defined 'onreadystatechange' function which should be fired on this response, and the majority of times this seems to fire correctly. However, on the rare occasion it does appear, it continues to happen every time.
Here is a snippet of the code... we expect the 'callback()' function to be called on a successful response to Timeout.asp:
XMLPoster.prototype.checkSessionAliveAsync = function(callback) {
var checkSessionAlive = new XMLHttpRequest();
checkSessionAlive.open("POST", "Timeout.asp?Action=ping", true);
checkSessionAlive.setRequestHeader("Content-Type", "application/x-www-form-urlencoded");
checkSessionAlive.onreadystatechange = function() {
if (checkSessionAlive.readyState == 4) {
if (checkSessionAlive.responseText.indexOf("expired") != -1 || checkSessionAlive.status !== 200) {
eTop.window.main.location = "timeout.asp";
return;
}
callback(checkSessionAlive.responseText);
}
}
checkSessionAlive.send();
}
Has anyone seen anything like this before? I appreciate that using legacy software is not ideal, but we are currently limited to using it.

How to reading out XMLHttpRequest responses

I'm doing a XMLHttpRequest like this:
var url = "myurl.html";
var http_request=new XMLHttpRequest();
http_request.onreadystatechange=function(){
if (http_request.readyState === 4){
console.log(http_request.response);
}
};
http_request.withCredentials = true;
http_request.open('GET',url,true);
http_request.send(null);
In the console http_request.response is empty but when I look in Chrome into the networks Tab I get this stuff
How do I get to this stuff from JavaScript?
The request comes from a different computer on the same network. The server at myurl.html doesn't allow "Access-Control-Allow-Origin". For this, in the header of the response must be something like this:
Access-Control-Allow-Origin: *
(* stands for wildcard)
When the header is lacking that information, Chrome will block the response in the JavaScript. Hence, the network tab will show the response but not the JS console. To bypass CORS in different browsers there are several methods. I used an extension from the Chrome web store for Chrome.
To further ensure that the request was done correctly, the callback function can be modified to this:
http_request.onreadystatechange = function () {
if(http_request.readyState === XMLHttpRequest.DONE && http_request.status === 200) {
console.log(http_request.responseText);
}
};
http_request.readyState should be 4, the status however should be 0 (even though in the Network tab it will appear as 200).

If statement causes parsing but nothing else inside the branch

This is an odd one; This function is called by a timer every 1 second in Qt. Check out the if statement; first it parses some JSON data, then it logs that it is parsing. As I would expect, the console.log is only happening when fileshow.txt changes it contents. HOWEVER -- The line that says var parsed = JSON.parse(t) reports a parsing error every 1 second, even when nothing else (including the logging) occurs during that one second:
function get() {
var xhr = new XMLHttpRequest;
xhr.open("GET", "/fileshow.txt");
xhr.onreadystatechange = function () {
var t = xhr.responseText;
if (t != tt.lastData) {
var parsed = JSON.parse(t);
console.log("parsing");
viewer.newImages(parsed.files);
thetime.shouldRepeat = parsed.repeat;
thetime.fps = parsed.fps;
tt.lastData = t;
thetime.running = true;
}
}
xhr.send()
}
Even though I get a parse error (which is a different topic -- the data is in fact parsing correctly despite the error, as it is getting routed via the above formulas just fine and other parts of the program get the data as expected), I should not even be seeing an error for that source code line unless that if branch actually runs! How can it report a parsing error that could only happen in that if branch when that if branch is not even running?!
There is no other parsing anywhere, and the error is reported for the specific line number of this JSON.parse call.
If you're asking why the if statement executes, it's because onreadystatechange is called every time the state of the XHR request changes. There are 5 states for XHR:
0: UNSENT (open() has not been called yet)
1: OPENED (send() has not been called yet)
2: HEADERS_RECEIVED (send() has been called and headers and status are available)
3: LOADING (downloading; responseText holds partial data)
4: DONE (operation complete)
Because onreadystatechange will be called when it starts loading, you're getting passed in a partial JSON file, which is extremely unlikely to validate as proper JSON. You need to check that the readyState is 4, that is, it has finished loading:
if (xhr.readyState === 4 && xhr.status === 200 && t != tt.lastData) {
You also probably want to check that the request was successful by checking for a HTTP 200 response. For more information about XMLHttpRequest, see this MDN article.

Not Receiving Asynchronous AJAX Requests When Sent Rapidly

My script is sending a GET request to a page (http://example.org/getlisting/) and the page, in turn, responds back with a JSON object. ({"success":true, "listingid":"123456"})
Here's an example snippet:
var listingAjax = new XMLHttpRequest();
listingAjax.addEventListener("load", listingCallback, false);
function listingCallback(event) {
if (this.readyState == 4 && this.status == 200) {
console.log(this.responseText);
}
}
listingAjax.open("GET", "http://example.org/getlisting/", true);
listingAjax.send();
Simple enough. The script works perfectly too! The issue arises when I want to do this:
var listingAjax = new XMLHttpRequest();
listingAjax.addEventListener("load", listingCallback, false);
function listingCallback(event) {
if (this.readyState == 4 && this.status == 200) {
console.log(this.responseText);
}
}
window.setInterval(function() {
listingAjax.open("GET", "http://example.org/getlisting/", true);
listingAjax.send();
}, 250);
What I imagine should happen is my script would create a steady flow of GET requests that get sent out to the server and then the server responds to each one. Then, my script will receive the server's responses and send them to the callback.
To be more exact, say I let this script run for 5 seconds and my script sent out 20 GET requests to the server in that time. I would expect that my callback (listingCallback) would be called 20 times as well.
The issue is, it isn't. It almost seems that, if I sent out two GET requests before I received a response from the server, then the response is ignored or discarded.
What am I doing wrong/misunderstanding from this?
Many browsers have a built in maximum number of open HTTP connections per server. You might be hitting that wall?
Here is an example from Mozilla but most browsers should have something like this built in: http://kb.mozillazine.org/Network.http.max-connections-per-server
An earlier question regarding Chrome:
Increasing Google Chrome's max-connections-per-server limit to more than 6
If you have Windows, take a look at a tool like Fiddler - you might be able to see if all of the requests are actually being issued or if the browser is queueing/killing some of them.
You can't reuse the same XMLHttpRequest object opening a new connection while one is in progress, otherwise it will cause an abrupt abortion (tested in Chrome). Using a new XMLHttpRequest object for each call will solve that:
function listingCallback(event) {
if (this.readyState == 4 && this.status == 200) {
console.log(this.responseText);
}
}
window.setInterval(function() {
var listingAjax = new XMLHttpRequest();
listingAjax.addEventListener("load", listingCallback, false);
listingAjax.open("GET", "http://example.org/getlisting/", true);
listingAjax.send();
}, 250);
This will work nicely queueing a new ajax request for each interval.
Fiddle
Note that too frequent calls may cause slowdown due to the maximum limit of concurrent ajax calls which is inherent to each browser.
Though, modern browsers have a pretty fair limit and very good parallelism, so as long as you're fetching just a small JSON object modern browsers should be able to keep up even when using a dial-up.
Last time I made an ajax polling script, I'd start a new request in the success handler of the previous request instead of using an interval, in order to minimize ajax calls. Not sure if this logic is applicable to your app though.

Consecutive Ajax requests without jQuery/ JS library

I have an issue, mainly with IE.
I need to be able to handle n queries one after another. But If I simply call my function below in a for loop IE does some strange things (like loading only so many of the calls).
If I use an alert box it proves that the function gets all of the calls, and surprisingly IT WORKS!
My guess is that IE needs more time than other browsers, and the alert box does just that.
Here is my code:
var Ajax = function(all) {
this.xhr = new XMLHTTPREQUEST(); // Function returns xhr object/ activeX
this.uri = function(queries) { // Takes an object and formats query string
var qs = "", i = 0, len = size(queries);
for (value in queries) {
qs += value + "=" + queries[value];
if (++i <= len) { qs += "&"; }
}
return qs;
};
xhr.onreadystatechange = function() { // called when content is ready
if (this.readyState === 4) {
if (this.status === 200) {
all.success(this.responseText, all.params);
}
this.abort();
}
};
this.post = function() { // POST
xhr.open("POST", all.where, true);
xhr.setRequestHeader("Content-type","application/x-www-form-urlencoded");
xhr.send(uri(all.queries));
};
this.get = function() { // GET
xhr.open("GET", all.where + "?" + uri(all.queries), true);
xhr.send();
};
if (this instanceof Ajax) {
return this.Ajax;
} else {
return new Ajax(all);
}
};
This function works perfectly for a single request, but how can I get it to work when called so many times within a loop?
I think the problem might be related to the 2 concurrent connections limit that most web browsers implement.
It looks like the latency of your web service to respond is making your AJAX requests overlap, which in turn is exceeding the 2 concurrent connections limit.
You may want to check out these articles regarding this limitation:
The Dreaded 2 Connection Limit
The Two HTTP Connection Limit Issue
Circumventing browser connection limits for fun and profit
This limit is also suggested in the HTTP spec: section 8.14 last paragraph, which is probably the main reason why most browsers impose it.
To work around this problem, you may want to consider the option of relaunching your AJAX request ONLY after a successful response from the previous AJAX call. This will prevent the overlap from happening. Consider the following example:
function autoUpdate () {
var ajaxConnection = new Ext.data.Connection();
ajaxConnection.request({
method: 'GET',
url: '/web-service/',
success: function (response) {
// Add your logic here for a successful AJAX response.
// ...
// ...
// Relaunch the autoUpdate() function in 100ms. (Could be less or more)
setTimeout(autoUpdate, 100);
}
}
}
This example uses ExtJS, but you could very easily use just XMLHttpRequest.
Given that the limit to a single domain is 2 concurrent connections in most browsers, it doesn't confer any speed advantage launching more than 2 concurrent requests. Launch 2 requests, and dequeue and launch another each time one completes.
I'd suggest throttling your requests so you only have a few (4?) outstanding at any given time. You're probably seeing the result of multiple requests being queued and timing out before your code can handle them all. Just a gess though. We have an ajax library that has built-in throttling and queues the requests so we only have 4 outstanding at any one time and don't see any problems. We routinely q lots per page.
Your code looks like it's put together using the constructor pattern. Are you invoking it with the new operator like var foo = new Ajax(...) in your calling code? Or are you just calling it directly like var foo = Ajax(...) ?
If the latter, you're likely overwriting state on your later calls. It looks like it's designed to be called to create an object, on which the get/post methods are called. This could be your problem if you're "calling it within a loop" as you say.

Categories

Resources