I have a problem with ajax requests that have a fixed timeout. So I am just using this simple code.
$.ajax({
url: "test.html",
error: function(){
//do something
},
success: function(){
//do something
},
timeout: 7000
});
In my scenario it is possible that like 20 ajax requests are called at once, each with a timeout of 7 seconds. As you might know, each browser has a different number of connection limits. And here comes the problem. Some of the ajax requests are queued due to the connection limit but the timeout counting already begins. So some requests are already timed out before they even arrive at the server. Is there a possibility that the timeout is only counted down when the request is actually transmitted to the server?
A quick and dirty "replacement" for $.ajax - only covers what you use in the example, but it should be a "drop in" replacement
function betterAjax(options) {
var xhr = new XMLHttpRequest();
var setListener = function(which, fn) {
if (typeof fn == 'function') {
if (which == 'success') {
which = 'load';
}
xhr.addEventListener(which, fn);
}
}
xhr.open(options.method || 'GET', options.url);
setListener('success', options.success);
setListener('error', options.error);
if (options.timeout) {
xhr.timeout = options.timeout;
}
xhr.end();
}
Which is a simplified version of something I wrote some years back - basically it's https://jsfiddle.net/jaromanda/wpkeet34/
Related
My program will generate a large number(thousands) of content URLs that it will then loop through, and make an ajax GET request for each.
The problem I am having is that IE can only operate on 6 requests at a time, so the remaining ones are stuck with readyState = "UNSENT". This is only problematic because some of these requests can take minutes, while others only take seconds.
Every time my program runs, right at the 300 second mark, all of the UNSENT XHR requests will timeout and throw an error in the console: XMLHttpRequest: Network Error 0x2ee2, Could not complete the operation due to error 00002ee2.. After looking into this error, I noticed that this is essentially a network timeout.
A few things I have tried and confirmed after much testing:
This is strictly happening in IE11 (tested in Chrome and worked good)
I tried setting up timeouts greater than 300 seconds in multiple different forms/ways. All of them only effected the individual Ajax calls, meaning if one started and took more than my timeout(let's say ten minutes) it would timeout, but the UNSENT requests would still timeout after five minutes.
Also tried setting timeout to 0, no effect(still threw out UNSENT requests after 300 seconds)
Attempt 1
sContentUrls.forEach(function( sContentUrl ) {
$.get({
url: sContentUrl,
timeout: 600000,
error: function() {
},
success: function() {
console.log("Success");
},
async: true
});
});
Attempt 2
sContentUrls.forEach( function( sContentUrl ) {
var xhr = new XMLHttpRequest();
xhr.open("GET", sContentUrl, true);
xhr.onreadystatechange = function(){
if (xhr.readyState === 4 && xhr.status === 200){
console.log("Success");
}
};
xhr.onprogress = function(){ };
xhr.ontimeout = function(){ };
xhr.onerror = function () { };
setTimeout(function(){
xhr.send();
}, 0);
});
I am wondering what I can do to stop IE11 from throwing this error and aborting thousands of requests. I have looked into setting up a manager/queue for the requests, so that a new one is only made when it can sent right away or something.
Has anyone seen this issue before or knows of a potential workaround/solution? I can confirm that it does not occur if I make the requests synchronous, but due to the massive amount of requests this isn't a great option for me.
Thanks!
Curious what are some of your solutions, elegantly, that deal with block js calls / ajax calls that take too long reaching out to third party sites for data/info.
Lately, I've been contending with some scripts/ajax requests in which the server is either down or not responding and literally blocks my page. They are suppose to be async.
So, I want to abort the call after x time.
var request = $.ajax({
type: 'POST',
url: 'someurl',
success: function(result){}
});
then use: request.abort() if it takes too long.
But I am thinking I can use a deferred/promise here, with a timeout ability and call abort if my promise doesn't come back in say 1000ms.
Your thoughts?
My bad for not referring to the timeout attr of the ajax request. I didn't want to wrap the abort() in a setTimeout, but having the jQuery ajax api w/ timeout is what I need. I should have seen this. thanks all.
Check the timeout option: http://api.jquery.com/jQuery.ajax/
var request = $.ajax({
type: 'POST',
url: 'someurl',
timeout: 2000,
success: function(result){},
error: function(xhr, status, message) {
if(status == "timeout") {
alert("Request time out");
}
}
});
One way can be using setTimeout
var abort_req = setTimeout(function () {
request.abort();
}, 3000);
//^ time in ms
Possible duplicate of jQuery $.ajax timeout setting. JQuery AJAX has an optional timeout option that can be passed by milliseconds. You can process logic when the AJAX request timed out by passing a callback function to the error option like so:
error: function(x, t, m){
//process error here
}
This question already has answers here:
Check if Internet Connection Exists with jQuery? [duplicate]
(9 answers)
Closed 8 years ago.
Assuming there is no internet connection, of course. Like a jQuery method?
I would try to make HEAD requests (no content downloaded) to a few servers you know are online. They will automatically fail if there is no network (no need to set a timeout).
$.ajax({
type: "HEAD",
url: 'http://www.google.com',
error: function() {
alert('world is gone !');
}
});
DEMONSTRATION (unplug your network to test)
If you are dealing with ajax requests, you can catch timeout error - see error(jqXHR, textStatus, errorThrown) in $.ajax reference.
Sort of, yes. Before the actual submission, you can have an ajax call try to connect to a simple lightweight service on your site just to check if it is reachable.
If the call fails, you can assume there's no connection.
With help of failed XHR requests you can determine the connection.
Retry few times , if the request doesnot go through, alert and fail gracefully.
I can use this function which I got from http://jamiethompson.co.uk/web/2008/06/17/publish-subscribe-with-jquery/
$.networkDetection = function(url,interval){
var url = url;
var interval = interval;
online = false;
this.StartPolling = function(){
this.StopPolling();
this.timer = setInterval(poll, interval);
};
this.StopPolling = function(){
clearInterval(this.timer);
};
this.setPollInterval= function(i) {
interval = i;
};
this.getOnlineStatus = function(){
return online;
};
function poll() {
$.ajax({
type: "POST",
url: url,
dataType: "text",
error: function(){
online = false;
$(document).trigger('status.networkDetection',[false]);
},
success: function(){
online = true;
$(document).trigger('status.networkDetection',[true]);
}
});
};
};
One way to go about it is to send your form using AJAX. Then you'll have a handler that will tell you that it wasn't able to connect to the server and you can save the data and inform the user that the server is not available. It doesn't really matter whether the problem is your internet connection or the server may be down.
if (navigator.onLine) {
alert('online');
} else {
alert('offline');
}
If I use Ajax to send request and this request take long time ..... if I want to send anther request what should I do?
the current behaviour the second request (I did) waiting until the first request get with response.
NOTE :
i want to do this behaviour on whole application (any new request execute immediately not wait the old one to be finished firstly)
My application using (Ajax + PHP + jQuery + Symfony)
Assume that is the first request take long time:
$.ajax
({
type: "GET",
url: url1,
success: function (html)
{
// do some thing
}
});
In any time I want this request to execute and terminate the first one.
$.ajax
({
type: "POST",
url: url,
success: function (html)
{
// do some thing else
}
});
var xhrReq;
xhrReq = $.ajax(...);
// then if you want to stop the rqest and exit use :
xhrReq.abort();
It’s sort of a manual process, but you can add a global xhr object and test it on each request. If the readystate is "loading", abort it:
var xhr;
var loadUrl = function(url) {
if ( xhr && xhr.readyState > 0 && xhr.readyState < 4 ) {
// there is a request in the pipe, abort
xhr.abort();
}
xhr = $.get(url, function() {
console.log('success', this);
});
};
loadUrl('/ajax/');
The XMLHttpRequest object has an abort function. You can use setTimeout to abort a request that is taking too long.
EDIT: In the case you do not want to use a timer, and a new event occurs that should abort the prior request, then the event handler should do the following
if(!this.request) return; // request contains the XMLHttpRequest
this.request.onreadystatechange = function() {};
if(this.request.readyState != 4) {
this.request.abort();
}
Then after that you can create the new XMLHttpRequest object.
I have been working on this many ways and I feel I found a working solution. I had a caching process that was causing a page to hang until done (average 5 seconds). Yes this is better suited as a CRON job, but I needed to create caching process for the user without knowing the environment they are using for my CMS.
What I had done:
Create the call within a variable and then remove it by a hard delete. By deleting this it seems to be removing the wait. This "hack" seemed to pull the wait from 5 second average to a 325ms wait.
var ignore = $.ajax({
url:something/here.php,
type: "GET",
url: url1,
success: function(){}
});
delete ignore;
Defining the ajax request variable:
var xhr;
Making the ajax call:
xhr = $.ajax(...);
Aborting the ajax call:
xhr.abort();
Browser allows you to handle only limited amount of requests to same host at time (2 or 3 as I remember, depending on browser).
Workaround on requests count is to make fake domains - like img1.domain.com, img2.domain.com, etc. leading to the same host and randomly use them in requests. Then you can just make requests you need. Domains count should be chosen depending on requests quantity in order to keep in bounds - 2 requests per domain. Otherwise 3rd request will wait until one of active finishes.
It allows you to receive responses from all your requests.
For example, Google uses it to make images load faster.
EDIT:
Example: you have http://yourhost.com/ and alias http://alias.yourhost.com which points to the same place.
Then:
$.ajax
({
type: "GET",
url: 'http://yourhost.com/somescript.php',
success: function (html)
{
// do some thing
}
});
and then
$.ajax
({
type: "POST",
url: 'http://alias.yourhost.com/somescript2.php',
success: function (html)
{
// do some thing else
}
});
I am using jQuery Mobile to create a webapp to look at and update a CRM type system.
The mobile app sends update using jQuery.get and jQuery.post and they work fine when network connection is available.
How should I code or what can I use to queue the jQuery.get and jQuery.post calls when the network connection is not available so they are sent when it becomes available again.
Edit: ah poo, i just noticed you said 'jQuery Mobile', I initially read that as jquery for mobile lol. Ummm, this'll probably only work as long as jQM supports ajax the same as normal jquery
I had an idea with a secondary ajax request, but you shouldn't need that. Just set up your AJAX like this, and give it a timeout. If it takes > 4 (should be enough for a broadband connection, but some phones may need ~10-15) seconds for the server to respond, it'll just try the ajax request again up to retryLimit, which can be set, then changed later as well after the 50 times is up (i.e. should it send when the program is idle and has no data perhaps?). When it connects, it'll go to the success function, which will then send the data to the server.
So it'd be like:
$.ajax({
type: 'GET',
timeout: 4000,
tryCount : 0,
retryLimit: 50,
success:function(data) {
sendSavedData();
}
error: function(xhr, textStatus, errorThrown) {
if(textStatus == 'timeout') {
this.tryCount++;
if(this.tryCount <= this.retryLimit) {
$.ajax(this);
return;
}
var check = confirm('We have tried ' + this.retryLimit + ' times to do this and the server has not responded. Do you want to try again?');
if(check) {
this.timeout = 200000;
$.ajax(this);
return;
} else {
return;
}
}
}
});