If I use Ajax to send request and this request take long time ..... if I want to send anther request what should I do?
the current behaviour the second request (I did) waiting until the first request get with response.
NOTE :
i want to do this behaviour on whole application (any new request execute immediately not wait the old one to be finished firstly)
My application using (Ajax + PHP + jQuery + Symfony)
Assume that is the first request take long time:
$.ajax
({
type: "GET",
url: url1,
success: function (html)
{
// do some thing
}
});
In any time I want this request to execute and terminate the first one.
$.ajax
({
type: "POST",
url: url,
success: function (html)
{
// do some thing else
}
});
var xhrReq;
xhrReq = $.ajax(...);
// then if you want to stop the rqest and exit use :
xhrReq.abort();
It’s sort of a manual process, but you can add a global xhr object and test it on each request. If the readystate is "loading", abort it:
var xhr;
var loadUrl = function(url) {
if ( xhr && xhr.readyState > 0 && xhr.readyState < 4 ) {
// there is a request in the pipe, abort
xhr.abort();
}
xhr = $.get(url, function() {
console.log('success', this);
});
};
loadUrl('/ajax/');
The XMLHttpRequest object has an abort function. You can use setTimeout to abort a request that is taking too long.
EDIT: In the case you do not want to use a timer, and a new event occurs that should abort the prior request, then the event handler should do the following
if(!this.request) return; // request contains the XMLHttpRequest
this.request.onreadystatechange = function() {};
if(this.request.readyState != 4) {
this.request.abort();
}
Then after that you can create the new XMLHttpRequest object.
I have been working on this many ways and I feel I found a working solution. I had a caching process that was causing a page to hang until done (average 5 seconds). Yes this is better suited as a CRON job, but I needed to create caching process for the user without knowing the environment they are using for my CMS.
What I had done:
Create the call within a variable and then remove it by a hard delete. By deleting this it seems to be removing the wait. This "hack" seemed to pull the wait from 5 second average to a 325ms wait.
var ignore = $.ajax({
url:something/here.php,
type: "GET",
url: url1,
success: function(){}
});
delete ignore;
Defining the ajax request variable:
var xhr;
Making the ajax call:
xhr = $.ajax(...);
Aborting the ajax call:
xhr.abort();
Browser allows you to handle only limited amount of requests to same host at time (2 or 3 as I remember, depending on browser).
Workaround on requests count is to make fake domains - like img1.domain.com, img2.domain.com, etc. leading to the same host and randomly use them in requests. Then you can just make requests you need. Domains count should be chosen depending on requests quantity in order to keep in bounds - 2 requests per domain. Otherwise 3rd request will wait until one of active finishes.
It allows you to receive responses from all your requests.
For example, Google uses it to make images load faster.
EDIT:
Example: you have http://yourhost.com/ and alias http://alias.yourhost.com which points to the same place.
Then:
$.ajax
({
type: "GET",
url: 'http://yourhost.com/somescript.php',
success: function (html)
{
// do some thing
}
});
and then
$.ajax
({
type: "POST",
url: 'http://alias.yourhost.com/somescript2.php',
success: function (html)
{
// do some thing else
}
});
Related
I have a problem with ajax requests that have a fixed timeout. So I am just using this simple code.
$.ajax({
url: "test.html",
error: function(){
//do something
},
success: function(){
//do something
},
timeout: 7000
});
In my scenario it is possible that like 20 ajax requests are called at once, each with a timeout of 7 seconds. As you might know, each browser has a different number of connection limits. And here comes the problem. Some of the ajax requests are queued due to the connection limit but the timeout counting already begins. So some requests are already timed out before they even arrive at the server. Is there a possibility that the timeout is only counted down when the request is actually transmitted to the server?
A quick and dirty "replacement" for $.ajax - only covers what you use in the example, but it should be a "drop in" replacement
function betterAjax(options) {
var xhr = new XMLHttpRequest();
var setListener = function(which, fn) {
if (typeof fn == 'function') {
if (which == 'success') {
which = 'load';
}
xhr.addEventListener(which, fn);
}
}
xhr.open(options.method || 'GET', options.url);
setListener('success', options.success);
setListener('error', options.error);
if (options.timeout) {
xhr.timeout = options.timeout;
}
xhr.end();
}
Which is a simplified version of something I wrote some years back - basically it's https://jsfiddle.net/jaromanda/wpkeet34/
Curious what are some of your solutions, elegantly, that deal with block js calls / ajax calls that take too long reaching out to third party sites for data/info.
Lately, I've been contending with some scripts/ajax requests in which the server is either down or not responding and literally blocks my page. They are suppose to be async.
So, I want to abort the call after x time.
var request = $.ajax({
type: 'POST',
url: 'someurl',
success: function(result){}
});
then use: request.abort() if it takes too long.
But I am thinking I can use a deferred/promise here, with a timeout ability and call abort if my promise doesn't come back in say 1000ms.
Your thoughts?
My bad for not referring to the timeout attr of the ajax request. I didn't want to wrap the abort() in a setTimeout, but having the jQuery ajax api w/ timeout is what I need. I should have seen this. thanks all.
Check the timeout option: http://api.jquery.com/jQuery.ajax/
var request = $.ajax({
type: 'POST',
url: 'someurl',
timeout: 2000,
success: function(result){},
error: function(xhr, status, message) {
if(status == "timeout") {
alert("Request time out");
}
}
});
One way can be using setTimeout
var abort_req = setTimeout(function () {
request.abort();
}, 3000);
//^ time in ms
Possible duplicate of jQuery $.ajax timeout setting. JQuery AJAX has an optional timeout option that can be passed by milliseconds. You can process logic when the AJAX request timed out by passing a callback function to the error option like so:
error: function(x, t, m){
//process error here
}
There are many questions out there about JSONP requests and the fact that they cannot be made synchronously. Most workarounds involve using the callbacks, or the success function in an ajax request to do what you want, but I don't think that's going to work for me.
Background: I'm working on a search application using Solr. I'm developing a javascript API for others to use to interact with Solr so they don't need to understand the ins and outs of a Solr search request.
In my api, I have a Request object, with a function called doRequest. The purpose of the function is to perform the call to the solr server (on another domain, thus the need for JSONP), and return a Response object.
Request.prototype.doRequest = function (){
var res = new Response();
$.ajax({
url: this.baseURL,
data: 'q=*:*&start=0&rows=10&indent=on&wt=json',
success: function(data){
res.response = data.response;
res.responseHeader = data.responseHeader;
/*
other...
stuff...
*/
},
dataType: 'jsonp',
jsonp: 'json.wrf'
});
res.request = this;
return res;
};
The "user" would use this function like so...
var req = new Request();
var res = req.doRequest();
and then do something or other with the results in res.
Given that I cannot do a synchronous JSONP request, and I cannot return from within the ajax function, I can't figure out anyway to make sure res is fully populated for the user before they start using it.
Thanks,
It looks like using callback functions is the way to do this. Here's how I ended up doing it in case you're interested.
Request.prototype.doRequest = function (callback){
var res = new Response();
$.ajax({
url: this.baseURL,
data: 'q=*:*&start=0&rows=10&indent=on&wt=json',
success: function(data){
res.response = data.response;
res.responseHeader = data.responseHeader;
res.request = this;
/*
other...
stuff...
*/
callback(res);
},
dataType: 'jsonp',
jsonp: 'json.wrf'
});
};
And now the user uses the function like so:
var req = new Request();
var res = req.doRequest(parseResults);
Where parseResults is the callback function defined by the user that takes the response object as a parameters.
function parseResults(res){
//Doing work
}
Whether or not the rest of your program is synchronous, there are times when you need to touch a REST interface synchronously. An example is where you have a request to get an array of assets (perhaps asset Ids) and then hydrate each of the Ids to get the full data/asset. If you've used the Ooyala API then you know what I mean.
I created what seems to be the first synchronous JSONP module to allow each next request to only run once its previous request has finished. It uses recursion instead of Promises, which means the next request will not be sent until the previous one is successful. The API lives here: https://github.com/cScarlson/jsonpsync
Hope this helps.
Here are the two ways of implementation, that describe the issue.
The first one is the method that works really slow. It tries to get data from server, but the request is pending too long, only after that it returns data and everything's fine (except the terrible synchronous perfomance).
asyncMethod: function(doSmth, param) {
var resp = $.ajax({
type: 'GET',
async: false,
url: 'url'
});
data = resp.responseText;
doSmth(param, data);
}
Here is the same method, but it's asynchronous. The perfomance problem is eliminated here. But it executes the part in success only when page is reloaded. Probably reload stops some executions that were the bottleneck of the previous code sample.
asyncMethod: function(doSmth, param) {
var resp = $.ajax({
type: 'GET',
url: 'url',
success: function () {
data = resp.responseText;
doSmth(param, data);
}
});
}
I don't need to use asynchronous request, if the synchronous one works fast (but now it doesn't). There seem to be some executions, that make the request remain pending for too long. I don't see the execution that may be a bottleneck. Maybe it's somewhere in the libraries that are used, but no other requests are active when resp is being processed.
What are the ways to fix the problem or to analyze it? An advice would be appreciated.
There are two main culprits if a response is sat on "pending" for too long:
The application code that is fulfulling the ajax request is taking longer than expected
Simple network latency (not much that can be done about that in the application layer)
If you have access to the code that is fulfilling the request then I'd start there. Also, it's probably not a network issue if this request is taking an unusually long time compared to all your other requests
Have you tried the async method like this:
asyncMethod: function(doSmth, param) {
$.ajax({
type: 'GET',
url: 'url',
success: function (response, status) {
doSmth(param, response.responseText);
}
});
}
We have two domains(xyz.com and zxy.com). and using jquery ajax call to get the data from one domain.
If first one fails, how do i check and redirect to another domain using javascript.
Could you not just use window.location.host to find out which domain the user is currently on?
This way you can make the correct ajax call from the start. There is no reason to make a request to the wrong domain, wait for it to fail then make another request.
If you do however wish to actually make an AJAX request and wait for it to fail then make a different request it would be as follows (this answer is quite verbose on purpose):
var firstRequest = $.ajax({
url: 'http://xyz.com',
timeout: 5000
});
firstRequest.done(function (data) {
});
firstRequest.fail(function () {
var anotherRequest = $.ajax({
url: 'http://xzy.com',
timeout: 5000
});
anotherRequest.done(function (data) {
});
anotherRequest.fail(function () {
});
});