I have a list of URIs and while iterating over that list I send OData GET requests via OData.read().
My problem is that those calls are asynchronous and I want them to be synchronous.
Is there any way to accomplish that?
As far as I know the given parameters of OData.read() won't allow this. But maybe there is some kind of work-around?
I solved the problem by using a recursive function, triggering the next ajax request exactly then, when the response of the previous is received.
Related
I'm writing a script that uses Ajax. The script will call an API, and then use that data to call the API again, and then based on that a final request to the API a third time.
Currently the Ajax requests are chained, so if response status is 200, it will perform the other Ajax request and if that one is 200 it will do another. So basically nested requests.
They are asynchronous requests. Is this the correct way to do this? I cant help but think its a little messy, and wrong.
With ajax request, chaining them with callbacks is the right way... its the best way to make sure the second call initializes only after the first one finished successfully.
asyncCall1( function(){
asyncCall2(function(){
asyncCall3();
})
})
On javascript-side I would say it's a correct way.
But on API-side instead of multiple requests your api could/should be able to respond with the end-result (or merged results) on the first request, when the following requests are just based on data retrieved by previous requests.
How i can make several requests to a server all at the same time, but preventing the mix of the responses?
Each of the ajax requests is made separately and you should set them up so that they go to different handlers when the ajax request is finished. The handlers may not be called in the same order since each may take longer than another.
If your code requires that they come back in the same order, you should create a single call that returns all the values you need. Or you need to queue the responses until they have all been processed
Hi I make randomly calling multiple ajax calls.how i can check all ajax calls are completed and values get loaded in combox and multiple boxes,PLease give any solution other than ajax status,Any javascript event which triggers when all elements loaded???,I tried prototype document.observe("dom:loaded", function() but its not working for ajax calls
how i can check all ajax calls are completed and values get loaded in combox and multiple boxes,PLease give any solution other than ajax status
Why? What's wrong with using the AJAX request status, which is the canonical way to determine the status of the request (and thus success or failure)?
There might be a legitimate reason for this restriction (though at first glance it appears not), but if so then it's because you're doing something unusual, such as making requests that you expect to "fail". If this is the case, then you'd need to make clear exactly what the constraints are anyway.
Failing that, just check the status and ensure that the remote server is returning the right status for requests (if it's under your control).
i have a page with many actions on it, which triggers with $.get, but i want to run one at a time, rather then all triggering at once, that is lots of load time.. so what is the solution for me on this?
well before you give answer.. i dont want to trigger based on time, i want to trigger after each request is completely done from ajax, then go and continue with loop for ajax after first one is done.
Do you want to execute synchronous requests? if so, you need to use jQuery's ajax method instead of get, setting async:false.
check http://api.jquery.com/jQuery.ajax/
EDIT
As one commenter properly pointed, making sync requests hangs the only thread javascript code has. That means no animations or other code running while you wait for the requests to finish.
An interesting option would be to "chain" your requests, executing the next request in the previous one callback like this:
$.get('ajax/first-call.html', function(data) {
$.get('ajax/second-call.html', function(data){
//etc
}
});
You can setup your own custom queue in jQuery.
http://api.jquery.com/queue/
Populate your queue with all the functions you want to execute.
Each function is a single call to $.get().
In the callback for each $.get function, call the dequeue() function to start up the next ajax call.
It sounds like you want an ajax queue.
I've used this plugin before, and it's pretty simple.
Most browsers will only request four HTTP calls at once for the same domain. The others will be queued up and executed in serial. So the browser already implements some queuing on these requests.
Is it possible to kill a previous ajax request?
We have tabular data very adjacent to each other. On mouseover event of each data we are making a request to our server using JQuery Ajax object and showing in popup.
But as frequently we move mouse to other tabular contents previous Ajax responses are being displayed inside popups prior exact response being displayed which is meant for that tabular content.
I need when new Ajax request is generated previous request/response should be killed, so that always expected latest response is available inside popup.
I am using JQuery, PHP and Mysql to server the request.
Could you create a custom Javascript Sync object which would be shared by the function making subsequent ajax calls?
Assign a sequentially generated id as a parameter to the request call going in. Include the same id in response. On firing every request assign a new id, incremented by 1 or whatever logic. If the current id in response is not same as the one in shared object; ignore the response else render the response.
this would cleanly solve the race condition. I am not sure myself if there is a way to kill the request prematurely but it would at least not create rendering problem that you face now.
Another option would be not to initiate another request until the first is completed.
Yes and no. That is the point of Ajax. To be able to do something asynchronously. What you are wanting to do is to abort a request which destroys the idea of asynchronously. Perhaps what you can do is, if you send another request, set a value somewhere indicating the number of requests, then in the callbacks to your requests, check if the amount of request is higher than 1, if so ignore the response.
Check this AJAX Manager plugin. The XmlHttpRequest has an abort() function but jQuery doesn't have a wrapper for it.