Jquery. Calling a function after .each loop with ajax calls - javascript

I've been searching online for hours and I know that I probably have to do something with the deferred objects, but I can't get it done.
Firstly, here is my code:
$('#uploadButton').on('click', function () {
//some preparation stuff (deleted)
var panels = boxesContainer.find('.panel');
var ajaxes = [];
$.each(panels, function (index, panel) {
//more preparation + declaration of variables
function getIdAndPrepareData() {
$.when(
$.ajax({
type: 'POST',
url: 'url',
data: {
'title': name
}
}))
.done(function (result, textStatus, jqXHR) {
if (result.success) {
var id = result.id;
} else {
getIdAndPrepareData();
}
console.log('Created Id: ' + id);
$.each(panelForms, function (index, form) {
var filesInput = //again, prep and vars
$.each($(filesInput)[0].files, function (index, file) {
var formData = new FormData();
formData.append('file', file);
//more stuff
ajaxes.push({
'formData': formData,
'file': file
});
});
});
}).fail(function () {
getIdAndPrepareData();
});
}
getIdAndPrepareData();
});
$.ajax().promise().done(function () {
console.log('bla bla bla');
});
});
So, basically, I am looping through certain DIV's (.panels) and creating a new database entity via ajax for them. Then I am preparing data to be sent to server via ajax, after all the loops complete. And I need to send this data after all the loops with ajax, because the next ajax calls (that I am planning to make after the iterating through .panels and preparing data) are going to create another entities that will be related with the entities created for .panel DIV's ( I push all this data to ajaxes array, and plan to use it leter on ).
I am using jQuerys deferred objects inside panels loop in order to get the newly created ID of the panel, and hold it in the ajaxes array. But I do not know how to execute any code after the panels loop.
I tried to make a promise (I am quite new to this technique) for all ajax calls at the end ( $.ajax().promise().done ), but id doesn't seem to work. Sometimes the console.log in the promise, fires at the end, sometimes at the beginning.
I am not an expert in jQuery and JS, so I would like to ask for some explanations how to work with asynchronous ajax calls inside loops and what should I do in this situation? I want to execute some code at the end, after all the data is prepared.
Thank you.

You will get answers using arrays of promises and evaluating them using array evaluation against $.when etc, but there is a handy shortcut where you can chain $.when calls with only a slight overhead:
Pseudo code below:
var promise; // undefined is a resolved promise to $.when
for (items in a loop){
promise = $.when(promise, $.ajax({...});
}
promise.done(function(){
// All done
});
Notes:
$.ajax returns a promise. That promise is to call you back on completion with the data or an error.
$.when calls you back when a number of promises have completed (or when any fails)
If you call $.when with an existing promise and a new promise you get back a third promise that will complete when both are done. These can be chained together in sequence.
The downside of this shortcut is that the final data values passed to done are more complex that expected with normal evaluation of an array of promises against done.
I use this technique, in preference to arrays of promises, when I just need to know overall completion and not all the individual data/results. It makes for far simpler code and the overhead of the extra promises is minimal. It also works great with sets of animations.

Related

Queuing/throttling jQuery ajax requests

I need to fire a number of ajax requests at a server and then run a callback when they are finished. Normally this would be easy using jQuery's deferred.done() . However to avoid overwhelming the server, I'm queuing the requests, and firing one every X milliseconds.
e.g
var promisesList = [];
var addToQueue = function(workflow) {
workflowQueue.push(workflow);
}
var startWorkflow = function(workflow) {
return $.ajax($endointURL, {
type: "POST",
data: {
action: workflow.id
},
success: function() {
},
error: function(jqXHR, textStatus, errorThrown) {
}
});
};
var startWorkflows = function() {
var promisesList = [];
if (workflowQueue.length > 0) {
var workflow = workflowQueue.shift();
promisesList.push(startWorkflow(workflow));
setTimeout(startWorkflows, delay);
}
};
startWorkflows();
$.when(promisesList).done(function(){
//do stuff
});
The problem with this is, that the promisesList array is initially empty, so the done() callback fires immediately, and then the ajax requests start getting sent by the setTimeout(). Is there an easy way to create the ajax requests initially and kind of "pause" them, then fire them using setTimeout().
I've found various throttle/queue implementations to fire ajax requests sequentially, but I'm happy for them to be fired in parallel, just with a delay on them.
The first thing you're stumbling upon is that when() doesn't work this way with arrays. It accepts an arbitrary list of promises so you can get around that by applying an array using:
$.when.apply(null, promiseList).done(function(){
// Do something
// use the `arguments` magic property to get an ordered list of results
});
Secondly, the throttling method can be done with $.ajax param of {delay:timeInSeconds} but I've proposed a solution that sets up a new defered which is immediately returned (to keep the order) but resolved after a timeout.
See http://jsfiddle.net/9Acb2/1/ for an interactive example

Javascript function for N ajax calls

I frequently need to load a function on a webpage after between two and five modest-sized data files have loaded. Let's say a maximum of 3MB of data split over a maximum of five files.
I try to optimize the load time by making all the AJAX calls at the same time and loading an initialize() function after they have all loaded, like so:
var data1, data2;
$(document).ajaxStop(function() {
$(this).unbind("ajaxStop"); //prevent running again when other calls finish
initialize();
});
$.ajax({
url: url_to_data_1,
success: function (d) {
data1 = d;
}
});
$.ajax({
url: url_to_data_2,
success: function (d) {
data2 = d;
}
});
function initialize() { /* do something with data1 and data2 */ }
But I'm tired of pasting this in every time, so I want a function like this:
function multi_Ajax(urls, callback) {
var data = {};
//call init() after both the list of prayers and the word concordance index load
$(document).ajaxStop(function() {
$(this).unbind("ajaxStop"); //prevent running again when other calls finish
callback(data);
});
for (var c = 0; c < urls.length; c += 1) {
//data files
$.ajax({
url: urls[c],
dataType: "json",
success: function (d) {
data[urls[c]] = d; console.log("Loaded " + urls[c]);
}
});
}
}
This does not work, of course, since the ajax calls do not exist for ajaxStop to catch. But I do not understand how ajaxStop works well enough to get much further. Thank you!
I'm not sure why your second attempt wouldn't work. According to the documentation, whenever an ajax request completes, jquery will check if there are any other requests that are still outstanding and fires ajaxStop if there are none. Calling $.ajax in a loop shouldn't be any different than hardcoding each call, as far as I can tell.
However, as Barmar suggested in the comments, $.when seems like a cleaner way to do what you want to do. Here's an example of how to pass an array (which you can populate in a loop) to $.when: Pass in an array of Deferreds to $.when()
Using $.when seems cleaner than $.ajaxStop because if someone later comes along and adds an unrelated ajax request somewhere before or after your loop, that would interfere with when ajaxStop triggers. $.when allows you to explicitly say which ajax requests you want to wait for.
EDIT: Here's a fiddle showing ajaxStop working for multiple calls issued in a loop: http://jsfiddle.net/zYk5W/
It looks like the reason this wasn't working for you has nothing to do with .ajax or .ajaxStop; instead it's a scope issue in your success callback. Your success callback closes over c, but this is the same c that the outer (loop) scope uses. By the time any of the success callbacks runs, the for loop has completed and c has been incremented to urls.length. Thus every time your success callback runs, urls[c] is undefined. See the fiddle I linked or JavaScript closure inside loops – simple practical example for an example of how to give each success callback its own c in a scope separate from the loop's c.

dojo two xhr requests and callback function fired only when both are complete

Is there a way of handling multiple xhr requests like this? for one xhr, there's a function builtin but for multiple requests?
A dojo.xhr returns a dojo.Deferred, a Deferred is a wrapper for callbacks and like. So key function in your case would be dojo.xhr({parameters}).then(callbackFunction).
Whilst we often want to achieve parallel XHR by the an easy shortcut of polls and counters, Dojo has also though of this, there is as Craig informs DeferredList;
You will fire two times an XHR, each call returns a handler, reference those. Create a list with those references as arguments, collected in an array. Then as final touch, append a 'then' call on the list,
var dXhr1 = dojo.xhrGet({ url: ... });
var dXhr2 = dojo.xhrGet({ url: ... });
var dList = new dojo.DeferredList([dXhr1, dXhr2]);
dList.then(function(arrayOfValues) {
var res = "Result: succes?+"arrayOfValues[0][0].toString()+":"+arrayOfValues[0][1]+", "+
succes?+"arrayOfValues[1][0].toString()+":"+arrayOfValues[1][1]+";
console.log(res);
});
or chained
new dojo.DeferredList([
dojo.xhrGet({ url: ... }),
dojo.xhrGet({ url: ... })
]).then(function(res) { console.log(res); });
A DeferredList will do what you need.
http://dojotoolkit.org/reference-guide/1.7/dojo/DeferredList.html
Couldn't you fire a callback that hits a function with an internal counter? When that counter reaches 2, then both callbacks have completed?

getJSON to string then loop through string

I have the following code which is included in a keypress function:
$.getJSON('dimensions.json', function(data) {
$.each(data, function(index) {
$('#div1').append(index);
});
});
I'm trying to first get the JSON string, save it in a variable and then run the each(). I want to basically separate the each() to be unlinked to the getJSON() function because I don't want it to fetch the json file for every keypress.
I've tried this, but it didn't work:
var JSONstr = $.getJSON('dimensions.json');
$.each(JSONstr, function(index) {
$('#div1').append(index);
});
In your first example, you do $.each in the callback. The callback is executed by some other callback after there result is received, while $.getJSON returns immediately without waiting for the result (since there is no blocking in JavaScript by design).
Therefore the code in your second example can never work: the $.each begins before any result is received from the web server, probably even before the request is sent. Whatever the return value of $.getJSON is, it can't, by the design of JavaScript, be the result of AJAX request.
UPD: Saw your comment, now I understand what you wanted to do. Here's a simple example of how to do this:
function ActualHandler(data) {
$.each(data, function(index) {
$('#div1').append(index);
});
}
function KeypressHandler() {
if (window.my_data) { // If we have the data saved, work with it
ActualHandler(window.my_data);
}
else { // Otherwise, send the request, wait for the answer, then do something
$.getJSON('dimensions.json', function(data) {
window.my_data = data; // Save the data
ActualHandler(data); // And *then* work on it
});
}
}
Here, the ActualHandler is not launched before the data is received, and once that happens, all subsequent clicks will be handled immediately.
The downside in this particular case is that if user clicks again while the first request is running, one more will be sent. But to fix that you would need to maintain some queue, which is kind of out of scope here.
You fell into the asynchronous trap. Your $.each() function doesn't wait for your $.getJSON() call to get the data. You can get around this by using the good 'ol $.ajax() function. Like this:
function processJSON(data) {
$.each(data, function(index) {
$('#div1').append(index);
});
}
$.ajax({
url: 'dimensions.json',
dataType: 'json',
async: false,
success: processJSON(data)
});

How do you make javascript code execute *in order*

Okay, so I appreciate that Javascript is not C# or PHP, but I keep coming back to an issue in Javascript - not with JS itself but my use of it.
I have a function:
function updateStatuses(){
showLoader() //show the 'loader.gif' in the UI
updateStatus('cron1'); //performs an ajax request to get the status of something
updateStatus('cron2');
updateStatus('cron3');
updateStatus('cronEmail');
updateStatus('cronHourly');
updateStatus('cronDaily');
hideLoader(); //hide the 'loader.gif' in the UI
}
Thing is, owing to Javascript's burning desire to jump ahead in the code, the loader never appears because the 'hideLoader' function runs straight after.
How can I fix this? Or in other words, how can I make a javascript function execute in the order I write it on the page...
The problem occurs because AJAX is in its nature asynchronus. This means that the updateStatus() calls are indeed executed in order but returns immediatly and the JS interpreter reaches hideLoader() before any data is retreived from the AJAX requests.
You should perform the hideLoader() on an event where the AJAX calls are finished.
You need to think of JavaScript as event based rather than procedural if you're doing AJAX programming. You have to wait until the first call completes before executing the second. The way to do that is to bind the second call to a callback that fires when the first is finished. Without knowing more about the inner workings of your AJAX library (hopefully you're using a library) I can't tell you how to do this, but it will probably look something like this:
showLoader();
updateStatus('cron1', function() {
updateStatus('cron2', function() {
updateStatus('cron3', function() {
updateStatus('cronEmail', function() {
updateStatus('cronHourly', function() {
updateStatus('cronDaily', funciton() { hideLoader(); })
})
})
})
})
})
});
The idea is, updateStatus takes its normal argument, plus a callback function to execute when it's finished. It's a reasonably common pattern to pass a function to run onComplete into a function which provides such a hook.
Update
If you're using jQuery, you can read up on $.ajax() here: http://api.jquery.com/jQuery.ajax/
Your code probably looks something like this:
function updateStatus(arg) {
// processing
$.ajax({
data : /* something */,
url : /* something */
});
// processing
}
You can modify your functions to take a callback as their second parameter with something like this:
function updateStatus(arg, onComplete) {
$.ajax({
data : /* something */,
url : /* something */,
complete : onComplete // called when AJAX transaction finishes
});
}
I thinks all you need to do is have this in your code:
async: false,
So your Ajax call would look like this:
jQuery.ajax({
type: "GET",
url: "something.html for example",
dataType: "html",
async: false,
context: document.body,
success: function(response){
//do stuff here
},
error: function() {
alert("Sorry, The requested property could not be found.");
}
});
Obviously some of this need to change for XML, JSON etc but the async: false, is the main point here which tell the JS engine to wait until the success call have returned (or failed depending) and then carry on.
Remember there is a downside to this, and thats that the entire page becomes unresponsive until the ajax returns!!! usually within milliseconds which is not a big deals but COULD take longer.
Hope this is the right answer and it helps you :)
We have something similar in one of our projects, and we solved it by using a counter. If you increase the counter for each call to updateStatus and decrease it in the AJAX request's response function (depends on the AJAX JavaScript library you're using.)
Once the counter reaches zero, all AJAX requests are completed and you can call hideLoader().
Here's a sample:
var loadCounter = 0;
function updateStatuses(){
updateStatus('cron1'); //performs an ajax request to get the status of something
updateStatus('cron2');
updateStatus('cron3');
updateStatus('cronEmail');
updateStatus('cronHourly');
updateStatus('cronDaily');
}
function updateStatus(what) {
loadCounter++;
//perform your AJAX call and set the response method to updateStatusCompleted()
}
function updateStatusCompleted() {
loadCounter--;
if (loadCounter <= 0)
hideLoader(); //hide the 'loader.gif' in the UI
}
This has nothing to do with the execution order of the code.
The reason that the loader image never shows, is that the UI doesn't update while your function is running. If you do changes in the UI, they don't appear until you exit the function and return control to the browser.
You can use a timeout after setting the image, giving the browser a chance to update the UI before starting rest of the code:
function updateStatuses(){
showLoader() //show the 'loader.gif' in the UI
// start a timeout that will start the rest of the code after the UI updates
window.setTimeout(function(){
updateStatus('cron1'); //performs an ajax request to get the status of something
updateStatus('cron2');
updateStatus('cron3');
updateStatus('cronEmail');
updateStatus('cronHourly');
updateStatus('cronDaily');
hideLoader(); //hide the 'loader.gif' in the UI
},0);
}
There is another factor that also can make your code appear to execute out of order. If your AJAX requests are asynchronous, the function won't wait for the responses. The function that takes care of the response will run when the browser receives the response. If you want to hide the loader image after the response has been received, you would have to do that when the last response handler function runs. As the responses doesn't have to arrive in the order that you sent the requests, you would need to count how many responses you got to know when the last one comes.
As others have pointed out, you don't want to do a synchronous operation. Embrace Async, that's what the A in AJAX stands for.
I would just like to mention an excellent analogy on sync v/s async. You can read the entire post on the GWT forum, I am just including the relevant analogies.
Imagine if you will ...
You are sitting on the couch watching
TV, and knowing that you are out of
beer, you ask your spouse to please
run down to the liquor store and
fetch you some. As soon as you see
your spouse walk out the front door,
you get up off the couch and trundle
into the kitchen and open the
fridge. To your surprise, there is no
beer!
Well of course there is no beer, your
spouse is still on the trip to the
liquor store. You've gotta wait until
[s]he returns before you can expect
to have a beer.
But, you say you want it synchronous? Imagine again ...
... spouse walks out the door ... now,
the entire world around you stops, you
don't get to breath, answer the
door, or finish watching your show
while [s]he runs across town to
fetch your beer. You just get to sit
there not moving a muscle, and
turning blue until you lose
consciousness ... waking up some
indefinite time later surrounded by
EMTs and a spouse saying oh, hey, I
got your beer.
That's exactly what happens when you insist on doing a synchronous server call.
Install Firebug, then add a line like this to each of showLoader, updateStatus and hideLoader:
Console.log("event logged");
You'll see listed in the console window the calls to your function, and they will be in order. The question, is what does your "updateStatus" method do?
Presumably it starts a background task, then returns, so you will reach the call to hideLoader before any of the background tasks finish. Your Ajax library probably has an "OnComplete" or "OnFinished" callback - call the following updateStatus from there.
move the updateStatus calls to another function. make a call setTimeout with the new function as a target.
if your ajax requests are asynchronous, you should have something to track which ones have completed. each callback method can set a "completed" flag somewhere for itself, and check to see if it's the last one to do so. if it is, then have it call hideLoader.
One of the best solutions for handling all async requests is the 'Promise'.
The Promise object represents the eventual completion (or failure) of an asynchronous operation.
Example:
let myFirstPromise = new Promise((resolve, reject) => {
// We call resolve(...) when what we were doing asynchronously was successful, and reject(...) when it failed.
// In this example, we use setTimeout(...) to simulate async code.
// In reality, you will probably be using something like XHR or an HTML5 API.
setTimeout(function(){
resolve("Success!"); // Yay! Everything went well!
}, 250);
});
myFirstPromise.then((successMessage) => {
// successMessage is whatever we passed in the resolve(...) function above.
// It doesn't have to be a string, but if it is only a succeed message, it probably will be.
console.log("Yay! " + successMessage);
});
Promise
If you have 3 async functions and expect to run in order, do as follows:
let FirstPromise = new Promise((resolve, reject) => {
FirstPromise.resolve("First!");
});
let SecondPromise = new Promise((resolve, reject) => {
});
let ThirdPromise = new Promise((resolve, reject) => {
});
FirstPromise.then((successMessage) => {
jQuery.ajax({
type: "type",
url: "url",
success: function(response){
console.log("First! ");
SecondPromise.resolve("Second!");
},
error: function() {
//handle your error
}
});
});
SecondPromise.then((successMessage) => {
jQuery.ajax({
type: "type",
url: "url",
success: function(response){
console.log("Second! ");
ThirdPromise.resolve("Third!");
},
error: function() {
//handle your error
}
});
});
ThirdPromise.then((successMessage) => {
jQuery.ajax({
type: "type",
url: "url",
success: function(response){
console.log("Third! ");
},
error: function() {
//handle your error
}
});
});
With this approach, you can handle all async operation as you wish.

Categories

Resources