Finding out when both JSON requests have completed - javascript

I currently have the following code:
function render(url1, url2, message) {
utility.messageBoxOpen(message);
$.getJSON(url1, function (items) {
// Do something
utility.messageBoxClose();
});
$.getJSON(url2, function (items) {
// Do something
});
}
When the function is executed a modal window appears to inform the user something is loading. Initially I only had one $getJSON request so when the request was done the modal window closed as per the code above.
I am looking to add another $getJSON request but want to close the modal window only when both $getJSON requests have completed.
What is the best way of achieving this?

You're looking for $.when()
All jQuery ajax requests (including shortcuts like getJSON) return deferred objects which can be used to control other actions.
var dfd1 = $.getJSON(url1, function (items) {
// Do something
});
var dfd1 = $.getJSON(url2, function (items) {
// Do something
});
$.when(dfd1, dfd2).then(function(){
//both succeeded
utility.messageBoxClose();
},function(){
//one or more of them failed
});
If you don't care whether the getJSONs come back successfully or not and instead only care that they are done you can instead:
$.when(dfd1, dfd2).done( utility.messageBoxClose );

A variable
function render(url1, url2, message) {
utility.messageBoxOpen(message);
var isOneDone = false;
$.getJSON(url1, function (items) {
// Do something
if(!isOneDone)
isOneDone = true;
else
utility.messageBoxClose();
});
$.getJSON(url2, function (items) {
// Do something
if(!isOneDone)
isOneDone = true;
else
utility.messageBoxClose();
});
}

You can replace the getJSON() call to one using $.ajax which accomplishes the same thing but gives you more flexibility:
$.ajax({
url: http://whatever,
dataType: 'json',
async: false,
data: {data},
success: function(data) {
// do the thing
}
});
Note the async:false part - this makes the code execution pause until the request is completed. So you could simply make your two calls this way, and close the dialog after the second call is completed.

function render(url1, url2, message) {
utility.messageBoxOpen(message);
$.when($.getJSON(url1, function (items) {
// Do something
utility.messageBoxClose();
}), $.getJSON(url2, function (items) {
// Do something
})).then(function () {
//Both complete
});
}
jQuery.when

Related

Order of ajax requests is always different

I have a javascript code which have to request the database (ajax). But I discovered that the inserts were wrong but with the right sql request. So I added an alert on which ajax request to know when the code is executed.
Here is the code :
$.post("/kohana-v3.3.5/ajax/update_simulation", {
id_simulation: id_simulation,
nom_simulation: nom_simulation,
sol_simulation: sol_simulation,
station_simulation: station_simulation,
iteration_simulation: iteration_simulation,
scenario_simulation: scenario_simulation
}
, function (result) {
console.log(result);
alert('update');
});
$.post("/kohana-v3.3.5/ajax/delete_pousses", {id_simulation: id_simulation}, function (result) {
console.log(result);
alert('delete');
});
$(this).prev('div').find('table .formRows').each(function (i) {
alert('here');
if (cpt % 2 == 1) {
//interculture
var $tds = $(this).find('td option:selected'),
culture = $tds.eq(0).val(),
date = $tds.eq(1).text();
itk = null;
} else {
//culture
var $tds = $(this).find('td option:selected'),
culture = $tds.eq(0).val(),
itk = $tds.eq(1).val();
date = null;
}
$.post("/kohana-v3.3.5/ajax/insert_pousses", {
id_simulation: id_simulation,
culture: culture,
date: date,
itk: itk,
rang: cpt
}, function (result) {
console.log(result);
alert('insert');
}); //Fin du post
cpt++;
}); //Fin du each
Each time I run that code, the order of the alert is always different ! Sometimes "insert update delete", sometimes "update, delete insert" ...
And it's a problem because if the delete is the last one, the insert will be removed. So, is it a normal way ? How should I resolve it ?
javascript can be executed asynchronously - and that's the reason why your ajax requests are not always executed in the same order. You can set them asnyc false (like here jQuery: Performing synchronous AJAX requests) or make something like promises (https://api.jquery.com/promise/) to wait for the ajax call to be finished.
greetings
AJAX requests are asynchronous, so you cannot guarantee an order if you trigger them as siblings like this.
In order to guarantee a fixed order, you need to make the subsequent call from the success block of its predecessor. Something like this:
$.post('/ajax/method1', { params: params },
function(result) {
$.post('/ajax/method2', { params: params },
function(result) {
$.post('/ajax/method3', { params: params },
function(result) {
});
});
});
You can use .promise to "observe when all actions of a certain type bound to the collection, queued or not, have finished."
https://api.jquery.com/promise/
Example Function
function testFunction() {
var deferred = $.Deferred();
$.ajax({
type: "POST",
url: "",
success: function (data) {
deferred.resolve(data);
}
});
return deferred.promise();
}
Calling Function
function CallingFunction()
{
var promise = testFunction();
promise.then(function (data) {
//do bits / call next funtion
}
}
Update
This may also help you out:
"Register a handler to be called when all Ajax requests have completed."
https://api.jquery.com/ajaxStop/
$(document).ajaxStop(function () {
});
Final note:
As of jQuery 1.8, the use of async: false is deprecated, use with $.Deferred.
you need to call post ajax method after by the success of previous one.
like:
$.post("/kohana-v3.3.5/ajax/update_simulation", {
id_simulation: id_simulation,
nom_simulation: nom_simulation,
sol_simulation: sol_simulation,
station_simulation: station_simulation,
iteration_simulation: iteration_simulation,
scenario_simulation: scenario_simulation
}
, function (result) {
console.log(result);
alert('update');
dleteajax();
});
function dleteajax()
{
$.post("/kohana-v3.3.5/ajax/delete_pousses", {id_simulation: id_simulation}, function (result) {
console.log(result);
alert('delete');
});
}

Stopping Code until AJAX call completes inside $.each loop

I have a function that loads HTML from an external file via an AJAX call using jQuery.
These ajax call runs inside of a $.each loop and I need this ajax call to finish before the loop continues. Here is my code:
$('img').each(function(){
var cotainer_html = $('.cotainer_html').clone();
/* Get Image Content */
$.ajax({
url: '/assets/ajax/get_studio_image.php',
type:'GET',
success:function(data){
cotainer_html.find('.replaceme').replaceWith(data);
}
});
});
I know I can set async:false but I hear that is not a good idea. Any ideas?
To achieve this you can put each request in to an array and apply() that array to $.when. Try this:
var requests = [];
$('img').each(function(){
var cotainer_html = $('.cotainer_html').clone();
/* Get Image Content */
requests.push($.ajax({
url: '/assets/ajax/get_studio_image.php',
type:'GET',
success:function(data){
cotainer_html.find('.replaceme').replaceWith(data);
}
}));
});
$.when.apply($, requests).done(function() {
console.log('all requests complete');
});
Note that you're replacing the same content on each request, so the only one which will have any effect on the UI is the last request. The preceding ones are redundant.
Also note that you should never, ever use async: false. It locks the UI thread of the browser until the request completes, which makes it look like it has crashed to the user. It is terrible practice. Use callbacks.
The OP appears to want the calls to run in series, not in parallel
If this is the case you could use recursion:
function makeRequest($els, index) {
var cotainer_html = $('.cotainer_html').clone();
$.ajax({
url: '/assets/ajax/get_studio_image.php',
type:'GET',
success:function(data){
cotainer_html.find('.replaceme').replaceWith(data);
if ($els.eq(index + 1).length) {
makeRequest($els, ++index);
} else {
console.log('all requests complete');
}
}
});
}
makeRequest($('img'), 0);
You can use a pseudo-recursive loop:
var imgs = $('img').get();
var done = (function loop() {
var img = imgs.shift();
if (img) {
var cotainer_html = $('.cotainer_html').clone();
/* Get Image Content */
return $.get('/assets/ajax/get_studio_image.php')
.then(function(data) {
cotainer_html.find('.replaceme').replaceWith(data);
}).then(loop);
} else {
return $.Deferred().resolve(); // resolved when the loop terminates
}
})();
This will take each element of the list, get the required image, and .then() start over until there's nothing left.
The immediately invoked function expression itself returns a Promise, so you can chain a .then() call to that that'll be invoked once the loop has completed:
done.then(function() {
// continue your execution here
...
});

Javascript esriRequest (dojo) in a function async issue

I am facing the following synchronization issue. I wouldn't be surprised if it has a simple solution/workaround. The BuildMenu() function is called from another block of code and it calls the CreateMenuData() which makes a request to a service which return some data. The problem is that since it is an async call to the service when the data variable is being used it is undefined. I have provided the js log that also shows my point.
BuildMenu: function () {
console.log("before call");
var data=this.CreateMenuData();
console.log("after call");
//Doing more stuff with data that fail.
}
CreateMenuData: function () {
console.log("func starts");
data = [];
dojo.forEach(config.layerlist, function (collection, colindex) {
var layersRequest = esriRequest({
url: collection.url,
handleAs: "json",
});
layersRequest.then(
function (response) {
dojo.forEach(response.records, function (value, key) {
console.log(key);
data.push(key);
});
}, function (error) {
});
});
console.log("func ends");
return data;
}
Console log writes:
before call
func starts
func ends
after call
0
1
2
3
4
FYI: using anything "dojo." is deprecated. Make sure you are pulling all the modules you need in "require".
Ken has pointed you the right direction, go through the link and get familiarized with the asynchronous requests.
However, I'd like to point out that you are not handling only one async request, but potentionally there might be more of them of which you are trying to fill the "data" with. To make sure you handle the results only when all of the requests are finished, you should use "dojo/promise/all".
CreateMenuData: function (callback) {
console.log("func starts");
requests = [];
data = [];
var scope = this;
require(["dojo/_base/lang", "dojo/base/array", "dojo/promise/all"], function(lang, array, all){
array.forEach(config.layerlist, function (collection, colindex) {
var promise = esriRequest({
url: collection.url,
handleAs: "json",
});
requests.push(promise);
});
// Now use the dojo/promise/all object
all(requests).then(function(responses){
// Check for all the responses and add whatever you need to the data object.
...
// once it's all done, apply the callback. watch the scope!
if (typeof callback == "function")
callback.apply(scope, data);
});
});
}
so now you have that method ready, call it
BuildMenu: function () {
console.log("before call");
var dataCallback = function(data){
// do whatever you need to do with the data or call other function that handles them.
}
this.CreateMenuData(dataCallback);
}

jQuery Ajax / .each callback, next 'each' firing before ajax completed

Hi the below Javascript is called when I submit a form. It first splits a bunch of url's from a text area, it then:
1) Adds lines to a table for each url, and in the last column (the 'status' column) it says "Not Started".
2) Again it loops through each url, first off it makes an ajax call to check on the status (status.php) which will return a percentage from 0 - 100.
3) In the same loop it kicks off the actual process via ajax (process.php), when the process has completed (bearing in the mind the continuous status updates), it will then say "Completed" in the status column and exit the auto_refresh.
4) It should then go to the next 'each' and do the same for the next url.
function formSubmit(){
var lines = $('#urls').val().split('\n');
$.each(lines, function(key, value) {
$('#dlTable tr:last').after('<tr><td>'+value+'</td><td>Not Started</td></tr>');
});
$.each(lines, function(key, value) {
var auto_refresh = setInterval( function () {
$.ajax({
url: 'status.php',
success: function(data) {
$('#dlTable').find("tr").eq(key+1).children().last().replaceWith("<td>"+data+"</td>");
}
});
}, 1000);
$.ajax({
url: 'process.php?id='+value,
success: function(msg) {
clearInterval(auto_refresh);
$('#dlTable').find("tr").eq(key+1).children().last().replaceWith("<td>completed rip</td>");
}
});
});
}
What you want is to run several asynchronous actions in sequence, right? I'd build an array of the functions to execute and run it through a sequence helper.
https://github.com/michiel/asynchelper-js/blob/master/lib/sequencer.js
var actions = [];
$.each(lines, function(key, value) {
actions.push(function(callback) {
$.ajax({
url: 'process.php?id='+val,
success: function(msg) {
clearInterval(auto_refresh);
//
// Perform your DOM operations here and be sure to call the
// callback!
//
callback();
}
});
});
}
);
As you can see, we build an array of scoped functions that take an arbitrary callback as an argument. A sequencer will run them in order for you.
Use the sequence helper from the github link and run,
var sequencer = new Sequencer(actions);
sequencer.start();
It is, btw, possible to define sequencer functions in a few lines of code. For example,
function sequencer(arr) {
(function() {
((arr.length != 0) && (arr.shift()(arguments.callee)));
})();
}
AJAX is asynchronous.
That's exactly what's supposed to happen.
Instead of using each, you should send the next AJAX request in the completion handler of the previous one.
You can also set AJAX to run synchronously using the "async" property. Add the following:
$.ajax({ "async": false, ... other options ... });
AJAX API reference here. Note that this will lock the browser until the request completes.
I prefer the approach in SLaks answer (sticking with asynchronous behavior). However, it does depend on your application. Exercise caution.
I would give the same answer as this jquery json populate table
This code will give you a little idea how to use callback with loops and ajax. But I have not tested it and there will be bugs. I derived the following from my old code:-
var processCnt; //Global variable - check if this is needed
function formSubmit(){
var lines = $('#urls').val().split('\n');
$.each(lines, function(key, value) {
$('#dlTable tr:last').after('<tr><td>'+value+'</td><td>Not Started</td></tr>');
});
completeProcessing(lines ,function(success)
{
$.ajax({
url: 'process.php?id='+value,
success: function(msg) {
$('#dlTable').find("tr").eq(key+1).children().last().replaceWith("<td>completed rip</td>");
}
});
});
}
//Following two functions added by me
function completeProcessing(lines,callback)
{
processCnt= 0;
processingTimer = setInterval(function() { singleProcessing(lines[processCnt],function(completeProcessingSuccess){ if(completeProcessingSuccess){ clearInterval(processingTimer); callback(true); }})}, 1000);
}
function singleProcessing(line,callback)
{
key=processCnt;
val = line;
if(processCnt < totalFiles)
{ //Files to be processed
$.ajax({
url: 'status.php',
success: function(data) {
$('#dlTable').find("tr").eq(key+1).children().last().replaceWith("<td>"+data+"</td>");
processCnt++;
callback(false);
}
});
}
else
{
callback(true);
}
}

How to chain ajax requests?

I have to interact with a remote api that forces me to chain requests. Thats a callback-hell in asynchronous mode:
// pseudocode: ajax(request_object, callback)
ajax(a, function() {
ajax(b(a.somedata), function() {
ajax(c(b.somedata), function() {
c.finish()
}
})
})
It would be much more readable in sync mode:
sjax(a)
sjax(b(a.somedata))
sjax(c(b.somedata))
c.finish()
But Sjax is evil :) How do I do that in a nice not-so-evil and readable way?
You could have a single function which is passed an integer to state what step the request is in, then use a switch statement to figure out what request needs to be make next:
function ajaxQueue(step) {
switch(step) {
case 0: $.ajax({
type: "GET",
url: "/some/service",
complete: function() { ajaxQueue(1); }
}); break;
case 1: $.ajax({
type: "GET",
url: "/some/service",
complete: function() { ajaxQueue(2); }
}); break;
case 2: $.ajax({
type: "GET",
url: "/some/service",
complete: function() { alert('Done!'); }
}); break;
}
}
ajaxQueue(0);
Hope that helps!
Don't use anonymous functions. Give them names. I don't know if you're able to do what I wrote below though:
var step_3 = function() {
c.finish();
};
var step_2 = function(c, b) {
ajax(c(b.somedata), step_3);
};
var step_1 = function(b, a) {
ajax(b(a.somedata), step_2);
};
ajax(a, step_1);
This function should chain together a list of ajax requests, if the callbacks always return the parameters necessary for the next request:
function chainajax(params, callbacks) {
var cb = shift(callbacks);
params.complete = function() {
var newparams = cb(arguments);
if (callbacks)
chainajax(newparams, callbacks);
};
$.ajax(params);
};
You can define these callback functions separately and then chain them together:
function a(data) {
...
return {type: "GET", url: "/step2.php?foo"}
};
// ...
function d(data) { alert("done!"); };
chainajax({type: "GET", url: "/step1.php"},
[a, b, c, d]);
You could also declare the functions "inline" in the call to chainajax, but that might get a little confusing.
Maybe what you can do is write a server-side wrapper function. That way your javascript only does a single asynchronous call to your own web server. Then your web server uses curl (or urllib, etc.) to interact with the remote API.
Update: I've learn a better answer for this if you are using jQuery, see my update under the title: Using jQuery Deffered
Old answer:
You can also use Array.reduceRight (when it's available) to wrap the $.ajax calls and transform a list like: [resource1, resource2] into $.ajax({url:resource1,success: function(...) { $ajax({url: resource2... (a trick that I've learn from Haskell and it's fold/foldRight function).
Here is an example:
var withResources = function(resources, callback) {
var responses = [];
var chainedAjaxCalls = resources.reduceRight(function(previousValue, currentValue, index, array) {
return function() {
$.ajax({url: currentValue, success: function(data) {
responses.push(data);
previousValue();
}})
}
}, function() { callback.apply(null, responses); });
chainedAjaxCalls();
};
Then you can use:
withResources(['/api/resource1', '/api/resource2'], function(response1, response2) {
// called only if the ajax call is successful with resource1 and resource2
});
Using jQuery Deffered
If you are using jQuery, you can take advantage of jQuery Deffered, by using the jQuery.when() function:
jQuery.when($.get('/api/one'), $.get('/api/two'))
.done(function(result1, result2) {
/* one and two is done */
});
Check out this FAQ item on the jQuery site. Specially the callback reference and the complete method.
What you want is data from A to be passed to B and B's data passed to C. So you would do a callback on complete.
I haven't tried this though.
I believe that implementing a state machine will make the code more readable:
var state = -1;
var error = false;
$.ajax({success: function() {
state = 0;
stateMachine(); },
error: function() {
error = true;
stateMachine();
}});
function stateMachine() {
if (error) {
// Error handling
return;
}
if (state == 0) {
state = 1;
// Call stateMachine again in an ajax callback
}
else if (state == 1) {
}
}
I made a method using Promises
// How to setup a chainable queue method
var sequence = Promise.resolve();
function chain(next){
var promise = new Promise(function(resolve){
sequence.then(function(){
next(resolve);
});
});
sequence = promise;
}
// How to use it
chain(function(next){
document.write("<p>start getting config.json</p>");
setTimeout(function(){
document.write("<p>Done fetching config.json</p>");
next();
}, 3000);
});
chain(function(next){
document.write("<p>start getting init.js</p>")
setTimeout(function(){
document.write("<p>starting eval scripting</p>");
next();
}, 3000);
});
chain(function(next){
document.write("<p>Everything is done</p>");
});
Bonus: A ultraligth 138 byte limited A- Promise (that can only resolve - without parameters, and only call the last then-method )
Background:
I made this for node.js at the point where it dose not have promises ATM. I didn't want a complete full blown Promise library that I was dependent on and had to include in my package.json, I needed it to be fast and light and do mostly one thing only. I only needed it for one thing (chaining things like you want to)
function Q(a,b){b=this;a(function(){b.then&&b.then();b.then=i});return b}function i(a){a&&a()}Q.prototype={then:function(a){this.then=a}};
How?
// Start with a resolved object
var promise = new Q(function(a){a()});
// equal to
// var promise = Promise.resolve();
// example usage
new Q(function(resolve){
// do some async stuff that takes time
// setTimeout(resolve, 3000);
}).then(function(){
// its done
// can not return a new Promise
}); // <- can not add more then's (it only register the last one)
and for the chainable queue method
// How to setup a chainable queue method with ultraligth promise
var sequence = new Q(function(a){a()});
function chain(next){
var promise = new Q(function(resolve){
sequence.then(function(){
next(resolve);
});
});
sequence = promise;
}
The complete callback is what you're looking for:
$.ajax({
type: 'post',
url: "www.example.com",
data: {/* Data to be sent to the server. It is converted to a query string, if not already a string. It's appended to the url for GET-requests. */},
success:
function(data) {
/* you can also chain requests here. will be fired if initial request is successful but will be fired before completion. */
},
complete:
function() {
/* For more a more synchronous approach use this callback. Will be fired when first function is completed. */
}
});

Categories

Resources