I use jQuery. And I don't want parallel AJAX calls on my application, each call must wait the previous before starting. How to implement it? There is any helper?
UPDATE If there is any synchronous version of the XMLHttpRequest or jQuery.post I would like to know. But sequential != synchronous, and I would like an asynchronous and sequential solution.
There's a much better way to do this than using synchronous ajax calls. Jquery ajax returns a deferred so you can just use pipe chaining to make sure that each ajax call finishes before the next runs. Here's a working example with a more in depth example you can play with on jsfiddle.
// How to force async functions to execute sequentially
// by using deferred pipe chaining.
// The master deferred.
var dfd = $.Deferred(), // Master deferred
dfdNext = dfd; // Next deferred in the chain
x = 0, // Loop index
values = [],
// Simulates $.ajax, but with predictable behaviour.
// You only need to understand that higher 'value' param
// will finish earlier.
simulateAjax = function (value) {
var dfdAjax = $.Deferred();
setTimeout(
function () {
dfdAjax.resolve(value);
},
1000 - (value * 100)
);
return dfdAjax.promise();
},
// This would be a user function that makes an ajax request.
// In normal code you'd be using $.ajax instead of simulateAjax.
requestAjax = function (value) {
return simulateAjax(value);
};
// Start the pipe chain. You should be able to do
// this anywhere in the program, even
// at the end,and it should still give the same results.
dfd.resolve();
// Deferred pipe chaining.
// What you want to note here is that an new
// ajax call will not start until the previous
// ajax call is completely finished.
for (x = 1; x <= 4; x++) {
values.push(x);
dfdNext = dfdNext.pipe(function () {
var value = values.shift();
return requestAjax(value).
done(function(response) {
// Process the response here.
});
});
}
Some people have commented they have no clue what the code does. In order to understand it, you first need to understand javascript promises. I am pretty sure promises are soon to be a native javascript language feature, so that should give you a good incentive to learn.
You have two choices that I can think of. One is to chain them through callbacks. The other is to make the calls synchronous rather than async.
Is there a reason you want them sequential? That will slow things down.
To make the call synchronous, you'll set the async option in the Ajax call to false. See the documentation at http://docs.jquery.com/Ajax/jQuery.ajax#options (click options tab to see them).
(async () => {
for(f of ['1.json','2.json','3.json']){
var json = await $.getJSON(f);
console.log(json)
};
})()
requests 3 json files with jQuery ajax calls
process in sequence (not in parallel) with await
works in Chrome/Firefox/Edge (as of 1/30/2018)
more at MDN
The best way you could do this is by chaining callbacks as Nosredna said. I wouldn't recommend using synchronous XMLHttpRequest as they lock your entire application.
There aren't much helper for this as far as I know, but you could do something resembling a callback FIFO.
You could give narrative javascript a try http://www.neilmix.com/narrativejs/doc/
I've never used it myself though. If I wanted to do this, I would setup some kind of abstraction for chaining asynchronous actions. As others have said, the synchonous version of the ajax object blocks events from being processed while it's waiting for a response. This causes the browser to look like it's frozen until it recieves a response.
Set the async option to false, e.g.,
$.ajax({ async: false /*, your_other_ajax_options_here */ });
Reference: Ajax/jQuery.ajax
You can use promise to make ajax calls sequential. Using Array push and pop promise method, sequential ajax calls will be lot easier.
var promises = [Promise.resolve()];
function methodThatReturnsAPromise(id) {
return new Promise((resolve, reject) => {
$.ajax({
url: 'https://jsonplaceholder.typicode.com/todos/'+id,
dataType:'json',
success: function(data)
{
console.log("Ajax Request Id"+id);
console.log(data);
resolve();
}
});
});
}
function pushPromise(id)
{
promises.push(promises.pop().then(function(){
return methodThatReturnsAPromise(id)}));
}
pushPromise(1);
pushPromise(3);
pushPromise(2);
Look at this: http://docs.jquery.com/Ajax/jQuery.ajax (click on the "options" tab).
But remember a synchronous call will freeze the page until the response is received, so it can't be used in a production site, because users will get mad if for any reason they have to wait 30 seconds with their browser frozen.
EDIT: ok, with your update it's clearer what you want to achieve ;)
So, your code may look like this:
$.getJSON("http://example.com/jsoncall", function(data) {
process(data);
$.getJSON("http://example.com/jsoncall2", function (data) {
processAgain(data);
$.getJSON("http://example.com/anotherjsoncall", function(data) {
processAgainAndAgain(data);
});
});
});
This way, the second call will only be issued when the response to the first call has been received and processed, and the third call will only be issued when the response to the second call has been received and processed. This code is for getJSON but it can be adapted to $.ajax.
The modern way of sequencing jQuery asynchronous operations is to use the promises they already return and the flow control that promises support and this is not currently shown in any of the other answers here from prior years.
For example, let's suppose you wanted to load several scripts with $.getScript(), but the scripts must be loaded sequentially so the second one doesn't load/run until the first has finished and so on and you want to know when they are all done. You can directly use the promise that $.getScript() already returns. For simplicity, you can await that promise in a for loop like this:
async function loadScripts(scriptsToLoad) {
for (const src of scriptsToLoad) {
await $.getScript(src);
}
}
loadScripts([url1, url2, url3]).then(() => {
console.log("all done loading scripts");
}).catch(err => {
console.log(err);
});
Since all jQuery Ajax-related asynchronous operations now return promises (and have for many years now), you can extend this concept to any of jQuery's Ajax-related operations.
Also, note that all the other attempts in other answers here to wrap a jQuery operation in a new promise or in a jQuery deferred are obsolete and considered a promise anti-pattern because when the operation itself already returns a promise, you can just use that promise directly without trying to wrap it in your own new promise.
Synchronous calls aren't necessarily slower, if you have an app where AJAX calls open, posts to, then closes a socket, multiple calls to the socket don't make sense as some sockets can only handle a single connection, in which case, queuing data so its only sent when the previous AJAX call has completed means much higher data throughput.
How about using Node.js events?
var EventEmitter = require('events').EventEmitter;
var eventEmitter = new EventEmitter();
var $ = require('jquery');
var doSomething = function (responseData) {
var nextRequestData = {};
// do something with responseData
return nextRequestData;
};
// ajax requests
var request1 = $.ajax;
var request2 = $.ajax;
var requests = [request1, request2];
eventEmitter.on('next', function (i, requestData) {
requests[i](requestData).then(
function (responseData) {
console.log(i, 'request completed');
if (i+1 < requests.length) {
var nextRequestData = doSomething(responseData);
eventEmitter.emit('next', i+1, nextRequestData);
}
else {
console.log('completed all requests');
}
},
function () {
console.log(i, 'request failed');
}
);
});
var data = {
//data to send with request 1
};
eventEmitter.emit('next', 0, data);
sequential != synchronous, and I would like an asynchronous and sequential solution
Synchronous execution generally means "using the same clock", while sequential execution means "following in order or sequence".
For your specific use case I think both conditions must be met, as asynchronous execution implies the possibility of a non-sequential result.
Related
I use jQuery. And I don't want parallel AJAX calls on my application, each call must wait the previous before starting. How to implement it? There is any helper?
UPDATE If there is any synchronous version of the XMLHttpRequest or jQuery.post I would like to know. But sequential != synchronous, and I would like an asynchronous and sequential solution.
There's a much better way to do this than using synchronous ajax calls. Jquery ajax returns a deferred so you can just use pipe chaining to make sure that each ajax call finishes before the next runs. Here's a working example with a more in depth example you can play with on jsfiddle.
// How to force async functions to execute sequentially
// by using deferred pipe chaining.
// The master deferred.
var dfd = $.Deferred(), // Master deferred
dfdNext = dfd; // Next deferred in the chain
x = 0, // Loop index
values = [],
// Simulates $.ajax, but with predictable behaviour.
// You only need to understand that higher 'value' param
// will finish earlier.
simulateAjax = function (value) {
var dfdAjax = $.Deferred();
setTimeout(
function () {
dfdAjax.resolve(value);
},
1000 - (value * 100)
);
return dfdAjax.promise();
},
// This would be a user function that makes an ajax request.
// In normal code you'd be using $.ajax instead of simulateAjax.
requestAjax = function (value) {
return simulateAjax(value);
};
// Start the pipe chain. You should be able to do
// this anywhere in the program, even
// at the end,and it should still give the same results.
dfd.resolve();
// Deferred pipe chaining.
// What you want to note here is that an new
// ajax call will not start until the previous
// ajax call is completely finished.
for (x = 1; x <= 4; x++) {
values.push(x);
dfdNext = dfdNext.pipe(function () {
var value = values.shift();
return requestAjax(value).
done(function(response) {
// Process the response here.
});
});
}
Some people have commented they have no clue what the code does. In order to understand it, you first need to understand javascript promises. I am pretty sure promises are soon to be a native javascript language feature, so that should give you a good incentive to learn.
You have two choices that I can think of. One is to chain them through callbacks. The other is to make the calls synchronous rather than async.
Is there a reason you want them sequential? That will slow things down.
To make the call synchronous, you'll set the async option in the Ajax call to false. See the documentation at http://docs.jquery.com/Ajax/jQuery.ajax#options (click options tab to see them).
(async () => {
for(f of ['1.json','2.json','3.json']){
var json = await $.getJSON(f);
console.log(json)
};
})()
requests 3 json files with jQuery ajax calls
process in sequence (not in parallel) with await
works in Chrome/Firefox/Edge (as of 1/30/2018)
more at MDN
The best way you could do this is by chaining callbacks as Nosredna said. I wouldn't recommend using synchronous XMLHttpRequest as they lock your entire application.
There aren't much helper for this as far as I know, but you could do something resembling a callback FIFO.
You could give narrative javascript a try http://www.neilmix.com/narrativejs/doc/
I've never used it myself though. If I wanted to do this, I would setup some kind of abstraction for chaining asynchronous actions. As others have said, the synchonous version of the ajax object blocks events from being processed while it's waiting for a response. This causes the browser to look like it's frozen until it recieves a response.
Set the async option to false, e.g.,
$.ajax({ async: false /*, your_other_ajax_options_here */ });
Reference: Ajax/jQuery.ajax
You can use promise to make ajax calls sequential. Using Array push and pop promise method, sequential ajax calls will be lot easier.
var promises = [Promise.resolve()];
function methodThatReturnsAPromise(id) {
return new Promise((resolve, reject) => {
$.ajax({
url: 'https://jsonplaceholder.typicode.com/todos/'+id,
dataType:'json',
success: function(data)
{
console.log("Ajax Request Id"+id);
console.log(data);
resolve();
}
});
});
}
function pushPromise(id)
{
promises.push(promises.pop().then(function(){
return methodThatReturnsAPromise(id)}));
}
pushPromise(1);
pushPromise(3);
pushPromise(2);
Look at this: http://docs.jquery.com/Ajax/jQuery.ajax (click on the "options" tab).
But remember a synchronous call will freeze the page until the response is received, so it can't be used in a production site, because users will get mad if for any reason they have to wait 30 seconds with their browser frozen.
EDIT: ok, with your update it's clearer what you want to achieve ;)
So, your code may look like this:
$.getJSON("http://example.com/jsoncall", function(data) {
process(data);
$.getJSON("http://example.com/jsoncall2", function (data) {
processAgain(data);
$.getJSON("http://example.com/anotherjsoncall", function(data) {
processAgainAndAgain(data);
});
});
});
This way, the second call will only be issued when the response to the first call has been received and processed, and the third call will only be issued when the response to the second call has been received and processed. This code is for getJSON but it can be adapted to $.ajax.
The modern way of sequencing jQuery asynchronous operations is to use the promises they already return and the flow control that promises support and this is not currently shown in any of the other answers here from prior years.
For example, let's suppose you wanted to load several scripts with $.getScript(), but the scripts must be loaded sequentially so the second one doesn't load/run until the first has finished and so on and you want to know when they are all done. You can directly use the promise that $.getScript() already returns. For simplicity, you can await that promise in a for loop like this:
async function loadScripts(scriptsToLoad) {
for (const src of scriptsToLoad) {
await $.getScript(src);
}
}
loadScripts([url1, url2, url3]).then(() => {
console.log("all done loading scripts");
}).catch(err => {
console.log(err);
});
Since all jQuery Ajax-related asynchronous operations now return promises (and have for many years now), you can extend this concept to any of jQuery's Ajax-related operations.
Also, note that all the other attempts in other answers here to wrap a jQuery operation in a new promise or in a jQuery deferred are obsolete and considered a promise anti-pattern because when the operation itself already returns a promise, you can just use that promise directly without trying to wrap it in your own new promise.
Synchronous calls aren't necessarily slower, if you have an app where AJAX calls open, posts to, then closes a socket, multiple calls to the socket don't make sense as some sockets can only handle a single connection, in which case, queuing data so its only sent when the previous AJAX call has completed means much higher data throughput.
How about using Node.js events?
var EventEmitter = require('events').EventEmitter;
var eventEmitter = new EventEmitter();
var $ = require('jquery');
var doSomething = function (responseData) {
var nextRequestData = {};
// do something with responseData
return nextRequestData;
};
// ajax requests
var request1 = $.ajax;
var request2 = $.ajax;
var requests = [request1, request2];
eventEmitter.on('next', function (i, requestData) {
requests[i](requestData).then(
function (responseData) {
console.log(i, 'request completed');
if (i+1 < requests.length) {
var nextRequestData = doSomething(responseData);
eventEmitter.emit('next', i+1, nextRequestData);
}
else {
console.log('completed all requests');
}
},
function () {
console.log(i, 'request failed');
}
);
});
var data = {
//data to send with request 1
};
eventEmitter.emit('next', 0, data);
sequential != synchronous, and I would like an asynchronous and sequential solution
Synchronous execution generally means "using the same clock", while sequential execution means "following in order or sequence".
For your specific use case I think both conditions must be met, as asynchronous execution implies the possibility of a non-sequential result.
I've got a complicated (at least for me) set up of nested loops, ajax calls, and deferreds. The code is calling an API, parsing out relevant data, then using it to make further calls to other APIs.
It works almost as intended. I used the answer to this question (Using $.Deferred() with nested ajax calls in a loop) to build it. Here's my code:
function a() {
var def = $.Deferred();
var req = [];
for (var i = 0 /*...*/) {
for (var j = 0 /*...*/) {
(function(i, j) {
req.push($.ajax({
//params
}).done(function(resp) {
var def2 = $.Deferred();
var req2 = [];
for (var k = 0 /*...*/) {
for (var l = 0 /*...*/) {
req2.push(b(l));
}
}
$.when.apply($, req2).done(function() {
console.log("Got all data pieces");
def2.resolve();
})
}));
})(i, j);
}
}
$.when.apply($, req).done(function() {
console.log("Got all data");
def.resolve();
});
return def.promise();
}
function b(j) {
var def = $.Deferred();
$.when.apply(
$.ajax({
//params
})
).then(function() {
console.log("Got data piece #" + l);
def.resolve();
});
return def.promise();
}
function main() {
//...
$.when.apply($, a()).then(function() {
console.log("All done");
displayPage();
})
//...
}
Here's what I'm expecting to see when the calls complete
(In no specific order)
Got data piece #1
Got data piece #0
Got data piece #2
Got all data pieces
Got data piece #2
Got data piece #1
Got data piece #0
Got all data pieces
Got data piece #0
Got data piece #1
Got data piece #2
Got all data pieces
Got all data <-- These two must be last, and in this order
All done
Here's what I'm seeing
All done
Got data piece #0
Got data piece #1
Got data piece #2
Got all data pieces
Got data piece #0
Got data piece #1
Got data piece #2
Got all data pieces
Got data piece #0
Got data piece #1
Got data piece #2
Got all data pieces
I stepped through it in the debugger, and the 'Got all data' line in function a() gets printed in the correct sequence after everything else completes, after which def.resolve() should get called and resolve the returned promise.
However, in main(), a() is seen as resolved right away and the code jumps right into printing 'All done' and displaying the page. Any ideas as to why it doesn't wait as it's supposed to?
You have illustrated a set of code and said it isn't doing what you expected, but you haven't really described the overall problem. So, I don't actually know exactly what code to recommend. We do a lot better here with real problems rather than pseudo code problems. So, instead, what I can do is to outline a bunch of things that are wrong with your code:
Expecting serial order of parallel async operations
Based on what you say you are expecting, the basic logic for how you control your async operations seems to be missing. When you use $.when() on a series of promises that have already been started, you are running a whole bunch of async operations in parallel. Their completion order is completely unpredictable.
Yes, you seem to expect to be able to run a whole bunch of b(i) in parallel and have them all complete in order. That seems to be the case because you say you are expecting this type of output:
Got data piece #0
Got data piece #1
Got data piece #2
where each of those statements is generated by the completion of some b(i) operation.
That simply will not happen (or it would be blind luck if it did in the real world because there is no code that guarantees the order). Now, you can run them in parallel and use $.when() to track them and $.when() will let you know when they are all done and will collect all the results in order. But when each individual async operation in that group finishes is up to chance.
So, if you really wanted each of your b(i) operations to run and complete in order, then you would have to purposely sequence them (run one, wait for it to complete, then run the next, etc...). In general, if one operation does not depend upon the other, it is better to run them in parallel and let $.when() track them all and order the results for you (because you usually get your end result faster by running them all in parallel rather than sequencing them).
Creation of unnecessary deferreds in lots of places - promse anti-pattern
In this code, there is no need to create a deferred at all. $.ajax() already returns a promise. You can just use that promise. So, instead of this:
function b(j) {
var def = $.Deferred();
$.when.apply(
$.ajax({
//params
})
).then(function() {
console.log("Got data piece #" + l);
def.resolve();
});
return def.promise();
}
You can do this:
function b(j) {
return $.ajax({
//params
}).then(function(data) {
console.log("Got data piece #" + l);
return data;
});
}
Note, that you just directly return the promise that is already produced by $.ajax() and no deferred needs to be created at all. This is also a lot more bulletproof for error handling. One of the reason your method is called an anti-pattern is you don't handle errors at all (a common mistake when using this anti-pattern). But, the improved code, propagates errors right back to the caller just like they should be. In your version, if the $.ajax() call rejects its promise (due to an error), your deferred is NEVER resolved and the caller never sees the error either. Now, you could write extra code to handle the error, but there is no reason to. Just return the promise you already have. When coding with async operations that return promises, you should pretty much never need to create your own deferred.
$.when() is only needed when you have more than one promise
In your b() function, there is no need to use $.when() in this piece of code:
$.when(
$.ajax({
//params
})).then(...);
When you have a single promise, you just use .then() directly on it.
$.ajax({
//params
}).then(...);
Only use $.when() when you have more than one promise and you want to know when all of them are done. If you only have one promise, just use its own .then() handler.
More anti-pattern - just return promises from .then() handlers
In your inner loop, you have this:
$.when.apply($, req2).done(function() {
console.log("Got all data pieces");
def2.resolve();
})
There are several things wrong here. It's not clear what you're trying to do because def2 is a deferred that nothing else uses. So, it appears you're trying to tell someone when this req2 group of promises is done, but nobody is using it. In addition it's another version of the anti-pattern. $.when() already returns a promise. You don't need to create a deferred to resolve when $.when() completes. You can just use the promise that $.when() already returns.
Though I don't fully know your intent here, it appears that what you should probably do is to get rid of the def2 deferred entirely and do just this:
return $.when.apply($, req2).done(function() {
console.log("Got all data pieces");
});
Returning this promise from the .then() handler that it is within will chain this sequence of actions to the parent promise and make the parent promise wait for this new promise to be resolved (which is tied to when all the req2 promises are done) before the parent promise will resolve. This is how you make parent promises dependent upon other promise within a .then() handler. You return a promise from the .then() handler.
And, the exact same issue is true for your outer $.when.apply($, req) also. You don't need a deferred there at all. Just use the promise that $.when() already returns.
Putting it together
Here's a cleaned up version of your code that gets rid of the anti-patterns in multiple places. This does not change the sequencing of the b(i) calls among themselves. If you care about that, it is a bigger change and we need to see more of the real/actual problem to know what best to recommend.
function a() {
var req = [];
for (var i = 0 /*...*/) {
for (var j = 0 /*...*/) {
(function(i, j) {
req.push($.ajax({
//params
}).then(function(resp) {
var req2 = [];
for (var k = 0 /*...*/) {
for (var l = 0 /*...*/) {
req2.push(b(l));
}
}
return $.when.apply($, req2).done(function() {
console.log("Got all data pieces");
});
}));
})(i, j);
}
}
return $.when.apply($, req).done(function() {
console.log("Got all data");
});
}
function b(j) {
return $.ajax({
//params
}).then(function(data) {
console.log("Got data piece #" + l);
return data;
});
}
function main() {
//...
a().then(function() {
console.log("All done");
displayPage();
});
//...
}
P.S. If you want to process the b(i) results from within the same group in order, then don't use a .then() handler on the individual promise because those will execute in arbitrary order. Instead, use the results that come with $.when().then(result1, result2, ...) and process them all there. Though the individual promises complete in an arbitrary order, $.when() will collect the results into the original order so if you process the results in the $.when() handler, then you can process them all in order.
In tornado we have gen module, that allows us to write constructions like this:
def my_async_function(callback):
return callback(1)
#gen.engine
def get(self):
result = yield gen.Task(my_async_function) #Calls async function and puts result into result variable
self.write(result) #Outputs result
Do we have same syntax sugar in jquery or other javascript libraries?
I want something like this:
function get_remote_variable(name) {
val = $.sweetget('/remote/url/', {}); //sweetget automatically gets callback which passes data to val
return val
}
You describe the function as "my_async_function", but the way you use it is synchronous rather than asynchronous.
Your sample code requires blocking -- if "my_async_function" were truly asynchronous (non-blocking), the following line of code self.write(result) would execute immediately after you called "my_async_function". If the function takes any time at all (and I mean any) to come back with a value, what would self.write(result) end up writing? That is, if self.write(result) is executed before result ever has a value, you don't get expected results. Thus, "my_async_function" must block, it must wait for the return value before going forward, thus it is not asynchronous.
On to your question specifically, $.sweetget('/remote/url/', {}): In order to accomplish that, you would have to be able to block until the ajax request (which is inherently asynchronous -- it puts the first A in AJAX) comes back with something.
You can hack a synchronous call by delaying the return of sweetget until the XHR state has changed, but you'd be using a while loop (or something similar) and risk blocking the browser UI thread or setting off one of those "this script is taking too long" warnings. Javascript does not offer threading control. You cannot specify that your current thread is waiting, so go ahead and do something else for a minute. You could contend with that, too, by manually testing for a timeout threshold.
By now one should be starting to wonder: why not just use a callback? No matter how you slice it, Javascript is single-threaded. No sleep, no thread.sleep. That means that any synchronous code will block the UI.
Here, I've mocked up what sweetget would, roughly, look like. As you can see, your browser thread will lock up as soon as execution enters that tight while loop. Indeed, on my computer the ajax request won't even fire until the browser displays the unresponsive script dialog.
// warning: this code WILL lock your browser!
var sweetget = function (url, time_out) {
var completed = false;
var result = null;
var run_time = false;
if (time_out)
run_time = new Date().getTime();
$.ajax({
url: url,
success: function(data) {
result = data;
completed = true;
},
error: function () {
completed = true;
}
}); // <---- that AJAX request was non-blocking
while (!completed) { // <------ but this while loop will block
if (time_out) {
if (time_out>= run_time)
break;
run_time = new Date().getTime();
}
}
return result;
};
var t = sweetget('/echo/json/');
console.log('here is t: ', t);
Try it: http://jsfiddle.net/AjRy6/
Versions of jQuery prior to 1.8 support sync ajax calls via the async: false setting. Its a hack with limitations (no cross-domain or jsonp, locks up the browser), and I would avoid it if possible.
There are several available libraries that provide some syntactic sugar for async operations in Javascript. For example:
https://github.com/caolan/async
https://github.com/coolaj86/futures
...however I don't think anything provides the synchronous syntax you are looking for - there is always a callback involved, because of the way JavaScript works in the browser.
Given the following:
var doThings = (function ($, window, document) {
var someScopedVariable = undefined,
methods,
_status;
methods = {
init: function () {
_status.getStatus.call(this);
// Do something with the 'someScopedVariable'
}
};
// Local method
_status = {
getStatus: function () {
// Runs a webservice call to populate the 'someScopedVariable'
if (someScopedVariable === undefined) {
_status.setStatus.call(this);
}
return someScopedVariable;
},
setStatus: function () {
$.ajax({
url: "someWebservice",
success: function(results){
someScopedVariable = results;
}
});
}
};
return methods;
} (jQuery, window, document));
The issue is clear, this is an async situation were I would like to wait until someScopedVariable is not undefined, then continue.
I thought of using jQuery's .when() -> .done() deferred call but I cant seem to get it to work. I've also thought of doing a loop that would just check to see if its defined yet but that doesnt seem elegant.
Possible option 1:
$.when(_status.getStatus.call(this)).done(function () {
return someScopedVariable;
});
Possible option 2 (Terrible option):
_status.getStatus.call(this)
var i = 0;
do {
i++;
} while (formStatusObject !== undefined);
return formStatusObject;
UPDATE:
I believe I stripped out too much of the logic in order to explain it so I added back in some. The goal of this was to create an accessor to this data.
I would suggest to wait for the complete / success event of an ajax call.
methods = {
init: function () {
_status.getStatus.call(this);
},
continueInit: function( data ) {
// populate 'someScopedVariable' from data and continue init
}
};
_status = {
getStatus: function () {
$.post('webservice.url', continueInit );
}
};
You cannot block using an infite loop to wait for the async request to finish since your JavaScript is most likely running in a single thread. The JavaScript engine will wait for your script to finish before it tries to call the async callback that would change the variable you are watching in the loop. Hence, a deadlock occurrs.
The only way to go is using callback functions throughout, as in your second option.
I agree with the other answer about using a callback if possible. If for some reason you need to block and wait for a response, don't use the looping approach, that's about the worst possible way to do that. The most straightforward would be use set async:false in your ajax call.
See http://api.jquery.com/jQuery.ajax/
async - Boolean Default: true By default,
all requests are sent asynchronously
(i.e. this is set to true by default).
If you need synchronous requests, set
this option to false. Cross-domain
requests and dataType: "jsonp"
requests do not support synchronous
operation. Note that synchronous
requests may temporarily lock the
browser, disabling any actions while
the request is active.
I'm trying to figure the best way to get my functions executing in the correct order.
I have 3 functions
function 1 - squirts OPTIONs into a SELECT via JSON and marks them as selected
function 2 - squirts OPTIONS into a 2nd SELECT and marks them as selected
function 3 - gets the values from the above SELECTs along with some additional INPUT values, does an AJAX GET resulting in JSON data, which is read and populates a table.
With JQuery Onload, I execute:
function1();
function2();
function3();
I'm finding function3 is executing before the SELECTs have been populated with OPTIONS and hence the table has no results, because the values sent in the GET were blank.
I know this is probably a very simple problem and that there are probably a dozen ways to accomplish this, but basically I need the best way to code this so that function3 only runs if function1 and 2 are complete.
I've come into Javascript via the back door having learnt the basics of JQuery first!
Thanks for your assistance.
Javascript executes synchronously, which means that function3 must wait for function2 to complete, which must wait for function1 to complete before executing.
The exception is when you run code that is asynchronous, like a setTimeout, setInterval or an asynchronous AJAX request.
Any subsequent code that relies on the completion of such asynchronous code needs to be called in such a manner that it doesn't execute until the asynchronous code has completed.
In the case of the setTimeout, you could just place the next function call at the end of the function you're passing to the setTimeout.
In the case of an AJAX call, you can place the next function call in a callback that fires upon a completed request.
If you don't want the execution of the subsequent function to occur every time, you can modify your functions to accept a function argument that gets called at the end of the asynchronous code.
Something like:
function function1( fn ) {
setTimeout(function() {
// your code
// Call the function parameter if it exists
if( fn ) {
fn();
}
}, 200);
}
function function2() {
// some code that must wait for function1
}
onload:
// Call function1 and pass function2 as an argument
function1( function2 );
// ...or call function1 without the argument
function1();
// ...or call function2 independently of function1
function2();
I recommend you use a Promises library. You can hack simple solutions like other answers suggest, but as your application grows, you'll find you are doing more and more of these hacks. Promises are intended to solve these kinds of problems when dealing with asynchronous calls.
The CommonJS project has several Promises proposals which you should check out. Here is a question I asked on SO about Promises a while back with links to other solutions. Learn more about Promises in this Douglas Crockford video. The whole video is good, but skip to just past half way for promises.
I'm using the FuturesJS library currently as it suits my needs. But there are advantages to other implementations as well. It allows you to do sequences very easily:
// Initialize Application
Futures.sequence(function (next) {
// First load the UI description document
loadUI(next); // next() is called inside loadUI
})
.then(function(next) {
// Then load all templates specified in the description
loadTemplates(next); // next() is called inside loadTemplates
})
.then(function(next) {
// Then initialize all templates specified in the description
initTemplates();
});
Even more powerful is when you need to join async events together and do another action when all of the other async events have completed. Here's an example (untested) that will load a bunch of HTML files and then perform an action only once ALL of them have completed loading:
var path = "/templates/",
templates = ["one.html","two.html","three.html"],
promises = [];
$.each(templates, function(i,name) {
promises[i] = Futures.promise();
var $container = $("<div>");
$container.load(path+name, function(response,status,xhr) {
promises[i].fullfill();
}
});
Futures.join(promises, {timeout: 10000}) // Fail if promises not completed in 10 seconds
.when(function(p_arr) {
console.log("All templates loaded");
})
.fail(function(p_arr) {
console.log("Error loading templates");
});
This might be overkill for your application. But if the application is growing in complexity, using promises will help you in the long run.
I hope this helps!
invoke function2 inside of function1 and function3 inside of function2.
It's not clear why f1 and f2 are executing before f3.
Also, are you using the preferred $(document).ready() or some variation of onload?
It might be helpful if you provide a reproducible test case.
fun3() will only run after both are ready. It might run twice. You can fix this with a lock inside fun3() you would need a Singleton to guarantee it works correctly.
var select1ready = false, select2ready = false;
fun1()
{
// do stuff
select1ready = true;
fun3();
}
fun2()
{
// do stuff
select2ready = true;
fun3();
}
fun3()
{
if (select1ready && select2ready)
{
}
}
fun1();
fun2();