Javascript how to make sure JQuery .append is displayed - javascript

I have the following javascript function, which uses .append from JQuery. However it only works if I add alert() at the beginning of the function. I have found out the reason for this behavior is due to the asynchronous way AJAX works.
How can I make sure that this function displays the html as wanted?
this.printJSON = function(id) {
//alert(id);
$(id).append('<button id="#store">store</button>');
for(key in params) {
$(id).append('<p '...'</p>');
}
};
my whole class, which is called this way:
params.parseJSON();
params.printJSON("#showdata");
function Parameters(parametersFile) {
//private stuff
var paramFile = parametersFile;
var params = {};
//public stuff
this.parseJSON = function() {
$.getJSON('inputFileParametersJSON.txt', function(json) {
for(var param in json) {
for(var key in json[param]) {
if(json[param].hasOwnProperty(key)) {
params[key] = {
filterIdentifier : json[param][key].filterIdentifier,
paramIdentifier : json[param][key].paramIdentifier,
param : json[param][key].param
};
}
}
}
});
};
this.printJSON = function(id) {
alert("");
$(id).append('<button id="#store">store</button>');
for(key in params) {
$(id).append('<p id="' + key + '"> filterIdentifier: ' + params[key].filterIdentifier + '<br /> paramIdentifier: ' + params[key].paramIdentifier + '<br /> param= <input type="text" id="' + key + '"name="param" value="' + params[key].param + '"/></p>');
//alert(params[key].filterIdentifier);
}
};
}

It looks like you may need to re-evaluate the way your functions are being called. Do you actually need to call them seperately? I'd rename parseJSON to something like getJSON
function Parameters(parametersFile) {
// ...
var printJSON = function(id) {
$(id).append('<button id="#store">store</button>');
for(key in params) {
// ...
}
};
//public stuff
this.getJSON = function(id) {
$.getJSON('inputFileParametersJSON.txt', function(json) {
// process results ...
printJSON(id);
});
};
}
This way you can simply call foo.getJSON('#someid') and it will not append until the request has been processed.

As jcm has said, you should be calling printJSON() from the response handler to enable it to work once the results of the request have been used to populate params.
Here be monsters
If you really need to wait for the result of an ajax post and can't use the result in a response handler (which is almost never), you can set async to false (see http://api.jquery.com/jQuery.ajax/), but since JS is executed in a single thread, this will halt execution of the JS until the request is completed.
Since you are using $.getJSON() you would need to use the ajaxSetup http://api.jquery.com/jQuery.ajaxSetup/ to change the behaviour of the ajax call.

You must make sure that .printJSON() is called after .parseJSON() finishes parsing the data. This can be done by directly calling .printJSON() (see answer by jcm) or by addind a support for callback that will get executed when .parseJSON is done.
function Parameters(parametersFile) {
var that = this;
// ...
this.parseJSON = function(callback) {
$.getJSON('inputFileParametersJSON.txt', function(json) {
for (var param in json) {
for (var key in json[param]) {
// ...
}
}
callback.call(that);
});
};
this.printJSON = function(id) {
// ...
}
};
// ...
parameters.parseJSON(function() {
// ...
// you can use "this" as if you were in Parameters
this.printJSON(someId);
// ...
});

Related

Return an object from a function that has nested ajax calls

I would like to write a javascript function that returns informations from youtube videos; to be more specific I would like to get the ID and the length of videos got by a search, in a json object. So I took a look at the youtube API and I came out with this solution:
function getYoutubeDurationMap( query ){
var youtubeSearchReq = "https://gdata.youtube.com/feeds/api/videos?q="+ query +
"&max-results=20&duration=long&category=film&alt=json&v=2";
var youtubeMap = [];
$.getJSON(youtubeSearchReq, function(youtubeResult){
var youtubeVideoDetailReq = "https://gdata.youtube.com/feeds/api/videos/";
for(var i =0;i<youtubeResult.feed.entry.length;i++){
var youtubeVideoId = youtubeResult.feed.entry[i].id.$t.substring(27);
$.getJSON(youtubeVideoDetailReq + youtubeVideoId + "?alt=json&v=2",function(videoDetails){
youtubeMap.push({id: videoDetails.entry.id.$t.substring(27),runtime: videoDetails.entry.media$group.media$content[0].duration});
});
}
});
return youtubeMap;
}
The logic is ok, but as many of you have already understood because of ajax when I call this function I get an empty array. Is there anyway to get the complete object? Should I use a Deferred object? Thanks for your answers.
Yes, you should use deferred objects.
The simplest approach here is to create an array into which you can store the jqXHR result of your inner $.getJSON() calls.
var def = [];
for (var i = 0; ...) {
def[i] = $.getJSON(...).done(function(videoDetails) {
... // extract and store in youtubeMap
});
}
and then at the end of the whole function, use $.when to create a new promise that will be resolved only when all of the inner calls have finished:
return $.when.apply($, def).then(function() {
return youtubeMap;
});
and then use .done to handle the result from your function:
getYoutubeDurationMap(query).done(function(map) {
// map contains your results
});
See http://jsfiddle.net/alnitak/8XQ4H/ for a demonstration using this YouTube API of how deferred objects allow you to completely separate the AJAX calls from the subsequent data processing for your "duration search".
The code is a little long, but reproduced here too. However whilst the code is longer than you might expect note that the generic functions herein are now reusable for any calls you might want to make to the YouTube API.
// generic search - some of the fields could be parameterised
function youtubeSearch(query) {
var url = 'https://gdata.youtube.com/feeds/api/videos';
return $.getJSON(url, {
q: query,
'max-results': 20,
duration: 'long', category: 'film', // parameters?
alt: 'json', v: 2
});
}
// get details for one YouTube vid
function youtubeDetails(id) {
var url = 'https://gdata.youtube.com/feeds/api/videos/' + id;
return $.getJSON(url, {
alt: 'json', v: 2
});
}
// get the details for *all* the vids returned by a search
function youtubeResultDetails(result) {
var details = [];
var def = result.feed.entry.map(function(entry, i) {
var id = entry.id.$t.substring(27);
return youtubeDetails(id).done(function(data) {
details[i] = data;
});
});
return $.when.apply($, def).then(function() {
return details;
});
}
// use deferred composition to do a search and then get all details
function youtubeSearchDetails(query) {
return youtubeSearch(query).then(youtubeResultDetails);
}
// this code (and _only_ this code) specific to your requirement to
// return an array of {id, duration}
function youtubeDetailsToDurationMap(details) {
return details.map(function(detail) {
return {
id: detail.entry.id.$t.substring(27),
duration: detail.entry.media$group.media$content[0].duration
}
});
}
// and calling it all together
youtubeSearchDetails("after earth").then(youtubeDetailsToDurationMap).done(function(map) {
// use map[i].id and .duration
});
As you have discovered, you can't return youtubeMap directly as it's not yet populated at the point of return. But you can return a Promise of a fully populated youtubeMap, which can be acted on with eg .done(), .fail() or .then().
function getYoutubeDurationMap(query) {
var youtubeSearchReq = "https://gdata.youtube.com/feeds/api/videos?q=" + query + "&max-results=20&duration=long&category=film&alt=json&v=2";
var youtubeVideoDetailReq = "https://gdata.youtube.com/feeds/api/videos/";
var youtubeMap = [];
var dfrd = $.Deferred();
var p = $.getJSON(youtubeSearchReq).done(function(youtubeResult) {
$.each(youtubeResult.feed.entry, function(i, entry) {
var youtubeVideoId = entry.id.$t.substring(27);
//Build a .then() chain to perform sequential queries
p = p.then(function() {
return $.getJSON(youtubeVideoDetailReq + youtubeVideoId + "?alt=json&v=2").done(function(videoDetails) {
youtubeMap.push({
id: videoDetails.entry.id.$t.substring(27),
runtime: videoDetails.entry.media$group.media$content[0].duration
});
});
});
});
//Add a terminal .then() to resolve dfrd when all video queries are complete.
p.then(function() {
dfrd.resolve(query, youtubeMap);
});
});
return dfrd.promise();
}
And the call to getYoutubeDurationMap() would be of the following form :
getYoutubeDurationMap("....").done(function(query, map) {
alert("Query: " + query + "\nYouTube videos found: " + map.length);
});
Notes:
In practice, you would probably loop through map and display the .id and .runtime data.
Sequential queries is preferable to parallel queries as sequential is kinder to both client and server, and more likely to succeed.
Another valid approach would be to return an array of separate promises (one per video) and to respond to completion with $.when.apply(..), however the required data would be more awkward to extract.

Jquery deferred from callback of ajax calls

I'm trying to write an procedure that does something after 2 objects are returned as a result of the callback of an ajax function.
I know the classic example of using Jquery when():
$.when($.get("http://localhost:3000/url1"),
$.get("http://localhost:3000/url2").done(//do something));
But in my case, I don't want to trigger the when on the execution of the ajax function, I want the when to trigger from the callback from the execution of the ajax function. Something like:
$.get("http://localhost:3000/url1", function(data){
function(){
//do something with the data, and return myobj1;
}
});
$.get("http://localhost:3000/url2", function(data){
function(){
//do something with the data, and return myobj2;
}
});
$.when(obj1, obj2).done(function(){
//do something with these 2 objects
});
But of course, that doesn't work. Ideas?
That actually should work. jQuery.when() takes multiple arguments and fires once they all have completed returning each results arguments as an array:
var req1 = $.get("http://localhost:3000/url1");
var req2 = $.get("http://localhost:3000/url2");
$.when(req1, req2).done(function(res1, res2) {
//do something with these 2 objects
});
If you don't want to handle the requests together you can create your own deferreds and use those:
var deferred1 = $.Deferred(),
deferred2 = $.Deferred();
$.get("http://localhost:3000/url1", function(data){
function(){
//do something with the data, and return myobj1;
deferred1.resolve(myobj1);
}
});
$.get("http://localhost:3000/url2", function(data){
function(){
//do something with the data, and return myobj2;
deferred2.resolve(myobj2);
}
});
$.when(deferred1, deferred2).done(function(){
//do something with these 2 objects
});
or you can do controls yourself
$(function(){$('body').addClass('doc-ready')})
var thingsToLoad = ['blabla.js','blublu.js','weee.js'];
var launch = function(){
// do whatever you want to do after loading is complete
// this will be invoked after dom ready.
// objectCollection will have everything you loaded.
// and you can wrap your js files in functions, and execute whenever you want.
}
var loadTester = (function() {
var loadCounter = 0,
loadEnds = thingToLoad.length; // total number of items needs to be loaded
return function() {
loadCounter += 1;
if (loadCounter === loadEnds) {
if ($('body').hasClass('doc-ready')) {
launch();
} else {
/* if body doesnt have ready class name, attach ready event and launch application */
$(function() {
launch();
});
}
}
}
}());
$.each(thingsToLoad, function(i) {
$.ajax({
url : thingsToLoad[i],
mimeType : 'application/javascript',
dataType : 'script',
success : function(data) {
loadTester();
}
});
});
add your files into thingsToLoad array,
at the end it will be iterated over and will be loaded after success, it will init
loadTester.
loadTester will check length of your thingsToLoad array, when number of loaded files vs files length matches and dom in ready status, it will launch().
if you're just loading html files, or text files, you can pass those (data in ajax function) into loadTester and accumulate there (within a private var like those loadCounter and loadEnds), and pass accumulated array or object to launch()

Success handling in custom javascript function

If I make an ajax call, I can add success handling. I want to add similar logic to my custom functions.
I have 6-10 custom functions that MUST be run sequentially or independently. They don't typically run independently so I have them daisy-chained now by calling the next function at the end of the previous but that is messy to read and does not allow for separate execution.
I would love to have something like this:
function runall(){
runfirst().success(
runsecond().success(
runthird()
))
}
I have had other situations were I would like to add .success() handling to a custom function, but this situation made it more important.
If there is another way to force 6-10 functions to run synchronously, that could solve this problem, but I would also like to know how to add success handling to my custom functions.
I tried the following based on #lanzz's suggestion:
I added .then() to my function(s):
$bomImport.updateGridRow(rowId).then(function () {
$bomImport.toggleSubGrid(rowId, false);
});
var $bomImport = {
updateGridRow: function (rowId) {
$('#' + rowId + ' td[aria-describedby="bomImport_rev"]').html($("#mxRevTxt").val());
$('#' + rowId + ' td[aria-describedby="bomImport_itemno"]').html($("#itemNoTxt").val());
$('#' + rowId + ' td[aria-describedby="bomImport_used"]').html($("#usedTxt").val());
$('#' + rowId + ' td[aria-describedby="bomImport_partSource"]').html($("#partSourceTxt").val());
$('#' + rowId + ' td[aria-describedby="bomImport_partClass"]').html($("#partClassTxt").val());
$('#' + rowId + ' td[aria-describedby="bomImport_partType"]').html($("#partTypeTxt").val());
$('#' + rowId + ' td[aria-describedby="bomImport_partno"]').html($("#mxPnTxt").val());
$('#' + rowId + ' td[aria-describedby="bomImport_descript"]').html($("#descTxt").val());
$('#' + rowId + ' td[aria-describedby="bomImport_qty"]').html($("#qtyTxt").val());
$('#' + rowId + ' td[aria-describedby="bomImport_custPartNo"]').html($("#custPartNoTxt").val());
$('#' + rowId + ' td[aria-describedby="bomImport_crev"]').html($("#custRevTxt").val());
$('#' + rowId + ' td[aria-describedby="bomImport_u_of_m"]').html($("#uomTxt").val());
$('#' + rowId + ' td[aria-describedby="bomImport_warehouse"]').html($("#warehouseTxt").val());
$('#' + rowId + ' td[aria-describedby="bomImport_standardCost"]').html($("#stdCostTxt").val());
$('#' + rowId + ' td[aria-describedby="bomImport_workCenter"]').html($("#wcTxt").val());
var defferred = new $.Deferred();
return defferred.promise();
}};
The code correctly goes to the end of updateGridRow, gives no errors, but never gets back to call the second function.
I also tried the following as was suggested #Anand:
workSheetSaveExit(rowId, isNew).save().updateRow().toggle();
function workSheetSaveExit(){
this.queue = new Queue;
var self = this;
self.queue.flush(this);
}
workSheetSaveExit.prototype = {
save: function () {
this.queue.add(function (self) {
$bomImport.workSheetSave(rowId, isNew);
});
return this;
},
updateRow: function () {
this.queue.add(function (self) {
$bomImport.updateGridRow(rowId);
});
return this;
},
toggle: function () {
this.queue.add(function (self) {
$bomImport.toggleSubGrid(rowId, false);
});
return this;
}
};
Which didn't work.
Final Solution
For a great explanation of how to use deferred and make this work see here:
Using Deferred in jQuery
How to use Deferreds:
function somethingAsynchronous() {
var deferred = new $.Deferred();
// now, delay the resolution of the deferred:
setTimeout(function() {
deferred.resolve('foobar');
}, 2000);
return deferred.promise();
}
somethingAsynchronous().then(function(result) {
// result is "foobar", as provided by deferred.resolve() in somethingAsynchronous()
alert('after somethingAsynchronous(): ' + result);
});
// or, you can also use $.when() to wait on multiple deferreds:
$.when(somethingAsynchronous(), $.ajax({ something })).then(function() {
alert('after BOTH somethingAsynchronous() and $.ajax()');
});
If your functions simply make an AJAX request, you can just return the actual promise returned by $.ajax():
function doAjax() {
return $.ajax({ /* ajax options */ });
}
doAjax().then(function() {
alert('after doAjax()');
});
From what I can tell you really just want a better way to organize these callbacks. You should use a FIFO array or a queue. Your run all should do your stacking for you then execute the first function.
var RunQueue = function(queue){
this.init(queue);
}
var p = RunQueue.prototype = {};
p.queue = null;
p.init = function(queue){
this.queue = queue.slice(); //copy the array we will be changing it
// if this is not practical, keep an index
}
p.run = function(){
if(this.queue && this.queue.length) {
var first = this.queue[0];
this.queue.shift();
var runQueue = this;
first(function(){ /*success callback parameter*/
runQueue.run();
});
}
}
Usage:
var queue = [runFirst, runSecond, runThird, ...]
(new RunQueue(queue)).run();
If you really want to get fancy, and you may need to, you could pass in Objects in the array containing your parameters and have RunQueue append the last parameter as the success callback. You could even pass in the context to run the function in that object then call apply or call (whichever one uses the array) on your method.
{
method: runFirst,
context: someObject,
parameters: [param1, param2, param3];
}
If Each of your function returns a state/function and then probably you could add a prototype to each state/function, then you would be able to call the functions like this, in fluent api way(method chaining).
runfirst().runSecond().runThird()
and so on.
Lemme try to build a sample.
EDIT
See this, if it fits your design
EDIT 2
I did not realise, you were talking about async method chaining.
There is very good example here. It was discussed in this stackoverflow thread

Titanium HTTPClient returns too 'fast'

I have the following function:
getTasks: function()
{
var taskRequest = Titanium.Network.createHTTPClient();
var api_url = 'http://myawesomeapi.heroku.com/users/' + Ti.App.Properties.getString("userID") + '/tasks';
var tasks = [];
taskRequest.onload = function() {
var response = JSON.parse(this.responseText),
len = response.length,
i = 0,
t;
for(; i < len; i++)
{
task = response[i];
var newTask = {};
newTask.rowID = i;
newTask.title = task.title;
newTask.description = task.description;
newTask.id = task.id;
newTask.hasChild = true;
tasks.push(newTask);
}
alert(tasks);
}
taskRequest.open('GET', api_url, false);
taskRequest.setRequestHeader('Content-Type', 'application/json');
taskRequest.send();
alert(tasks);
// return tasks;
}
This function is in my controller; I call it in my view when I need to load the data in. However, I wish to return this data so I can assign it to a variable in the view.
Now what happens is that it returns emptiness. The last alert (bottom one) seems to be running too fast and it returns an empty array, while the one that only gets alerted after the onload function is done, contains what I need.
Now my obvious question, how can I get my function to return the array with the data, instead of without?
Putting a timer on it seems hardly the right decision.. Thanks!
"However, I wish to return this data so I can assign it to a variable in the view."
Aside from making the AJAX request synchronous (which you probably don't want), there isn't any way to return the data.
Whatever code relies on the response needs to be called from within the response handler.
Since functions can be passed around, you could have your getTasks method receive a callback function that is invoked and will receive the tasks Array.
getTasks: function( callback ) // receive a callback function
{
var taskRequest = Titanium.Network.createHTTPClient();
var api_url = 'http://myawesomeapi.heroku.com/users/' + Ti.App.Properties.getString("userID") + '/tasks';
taskRequest.onload = function() {
var tasks = [];
// code populating the tasks array
alert(tasks);
callback( tasks ); // invoke the callback
}
taskRequest.open('GET', api_url, false);
taskRequest.setRequestHeader('Content-Type', 'application/json');
taskRequest.send();
}
So you'd use it like this...
myObj.getTasks(function(tasks) {
alert('in the callback');
alert(tasks);
// Any and all code that relies on the response must be
// placed (or invoked from) inside here
some_other_function();
});
function some_other_function() {
// Some more logic that can't run until the tasks have been received.
// You could pass the tasks to this function if needed.
}
You are getting empty alert because when the bottom alert is executed the server response is not available and tasks array is empty.
When the server response comes the tasks array is populated by the code which you have in the onload handler so you see the tasks in the second alert.

Parallel asynchronous Ajax requests using jQuery

I'd like to update a page based upon the results of multiple ajax/json requests. Using jQuery, I can "chain" the callbacks, like this very simple stripped down example:
$.getJSON("/values/1", function(data) {
// data = {value: 1}
var value_1 = data.value;
$.getJSON("/values/2", function(data) {
// data = {value: 42}
var value_2 = data.value;
var sum = value_1 + value_2;
$('#mynode').html(sum);
});
});
However, this results in the requests being made serially. I'd much rather a way to make the requests in parallel, and perform the page update after all are complete. Is there any way to do this?
jQuery $.when() and $.done() are exactly what you need:
$.when($.ajax("/page1.php"), $.ajax("/page2.php"))
.then(myFunc, myFailure);
Try this solution, which can support any specific number of parallel queries:
var done = 4; // number of total requests
var sum = 0;
/* Normal loops don't create a new scope */
$([1,2,3,4,5]).each(function() {
var number = this;
$.getJSON("/values/" + number, function(data) {
sum += data.value;
done -= 1;
if(done == 0) $("#mynode").html(sum);
});
});
Run multiple AJAX requests in parallel
When working with APIs, you sometimes need to issue multiple AJAX requests to different endpoints. Instead of waiting for one request to complete before issuing the next, you can speed things up with jQuery by requesting the data in parallel, by using jQuery's $.when() function:
JS
$.when($.get('1.json'), $.get('2.json')).then(function(r1, r2){
console.log(r1[0].message + " " + r2[0].message);
});
The callback function is executed when both of these GET requests finish successfully. $.when() takes the promises returned by two $.get() calls, and constructs a new promise object. The r1 and r2 arguments of the callback are arrays, whose first elements contain the server responses.
Here's my attempt at directly addressing your question
Basically, you just build up and AJAX call stack, execute them all, and a provided function is called upon completion of all the events - the provided argument being an array of the results from all the supplied ajax requests.
Clearly this is early code - you could get more elaborate with this in terms of the flexibility.
<script type="text/javascript" src="http://jqueryjs.googlecode.com/files/jquery-1.3.2.min.js"></script>
<script type="text/javascript">
var ParallelAjaxExecuter = function( onComplete )
{
this.requests = [];
this.results = [];
this.onComplete = onComplete;
}
ParallelAjaxExecuter.prototype.addRequest = function( method, url, data, format )
{
this.requests.push( {
"method" : method
, "url" : url
, "data" : data
, "format" : format
, "completed" : false
} )
}
ParallelAjaxExecuter.prototype.dispatchAll = function()
{
var self = this;
$.each( self.requests, function( i, request )
{
request.method( request.url, request.data, function( r )
{
return function( data )
{
console.log
r.completed = true;
self.results.push( data );
self.checkAndComplete();
}
}( request ) )
} )
}
ParallelAjaxExecuter.prototype.allRequestsCompleted = function()
{
var i = 0;
while ( request = this.requests[i++] )
{
if ( request.completed === false )
{
return false;
}
}
return true;
},
ParallelAjaxExecuter.prototype.checkAndComplete = function()
{
if ( this.allRequestsCompleted() )
{
this.onComplete( this.results );
}
}
var pe = new ParallelAjaxExecuter( function( results )
{
alert( eval( results.join( '+' ) ) );
} );
pe.addRequest( $.get, 'test.php', {n:1}, 'text' );
pe.addRequest( $.get, 'test.php', {n:2}, 'text' );
pe.addRequest( $.get, 'test.php', {n:3}, 'text' );
pe.addRequest( $.get, 'test.php', {n:4}, 'text' );
pe.dispatchAll();
</script>
here's test.php
<?php
echo pow( $_GET['n'], 2 );
?>
Update: Per the answer given by Yair Leviel, this answer is obsolete. Use a promise library, like jQuery.when() or Q.js.
I created a general purpose solution as a jQuery extension. Could use some fine tuning to make it more general, but it suited my needs. The advantage of this technique over the others in this posting as of the time of this writing was that any type of asynchronous processing with a callback can be used.
Note: I'd use Rx extensions for JavaScript instead of this if I thought my client would be okay with taking a dependency on yet-another-third-party-library :)
// jQuery extension for running multiple async methods in parallel
// and getting a callback with all results when all of them have completed.
//
// Each worker is a function that takes a callback as its only argument, and
// fires up an async process that calls this callback with its result.
//
// Example:
// $.parallel(
// function (callback) { $.get("form.htm", {}, callback, "html"); },
// function (callback) { $.post("data.aspx", {}, callback, "json"); },
// function (formHtml, dataJson) {
// // Handle success; each argument to this function is
// // the result of correlating ajax call above.
// }
// );
(function ($) {
$.parallel = function (anyNumberOfWorkers, allDoneCallback) {
var workers = [];
var workersCompleteCallback = null;
// To support any number of workers, use "arguments" variable to
// access function arguments rather than the names above.
var lastArgIndex = arguments.length - 1;
$.each(arguments, function (index) {
if (index == lastArgIndex) {
workersCompleteCallback = this;
} else {
workers.push({ fn: this, done: false, result: null });
}
});
// Short circuit this edge case
if (workers.length == 0) {
workersCompleteCallback();
return;
}
// Fire off each worker process, asking it to report back to onWorkerDone.
$.each(workers, function (workerIndex) {
var worker = this;
var callback = function () { onWorkerDone(worker, arguments); };
worker.fn(callback);
});
// Store results and update status as each item completes.
// The [0] on workerResultS below assumes the client only needs the first parameter
// passed into the return callback. This simplifies the handling in allDoneCallback,
// but may need to be removed if you need access to all parameters of the result.
// For example, $.post calls back with success(data, textStatus, XMLHttpRequest). If
// you need textStatus or XMLHttpRequest then pull off the [0] below.
function onWorkerDone(worker, workerResult) {
worker.done = true;
worker.result = workerResult[0]; // this is the [0] ref'd above.
var allResults = [];
for (var i = 0; i < workers.length; i++) {
if (!workers[i].done) return;
else allResults.push(workers[i].result);
}
workersCompleteCallback.apply(this, allResults);
}
};
})(jQuery);
UPDATE And another two years later, this looks insane because the accepted answer has changed to something much better! (Though still not as good as Yair Leviel's answer using jQuery's when)
18 months later, I just hit something similar. I have a refresh button, and I want the old content to fadeOut and then the new content to fadeIn. But I also need to get the new content. The fadeOut and the get are asynchronous, but it would be a waste of time to run them serially.
What I do is really the same as the accepted answer, except in the form of a reusable function. Its primary virtue is that it is much shorter than the other suggestions here.
var parallel = function(actions, finished) {
finishedCount = 0;
var results = [];
$.each(actions, function(i, action) {
action(function(result) {
results[i] = result;
finishedCount++;
if (finishedCount == actions.length) {
finished(results);
}
});
});
};
You pass it an array of functions to run in parallel. Each function should accept another function to which it passes its result (if any). parallel will supply that function.
You also pass it a function to be called when all the operations have completed. This will receive an array with all the results in. So my example was:
refreshButton.click(function() {
parallel([
function(f) {
contentDiv.fadeOut(f);
},
function(f) {
portlet.content(f);
},
],
function(results) {
contentDiv.children().remove();
contentDiv.append(results[1]);
contentDiv.fadeIn();
});
});
So when my refresh button is clicked, I launch jQuery's fadeOut effect and also my own portlet.content function (which does an async get, builds a new bit of content and passes it on), and then when both are complete I remove the old content, append the result of the second function (which is in results[1]) and fadeIn the new content.
As fadeOut doesn't pass anything to its completion function, results[0] presumably contains undefined, so I ignore it. But if you had three operations with useful results, they would each slot into the results array, in the same order you passed the functions.
you could do something like this
var allData = []
$.getJSON("/values/1", function(data) {
allData.push(data);
if(data.length == 2){
processData(allData) // where process data processes all the data
}
});
$.getJSON("/values/2", function(data) {
allData.push(data);
if(data.length == 2){
processData(allData) // where process data processes all the data
}
});
var processData = function(data){
var sum = data[0] + data[1]
$('#mynode').html(sum);
}
Here's an implementation using mbostock/queue:
queue()
.defer(function(callback) {
$.post('/echo/json/', {json: JSON.stringify({value: 1}), delay: 1}, function(data) {
callback(null, data.value);
});
})
.defer(function(callback) {
$.post('/echo/json/', {json: JSON.stringify({value: 3}), delay: 2}, function(data) {
callback(null, data.value);
});
})
.awaitAll(function(err, results) {
var result = results.reduce(function(acc, value) {
return acc + value;
}, 0);
console.log(result);
});
The associated fiddle: http://jsfiddle.net/MdbW2/
With the following extension of JQuery (to can be written as a standalone function you can do this:
$.whenAll({
val1: $.getJSON('/values/1'),
val2: $.getJSON('/values/2')
})
.done(function (results) {
var sum = results.val1.value + results.val2.value;
$('#mynode').html(sum);
});
The JQuery (1.x) extension whenAll():
$.whenAll = function (deferreds) {
function isPromise(fn) {
return fn && typeof fn.then === 'function' &&
String($.Deferred().then) === String(fn.then);
}
var d = $.Deferred(),
keys = Object.keys(deferreds),
args = keys.map(function (k) {
return $.Deferred(function (d) {
var fn = deferreds[k];
(isPromise(fn) ? fn : $.Deferred(fn))
.done(d.resolve)
.fail(function (err) { d.reject(err, k); })
;
});
});
$.when.apply(this, args)
.done(function () {
var resObj = {},
resArgs = Array.prototype.slice.call(arguments);
resArgs.forEach(function (v, i) { resObj[keys[i]] = v; });
d.resolve(resObj);
})
.fail(d.reject);
return d;
};
See jsbin example:
http://jsbin.com/nuxuciwabu/edit?js,console
The most professional solution for me would be by using async.js and Array.reduce like so:
async.map([1, 2, 3, 4, 5], function (number, callback) {
$.getJSON("/values/" + number, function (data) {
callback(null, data.value);
});
}, function (err, results) {
$("#mynode").html(results.reduce(function(previousValue, currentValue) {
return previousValue + currentValue;
}));
});
If the result of one request depends on the other, you can't make them parallel.
Building on Yair's answer.
You can define the ajax promises dynamically.
var start = 1; // starting value
var len = 2; // no. of requests
var promises = (new Array(len)).fill().map(function() {
return $.ajax("/values/" + i++);
});
$.when.apply($, promises)
.then(myFunc, myFailure);
Suppose you have an array of file name.
var templateNameArray=["test.html","test2.html","test3.html"];
htmlTemplatesLoadStateMap={};
var deffereds=[];
for (var i = 0; i < templateNameArray.length; i++)
{
if (!htmlTemplatesLoadStateMap[templateNameArray[i]])
{
deferreds.push($.get("./Content/templates/" +templateNameArray[i],
function (response, status, xhr) {
if (status == "error") { }
else {
$("body").append(response);
}
}));
htmlTemplatesLoadStateMap[templateNameArray[i]] = true;
}
}
$.when.all(deferreds).always(function(resultsArray) { yourfunctionTobeExecuted(yourPayload);
});
I needed multiple, parallel ajax calls, and the jquery $.when syntax wasn't amenable to the full $.ajax format I am used to working with. So I just created a setInterval timer to periodically check when each of the ajax calls had returned. Once they were all returned, I could proceed from there.
I read there may be browser limitations as to how many simultaneous ajax calls you can have going at once (2?), but .$ajax is inherently asynchronous, so making the ajax calls one-by-one would result in parallel execution (within the browser's possible limitation).

Categories

Resources