Execute function in order in javascript? - javascript

I'm having this piece of code written in javascript
preDefineListName = ['Applied', 'Test taken', 'SWS Interview', 'Candidate', 'Rejected'];
for (var i = 0; i < preDefineListName.length; i++) {
Trello.addList(data.id, preDefineListName[i]);
};
Trello.addList = function (trelloBoardId, listName) {
return $http.post('https://api.trello.com/1/lists', {
idBoard: trelloBoardId,
name: listName,
key: trelloKey,
token: trelloToken
});
};
now above function Trello.addList in the for loop makes a list on the trello.com with the given names in preDefineListName. The problem is the lists are not appearing in the order as they passed.
What should I do to make it in proper order. and i've to call function in the loop so I can't change it.

Your Trello.addList returns a Promise and is asynchronous (as it executes an http call). You therefore need an asynchronous loop instead of the for loop as well. This would be a .forEach call on the preDefineListName list.
You can however use .map as well, which lets you return the result of the Trello.addList calls and then use $q.all to wait until all addList calls are done:
$q.all(preDefineListName.map(function(name) {
return Trello.addList(data.id, name);
})).then(function success(results) {
// do something with the results
}, function error(reasons) {
// handle errors here
});

Use promises and recursion. Looks a bit hacky, but will make things synchronous:
preDefineListName = ['Applied', 'Test taken', 'SWS Interview', 'Candidate', 'Rejected'];
Trello.addList(data.id, preDefinedListName); // Initiate list adding
Trello.addList = function(trelloBoardId, listNames) {
if(!listNames.length) {
return;
}
var listName = listNames[0];
listNames.splice(0, 1); // Remove first element from array
$http.post('https://api.trello.com/1/lists', {
idBoard: trelloBoardId,
name: listName,
key: trelloKey,
token: trelloToken
}).then(function(response) {
Trello.addList(trelloBoardId, listNames); // Call the function again after this request has finished.
});
}

Related

Async call from function, calling API

I'm still learning about async.
I can't get my asynchro code to work. I have an fonction, that call an API.
I call this function on another function, to be able to manipulate later the data and merge them.
But my code always goes forward, without waiting the first function answer.
Async / promise problem
var exchangeListing = {
"exchange":
[
{
"name": "Kraken",
"pair": ['XBTUSD','XETHUSD']
},{
"name": "Coinbase",
"pair": ["BTC-USD","ETH-USDC"]
},{
"name":"Bittrex",
"pair": ["BTC-USD","BTC-ETH"]
}
]
}
function BXgetPrice(currencies) {
var ret;
bittrex.getmarketsummary( { market : currencies}, function( data, err ) {
if(data!=null) {
ret = {
message: {
type: 'success',
data: {
last:data.result[0].Last,
volume:data.result[0].Volume,
date:data.result[0].TimeStamp,
}
}
};
return ret;
} else {
return;
}
});
return ret;
}
//Loop to get all data from all exchanges we need, and merge in a JSON, to send to HTML page
function exchangeListe(exchangeListing) {
var exchangeListing = JSON.stringify(exchangeListing);
var exchangeListing = JSON.parse(exchangeListing);
console.log(exchangeListing);
for(var i = 0, len = exchangeListing.exchange.length; i < len; i++) {
var a = exchangeListing.exchange[i].name;
for(var j=0, l=exchangeListing.exchange[i].pair.length; j<l; j++) {
if(a=="Kraken") {
//add code to manipulate data
} else if(a=="Bittrex") {
console.log("bittrex");
BXgetPrice(exchangeListing.exchange[i].pair[j], function(data,err){
console.log("hi"+data);
//add code to manipulate data
}) ;
} else if(a=="Coinbase") {
//add code to manipulate data
}
}
}
}
In Exchange list, i need my loop to do all the exchange available and all pair, get data from the API call, and merge them after.
For now, when launch, the exchangeListe(exchangeListing), i don't get the data from the function BXgetPrice. Data is empty.
I try to add the async function on both, and declare as const + use await, but nothing help, i that case i get Promise {}
Thanks for your help
This behavior is by design. Asynchronous call means that you do something (like send an AJAX request) and do not wait for it, your program keeps going when the result is not yet received. You expect your code to run in synchronous manner, but it is not. You could make your code synchronous, but it would be a bad idea, because then you are waiting for synchronous responses and your page might be frozen, which is bad UX. The right way to do this is to either call exchangeListe in your callback (the function which is passed to the asynchronous function is the callback, that is, it will be executed when the async job is done), or you could refactor your code and use promises instead, with equivalent results, but more elegant code.

Call function after two promises resolve in angularJS, one using result from other

In a controller function, I make some operations:
Get a list of organizations with a promise
In the then of this promise, I loop through each of them to extract some data and populate some of my controller attributes.
One of this operation is to call another promise to gather all users attached to this organization, with a loop inside of it to extract name and other stuff.
When I get ALL of it, so every organization has been parsed, and within them all users too, I must call a function to update my view.
I got it working by setting some flags (orgParsed and usersParsed) but I find it to be... a code shame.
I heard about a way of maybe doing this by using $q to wait for the two promises and maybe loops inside their "then" to be resolve before calling my view function. But I struggle applying this code change since the second promise use the result of the first to gather the organization ID.
Here is my current code:
this.getOrgData = function () {
return Service.getList().then(function (result) {
var orgCount = result.Objects.length;
var orgParsed = 0;
_.forEach(result.Objects, function (org) {
org.Users = [];
// Some logic here using 'org' data
Service.getUsers(org.Id, 0, 0).then(function (userResult) {
usersParsed = 0;
_.forEach(userResult.Objects, function (user) {
// Some Logic here using 'user.Name'
usersParsed++;
});
orgParsed++;
if (orgParsed === orgCount && usersParsed === userResult.Objects.length) {
self.sortMenuList(); // My view Function
}
});
});
$scope.$broadcast("getOrgData");
});
};
Do you see any way to trigger my self.sortMenuList() function only when I can be sure I got all users of every companies parsed in more elegant/efficient/safe way?
Yes, that counting should definitely be replaced by $q.all, especially as you did not bother to handle any errors.
this.getOrgData = function () {
return Service.getList().then(function (result) {
$scope.$broadcast("getOrgData"); // not sure whether you want that here before the results from the loop
return $q.all(_.map(result.Objects, function (org) {
org.Users = [];
// Some logic here using 'org' data
return Service.getUsers(org.Id, 0, 0).then(function (userResult) {
_.forEach(userResult.Objects, function (user) {
// Some Logic here using 'user.Name'
});
});
}));
}).then(function() {
self.sortMenuList(); // My view Function;
})
};
The problem you describe sounds like you want to wait until a certain amount of promises are all resolved, and then do something with the result. That's really easy when you use Promise.all():
this.getOrgData = function () {
return Service.getList().then(function (result) {
var promises = [];
_.forEach(result.Objects, function (org) {
org.Users = [];
// Some logic here using 'org' data
// Store the promise for this user in the promises array
promises.push(Service.getUsers(org.Id, 0, 0));
});
// userResults is an array of all the results of the promises, in the same order as the getUsers was called
Promise.all(promises).then(function (userResults) {
_.forEach(userResults, function(userResult) {
_.forEach(userResult.Objects, function (user) {
// Some Logic here using 'user.Name'
});
});
self.sortMenuList();
});
$scope.$broadcast("getOrgData");
});
};

Multiple ajax request in sequence using recursive function and execute callback function after all requests completed

I have list of names separated by comma. What I want is I want to call server request for all names in a sequence and store result inside an array. I tried and it's working when I do have number of names which are there in string.
See Here - This is working when I know number of names
Now what I want is I want to make this code as generic. If I add one name in that string, It should handle automatically without adding any code for ajax request.
See Here - This is what I've tried. It's not working as expected.
shoppingList = shoppingList.split(",");
var result = [];
function fetchData(shoppingItem)
{
var s1 = $.ajax('/items/'+shoppingItem);
s1.then(function(res) {
result.push(new Item(res.label,res.price));
console.log("works fine");
});
if(shoppingList.length == 0)
{
completeCallback(result);
}
else
{
fetchData(shoppingList.splice(0,1)[0]);
}
}
fetchData(shoppingList.splice(0,1)[0]);
Problem
I am not getting how to detect that all promise object have been resolved so that I can call callback function.
To make the ajax requests in sequence, you have to put the recursive call in the callback:
function fetchList(shoppingList, completeCallback) {
var result = [];
function fetchData() {
if (shoppingList.length == 0) {
completeCallback(result);
} else {
$.ajax('/items/'+shoppingList.shift()).then(function(res) {
result.push(new Item(res.label,res.price));
console.log("works fine");
fetchData();
// ^^^^^^^^^^^
});
}
}
fetchData();
}
or you actually use promises and do
function fetchList(shoppingList) {
return shoppingList.reduce(function(resultPromise, shoppingItem) {
return resultPromise.then(function(result) {
return $.ajax('/items/'+shoppingItem).then(function(res) {
result.push(new Item(res.label,res.price));
return result;
});
});
}, $.when([]));
}
(updated jsfiddle)
Notice there is nothing in the requirements of the task about the ajax requests to be made sequentially. You could also let them run in parallel and wait for all of them to finish:
function fetchList(shoppingList) {
$.when.apply($, shoppingList.map(function(shoppingItem) {
return $.ajax('/items/'+shoppingItem).then(function(res) {
return new Item(res.label,res.price);
});
})).then(function() {
return Array.prototype.slice.call(arguments);
})
}
(updated jsfiddle)
// global:
var pendingRequests = 0;
// after each ajax request:
pendingRequests++;
// inside the callback:
if (--pendingRequest == 0) {
// all requests have completed
}
I have modified your code to minimal to make it work - Click here.
Please note your last assertion will fail as the item promise is not resolved in linear manner. Thus sequence of the item will change.

How to capture results from end of FOR loop with Nested/Dependent APIs calls in Node JS

This is my first JavaScript & Node project and I am stuck….
I am trying to call a REST API that returns a set of Post IDs... and based on the set of retrieved IDs I am trying to call another API that returns details for each ID from the first API. The code uses Facebook API provided by Facebook-NodeSDK.
The problem I am having is that the second API fires of in a FOR Loop…. As I understand the for loop executes each request asynchronously…. I can see both the queries executing however I can’t figure out how to capture the end of the second for loop to return the final result to the user…
Following is the code…
exports.getFeeds = function(req, res) {
var posts = [];
FB.setAccessToken(’SOME TOKEN');
var resultLength = 0;
FB.api(
//ARG #1 FQL Statement
'fql', { q: 'SELECT post_id FROM stream WHERE filter_key = "others"' },
//ARG #2 passing argument as a anonymous function with parameter result
function (result)
{
if(!result || result.error) {
console.log(!result ? 'error occurred' : result.error);
return;
} //closing if handling error in this block
var feedObj
console.log(result.data);
console.log(result.data.length);
for (var i = 0; i<resultLengthj ; i++) {
(function(i) {
feedObj = {};
FB.api( result.data[ i].post_id, { fields: ['name', 'description', 'full_picture' ] },
// fbPost is data returned by query
function (fbPost) {
if(!fbPost || fbPost.error) {
console.log(!fbPost ? 'error occurred' : result.error);
return;
}
// else
feedObj=fbPost;
posts.push(feedObj);
});
})(i);
}// end for
}//CLOSE ARG#2 Function
);// close FB.api Function
NOTE I need to call…... res.Send(post)…. and have tried to call it at several places but just can’t get all the posts… I have removed the console statements from the above code…which have shown that the data is being retrieved...
Thanks a lot for your help and attention....
If you just stick res.send almost anywhere in your code, it will be certain to get called before your posts have returned.
What you want to do in a case like this is to declare a counter variable outside your for loop and set it to zero. Then increment it inside the for loop. Then inside your inner callback (in your case, the one that is getting called once for each post), you would decrement the counter and test for when it hits zero. Below I apply the technique to an edited version of your code (I didn't see what feedObj was actually doing, nor did I understand why you were using the immediately-invoked function, so eliminated both - please let me know if i missed something there).
var posts = [];
FB.setAccessToken(’SOME TOKEN');
var resultLength = 0;
FB.api(
//ARG #1 FQL Statement
'fql', { q: 'SELECT post_id FROM stream WHERE filter_key = "others"' },
//ARG #2 passing argument as a anonymous function with parameter result
function (result)
{
if(!result || result.error) {
return;
} //closing if handling error in this block
var counter = 0;
for (var i = 0; i<resultLengthj ; i++) {
counter++;
FB.api( result.data[ i].post_id, { fields: ['name', 'description', 'full_picture' ] },
function (fbPost) { // fbPost is data returned by query
if(!fbPost || fbPost.error) {
return;
}
posts.push(fbPost);
counter--;
if (counter === 0){
// Let's render that page/send that result, etc
}
});
}// end for
}//CLOSE ARG#2 Function
);// close FB.api Function
Hope this helps.
Essentially you are wanting to do an async map operation for each id.
There is a really handy library for doing async operations on collections called async that has a async.map method.
var async = require('async');
FB.api(
'fql', { q: 'SELECT post_id FROM stream WHERE filter_key = "others"' },
//ARG #2 passing argument as a anonymous function with parameter result
function (result) {
async.map(
result,
function (item, callback) {
FB.api(
item.post_id,
{ fields: ['name', 'description', 'full_picture' ] },
callback
);
},
function (err, allPosts) {
if (err) return console.log(err);
// Array of all posts
console.log(allPosts);
}
);
}
);
You definitely don't need to use this, but it simplifies your code a bit. Just run npm install --save async in your project directory and you should be good to go.

Sequencing ajax requests

I find I sometimes need to iterate some collection and make an ajax call for each element. I want each call to return before moving to the next element so that I don't blast the server with requests - which often leads to other issues. And I don't want to set async to false and freeze the browser.
Usually this involves setting up some kind of iterator context that i step thru upon each success callback. I think there must be a cleaner simpler way?
Does anyone have a clever design pattern for how to neatly work thru a collection making ajax calls for each item?
jQuery 1.5+
I developed an $.ajaxQueue() plugin that uses the $.Deferred, .queue(), and $.ajax() to also pass back a promise that is resolved when the request completes.
/*
* jQuery.ajaxQueue - A queue for ajax requests
*
* (c) 2011 Corey Frang
* Dual licensed under the MIT and GPL licenses.
*
* Requires jQuery 1.5+
*/
(function($) {
// jQuery on an empty object, we are going to use this as our Queue
var ajaxQueue = $({});
$.ajaxQueue = function( ajaxOpts ) {
var jqXHR,
dfd = $.Deferred(),
promise = dfd.promise();
// queue our ajax request
ajaxQueue.queue( doRequest );
// add the abort method
promise.abort = function( statusText ) {
// proxy abort to the jqXHR if it is active
if ( jqXHR ) {
return jqXHR.abort( statusText );
}
// if there wasn't already a jqXHR we need to remove from queue
var queue = ajaxQueue.queue(),
index = $.inArray( doRequest, queue );
if ( index > -1 ) {
queue.splice( index, 1 );
}
// and then reject the deferred
dfd.rejectWith( ajaxOpts.context || ajaxOpts,
[ promise, statusText, "" ] );
return promise;
};
// run the actual query
function doRequest( next ) {
jqXHR = $.ajax( ajaxOpts )
.done( dfd.resolve )
.fail( dfd.reject )
.then( next, next );
}
return promise;
};
})(jQuery);
jQuery 1.4
If you're using jQuery 1.4, you can utilize the animation queue on an empty object to create your own "queue" for your ajax requests for the elements.
You can even factor this into your own $.ajax() replacement. This plugin $.ajaxQueue() uses the standard 'fx' queue for jQuery, which will auto-start the first added element if the queue isn't already running.
(function($) {
// jQuery on an empty object, we are going to use this as our Queue
var ajaxQueue = $({});
$.ajaxQueue = function(ajaxOpts) {
// hold the original complete function
var oldComplete = ajaxOpts.complete;
// queue our ajax request
ajaxQueue.queue(function(next) {
// create a complete callback to fire the next event in the queue
ajaxOpts.complete = function() {
// fire the original complete if it was there
if (oldComplete) oldComplete.apply(this, arguments);
next(); // run the next query in the queue
};
// run the query
$.ajax(ajaxOpts);
});
};
})(jQuery);
Example Usage
So, we have a <ul id="items"> which has some <li> that we want to copy (using ajax!) to the <ul id="output">
// get each item we want to copy
$("#items li").each(function(idx) {
// queue up an ajax request
$.ajaxQueue({
url: '/echo/html/',
data: {html : "["+idx+"] "+$(this).html()},
type: 'POST',
success: function(data) {
// Write to #output
$("#output").append($("<li>", { html: data }));
}
});
});
jsfiddle demonstration - 1.4 version
A quick and small solution using deferred promises. Although this uses jQuery's $.Deferred, any other should do.
var Queue = function () {
var previous = new $.Deferred().resolve();
return function (fn, fail) {
return previous = previous.then(fn, fail || fn);
};
};
Usage, call to create new queues:
var queue = Queue();
// Queue empty, will start immediately
queue(function () {
return $.get('/first');
});
// Will begin when the first has finished
queue(function() {
return $.get('/second');
});
See the example with a side-by-side comparison of asynchronous requests.
This works by creating a function that will automatically chain promises together. The synchronous nature comes from the fact that we are wrapping $.get calls in function and pushing them into a queue. The execution of these functions are deferred and will only be called when it gets to the front of the queue.
A requirement for the code is that each of the functions you give must return a promise. This returned promise is then chained onto the latest promise in the queue. When you call the queue(...) function it chains onto the last promise, hence the previous = previous.then(...).
You can wrap all that complexity into a function to make a simple call that looks like this:
loadSequantially(['/a', '/a/b', 'a/b/c'], function() {alert('all loaded')});
Below is a rough sketch (working example, except the ajax call). This can be modified to use a queue-like structure instead of an array
// load sequentially the given array of URLs and call 'funCallback' when all's done
function loadSequantially(arrUrls, funCallback) {
var idx = 0;
// callback function that is called when individual ajax call is done
// internally calls next ajax URL in the sequence, or if there aren't any left,
// calls the final user specified callback function
var individualLoadCallback = function() {
if(++idx >= arrUrls.length) {
doCallback(arrUrls, funCallback);
}else {
loadInternal();
}
};
// makes the ajax call
var loadInternal = function() {
if(arrUrls.length > 0) {
ajaxCall(arrUrls[idx], individualLoadCallback);
}else {
doCallback(arrUrls, funCallback);
}
};
loadInternal();
};
// dummy function replace with actual ajax call
function ajaxCall(url, funCallBack) {
alert(url)
funCallBack();
};
// final callback when everything's loaded
function doCallback(arrUrls, func) {
try {
func();
}catch(err) {
// handle errors
}
};
Ideally, a coroutine with multiple entry points so every callback from server can call the same coroutine will be neat. Damn, this is about to be implemented in Javascript 1.7.
Let me try using closure...
function BlockingAjaxCall (URL,arr,AjaxCall,OriginalCallBack)
{
var nextindex = function()
{
var i =0;
return function()
{
return i++;
}
};
var AjaxCallRecursive = function(){
var currentindex = nextindex();
AjaxCall
(
URL,
arr[currentindex],
function()
{
OriginalCallBack();
if (currentindex < arr.length)
{
AjaxCallRecursive();
}
}
);
};
AjaxCallRecursive();
}
// suppose you always call Ajax like AjaxCall(URL,element,callback) you will do it this way
BlockingAjaxCall(URL,myArray,AjaxCall,CallBack);
Yeah, while the other answers will work, they are lots of code and messy looking. Frame.js was designed to elegantly address this situation. https://github.com/bishopZ/Frame.js
For instance, this will cause most browsers to hang:
for(var i=0; i<1000; i++){
$.ajax('myserver.api', { data:i, type:'post' });
}
While this will not:
for(var i=0; i<1000; i++){
Frame(function(callback){
$.ajax('myserver.api', { data:i, type:'post', complete:callback });
});
}
Frame.start();
Also, using Frame allows you to waterfall the response objects and deal with them all after the entire series of AJAX request have completed (if you want to):
var listOfAjaxObjects = [ {}, {}, ... ]; // an array of objects for $.ajax
$.each(listOfAjaxObjects, function(i, item){
Frame(function(nextFrame){
item.complete = function(response){
// do stuff with this response or wait until end
nextFrame(response); // ajax response objects will waterfall to the next Frame()
$.ajax(item);
});
});
Frame(function(callback){ // runs after all the AJAX requests have returned
var ajaxResponses = [];
$.each(arguments, function(i, arg){
if(i!==0){ // the first argument is always the callback function
ajaxResponses.push(arg);
}
});
// do stuff with the responses from your AJAX requests
// if an AJAX request returned an error, the error object will be present in place of the response object
callback();
});
Frame.start()
I am posting this answer thinking that it might help other persons in future, looking for some simple solutions in the same scenario.
This is now possible also using the native promise support introduced in ES6. You can wrap the ajax call in a promise and return it to the handler of the element.
function ajaxPromise(elInfo) {
return new Promise(function (resolve, reject) {
//Do anything as desired with the elInfo passed as parameter
$.ajax({
type: "POST",
url: '/someurl/',
data: {data: "somedata" + elInfo},
success: function (data) {
//Do anything as desired with the data received from the server,
//and then resolve the promise
resolve();
},
error: function (err) {
reject(err);
},
async: true
});
});
}
Now call the function recursively, from where you have the collection of the elements.
function callAjaxSynchronous(elCollection) {
if (elCollection.length > 0) {
var el = elCollection.shift();
ajaxPromise(el)
.then(function () {
callAjaxSynchronous(elCollection);
})
.catch(function (err) {
//Abort further ajax calls/continue with the rest
//callAjaxSynchronous(elCollection);
});
}
else {
return false;
}
}
I use http://developer.yahoo.com/yui/3/io/#queue to get that functionality.
The only solutions I can come up with is, as you say, maintaining a list of pending calls / callbacks. Or nesting the next call in the previous callback, but that feels a bit messy.
You can achieve the same thing using then.
var files = [
'example.txt',
'example2.txt',
'example.txt',
'example2.txt',
'example.txt',
'example2.txt',
'example2.txt',
'example.txt'
];
nextFile().done(function(){
console.log("done",arguments)
});
function nextFile(text){
var file = files.shift();
if(text)
$('body').append(text + '<br/>');
if(file)
return $.get(file).then(nextFile);
}
http://plnkr.co/edit/meHQHU48zLTZZHMCtIHm?p=preview
I would suggest a bit more sophisticated approach which is reusable for different cases.
I am using it for example when I need to slow down a call sequence when the user is typing in text editor.
But I am sure it should also work when iterating through the collection. In this case it can queue requests and can send a single AJAX call instead of 12.
queueing = {
callTimeout: undefined,
callTimeoutDelayTime: 1000,
callTimeoutMaxQueueSize: 12,
callTimeoutCurrentQueueSize: 0,
queueCall: function (theCall) {
clearTimeout(this.callTimeout);
if (this.callTimeoutCurrentQueueSize >= this.callTimeoutMaxQueueSize) {
theCall();
this.callTimeoutCurrentQueueSize = 0;
} else {
var _self = this;
this.callTimeout = setTimeout(function () {
theCall();
_self.callTimeoutCurrentQueueSize = 0;
}, this.callTimeoutDelayTime);
}
this.callTimeoutCurrentQueueSize++;
}
}
There's a very simple way to achieve this by adding async: false as a property to the ajax call. This will make sure the ajax call is complete before parsing the rest of the code. I have used this successfully in loops many times.
Eg.
$.ajax({
url: "",
type: "GET",
async: false
...

Categories

Resources