I allmost banged my head into the wall because I can't get the following code too work. I'm trying to code a photo gallery with the flickrApi and have problems with multiple async calls. But perhaps there is a cleaner solution to code this.
openPhotoset() is called when clicking the link of a photoset. Unfortunately getting the description of a photo I need to use a different method, which means another async call. I'm looping through the data, but because I make the call in a loop (that's when I have the photo-id available) the deferred of openPhotoset() doesn't resolve after looping but before. I read and have seen examples of $.when() used in a loop, filling an array with deferreds and checking with $.when but I seem to fail horribly at it. Is this the solution I need or is there another road to salvation? ;)
I want to execute different functions after all calls within openPhotoset() has completed.
function openPhotoset(photosetId) {
var currentPhotoset = [],
deferred = $.Deferred();
_requestPhotosOfSet(photosetId).done(function(data){
$(data.photoset.photo).each(function(i, photo){
var objPhoto = {};
objPhoto.id = photo.id;
objPhoto.title = photo.title;
objPhoto.farm = photo.farm;
objPhoto.server = photo.server;
objPhoto.secret = photo.secret;
// get photo description
requestPhotoInfo(photo.id).done(function(data) {
objPhoto.description = data.photo.description._content;
currentPhotoset.push(objPhoto);
}).then(function() {
// TODO: renders with each iteration, shouldnt!
var template = $('#li-gallery').html(),
result = Mustache.render(template, {currentPhotoset:currentPhotoset});
showGallery();
_$fyGallery.find('.gallery-list').html(result);
deferred.resolve();
});
});
});
return deferred;
}
You can do this by changing .done() for .then() in a couple of places, and rearranging things a bit - well quite
a lot.
I think you've probably been searching for something like this :
function openPhotoset(photosetId) {
return _requestPhotosOfSet(photosetId).then(function(data) {
var promises = $(data.photoset.photo).map(function(photo) {
return requestPhotoInfo(photo.id).then(function(data) {
return {
id: photo.id,
title: photo.title,
farm: photo.farm,
server: photo.server,
secret: photo.secret,
description: data.photo.description._content
};
});
}).get();//.get() is necessary to convert a jQuery object to a regular js array.
return $.when.apply(null, promises).then(function() {
var template = $('#li-gallery').html(),
result = Mustache.render(template, {
currentPhotoset: Array.prototype.slice.apply(arguments)
});
showGallery();
_$fyGallery.find('.gallery-list').html(result);
});
});
}
The main difference here is the creation of an array of promises as opposed to an array of photo objects, and allowing the promises to convey the data. This allows $.when() to fire off a callback when all the promises are fulfilled - ie when data objects have been composed for all photos in the set.
Note the use of .map() instead of .each(), thus simplifying the creation of promises.
And finally, the overall promise returned by openPhotoset() allows whatever action to be taken on completion of the whole process. Just chain .then().
openPhotoset(...).then(function() {
// here, do whatever
});
EDIT
The overall pattern is probably easier to understand if the inner workings are pulled out and rephrased as named promise-returning functions - getPhotoInfoObject() and renderData().
function openPhotoset(photosetId) {
function getPhotoInfoObject(photo) {
return requestPhotoInfo(photo.id).then(function(data) {
//$.extend() is much less verbose than copying `photo`'s properties into a new object longhand.
return $.extend(photo, {description: data.photo.description._content});
});
}
function renderData() {
var template = $('#li-gallery').html(),
currentPhotoset = Array.prototype.slice.apply(arguments),
result = Mustache.render(template, {
currentPhotoset: currentPhotoset
});
showGallery();
_$fyGallery.find('.gallery-list').html(result);
}
// With the inner workings pulled out as getPhotoInfoObject() and renderData(),
// the residual pattern is very concise and easier to understand.
return _requestPhotosOfSet(photosetId).then(function(data) {
var promises = $(data.photoset.photo).map(getPhotoInfoObject).get();
return $.when.apply(null, promises).then(renderData);
});
}
I was so blinded by the defereds and $.when function that I didn't notice all I needed was to create a counter and count down each time requestPhotoInfo was done and after render the html
Related
I have a method of rest call using request module which is restRequest() which returns response as promise which is asynchronous method, I have to call this method recursively with different parameters after getting the each results and passing that result to same method.
Example code:
restRequest(url, "POST").then(function(response) {
restRequest(secondUrl, 'GET', response).then(function(response2) {
}):
});
will this works, or any other things are there to solve this one.
I would use the async library for this
Specifically the waterfall
Which would work like
async.waterfall([
function firstRequest(callback) {
restRequest(url, "POST").then(function(response) {
callback(null, response);
});
},
function secondRequest (data, callback) {
restRequest(secondUrl, 'GET', data).then(function(response2) {
callback();
});
}
], function (err, result) {
// Handle err or result
});
Sorry for formatting I'm on mobile.
You can read about how async.waterfall works from the link above.
Your method works but depending on how many requests you have you can end up with quite a deep callback hell
But since you are using promises you can just return your promise chain like
restRequest(url, "POST")
.then(function(resp) {
return restRequest(secondUrl, "GET", resp);
})
.then(function(resp) {
return restRequest(thirdUrl, "GET", resp);
});
.then(function(resp) {
// do whatever keep the chain going or whatever
})
.catch(function(error) {
// if any of the promises error it will immediately call here.
});
With promises you can return a new promise from within a .then and just keep the chain going infinitely.
I'm just biased for async as i think it really improves readability when used right.
you could do something like:
let requestParams = [
[url, 'POST'],
[secondUrl, 'GET'],
...
];
function callRecursive(response){
if(!requestParams.length) return Promise.resolve(response);
let params = requestParams.shift();
if(response) params.push(response);
return restRequest(...params).then(callRecursive);
}
callRecursive().then(successCallbk).catch(errCallBk);
You can supply one or more arguments to bind to your partially applied function.
restRequest(url,"POST").then(restRequest.bind(this,secondUrl, "GET"))
.then(restRequest.bind(this,thirdUrl, "GET"));
Since these are fired off in serial, what you really have is a simple chain of functions (some return promises, some might not) that can compose (or sequence, here) together, which I find to be a neat way to isolate out everything you want to happen and then combine behaviors as needed. It's still a Promise chain under the hood, but expressed as a series. First, a few utility methods to help:
var curry = (f, ...args) =>
(f.length <= args.length) ? f(...args) : (...more) => curry(f, ...args, ...more);
var pipeP = (...fnlist) =>
acc => fnlist.reduce( (acc,fn) => acc.then(fn), Promise.resolve(acc));
then
//make restRequest only return a Promise once it's given its 3rd argument
var restRequest = autocurry(restRequest);
//define what our requests look like
var request1 = restRequest('firstUrl', "POST");//-> curried function, not yet called
var request2 = restRequest('secondUrl', 'GET');//-> curried function, not yet called
//define some simple methods to process responses
var extractURL = x => x.url;//-> simple function
var extractData = x=> x.data;//-> simple function
//final behaviors, i.e. do something with data or handle errors
//var handleData = ... //-> do something with "data"
//var handleError = ... //-> handle errors
//now, create a sort of lazy program chain waiting for a starting value
//that value is passed to request1 as its 3rd arg, starting things off
var handleARequest = pipeP(request1, extractURL, request2, extractData);
//and execute it as needed by passing it a starting request
handleARequest({postdata:5}).then(handleData).catch(handleErrors);
Recursion is the most obvious approach but it's not necessary. An alternative is to build a .then() chain by reducing an array of known parameters (urls and methods).
The process is presented here under "The Collection Kerfuffle".
function asyncSequence(params) {
return params.reduce(function(promise, paramObj) {
return promise.then(function(response) {
return restRequest(paramObj.url, paramObj.method, response);
});
}, Promise.resolve(null)); // a promise resolved with the value to appear as `response` in the first iteration of the reduction.
}
This will cater for any number of requests, as determined by the length of the params array.
Call as follows :
var params = [
{url:'path/1', method:'POST'},
{url:'path/2', method:'GET'},
{url:'path/3', method:'POST'}
];
asyncSequence(params).then(function(lastResponse) {
//all successfully completed
}).catch(function(e) {
// something went wrong
});
In jQuery you can chain actions for the same selector very easily but if you want to use different selector for each action it requires nested $.when with more then two actions which is quite hard to read and maintain.
HTML:
<span id='a'>Hello</span>
<span id='b'>world</span>
<span id='c'>!!!</span>
CSS:
span {
display: none;
}
JS: based on this: how to hide multiple items but only call the handler once?
var d = 500; // duration
// Execute in parallel.
//$('#a').show(d).hide(d);
//$('#b').show(d).hide(d);
//$('#c').show(d).hide(d);
$.when($('#a').fadeIn(d).fadeOut(d)).done(function () {
$.when($('#b').show(d).hide(d)).done(function () {
$('#c').slideDown(d).slideUp(d);
});
});
jsfiddle (old)
jsfiddle-2
I thougt I could use the queue but it seems to work only for the same selector.
Is there a way to write it in a more maintainable manner like:
pseudocode
var myActions = [];
myActions.push(function(){...});
myActions.push(function(){...});
myActions.push(function(){...});
something.executeSequentially(myActions);
EDIT:
I updated the demo so that it's a little bit harder.
If you really don't have to encounter for failures (and that's hardly possible with animations I suppose), you can use the following approach (kudos to #Esailija, as this solution is basically a simplified version of his answer):
var chainOfActions = [
function() { return $('#a').fadeIn(d).fadeOut(d); },
function() { return $('#b').fadeIn(d).fadeOut(d); },
function() { return $('#c').fadeIn(d).fadeOut(d); },
];
chainOfActions.reduce(function(curr, next) {
return curr.then(next);
}, $().promise());
Demo. There are three key points here:
each function in chain of actions already returns a promise (if not, you can 'promisify' it with returning the result of .promise() call instead)
at each step of reduce a chain is created, as each callback supplied into then() creates a new promise
the whole chain is initiated by supplying an empty promise as the initial value of reduce accumulator.
Edit, Updated
var d = 500
, something = {}
, myActions = [];
myActions.push(
function (next) {
$('#a').fadeIn(d).fadeOut(d, next)
});
myActions.push(
function (next) {
return $('#b').show(d).hide(d, next)
});
myActions.push(
function () {
return $('#c').slideDown(d).slideUp(d)
});
something.executeSequentially = function (arr) {
return $(this).queue("fx", arr);
};
something.executeSequentially(myActions);
jsfiddle http://jsfiddle.net/guest271314/2oawa1zn/
I am getting into programming with javascript and using Promises, right now using Q.js. I have finally gotten to a point where I understand what I am doing, but am having a difficult time with a specific behavior.
I have one situation where I have reasonably similar code repeated several times. It basically goes like this ...
{
// start
var deferred = Q.defer();
// do something {
deferred.resolve();
}
return deferred.promise;
}
Okay, that's all fine and good, but repeating all of this every time was getting annoying, so I attempted to wrap it up in something. This is just an example, it is not the entire javascript file, since most of the other parts are not relevant.
{
var list = [];
queue = function(f) {
var deferred = Q.defer();
list.push(f(deferred));
return deferred.promise;
}
{
queue(function(deferred){
// do some work
// we want the deferred here so we can resolve it at the correct time
deferred.resolve();
});
}
}
The problem is that I don't want this to run the instant I queue it up. I basically want to build the list, and then run it later. I am running the list using the reduce function in Q.js
{
return list.reduce(function(i, f) {
return i.then(f);
}, Q());
}
But this is kind of counter to my goal, since I really don't intend to run them at the same time they are queued. Is there a way to save the execution for later and still pass the deferred object through the function?
Update
I was asked what I expect the code to do, which is a fair question. I'll try to explain. The purpose of this is to split up the logic because I am using ASP.NET MVC, and so I have _Layout pages, and then normal views - so there is logic that cannot run until other things are completed, but some times that is on a per-page basis. This method was contrived to deal with that.
Essentially it works like this ...
Loader.js
This is, for lack of a better term or current implementation, a global object. I have plans to change that eventually, but one step at a time.
{
var Loader = {};
var list = [];
initialize = function() {
Q().then(step1).then(step2).then(process).then(finalStep);
};
queue = function(f) {
// push the given function to the list
};
process = function() {
return list.reduce(function(i,f){
return i.then(f);
}, Q());
};
step1 = function() { // generic example
// create a promise
return deferred.promise;
}; // other steps are similar to this.
return Loader;
}
_Layout
<head>
#RenderSection("scripts", false)
<script type="text/javascript">
// we have the loader object already
Loader.initialize();
</script>
</head>
Index
#section Scripts {
<script type="text/javascript">
Loader.promise(function(deferred){
// do something here.
deferred.resolve();
}));
</script>
}
You could use a closure.
queue(function(deferred) {
return function() {
// this is the actual function that will be run,
// but it will have access to the deferred variable
deferred.resolve();
};
});
I think you should do something like
var Loader = {
promise: function(construct) {
var deferred = Q.defer();
construct(deferred);
return deferred.promise;
},
queue: function(f) {
this.ready = this.ready.then(f);
},
ready: Q.Promise(function(resolve) {
window.onload = resolve; // or whatever you need to do here
// or assign the resolve function to Loader.initialize and call it later
})
};
Then Loader.queue() functions that return other promises.
I'm new to the when.js javascript library, but I'm familiar with async programming in C#. That's why I find this code to be unwieldy:
filters.doFilter('filter1name', reqAndPosts).then(function(filter1) {
filters.doFilter('filter2name', filter1).then(function(filter2) {
filters.doFilter('filter3name', filter2).then(function (posts) {
renderView(posts);
});
});
return filter1;
});
I basically want three methods to be called in sequence, with the output of each being piped to the next method. Is there anyway I can refactor this code to be more "sequence-like" - i.e. get rid of the nesting? I feel like there's something I'm missing with the when-framework here. I'm not doing it right, right?
Since doFilter returns a promise, we can do
filters.doFilter('filter1name', reqAndPosts)
.then(function(filter1) {
return filters.doFilter('filter2name', filter1);
})
.then(function(filter2) {
return filters.doFilter('filter3name', filter2);
})
.then(renderView);
There is another option to have both advantages of cleaner indentation and previous results available: using withThis.
filters.doFilter('filter1name', reqAndPosts).withThis({}).then(function(filter1) {
this.filter1 = filter1; // since we used withThis, you use `this` to store values
return filters.doFilter('filter2name', filter1);
}).then(function(filter2) {
// use "this.filter1" if you want
return filters.doFilter('filter3name', filter2);
}).then(renderView);
With a little thought you can write a generalised utility function that will take a start object and a filter sequence as its arguments, dynamically build the required .then chain, and return a promise of the multi-filtered result.
The function will look like this ...
function doFilters(filterArray, startObj) {
return filterArray.reduce(function(promise, f) {
return promise.then(function(result) {
return filters.doFilter(f, result);
});
}, when(startObj));
}
... which is an adaptation of a pattern given here in the section headed "The Collection Kerfuffle".
For the operation you want, call as follows :
doFilters(['filter1name', 'filter2name', 'filter3name'], reqAndPosts).then(function(result) {
//All filtering is complete.
//Do awesome stuff with the result.
});
Provding it is not destroyed and is in scope, doFilters() will remain available to be used elsewhere in your code :
doFilters(['f1', 'f2', 'f3'], myOtherObject).then(function(result) {
//...
});
With very little more effort, you could tidy things up by phrasing doFilters() as a method of filters. That would be best of all.
I have the following function which queries a SQLite database and pushed results into an Array that is used later. I need to wait for the each statement to process the tables in the (self.projectSetEditList) However it seems that the master deferred is not waiting for all promises... Am I just going about this all wrong? I need to know when all the sql results are ready before proceeding to the next function.
/// <summary>Validates user INSERTS to project_set entities</summary>
this.initProjectSetAdds = function ()
{
var promises = [];
var masterDeferred = new $.Deferred();
///var count = (self.projectSetEditList.length - 1);
$.each(self.projectSetEditList, function (index, syncEntity)
{
var def = new $.Deferred();
//get the config entity definition object
var entityDefinition = self.getEntityDefinition(syncEntity.entity_name);
self.db.executeSql(self.getAddsSql(entityDefinition)).done(function (tx, insertResults)
{
self.projectSetAdds.push({ definition: entityDefinition, addedObjects: dataUtils.convertToObjectArray(insertResults) });
def.resolve(true);
});
promises.push(def);
});
//resolve all deferred and return to caller
$.when.apply($, promises).then(function ()
{
masterDeferred.resolve(arguments);
},
function ()
{
masterDeferred.reject(arguments);
});
return (masterDeferred.promise());
}
The only async function inside is executeSql... Any suggestions are greatly appreciated
Your code appears unnecessarily complicated to me.
Since $.when also creates a promise, don't bother creating the masterDeferred yourself, just do:
return $.when.apply($, promises);
The only functional difference is that this version will pass the true results as individual parameters to the eventual callback, whereas your code will pass a single array of [true, true, ...] values.