I'm working with a string of XMLHttpRequests that each depend on the one prior to it. Psuedocode:
xhr1.open('GET', 'http://foo.com');
xhr1.onload = function(e){
xhr2.open('POST', xhr1.response.url)
xhr2.onload = function(e){
xhr3.open('GET', xhr2.response.url2);
xhr3.onload = function(e){
console.log('hooray! you have data from the 3rd URL!');
}
xhr3.send();
}
xhr2.send();
}
xhr1.send();
Is this the kind of situation where using promises would be a good idea to avoid all the callback muck?
Yes. If you return a promise in a then, the next chained then listens for that promise instead of resolving from the original promise. Given that ajaxCall returns a promise, your code will then look like:
ajaxCall(1)
.then(function(result1){
return ajaxCall(2);
})
.then(function(result2){
return ajaxCall(3);
})
.then(function(result3){
// all done
});
// Sample AJAX call
function ajaxCall(){
return new Promise(function(resolve, reject){
// xhr code. call resolve/reject with accordingly
// args passed into resolve/reject will be passed as result in then
});
}
Yes, definitively. Assuming a helper function like those from How do I promisify native XHR?, your code could be transformed into
makeRequest('GET', 'http://foo.com').then(function(response1) {
return makeRequest('POST', response1.url);
}).then(function(response2) {
return makeRequest('GET', response2.url2);
}).then(function(response3) {
console.log('hooray! you have data from the 3rd URL!');
});
Still callbacks of course, but no more nesting required. Also you'd have simple error handling, and the code looks much cleaner (partially contributed to by the non-promise-related fact of abstracting XHR in its own function).
Related
I consider rewriting existing callback-based code into code using promises. However I'm unsure if this makes sense and how to start. The following code-snippet is a mostly self-contained example from that code:
function addTooltip($element, serverEndpoint, data) {
'use strict';
const DELAY = 300;
const TOOLTIP_PARENT_CLASS = 'hasTooltip';
let timeOutReference;
$element.hover(function hoverStart() {
if ($element.hasClass(TOOLTIP_PARENT_CLASS)) {
return;
}
timeOutReference = setTimeout(function getToolTip() {
const $span = jQuery('<span class="serverToolTip">');
$span.html(jQuery('<span class="waiting">'));
$element.append($span);
$element.addClass(TOOLTIP_PARENT_CLASS);
jQuery.get(serverEndpoint, data).done(function injectTooltip(response) {
$span.html(response.data);
}).fail(handleFailedAjax);
}, DELAY);
}, function hoverEnd() {
clearTimeout(timeOutReference);
});
};
Intended functionality: When the user hovers over $element for 300ms the tooltip-content is requested from the server and appended to $element.
Does it make sense to rewrite that code with promises and how would I do it?
(jQuery is provided by the framework (dokuwiki), so we might as well use it.)
Prior research:
https://pouchdb.com/2015/05/18/we-have-a-problem-with-promises.html
other SO questions about this topic which left me unsure about whether this is a sensible idea and how to do it
First, you'd need to wrap setTimeout into a promise. Simply create a function that accepts a timeout and returns a promise that resolves after that timeout.
Next, since jQuery.get already returns a promise, you just need to put it inside the promise resolve handler and return its promise. That way the next chained then listens to that promise instead of the timer's.
It would look something like:
function timer(n){
return Promise(function(resolve){
setTimeout(resolve, n);
});
}
timer(DELAY).then(function(){
return jQuery.get(...)
}).then(function(response){
// jQuery.get promise resolved
}).catch(function(error){
// something failed somewhere
});
As for your question
Does it make sense to rewrite that code with promises and how would I do it?
That depends on you. I find promise-based code more readable but takes time to write properly especially if if you intend to write pure callbacks and deal with multiple async operations. I usually write my code callbacks-first if the API is simpler to write that way and refactor later for readability.
To elaborate on my comment above. Below is an example of how promises can make dependent-callback code (arguably) more readable (Basically, it destroys the nesting of callbacks-in-callbacks):
Again, in the case of the code snippet you posted, I hardly see how it's worth it (unless your doing it as an exercise).
With Callbacks
function someAsyncMethod(callback) {
$.get({...})
.then(callback);
}
function anotherAsyncMethod(callback) {
$.get({...})
.then(callback);
}
someAsyncMethod(function() {
anotherAsyncMethod(function yourFunction() {
//do something
});
});
With Promises:
function someAsyncMethod() {
return $.get({...});
}
function anotherAsycnMethod() {
return $.get({...});
}
someAsyncMethod()
.then(anotherAsyncMethod)
.then(function yourFunction() {
//do something
})
The below non-functional example should explain what I'm trying to do, I just don't understand the pattern I need to use to accomplish it. I tried googling to understand polling and deferred, but I couldn't find anything I could understand.
I have a function which polls an API, and I want to wait for that polling to return an expected result (waiting for the endpoint to indicate something has changed) before continuing with my main function. What am I doing wrong?
Edit: I should add that what SEEMS to go wrong with the code below is that even though deferred.resolve() eventually gets called, it seems it isn't the same deferred that got returned, so the when never gets activated in main(). I'm guessing it has to do with the timeout, meaning deferred get clobbered on the first repeat. That is my assumption, anyways.
function pollAPI() {
var deferred = $.Deferred();
$.ajax({
url: url,
contentType: 'application/JSON',
method: 'GET'
}).done(function(data){
if (!desiredResult) {
setTimeout(function() {
pollAPI();
}, 1000);
} else {
deferred.resolve();
}
}).error(deferred.reject());
return deferred.promise();
}
function main() {
$.when(pollAPI()).then(function() {
// do something now that the API has returned the expected result
});
You can use chaining of subsequent calls to the pollAPI() function to create a single promise that has others chained onto it. That would work like this:
// utility function to create a promise that is resolved after a delay
$.promiseDelay = function(t) {
return $.Deferred(function(def) {
setTimeout(def.resolve, t);
}).promise();
}
function pollAPI() {
return $.ajax({
url: url,
contentType: 'application/JSON',
method: 'GET'
}).then(function(data) {
// some logic here to test if we have desired result
if (!desiredResult) {
// chain the next promise onto it after a delay
return $.promiseDelay(1000).then(pollAPI);
} else {
// return resolved value
return someValue;
}
});
}
function main() {
pollAPI().then(function(result) {
// got desired result here
}, function(err) {
// ended with an error here
});
}
This has the following benefits:
No unnecessary promise is created to try to surround the ajax call that already has a promise. This avoids one of the common promise anti-patterns.
Subsequent calls to the API just chain to the original promise.
There is no need to use $.when() when you just have a single promise. You can just use .then() directly on it.
All errors automatically percolate back to the original promise.
This uses the ES6-standard .then() (which actually becomes more legitimately standard in jQuery 3.x - though it works in jQuery 1.x and 2.x with its own non-standard quirks) which makes this logic more compatible with other promise-producing async functions.
Also, a number of other retry ideas here: Promise Retry Design Patterns
Tools: JavaScript ES6
I haven't seen a good succinct answer about the syntax of chaining multiple promises to execute in order. I thought this would be a good nail in the coffin question for all promise newbies out there. :)
My Issue is that I want to call this in a synchronous order getPosts--->getThreads--->initializeComplete()
Here is what I am doing.
userPromise.then(getPostsPromise).then(getThreadsPromise).then(initializeComplete());
userPromise is Promise obj I returned from another part of the code
getPostsPromise returns a Promise and makes a fetch to the server for posts
getThreadsPromise returns a Promise and makes a fetch to the server for threads
initializeComplete is a callback to tell my program that it is initialized.
Here is an example of one of the promises in the chain:
var getPostsPromise = function(){
//Firebase is just a simple server I'm using
var firebasePostsRef = new Firebase("https://myfburl.firebaseio.com/posts");
var postsRef = firebasePostsRef.child(localPlace.key);
return new Promise(function(resolve, reject) {
//Below is a Firebase listener that is called when data is returned
postsRef.once('value', function(snap,prevChild) {
var posts = snap.val();
AnotherFile.receiveAllPosts(posts);
resolve(posts);
});
});
}
But initializeComplete() is being called before getPostsPromise and getThreadsPromise have a chance to finish fetching.
Why is that happening and how do I write the promises to execute in order?
initializeComplete is getting called right away because you are invoking it when passing it to then. You have to omit the parentheses, just like you did for getPostsPromise and getThreadsPromise
userPromise.then(getPostsPromise).then(getThreadsPromise).then(initializeComplete);
While yts's answer is correct (the issue is you're invoking initializeComplete instead of passing the function), I'd rather format the calls a bit differently. Having each callback function call the next function is a bit against the design of promises. I'd rather each function return a promise, and then call then on the returned promise:
userPromise
.then(function(){
return getPostsPromise()
}).then(function(){
return getThreadsPromise()
}).then(function(){
return initializeComplete();
});
or to pass the actual returned objects and not have to do any additional intermediate processing:
userPromise
.then(getPostsPromise)
.then(getThreadsPromise)
.then(initializeComplete);
I have several async functions with varying numbers of parameters, in each the last param is a callback. I wish to call these in order. For instance.
function getData(url, callback){
}
function parseData(data, callback){
}
By using this:
Function.prototype.then = function(f){
var ff = this;
return function(){ ff.apply(null, [].slice.call(arguments).concat(f)) }
}
it is possible to call these functions like this, and have the output print to console.log.
getData.then(parseData.then(console.log.bind(console)))('/mydata.json');
I've been trying to use this syntax instead, and cannot get the Then function correct. Any ideas?
getData.then(parseData).then(console.log.bind(console))('/mydata.json');
Implementing a function or library that allows you to chain methods like above is a non-trivial task and requires substantial effort. The main problem with the example above is the constant context changing - it is very difficult to manage the state of the call chain without memory leaks (i.e. saving a reference to all chained functions into a module-level variable -> GC will never free the functions from memory).
If you are interested in this kind of programming strategy I highly encourage you to use an existing, established and well-tested library, like Promise or q. I personally recommend the former as it attempts to behave as close as possible to ECMAScript 6's Promise specification.
For educational purposes, I recommend you take a look at how the Promise library works internally - I am quite sure you will learn a lot by inspecting its source code and playing around with it.
Robert Rossmann is right. But I'm willing to answer purely for academic purposes.
Let's simplify your code to:
Function.prototype.then = function (callback){
var inner = this;
return function (arg) { return inner(arg, callback); }
}
and:
function getData(url, callback) {
...
}
Let's analyze the types of each function:
getData is (string, function(argument, ...)) → null.
function(argument, function).then is (function(argument, ...)) → function(argument).
That's the core of the problem. When you do:
getData.then(function (argument) {}) it actually returns a function with the type function(argument). That's why .then can't be called onto it, because .then expects to be called onto a function(argument, function) type.
What you want to do, is wrap the callback function. (In the case of getData.then(parseData).then(f), you want to wrap parseData with f, not the result of getData.then(parseData).
Here's my solution:
Function.prototype.setCallback = function (c) { this.callback = c; }
Function.prototype.getCallback = function () { return this.callback; }
Function.prototype.then = function (f) {
var ff = this;
var outer = function () {
var callback = outer.getCallback();
return ff.apply(null, [].slice.call(arguments).concat(callback));
};
if (this.getCallback() === undefined) {
outer.setCallback(f);
} else {
outer.setCallback(ff.getCallback().then(f));
}
return outer;
}
This looks like an excellent use for the Promise object. Promises improve reusability of callback functions by providing a common interface to asynchronous computation. Instead of having each function accept a callback parameter, Promises allow you to encapsulate the asynchronous part of your function in a Promise object. Then you can use the Promise methods (Promise.all, Promise.prototype.then) to chain your asynchronous operations together. Here's how your example translates:
// Instead of accepting both a url and a callback, you accept just a url. Rather than
// thinking about a Promise as a function that returns data, you can think of it as
// data that hasn't loaded or doesn't exist yet (i.e., promised data).
function getData(url) {
return new Promise(function (resolve, reject) {
// Use resolve as the callback parameter.
});
}
function parseData(data) {
// Does parseData really need to be asynchronous? If not leave out the
// Promise and write this function synchronously.
return new Promise(function (resolve, reject) {
});
}
getData("someurl").then(parseData).then(function (data) {
console.log(data);
});
// or with a synchronous parseData
getData("someurl").then(function (data) {
console.log(parseData(data));
});
Also, I should note that Promises currently don't have excellent browser support. Luckily you're covered since there are plenty of polyfills such as this one that provide much of the same functionality as native Promises.
Edit:
Alternatively, instead of changing the Function.prototype, how about implementing a chain method that takes as input a list of asynchronous functions and a seed value and pipes that seed value through each async function:
function chainAsync(seed, functions, callback) {
if (functions.length === 0) callback(seed);
functions[0](seed, function (value) {
chainAsync(value, functions.slice(1), callback);
});
}
chainAsync("someurl", [getData, parseData], function (data) {
console.log(data);
});
Edit Again:
The solutions presented above are far from robust, if you want a more extensive solution check out something like https://github.com/caolan/async.
I had some thoughts about that problem and created the following code which kinda meets your requirements. Still - I know that this concept is far away from perfect. The reasons are commented in the code and below.
Function.prototype._thenify = {
queue:[],
then:function(nextOne){
// Push the item to the queue
this._thenify.queue.push(nextOne);
return this;
},
handOver:function(){
// hand over the data to the next function, calling it in the same context (so we dont loose the queue)
this._thenify.queue.shift().apply(this, arguments);
return this;
}
}
Function.prototype.then = function(){ return this._thenify.then.apply(this, arguments) };
Function.prototype.handOver = function(){ return this._thenify.handOver.apply(this, arguments) };
function getData(json){
// simulate asyncronous call
setTimeout(function(){ getData.handOver(json, 'params from getData'); }, 10);
// we cant call this.handOver() because a new context is created for every function-call
// That means you have to do it like this or bind the context of from getData to the function itself
// which means every time the function is called you have the same context
}
function parseData(){
// simulate asyncronous call
setTimeout(function(){ parseData.handOver('params from parseData'); }, 10);
// Here we can use this.handOver cause parseData is called in the context of getData
// for clarity-reasons I let it like that
}
getData
.then(function(){ console.log(arguments); this.handOver(); }) // see how we can use this here
.then(parseData)
.then(console.log)('/mydata.json'); // Here we actually starting the chain with the call of the function
// To call the chain in the getData-context (so you can always do this.handOver()) do it like that:
// getData
// .then(function(){ console.log(arguments); this.handOver(); })
// .then(parseData)
// .then(console.log).bind(getData)('/mydata.json');
Problems and Facts:
the complete chain is executed in the context of the first function
you have to use the function itself to call handOver at least with the first Element of the chain
if you create a new chain using the function you already used, it will conflict when it runs to the same time
it is possible to use a function twice in the chain (e.g. getData)
because of the shared conext you can set a property in one function and read it in one of the following functions
At least for the first Problem you could solve it with not calling the next function in the chain in the same context and instead give the queue as parameter to the next function. I will try this approach later. This maybe would solve the conflicts mentioned at point 3, too.
For the other problem you could use the sample Code in the comments
PS: When you run the snipped make sure your console is open to see the output
PPS: Every comment on this approach is welcome!
The problem is that then returns a wrapper for the current function and successive chained calls will wrap it again, instead of wrapping the previous callback. One way to achieve that is to use closures and overwrite then on each call:
Function.prototype.then = function(f){
var ff = this;
function wrapCallback(previousCallback, callback) {
var wrapper = function(){
previousCallback.apply(null, [].slice.call(arguments).concat(callback));
};
ff.then = wrapper.then = function(f) {
callback = wrapCallback(callback, f); //a new chained call, so wrap the callback
return ff;
}
return wrapper;
}
return ff = wrapCallback(this, f); //"replace" the original function with the wrapper and return that
}
/*
* Example
*/
function getData(json, callback){
setTimeout( function() { callback(json) }, 100);
}
function parseData(data, callback){
callback(data, 'Hello');
}
function doSomething(data, text, callback) {
callback(text);
}
function printData(data) {
console.log(data); //should print 'Hello'
}
getData
.then(parseData)
.then(doSomething)
.then(printData)('/mydata.json');
I'm looking to replace some of my existing code with JavaScript Promises and I just want to confirm that I'm not doing it wrong (or maybe there are better ways). I'm using the es6-promise library.
What I have are three functions that I've updated to use JavaScript Promises (what was previously a nested, callback-ey mess). The functions are actually supposed to have dual modes i.e. I can use them like regular functions and have them return a result, or I could chain them together.
function func_1()
{
return new Promise(function(resolve, reject){
if(condition)
{
resolve('1');
}
else
{
console.log('start');
resolve(ajaxRequest(url));
}
});
}
function func_2(data)
{
return new Promise(function(resolve, reject){
if(condition)
{
resolve('2');
}
else
{
console.log(data.response);
resolve(ajaxRequest(url));
}
});
}
function func_3(data)
{
return new Promise(function(resolve, reject){
if(condition)
{
resolve('3');
}
else
{
console.log(data.response);
resolve(ajaxRequest(url));
}
});
}
func_1().then(func_2).then(func_3).then(function(data){});
The if is to check whether data is cached in localStorage/sessionStorage, and if so the function just returns the result. However, in circumstances where I am unsure if the values have been cached (e.g. first run of the script), then I plan to chain them and then have each subsequent function persist the result from its preceeding one to localStorage/sessionStorage (hence the promise chain on the last line). The final then gives me an opportunity to persist the data from func_3.
Based on the tests I have run, everything seems to be working ok, but I was just wondering if this was the best way of doing this? And how do I handle the AJAX errors that could happen on one or more of the 3 listed functions?
Note: My AjaxRequest function also uses the same Promise mechanism and 'resolves' a full XHR on success, and 'rejects' the same full XHR on failure/error.
EDIT
After a tip from #Bergi, I've updated the code to look like this (and it works just as well):
function func_1()
{
if(condition)
{
return Promise.resolve('1');
}
else
{
console.log('start');
return ajaxRequest(url);
}
}
function func_2(data)
{
if(condition)
{
return Promise.resolve('2');
}
else
{
console.log(data.response);
return ajaxRequest(url);
}
}
function func_3(data)
{
if(condition)
{
return Promise.resolve('3');
}
else
{
console.log(data.response);
return ajaxRequest(url);
}
}
func_1().then(func_2).then(func_3).then(function(data){})['catch'](function(err){console.log(err)});
everything seems to be working ok, but I was just wondering if this was the best way of doing this?
You should never really need to use the Promise constructor except on the lowest level (i.e. when promisifying that ajax request). Your functions should simply be written like this:
function func_1() {
if (condition) {
return Promise.resolve('1');
} else {
console.log('start');
retrun ajaxRequest(url);
}
}
Strictly speaking, the func_2 and func_3 might even do return '2', the then method for which they are used as callbacks can cope with that. Of course it is cleaner to always return a promise object.
And how do I handle the AJAX errors that could happen on one or more of the 3 listed functions?
Pass a second function to then, or use catch. This callback will get called when the promise is rejected, and should handle the exception.