Promise All in Node.js with a forEach loop - javascript

I have a function that reads a directory and copies and creates a new file within that directory.
function createFiles (countryCode) {
fs.readdir('./app/data', (err, directories) => {
if (err) {
console.log(err)
} else {
directories.forEach((directory) => {
fs.readdir(`./app/data/${directory}`, (err, files) => {
if (err) console.log(err)
console.log(`Creating ${countryCode}.yml for ${directory}`)
fs.createReadStream(`./app/data/${directory}/en.yml`).pipe(fs.createWriteStream(`./app/data/${directory}/${countryCode}.yml`))
})
})
}
})
}
How do I do this using promises or Promise All to resolve when it's complete?

First, you need to wrap each file stream in a promise that resolves when the stream emits the finish event:
new Promise((resolve, reject) => {
fs.createReadStream(`./app/data/${directory}/en.yml`).pipe(
fs.createWriteStream(`./app/data/${directory}/${countryCode}.yml`)
).on('finish', resolve);
});
The you need to collect these promises in an array. This is done by using map() instead of forEach() and returning the promise:
var promises = directories.map((directory) => {
...
return new Promise((resolve, reject) => {
fs.createReadStream( ...
...
});
});
Now you have a collection of promises that you can wrap with Promise.all() and use with a handler when all the wrapped promises have resolved:
Promise.all(promises).then(completeFunction);

In recent versions of Node (8.0.0 and later), there's a new util.promisify function you can use to get a promise. Here's how we might use it:
// Of course we'll need to require important modules before doing anything
// else.
const util = require('util')
const fs = require('fs')
// We use the "promisify" function to make calling promisifiedReaddir
// return a promise.
const promisifiedReaddir = util.promisify(fs.readdir)
// (You don't need to name the variable util.promisify promisifiedXYZ -
// you could just do `const readdir = util.promisify(fs.readdir)` - but
// I call it promisifiedReaddir here for clarity.
function createFiles(countryCode) {
// Since we're using our promisified readdir function, we'll be storing
// a Promise inside of the readdirPromise variable..
const readdirPromise = promisifiedReaddir('./app/data')
// ..then we can make something happen when the promise finishes (i.e.
// when we get the list of directories) by using .then():
return readdirPromise.then(directories => {
// (Note that we only get the parameter `directories` here, with no `err`.
// That's because promises have their own way of dealing with errors;
// try looking up on "promise rejection" and "promise error catching".)
// We can't use a forEach loop here, because forEach doesn't know how to
// deal with promises. Instead we'll use a Promise.all with an array of
// promises.
// Using the .map() method is a great way to turn our list of directories
// into a list of promises; read up on "array map" if you aren't sure how
// it works.
const promises = directory.map(directory => {
// Since we want an array of promises, we'll need to `return` a promise
// here. We'll use our promisifiedReaddir function for that; it already
// returns a promise, conveniently.
return promisifiedReaddir(`./app/data/${directory}`).then(files => {
// (For now, let's pretend we have a "copy file" function that returns
// a promise. We'll actually make that function later!)
return copyFile(`./app/data/${directory}/en.yml`, `./app/data/${directory}/${countryCode}.yml`)
})
})
// Now that we've got our array of promises, we actually need to turn them
// into ONE promise, that completes when all of its "children" promises
// are completed. Luckily there's a function in JavaScript that's made to
// do just that - Promise.all:
const allPromise = Promies.all(promises)
// Now if we do a .then() on allPromise, the function we passed to .then()
// would only be called when ALL promises are finished. (The function
// would get an array of all the values in `promises` in order, but since
// we're just copying files, those values are irrelevant. And again, don't
// worry about errors!)
// Since we've made our allPromise which does what we want, we can return
// it, and we're done:
return allPromise
})
}
Okay, but, there's probably still a few things that might be puzzling you..
What about errors? I kept saying that you don't need to worry about them, but it is good to know a little about them. Basically, in promise-terms, when an error happens inside of a util.promisify'd function, we say that that promise rejects. Rejected promises behave mostly the same way you'd expect errors to; they throw an error message and stop whatever promise they're in. So if one of our promisifiedReaddir calls rejects, it'll stop the whole createFiles function.
What about that copyFile function? Well, we have two options:
Use somebody else's function. No need to re-invent the wheel! quickly-copy-file looks to be a good module (plus, it returns a promise, which is useful for us).
Program it ourselves.
Programming it ourselves isn't too hard, actually, but it takes a little bit more than simply using util.promisify:
function copyFile(from, to) {
// Hmm.. we want to copy a file. We already know how to do that in normal
// JavaScript - we'd just use a createReadStream and pipe that into a
// createWriteStream. But we need to return a promise for our code to work
// like we want it to.
// This means we'll have to make our own hand-made promise. Thankfully,
// that's not actually too difficult..
return new Promise((resolve, reject) => {
// Yikes! What's THIS code mean?
// Well, it literally says we're returning a new Promise object, with a
// function given to it as an argument. This function takes two arguments
// of its own: "resolve" and "reject". We'll look at them separately
// (but maybe you can guess what they mean already!).
// We do still need to create our read and write streams like we always do
// when copying files:
const readStream = fs.createReadStream(from)
const writeStream = fs.createWriteStream(to)
// And we need to pipe the read stream into the write stream (again, as
// usual):
readStream.pipe(writeStream)
// ..But now we need to figure out how to tell the promise when we're done
// copying the files.
// Well, we'll start by doing *something* when the pipe operation is
// finished. That's simple enough; we'll just set up an event listener:
writeStream.on('close', () => {
// Remember the "resolve" and "reject" functions we got earlier? Well, we
// can use them to tell the promise when we're done. So we'll do that here:
resolve()
})
// Okay, but what about errors? What if, for some reason, the pipe fails?
// That's simple enough to deal with too, if you know how. Remember how we
// learned a little on rejected promises, earlier? Since we're making
// our own Promise object, we'll need to create that rejection ourself
// (if anything goes wrong).
writeStream.on('error', err => {
// We'll use the "reject" argument we were given to show that something
// inside the promise failed. We can specify what that something is by
// passing the error object (which we get passed to our event listener,
// as usual).
reject(err)
})
// ..And we'll do the same in case our read stream fails, just in case:
readStream.on('error', err => {
reject(err)
})
// And now we're done! We've created our own hand-made promise-returning
// function, which we can use in our `createFiles` function that we wrote
// earlier.
})
}
..And here's all the finished code, so that you can review it yourself:
const util = require('util')
const fs = require('fs')
const promisifiedReaddir = util.promisify(fs.readdir)
function createFiles(countryCode) {
const readdirPromise = promisifiedReaddir('./app/data')
return readdirPromise.then(directories => {
const promises = directory.map(directory => {
return promisifiedReaddir(`./app/data/${directory}`).then(files => {
return copyFile(`./app/data/${directory}/en.yml`, `./app/data/${directory}/${countryCode}.yml`)
})
})
const allPromise = Promies.all(promises)
return allPromise
})
}
function copyFile(from, to) {
return new Promise((resolve, reject) => {
const readStream = fs.createReadStream(from)
const writeStream = fs.createWriteStream(to)
readStream.pipe(writeStream)
writeStream.on('close', () => {
resolve()
})
writeStream.on('error', err => {
reject(err)
})
readStream.on('error', err => {
reject(err)
})
})
}
Of course, this implementation isn't perfect. You could improve it by looking at other implementations - for example this one destroys the read and write streams when an error occurs, which is a bit cleaner than our method (which doesn't do that). The most reliable way would probably to go with the module I linked earlier!
I highly recommend you watch funfunfunction's video on promises. It explains how promises work in general, how to use Promise.all, and more; and he's almost certainly better at explaining this whole concept than I am!

First, create a function that returns a promise:
function processDirectory(directory) {
return new Promise((resolve, reject) => {
fs.readdir(`./app/data/${directory}`, (err, files) => {
if (err) reject(err);
console.log(`Creating ${countryCode}.yml for ${directory}`);
fs.createReadStream(`./app/data/${directory}/en.yml`)
.pipe(fs.createWriteStream(`./app/data/${directory}/${countryCode}.yml`))
.on('finish', resolve);
});
});
}
Then use Promise.all:
Promise.all(directories.map(processDirectory))
.then(...)
.catch(...);

Related

Promise result and its processing into two separate files

In my script I need to make a XMLrequest, get the response from the server (if resolved), parse it and place the parsed result in a variable.
Only later, with this variable I can do lot of stuff.
So basically, I am trying to separate the creation of such parser object from the rest of my code, in order to:
place the above process into a function in an external script which I can call whenever I need.
not having all my code within a .then() method
organize my code into logical files each devoted to a specific task
So I came up with this piece of code (just a sample) to explain my needs.
As I am fairly new to asyncronous programming, would it be fine to do like so, given that the response should be very fast to resovle (or to reject) in my case?
If it's ok, I would then put this inside a separate file to be able to import it and call testing() (or whatever the name will be) from anywhere I need.
function delay(input) {
return new Promise(function(resolve, reject) {
// some async operation here
setTimeout(function() {
// resolve the promise with some value
resolve(input + 10);
}, 500);
});
}
function testing() {delay(5).then(result => {return document.write(result)})};
testing();
EDIT
Ok, so I think I hit the problem thanks to the answer of #DrewReese and the links in the comments.
I am trying to solve this situation: the code above was misleading to understand my point, but I guess tere's no really simple solution.
Look at this code (it's basically the same as the one above, except fot the last three lines):
function delay(input) {
return new Promise(function(resolve, reject) {
// some async operation here
setTimeout(function() {
// resolve the promise with some value
resolve(input + 10);
}, 500);
});
}
function testing() {delay(5).then(result => {return result})};
var test = testing();
document.write(test);
So in this case, I know when I am defining test the output is 'undefined' because the Promise in testing() has not yet resolved.
The problem I'm trying to solve is: is there (if any) a way to define test only when the Promise is resolved without wrapping it in a then() and maybe outputting something different when it's not resolved (like "LOADING...").
I don't know basically if it is possible to check if a variable has a Promise pending and outputting two different values: when it's waiting and when it's resolved/rejected.
Hopefully I was clear enough, otherwise I'll test more and come back with another question if needed.
The whole point of promises (promise chains) is to escape from "nesting hell" by flattening your "chain", so yes, when your promise chain is resolving blocks of code, it is returning a promise which is then-able. I hope this helps illustrate:
someAsyncHttpRequest(parameter) // <- Returns Promise
.then(result => {
// do something with result data, i.e. extract response data,
// mutate it, save it, do something else based on value
...
// You can even return a Promise
return Promise.resolve(mutatedData);
})
.then(newData => { // <- mutadedData passed as parameter
// do new stuff with new data, etc... even reject
let rejectData = processNewData(newData);
return Promise.reject(rejectData);
})
.catch(err => {
console.log('Any caught promise rejection no matter where it came from:', err);
})
.finally(() => {// You can even run this code no matter what});
If you need to use any variable values you set within the chain, then you need to make the outer function async and await on the resolution of the promise chain:
asyncFunction = async (parameter) => {
let asyncValue;
await someAsyncHttpRequest(parameter)
.then(result => {
...
asyncValue = someValue;
...
});
// safe to use asyncValue now
};
So for you:
function delay(input) {
return new Promise(function(resolve, reject) {
// some async operation here
setTimeout(function() {
// resolve the promise with some value
resolve(input + 10);
}, 500);
});
}
**async** function testing() { // Declare this an asynchronous function
let value = **await** delay(5); // Now you can await the resolution of the Promise
console.log(value); // Outputs resolved value 15!
return value; // Just returns resolved Promise
}
var test = testing();
console.log(test);
/** Outputs the Promise!
Promise {<pending>}
__proto__:Promise
[[PromiseStatus]]:"resolved"
[[PromiseValue]]:15
*/
test.then(console.log)); // Outputs returned resolved value, 15

Async function not returning value, but console.log() does: how to do? [duplicate]

This question already has answers here:
How do I return the response from an asynchronous call?
(41 answers)
Closed 5 years ago.
I have an es6 class, with an init() method responsible for fetching data, transforming it, then update the class's property this.data with newly transformed data.
So far so good.
The class itself has another getPostById() method, to just do what it sounds like. Here is the code for the class:
class Posts {
constructor(url) {
this.ready = false
this.data = {}
this.url = url
}
async init() {
try {
let res = await fetch( this.url )
if (res.ok) {
let data = await res.json()
// Do bunch of transformation stuff here
this.data = data
this.ready = true
return data
}
}
catch (e) {
console.log(e)
}
}
getPostById(id){
return this.data.find( p => p.id === id )
}
}
Straightforward, except I have an async/await mechanism in the init() method.
Now, this code will work correctly:
let allPosts = new Posts('https://jsonplaceholder.typicode.com/posts')
allPosts.init()
.then( d => console.log(allPosts.getPostById(4)) )
// resulting Object correctly logged in console
but it only gets printed into the console:
How could I use allPosts.getPostById(4) as a return of a function ?
Like:
let myFunc = async () => {
const postId = 4
await allPosts.init() // I need to wait for this to finish before returning
// This is logging correct value
console.log( 'logging: ' + JSON.stringify(allPosts.getPostById( postId ), null, 4) )
// How can I return the RESULT of allPosts.getPostById( postId ) ???
return allPosts.getPostById( postId )
}
myFunc() returns a Promise but not the final value. I have read several related posts on the subject but they all give example of logging, never returning.
Here is a fiddle that includes two ways of handling init(): using Promise and using async/await. No matter what I try, I can't manage to USE the FINAL VALUE of getPostById(id).
The question of this post is: how can I create a function that will RETURN the VALUE of getPostById(id) ?
EDIT:
A lot of good answers trying to explain what Promises are in regards to the main execution loop.
After a lot of videos and other good reads, here is what I understand now:
my function init() correctly returns. However, within the main event loop: it returns a Promise, then it is my job to catch the result of this Promise from within a kinda parallel loop (not a new real thread). In order to catch the result from the parallel loop there are two ways:
use .then( value => doSomethingWithMy(value) )
use let value = await myAsyncFn(). Now here is the foolish hiccup:
await can only be used within an async function :p
thus itself returning a Promise, usable with await which should be embed in an async function, which will be usable with await etc...
This means we cannot really WAIT for a Promise: instead we should catch parallel loop indefinitely: using .then() or async/await.
Thanks for the help !
As for your comment; I'll add it as answer.
The code you write in JavaScript is run on one thread, that means that if your code could actually wait for something it will block any of your other code from getting executed. The event loop of JavaScript is explained very well in this video and if you like to read in this page.
A good example of blocking code in the browser is alert("cannot do anything until you click ok");. Alert blocks everything, the user can't even scroll or click on anything in the page and your code also blocks from executing.
Promise.resolve(22)
.then(x=>alert("blocking")||"Hello World")
.then(
x=>console.log(
"does not resolve untill you click ok on the alert:",
x
)
);
Run that in a console and you see what I mean by blocking.
This creates a problem when you want to do something that takes time. In other frameworks you'd use a thread or processes but there is no such thing in JavaScript (technically there is with web worker and fork in node but that's another story and usually far more complicated than using async api's).
So when you want to make a http request you can use fetch but fetch takes some time to finish and your function should not block (has to return something as fast as possible). This is why fetch returns a promise.
Note that fetch is implemented by browser/node and does run in another thread, only code you write runs in one thread so starting a lot of promises that only run code you write will not speed up anything but calling native async api's in parallel will.
Before promises async code used callbacks or would return an observable object (like XmlHttpRequest) but let's cover promises since you can convert the more traditional code to a promise anyway.
A promise is an object that has a then function (and a bunch of stuff that is sugar for then but does the same), this function takes 2 parameters.
Resolve handler: A function that will be called by the promise when the promise resolves (has no errors and is finished). The function will be passed one argument with the resolve value (for http requests this usually is the response).
Reject handler: A function that will be called by the promise when the promise rejects (has an error). This function will be passed one argument, this is usually the error or reason for rejection (can be a string, number or anything).
Converting callback to promise.
The traditional api's (especially nodejs api's) use callbacks:
traditionalApi(
arg
,function callback(err,value){
err ? handleFail(err) : processValue(value);
}
);
This makes it difficult for the programmer to catch errors or handle the return value in a linear way (from top to bottom). It gets even more impossible to try and do things parallel or throttled parallel with error handling (impossible to read).
You can convert traditional api's to promises with new Promise
const apiAsPromise = arg =>
new Promise(
(resolve,reject)=>
traditionalApi(
arg,
(err,val) => (err) ? reject(err) : resolve(val)
)
)
async await
This is what's called syntax sugar for promises. It makes promise consuming functions look more traditional and easier to read. That is if you like to write traditional code, I would argue that composing small functions is much easier to read. For example, can you guess what this does?:
const handleSearch = search =>
compose([
showLoading,
makeSearchRequest,
processRespose,
hideLoading
])(search)
.then(
undefined,//don't care about the resolve
compose([
showError,
hideLoading
])
);
Anayway; enough ranting. The important part is to understand that async await doesn't actually start another thread, async functions always return a promise and await doesn't actually block or wait. It's syntax sugar for someFn().then(result=>...,error=>...) and looks like:
async someMethod = () =>
//syntax sugar for:
//return someFn().then(result=>...,error=>...)
try{
const result = await someFn();
...
}catch(error){
...
}
}
The examples allways show try catch but you don't need to do that, for example:
var alwaysReject = async () => { throw "Always returns rejected promise"; };
alwaysReject()
.then(
x=>console.log("never happens, doesn't resolve")
,err=>console.warn("got rejected:",err)
);
Any error thrown or await returning a rejected promise will cause the async function to return a rejected promise (unless you try and catch it). Many times it is desirable to just let it fail and have the caller handle errors.
Catching errors could be needed when you want the promise to succeed with a special value for rejected promises so you can handle it later but the promise does not technically reject so will always resolve.
An example is Promise.all, this takes an array of promises and returns a new promise that resolves to an array of resolved values or reject when any one of them rejects. You may just want to get the results of all promises back and filter out the rejected ones:
const Fail = function(details){this.details=details;},
isFail = item => (item && item.constructor)===Fail;
Promise.all(
urls.map(//map array of urls to array of promises that don't reject
url =>
fetch(url)
.then(
undefined,//do not handle resolve yet
//when you handle the reject this ".then" will return
// a promise that RESOLVES to the value returned below (new Fail([url,err]))
err=>new Fail([url,err])
)
)
)
.then(
responses => {
console.log("failed requests:");
console.log(
responses.filter(//only Fail type
isFail
)
);
console.log("resolved requests:");
console.log(
responses.filter(//anything not Fail type
response=>!isFail(response)
)
);
}
);
Your question and the comments suggest you could use a little intuition nudge about the way the event loop works. It really is confusing at first, but after a while it becomes second nature.
Rather than thinking about the FINAL VALUE, think about the fact that you have a single thread and you can't stop it — so you want the FUTURE VALUE -- the value on the next or some future event loop. Everything you write that is not asynchronous is going to happen almost immediately — functions return with some value or undefined immediately. There's nothing you can do about. When you need something asynchronously, you need to setup a system that is ready to deal with the async values when they return sometime in the future. This is what events, callbacks, promises (and async/await) all try to help with. If some data is asynchronous, you simply can not use it in the same event loop.
So what do you do?
If you want a pattern where you create an instance, call init() and then some function that further process it, you simply need to setup a system that does the processing when the data arrives. There are a lot of ways to do this. Here's one way that's a variation on your class:
function someAsync() {
console.log("someAsync called")
return new Promise(resolve => {
setTimeout(() => resolve(Math.random()), 1000)
})
}
class Posts {
constructor(url) {
this.ready = false
this.data = "uninitilized"
this.url = url
}
init() {
this.data = someAsync()
}
time100() {
// it's important to return the promise here
return this.data.then(d => d * 100)
}
}
let p = new Posts()
p.init()
processData(p)
// called twice to illustrate point
processData(p)
async function processData(posts) {
let p = await posts.time100()
console.log("randomin * 100:", p)
}
init() saves the promise returned from someAsync(). someAsync() could be anything that returns a promise. It saves the promise in an instance property. Now you can call then() or use async/await to get the value. It will either immediately return the value if the promise has already resolved or it will deal with it when it has resolved. I called processData(p) twice just to illustrate that it doesn't calle the someAsync() twice.
That's just one pattern. There are a lot more — using events, observables, just using then() directly, or even callbacks which are unfashionable, but still can be useful.
NOTE: Wherever you use await it has to be inside an async function.
Check out the UPDATED FIDDLE
You need to use await myFunc() to get the value you expect from getPostById because an async function always returns a promise.
This sometimes is very frustrating as the whole chain needs to be converted into async functions but that's the price you pay for converting it to a synchronous code, I guess. I am not sure if that can be avoided but am interested in hearing from people who have more experience on this.
Try out the below code in your console by copying over the functions and then accessing final and await final.
NOTE:
An async function CAN contain an await expression.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/async_function
There is no rule that is must have await in order to even declare an async function.
The example below uses an async function without await just to show that an async function always returns a promise.
const sample = async () => {
return 100;
}
// sample() WILL RETURN A PROMISE AND NOT 100
// await sample() WILL RETURN 100
const init = async (num) => {
return new Promise((resolve, reject) => {
resolve(num);
});
}
const myFunc = async (num) => {
const k = await init(num);
return k;
}
// const final = myFunc();
// final; This returns a promise
// await final; This returns the number you provided to myFunc

Add the result of a chained promise into promise.all()

I try construct a promise, named downloadTextPromise, to download a file and then in Promise.all() to read the file content.
However, I still cannot read the file content in Promise.all().then.
I suspect that downloadTextPromise is a two-step promise, so only the first step (HTTP GET) is pushed into promises array. But I have little idea how to fix it.
Any hint? any solution that works is welcoming, request-promise-native package is not compulsory.
My code is as below:
const request = require("request-promise-native");
let promises = [];
//put several promises into promises array;
.....
let downloadTextPromise = request.get("http://example.com/xyz.txt").pipe(fs.createWriteStream(`~/xyz.txt`));
promises.push(downloadTextPromise);
Promise.all(promises).then(() => {
//I need xyz.txt content at this stage, but it is not ready immediately.
}
Firstly, check if downloadTextPromise is actually a promise. The library's documentation doesn't mention how pipe works. If it returns a native promise, you can do console.log(downloadTextPromise instanceof Promise). Otherwise, you can try console.log(!!downloadTextPromise.then).
If it's a promise, then one possibility is that the promise is resolved synchronously after the the stream ends. The code to actually write the file is added to the message queue. In this case, you can fix it by doing:
Promise.all(promises).then(() => {
setTimeout(() => {
// Do stuff with xyz.txt
}, 0);
}
The issue is the writeStream part, so I should build a function returning correct promise.
I don't find the way to construct one returning a promise from request.get().pipe(), instead I use request.get().then() and fs.writeFile() as below.
However, Another problem appears from the new solution: This fs.writeFile works all OK for txt files; for mp3 or mp4, the function returns no error, but the file content is invalid for further processing in Promise.all().then().
function downloadTextPromise() {
return new Promise((resolve, reject) => {
rp.get("http://example.com/xyz.txt").then(data => {
fs.writeFile("./xyz.txt, data, err => {
if (err) reject(err);
resolve(true);
});
});
});
}

Error on Bluebird spread function

When I start so code
import Promise from 'bluebird';
const mongodb = Promise.promisifyAll(require('mongodb'));
const MongoClient = mongodb.MongoClient;
MongoClient.connect(url).then((db) => {
return Promise.all([new WorkerService(db)]);
}).spread((workerService) => (
Promise.all([new WorkerRouter(workerService)])
)).spread((workerRouter) => {
app.use('/worker', workerRouter);
}).then(() => {
httpServer.start(config.get('server.port'));
}).catch((err) => {
console.log(err);
httpServer.finish();
});
I see so error
}).spread(function (workerService) {
^
TypeError: MongoClient.connect(...).then(...).spread is not a function
Please help me. What am I doing wrong?
I see several things wrong here:
The root cause here is that MongoClient.connect(...).then(...) is not returning a Bluebird promise, thus, there is no .spread() method. When you promisify an interface with Bluebird, it does NOT change the existing methods at all. Instead, it adds new methods with the "Async" suffix on it.
So, unless you've somehow told Mongo to create only Bluebird promises, then
`MongoClient.connect(...).then(...)`
will just be returning whatever kind of promise Mongo has built in. You will need to use the new promisified methods:
MongoClient.connectAsync(...).then(...).spread(...)
Some other issues I see in your code:
1) Promise.all() expects you to pass it an array of promises. But when you do this: Promise.all([new WorkerRouter(workerService)]), you're passing it an array of a single object which unless that object is a promise itself, is just wrong.
2) In this code:
}).spread((workerService) => (
Promise.all([new WorkerRouter(workerService)])
))
you need to return the resulting promise in order to link it into the chain:
}).spread((workerService) => (
return Promise.all([new WorkerRouter(workerService)])
))
3) But, there's no reason to use Promise.all() on a single element array. If it was a promise, then just return it.
4) It also looks like you're trying to use promises for different steps of synchronous code too. Besides just complicating the heck out of things, there's just no reason to do that as this:
}).spread((workerService) => (
Promise.all([new WorkerRouter(workerService)])
)).spread((workerRouter) => {
app.use('/worker', workerRouter);
}).then(() => {
httpServer.start(config.get('server.port'));
})...
can be combined to this:
)).spread((workerService) => {
app.use('/worker', new WorkerRouter(workerService));
httpServer.start(config.get('server.port'));
})...
And, can probably be condensed even further since new WorkerService(db) probably also doesn't return a promise.
5) And, then we can't really tell why you're even attempting to use .spread() in the first place. It is useful only when you have an array of results that you want to turn into multiple named arguments, but you don't need to have an array of results (since there was no reason to use Promise.all() with a single item and in modern version of node.js, there's really no reason to use .spread() any more because you can use ES6 destructing of an array function argument to accomplish the same thing with the regular .then().
I don't claim to follow exactly what you're trying to do with every line, but you may be able to do this:
import Promise from 'bluebird';
const mongodb = Promise.promisifyAll(require('mongodb'));
const MongoClient = mongodb.MongoClient;
MongoClient.connectAsync(url).then((db) => {
let service = new WorkerService(db);
app.use('/worker', new WorkerRouter(service));
// since httpServer.start() is async and there's no promise returning here,
// this promise chain will not wait for the server to start
httpServer.start(config.get('server.port'));
}).catch((err) => {
console.log(err);
httpServer.finish();
});

Using Javascript native promises, resolve a promise after thenables attached

Is there a way using javascript native promises (docs) to create a promise and attach thenables, without knowing at constructor time how it will resolve?
var foo = new Promise(function(resolve, reject) {
// I don't know how this will resolve yet as some other object will resolve it
});
foo.then(function(val) {
console.log("first " + val);
});
foo.resolve("bar");
foo.then(function(val) {
console.log("second " + val);
});
// result
// first bar
// second bar
Simply save them inside of a closure.
var makePromise = function () {
var resolvePromise = null,
rejectPromise = null,
promise = new Promise(function (resolve, reject) {
resolvePromise = resolve;
rejectPromise = reject;
});
return { promise : promise, resolve : resolvePromise, reject : rejectPromise };
};
var deferredSomething = function () {
var deferredThing = makePromise();
waitAWhile()
.then(doStuff)
.then(function (result) {
if (result.isGood) {
deferredThing.resolve(result.data);
} else {
deferredThing.reject(result.error);
}
});
return deferredThing.promise;
};
This is actually the majority of the difference between the "deferred" concept and the "promise" concept; one more level, on top, that has the actual remote-controls that you can give to someone else, while you hand the .then|.success|.done|etc... to your consumers.
Once you bring those functions out into your upstream process, you can happily lazy-load whatever you'd like, using the "thenable" which you'll return, and then succeed or fail your chain (or leave it hanging) at will...
UPDATE
Seeing as this is probably going to continue to be the chosen answer, and continue to be voted down, as the solution to the exact problem he was having (ie: retrofitting code which was not made with ES6 promises in mind), I figure I'll add a more detailed example of exactly why using this antipattern selectively can be better than naught:
MongoClient.connect("mongodb://localhost:21017/mydb", (err, db) => {
db.collection("mycollection", (err, collection) => {
collection.find().toArray((err, results) => {
doStuff(results);
});
});
});
If I were to write a library, here, hoping to reach the point where I could write:
let dbConnected = MongoClient.connect(dbURL);
dbConnected
.then(db => db.collection(myCollection))
.then(collection => collection.find(query))
.then(stream => doStuff(stream));
...or alternatively:
composeAsync(
(stream) => doStuff(stream),
(collection) => collection.find(query),
(db) => dbCollection(myCollection)
)(dbConnected);
...for ease of use within the library, does it make sense to wrap every single function-body within an instantiated promise
// find = curry(query, collection)
return new Promise(resolve, reject) {
/* whole function body, here /
/ do lots of stuff which is irrelevant to the resolution of mongo.db.collection.find, but is relevant to its calling */
collection.find(query).toArray( /node-callback/(err, result) {
if (err) {
reject(err);
} else {
resolve(result);
}
});
};
...or in looking at the pattern of really only requiring the node-specific callback to be resolved, does it make more sense to have some form of promise-resolver, to save having to write out / copy-paste a half-dozen purely-redundant lines which should be completely DRYed up?
// find = curry(query, collection)
let resolver = new NodeResolver();
collection.find(query).toArray(promise.resolve);
return resolver.promise;
Yes, that is an anti-pattern... ...yet, an antipattern which requires fewer keystrokes, restores the natural flow of the promise-chain, fixes a problem with Node's callback-only API, reduces the potential for errors, et cetera.
Yes, there are already libraries which do this...
...solutions specific to X library or Y...
...or solutions which globally override methods of various modules (scary)
...or solutions which, again, basically force you to pass in all of the details of the call you're making:
wrapNodeMethod(fs, "read", url, config).then(data => { /*...*/ });
But there is no simple solution for the case of inverting all of that pain, without either:
a) wrapping the entire function body in a promise, to feed the async callback a resolver
b) using an antipattern within a library, in order to pass Node callbacks a resolver that the rest of the function-body needs to know precisely nothing about.
Even if data needed to be transformed within the resolver, it would still make more sense to return a transformed set, in a new promise
let resolver = new NodeResolver();
somethingAsync(resolver.resolve);
return resolver.promise.then(transformData).then(logTransform);
...than to wrap the whole body, including transforms, etc, just for closure-scope reference, just for the sake of avoiding an "antipattern" which clearly goes against the grain of what has become a very prominent JS platform/paradigm.
Now, personally, I'd be happier if IO||Node methods returned a promise and/or a stream, as well as taking a callback, as a core part of the platform...
...that's not going to happen...
...but you can't possibly tell me that writing less, and keeping Node modules DRY, while still using ES6 Promises is an "antipattern", without providing me a more-eloquent solution, therefore.
If you can, indeed, provide me something that I can use in any NodeJS callback, which does, indeed, provide a more eloquent solution to this, such that I don't have to wrap every body of every method which contains an async callback in a new constructor, or use clunky dispatcher methods, or hijack entire modules to override their global functionality...
...I'd be more than willing to retract my statement that this particular pattern is still highly-useful, as regards interfacing with APIs which are prone to pyramids o'dewm.
If result of promise depends on other promise, you should just create a promise using then.
What was proposed by #Norguard in direct form, doesn't make much sense (it's even coined as deferred anti-pattern). Below code does exactly same, and no extra promise is needed:
var deferredSomething = function () {
return waitAWhile()
.then(doStuff)
.then(function (result) {
if (result.isGood) {
return result.data;
} else {
throw result.error;
}
});
});
};
And, even if, for whatever reason you would need to create promise upfront, then with constructor pattern it would be cleaner to do it that way:
var deferredSomething = function () {
return new Promise(function (resolve, reject) {
waitAWhile()
.then(doStuff)
.then(function (result) {
if (result.isGood) {
resolve(result.data);
} else {
reject(result.error);
}
});
});
};

Categories

Resources