await loop vs Promise.all [duplicate] - javascript

This question already has answers here:
Any difference between await Promise.all() and multiple await?
(6 answers)
Closed 4 years ago.
Having a set of async operations on db to do, I'm wondering what's the difference performance-wise of doing a "blocking" await loop versus a Promise.all.
let insert = (id,value) => {
return new Promise(function (resolve, reject) {
connnection.query(`insert into items (id,value) VALUES (${id},"${value}")`, function (err, result) {
if (err) return reject(err)
return resolve(result);
});
});
};
Promise.all solution (it needs a for loop to builds the array of promises..)
let inserts = [];
for (let i = 0; i < SIZE; i++) inserts.push(insert(i,"..string.."))
Promise.all(inserts).then(values => {
console.log("promise all ends");
});
await loop solution
let inserts = [];
(async function loop() {
for (let i = 0; i < SIZE; i++) {
await insert(i, "..string..")
}
console.log("await loop ends");
})
Edit: thanks for the anwsers, but I would dig into this a little more.
await is not really blocking, we all know that, it's blocking in its own code block. An await loop sequentially fire requests, so if in the middle 1 requests takes longer, the other ones waits for it.
Well this is similar to Promise.all: if a 1 req takes longer, the callback is not executed until ALL the responses are returned.

Your example of using Promise.all will create all promises first before waiting for them to resolve. This means that your requests will fire concurrently and the callback given to Promise.all(...).then(thisCallback) will only fire if all requests were successful.
Note: promise returned from Promise.all will reject as soon as one of the promises in the given array rejects.
const SIZE = 5;
const insert = i => new Promise(resolve => {
console.log(`started inserting ${i}`);
setTimeout(() => {
console.log(`inserted ${i}`);
resolve();
}, 300);
});
// your code
let inserts = [];
for (let i = 0; i < SIZE; i++) inserts.push(insert(i, "..string.."))
Promise.all(inserts).then(values => {
console.log("promise all ends");
});
// requests are made concurrently
// output
// started inserting 0
// started inserting 1
// started inserting 2
// ...
// started inserting 4
// inserted 0
// inserted 1
// ...
// promise all ends
Note: It might be cleaner to use .map instead of a loop for this scenario:
Promise.all(
Array.from(Array(SIZE)).map((_, i) => insert(i,"..string.."))
).then(values => {
console.log("promise all ends");
});
Your example of using await on the other hand, waits for each promise to resolve before continuing and firing of the next one:
const SIZE = 5;
const insert = i => new Promise(resolve => {
console.log(`started inserting ${i}`);
setTimeout(() => {
console.log(`inserted ${i}`);
resolve();
}, 300);
});
let inserts = [];
(async function loop() {
for (let i = 0; i < SIZE; i++) {
await insert(i, "..string..")
}
console.log("await loop ends");
})()
// no request is made until the previous one is finished
// output
// started inserting 0
// inserted 0
// started inserting 1
// ...
// started inserting 4
// inserted 4
// await loop ends
The implications for performance in the above cases are directly correlated to their different behavior.
If "efficient" for your use case means to finish up the requests as soon as possible, then the first example wins because the requests will be happening around the same time, independently, whereas in the second example they will happen in a serial fashion.
In terms of complexity, the time complexity for your first example is equal to O(longestRequestTime) because the requests will happen essentially in parallel and thus the request taking the longest will drive the worst-case scenario.
On the other hand, the await example has O(sumOfAllRequestTimes) because no matter how long individual requests take, each one has to wait for the previous one to finish and thus the total time will always include all of them.
To put things in numbers, ignoring all other potential delays due to the environment and application in which the code is ran, for 1000 requests, each taking 1s, the Promise.all example would still take ~1s while the await example would take ~1000s.
Maybe a picture would help:
Note: Promise.all won't actually run the requests exactly in parallel and the performance in general will greatly depend on the exact environment in which the code is running and the state of it (for instance the event loop) but this is a good approximation.

The major difference between the two approaches is that
The await version issues server requests sequentially in the loop. If one of them errors without being caught, no more requests are issued. If request errors are trapped using try/catch blocks, you can identify which request failed and perhaps code in some some form of recovery or even retry the operation.
The Promise.all version will make server requests in or near parallel fashion, limited by browser restrictions on the maximum number of concurrent requests permitted. If one of the requests fails the Promise.all returned promise fails immediately. If any requests were successful and returned data, you lose the data returned. In addition if any request fails, no outstanding requests are cancelled - they were initiated in user code (the insert function) when creating the array of promises.
As mentioned in another answer, await is non blocking and returns to the event loop until its operand promise is settled. Both the Promise.all and await while looping versions allow responding to other events while requests are in progress.

Each has different advantages, it's up to us which one we need to solve our problem.
await loop
for(let i = 0;i < SIZE; i++){
await promiseCall();
}
It will call all promises in parallel if any promise rejected it won't have any effect on other promises.
In ES2018 it has simplified for certain situation like if you want to call the second iteration only if the first iteration got finished, refer the following ex.
async function printFiles () {
const files = await getFilePaths()
for await (const file of fs.readFile(file, 'utf8')) {
console.log(contents)
}
}
Promise.all()
var p1 = Promise.resolve(32);
var p2 = 123;
var p3 = new Promise((resolve, reject) => {
setTimeout(() => {
resolve("foo");
}, 100);
});
Promise.all([p1, p2, p3]).then(values => {
console.log(values); // [32, 123, "foo"]
});
This will execute every promise sequentially and finally return combined revolved values array.
If any one of these promise get rejected it will return value of that rejected promise only. follow following ex,
var p1 = Promise.resolve(32);
var p2 = Promise.resolve(123);
var p3 = new Promise((resolve, reject) => {
setTimeout(() => {
resolve("foo");
}, 100);
});
Promise.all([p1, p2, p3]).then(values => {
console.log(values); // 123
});

Related

Best way to to use async/promise/callbacks in for loop [duplicate]

This question already has answers here:
How do I convert an existing callback API to promises?
(24 answers)
Closed 2 years ago.
I'm building a trading bot that needs to get stock names from separate files. But even I have used async function and await in my code, that doesn't work.
My index file init method.
const init = async () => {
const symbols = await getDownTrendingStock();
console.log("All stocks that are down: " + symbols);
const doOrder = async () => {
//do stuff
}
doOrder();
}
my getDownTrendeingStock file
const downStocks = []
function getDownTrendingStock () {
for(i = 0; i < data.USDTPairs.length; i++){
const USDTPair = data.USDTPairs[i] + "USDT";
binance.prevDay(USDTPair, (error, prevDay, symbol) => {
if(prevDay.priceChangePercent < -2){
downStocks.push(symbol)
}
});
}
return downStocks;
}
I have tried to also use async in for loop because the getDownTrendinStock function returns an empty array before for loop is finished. I didn't find the right way to do that because I was confused with all async, promise and callback stuff. What is the right statement to use in this situation?
Output:
All stocks that are down:
Wanted output:
All stocks that are down: [BTCUSDT, TRXUSDT, ATOMUSDT...]
I think the main issue in the code you posted is that you are mixing callbacks and promises.
This is what's happening in your getDownTrendingStock function:
You start iterating over the data.USDTPairs array, picking the first element
You call binance.prevDay. This does nothing yet, because its an asynchronous function that takes a bit of time. Notably, no data is added to downStocks yet.
You continue doing 1-2, still, no data is added
You return downStocks, which is still empty.
Your function is complete, you print the empty array
Now, at some point, the nodejs event loop continues and starts working on those asynchronous tasks you created earlier by calling binance.prevDay. Internally, it probably calls an API, which takes time; once that call is completed, it calls the function you provided, which pushes data to the downStocks array.
In summary, you didn't wait for the async code to complete. You can achieve this in multiple ways.
One is to wrap this in a promise and then await that promise:
const result= await new Promise((resolve, reject) => {
binance.prevDay(USDTPair, (error, prevDay, symbol) => {
if (error) {
reject(error);
} else {
resolve({prevDay, symbol});
}
});
});
if(result.prevDay.priceChangePercent < -2){
downStocks.push(result.symbol)
}
Note that you can probably also use promisify for this. Also, this means that you will wait for one request to finish before starting the next, which may slow down your code considerably, depending on how many calls you need; you may want to look into Promise.all as well.
Generally speaking, I use two technics:
const asyncFunc = () => {smthAsync};
const arrayToProcess = [];
// 1
const result = await arrayToProcess.reduce((acc, value) => acc.then(() => asyncFunc(value)), Promise.resolve(someInitialValue));
// 2
// here will be eslint errors
for(let i = 0 i < arrayToProcess.length; i+=1) {
const processResult = await asyncFunc(value);
// do with processResult what you want
};

Performance hit for multiple API call using await keyword

There is a huge performance hit when I use await keyword for multiple API calls,
In the below example suppose persons array have 50 objects therefore it make 50 API calls (for..of loop) to get transaction details of a person (one api call) one at a time and do operation on it.
for (const person of persons) {
const trx = await transactionHistory(person.id); // method which trigger single API call (https endpoint)
// do some business operation on trx object
}
It takes around 7+ seconds to execute everything.
But when I tweak a little bit by making all the API call in foreach loop and assign it to a property and later use it, in await keyword. Then it takes roughly less than a second to execute everything.
persons.forEach(person => {
person.trxHistory = transactionHistory(person.id); // method which trigger single API call (https endpoint)
});
for (const person of persons) {
const trx = await person.trxHistory
// do some business operation on trx object
}
I am a little confused and unable to understand how it improved the performance with a big difference. Can anyone explain the difference between the two logic? I am a beginner level :)
In the second snippet, you are triggering the operations one after another asynchronously in the forEach before they are awaited. So it happens in the background. The result of the transactionHistory(person.id) is a Promise which is then assigned to the person.trxHistory.
Since the operations were already started in the background, when you ultimately come to the for-of loop to await on them they are already in-progress and you need to wait fewer time to get the result.
But in the earlier for-of loop in your first snippet, you were triggering the operations one by one:
Start the operation
Wait for it to finish
This was the sequence in your first example. Hence this was slow as it was sequential, whereas the latter one was concurrent.
persons.forEach(() => {
//operation already started in the background in an async way
person.trxHistory = transactionHistory(person.id); // method which trigger single API call (https endpoint)
});
for (const person of persons) {
const trx = await person.trxHistory
// do some business operation on trx object
}
So when you eventually come to the for-of loop, you are only awaiting on the last result synchronously.
You can trigger all the operations inside a Promise.all call and await those results one by one:
Promise.all(persons.map(p => transactionHistory(p.id)).then(async res => {
const trx = await res
// do some business operation on trx object
});
To make it clear take this mock API for instance:
This first snippet is the concurrent one, where all tasks are triggered before hand and then awaited so total time is the max time for each individual task (approx 7 secs):
const mockAPICall = (time) => {
return new Promise((res, rej) => {
console.log("Started execution");
setTimeout(() => res("Mocked return value"), time);
});
}
const waits = [{wait: 1000}, {wait: 2000}, {wait: 7000}]
waits.forEach(w => {
w.result = mockAPICall(w.wait);
});
const start = performance.now();
(async () => {
for (const w of waits){
let result = await w.result;
console.log(result);
}
console.log(performance.now() - start)
})();
This snippet is sequential, waits for each task to complete one-by-one so total time is the sum of all the waiting periods (approx 10 secs):
const mockAPICall = (time) => {
return new Promise((res, rej) => {
console.log("Started execution");
setTimeout(() => res("Mocked return value"), time);
});
}
const waits = [{wait: 1000}, {wait: 2000}, {wait: 7000}]
const start = performance.now();
(async () => {
for (const w of waits){
let result = await mockAPICall(w.wait);
console.log(result);
}
console.log(performance.now() - start);
})();

How to wait for this function to finish?

I need to wait for mapping function to finish before I send the data to the console. I know it has something to do with Promise. I've been trying for hours and I couldn't get it to work even after reading so much about promises and async functions...
async function inactiveMemberWarner() {
var msg = "```javascript\nI have sent warnings to members that have been inactive for 2 weeks.\n\n"
var inactiveMembers = '';
var count = 0;
var guildMembers = client.guilds.find(g => g.name === mainGuild).members;
const keyPromises = await guildMembers.map(async (member) => {
if (isMod(member)) {
connection.query(`SELECT * from users WHERE userID='${member.id}'`, (err, data) => {
if (data[0]) {
if (!data[0].warnedForInactivity && moment().isSameOrAfter(moment(data[0].lastMSGDate).add('2', 'week'))) {
count++;
var updateWarning = {warnedForInactivity: 1}
connection.query(`UPDATE users SET ? WHERE userID='${data[0].userID}'`, updateWarning);
member.send(`**[*]** WARNING: You've been inactive on \`\`${mainGuild}\`\` for 2 weeks. Members that have been inactive for at least a month will be kicked.`);
inactiveMembers += `${count}. ${member.user.tag}\n`;
return inactiveMembers;
}
}
});
}
});
await Promise.all(keyPromises).then(inactiveMembersData => console.log(inactiveMembers)); // RETURNS AN EMPTY STRING
setTimeout(() => console.log(inactiveMembers), 5000); // RETURNS THE INACTIVE MEMBERS AFTER WAITING FOR 5 SECONDS (PRMITIVE WAY)
}
inactiveMemberWarner();
Thank you in advance!
You're close, but not quite there.
First, some notes:
await can be used on any value, but it is entirely pointless to use it on anything that isn't a Promise. Your guildMembers.map(...); returns an array, not a Promise.
Mixing await and .then(...) works, but is kinda messy. You're already using await - why bother dealing with callbacks?
Using guildMembers.map(async ...) like this will ensure that all the requests are fired more or less instantaneously, and they could finish in any order. This is fine, but it is kind of a race condition and results in a more or less random order of results.
This is not a good approach even just conceptually! Any time you ever have to loop queries, try and investigate ways to do it in only one query. SQL is quite powerful.
The reason your current code doesn't work is because your connection.query function escapes the async control flow. What I mean by this is that the whole point of using async/await and Promises is basically to keep track of the callbacks locally, and to make use of promise chaining to dynamically add callbacks. If you call an async function which returns a Promise, you can now carry that Promise anywhere else in your code and attach a success handler to it dynamically: either with .then() or with the sugar await.
But the connection.query function doesn't return a Promise, it just has you pass another naked callback - this one is not being tracked by a Promise! The Promise doesn't have a reference to that callback, it can't know when that callback is getting called, and thus your async/await control flow is escaped and your promises resolve long before the queries have ran.
You can resolve this by making a new Promise in the async function:
async function inactiveMemberWarner() {
var msg = "```javascript\nI have sent warnings to members that have been inactive for 2 weeks.\n\n"
var inactiveMembers = '';
var count = 0;
var guildMembers = client.guilds.find(g => g.name === mainGuild).members;
const keyPromises = guildMembers.map(async (member) => {
if (isMod(member)) {
return new Promise((resolve, reject) => {
connection.query(`SELECT * from users WHERE userID='${member.id}'`, (err, data) => {
if (err) reject(err); //make errors bubble up so they can be handled
if (data[0]) {
if (!data[0].warnedForInactivity && moment().isSameOrAfter(moment(data[0].lastMSGDate).add('2', 'week'))) {
count++;
var updateWarning = {warnedForInactivity: 1}
connection.query(`UPDATE users SET ? WHERE userID='${data[0].userID}'`, updateWarning);
member.send(`**[*]** WARNING: You've been inactive on \`\`${mainGuild}\`\` for 2 weeks. Members that have been inactive for at least a month will be kicked.`);
resolve(`${count}. ${member.user.tag}\n`;);
}
} else resolve(""); //make sure to always resolve or the promise may hang
});
});
}
});
let inactiveMembersData = await Promise.all(keyPromises); // Returns an array of inactive member snippets.
inactiveMembers = inactiveMembersData.join(""); //join array of snippets into one string
}
inactiveMemberWarner();
This will work, but there is a much much much better way. SQL supports the IN operator, which allows you to have conditions like WHERE userID IN (list_of_ids). In other words, you can do this in one query. You can even specify more conditions, such as warnedForInactivity = 0 and lastMSGDate BETWEEN (NOW() - INTERVAL 14 DAY) AND NOW(). This way you can offload all of your current processing logic onto the SQL server - something that you should try to do virtually every single time you can. It would simplify this code a lot too. I won't go any further as it's out of scope for this question but feel free to ask another if you can't figure it out.
I can't test this, but this is what normally works for me is when wanting to wait on smt:
async function inactiveMemberWarner() {
new Promise(function(cb,rj){
var msg = "```javascript\nI have sent warnings to members that have been inactive for 2 weeks.\n\n"
var inactiveMembers = '';
var count = 0;
var guildMembers = client.guilds.find(g => g.name === mainGuild).members;
const keyPromises = await guildMembers.map(async (member) => {
if (isMod(member)) {
connection.query(`SELECT * from users WHERE userID='${member.id}'`, (err, data) => {
if (data[0]) {
if (!data[0].warnedForInactivity && moment().isSameOrAfter(moment(data[0].lastMSGDate).add('2', 'week'))) {
count++;
var updateWarning = {warnedForInactivity: 1}
connection.query(`UPDATE users SET ? WHERE userID='${data[0].userID}'`, updateWarning);
member.send(`**[*]** WARNING: You've been inactive on \`\`${mainGuild}\`\` for 2 weeks. Members that have been inactive for at least a month will be kicked.`);
inactiveMembers += `${count}. ${member.user.tag}\n`;
cb(inactiveMembers);
}
}
});
}
});
cb('No Members');
}).then(inactiveMembersData => console.log(inactiveMembers)); // SHOULD RETURNS THE INACTIVE MEMBERS
}
inactiveMemberWarner();

Node.js - Await for all the promises thrown inside a loop

I'm dealing with a loop in Node.js that performs two tasks for every iteration of a loop. To simplify, the code is summarized in:
Extract products metadata from a web page (blocking task).
Save all the products metadata to a database (asynchronous task).
The save operation (2) will perform about 800 operations in a database, and it doesn't need to block the main thread (I can still extracting products metadata from the web pages).
So, that being said, awaiting for the products to being saved doesn't have any sense. But if I throw the promises without awaiting for them, in the last iteration of the loop the Node.js process exits and all the pending operations are not finished.
Which is the best approach to solve this? Is it possible to achieve it without having a counter for finished promises or emitters? Thanks.
for (let shop of shops) {
// 1
const products = await extractProductsMetadata(shop);
// 2
await saveProductsMetadata(products);
}
Collect the promises in an array, then use Promise.all on it:
const storePromises = [];
for (let shop of shops) {
const products = await extractProductsMetadata(shop); //(1)
storePromises.push(saveProductsMetadata(products)); //(2)
}
await Promise.all(storePromises);
// ... all done (3)
Through that (1) will run one after each other, (2) will run in parallel, and (3) will run afterwards.
For sure you can also run (1) and (2) in parallel:
await Promise.all(shops.map(async shop => {
const products = await extractProductsMetadata(shop); //(1)
await saveProductsMetadata(products);
}));
And if an error occured in one of the promises, you can handle that with a try / catch block, to make sure all other shops won't be affected:
await Promise.all(shops.map(async shop => {
try {
const products = await extractProductsMetadata(shop); //(1)
await saveProductsMetadata(products);
} catch(error) {
// handle it here
}
}));
how to signal node to finish the process ?
You could manually call process.exit(0);, but that hides the real problem: NodeJS exits automatically if there is no listener attached anymore. That means that you should close all database connections / servers / etc. after the code above is done.
We are creating packs of data to treat. When we treat the data, we do all the get synchronously, and all the save asynchronously.
I have not handled the failure part, I let you add it to it. appropriate try/catch or function encapsulation will do it.
/**
* Call the given functions that returns promises in a queue
* options = context/args
*/
function promiseQueue(promisesFuncs, options = {}, _i = 0, _ret = []) {
return new Promise((resolve, reject) => {
if (_i >= promisesFuncs.length) {
return resolve(_ret);
}
// Call one
(promisesFuncs[_i]).apply(options.context || this, options.args || [])
.then((ret: any) => promiseQueue(promisesFuncs, _i + 1, options, [
..._ret,
ret,
]))
.then(resolve)
.catch(reject);
});
}
function async executePromiseAsPacks(arr, packSize, _i = 0) {
const toExecute = arr.slice(_i * packSize, packSize);
// Leave if we did execute all packs
if (toExecute.length === 0) return true;
// First we get all the data synchronously
const products = await promiseQueue(toExecute.map(x => () => extractProductsMetadata(x)));
// Then save the products asynchronously
// We do not put await here so it's truly asynchronous
Promise.all(toExecute.map((x, xi) => saveProductsMetadata(products[xi])));
// Call next
return executePromiseAsPacks(arr, packSize, _i + 1);
}
// Makes pack of data to treat (we extract synchronously and save asynchronously)
// Made to handle huge dataset
await executePromisesAsPacks(shops, 50);

Is Node.js native Promise.all processing in parallel or sequentially?

I would like to clarify this point, as the documentation is not too clear about it;
Q1: Is Promise.all(iterable) processing all promises sequentially or in parallel? Or, more specifically, is it the equivalent of running chained promises like
p1.then(p2).then(p3).then(p4).then(p5)....
or is it some other kind of algorithm where all p1, p2, p3, p4, p5, etc. are being called at the same time (in parallel) and results are returned as soon as all resolve (or one rejects)?
Q2: If Promise.all runs in parallel, is there a convenient way to run an iterable sequencially?
Note: I don't want to use Q, or Bluebird, but all native ES6 specs.
Is Promise.all(iterable) executing all promises?
No, promises cannot "be executed". They start their task when they are being created - they represent the results only - and you are executing everything in parallel even before passing them to Promise.all.
Promise.all does only await multiple promises. It doesn't care in what order they resolve, or whether the computations are running in parallel.
is there a convenient way to run an iterable sequencially?
If you already have your promises, you can't do much but Promise.all([p1, p2, p3, …]) (which does not have a notion of sequence). But if you do have an iterable of asynchronous functions, you can indeed run them sequentially. Basically you need to get from
[fn1, fn2, fn3, …]
to
fn1().then(fn2).then(fn3).then(…)
and the solution to do that is using Array::reduce:
iterable.reduce((p, fn) => p.then(fn), Promise.resolve())
In parallel
await Promise.all(items.map(async (item) => {
await fetchItem(item)
}))
Advantages: Faster. All iterations will be started even if one fails later on. However, it will "fail fast". Use Promise.allSettled, to complete all iterations in parallel even if some throw. Technically, these are concurrent invocations not in parallel.
In sequence
for (const item of items) {
await fetchItem(item)
}
Advantages: Variables in the loop can be shared by each iteration. Behaves like normal imperative synchronous code.
NodeJS does not run promises in parallel, it runs them concurrently since it’s a single-threaded event loop architecture. There is a possibility to run things in parallel by creating a new child process to take advantage of the multiple core CPU.
Parallel Vs Concurent
In fact, what Promise.all does is, stacking the promises function in the appropriate queue (see event loop architecture) running them concurrently (call P1, P2,...) then waiting for each result, then resolving the Promise.all with all the promises results.
Promise.all will fail at the first promise which fails unless you have to manage the rejection yourself.
There is a major difference between parallel and concurrent, the first one will run a different computation in a separate process at exactly the same time and they will progress at their rhythm, while the other one will execute the different computation one after another without waiting for the previous computation to finish and progress at the same time without depending on each other.
Finally, to answer your question, Promise.all will execute neither in parallel nor sequentially but concurrently.
Bergi's answer got me on the right track using Array.reduce.
However, to actually get the functions returning my promises to execute one after another I had to add some more nesting.
My real use case is an array of files that I need to transfer in order one after another due to limits downstream...
Here is what I ended up with:
getAllFiles().then( (files) => {
return files.reduce((p, theFile) => {
return p.then(() => {
return transferFile(theFile); //function returns a promise
});
}, Promise.resolve()).then(()=>{
console.log("All files transferred");
});
}).catch((error)=>{
console.log(error);
});
As previous answers suggest, using:
getAllFiles().then( (files) => {
return files.reduce((p, theFile) => {
return p.then(transferFile(theFile));
}, Promise.resolve()).then(()=>{
console.log("All files transferred");
});
}).catch((error)=>{
console.log(error);
});
didn't wait for the transfer to complete before starting another and also the "All files transferred" text came before even the first file transfer was started.
Not sure what I did wrong, but wanted to share what worked for me.
Edit: Since I wrote this post I now understand why the first version didn't work. then() expects a function returning a promise. So, you should pass in the function name without parentheses! Now, my function wants an argument so then I need to wrap in in a anonymous function taking no argument!
You can also process an iterable sequentially with an async function using a recursive function. For example, given an array a to process with asynchronous function someAsyncFunction():
var a = [1, 2, 3, 4, 5, 6]
function someAsyncFunction(n) {
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log("someAsyncFunction: ", n)
resolve(n)
}, Math.random() * 1500)
})
}
//You can run each array sequentially with:
function sequential(arr, index = 0) {
if (index >= arr.length) return Promise.resolve()
return someAsyncFunction(arr[index])
.then(r => {
console.log("got value: ", r)
return sequential(arr, index + 1)
})
}
sequential(a).then(() => console.log("done"))
Just to elaborate on #Bergi's answer (which is very succinct, but tricky to understand ;)
This code will run each item in the array and add the next 'then chain' to the end:
function eachorder(prev,order) {
return prev.then(function() {
return get_order(order)
.then(check_order)
.then(update_order);
});
}
orderArray.reduce(eachorder,Promise.resolve());
Using async await an array of promises can easily be executed sequentially:
let a = [promise1, promise2, promise3];
async function func() {
for(let i=0; i<a.length; i++){
await a[i]();
}
}
func();
Note: In above implementation, if a promise is rejected, the rest wouldn't be executed.If you want all your promises to be executed, then wrap your await a[i](); inside try catch
parallel
see this example
const resolveAfterTimeout = async i => {
return new Promise(resolve => {
console.log("CALLED");
setTimeout(() => {
resolve("RESOLVED", i);
}, 5000);
});
};
const call = async () => {
const res = await Promise.all([
resolveAfterTimeout(1),
resolveAfterTimeout(2),
resolveAfterTimeout(3),
resolveAfterTimeout(4),
resolveAfterTimeout(5),
resolveAfterTimeout(6)
]);
console.log({ res });
};
call();
by running the code it'll console "CALLED" for all six promises and when they are resolved it will console every 6 responses after timeout at the same time
I stumbled across this page while trying to solve a problem in NodeJS: reassembly of file chunks. Basically:
I have an array of filenames.
I need to append all those files, in the correct order, to create one large file.
I must do this asynchronously.
Node's 'fs' module does provide appendFileSync but I didn't want to block the server during this operation. I wanted to use the fs.promises module and find a way to chain this stuff together. The examples on this page didn't quite work for me because I actually needed two operations: fsPromises.read() to read in the file chunk, and fsPromises.appendFile() to concat to the destination file. Maybe if I was better with JavaScript I could have made the previous answers work for me. ;-)
I stumbled across this and I was able to hack together a working solution:
/**
* sequentially append a list of files into a specified destination file
*/
exports.append_files = function (destinationFile, arrayOfFilenames) {
return arrayOfFilenames.reduce((previousPromise, currentFile) => {
return previousPromise.then(() => {
return fsPromises.readFile(currentFile).then(fileContents => {
return fsPromises.appendFile(destinationFile, fileContents);
});
});
}, Promise.resolve());
};
And here's a jasmine unit test for it:
const fsPromises = require('fs').promises;
const fsUtils = require( ... );
const TEMPDIR = 'temp';
describe("test append_files", function() {
it('append_files should work', async function(done) {
try {
// setup: create some files
await fsPromises.mkdir(TEMPDIR);
await fsPromises.writeFile(path.join(TEMPDIR, '1'), 'one');
await fsPromises.writeFile(path.join(TEMPDIR, '2'), 'two');
await fsPromises.writeFile(path.join(TEMPDIR, '3'), 'three');
await fsPromises.writeFile(path.join(TEMPDIR, '4'), 'four');
await fsPromises.writeFile(path.join(TEMPDIR, '5'), 'five');
const filenameArray = [];
for (var i=1; i < 6; i++) {
filenameArray.push(path.join(TEMPDIR, i.toString()));
}
const DESTFILE = path.join(TEMPDIR, 'final');
await fsUtils.append_files(DESTFILE, filenameArray);
// confirm "final" file exists
const fsStat = await fsPromises.stat(DESTFILE);
expect(fsStat.isFile()).toBeTruthy();
// confirm content of the "final" file
const expectedContent = new Buffer('onetwothreefourfive', 'utf8');
var fileContents = await fsPromises.readFile(DESTFILE);
expect(fileContents).toEqual(expectedContent);
done();
}
catch (err) {
fail(err);
}
finally {
}
});
});
You can do it by for loop.
async function return promise:
async function createClient(client) {
return await Client.create(client);
}
let clients = [client1, client2, client3];
if you write following code then client are created parallelly:
const createdClientsArray = yield Promise.all(clients.map((client) =>
createClient(client);
));
But if you want to create client sequentially then you should use for loop:
const createdClientsArray = [];
for(let i = 0; i < clients.length; i++) {
const createdClient = yield createClient(clients[i]);
createdClientsArray.push(createdClient);
}
Bergi's answer helped me to make the call synchronous. I have added an example below where we call each function after the previous function is called:
function func1 (param1) {
console.log("function1 : " + param1);
}
function func2 () {
console.log("function2");
}
function func3 (param2, param3) {
console.log("function3 : " + param2 + ", " + param3);
}
function func4 (param4) {
console.log("function4 : " + param4);
}
param4 = "Kate";
//adding 3 functions to array
a=[
()=>func1("Hi"),
()=>func2(),
()=>func3("Lindsay",param4)
];
//adding 4th function
a.push(()=>func4("dad"));
//below does func1().then(func2).then(func3).then(func4)
a.reduce((p, fn) => p.then(fn), Promise.resolve());
I've been using for of in order to solve sequential promises. I'm not sure if it helps here but this is what I've been doing.
async function run() {
for (let val of arr) {
const res = await someQuery(val)
console.log(val)
}
}
run().then().catch()
Yes, you can chain an array of promise returning functions as follows
(this passes the result of each function to the next). You could of course edit it to pass the same argument (or no arguments) to each function.
function tester1(a) {
return new Promise(function(done) {
setTimeout(function() {
done(a + 1);
}, 1000);
})
}
function tester2(a) {
return new Promise(function(done) {
setTimeout(function() {
done(a * 5);
}, 1000);
})
}
function promise_chain(args, list, results) {
return new Promise(function(done, errs) {
var fn = list.shift();
if (results === undefined) results = [];
if (typeof fn === 'function') {
fn(args).then(function(result) {
results.push(result);
console.log(result);
promise_chain(result, list, results).then(done);
}, errs);
} else {
done(results);
}
});
}
promise_chain(0, [tester1, tester2, tester1, tester2, tester2]).then(console.log.bind(console), console.error.bind(console));
see this sample
Promise.all working parallel
const { range, random, forEach, delay} = require("lodash");
const run = id => {
console.log(`Start Task ${id}`);
let prom = new Promise((resolve, reject) => {
delay(() => {
console.log(`Finish Task ${id}`);
resolve(id);
}, random(2000, 15000));
});
return prom;
}
const exec = () => {
let proms = [];
forEach(range(1,10), (id,index) => {
proms.push(run(id));
});
let allPromis = Promise.all(proms);
allPromis.then(
res => {
forEach(res, v => console.log(v));
}
);
}
exec();

Categories

Resources