What is the purpose of this Promise wrapper in producer-consumer queue? - javascript

I am reading Node.js Design Patterns and am trying to understand the following example of a producer-consumer pattern in implementing limited parallel execution (my questions in comments):
export class TaskQueuePC extends EventEmitter {
constructor(concurrency) {
super();
this.taskQueue = [];
this.consumerQueue = [];
for (let i = 0; i < concurrency; i++) {
this.consumer();
}
}
async consumer() {
while (true) {
try {
const task = await this.getNextTask();
await task();
} catch (err) {
console.error(err);
}
}
}
async getNextTask() {
return new Promise((resolve) => {
if (this.taskQueue.length !== 0) {
return resolve(this.taskQueue.shift());
}
this.consumerQueue.push(resolve);
});
}
runTask(task) {
// why are we returning a promise here?
return new Promise((resolve, reject) => {
// why are we wrapping our task here?
const taskWrapper = () => {
const taskPromise = task();
taskPromise.then(resolve, reject);
return taskPromise;
};
if (this.consumerQueue.length !== 0) {
const consumer = this.consumerQueue.shift();
consumer(taskWrapper);
} else {
this.taskQueue.push(taskWrapper);
}
});
}
}
In the constructor, we create queues for both tasks and consumers, and then execute the consumer method up to the concurrency limit.
This pauses each consumer on const task = await getNextTask() which returns a pending promise.
Because there are no tasks yet in our task queue, the resolver for the promise is pushed to the consumer queue.
When a task is added with runTask, the consumer (the pending promise's resolver) is plucked off the queue and called with the task. This returns execution to the consumer method, which will run the task(), eventually looping again to await another task or sit in the queue.
What I cannot grok is the purpose of the Promise and taskWrapper in the runTask method. It seems we would have the same behavior if both the Promise and taskWrapper were omitted:
runTask(task) {
if (this.consumerQueue.length !== 0) {
const consumer = this.consumerQueue.shift();
consumer(task);
} else {
this.taskQueue.push(task);
}
}
In fact, when I execute this version I get the same results. Am I missing something?

Related

Unordered resolution of a list of promises

How to convert a dynamic Set<Promise<T>> into AsyncIterable<T> (unordered)?
The resulting iterable must produce values as they get resolved, and it must end just as the source runs empty.
I have a dynamic cache of promises to be resolved, and values reported, disregarding the order.
NOTE: The source is dynamic, which means it can receive new Promise<T> elements while we progress through the resulting iterator.
UPDATE
After going through all the suggestions, I was able to implement my operator. And here're the official docs.
I'm adding a bounty to reward anyone who can improve it further, though at this point a PR is preferable (it is for a public library), or at least something that fits the same protocol.
Judging from your library implementation, you actually want to transform an AsyncIterable<Promise<T>> into an AsyncIterator<T> by racing up to N of the produced promises concurrently. I would implement that as follows:
async function* limitConcurrent<T>(iterable: AsyncIterable<Promise<T>>, n: number): AsyncIterator<T> {
const pool = new Set();
for await (const p of iterable) {
const promise = Promise.resolve(p).finally(() => {
pool.delete(promise); // FIXME see below
});
promise.catch(() => { /* ignore */ }); // mark rejections as handled
pool.add(promise);
if (pool.size >= n) {
yield /* await */ Promise.race(pool);
}
}
while (pool.size) {
yield /* await */ Promise.race(pool);
}
}
Notice that if one of the promises in the pool rejects, the returned iterator will end with the error and the results of the other promises that are currently in the pool will be ignored.
However, above implementation presumes that the iterable is relatively fast, as it will need to produce n promises before the pool is raced for the first time. If it yields the promises slower than the promises take to resolve, the results are held up unnecessarily.
And worse, the above implementation may loose values. If the returned iterator is not consumed fast enough, or the iterable is not yielding fast enough, multiple promise handlers may delete their respective promise from the pool during one iteration of the loop, and the Promise.race will consider only one of them.
So this would work for a synchronous iterable, but if you actually have an asynchronous iterable, you would need a different solution. Essentially you got a consumer and a producer that are more or less independent, and what you need is some queue between them.
Yet with a single queue it still wouldn't handle backpressure, the producer just runs as fast as it can (given the iteration of promises and the concurrency limit) while filling the queue. What you really need then is a channel that allows synchronisation in both directions, e.g. using two queues:
class AsyncQueue<T> {
resolvers: null | ((res: IteratorResult<T> | Promise<never>) => void)[];
promises: Promise<IteratorResult<T>>[];
constructor() {
// invariant: at least one of the arrays is empty.
// when `resolvers` is `null`, the queue has ended.
this.resolvers = [];
this.promises = [];
}
putNext(result: IteratorResult<T> | Promise<never>): void {
if (!this.resolvers) throw new Error('Queue already ended');
if (this.resolvers.length) this.resolvers.shift()(result);
else this.promises.push(Promise.resolve(result));
}
put(value: T): void {
this.putNext({done: false, value});
}
end(): void {
for (const res of this.resolvers) res({done: true, value: undefined});
this.resolvers = null;
}
next(): Promise<IteratorResult<T>> {
if (this.promises.length) return this.promises.shift();
else if (this.resolvers) return new Promise(resolve => { this.resolvers.push(resolve); });
else return Promise.resolve({done: true, value: undefined});
}
[Symbol.asyncIterator](): AsyncIterator<T> {
// Todo: Use AsyncIterator.from()
return this;
}
}
function limitConcurrent<T>(iterable: AsyncIterable<Promise<T>>, n: number): AsyncIterator<T> {
const produced = new AsyncQueue<T>();
const consumed = new AsyncQueue<void>();
(async () => {
try {
let count = 0;
for await (const p of iterable) {
const promise = Promise.resolve(p);
promise.then(value => {
produced.put(value);
}, _err => {
produced.putNext(promise); // with rejection already marked as handled
});
if (++count >= n) {
await consumed.next(); // happens after any produced.put[Next]()
count--;
}
}
while (count) {
await consumed.next(); // happens after any produced.put[Next]()
count--;
}
} catch(e) {
// ignore `iterable` errors?
} finally {
produced.end();
}
})();
return (async function*() {
for await (const value of produced) {
yield value;
consumed.put();
}
}());
}
function createCache() {
const resolve = [];
const sortedPromises = [];
const noop = () => void 0;
return {
get length() {
return sortedPromises.length
},
add(promiseOrValue) {
const q = new Promise(r => {
resolve.push(r);
const _ = () => {
resolve.shift()(promiseOrValue);
}
Promise.resolve(promiseOrValue).then(_, _);
});
q.catch(noop); // prevent q from throwing when rejected.
sortedPromises.push(q);
},
next() {
return sortedPromises.length ?
{ value: sortedPromises.shift() } :
{ done: true };
},
[Symbol.iterator]() {
return this;
}
}
}
(async() => {
const sleep = (ms, value) => new Promise(resolve => setTimeout(resolve, ms, value));
const cache = createCache();
const start = Date.now();
function addItem() {
const t = Math.floor(Math.random() ** 2 * 8000), // when to resolve
val = t + Date.now() - start; // ensure that the resolved value is in ASC order.
console.log("add", val);
cache.add(sleep(t, val));
}
// add a few initial items
Array(5).fill().forEach(addItem);
// check error handling with a rejecting promise.
cache.add(sleep(1500).then(() => Promise.reject("a rejected Promise")));
while (cache.length) {
try {
for await (let v of cache) {
console.log("yield", v);
if (v < 15000 && Math.random() < .5) {
addItem();
}
// slow down iteration, like if you'd await some API-call.
// promises now resolve faster than we pull them.
await sleep(1000);
}
} catch (err) {
console.log("error:", err);
}
}
console.log("done");
})()
.as-console-wrapper{top:0;max-height:100%!important}
works with both for(const promise of cache){ ... } and for await(const value of cache){ ... }
Error-handling:
for(const promise of cache){
try {
const value = await promise;
}catch(error){ ... }
}
// or
while(cache.length){
try {
for await(const value of cache){
...
}
}catch(error){ ... }
}
rejected Promises (in the cache) don't throw until you .then() or await them.
Also handles backpressure (when your loop is iterating slower than the promises resolve)
for await(const value of cache){
await somethingSlow(value);
}

Javascript : wait for for loop to finish before executing next loop

I have a scenario where I have to delete attachments first and then upload new attachments. Here is my code:
var programs = this.UploadModel.getProperty("/programs/items");
//Delete files first
for(var i=0; i<filesToDelete.length; i++){
oThis._callAttachmentWS("DELETE", proj, filesToDelete[i]);
}
//Then save new files
for(var i=0; i<programs.length; i++){
oThis._callAttachmentWS("SAVE", proj, programs[i]);
}
How do I make the second for loop wait for the first loop to finish?
Since the OP within the comments states ...
"... it calls a web service and the return is true or false"
"... the function is coming from another controller. It can be changed. Since it's an ajax call, then callback back is most likely supported"
... and looking at how ...
oThis._callAttachmentWS("DELETE", proj, filesToDelete[i]);
... respectively ...
oThis._callAttachmentWS("SAVE", proj, programs[i]);
... are being used, one could assume the _callAttachmentWS method returns a Promise.
Promise.all and Promise.allSettled are two methods each operating upon the states of a list of promises and returning a promise itself.
The next provided example code utilizes the latter method. The implementation also mocks the behavior of an asynchronous (promise returning) _callAttachmentWS method. There are promise returning helper functions for the also mocked file save/delete tasks. The main task, called handleFileDeleteAndFileSave, shows a possible solution of how one could handle the promise chain(s) ...
function callAttachmentWS(action, project, fileName) {
return new Promise(
(resolve, reject) => {
setTimeout(() => {
// file deletion completed.
resolve({ action, fileName });
}, 3000);
}
);
}
// var programs = this.UploadModel.getProperty("/programs/items");
// //Delete files first
// for(var i=0; i<filesToDelete.length; i++){
// oThis._callAttachmentWS("DELETE", proj, filesToDelete[i]);
// }
//
// //Then save new files
// for(var i=0; i<programs.length; i++){
// oThis._callAttachmentWS("SAVE", proj, programs[i]);
// }
function triggerFileActions(action, fileList) {
console.log(`+++ trigger ${ action.toLowerCase() } files +++`);
// returns an array of promises.
return fileList.map(fileName =>
/*oThis._*/callAttachmentWS(action, 'my-project-name', fileName)
)
}
function deleteFiles(fileList) {
// returns a promise.
return Promise.allSettled(triggerFileActions('DELETE', fileList));
}
function saveFiles(fileList) {
// returns a promise.
return Promise.allSettled(triggerFileActions('SAVE', fileList));
}
function handleFileDeleteAndFileSave(deleteList, saveList) {
// returns a promise.
return deleteFiles(
deleteList
).then(deleteResultList => {
deleteResultList.forEach(result => console.log(result));
console.log('... delete files finished ...');
}).then(() => {
// returns a promise.
return saveFiles(
saveList
).then(saveResultList => {
saveResultList.forEach(result => console.log(result));
console.log('... save files finished ...');
}).then(() => '+++ handleFileDeleteAndFileSave is settled +++');
});
}
const filesToDelete = ['foo', 'bar', 'baz'];
const programs = ['bizz', 'buzz'];
handleFileDeleteAndFileSave(
filesToDelete,
programs,
)
.then(status => console.log(status));
.as-console-wrapper { min-height: 100%!important; top: 0; }
As the above code shows, the properly timed handling of file delete/save is based on nested promise chains. In order to free the programmers' minds from writing and maintaining such structures the async ... await syntax was introduced.
The next code example repeats the above code block, just in a more imperative programming style ...
async function callAttachmentWS(action, project, fileName) {
return new Promise(
(resolve, reject) => {
setTimeout(() => {
// file deletion completed.
resolve({ action, fileName });
}, 3000);
}
);
}
function triggerFileActions(action, fileList) {
console.log(`+++ trigger ${ action.toLowerCase() } files +++`);
// returns an array of promises.
return fileList.map(fileName =>
callAttachmentWS(action, 'my-project-name', fileName)
)
}
async function deleteFiles(fileList) {
// returns a promise.
return Promise.allSettled(triggerFileActions('DELETE', fileList));
}
async function saveFiles(fileList) {
// returns a promise.
return Promise.allSettled(triggerFileActions('SAVE', fileList));
}
async function handleFileDeleteAndFileSave(deleteList, saveList) {
// handles promises (async functions) via `await` syntax,
// thus it makes it an async function too
// which (implicitly) returns a promise.
const deleteResultList = await deleteFiles(deleteList);
deleteResultList.forEach(result => console.log(result));
console.log('... delete files finished ...');
const saveResultList = await saveFiles(saveList);
saveResultList.forEach(result => console.log(result));
console.log('... save files finished ...');
return '+++ handleFileDeleteAndFileSave is settled +++';
}
const filesToDelete = ['foo', 'bar', 'baz'];
const programs = ['bizz', 'buzz'];
(async function () {
const status =
await handleFileDeleteAndFileSave(filesToDelete, programs);
console.log(status);
}());
.as-console-wrapper { min-height: 100%!important; top: 0; }
You can use async/await.
Here is more about them

Specific task for each async call and promise wait for all

Is it safe to use asyncs in javascript like this:
async coolFunction(users) {
const firstPromise = findPrivilegesInOneDbAsync();
const secondPromise = findPrivilegesInSecondDbAsync();
//LABEL_1
firstPromise.then(privilege=> {
users.forEach(user => {
if(user.privCode === privilege.code) {
user.privileges.push(privilege);
}
}
}
//LABEL_2
secondPromise.then(privilege=> {
users.forEach(user => {
if(user.altPrivCode === privilege.differentCode) {
user.privileges.push(privilege);
user.hasAlternativePrvis = true;
}
}
}
//LABEL_3
Promise.all([firstPromise, secondPromise]).then(() => {
console.log("DONE!");
//do something
})
}
The question is, is it guaranteed that LABEL_3 - Promise.all callback gonna execute after first and second promise (order in those two of course does not matter) callbacks are done?
...is it guaranteed that LABEL_3 - Promise.all callback gonna execute after first and second promise (order in those two of course does not matter) callbacks are done?
Yes, it is. The fulfillment handlers on a promise are called in order of registration. Since your earlier ones are registered before your Promise.all ones, they'll be run first.
Example:
function delay(ms, ...args) {
return new Promise(resolve => {
setTimeout(resolve, ms, ...args);
});
}
const promise = delay(800);
promise.then(() => {
console.log("first");
});
promise.then(() => {
console.log("second");
});
But it would probably be more idiomatic to use the promises returned by then instead:
async coolFunction(users) {
const firstPromise = findPrivilegesInOneDbAsync();
const secondPromise = findPrivilegesInSecondDbAsync();
Promise.all([
firstPromise.then(privilege=> {
users.forEach(user => {
if(user.privCode === privilege.code) {
user.privileges.push(privilege);
}
}
},
secondPromise.then(privilege=> {
users.forEach(user => {
if(user.altPrivCode === privilege.differentCode) {
user.privileges.push(privilege);
user.hasAlternativePrvis = true;
}
}
}
])
.then(() => {
console.log("DONE!");
//do something
});
}
That would also have the advantage of waiting for any promises returned by those fulfillment handlers before executing the "done" logic.
It's probably worth noting that there's no reason for that method to be async if you're going to do things in parallel like that and use .then handlers rather than await. You could do this, though, to keep the processing of the first two things in parallel but wait for them both to finish:
async coolFunction(users) {
const firstPromise = findPrivilegesInOneDbAsync();
const secondPromise = findPrivilegesInSecondDbAsync();
await Promise.all([
firstPromise.then(privilege=> {
users.forEach(user => {
if(user.privCode === privilege.code) {
user.privileges.push(privilege);
}
}
},
secondPromise.then(privilege=> {
users.forEach(user => {
if(user.altPrivCode === privilege.differentCode) {
user.privileges.push(privilege);
user.hasAlternativePrvis = true;
}
}
}
])
console.log("DONE!");
//do something
}
That would also wait for any promises returned by those fulfillment handlers before executing the "done" logic.

Why doesn't .then execute after previous .then in async function?

Multiple calls to _dispatch sometimes causes the promises passed to _dispatch to be executed at the same time. Isn't .then supposed to execute after previous .then?
// Failing code
async _dispatch (promise) {
// this._mutex is a Promise
this._mutex = this._mutex.then(() => promise)
return Promise.resolve(this._mutex)
}
// Possibly working code
async _dispatch (promise) {
console.log('START_CS', promise)
while (Atomics.load(this.done, 0) === 0) {
await this.sleep(50)
}
Atomics.store(this.done, 0, 0)
console.log('IN_CS', promise)
const ret = await promise
Atomics.store(this.done, 0, 1)
console.log('END_CS', promise)
return ret
}
_dispatch is used in the following manner:
async getStatus (ports) {
const request = // ...
return this._dispatch(someAsyncFunctionReturningArray(request, ports))
}
const polling = () => {
const sleep = new Promise(resolve => setTimeout(resolve, 500))
const status = this.getStatus().then(() => {}).catch(() => {})
return Promise.all([sleep, status])
.then(polling)
}
polling()
polling() and another similar block of code is running at the same time. I noticed that someAsyncFunctionReturningArray is called concurrently.
Promises carry information about the state of a task and allow you to act on that state. They don’t, in general, represent tasks themselves. Here’s the equivalent of what you’re doing:
async function foo() {
console.log('foo() task ran');
}
function delay() {
return new Promise(resolve => {
setTimeout(resolve, 1000);
});
}
const promise = foo();
delay().then(() => promise)
This isn’t going to delay promise by a second, because promise is just an object that can say “resolved with value X”, “rejected with error Y”, or “pending”. There’s no concept of delaying promises – you delay tasks. The work is done by foo and starts when you call foo().
It’s not quite clear what the correct replacement would be in your question. I think this is what you were going for:
_dispatch (action) {
this._mutex = this._mutex.then(() => action())
return this._mutex
}
async getStatus (ports) {
const request = // ...
return this._dispatch(() => someAsyncFunctionReturningArray(request, ports))
}
but it’s possible there’s an entirely different approach that works better, and we’d need more details on what you’re trying to accomplish with this queue to recommend one.
A Promise is not a task that generates a value, it is rather a value immeadiately returned by tasks that take a while, to then somewhen pass out the result. Adding a promise to another promise through chaining them does not influence what the tasks are doing. However a callback could be called when a promise resolves, like:
async _dispatch(callback) {
this._mutex = this._mutex.then(() => callback());
return this._mutex;
}
That can then be used as:
const getStatus = (ports) => this.dispatch(() => {
const request = // ...
return someAsyncFunctionReturningArray(request, ports);
});
const sleep = new Promise(resolve => setTimeout(resolve, 500))
const polling = () => {
const status = this.getStatus().then(() => {}).catch(() => {})
return Promise.all([sleep, status])
.then(polling)
};
polling();

Limit concurrency of pending promises

I'm looking for a promise function wrapper that can limit / throttle when a given promise is running so that only a set number of that promise is running at a given time.
In the case below delayPromise should never run concurrently, they should all run one at a time in a first-come-first-serve order.
import Promise from 'bluebird'
function _delayPromise (seconds, str) {
console.log(str)
return Promise.delay(seconds)
}
let delayPromise = limitConcurrency(_delayPromise, 1)
async function a() {
await delayPromise(100, "a:a")
await delayPromise(100, "a:b")
await delayPromise(100, "a:c")
}
async function b() {
await delayPromise(100, "b:a")
await delayPromise(100, "b:b")
await delayPromise(100, "b:c")
}
a().then(() => console.log('done'))
b().then(() => console.log('done'))
Any ideas on how to get a queue like this set up?
I have a "debounce" function from the wonderful Benjamin Gruenbaum. I need to modify this to throttle a promise based on it's own execution and not the delay.
export function promiseDebounce (fn, delay, count) {
let working = 0
let queue = []
function work () {
if ((queue.length === 0) || (working === count)) return
working++
Promise.delay(delay).tap(function () { working-- }).then(work)
var next = queue.shift()
next[2](fn.apply(next[0], next[1]))
}
return function debounced () {
var args = arguments
return new Promise(function (resolve) {
queue.push([this, args, resolve])
if (working < count) work()
}.bind(this))
}
}
I don't think there are any libraries to do this, but it's actually quite simple to implement yourself:
function sequential(fn) { // limitConcurrency(fn, 1)
let q = Promise.resolve();
return function(x) {
const p = q.then(() => fn(x));
q = p.reflect();
return p;
};
}
For multiple concurrent requests it gets a little trickier, but can be done as well.
function limitConcurrency(fn, n) {
if (n == 1) return sequential(fn); // optimisation
let q = Promise.resolve();
const active = new Set();
const fst = t => t[0];
const snd = t => t[1];
return function(x) {
function put() {
const p = fn(x);
const a = p.reflect().then(() => {
active.delete(a);
});
active.add(a);
return [Promise.race(active), p];
}
if (active.size < n) {
const r = put()
q = fst(t);
return snd(t);
} else {
const r = q.then(put);
q = r.then(fst);
return r.then(snd)
}
};
}
Btw, you might want to have a look at the actors model and CSP. They can simplify dealing with such things, there are a few JS libraries for them out there as well.
Example
import Promise from 'bluebird'
function sequential(fn) {
var q = Promise.resolve();
return (...args) => {
const p = q.then(() => fn(...args))
q = p.reflect()
return p
}
}
async function _delayPromise (seconds, str) {
console.log(`${str} started`)
await Promise.delay(seconds)
console.log(`${str} ended`)
return str
}
let delayPromise = sequential(_delayPromise)
async function a() {
await delayPromise(100, "a:a")
await delayPromise(200, "a:b")
await delayPromise(300, "a:c")
}
async function b() {
await delayPromise(400, "b:a")
await delayPromise(500, "b:b")
await delayPromise(600, "b:c")
}
a().then(() => console.log('done'))
b().then(() => console.log('done'))
// --> with sequential()
// $ babel-node test/t.js
// a:a started
// a:a ended
// b:a started
// b:a ended
// a:b started
// a:b ended
// b:b started
// b:b ended
// a:c started
// a:c ended
// b:c started
// done
// b:c ended
// done
// --> without calling sequential()
// $ babel-node test/t.js
// a:a started
// b:a started
// a:a ended
// a:b started
// a:b ended
// a:c started
// b:a ended
// b:b started
// a:c ended
// done
// b:b ended
// b:c started
// b:c ended
// done
Use the throttled-promise module:
https://www.npmjs.com/package/throttled-promise
var ThrottledPromise = require('throttled-promise'),
promises = [
new ThrottledPromise(function(resolve, reject) { ... }),
new ThrottledPromise(function(resolve, reject) { ... }),
new ThrottledPromise(function(resolve, reject) { ... })
];
// Run promises, but only 2 parallel
ThrottledPromise.all(promises, 2)
.then( ... )
.catch( ... );
I have the same problem. I wrote a library to implement it. Code is here. I created a queue to save all the promises. When you push some promises to the queue, the first several promises at the head of the queue would be popped and running. Once one promise is done, the next promise in the queue would also be popped and running. Again and again, until the queue has no Task. You can check the code for details. Hope this library would help you.
Advantages
you can define the amount of concurrent promises (near simultaneous requests)
consistent flow: once one promise resolve, another request start no need to guess the server capability
robust against data choke, if the server stop for a moment, it will just wait, and next tasks will not start just because the
clock allowed
do not rely on a 3rd party module it is Vanila node.js
1st thing is to make https a promise, so we can use wait to retrieve data (removed from the example)
2nd create a promise scheduler that submit another request as any promise get resolved.
3rd make the calls
Limiting requests taking by limiting the amount of concurrent promises
const https = require('https')
function httpRequest(method, path, body = null) {
const reqOpt = {
method: method,
path: path,
hostname: 'dbase.ez-mn.net',
headers: {
"Content-Type": "application/json",
"Cache-Control": "no-cache"
}
}
if (method == 'GET') reqOpt.path = path + '&max=20000'
if (body) reqOpt.headers['Content-Length'] = Buffer.byteLength(body);
return new Promise((resolve, reject) => {
const clientRequest = https.request(reqOpt, incomingMessage => {
let response = {
statusCode: incomingMessage.statusCode,
headers: incomingMessage.headers,
body: []
};
let chunks = ""
incomingMessage.on('data', chunk => { chunks += chunk; });
incomingMessage.on('end', () => {
if (chunks) {
try {
response.body = JSON.parse(chunks);
} catch (error) {
reject(error)
}
}
console.log(response)
resolve(response);
});
});
clientRequest.on('error', error => { reject(error); });
if (body) { clientRequest.write(body) }
clientRequest.end();
});
}
const asyncLimit = (fn, n) => {
const pendingPromises = new Set();
return async function(...args) {
while (pendingPromises.size >= n) {
await Promise.race(pendingPromises);
}
const p = fn.apply(this, args);
const r = p.catch(() => {});
pendingPromises.add(r);
await r;
pendingPromises.delete(r);
return p;
};
};
// httpRequest is the function that we want to rate the amount of requests
// in this case, we set 8 requests running while not blocking other tasks (concurrency)
let ratedhttpRequest = asyncLimit(httpRequest, 8);
// this is our datase and caller
let process = async () => {
patchData=[
{path: '/rest/slots/80973975078587', body:{score:3}},
{path: '/rest/slots/809739750DFA95', body:{score:5}},
{path: '/rest/slots/AE0973750DFA96', body:{score:5}}]
for (let i = 0; i < patchData.length; i++) {
ratedhttpRequest('PATCH', patchData[i].path, patchData[i].body)
}
console.log('completed')
}
process()
The classic way of running async processes in series is to use async.js and use async.series(). If you prefer promise based code then there is a promise version of async.js: async-q
With async-q you can once again use series:
async.series([
function(){return delayPromise(100, "a:a")},
function(){return delayPromise(100, "a:b")},
function(){return delayPromise(100, "a:c")}
])
.then(function(){
console.log(done);
});
Running two of them at the same time will run a and b concurrently but within each they will be sequential:
// these two will run concurrently but each will run
// their array of functions sequentially:
async.series(a_array).then(()=>console.log('a done'));
async.series(b_array).then(()=>console.log('b done'));
If you want to run b after a then put it in the .then():
async.series(a_array)
.then(()=>{
console.log('a done');
return async.series(b_array);
})
.then(()=>{
console.log('b done');
});
If instead of running each sequentially you want to limit each to run a set number of processes concurrently then you can use parallelLimit():
// Run two promises at a time:
async.parallelLimit(a_array,2)
.then(()=>console.log('done'));
Read up the async-q docs: https://github.com/dbushong/async-q/blob/master/READJSME.md

Categories

Resources