Call function after Promise.all inside a loop - javascript

So I wrote javascript about promise. I made two promise inside a for loop like this:
for(let i=0; i<globalName.length; i++ ){
let debug = globalName[i];
var promise1 = new Promise(function(resolve,reject){
var j = searchStart(startT,debug);
resolve(j)
}).then(function(result){
sxx = result;
});
var promise2 = new Promise(function(resolve,reject){
var k = searchEnd(endT,debug);
resolve(k);
}).then (function(result){
syy = result;
});
Promise.all([promise1, promise2]).then(function(values) {
let localed = [];
entry[i] = sxx;
exit[i] = syy;
localed.push({
"name" : debug,
"first" : entry[i],
"last" : exit[i]
});
xtable.rows.add(localed).draw();
});
}
In each promise, I call function searchStart(startT,debug) and searchEnd(endT,debug), which within each function, I also wrote promise script that return value from an API (ready called API from a device, when I called it, returns JSON data). JSON data works fine, and I can access it with my function and returned some intended value.
With the Promise.all when my function returns value, I write the data into table provided from DataTables. But of course because the function run when two promises above resolved, it can only write to my table with every each row of data.
Now, what I want to ask is, can I somehow manage to write all data first, and after the data is complete I call other function to write to table?

You can .map each debug to its associated Promise.all, so that you have an array of Promise.alls. Then, after calling Promise.all on that array, you can add all rows at once.
Note that since searchStart and searchEnd look to already return Promises, there's no need for the explicit Promise constructor antipattern - simply use the existing Promise alone. Also, by returning a value inside a .then, you can avoid having to use outer variables like sxx, syy, entry[i], and exit[i]:
const promiseAlls = globalName.map((debug, i) => {
return Promise.all([
debug, // see below for note
searchStart(startT, debug),
searchEnd(endT, debug)
]);
});
Promise.all(promiseAlls).then((allArrs) => {
allArrs.forEach(([
name, // this is the same as the "debug" variable above
first, // this is the same as `entry[i]`, or `sxx`, in your original code
last // this is the same as `exit[i]`, or `syy`, in your original code
]) => {
const localed = [{ name, first, last }];
xtable.rows.add(localed).draw();
});
});
The debug is used in the initial Promise.all even though it's not a Promise so that it can be passed along and used with its other associated values, once they've been resolved.

I am not clear about what you want, but I have two answers which may help you
Solution 1: It will solve each promise at a time then proceed for next
function searchStartAndEnd(flag = false, date, debug){
return new Promise((resolve, reject)=>{
var j;
if(flag){
j = searchStart(date, debug);
}else{
j = searchStart(date, debug);
}
resolve(j)
})
}
for(let i=0; i<globalName.length; i++ ){
let debug = globalName[i];
sxx = await searchStartAndEnd(true, startT, debug);
syy = await searchStartAndEnd(false, endT, debug) ;
localed.push({
"name" : debug,
"first" : sxx,
"last" : syy]
});
xtable.rows.add(localed).draw();
}
solution 2: It will solve all promise parallel then do move to the next task, then move next iteration
function searchStartAndEnd(flag = false, date, debug){
return new Promise((resolve, reject)=>{
var j;
if(flag){
j = searchStart(date, debug);
}else{
j = searchStart(date, debug);
}
resolve(j)
})
}
for(let i=0; i<globalName.length; i++ ){
let debug = globalName[i];
sxx = await ;
[sxx, syy] = await Promise.all([searchStartAndEnd(true, startT, debug),
searchStartAndEnd(false, endT, debug)])
localed.push({
"name" : debug,
"first" : sxx,
"last" : syy]
});
xtable.rows.add(localed).draw();
}

Related

Saving data from spawned process into variables in Javascript

I am having issues saving the results from a spawned python process. After converting data into json, I push the data to an array defined within the function before the spawn process is called, but the array keeps returning undefined. I can console.log and show the data correctly, but the array that is returned from the function is undefined. Any input would be greatly appreciated. Thanks in advance.
function sonar_projects(){
const projects = [];
let obj;
let str = '';
const projects_py = spawn('python', ['sonar.py', 'projects']);
let test = projects_py.stdout.on('data', function(data){
let projects = [];
let json = Buffer.from(data).toString()
str += json
let json2 = json.replace(/'/g, '"')
obj = JSON.parse(json2)
console.log(json2)
for(var dat in obj){
var project = new all_sonar_projects(obj[dat].key, obj[dat].name, obj[dat].qualifier, obj[dat].visibility, obj[dat].lastAnalysisDate);
projects.push(project);
}
for (var i= 0; i < projects.length; i++){
console.log(projects[i].key + ' ' + projects[i].name + ' ' + projects[i].qualifier + ' ' + projects[i].visibility + ' ' + projects[i].lastAnalysisDate)
}
console.log(projects)
return projects;
});
}
First of all, going through the NodeJS documentation, we have
Child Process
[child_process.spawn(command, args][, options])
Child Process Class
Stream
Stream Readable Event "data"
Even though projects_py.stdout.on(event_name, callback) accepts an callback, it returns either the EventEmitter-like object, where the events are registered (in this case, stdout that had it's method on called), or the parent element (the ChildProcess named projects_py).
It's because the callback function will be called every time the "data" event occurs. So, if the assign of the event returned the same as the callback function, it'd return only one single time, and then every next happening of the "data" event would be processed by the function, but not would be done.
In this kind of situation, we need a way to collect and compile the data of the projects_py.stdout.on("data", callback) event after it's done.
You already have the collecting part. Now see the other:
Right before you create the on "data" event, we create a promise to encapsulate the process:
// A promise says "we promise" to have something in the future,
// but it can end not happening
var promise = new Promise((resolve, reject) => {
// First of all, we collect only the string data
// as things can come in parts
projects_py.stdout.on('data', function(data){
let json = Buffer.from(data).toString()
str += json
});
// When the stream data is all read,
// we say we get what "we promised", and give it to "be resolved"
projects_py.stdout.on("end", () => resolve(str));
// When something bad occurs,
// we say what went wrong
projects_py.stdout.on("error", e => reject(e));
// With every data collected,
// we parse it (it's most your code now)
}).then(str => {
let json2 = str.replace(/'/g, '"')
// I changed obj to arr 'cause it seems to be an array
let arr = JSON.parse(json2)
//console.log(json2)
const projects = []
// With for-of, it's easier to get elements of
// an object / an array / any iterable
for(var dat of arr){
var project = new all_sonar_projects(
dat.key, dat.name, dat.qualifier,
dat.visibility, dat.lastAnalysisDate
);
projects.push(project);
}
// Template strings `a${variable or expression}-/b`
// are easier to compile things into a big string, yet still fast
for(var i = 0; i < projects.length; i++)
console.log(
`${projects[i].key} ${projects[i].name} ` +
`${projects[i].qualifier} ${projects[i].visibility} ` +
projects[i].lastAnalysisDate
)
console.log(projects)
// Your projects array, now full of data
return projects;
// Finally, we catch any error that might have happened,
// and show it on the console
}).catch(e => console.error(e));
}
Now, if you want to do anything with your array of projects, there are two main options:
Promise (then / catch) way
// Your function
function sonar_projects(){
// The new promise
var promise = ...
// As the working to get the projects array
// is already all set up, you just use it, but in an inner scope
promise.then(projects => {
...
});
}
Also, you can just return the promise variable and do the promise-things with it out of sonar_projects (with then / catch and callbacks).
async / await way
// First of all, you need to convert your function into an async one:
async function sonar_projects(){
// As before, we get the promise
var promise = ...
// We tell the function to 'wait' for it's data
var projects = await promise;
// Do whatever you would do with the projects array
...
}

How do I parse multiple pages?

I have been attempting to parse a sites table data into a json file, which I can do if I do each page one by one, but seeing as there are 415 pages that would take a while.
I have seen and read a lot of StackOverflow questions on this subject but I don't seem able to modify my script so that it;
Scrapes each page and extracts the 50 items with item IDS per page
Do so in a rate limited way so I don't negatively affect the server
The script waits until all requests are done so I can write each item + item id to a JSON file.
I believe you should be able to do this using request-promise and promise.all but I cannot figure it out.
The actual scraping of the data is fine I just cannot make the code, scrape a page, then go to the next URL with a delay or pause inbetween requests.
Code below is the closest I have got, but I get the same results multiple times and I cannot slow the request rate down.
Example of the page URLS:
http://test.com/itemlist/1
http://test.com/itemlist/2
http://test.com/itemlist/3 etc (upto 415)
for (var i = 1; i <= noPages; i++) {
urls.push({url: itemURL + i});
console.log(itemURL + i);
}
Promise.map(urls, function(obj) {
return rp(obj).then(function(body) {
var $ = cheerio.load(body);
//Some calculations again...
rows = $('table tbody tr');
$(rows).each(function(index, row) {
var children = $(row).children();
var itemName = children.eq(1).text().trim();
var itemID = children.eq(2).text().trim();
var itemObj = {
"id" : itemID,
"name" : itemName
};
itemArray.push(itemObj);
});
return itemArray;
});
},{concurrency : 1}).then(function(results) {
console.log(results);
for (var i = 0; i < results.length; i++) {
// access the result's body via results[i]
//console.log(results[i]);
}
}, function(err) {
// handle all your errors here
console.log(err);
});
Apologies for perhaps misunderstand node.js and its modules, I don't really use the language but I needed to scrape some data and I really don't like python.
since you need requests to be run only one by one Promise.all() would not help.
Recursive promise (I'm not sure if it's correct naming) would.
function fetchAllPages(list) {
if (!list || !list.length) return Promise. resolve(); // trivial exit
var urlToFetch = list.pop();
return fetchPage(urlToFetch).
then(<wrapper that returns Promise will be resolved after delay >).
then(function() {
return fetchAllPages(list); // recursion!
});
}
This code still lacks error handling.
Also I believe it can become much more clear with async/await:
for(let url of urls) {
await fetchAndProcess(url);
await <wrapper around setTimeout>;
}
but you need to find /write your own implementation of fetch() and setTimeout() that are async
After input from #skyboyer suggesting using recursive promises I was lead to a GitHub Gist called Sequential execution of Promises using reduce()
Firstly I created my array of URLS
for (var i = 1; i <= noPages; i++) {
//example urls[0] = "http://test.com/1"
//example urls[1] = "http://test.com/2"
urls.push(itemURL + i);
console.log(itemURL + i);
}
Then
var sequencePromise = urls.reduce(function(promise, url) {
return promise.then(function(results) {
//fetchIDsFromURL async function (it returns a promise in this case)
//when the promise resolves I have my page data
return fetchIDsFromURL(url)
.then(promiseWithDelay(9000))
.then(itemArr => {
results.push(itemArr);
//calling return inside the .then method will make sure the data you want is passed onto the next
return results;
});
});
}, Promise.resolve([]));
// async
function fetchIDsFromURL(url)
{
return new Promise(function(resolve, reject){
request(url, function(err,res, body){
//console.log(body);
var $ = cheerio.load(body);
rows = $('table tbody tr');
$(rows).each(function(index, row) {
var children = $(row).children();
var itemName = children.eq(1).text().trim();
var itemID = children.eq(2).text().trim();
var itemObj = {
"id" : itemID,
"name" : itemName
};
//push the 50 per page scraped items into an array and resolve with
//the array to send the data back from the promise
itemArray.push(itemObj);
});
resolve(itemArray);
});
});
}
//returns a promise that resolves after the timeout
function promiseWithDelay(ms)
{
let timeout = new Promise(function(resolve, reject){
setTimeout(function()
{
clearTimeout(timeout);
resolve();
}, ms);
});
return timeout;
}
Then finally call .then on the sequence of promises, the only issue I had with this was returning multiple arrays inside results with the same data in each, so since all data is the same in each array I just take the first one which has all my parsed items with IDs in it, then I wrote it to a JSON file.
sequencePromise.then(function(results){
var lastResult = results.length;
console.log(results[0]);
writeToFile(results[0]);
});

Waiting for a set of workers to finish

I have an array of webworkers, called workers. I'm initiating them all in a single function, called activate. The problem is, I want to have the activate return the values that are posted by the worker. I either want it to return a promise of some kind or wait until they are all done.
So the code could be:
// the web workers add stuff in this array with onmessage()
var globalArray = [];
function activate(){
for(var i = 0; i < workers.length; i++){
workers[i].postMessage('do something');
}
return // Promise or filled globalArray;
}
So I could use it like this:
var values = await activate();
I don't want the workers to call a seperate function once the last worker is finished. Is there any way I can achieve this?
What you want to do is to create the Promise, and inside of the function of the Promise, initiate all the workers and check when the last ends to call the resolve function of the promise, and return this promise in your activate function.
Would be something like this:
// the web workers add stuff in this array with onmessage()
var globalArray = [];
function activate(){
var promise = new Promise(function(resolve, reject){
var counter = 0;
var array = [];
var callback = function(message){
counter++;
//You can add here the values of the messages
globalArray.push(message.data);
//Or add them to an array in the function of the Promise
array.push(message.data);
//And when all workers ends, resolve the promise
if(counter >= workers.length){
//We resolve the promise with the array of results.
resolve(array);
}
}
for(var i = 0; i < workers.length; i++){
workers[i].onmessage = callback;
workers[i].postMessage('do something');
}
});
return promise;
}
The code has not been tested for now, but hope you get the idea.
one way to do this is to wrap everything in a promise,
const workers = [ new Worker("./worker1.js"), new Worker("./worker2.js")];
const activate = () => {
return new Promise((resolve,reject) => {
let result = [];
for ( let i = 0 ; i < workers.length; i++) {
workers[i].postMessage("do something");
workers[i].onmessage = function(e) {
result.push(e.data);
};
}
resolve(result)
});
};
async function f() {
let res = await activate();
console.log(res);
}
f();

Resolving promises inside for loop

I am trying to read a JSON object using a for loop to format the JSON data and send it back to the client by putting the formatted response into a model object.
Inside for loop, i am dealing with two promises based upon few conditions. There are two functions, each having a promise returned.How can I get my final data after all the promises are resolved? Thanks in advance.
for (var i = 0, i<jsonData.length; i++){
if(someCOndition){
getSomeData().then(function(data){
//some operation using data
})
}
if(someOtherCOndition){
getSomeOtherData().then(function(data){
//some operation using data
})
}
}
Promise.all([ promise1, promise2 ]) (Promise.all() on MDN) in case of standard JS Promises (ES2015+). It returns a new promise, which gets resolved once all passed promises get resolved. But be aware - it will get rejected immediately when at least one promise gets rejected (it won't wait for any other promise).
You might do as follows;
var promises = [],
JSONData_1 = ["chunk_11","chunk_12","chunk_13"],
JSONData_2 = ["chunk_21","chunk_22","chunk_23"],
getJSONData = (b,i) => new Promise((resolve,reject) => setTimeout(_ => b ? resolve(JSONData_1[i])
: resolve(JSONData_2[i]),1000));
for (var i = 0; i < JSONData_1.length; i++){
if(Math.random() < 0.5) promises.push(getJSONData(true,i));
else promises.push(getJSONData(false,i));
}
Promise.all(promises)
.then(a => console.log(a));
You can use jQuery.when().
var deferredList = [];
for (var i = 0, i<jsonData.length; i++){
if(someCOndition){
deferredList.push(getSomeData().then(function(data){
//some operation using data
}))
}
if(someOtherCOndition){
taskList.push(getSomeOtherData().then(function(data){
//some operation using data
}))
}
}
JQuery.when(taskList).done(function(){
// final to do..
}).fail(){
// even if single one fails ! be aware of this
}
jQuery.when() MDN
You can do it in multiple ways. We can also use for of loop with async..await to get the result synchronously while looping, if that is a requirement. Something like this:
function downloadPage(url) {
return Promise.resolve('some value');
}
async function () {
for(let url of urls) {
let result = await downloadPage(url);
// Process the result
console.log(result);
}
}
You could do something like this..
var arr=[],arr2=[];
for (var i = 0, i<jsonData.length; i++){
if(someCOndition){
//push onto the array inputs for getSomeData()
arr.push(jsonData[i]);
}
if(someOtherCOndition){
arr2.push(jsonData[i]);
}
}
processArr(0);
processArr2(0);
function processArr(idx){
if (idx>=arr.length) {
//done
}
else {
getSomeData().then(function(data){
// some operation using data
// optionally store in a results array
// recurse
processArr(idx+1)
})
}
}
function processArr2(idx){
if (idx>=arr2.length) {
//done
}
else {
getSomeotherData().then(function(data){
// some operation using data
// recurse
processArr2(idx+1)
})
}
}

Best es6 way to get name based results with Promise.all

By default the Promise.All([]) function returns a number based index array that contains the results of each promise.
var promises = [];
promises.push(myFuncAsync1()); //returns 1
promises.push(myFuncAsync1()); //returns 2
Promise.all(promises).then((results)=>{
//results = [0,1]
}
What is the best vanilla way to return a named index of results with Promise.all()?
I tried with a Map, but it returns results in an array this way:
[key1, value1, key2, value2]
UPDATE:
My questions seems unclear, here is why i don't like ordered based index:
it's crappy to maintain: if you add a promise in your code you may have to rewrite the whole results function because the index may have change.
it's awful to read: results[42] (can be fixed with jib's answer below)
Not really usable in a dynamic context:
var promises = [];
if(...)
promises.push(...);
else{
[...].forEach(... => {
if(...)
promises.push(...);
else
[...].forEach(... => {
promises.push(...);
});
});
}
Promise.all(promises).then((resultsArr)=>{
/*Here i am basically fucked without clear named results
that dont rely on promises' ordering in the array */
});
ES6 supports destructuring, so if you just want to name the results you can write:
var myFuncAsync1 = () => Promise.resolve(1);
var myFuncAsync2 = () => Promise.resolve(2);
Promise.all([myFuncAsync1(), myFuncAsync2()])
.then(([result1, result2]) => console.log(result1 +" and "+ result2)) //1 and 2
.catch(e => console.error(e));
Works in Firefox and Chrome now.
Is this the kind of thing?
var promises = [];
promises.push(myFuncAsync1().then(r => ({name : "func1", result : r})));
promises.push(myFuncAsync1().then(r => ({name : "func2", result : r})));
Promise.all(promises).then(results => {
var lookup = results.reduce((prev, curr) => {
prev[curr.name] = curr.result;
return prev;
}, {});
var firstResult = lookup["func1"];
var secondResult = lookup["func2"];
}
If you don't want to modify the format of result objects, here is a helper function that allows assigning a name to each entry to access it later.
const allNamed = (nameToPromise) => {
const entries = Object.entries(nameToPromise);
return Promise.all(entries.map(e => e[1]))
.then(results => {
const nameToResult = {};
for (let i = 0; i < results.length; ++i) {
const name = entries[i][0];
nameToResult[name] = results[i];
}
return nameToResult;
});
};
Usage:
var lookup = await allNamed({
rootStatus: fetch('https://stackoverflow.com/').then(rs => rs.status),
badRouteStatus: fetch('https://stackoverflow.com/badRoute').then(rs => rs.status),
});
var firstResult = lookup.rootStatus; // = 200
var secondResult = lookup.badRouteStatus; // = 404
If you are using typescript you can even specify relationship between input keys and results using keyof construct:
type ThenArg<T> = T extends PromiseLike<infer U> ? U : T;
export const allNamed = <
T extends Record<string, Promise<any>>,
TResolved extends {[P in keyof T]: ThenArg<T[P]>}
>(nameToPromise: T): Promise<TResolved> => {
const entries = Object.entries(nameToPromise);
return Promise.all(entries.map(e => e[1]))
.then(results => {
const nameToResult: TResolved = <any>{};
for (let i = 0; i < results.length; ++i) {
const name: keyof T = entries[i][0];
nameToResult[name] = results[i];
}
return nameToResult;
});
};
A great solution for this is to use async await. Not exactly ES6 like you asked, but ES8! But since Babel supports it fully, here we go:
You can avoid using only the array index by using async/await as follows.
This async function allows you to literally halt your code inside of it by allowing you to use the await keyword inside of the function, placing it before a promise. As as an async function encounters await on a promise that hasn't yet been resolved, the function immediately returns a pending promise. This returned promise resolves as soon as the function actually finishes later on. The function will only resume when the previously awaited promise is resolved, during which it will resolve the entire await Promise statement to the return value of that Promise, allowing you to put it inside of a variable. This effectively allows you to halt your code without blocking the thread. It's a great way to handle asynchronous stuff in JavaScript in general, because it makes your code more chronological and therefore easier to reason about:
async function resolvePromiseObject(promiseObject) {
await Promise.all(Object.values(promiseObject));
const ret = {};
for ([key, value] of Object.entries(promiseObject)) {
// All these resolve instantly due to the previous await
ret[key] = await value;
};
return ret;
}
As with anything above ES5: Please make sure that Babel is configured correctly so that users on older browsers can run your code without issue. You can make async await work flawlessly on even IE11, as long as your babel configuration is right.
in regards to #kragovip's answer, the reason you want to avoid that is shown here:
https://medium.com/front-end-weekly/async-await-is-not-about-making-asynchronous-code-synchronous-ba5937a0c11e
"...it’s really easy to get used to await all of your network and I/O calls.
However, you should be careful when using it multiple times in a row as the await keyword stops execution of all the code after it. (Exactly as it would be in synchronous code)"
Bad Example (DONT FOLLOW)
async function processData() {
const data1 = await downloadFromService1();
const data2 = await downloadFromService2();
const data3 = await downloadFromService3();
...
}
"There is also absolutely no need to wait for the completion of first request as none of other requests depend on its result.
We would like to have requests sent in parallel and wait for all of them to finish simultaneously. This is where the power of asynchronous event-driven programming lies.
To fix this we can use Promise.all() method. We save Promises from async function calls to variables, combine them to an array and await them all at once."
Instead
async function processData() {
const promise1 = downloadFromService1();
const promise2 = downloadFromService2();
const promise3 = downloadFromService3();
const allResults = await Promise.all([promise1, promise2, promise3]);

Categories

Resources