I have a function that tries to load web images and tracks the count of loaded images and the count of the failed images. I am loading the images using fetch and using Promise.allSettled to run operations after all the images are validated.
const data = ["/test1.png", "/test2.png", "/test3.png"];
let imagesValidated = 0;
let imagesFailed = 0;
const promiseArr = [];
data.forEach((item) => {
const imgPromise = fetch(item);
promiseArr.push(imgPromise);
imgPromise
.then((resp) => {
if (!resp.ok()) imagesFailed += 1;
})
.catch((error) => {
imagesFailed += 1;
})
.finally(() => {
// For the last image `test3.png`, the finally blocks runs after `allSettled`.
imagesValidated += 1;
});
});
Promise.allSettled(promiseArr).then(() => {
// some operations
});
The issue I am facing is with the finally block. For the last image the finally block is running after the allSettled callback. This causes the imagesValidated to be lesser than the actual images scanned count. I do not want to remove the finally block as in the future I will be adding more cleanup code into it.
Is this the expected behavior of the Promise resolution methods? Is there a way I can fix this code without removing the finally block?
You're pushing the fetch Promise to the array - not the chain that goes through .then and .finally. Push the whole chained Promise to the array.
data.forEach((item) => {
promiseArr.push(
fetch(item)
.then((resp) => {
if (!resp.ok()) imagesFailed += 1;
})
.catch((error) => {
imagesFailed += 1;
})
.finally(() => {
// For the last image `test3.png`, the finally blocks runs after `allSettled`.
imagesValidated += 1;
})
);
});
Promise.allSettled(promiseArr).then(() => {
// some operations
});
Or, even better, use .map on the original data instead.
Promise.allSettled(
data.map(item => fetch(item)
.then((resp) => {
if (!resp.ok()) imagesFailed += 1;
})
.catch((error) => {
imagesFailed += 1;
})
.finally(() => {
// For the last image `test3.png`, the finally blocks runs after `allSettled`.
imagesValidated += 1;
})
)
.then(() => {
// some operations
});
Though, note that using Promise.allSettled isn't helping you that much here - none of the Promises can reject due to the .catch. Consider either using Promise.all, or use Promise.allSettled with just the .fetch, so you can increment the counters after all responses have come back.
Related
so what I have right now is
window.addEventListener('load', fetchInfo)
function fetchInfo() {
const tableRows = //an array of results
tableRows.forEach((row) => {
const rowId = //get the id of each row
fetch(...) //fetch some stuff using the id
.then(() => {
//do some stuff
return rowId;
})
.then((id) => {
//do some stuff
}
})
})
}
basically using rowId to fetch information and populate each table row, so this happens a few times, the table maxes out at 10 rows so max 10 fetches
I want to have an event listener to see when all the fetching is done, aka when the table is completely done loading. How should I go about that?
Edit: these fetches are api requests so they take a few seconds to respond. I've tried using Promise.all(tableRows.map(row) => and it returned results before the api could respond. So in the end, it still doesn't really detect when does the table actually finish loading information.
Use Promise.all:
async function fetchInfo() {
const tableRows = //an array of results
const results = await Promise.all(tableRows.map(row => fetch(...))
console.log(results)
}
From the suggestion in the comments length was indeed the way, however that would only be half of the solution. Because the key point here is that I want to know whether or not the table has been fully populated with requested information.
Tried Promise.all and map() and didn't work out because as stated in my edit, these will return when the fetch was called, when the fetch was still pending, and doesn't really care if the fetch was a 200 OK, which was what I needed.
So the solution was to use response.status
function fetchInfo() {
const tableRows = ...
let successFetch = 0;
tableRows.forEach((row) => {
fetch(...)
.then((response) => {
if (response.status == 200) {
successFetch = successFetch + 1
}
if (successFetch == tableRows.length) {
//this point here was exactly what I needed
}
return response;
})
.then((response) => {
...
})
})
}
Use Promise.all to wait for all promises to complete
Promise.all(
tableRows.map((row) => {
return fetch(...).then(() => {
return rowId;
})
})
).then((results) => {
console.log(results);
})
Sorry for the very confusing question, I have this code that gets information from a website without any node modules or libraries. It is a list of users separated into different pages use ?page= at the end of the URL. I have managed to iterate through the pages and split up the raw HTML just right. However, my promise resolves before all the data is collected. How can I wait for everything to finish before I resolve the promise? I have tried countless solutions, but none seem to work. Please don't ask to use a node package, as my goal is to not use one :) A friend helped with the regex and splitting it up. Here is the code I am using:
function getData() {
return new Promise((resolve, reject) => {
let final = [] //the array of users returned in the end
const https = require("https"), url = "https://buildtheearth.net/buildteams/121/members";
https.get(url + "?page=1", request => { //initial request, gets the number of user pages.
let rawList = '';
request.setEncoding("utf8"),
request.on("data", data => {rawList += data}),
request.on("end", () => {
if(request = (request = (request = rawList.substring(rawList.indexOf('<div class="pagination">'))).substring(0, request.indexOf("</div>"))).match(/<a(.+)>(.+)<\/a>/g)) {
for(let t = parseInt(request[request.length - 1].match(/(\d+)(?!.*\d)/g)), a = 1; a < t + 1; a++) { //iterates through member pages
https.get(url + "?page=" + a, request2 => { //https request for each page of members
let rawList2 = '';
request2.setEncoding('utf8'),
request2.on("data", data => {rawList2 += data}),
request2.on("end", () => {
let i = rawList2.match(/<td>(.+)<\/td>/g); //finds table in HTML
if (i)
for (var t = 1; t < i.length; t += 3) //iterates through rows in table
console.log(i[t].replace(/<td>/g, "").replace(/<\/td>/g, "")), /* logs element to the console (for testing) */
final.push(i[t].replace(/<td>/g, "").replace(/<\/td>/g, "")); //pushes element to the array that is resolved in the end
})
})
}
}
resolve(final) //resolves promise returning final array, but resolves before elements are added with code above
})
})
})
}
If this helps, here is the website I am trying to get info from.
I am still a little new to JS so if you could help, I would really appreciate it :)
I ended up turning each action into an async function with a try and catch block and then chained the functions together with .then() For the base (getting data from a website) I took inspiration from an article on Medium. Here is the site I am pulling data from, and here is the function to get data from a website:
const getData = async (url) => {
const lib = url.startsWith('https://') ? https : http;
return new Promise((resolve, reject) => {
const req = lib.get(url, res => {
if (res.statusCode < 200 || res.statusCode >= 300) {
return reject(new Error(`Status Code: ${res.statusCode}`));
}
const data = [];
res.on('data', chunk => data.push(chunk));
res.on('end', () => resolve(Buffer.concat(data).toString()));
});
req.on('error', reject);
req.end();
});
};
and then I got the number of pages (which can be accessed by appending ?page=<page number> to the end of the url) with this this function:
const pages = async () => {
try {
let html = await getData('https://buildtheearth.net/buildteams/121/members',);
let pages = await (html = (html = html.substring(html.indexOf('<div class="pagination">'))).substring(0, html.indexOf("</div>"))).match(/<a(.+)>(.+)<\/a>/g)
let pageCount = await parseInt(pages[pages.length - 1].match(/(\d+)(?!.*\d)/g))
return pageCount
} catch (error) {
console.error(error);
}
}
and then I used the page count to iterate through the pages and add the HTML of each to an array with this function:
const getPages = async pageCount => {
let returns = []
try {
for (page = 1; page <= pageCount; page++) {
try {
let pageData = await getData('https://buildtheearth.net/buildteams/121/members?page=' + page)
returns.push(pageData)
} catch (error) {
return error
}
}
} catch (error) {
return error
} finally {return returns}
}
and then I iterated through the array of strings of HTML of each page, and extracted the data I needed out of each with this function which would return the list of members I need:
const iteratePages = async pages => {
if (!Array.isArray(pages)) return
try {
let returns = []
await pages.forEach(page => {
let list = page.match(/<td>(.+)<\/td>/g);
if (list)
for (var element = 1; element < list.length; element += 3)
returns.push(list[element].replace(/<td>/g, "").replace(/<\/td>/g, ""));
})
return returns
} catch (error) {
return error
}
}
And then it was a matter of chaining each together to get the array I needed:
pages().then(pageCount => getPages(pageCount)).then(pages => iteratePages(pages)).then(finalList => {console.log(finalList); console.log(finalList.length)})
First of all, I'm sorry if this is a basic question, but I am not being able to do this.
I have a promise that inserts data into a table (sqlite), and I have a for iterating over an array. I want to put all of that data into the table, but I want to know when it ends, to display a message at the end. I was verifying (i == array.length -1) it displayed the message but that doesn't seem correct. If I don't do this in this way the message displays before it has ended.
I have two other promises that should run along this one and the solution(it seems a bad solution) above wouldn't work in this case, because one can end after or before the iteration. How can I know when they all are done too?
Could you help me, please?
Here's my code:
for (let i = 0; i < this.data_array.length; i++) {
this.database.insertService(this.data_array[i]).then(async () => {
console.log('Inserting object number ' + this.data_array[i].id);
if( i == this.data_array.length - 1) {
console.log('done!');
}
}).catch(err => console.log('error inserting object into the table'));
}
Thank you.
It is a better practice to use Promise.all when dealing with such cases. A simple example would be the following:
let i;
let promises = [];
for (i = 0; i < 5; ++i)
promises.push(someAsyncFunc(i));
Promise.all(promises)
.then((results) => {
console.log("All done", results);
})
.catch((e) => {
console.log(e)
});
You can solve this using Promise.all
Here is an example and here some useful documentation
the example is using map instead of a for loop to iterate over the array.
I haven't tested it but that should work
Promise.all(
this.data_array.map((data) => {
console.log('Inserting object number ' + data.id);
return this.database.insertService(data);
})
)
.then((result) => console.log('done', result))
.catch((error) => console.log(error));
You could use Promise.all([]) where .all() take an array.
Promise.all(
// Map the items in data_array to the async insertService function
this.data_array.map(dataEntry => this.database.insertService(dataEntry))
).then((resultArray) => {
console.log(resultArray.length);
console.log('done!');
}).catch((e) => {
console.log(e)
});
I think that Promise.all is the best solution here
async method() {
const promises = this.data_array.map(item =>
this.database.insertService(item));
try {
const result = await Promise.all(promises);
console.log({ result });
} catch (error) {
console.log({ error });
}
}
I have some code that does this: First scrape this array of webpages. After that, scrape another array of webpages.
The following code does what I expect:
let bays=[];
let promises=promisesN=[];
for (let y=2019;y>=2015;y--)
promises.push(new Promise(resolve=>
curl.get(`/*url*/${y}.html`,null, (error,resp,body)=>
resp.statusCode==200? resolve(parse(body)):reject(error)
)));
Promise.all(promises).then(()=>{
bays.forEach(bay=>{
if (bay.no.match(/\d+/)<=103) return;
promisesN.push(new Promise(resolve=>
curl.get(`/*url*/${bay.code}/`,null, (error,resp,body)=>
resp.statusCode==200? resolve(image(bey,body)):reject(error)
)))});
Promise.all(promisesN).then(()=>{
bays.sort((a,b)=>{return parseInt(a.no.match(/\d+/))<parseInt(b.no.match(/\d+/))? -1:1});
console.log(bays);
});
}).catch(error=>console.log(error));`
So I've read you can write a simplier nesting-free syntax:
doSomething()
.then(function(result) {
return doSomethingElse(result);
})
.then(function(newResult) {
return doThirdThing(newResult);
})
.then(function(finalResult) {
console.log('Got the final result: ' + finalResult);
})
.catch(failureCallback);
How to apply this to the code above?
correctness
let promises=promisesN=[];
This is really incorrect. It makes both variables reference the same array, and makes promisesN an implicit global. The fact that it appears to work means you aren’t in strict mode. Always use strict mode. The correct version of what you intended is:
let promises = [];
let promisesN = [];
cleanliness
new Promise(resolve=>
curl.get(`/*url*/${y}.html`,null, (error,resp,body)=>
resp.statusCode==200? resolve(parse(body)):reject(error)
))
You’re repeating this pattern, so make it into a function, or use a package that does the job for you, like request-promise[-native] or axios. (Also, please show your real code. reject isn’t defined here.)
const getAsync = url => new Promise((resolve, reject) => {
curl.get(url, null, (error, resp, body) => {
if (resp.statusCode === 200) {
resolve(body);
} else {
reject(error);
}
});
});
Notice how you’re free to make the function more readable when it isn’t repeated, and to extend it later.
let promises = [];
let promisesN = [];
for (let y = 2019; y >= 2015; y--) {
promises.push(getAsync(`/*url*/${y}.html`).then(parse));
}
Promise.all(promises).then(bays => {
bays.forEach(bay => {
if (bay.no.match(/\d+/) <= 103) return;
promisesN.push(getAsync(`/*url*/${bay.code}/`).then(body => image(bay, body)));
});
Promise.all(promisesN).then(() => {
bays.sort((a, b) => {return parseInt(a.no.match(/\d+/)) < parseInt(b.no.match(/\d+/)) ? -1 : 1;});
console.log(bays);
});
}).catch(error => console.log(error));
I had to take a few guesses at what your real code looks like again, because you’re surely doing something with the resolved value of Promise.all(promises). It doesn’t have any easily-accessible side-effects. bey also seemed likely enough to be bay.
Now you can give promisesN a more appropriate scope:
let promises = [];
for (let y = 2019; y >= 2015; y--) {
promises.push(getAsync(`/*url*/${y}.html`).then(parse));
}
Promise.all(promises).then(bays => {
let promisesN = bays
.filter(bay => bay.no.match(/\d+/) > 103)
.map(bay => getAsync(`/*url*/${bay.code}/`).then(body => image(bay, body)));
Promise.all(promisesN).then(() => {
bays.sort((a, b) => {return parseInt(a.no.match(/\d+/)) < parseInt(b.no.match(/\d+/)) ? -1 : 1;});
console.log(bays);
});
}).catch(error => console.log(error));
and use an expression-bodied arrow function where appropriate, since you’re already using them whenever they aren’t appropriate:
bays.sort((a, b) => parseInt(a.no.match(/\d+/)) < parseInt(b.no.match(/\d+/)) ? -1 : 1);
Now, if my guess about bays is right, then you can’t unnest. If it comes from somewhere else then you can. Normally I would leave a comment about that but I already wrote all this, so… please clarify that for further cleanup.
If you're looking to simplify your code, you might consider the use of async/await instead of promises.
The async/await syntax will greatly simplify the presentation and ease comprehension of the code, especially given that your logic relies on asynchronous iteration of arrays.
Consider the following code revision of your code:
/* Define local helper that wraps curl() in async function declaration */
function async doRequest(url) {
return (await new Promise(resolve=> curl.get(url, null, (error,resp,body) =>
resp.statusCode==200 ? resolve(res) : reject(error))))
}
/* Re-define simplified scrape logic using await/async */
function async doScrape() {
try {
var bays = []
/* Iterate date range asynchronously */
for (let y=2019; y>=2015; y--) {
/* Use doRequest helper function to fetch html */
const response = await doRequest(`/*url*/${y}.html`)
const bay = parse(response)
bays.push(bay)
}
/* Iterate bays array that was obtained */
for(const bay of bays) {
/* Use doRequest helper again to fetch data */
const response = await doRequest(`/*url*/${bay.code}/`)
/* Await may not be needed here */
await image(bay, response)
}
/* Perform your sort (which is non asynchronous) */
bays.sort((a,b)=> parseInt(a.no.match(/\d+/))<parseInt(b.no.match(/\d+/))? -1:1);
console.log("Result", bays);
}
catch(err) {
/* If something goes wrong we arrive here - this is
essentially equivalent to your catch() block */
console.error('Scrape failed', err);
}
}
/* Usage */
doScrape()
Hope that helps!
Not entirely sure if this is what you want, but I've separated your code out a bit because I found it easier for me to read.
let bays = [];
let promises = [];
let promisesN = [];
for (let y = 2019; y >= 2015; y--) {
const promiseOne = new Promise((resolve, reject) => {
return curl.get(`/*url*/${y}.html`, null, (error, resp, body) => {
resp.statusCode === 200 ? resolve(parse(body)) : reject(error);
});
});
promises.push(promiseOne);
}
Promise.all(promises)
.then(() => {
bays.forEach((bay) => {
if (bay.no.match(/\d+/) <= 103) {
return;
}
const promiseTwo = new Promise((resolve, reject) => {
return curl.get(`/*url*/${bay.code}/`, null, (error, resp, body) => {
resp.statusCode === 200 ? resolve(image(bay, body)) : reject(error);
});
});
promisesN.push(promiseTwo);
});
return Promise.all(promisesN);
})
.then(() => {
bays.sort((a, b) => {
return parseInt(a.no.match(/\d+/), 10) < parseInt(b.no.match(/\d+/), 10) ? -1 : 1;
});
console.log(bays);
})
.catch((error) => {
console.log(error);
});
I am wondering though, you are firing the promises instantly on each iteration of your for loop. This might be intentional, but it means if those promises resolve before the code gets to execute Promise.all you may run into issues. I personally would do something like, e.g. const promiseOne = () => somePromise, that way you can create a bunch of promises, and then once they're all created, map over that array and fire them at once. Same thing goes for the second promises.
Not sure if this is helpful, let me know if it is. Feel free to ask more questions too.
Utilizing the YouTube Data API we can make a query to to obtain the first 50 (maximum amount of results obtainable with a single query) videos belonging to a user with the following request: https://www.googleapis.com/youtube/v3/search?key={access_key}&channelId={users_id}&part=id&order=date&maxResults=50&type=video
If there are more than 50 videos, then the resulting JSON will have a nextPageToken field, to obtain the next 50 videos we can append &pageToken={nextPageToken} to the above request, producing the next page of 50 videos. This is repeatable until the nextPageToken field is no longer present.
Here is a simple JavaScript function I wrote using the fetch API to obtain a single page of videos, specified by the nextPageToken parameter (or lack thereof).
function getUploadedVideosPage(nextPageToken) {
return new Promise((resolve, reject) => {
let apiUrl = 'https://www.googleapis.com/youtube/v3/search?key={access_key}&channelId={users_id}&part=id&order=date&maxResults=50&type=video';
if(nextPageToken)
apiUrl += '&pageToken=' + nextPageToken;
fetch(apiUrl)
.then((response) => {
response.json()
.then((data) => {
resolve(data);
});
});
});
}
Now we need a wrapper function that will iteratively call getUploadedVideosPage for as long as we have a nextPageToken. Here is my 'working' albeit dangerous (more on this later) implementation.
function getAllUploadedVideos() {
return new Promise((resolve, reject) => {
let dataJoined = [];
let chain = getUploadedVideosPage();
for(let i = 0; i < 20000; ++i) {
chain = chain
.then((data) => {
dataJoined.push(data);
if(data.nextPageToken !== undefined)
return getUploadedVideosPage(data.nextPageToken);
else
resolve(dataJoined);
});
}
});
}
The 'dangerous' aspect is the condition of the for loop, theoretically it should be infinite for(;;) since we have no predefined way of knowing exactly how many iterations to make, and the only way to terminate the loop should be with the resolve statement. Yet when I implement it this way it truly is infinite and never terminates.
Hence why I hard coded 20,000 iterations, and it seems to work but I don't trust the reliability of this solution. I was hopping somebody here can shed some light on how to go about implementing this iterative Promise chain that has no predefined terminating condition.
You can do this all with one function that calls itself if applicable.
You are also using an explicit promise construction anti-pattern wrapping fetch() in new Promise since fetch() already returns a promise
function getVideos(nextPageToken, results = []) {
let apiUrl = 'https://www.googleapis.com/youtube/v3/search?key={access_key}&channelId={users_id}&part=id&order=date&maxResults=50&type=video';
if (nextPageToken) {
apiUrl += '&pageToken=' + nextPageToken;
}
// return fetch() promise
return fetch(apiUrl)
.then(response => response.json())
.then(data => {
// merge new data into final results array
results = results.concat(data);
if (data.nextPageToken !== undefined) {
// return another request promise
return getVideos(data.nextPageToken, results);
} else {
// all done so return the final results
return results
}
});
}
// usage
getVideos().then(results=>{/*do something with all results*/})
.catch(err=>console.log('One of the requests failed'));
Get rid of the for loop altogether. Something like this:
function getAllUploadedVideos() {
return new Promise((resolve, reject) => {
let dataJoined = [];
function getNextPage(nextPageToken) {
return new Promise((resolve, reject) => {
getUploadedVideosPage(nextPageToken)
.then((data) => {
dataJoined.push(data);
if(data.nextPageToken !== undefined) {
resolve(data.nextPageToken);
}
else {
reject();
}
})
.catch((err) => {
// Just in case getUploadedVideosPage errors
reject(err);
});
});
}
getNextPage()
.then((nextPageToken) => {
getNextPage(nextPageToken);
})
.catch(() => {
// This will hit when there is no more pages to grab
resolve(dataJoined);
});
});
}