Array shown empty but inside data available - javascript

I am getting url from firebase using ".getDownloadUrl" and then push into an array but array shown empty but inside data available in console
I am using the following code:
const getPostUrl = async (item) => {
const url = await storage().ref(item.data?.postName).getDownloadUrl();
return { ...item._data, URL: url };
};
const getUrls = async () => {
let response=[]
storage().list((data)=>console.log(data, "knok knok tera baaap aaya"))
PostCollection.doc(UserInfo.uid)
.collection("post")
.onSnapshot(documentSnapshot => {
documentSnapshot.docs.forEach ( async (item, i) => {
const getdata = await getPostUrl(item)
response.push(getdata)
// const url = await storage().ref(item._data?.postName).getDownloadUrl()
// const obj = { ...item._data, URL:res }
});
// setPosts(postArr);
// setUrls(newArr);
});
let xyz = await Promise.resolve(response)
console.log(response, xyz, "RESPONSE========>");
};
Note: Originally code provided as image: here

Diagnosis
Your code is executed asyncronously. In simple terms that means that conceptually there is no guarantee of execution order among code portions flagged as async and wrt to the (synchronous) rest of the code.
Specifically that applies to the handler passed as argument to .forEach where the array is filled and the console output statement.
What likely happens is that the array is written to the console before the code to fill it with downloaded information has executed.
Remedy
Add additional synchronization to guarantee that the array is filled before using/outputting it.
Implementation
Instead of await-ing the result of async execution in various places of the code, pass the abstraction of the future results as Promises. In particular, collect the responses as an array of Promises instead of an array of results. If all promises are resolved (in production code you would prefer to have them all 'settled', as some may fail), continue processing, eg. output to console.
Pitfalls
Promise.all (For the in-browser implementation, see here (MDN), available in most Promise libraries) processes the argument array as is without regard to whether it contains the proper number of arguments. This is no problem here as the array-filling forEach loop is executed synchronously but must be looked after when modifying the code.
Beware: The following code is untested !
const getPostUrl = (item) => {
const p_url =
storage().ref(item.data?.postName).getDownloadUrl()
.then ( (url) => { return Promise.resolve({ ...item._data, URL: url }, (e) => { /* flag error */ } )
;
return p_url;
};
const getUrls = async () => {
let p_response=[]
storage().list((data)=>console.log(data, "knok knok tera baaap aaya"))
PostCollection.doc(UserInfo.uid)
.collection("post")
.onSnapshot(documentSnapshot => {
documentSnapshot.docs.forEach ( (item, i) => {
const getdata = getPostUrl(item)
p_response.push(getdata)
// const url = await storage().ref(item._data?.postName).getDownloadUrl()
// const obj = { ...item._data, URL:res }
});
Promise.all ( p_response )
.then ( (a_results) => {
console.log ( `Response ========> ${JSON.stringify(a_results)}.`)
}, (e) => { /* flag error */ }
);
// setPosts(postArr);
// setUrls(newArr);
});
};
Recommendation
Encapsulate the pattern of collecting an array of a predetermined number of promises and firing when they are all settled into a class of their own (or use a Promise/sync library that comes with it).

Related

Asyncronicity in a reduce() function WITHOUT using async/await

I am patching the exec() function to allow subpopulating in Mongoose, which is why I am not able to use async/await here -- my function will be chained off a db call, so there is no opportunity to call await on it, and within the submodule itself, there I can't add async/await outside of an async function itself.
With that out of the way, let's look at what I'm trying to do. I have two separate arrays (matchingMealPlanFoods and matchingMealPlanRecipeFoods) full of IDs that I need to populate. Both of them reside on the same array, foods. They each require a db call with aggregation, and the problem in my current scenario is that only one of the arrays populates because they are happening asynchronously.
What I am trying to do now is use the reduce function to return the updated foods array to the next run of reduce so that when the final result is returned, I can replace the entire foods array once on my doc. The problem of course is that my aggregate/exec has not yet returned a value by the time the reduce function goes into its next run. Is there a way I can achieve this without async/await here? I'm including the high-level structure here so you can see what needs to happen, and why using .then() is probably not viable.
EDIT: Updating code with async suggestion
function execute(model, docs, options, lean, cb) {
options = formatOptions(options);
let resolvedCount = 0;
let error = false;
(async () => {
for (let doc of docs) {
let newFoodsArray = [...doc.foods];
for (let option of options) {
const path = option.path.split(".");
// ... various things happen here to prep the data
const aggregationOptions = [
// // $match, then $unwind, then $replaceRoot
];
await rootRefModel
.aggregate(aggregationOptions)
.exec((err, refSubDocuments) => {
// more stuff happens
console.log('newFoodsArray', newFoodsArray); // this is to check whether the second iteration is using the updated newFoods Array
const arrToReturn = newFoodsArray.map((food) => {
const newMatchingArray = food[nests[1]].map((matchingFood) => {
//more stuff
return matchingFood;
});
const updatedFood = food;
updatedFood[`${nests[1]}`] = newMatchingArray;
return updatedFood;
});
console.log('arrToReturn', arrToReturn);
newFoodsArray = [...arrToReturn];
});
}
};
console.log('finalNewFoods', newFoodsArray); // this should log after the other two, but it is logging first.
const document = doc.toObject();
document.foods = newFoodsArray;
if (resolvedCount === options.length) cb(null, [document]);
}
})()
EDIT: Since it seems it will help, here is the what is calling the execute function I have excerpted above.
/**
* This will populate sub refs
* #param {import('mongoose').ModelPopulateOptions[]|
* import('mongoose').ModelPopulateOptions|String[]|String} options
* #returns {Promise}
*/
schema.methods.subPopulate = function (options = null) {
const model = this.constructor;
if (options) {
return new Promise((resolve, reject) => execute(model, [this], options, false, (err, docs) => {
if (err) return reject(err);
return resolve(docs[0]);
}));
}
Promise.resolve();
};
};
We can use async/await just fine here, as long as we remember that async is the same as "returning a Promise" and await is the same as "resolving a Promise's .then or .catch".
So let's turn all those "synchronous but callback-based" calls into awaitables: your outer code has to keep obeying the API contract, but since it's not meant to a return a value, we can safely mark our own version of it as async, and then we can use await in combination with promises around any other callback based function calls in our own code just fine:
async function execute(model, docs, options, lean, andThenContinueToThis) {
options = formatOptions(options);
let option, resolvedCount = 0;
for (let doc of docs) {
let newFoodsArray = [...doc.foods];
for (option of options) {
// ...things happen here...
const aggregationOptions = [/*...data...*/];
try {
const refSubDocuments = await new Promise((resolve, reject) => rootRefModel
.aggregate(aggregationOptions)
.exec((err, result) => err ? reject(err) : resolve(result));
// ...do some work based on refSubDocuments...
}
// remember to forward errors and then stop:
catch (err) {
return andThenContinueToThis(err);
}
}
// remember: bind newFoodsArray somewhere so it doesn't get lost next iteration
}
// As our absolutely last action, when all went well, we trigger the call forwarding:
andThenContinueToThis(null, dataToForward);
}

Best way to to use async/promise/callbacks in for loop [duplicate]

This question already has answers here:
How do I convert an existing callback API to promises?
(24 answers)
Closed 2 years ago.
I'm building a trading bot that needs to get stock names from separate files. But even I have used async function and await in my code, that doesn't work.
My index file init method.
const init = async () => {
const symbols = await getDownTrendingStock();
console.log("All stocks that are down: " + symbols);
const doOrder = async () => {
//do stuff
}
doOrder();
}
my getDownTrendeingStock file
const downStocks = []
function getDownTrendingStock () {
for(i = 0; i < data.USDTPairs.length; i++){
const USDTPair = data.USDTPairs[i] + "USDT";
binance.prevDay(USDTPair, (error, prevDay, symbol) => {
if(prevDay.priceChangePercent < -2){
downStocks.push(symbol)
}
});
}
return downStocks;
}
I have tried to also use async in for loop because the getDownTrendinStock function returns an empty array before for loop is finished. I didn't find the right way to do that because I was confused with all async, promise and callback stuff. What is the right statement to use in this situation?
Output:
All stocks that are down:
Wanted output:
All stocks that are down: [BTCUSDT, TRXUSDT, ATOMUSDT...]
I think the main issue in the code you posted is that you are mixing callbacks and promises.
This is what's happening in your getDownTrendingStock function:
You start iterating over the data.USDTPairs array, picking the first element
You call binance.prevDay. This does nothing yet, because its an asynchronous function that takes a bit of time. Notably, no data is added to downStocks yet.
You continue doing 1-2, still, no data is added
You return downStocks, which is still empty.
Your function is complete, you print the empty array
Now, at some point, the nodejs event loop continues and starts working on those asynchronous tasks you created earlier by calling binance.prevDay. Internally, it probably calls an API, which takes time; once that call is completed, it calls the function you provided, which pushes data to the downStocks array.
In summary, you didn't wait for the async code to complete. You can achieve this in multiple ways.
One is to wrap this in a promise and then await that promise:
const result= await new Promise((resolve, reject) => {
binance.prevDay(USDTPair, (error, prevDay, symbol) => {
if (error) {
reject(error);
} else {
resolve({prevDay, symbol});
}
});
});
if(result.prevDay.priceChangePercent < -2){
downStocks.push(result.symbol)
}
Note that you can probably also use promisify for this. Also, this means that you will wait for one request to finish before starting the next, which may slow down your code considerably, depending on how many calls you need; you may want to look into Promise.all as well.
Generally speaking, I use two technics:
const asyncFunc = () => {smthAsync};
const arrayToProcess = [];
// 1
const result = await arrayToProcess.reduce((acc, value) => acc.then(() => asyncFunc(value)), Promise.resolve(someInitialValue));
// 2
// here will be eslint errors
for(let i = 0 i < arrayToProcess.length; i+=1) {
const processResult = await asyncFunc(value);
// do with processResult what you want
};

Returning a single array

I am struggling to understand how to return a single array from a function that calls another function several times. Currently, the console.log in the code below outputs a growing array each time the scrapingfunction runs.
The final last time that the scrapingfunction runs is actually what I want, but I want to find a way to return a single array at the end of the hello function so that I can drop each object into my database. I'm guessing this is just not me understanding javascript well enough yet.
const hello = async () => {
//[launch puppeteer and navigate to page I want to scrape]
await scrapingfunction(page)
//[navigate to the next page I want to scrape]
await scrapingfunction(page)
//[navigate to the next page I want to scrape]
await scrapingfunction(page)
}
const scrapingfunction = async (page) => {
const html = await page.content()
const $ = cheerio.load(html)
const data = $('.element').map((index, element)=>{
const dateElement = $(element).find('.sub-selement')
const date = dateElement.text()
return {date}
}).get()
console.log(data)
}
hello();
The problem you've encountered is that Promises (which is what async/await uses "under the covers") cannot return values outside the Promise chain.
Think of it this way.
You ask me to write a StackOverflow article for you and immediately demand the result of that task, without waiting for me to finish it.
When you set me the task, I haven't yet completed it, so I cannot provide a response.
You will need to restructure your request to return values from your awaits which can then be operated upon by the surrounding async function, such as:
# Assume doubleValue() takes some unknown time to return a result, like
# waiting for the result of an HTTP query.
const doubleValue = async (val) => return val * 2
const run = async () => {
const result = []
result.push(await doubleValue(2))
result.push(await doubleValue(4))
result.push(await doubleValue(8))
console.log(result)
}
which will print [4, 8, 16] to the console.
You might think you could return the result from run() and print it to the console as in:
const run = async () => {
const result = []
result.push(await doubleValue(2))
result.push(await doubleValue(4))
result.push(await doubleValue(8))
return result
}
console.log(run())
But since Node has no idea when run() has everything it needs to create a result, console.log will not print anything. (That's not expressly true since an async function returns a Promise, but the explanation works for this example.)
The rule is that you can await the result of other functions from within a function marked as async, but you cannot return any useful result to its surrounding context.
Since an async function does return a Promise, you could:
run().then(result => console.log(result))
But note that the result never leaves the Promise chain.

Async await with a forEach loop still running asynchronously [duplicate]

This question already has answers here:
Using async/await with a forEach loop
(33 answers)
Closed 4 years ago.
First of all I did read through similar questions, and still cannot see where I'm making my mistake.
Here's my code:
async function validateWebsites(website) {
var result = url.parse(`http://${website}`);
console.log(result.hostname);
return await fetch(`http://www.${result.hostname}`)
.then(() => console.log(true))
.catch(() => console.log(false));
}
var wrongWebsites = [];
var wrongWebsites = [];
var i = 0;
websites.forEach(website => {
i++;
if (validateWebsites(website) === false
) {
wrongWebsites.push(i);
}
});
console.log(wrongWebsites);
How it works:
The user passes an array of websites, and I want to validate if they're valid websites, not to waste resources and block other errors. Now to the console:
digitlead.com
google.com
georgiancollege.ca
youtube.com
[]
true
true
true
true
So as you see, it prints out first the websites array, and then the response. So it's still async. How do I make it wait? I changed the loop from a for to forEach as suggested by many posts, I used the await and I am returning a promise. So what else do I have to do?
Edit:
I tried to do this:
async function validateWebsites(website) {
var result = url.parse(`http://${website}`); // TODO figure out if filtering all the subpages is a good idea.
console.log(result.hostname);
return await fetch(`http://www.${result.hostname}`)
.then(()=>console.log(true))
.catch(()=>console.log(false));
}
But it doesn't change anything
I found a function called readFileSync. That's more or less what I'm looking for, but with the ability to call a different website.
Here is how you can get all valid websites.
Problem with your validateWebsites function is that it returns promise wchih is resolving to undefined thanks to promise chaining and your loging
Also using forEach to filter array is unnesesery.
But if you wanted you could do something like this
websites.forEach(async website => {
i++;
if (await validateWebsites(website) === false) { // now value is Boolean instead of Promise
wrongWebsites.push(i);
}
});
Also note that if you use global i with asyncronous functions to keep track of index this can lead to many errors.
However I think this soultion should satisfy you
async function validateWebsites(website) {
var result = url.parse(`http://${website}`)
return fetch(`http://www.${result.hostname}`)
.then(() => true) // async function returns promise
.catch(() => false)
}
const websites = ['digitlead.com',
'google.com',
'georgiancollege.ca',
'youtube.com',
'111.1',
'foobarbaz']
async function filter(array, func) {
const tmp = await Promise.all( // waits for all promises to resolve
array.map(func) // evecutes async function and stores it result in new array then returns array of promises
)
return array.filter((_, i) => tmp[i]) // removes invalid websites
}
const validWebsites = filter(websites, validateWebsites)
validWebsites.then(console.log)
Get the indexes of the non-valid sites
async function filter(array, func) {
const tmp = await Promise.all(array.map(func))
return tmp
.map((x, i) => !x && i) // flip true to false and asign index when x is false
.filter(x => x !== false) // return indexes
}
destoryeris saying you should do something like this:
websites.forEach(async website => {
i++;
if (await validateWebsites(website) === false
) {
wrongWebsites.push(i);
}
});
But that alone is problematic because you have wrap your async functions in a try/catch to handle their errors. So something more like this:
websites.forEach(async website => {
i++;
try {
const validSites = await validateWebsites(website);
if (validSites === false) {
wrongWebsites.push(i);
}
} catch(e) {
// handle e
}
})

Best es6 way to get name based results with Promise.all

By default the Promise.All([]) function returns a number based index array that contains the results of each promise.
var promises = [];
promises.push(myFuncAsync1()); //returns 1
promises.push(myFuncAsync1()); //returns 2
Promise.all(promises).then((results)=>{
//results = [0,1]
}
What is the best vanilla way to return a named index of results with Promise.all()?
I tried with a Map, but it returns results in an array this way:
[key1, value1, key2, value2]
UPDATE:
My questions seems unclear, here is why i don't like ordered based index:
it's crappy to maintain: if you add a promise in your code you may have to rewrite the whole results function because the index may have change.
it's awful to read: results[42] (can be fixed with jib's answer below)
Not really usable in a dynamic context:
var promises = [];
if(...)
promises.push(...);
else{
[...].forEach(... => {
if(...)
promises.push(...);
else
[...].forEach(... => {
promises.push(...);
});
});
}
Promise.all(promises).then((resultsArr)=>{
/*Here i am basically fucked without clear named results
that dont rely on promises' ordering in the array */
});
ES6 supports destructuring, so if you just want to name the results you can write:
var myFuncAsync1 = () => Promise.resolve(1);
var myFuncAsync2 = () => Promise.resolve(2);
Promise.all([myFuncAsync1(), myFuncAsync2()])
.then(([result1, result2]) => console.log(result1 +" and "+ result2)) //1 and 2
.catch(e => console.error(e));
Works in Firefox and Chrome now.
Is this the kind of thing?
var promises = [];
promises.push(myFuncAsync1().then(r => ({name : "func1", result : r})));
promises.push(myFuncAsync1().then(r => ({name : "func2", result : r})));
Promise.all(promises).then(results => {
var lookup = results.reduce((prev, curr) => {
prev[curr.name] = curr.result;
return prev;
}, {});
var firstResult = lookup["func1"];
var secondResult = lookup["func2"];
}
If you don't want to modify the format of result objects, here is a helper function that allows assigning a name to each entry to access it later.
const allNamed = (nameToPromise) => {
const entries = Object.entries(nameToPromise);
return Promise.all(entries.map(e => e[1]))
.then(results => {
const nameToResult = {};
for (let i = 0; i < results.length; ++i) {
const name = entries[i][0];
nameToResult[name] = results[i];
}
return nameToResult;
});
};
Usage:
var lookup = await allNamed({
rootStatus: fetch('https://stackoverflow.com/').then(rs => rs.status),
badRouteStatus: fetch('https://stackoverflow.com/badRoute').then(rs => rs.status),
});
var firstResult = lookup.rootStatus; // = 200
var secondResult = lookup.badRouteStatus; // = 404
If you are using typescript you can even specify relationship between input keys and results using keyof construct:
type ThenArg<T> = T extends PromiseLike<infer U> ? U : T;
export const allNamed = <
T extends Record<string, Promise<any>>,
TResolved extends {[P in keyof T]: ThenArg<T[P]>}
>(nameToPromise: T): Promise<TResolved> => {
const entries = Object.entries(nameToPromise);
return Promise.all(entries.map(e => e[1]))
.then(results => {
const nameToResult: TResolved = <any>{};
for (let i = 0; i < results.length; ++i) {
const name: keyof T = entries[i][0];
nameToResult[name] = results[i];
}
return nameToResult;
});
};
A great solution for this is to use async await. Not exactly ES6 like you asked, but ES8! But since Babel supports it fully, here we go:
You can avoid using only the array index by using async/await as follows.
This async function allows you to literally halt your code inside of it by allowing you to use the await keyword inside of the function, placing it before a promise. As as an async function encounters await on a promise that hasn't yet been resolved, the function immediately returns a pending promise. This returned promise resolves as soon as the function actually finishes later on. The function will only resume when the previously awaited promise is resolved, during which it will resolve the entire await Promise statement to the return value of that Promise, allowing you to put it inside of a variable. This effectively allows you to halt your code without blocking the thread. It's a great way to handle asynchronous stuff in JavaScript in general, because it makes your code more chronological and therefore easier to reason about:
async function resolvePromiseObject(promiseObject) {
await Promise.all(Object.values(promiseObject));
const ret = {};
for ([key, value] of Object.entries(promiseObject)) {
// All these resolve instantly due to the previous await
ret[key] = await value;
};
return ret;
}
As with anything above ES5: Please make sure that Babel is configured correctly so that users on older browsers can run your code without issue. You can make async await work flawlessly on even IE11, as long as your babel configuration is right.
in regards to #kragovip's answer, the reason you want to avoid that is shown here:
https://medium.com/front-end-weekly/async-await-is-not-about-making-asynchronous-code-synchronous-ba5937a0c11e
"...it’s really easy to get used to await all of your network and I/O calls.
However, you should be careful when using it multiple times in a row as the await keyword stops execution of all the code after it. (Exactly as it would be in synchronous code)"
Bad Example (DONT FOLLOW)
async function processData() {
const data1 = await downloadFromService1();
const data2 = await downloadFromService2();
const data3 = await downloadFromService3();
...
}
"There is also absolutely no need to wait for the completion of first request as none of other requests depend on its result.
We would like to have requests sent in parallel and wait for all of them to finish simultaneously. This is where the power of asynchronous event-driven programming lies.
To fix this we can use Promise.all() method. We save Promises from async function calls to variables, combine them to an array and await them all at once."
Instead
async function processData() {
const promise1 = downloadFromService1();
const promise2 = downloadFromService2();
const promise3 = downloadFromService3();
const allResults = await Promise.all([promise1, promise2, promise3]);

Categories

Resources