This question already has answers here:
JavaScript array .reduce with async/await
(11 answers)
Closed 8 months ago.
const handleFileChange = async (e) => {
const target = e?.target?.files;
const attachments = await Array.from(target).reduce(async (acum, file) => {
file.id = uniqid();
// const format = file.name.split('.').pop();
// if (IMAGE_FORMATS.includes(format)) {
setIsLoading(true);
if (file.type.startsWith('image/')) {
const response = await channel.sendImage(file);
file.src = response.file;
acum.images.push(file);
} else {
const response = await channel.sendFile(file);
file.src = response.file;
acum.files.push(file);
}
setIsLoading(false);
return acum;
}, Promise.resolve({ files: [], images: [] }));
setFilesList(prev => {
console.log('files', [...prev, ...attachments.files]);
return [...prev, ...attachments.files];
});
setImagesList(prev => {
console.log('images', [...prev, ...attachments.images]);
return [...prev, ...attachments.images];
});
};
In the above code I got the following error
It looks it's cause by my initialization of array, but how should I address it?
An async function returns Promise, which makes it difficult to work with when using .reduce() as you would need to await your accumulator each iteration to get your data. As an alternative, you can create an array of Promises using the mapper function of Array.from() (which you can think of as using .map() directly after Array.from()). The idea here is that the map will trigger multiple asynchronous calls for each file by using sendImage/sendFile. These calls will run in parallel in the background. The value that we return from the mapping function will be a Promise that notifies us when the asynchronous call has successfully completed (once it resolves). Moreover, the mapping function defines what the promise resolves with, in our case that is the new object with the src property:
const isImage = file => file.type.startsWith('image/');
const filePromises = Array.from(target, async file => {
const response = await (isImage(file) ? channel.sendImage(file) : channel. sendFile(file));
return {...file, type: file.type, src: response.file};
});
Above filePromises is an array of Promises (as the async mapper function returns a Promise implicitly). We can use Promise.all() to wait for all of our Promises to resolve. This is faster than performing each asynchronous call one by one and only moving to the next once we've waited for the previous to complete:
setIsLoading(true); // set loading to `true` before we start waiting for our asynchronous work to complete
const fileObjects = await Promise.all(filePromises);
setIsLoading(false); // complete asynchronous loading/waiting
Lastly, fileObjects is an array that contains all objects, both files and images. We can do one iteration to partition this array into seperate arrays, one for images, and one for files:
const attachments = {files: [], images: []};
for(const fileObj of fileObjects) {
if(isImage(fileObj))
attachments.images.push(fileObj);
else
attachments.files.push(fileObj);
}
The reduce is not really necessary at this point:
Here is a solution with a map to transform the elements in promises and then Promise.all to wait for the exectution
const channel = {
sendImage: async (file) => {return {file}},
sendFile: async (file) => {return {file}}
}
const uniqid = () => Math.floor(Math.random() * 100);
const input = {
target: {
files: [{
src: 'src',
type: 'image/123'
},
{
src: 'src',
type: 'image/321'
},
{
src: 'src',
type: '123'
},
{
src: 'src',
type: '321'
}]
}
}
const setIsLoading = () => null;
const handleFileChange = async (e) => {
const target = e?.target?.files;
setIsLoading(true);
const attachments = {
images: [],
files: [],
}
await Promise.all(Array.from(target).map((file) => {
file.id = uniqid();
return new Promise(async (resolve) => {
if (file.type.startsWith('image/')) {
const response = await channel.sendImage(file);
file.src = response.file;
attachments.images.push(file);
} else {
const response = await channel.sendFile(file);
file.src = response.file;
attachments.files.push(file);
}
resolve();
});
}));
setIsLoading(false)
return attachments;
};
handleFileChange(input).then(res => console.log(res))
Related
The Problem is with the uplines.push.
I always get an empty uplines array so the last part of the code doesn't run. The promises resolve later and I get the correct data. May I know how to go about doing it the correct way?
const getAllUplines = async () => {
uplines = [];
const findUser = async (userFid) => {
const userDoc = await firestore.collection("users").doc(userFid).get();
if (userDoc.exists) {
const user = { ...userDoc.data(), id: userDoc.id };
console.log(user);
uplines.push(user);
if (user.immediateUplineFid) {
findUser(user.immediateUplineFid); //self looping
}
} else {
console.log("No User Found");
return null;
}
};
sale.rens.forEach(async (ren) => {
findUser(ren.userFid);
});
console.log(uplines);
return uplines;
};
let uplines = await getAllUplines();
console.log(uplines);
uplines = uplines.filter(
(v, i) => uplines.findIndex((index) => index === v) === i
); //remove duplicates
uplines.forEach((user) => {
if (user.chatId) {
sendTelegramMessage(user.chatId, saleToDisplay, currentUser.displayName);
console.log("Telegram Message Sent to " + user.displayName);
} else {
console.log(user.displayName + " has no chatId");
}
});
There are a few things that you have missed out while implementing the async call, which are explained in the inline comments in the code snippet.
A short explanation for what happened in your code is that in the line sale.rens.forEach you are passing an async function in the argument, which does not make any difference to the function forEach, it will execute it without waiting for it to complete.
Therefore in my answer I am using Promise.all to wait for all the async function calls to complete before returning the result.
// This is wrapped in an immediately executed async function because await in root is not supported here
(async () => {
const mockGetData = () => new Promise(resolve => setTimeout(resolve, 1000));
const sale = {
rens: [
{ userFid: 1 },
{ userFid: 2 },
{ userFid: 3 }
]
};
const getAllUplines = async () => {
const uplines = [];
const findUser = async (userFid) => {
// Simulating an async function call
const userDoc = await mockGetData();
console.log("User data received");
uplines.push(`User ${userFid}`);
};
const promises = [];
sale.rens.forEach(ren => { // This function in foreach does not have to be declared as async
// The function findUser is an async function, which returns a promise, so we have to keep track of all the promises returned to be used later
promises.push(findUser(ren.userFid));
});
await Promise.all(promises);
return uplines;
};
let uplines = await getAllUplines();
console.log(uplines);
})();
In order to get the results of getAllUplines() properly, you need to add await to all async functions called in getAllUplines().
const getAllUplines = async () => {
uplines = [];
const findUser = async (userFid) => {
const userDoc = await firestore.collection("users").doc(userFid).get();
if (userDoc.exists) {
const user = { ...userDoc.data(), id: userDoc.id };
console.log(user);
uplines.push(user);
if (user.immediateUplineFid) {
await findUser(user.immediateUplineFid); //self looping
}
} else {
console.log("No User Found");
return null;
}
};
sale.rens.forEach(async (ren) => {
await findUser(ren.userFid);
});
console.log(uplines);
return uplines;
};
I'm trying to use a web worker to get some data and then pass them back to the main thread. I have tried with this code but it will not work as expected
onmessage = (e) => {
console.log(e);
if( e.data[0] === 'fetchData' ){
fetch('https://example.com/platform/api/v1/endpoint')
.then( (res) => res.json() )
.then( async (data) => {
const imagesData = await Promise.all(
data.map( async (item) => {
let res = await fetch(item.src);
let img = await res.blob();
let reader = new FileReader();
reader.readAsDataURL(img);
reader.onloadend = () => {
return {
title: item.title,
link: item.link,
src: reader.result
}
}
})
)
postMessage(imagesData);
})
}
}
The imagesData after a console.log will contain an array of seven undefined elements, how I can fix this?
UPDATE
I've changed the code in vue front-end and into the worker and now I'm able to get the data but sometimes the worker will not work and I will not able to get the data or I get just two entries but the expected number is seven items for the front-end. Here is how I've modified the code, maybe I need to terminate the work before use another one?
NB: I'm creating a tab override chrome extension
vue front-end code
<script>
const worker = new Worker('services.js');
export default {
name: 'App',
beforeCreate() {
worker.postMessage(['fetchData']);
},
created() {
this.init();
this.clock();
},
data() {
return {
mostVisited: [],
imagesData: [],
isLoading: true
}
},
methods: {
init() {
worker.onmessage = (e) => {
console.log(e);
this.imagesData = e.data;
this.isLoading = false;
}
browser.topSites.get().then( (sites) => this.mostVisited = sites );
} //end init
}
</script>
web worker code
onmessage = (e) => {
console.log(e);
if( e.data[0] === 'fetchData' ){
fetch('https://example.com/platform/api/v1/endpoint')
.then( (res) => res.json() )
.then( async (data) => {
let imagesData = [];
await Promise.all(
data.map( async (item) => {
let res = await fetch(item.src);
let img = await res.blob();
let reader = new FileReader();
reader.readAsDataURL(img);
reader.onloadend = () => {
imagesData.push({ title: item.title, link: item.link, src: reader.result });
}
})
)
postMessage(imagesData);
}); // end then(data)
}
}
You are not waiting for the asynchronous FileReader to have read your files before resolving the outer Promise, so the Promise.all Promise resolves after let img = await res.blob(); but before onloadend does.
Since you are in a Worker context, you can use the synchronous FileReaderSync API, which would give something like
.then( async (data) => {
let imagesData = [];
await Promise.all(
data.map( async (item) => {
let res = await fetch(item.src);
let img = await res.blob();
let reader = new FileReaderSync();
const result = reader.readAsDataURL(img);
imagesData.push({ title: item.title, link: item.link, src: result });
})
)
postMessage(imagesData);
});
But I'm 99% confident that you don't even need that data: URL, and that it will do more harm than anything.
Remember that the data: URL from a FileReader always encode to base64, which will produce a DOMString containing 134% of the original data, that you multiply per 2 since DOMStrings are stored in UTF-16.
Then, to pass that data to the main context, the browser will have to serialize the data using the structured clone algorithm, and for DOMStrings that means a simple copy in memory. Each image now takes in memory about 5 times its real size, and it's even before the main context starts parsing it again to binary data so it can build the pixels data...
Instead, simply pass the Blob you get as img.
Blobs' inner data is passed by reference by the structured clone algorithm, so all you copy is the small js wrapper around.
Then when you need to display these images on the main context, use URL.createObejctURL(img) which will return a blob: URL directly pointing to that same Blob's data, which is still stored only once in memory.
.then( async (data) => {
let imagesData = [];
await Promise.all(
data.map( async (item) => {
let res = await fetch(item.src);
let img = await res.blob();
const url = URL.createObjectURL(img);
imagesData.push({ title: item.title, link: item.link, src: url, file: img });
})
)
postMessage(imagesData);
});
In writeFile fn the records returned by makeCsv function are empty (used await still :/ empty) how to make all the code inside makeCsv blocking such that it i get all entries in records array when i call fn(makeCsv) Expected Code Flow
makeCsv -> reads a csv from local storage and calls function getPubKey
getPubKey-> fetches a key against each call for the account name passed by makeCsv by making a request to url and returns that key
makeCsv-> appends key property to object pushes it to results array and returns it
writeFile -> calls makeCsv and takes array to write new Csv with keys included
Issue : As soon as call to getPubkey function is made end event is triggred resolving promise with empty array and thus writeFile receives empty array because it runs before makeCsv has ended creating all requests and then adding keys to data objects
Code
const csv = require('csv-parser')
const createCsvWriter = require('csv-writer').createObjectCsvWriter;
const fetch = require('node-fetch');
const fs = require('fs');
let nkas = []
async function makeCsv(results) {
const readable = fs.createReadStream('/home/vector/Desktop/test.csv')
return new Promise((resolve, reject) => {
readable.pipe(csv(['acc', 'balance']))
.on('data', async (data) => {
data.key = await getPubkey(data.acc)
results.push(data)
})
.on('end', () => {
return resolve(results);
});
})
}
async function getPubkey(account) {
const body = { account_name: account }
let key;
await fetch('https://eos.greymass.com/v1/chain/get_account', {
method: 'post',
body: JSON.stringify(body),
headers: { 'Content-Type': 'application/json' },
})
.then(res => res.json())
.then(json => key = json.permissions[1].required_auth.keys[0].key)
.catch(err => console.log(err))
if (key && key.length > 0)
return key
else
nkas.push(account);
console.log(nkas);
}
async function writeFile() {
const csvWriter = createCsvWriter({
path: 'out.csv',
header: [
{ id: 'acc', title: 'Account' },
{ id: 'balance', title: 'Balance' },
{ id: 'key', title: 'PubKey' }
]
});
let records = []
records = await makeCsv(records)
console.log(records)
csvWriter.writeRecords(records) // returns a promise
.then(() => {
console.log('...Done');
});
}
writeFile();
You need to wait for the getPubKey() promises to resolve before resolving the makeCsv() promise:
async function makeCsv() {
const readable = fs.createReadStream('/home/vector/Desktop/test.csv')
const pending = [];
return new Promise((resolve, reject) => {
readable.pipe(csv(['acc', 'balance']))
.on('data', (data) => {
const getKey = async (data) => {
data.key = await getPubkey(data.acc)
return data
}
pending.push(getKey(data))
})
.on('end', () => {
Promise.all(pending).then(results => resolve(results));
});
})
}
I am trying to run the node js Lighthouse function serially (one at a time) with an array of URLs. My problem is that whenever I loop through the array, Lighthouse runs all the URLs at once, which I imagine is problematic if you have a very large array of URLs.
The code:
for(let url of urls) {
function launchChromeAndRunLighthouse(url, opts, config = null) {
return chromeLauncher.launch({chromeFlags: opts.chromeFlags}).then(chrome => {
opts.port = chrome.port;
return lighthouse(url, opts, config).then(results => {
return chrome.kill().then(() => results.lhr)
});
});
}
}
launchChromeAndRunLighthouse('https://example.com', opts).then(results => {
// Use results!
});
Please help! And thank you for your time!
Your answer is correct but it can be improved. Since you have access to async and await, you should fully utilize it to make your code cleaner:
async function launchChromeAndRunLighthouse (url, opts, config = null) {
const chrome = await chromeLauncher.launch({chromeFlags: opts.chromeFlags});
opts.port = chrome.port;
const { lhr } = await lighthouse(url, opts, config);
await chrome.kill();
return lhr;
}
async function launchAudit (urls) {
for (const url of urls) {
const results = await launchChromeAndRunLighthouse(url, opts);
// Use results!
};
}
launchAudit(urls);
I believe I figured it out. What I did is below. Please continue to send feedback if you think this is wrong.
function launchChromeAndRunLighthouse(url, opts, config = null) {
return chromeLauncher.launch({chromeFlags: opts.chromeFlags}).then(chrome => {
opts.port = chrome.port;
return lighthouse(url, opts, config).then(results => {
return chrome.kill().then(() => results.lhr)
});
});
};
async function launchAudit(urls) {
for (let url of urls) {
await launchChromeAndRunLighthouse(url, opts).then(results => {
// Use results!
});
};
};
launchAudit(urls);
A variation on Patric Roberts answer (which should be the accepted answer).
I was wondering if it was necessary to kill chrome every iteration.
const lighthouse = require('lighthouse');
const chromeLauncher = require('chrome-launcher');
function launchChromeAndRunLighthouse(sites, opts, config = null) {
return chromeLauncher.launch({chromeFlags: opts.chromeFlags}).then(chrome => {
opts.port = chrome.port;
const siteResults = [];
return new Promise((resolve, reject) => {
// batch async functions.
// C/O https://stackoverflow.com/questions/43082934/how-to-execute-promises-sequentially-passing-the-parameters-from-an-array
const runBatch = async (iterable, action) => {
for (const x of iterable) {
await action(x)
}
}
// func to run lighthouse
const doLightHouse = (site) => new Promise((resolve, reject) => {
lighthouse(site, opts, config).then(results => {
siteResults.push(results.lhr);
resolve();
});
});
// go go go
runBatch(sites, doLightHouse).then(d => {
chrome.kill().then((result) => {
resolve(siteResults)
})
});
});
});
}
const opts = {
chromeFlags: ['--show-paint-rects'],
onlyCategories: ['performance']
};
const sites = ['https://www.example.com', 'https://www.test.com']
launchChromeAndRunLighthouse(sites, opts).then(results => {
// Use results!
console.log(results);
});
Just to execute your code as test, we'll use async/await and IIFE
Then, will create function which will put all our request to array of non resolved promises, so we could use it with Promise.all()
You need to rewrite code in something like this:
(async() => {
const promisesToExecute = [];
const launchChromeAndRunLighthouse = async (url, opts, config = null) => {
const chrome = await return chromeLauncher.launch({chromeFlags: opts.chromeFlags});
opts.port = chrome.port;
promisesToExecute.push(lighthouse(url, opts, config));
}
const results = await Promise.all(promisesToExecute);
for(const result of results) {
const resolvedResult = await result.kill();
// here you can access your results.lhr
console.log(resolvedResult.lhr);
}
})()
Please note, this code wasn't tested, so there might be problems with kill() on result. But, the main goal is to answer your question and explain how to execute promises.
Also, if you don't want to execute all promises at the same time, you could use Promise.waterfall with some npm package, like this
I have an array of questions.
Each question has some answers which are some files to upload.
Everything goes well instead of the fact that the API call is not waiting for Promise.all to finish.
Here are the steps:
map through questions array, if a question is image type, then get all the files and try to upload them.
after upload resolve all the promises from upload and add an answer to that question the result of Promise.all();
After the loop through all the questions is ready, make an API call to save into DB which now is not waiting to upload all the files and resolve everything in that array.
export function sendReview (taskId, companyId, questions, navigation) {
return async (dispatch) => {
dispatch(actions.sendReview.pending());
try {
let user = await getUser();
user = JSON.parse(user);
questions.map(async question => {
if (question.type === 'image') {
let images = question.answer;
if (images.length > 0) {
const results = images.map(async image => {
return await imageApi.upload(image).then(res => {
return res.url;
});
});
question.answer = await Promise.all(results).then(completed => {
return completed;
});
}
}
});
const data = await tasksApi.sendReview({
task_id: taskId,
company_id: companyId,
user_id: user.id,
questions: JSON.stringify(questions)
});
if (data.status === 201) {
markAsCompleted(taskId);
navigation.navigate('MyTasks');
dispatch(actions.sendReview.success({}));
}
else {
dispatch(actions.sendReview.error());
}
} catch (err) {
dispatch(actions.sendReview.error(err));
}
};
}
Here is the function used.
How can I make sure that all the items in .map() are ready and just after that make API call?
To give you an example from code I made quite some time ago:
await Promise.all((await readdir(repoPath, "utf8")).map(async file => {
if (!/\.mjs$/.test(file)) return;
const filePath = `${repoPath}/${file}`;
log(`importing "${file}"`);
const module = await import(filePath);
const meta = {
repository,
file,
filePath,
description: module.description || {}
};
module.default((...args) => createModule(meta, ...args));
}));
If you have asyncronous mapping handlers, you'll need to keep in mind, the content of the resulting map contains promises.
Promise.all() will help you with that.
In your case, all you need to do is change:
questions.map(async(question) => {
if(question.type === 'image'){
let images = question.answer;
if(images.length > 0){
const results = images.map(async (image) => {
return await imageApi.upload(image).then(res => {
return res.url;
});
});
question.answer = await Promise.all(results).then((completed) => {
return completed
});
}
}
});
as follows:
await Promise.all(questions.map(async(question) => {
if(question.type === 'image'){
let images = question.answer;
if(images.length > 0){
const results = await Promise.all(images.map(async (image) => {
return await imageApi.upload(image).then(res => {
return res.url;
});
}));
question.answer = results.then((completed) => {
return completed
});
}
}
}));
Use Promise.all to await promises in an array
Promise.all(questions.map(...))