JS/TS Promises - Resolving an Array (of objects) but Empty when Resolved - javascript

I've been learning about Promises in JS and although its been pretty exciting its also been a bit frustrating. So I'm working on some code that will allow users to drag-and-drop Files and Folders to my web app. However some processes are dependent on others to complete.
My idea is to capture all fileItems (for additional context they're of type FileSystemEntry), then convert them to an object type 'File Upload Object' (my own custom class), then push them all into an array of the same type, to then finally display them on the screen.
Here is my attempt to the solution:
First I want to create a new instance of an object so I can push it into the array.
async returnFileUploadObject(file): Promise<FileUploadObject> {
let fileModTime;
let fileSize;
return new Promise((resolve, reject) => {
try {
file.getMetadata((metadata) => {
fileModTime = metadata.modificationTime;
fileSize = metadata.size;
const fileUploadObj = new FileUploadObject(file.name, fileModTime, fileSize, String(this.uploadState.Spinner), false, file.fullPath);
resolve (fileUploadObj);
});
} catch (e) {
fileModTime = Date.now();
fileSize = 0;
const fileUploadObj = new FileUploadObject(file.name, fileModTime, fileSize, String(this.uploadState.Spinner), false, file.fullPath);
resolve(fileUploadObj);
}
});
}
Then, I want to push that new object into an array.
async createFileObjectList(fileEntries): Promise<FileUploadObject[]> {
return new Promise(async (resolve, reject) => {
let list = [];
for (let file of fileEntries) {
let fileObj = await this.returnFileUploadObject(file);
list.push(fileObj);
}
console.log('Time to resolve this promise', list);
resolve(list);
});
}
Once the array is finished being built, I want to pass it back to another list that will then display the files in HTML:
async dropFiles(items) {
// createFileObjectList
this.fileUploadList = await this.createFileObjectList(items);
console.log('File Upload List', this.fileUploadList);
}
I thought I was doing everything correctly with Promises but when I console.log the results, the arrays appear to have items (28 FileUploadObjs) but actually have a length of 0. I also want to note that sometimes the console.log statement "console.log('Time to resolve this promise', list);" does print out the items in the array - sometimes.
Any kind of help would be greatly appreciated! I am really trying my best to understand Promises and I want my code to finally work. Also any tips for better coding practices when using Promises would be awesome too, thanks! If you need more clarification, I'd be happy to provide it to you.

Related

Mongoose inserting same data three times instead of iterating to next data

I am trying to seed the following data to my MongoDB server:
const userRole = {
role: 'user',
permissions: ['readPost', 'commentPost', 'votePost']
}
const authorRole = {
role: 'author',
permissions: ['readPost', 'createPost', 'editPostSelf', 'commentPost',
'votePost']
}
const adminRole = {
role: 'admin',
permissions: ['readPost', 'createPost', 'editPost', 'commentPost',
'votePost', 'approvePost', 'approveAccount']
}
const data = [
{
model: 'roles',
documents: [
userRole, authorRole, adminRole
]
}
]
When I try to iterate through this object / array, and to insert this data into the database, I end up with three copies of 'adminRole', instead of the three individual roles. I feel very foolish for being unable to figure out why this is happening.
My code to actually iterate through the object and seed it is the following, and I know it's actually getting every value, since I've done the console.log testing and can get all the data properly:
for (i in data) {
m = data[i]
const Model = mongoose.model(m.model)
for (j in m.documents) {
var obj = m.documents[j]
Model.findOne({'role':obj.role}, (error, result) => {
if (error) console.error('An error occurred.')
else if (!result) {
Model.create(obj, (error) => {
if (error) console.error('Error seeding. ' + error)
console.log('Data has been seeded: ' + obj)
})
}
})
}
}
Update:
Here is the solution I came up with after reading everyone's responses. Two private functions generate Promise objects for both checking if the data exists, and inserting the data, and then all Promises are fulfilled with Promise.all.
// Stores all promises to be resolved
var deletionPromises = []
var insertionPromises = []
// Fetch the model via its name string from mongoose
const Model = mongoose.model(data.model)
// For each object in the 'documents' field of the main object
data.documents.forEach((item) => {
deletionPromises.push(promiseDeletion(Model, item))
insertionPromises.push(promiseInsertion(Model, item))
})
console.log('Promises have been pushed.')
// We need to fulfil the deletion promises before the insertion promises.
Promise.all(deletionPromises).then(()=> {
return Promise.all(insertionPromises).catch(()=>{})
}).catch(()=>{})
I won't include both promiseDeletion and promiseInsertion as they're functionally the same.
const promiseDeletion = function (model, item) {
console.log('Promise Deletion ' + item.role)
return new Promise((resolve, reject) => {
model.findOneAndDelete(item, (error) => {
if (error) reject()
else resolve()
})
})
}
Update 2: You should ignore my most recent update. I've modified the result I posted a bit, but even then, half of the time the roles are deleted and not inserted. It's very random as to when it will actually insert the roles into the server. I'm very confused and frustrated at this point.
You ran into a very common problem when using Javascript: You shouldn't define (async) functions in a regular for (-in) loop. What happens, is that while you loop through the three values the first async find is being called. Since your code is async, nodejs does not wait for it to finish, before it continues to the next loop iteration and counts up to the third value, here the admin rule.
Now, since you defined your functions in the loop, when the first async call is over, the for-loop already looped to the last value, which is why admin is being inserted three times.
To avoid this, you can just move the async functions out of the loop to force a call by value rather than reference. Still, this can bring up a lot of other problems, so I'd recommend you to rather have a look at promises and how to chain them (e.g. Put all mongoose promises in an array and the await them using Promise.all) or use the more modern async/await syntax together with the for-of loop that allows for both easy readability as well as sequential async command instructions.
Check this very similar question: Calling an asynchronous function within a for loop in JavaScript
Note: for-of is being discussed as to performance heavy, so check if this applies to your use-case or not.
When using async functions in loops could cause some problems.
You should change the way you work with findOne to make it synchronous function
First you need to set your function to async, and then use the findOne like so:
async function myFucntion() {
let res = await Model.findOne({'role':obj.role}).exec();//Exec will fire the function and give back a promise which the await can handle.
//do what you need to do here with the result..
}

NodeJS - Need help understanding and converting a synchronous dependant MySQL query code into something usable

this is my second Node project. I am using Node, Express, MySQL.
What I am doing is, I have an array of names of users that have posted something, I then loop over those names and for each of them I do a connection.query to get their posts and I store those into an array(after that I do some other data manipulation to it, but that's not the important part)
The problem is: my code tries to do that data manipulation before it even receives the data from the connection.query!
I google-d around and it seems async await is the thing I need, problem is, I couldn't fit it in my code properly.
// namesOfPeopleImFollowing is the array with the names
namesOfPeopleImFollowing.forEach(function(ele){
connection.query(`SELECT * FROM user${ele}posts`, function(error,resultOfThis){
if(error){
console.log("Error found" + error)
} else {
allPostsWithUsername.push([{username:ele},resultOfThis])
}
})
})
console.log(JSON.stringify(allPostsWithUsername)) // This is EMPTY, it mustn't be empty.
So, how do I convert that into something which will work properly?
(Incase you need the entire code, here it is: https://pastebin.com/dDEJbPfP though I forgot to uncomment the code)
Thank you for your time.
There are many ways to solve this. A simple one would be to wrap your function inside a promise and resolve when the callback is complete.
const allPromises = [];
namesOfPeopleImFollowing.forEach((ele) => {
const myPromise = new Promise((resolve, reject) => {
connection.query(`SELECT * FROM user${ele}posts`, (error, resultOfThis) => {
if (error) {
reject(error);
console.log(`Error found${error}`);
} else {
resolve({ username: ele });
}
});
});
allPromises.push(myPromise);
});
Promise.all(allPromises).then((result) => {
// your code here
})
You can read more about promise.all here

Recursively call promises

I've been scouring the web over this one for quite some time now.
I'm prototyping an Angular service for an Ionic app. The purpose of this service is to download an image. Now this is a problem that, in standard JS, I'd like to solve with some recursive calls to avoid duplicate code.
I've tried writing it using promises to get my feet wet with the concept of Promises and it's giving me a hard time.
Consider the following code:
public getBgForName = (name: string) => {
name = name.toLowerCase();
var instance = this;
var dir = this.file.dataDirectory;
return new Promise(function (fulfill, reject) {
instance.file.checkDir(dir, name).then(() => {
// directory exists. Is there a bg file?
dir = dir + '/' + name + '/';
instance.file.checkFile(dir, 'bg.jpg').then(() => {
console.log('read file');
fulfill(dir + '/' + 'bg.jpg')
}, (err) => {
// dl file and re-call
console.log('needs to download file!')
instance.transfer.create().download(encodeURI('https://host.tld/'+name+'/bg.jpg'), dir + 'bg.jpg', true, {})
.then((data) => {
return instance.getBgForName(name).then((url) => {return url});
}, (err) => {
console.log(err)
})
})
}, (err) => {
// create dir and re-call
instance.file.createDir(dir, name, true).then(() => {
instance.getBgForName(name).then((url) => {fulfill(url)});
})
})
});
}
the promise, when called - never quite fully resolves. I think, after reading this article that the problem lies in my the promise resolving not being passed correctly to the "original" promise chain - so that it resolves to solve level, but not all the way to the top. This is supported by the promise resolving correctly when the following is assured:
the directory has already been created
the file has already been downloaded
so I reckon the return statements somehow break up the link here, leading to the promise not being resolved after it's first recursive call.
What is the correct way to call a promise recursively, ensuring the the original caller receives the result when it is ready?
Edit: Outlining the desired result, as suggested by David B.
What the code is supposed to be is the function that is called on a list of items. For each item, there is a background image available, which is stored on a server. This background image will be cached locally. The goal of using recursively calls here is that no matter the state (downloaded, not downloaded) the function call will always return an url to the image on the local filesystem. The steps for this are as follows:
create a directory for the current item
download the file to this directory
return a local URL to the downloaded file
subsequent calls thereafter will only return the image straight from disk (after checking that it exists), with no more downloading.
After reading about the benefits of async / await over promises (and falling in love with the cleaner syntax) I rewrote it using async / await. The refactored (but not perfect!) code looks like this:
public getBgForName = async (name: string) => {
name = name.toLowerCase();
let instance = this;
let dir = this.file.dataDirectory;
try{
await instance.file.checkDir(dir, name)
dir = dir + name + '/';
try{
await instance.file.checkFile(dir, 'bg.jpg')
return dir + 'bg.jpg';
}catch(err) {
// download file
await instance.transfer.create().download(encodeURI('https://host.tld/'+name+'/bg.jpg'), dir + 'bg.jpg', true, {})
return this.getBgForName(name);
}
}catch(err) {
// not catching the error here since if we can't write to the app's local storage something is very off anyway.
await instance.file.createDir(dir, name, true)
return this.getBgForName(name);
}
}
and works as intended.

Firebase not receiving data before view loaded - empty array returned before filled

In the following code I save each item's key and an email address in one table, and to retrieve the object to fetch from the original table using said key. I can see that the items are being put into the rawList array when I console.log, but the function is returning this.cartList before it has anything in it, so the view doesn't receive any of the data. How can I make it so that this.cartList waits for rawList to be full before it is returned?
ionViewWillEnter() {
var user = firebase.auth().currentUser;
this.cartData.getCart().on('value', snapshot => {
let rawList = [];
snapshot.forEach(snap => {
if (user.email == snap.val().email) {
var desiredItem = this.goodsData.findGoodById(snap.val().key);
desiredItem.once("value")
.then(function(snapshot2) {
rawList.push(snapshot2);
});
return false
}
});
console.log(rawList);
this.cartList = rawList;
});
}
I have tried putting the this.cartList = rawList in a number of different locations (before return false, even inside the .then statement, but that did not solve the problem.
The following function call is asynchronous and you're falling out of scope before rawList has a chance to update because this database call takes a reasonably long time:
desiredItem.once("value").then(function(snapshot2) {
rawList.push(snapshot2);
});
You're also pushing the snapshot directly to this list, when you should be pushing snapshot2.val() to get the raw value.
Here's how I would fix your code:
ionViewWillEnter() {
var user = firebase.auth().currentUser;
this.cartData.getCart().on('value', snapshot => {
// clear the existing `this.cartList`
this.cartList = [];
snapshot.forEach(snap => {
if (user.email == snap.val().email) {
var desiredItem = this.goodsData.findGoodById(snap.val().key);
desiredItem.once("value")
.then(function(snapshot2) {
// push directly to the cartList
this.cartList.push(snapshot2.val());
});
}
return false;
});
});
}
The problem is the Promise (async .once() call to firebase) inside the forEach loop (sync). The forEach Loop is not gonna wait for the then() statement so then on the next iteration the data of the previous iteration is just lost...
let snapshots = [1, 2, 3];
let rawList = [];
snapshots.forEach((snap) => {
console.log(rawList.length)
fbCall = new Promise((resolve, reject) => {
setTimeout(function() {
resolve("Success!");
}, 2500)
});
fbCall.then((result) => {
rawList.push(result);
});
})
You need forEach to push the whole Promise to the rawList and Then wait for them to resolve and do sth with the results.
var snapshots = [1, 2, 3];
var rawList = [];
var counter = 0;
snapshots.forEach((snap) => {
console.log(rawList.length)
var fbCall = new Promise((resolve, reject) => {
setTimeout(function() {
resolve("Success!" + counter++);
}, 1500)
});
rawList.push(fbCall);
})
Promise.all(rawList).then((res) => {
console.log(res[0]);
console.log(res[1]);
console.log(res[2]);
});
The thing is, it is still a bit awkward to assign this.cartList = Promise.all(rawList) as it makes it a Promise. So you might want to rethink your design and make something like a getCartList Service? (dont know what ur app is like :p)
Since you're using angular you should also be using angularfire2, which makes use of Observables which will solve this issue for you. You will still be using the normal SDK for many things but for fetching and binding data it is not recommended to use Firebase alone without angularfire2 as it makes these things less manageable.
The nice things about this approach is that you can leverage any methods on Observable such as filter, first, map etc.
After installing it simply do:
public items$: FirebaseListObservable<any[]>;
this.items$ = this.af.database.list('path/to/data');
And in the view:
{{items$ | async}}
In order to wait for the data to appear.
Use AngularFire2 and RxJS this will save you a lot of time, and you will do it in the proper and maintainable way by using the RxJS operators, you can learn about those operators here learnrxjs

await for function with callback

I'm playing with streams and async/await functionality. What I have so far is:
let logRecord = ((record, callback) => {
console.log(record);
return callback();
});
let importCSVfromPath = async((csv_path) => {
return new Promise(function(resolve, reject) {
var parser = parse();
var input = fs.createReadStream(csv_path);
var transformer = transform(logRecord, {parallel: 1});
input.on('error', (err) => {
reject(err);
});
input.on('finish', ()=> {
resolve();
});
input.pipe(parser).pipe(transformer);
});
});
Now I want to replace logRecord with importRecord. The problem is that this function has to use functions that are already part of the async stack.
let importRecord = async( (record) => {
.......
await(insertRow(row));
});
What's the right way to do this?
It's slightly more complicated than this - node.js streams are not adapted (at least not yet) to the es7 async/await methods.
If you'd like to develop this on your own, consider writing a class derived from Readable stream. Implementing a promise based interface is quite a task, but it is possible.
If you're however fine with using a permissive licensed framework - take a look at Scramjet. With it your code will look like this (most of the example is parsing the CSV - I'll add a helper in the next version):
fs.createReadStream("file.csv") // open your file
.pipe(new StringStream()) // pass to scramjet
.split("\n") // split by line
.parse((line) => line.split(",")) // convert lines to arrays
.map(async (line) => { // run asynchrounous mapping
await importRecord(line); // import log to DB
return logRecord(line); // return some log for the output
})
.pipe(process.stdout); // pipe the output wherever you like
I believe it's exactly what you're looking for and it will run your record imports in parallel, while keeping the output order.

Categories

Resources