The problem:
I want to keep track of my uploaded files by writing the fileinformation each uploaded file for a multi upload into my database. However when I upload 2 files it usually creates 3 entries in the database and when I upload 6 files it will create a lot more than 6 entries.
My db function:
function saveAssetInDatabase(project, fileInformation) {
return new Promise((reject, resolve) => {
let uploaded_file = {}
uploaded_file = fileInformation
uploaded_file.file_type = 'asset'
uploaded_file.display_name = fileInformation.originalname
project.uploaded_files.push(uploaded_file)
project.save()
})
}
The simplified code which calls the function:
for(var i=0; i<req.files["sourceStrings"].length; i++) {
// Unknown file format, let's save it as asset
saveAssetInDatabase(project, fileInformation).then(result => {
return res.status(200).send()
}).catch(err => {
logger.error(err)
return res.status(500).send()
})
}
I guess that there is something wrong with my db function as it leads to duplicate file entries. What am I doing wrong here? One file should get one entry.
If I read the specs on model.save correctly on the mongoose website, the problem with your save is rather that you are always reusing the original project, and not the newly saved project that should contain the latest state.
So what you are essentially are doing:
project.files.push(file1);
// file1 is marked as new
project.save();
project.files.push(file2);
// file1 & file2 are marked as new
// (the project doesn't know file1 has been saved already)
// ...
Now, that actually brings quite some advantages, since you are doing currently a save per file, while you could save all files at once ;)
I guess the easiest way would be to put the project.save method outside of your for loop and change your first method like
function saveAssetInDatabase(project, fileInformation) {
let uploaded_file = {};
uploaded_file = fileInformation;
uploaded_file.file_type = 'asset';
uploaded_file.display_name = fileInformation.originalname;
project.uploaded_files.push(uploaded_file);
}
with the for loop changed to
function saveSourceString(project, req) {
for(var i=0; i<req.files["sourceStrings"].length; i++) {
// Unknown file format, let's save it as asset
saveAssetInDatabase(project, fileInformation);
}
// save after all files were added
return project.save().then(result => {
return res.status(200).send()
}).catch(err => {
logger.error(err)
return res.status(500).send()
});
}
Note that project.save() will return a promise, with an argument containing the newly saved project. If you wish to manipulate this object at a later time, make sure you take the saved file, and not, as you have done till now, the non-saved model
Problem
Each time in your for loop create a promise then send your that time project
object. It's not correct way. Each promise resolved you have send project object to DB then store it.
For example you have 3 asset details.
While first time loop running first asset data would store in project object and the promise resolved you have send that time project in your DB store it. This time project object has first asset details.
While second time loop running second asset data would store in project object with first asset data and the promise resolved you have send that time project in your DB store it. This time project object has first and second asset details.
While third time loop running third asset data would store in project object with first and second asset data and the promise resolved you have send that time project in your DB store it. This time project object has first, second and third asset details.
So you have store same datas in your DB.
Solution
You have use Promise.all. Resolve all assest's promise after you store your project data in your DB.
// your DB function
function saveAssetInDatabase(project, fileInformation) {
return new Promise((resolve, reject) => {
let uploaded_file = {}
uploaded_file = fileInformation
uploaded_file.file_type = 'asset'
uploaded_file.display_name = fileInformation.originalname
project.uploaded_files.push(uploaded_file)
project.save();
resolve();
})
}
// calls function
let promiseArray = [];
for(var i=0; i<req.files["sourceStrings"].length; i++) {
promiseArray.push(saveAssetInDatabase(project, fileInformation));
}
Promise.all(promiseArray).then(result => {
return res.status(200).send();
}).catch(err => {
logger.error(err)
return res.status(500).send()
})
}
Related
I have a list of countries (array) that I'd like to loop over and search with the country name in the API here: https://restcountries.eu. I'd like to store some of the returned country data in a variable which I will then export to a csv.
I am using 2 Node files one to clean the countries and one to convert to csv.
First I use the fetch API to grab the data
let list = [];
Promise.all(currentCountries.map((country)=>{
fetch(`https://restcountries.eu/rest/v2/name/${country}`)
.then(response => response.json())
.catch((err)=>console.log(err))
}))
.then((data)=>{
list.push(data)
})
and then I use another file with the json-2-csv library to convert the list variable stored above to a csv
const fs = require('fs')
const converter = require('json-2-csv')
const list = require('./countryCleaning')
let json2csvCallback = function (err, csv) {
if (err) throw err;
fs.writeFile('cleanedCountryCodes.csv', csv, 'utf8', function(err){
if(err){
console.log("FS Error")
} else {
console.log("everything has worked")
}
})
};
converter.json2csv(list, json2csvCallback)
I keep getting nulls in the csv and I have no idea why. Can't tell if the problem is in the promises or the conversion to csv. Any problems you see? Is there a way for me to see the list in the first file? console.log doesn't work because it always runs before the async code finishes and gives me null of course.
My understanding of promises/async is still very basic so I'd appreciate the help.
Need to return something from then
and export this promise from file same as list
then use this promise instead of list
I have a doubt if I read a JSON file with "require" and this JSON file is updated also changes the variable where it is set in the code?
this is an example - this is the constantly updated json file
context = {
id: 45121521541, //changing value
name: node,
status: completed,
}
and I get the values of this JSON through this code
var context = require ('./ context.json')
the code will update the json constantly and the data will change, and while the code is active I will get the values of the JSON by means of the 'require', that is possible, or the require will not allow me?
You should use fs.readFileSync() to do this and not require() - when you use require the first time it will fetch it into the the modules require cache and subsequent calls will be loaded from from there so you won't see timely updates. If you delete key from the require.cache it will also a trigger a reload, but overall will be less efficient than just reading the file.
Including fs.readFileSync() will synchronously read the file contents (blocking) each time it is executed. You can combine with your own caching/etc to make this more efficient. If you can do this asynchronously then using fs.readFile() is preferable as it is non-blocking.
const fs = require('fs');
// your code
// load the file contents synchronously
const context = JSON.parse(fs.readFileSync('./context.json'));
// load the file contents asynchronously
fs.readFile('./context.json', (err, data) => {
if (err) throw new Error(err);
const context = JSON.parse(data);
});
// load the file contents asynchronously with promises
const context = await new Promise((resolve, reject) => {
fs.readFile('./context.json', (err, data) => {
if (err) return reject(err);
return resolve(JSON.parse(data));
});
});
If you want to delete from the require.cache you can do it like so.
// delete from require.cache and reload, this is less efficient...
// ...unless you want the caching
delete require.cache['/full/path/to/context.json'];
const context = require('./context.json');
I have two asynchronous actions with callback. I would like to be sure that both of them succeed or fail, but not that one succeed and one fail. It should probably be like one process for both actions which could be revert back ?
Let's illustrate :
// In this simplified code, i assume i uploaded a file in a temporary folder.
// Every validations passed and now the goal is to move the file from temporary folder to final destination folder and to save it into database.
// This 'move' and this 'save' are my two async actions with a callback when each action is completed.
// Maybe i am not using the right way / pattern to do it, thank you for enlightening me if necessary.
myController.create = function (req, res, next) {
// I move the file from oldPath (temp directory) to newPath (final destination)
mv(oldPath, newPath, function (err) {
// If err, file is not moved i stop here, went fine. The temp directory is cleared later by the system every X period of time.
if (err) { return next(err); }
var file = new FileModel({
// some properties (name, path...)
});
// The file is now moved, need to save it into database
file.save(function (err) {
if (!err) { return next(); } // everything went fine
// If err, nothing stored in database but the file is still in final folder :o
// I could unlink the file but it still can fail and keep my file inside destination folder with no database entry.
fs.unlink(new_path, function (other_err) {
if (other_err) { return next(other_err); }
return next(err);
}
});
});
};
In the code above, if first action succeed, nothing guarantee that my second action will succeed too and that i could revert back (my first action) if it fail. The two actions are separate and independant instead of being linked / paired and working together.
If moving the file succeed, the save in database should succeed too. If the save in database doesn't succeed, then i should revert back to temp directory or delete the file from the destination folder to be in adequacy with database. In other words, if the second action fail, the first one should fail two.
What is a good way to achieve that ?
EDIT : One solution i can see would be to check every X period of time if each file in the final destination folder has an entry in db and if it doesn't, to delete it.
You need to use promises to implement such kind of thing, For example, you need to create a user and then send a notification. So both actions are async and need to be done one after one.
const user = {};
// This function create user and send back a id
user.createUser = (data) => {
return new Promise((resolve, reject) => {
// ...you callbacks
if (some conditions are true) {
return resolve(id);
} else {
return reject();
}
});
};
// This function finds user detail by id and send notifiaction
user.sendNotification = (id) => {
return new Promise((resolve, reject) => {
// ...you callbacks
return resolve();
});
};
user.action = async () => {
try {
const userId = await user.createUser(); // Wait for promise to resolve
await user.sendNotification(userId);
return true;
} catch (err) {
throw err;
}
};
module.exports = user;
In the above code, you can see the user.action() function call 2 separate function one by one, async/await only works on promises, So we made that functions promisify and use it once using a keyword await. So, in short, you need to use promises to handle such kind of things.
I hope it helps. Happy Coding :)
The Put method for the firebase storage seems to only take one file at a time. How do I get this to work with multiple files ? I am trying to wait for each upload to finish and collect a download url for each, then proceed and save these urls in an array in a node in the realtime database, but I can't seem to figure the best way to handle this.
I wrote a GitHub gist of this:
// set it up
firebase.storage().ref().constructor.prototype.putFiles = function(files) {
var ref = this;
return Promise.all(files.map(function(file) {
return ref.child(file.name).put(file);
}));
}
// use it!
firebase.storage().ref().putFiles(files).then(function(metadatas) {
// Get an array of file metadata
}).catch(function(error) {
// If any task fails, handle this
});
Im currently implementing a Purchase Order type View. Where I have a PurchaseOrder table and a PurchaseOrderLine table for the items. The first thing I do when the use presses the save button I first save the Purchase Order and then I retrieve the PurchaseOrderID and save to each individual PurchaseOrder item. The problems is the following:
Promise.resolve( app.PurchaseOrder.create(formData) ).then(function(response){
purchaseOrderID = response.collection.models[0].attributes.id;
for(var key in formLineData){
if(formLineData.hasOwnProperty(key)){
formLineData[key]['requestID'] = purchaseOrderID;
app.PurchaseOrderLines.create(formLineData[key]);
}
}
}).catch(function(error){
console.log(error);
})
formData is the PurchaseOrder data, formLineData is the PurchaseOrderLine Data(I do the for loop to insert requestIDs to all items).
I am using a Promise because collection.fetch does not return a promise on Backbone(I think my implementation isn't quite correct because Promise.resolve() is use to make thenables a promise and in this case it isn't). The problem is that when the save button is clicked the then part passes even PurchaseOrder hasn't been created. So when it gets to the PurchaseOrderLine.create, all the items are saved without a PurchaseOrderID. I have two options:
Add a server validation for this. The problem with this is that everytime is going to return an error and this can be bothersome to the user.
Add a setTimeout to at least wait a couple seconds for the write to over on the server.
Could please shed a light on this topic.
you can try something like this
app.PurchaseOrder.create(formData).then(function (response) {
var purchaseOrderID = response.collection.models[0].attributes.id;
return new Promise(async function (resolve) {
for (var key in formLineData) {
if (formLineData.hasOwnProperty(key)) {
formLineData[key]["requestID"] = purchaseOrderID;
await app.PurchaseOrderLines.create(formLineData[key]);
}
}
resolve()
});
});
or maybe doing something like this using Promise.all
Promise.all(formLineData.map(()=>app.PurchaseOrderLines.create(formLineData[key])))