dynamic variable with require - javascript

I have a doubt if I read a JSON file with "require" and this JSON file is updated also changes the variable where it is set in the code?
this is an example - this is the constantly updated json file
context = {
id: 45121521541, //changing value
name: node,
status: completed,
}
and I get the values of this JSON through this code
var context = require ('./ context.json')
the code will update the json constantly and the data will change, and while the code is active I will get the values of the JSON by means of the 'require', that is possible, or the require will not allow me?

You should use fs.readFileSync() to do this and not require() - when you use require the first time it will fetch it into the the modules require cache and subsequent calls will be loaded from from there so you won't see timely updates. If you delete key from the require.cache it will also a trigger a reload, but overall will be less efficient than just reading the file.
Including fs.readFileSync() will synchronously read the file contents (blocking) each time it is executed. You can combine with your own caching/etc to make this more efficient. If you can do this asynchronously then using fs.readFile() is preferable as it is non-blocking.
const fs = require('fs');
// your code
// load the file contents synchronously
const context = JSON.parse(fs.readFileSync('./context.json'));
// load the file contents asynchronously
fs.readFile('./context.json', (err, data) => {
if (err) throw new Error(err);
const context = JSON.parse(data);
});
// load the file contents asynchronously with promises
const context = await new Promise((resolve, reject) => {
fs.readFile('./context.json', (err, data) => {
if (err) return reject(err);
return resolve(JSON.parse(data));
});
});
If you want to delete from the require.cache you can do it like so.
// delete from require.cache and reload, this is less efficient...
// ...unless you want the caching
delete require.cache['/full/path/to/context.json'];
const context = require('./context.json');

Related

Synchronize critical section in API for each user in JavaScript

I wanted to swap a profile picture of a user. For this, I have to check the database to see if a picture has already been saved, if so, it should be deleted. Then the new one should be saved and entered into the database.
Here is a simplified (pseudo) code of that:
async function changePic(user, file) {
// remove old pic
if (await database.hasPic(user)) {
let oldPath = await database.getPicOfUser(user);
filesystem.remove(oldPath);
}
// save new pic
let path = "some/new/generated/path.png";
file = await Image.modify(file);
await Promise.all([
filesystem.save(path, file),
database.saveThatUserHasNewPic(user, path)
]);
return "I'm done!";
}
I ran into the following problem with it:
If the user calls the API twice in a short time, serious errors occur. The database queries and the functions in between are asynchronous, causing that the changes of the first API call weren't applied when the second API checks for a profile pic to delete. So I'm left with a filesystem.remove request for an already unexisting file and an unremoved image in the filesystem.
I would like to safely handle that situation by synchronizing this critical section of code. I don't want to reject requests only because the server hasn't finished the previous one and I also want to synchronize it for each user, so users aren't bothered by the actions of other users.
Is there a clean way to achieve this in JavaScript? Some sort of monitor like you know it from Java would be nice.
You could use a library like p-limit to control your concurrency. Use a map to track the active/pending requests for each user. Use their ID (which I assume exists) as the key and the limit instance as the value:
const pLimit = require('p-limit');
const limits = new Map();
function changePic(user, file) {
async function impl(user, file) {
// your implementation from above
}
const { id } = user // or similar to distinguish them
if (!limits.has(id)) {
limits.set(id, pLimit(1)); // only one active request per user
}
const limit = limits.get(id);
return limit(impl, user, file); // schedule impl for execution
}
// TODO clean up limits to prevent memory leak?

Link two asynchronous actions for both to succeed or fail together (move file + database save)

I have two asynchronous actions with callback. I would like to be sure that both of them succeed or fail, but not that one succeed and one fail. It should probably be like one process for both actions which could be revert back ?
Let's illustrate :
// In this simplified code, i assume i uploaded a file in a temporary folder.
// Every validations passed and now the goal is to move the file from temporary folder to final destination folder and to save it into database.
// This 'move' and this 'save' are my two async actions with a callback when each action is completed.
// Maybe i am not using the right way / pattern to do it, thank you for enlightening me if necessary.
myController.create = function (req, res, next) {
// I move the file from oldPath (temp directory) to newPath (final destination)
mv(oldPath, newPath, function (err) {
// If err, file is not moved i stop here, went fine. The temp directory is cleared later by the system every X period of time.
if (err) { return next(err); }
var file = new FileModel({
// some properties (name, path...)
});
// The file is now moved, need to save it into database
file.save(function (err) {
if (!err) { return next(); } // everything went fine
// If err, nothing stored in database but the file is still in final folder :o
// I could unlink the file but it still can fail and keep my file inside destination folder with no database entry.
fs.unlink(new_path, function (other_err) {
if (other_err) { return next(other_err); }
return next(err);
}
});
});
};
In the code above, if first action succeed, nothing guarantee that my second action will succeed too and that i could revert back (my first action) if it fail. The two actions are separate and independant instead of being linked / paired and working together.
If moving the file succeed, the save in database should succeed too. If the save in database doesn't succeed, then i should revert back to temp directory or delete the file from the destination folder to be in adequacy with database. In other words, if the second action fail, the first one should fail two.
What is a good way to achieve that ?
EDIT : One solution i can see would be to check every X period of time if each file in the final destination folder has an entry in db and if it doesn't, to delete it.
You need to use promises to implement such kind of thing, For example, you need to create a user and then send a notification. So both actions are async and need to be done one after one.
const user = {};
// This function create user and send back a id
user.createUser = (data) => {
return new Promise((resolve, reject) => {
// ...you callbacks
if (some conditions are true) {
return resolve(id);
} else {
return reject();
}
});
};
// This function finds user detail by id and send notifiaction
user.sendNotification = (id) => {
return new Promise((resolve, reject) => {
// ...you callbacks
return resolve();
});
};
user.action = async () => {
try {
const userId = await user.createUser(); // Wait for promise to resolve
await user.sendNotification(userId);
return true;
} catch (err) {
throw err;
}
};
module.exports = user;
In the above code, you can see the user.action() function call 2 separate function one by one, async/await only works on promises, So we made that functions promisify and use it once using a keyword await. So, in short, you need to use promises to handle such kind of things.
I hope it helps. Happy Coding :)

Push objects into object array

The problem:
I want to keep track of my uploaded files by writing the fileinformation each uploaded file for a multi upload into my database. However when I upload 2 files it usually creates 3 entries in the database and when I upload 6 files it will create a lot more than 6 entries.
My db function:
function saveAssetInDatabase(project, fileInformation) {
return new Promise((reject, resolve) => {
let uploaded_file = {}
uploaded_file = fileInformation
uploaded_file.file_type = 'asset'
uploaded_file.display_name = fileInformation.originalname
project.uploaded_files.push(uploaded_file)
project.save()
})
}
The simplified code which calls the function:
for(var i=0; i<req.files["sourceStrings"].length; i++) {
// Unknown file format, let's save it as asset
saveAssetInDatabase(project, fileInformation).then(result => {
return res.status(200).send()
}).catch(err => {
logger.error(err)
return res.status(500).send()
})
}
I guess that there is something wrong with my db function as it leads to duplicate file entries. What am I doing wrong here? One file should get one entry.
If I read the specs on model.save correctly on the mongoose website, the problem with your save is rather that you are always reusing the original project, and not the newly saved project that should contain the latest state.
So what you are essentially are doing:
project.files.push(file1);
// file1 is marked as new
project.save();
project.files.push(file2);
// file1 & file2 are marked as new
// (the project doesn't know file1 has been saved already)
// ...
Now, that actually brings quite some advantages, since you are doing currently a save per file, while you could save all files at once ;)
I guess the easiest way would be to put the project.save method outside of your for loop and change your first method like
function saveAssetInDatabase(project, fileInformation) {
let uploaded_file = {};
uploaded_file = fileInformation;
uploaded_file.file_type = 'asset';
uploaded_file.display_name = fileInformation.originalname;
project.uploaded_files.push(uploaded_file);
}
with the for loop changed to
function saveSourceString(project, req) {
for(var i=0; i<req.files["sourceStrings"].length; i++) {
// Unknown file format, let's save it as asset
saveAssetInDatabase(project, fileInformation);
}
// save after all files were added
return project.save().then(result => {
return res.status(200).send()
}).catch(err => {
logger.error(err)
return res.status(500).send()
});
}
Note that project.save() will return a promise, with an argument containing the newly saved project. If you wish to manipulate this object at a later time, make sure you take the saved file, and not, as you have done till now, the non-saved model
Problem
Each time in your for loop create a promise then send your that time project
object. It's not correct way. Each promise resolved you have send project object to DB then store it.
For example you have 3 asset details.
While first time loop running first asset data would store in project object and the promise resolved you have send that time project in your DB store it. This time project object has first asset details.
While second time loop running second asset data would store in project object with first asset data and the promise resolved you have send that time project in your DB store it. This time project object has first and second asset details.
While third time loop running third asset data would store in project object with first and second asset data and the promise resolved you have send that time project in your DB store it. This time project object has first, second and third asset details.
So you have store same datas in your DB.
Solution
You have use Promise.all. Resolve all assest's promise after you store your project data in your DB.
// your DB function
function saveAssetInDatabase(project, fileInformation) {
return new Promise((resolve, reject) => {
let uploaded_file = {}
uploaded_file = fileInformation
uploaded_file.file_type = 'asset'
uploaded_file.display_name = fileInformation.originalname
project.uploaded_files.push(uploaded_file)
project.save();
resolve();
})
}
// calls function
let promiseArray = [];
for(var i=0; i<req.files["sourceStrings"].length; i++) {
promiseArray.push(saveAssetInDatabase(project, fileInformation));
}
Promise.all(promiseArray).then(result => {
return res.status(200).send();
}).catch(err => {
logger.error(err)
return res.status(500).send()
})
}

Meteor synchronous and asynchronous call to read a file

I am new to Meteor. I am using following code to read a file stored at server.
Client side
Meteor.call('parseFile', (err, res) => {
if (err) {
alert(err);
} else {
Session.set("result0",res[0]);
Session.set("result1",res[1]);
Session.set("result2",res[2]);
}
});
let longitude = Session.get("result0");
let latitude = Session.get("result1");
var buildingData = Session.get("result2");
Server Side
Meteor.methods({
'parseFile'() {
var csv = Assets.getText('buildingData.csv');
var rows = Papa.parse(csv).data;
return rows;
}
})
The problem is while I make a call it takes time to send the result back and hence wherever i am using latitude and longitude its giving undefined and page breaks. So, is there any solution to avoid this problem. One of the solution can be to make a synchronous call and wait for result to be returned.
You can make the server method run synchronously using the futures package, which should force the client to wait for the method to complete.
It might look something like this:
Meteor.methods({
'parseFile'() {
var future = new Future();
var csv = Assets.getText('buildingData.csv');
var rows = Papa.parse(csv).data;
future.return(rows);
future.wait();
}
});
This would require you installing the futures package linked above and setting up your includes properly in file containing your Meteor.methods() definitions. You might also look into good error handling inside your method.
UPDATE:
The link to the Future package is an NPM package, which you can read about here. The link above is to the atmosphere package, which looks like an old wrapper package.

Use Asynchronous IO better

I am really new to JS, and even newer to node.js. So using "traditional" programming paradigms my file looks like this:
var d = require('babyparse');
var fs = require('fs');
var file = fs.readFile('SkuDetail.txt');
d.parse(file);
So this has many problems:
It's not asynchronous
My file is bigger than the default max file size (this one is about 60mb) so it currently breaks (not 100% sure if that's the reason).
My question: how do I load a big file (and this will be significantly bigger than 60mb for future uses) asynchronously, parsing as I get information. Then as a followup, how do I know when everything is completed?
You should create a ReadStream. A common pattern looks like this. You can parse data as it gets available on the data event.
function readFile(filePath, done) {
var
stream = fs.createReadStream(filePath),
out = '';
// Make done optional
done = done || function(err) { if(err) throw err; };
stream.on('data', function(data) {
// Parse data
out += data;
});
stream.on('end', function(){
done(null, out); // All data is read
});
stream.on('error', function(err) {
done(err);
});
}
You can use the method like:
readFile('SkuDetail.txt', function(err, out) {
// Handle error
if(err) throw err;
// File has been read and parsed
}
If you add the parsed data to the out variable the entire parsed file will be sent to the done callback.
It already is asynchronous, javascript is asynchronous no extra effort is needed from your part. Does your code even work though? I think your parse should be inside a callback of read. Otherwise readfile is skipped and file is null.
In normal situations any io code you write will be "skipped" and the code after it which may be more direct will be executed first.
For the first question since you want to process chunks, Streams might be what you are looking for. #pstenstrm has an example in his answer.
Also, you can check this Node.js documentation link for Streams: https://nodejs.org/api/fs.html#fs_fs_createreadstream_path_options
If you want an brief description and example for Streams check this link: http://www.sitepoint.com/basics-node-js-streams/
You can pass a callback to the fs.readFile function to process the content once the file read is complete. This would answer your second question.
fs.readFile('SkuDetail.txt', function(err, data){
if(err){
throw err;
}
processFile(data);
});
You can see Get data from fs.readFile for more details.
Also, you could use Promises for cleaner code with other added benefits. Check this link: http://promise-nuggets.github.io/articles/03-power-of-then-sync-processing.html

Categories

Resources