How to share a variable between two files generated by a function - javascript

I am writing an API in NodeJS and I have ran into a brick wall. I am trying to use a function to grab a variable and use module.exports to use said variable in another file. This however keeps coming up as undefined in the console.
I have already tried used return statements in different places in the file but I keep getting undefined.
This is what the code looks like to grab the variable and export it.
File 1 (api.js)
const fs = require('fs');
const homeDir = require('os').homedir();
module.exports = {
workingDirectory: () => {
let dir;
fs.access(`${homeDir}/.unitv`, fs.constants.F_OK, (err) => {
if(err) throw err;
fs.readFile(`${homeDir}/.unitv`, 'utf8', (readErr, data) => {
if(readErr) throw readErr;
let jsonData = JSON.parse(data);
dir = jsonData.WorkingDirectory;
});
});
return dir;
}
};
File 2
const api = require('../api');
console.log(api.workingDirectory);
.unitv file
{
"WorkingDirectory": "/home/user/UniTV",
"Port": "3000"
}
In the console it will turn up as undefined when it should turn up with the value of the "working directory" in /home/user/.unitv
Any and all help is appreciated, thanks.

Your current code is particularly problematic.
return dir; occurs before fs.access/fs.readFile finishes. These are asynchronous functions and require the use of callback, promise, or async/await styled coding. The gist of it is that the code continues executing other code while it waits on I/O (such as reading a file) and the way you have written it causes nothing to be returned. See https://repl.it/#CodyGeisler/readFileCallback for a working callback example.
workingDirectory: () => {
let dir;
fs.access(`${homeDir}/.unitv`, fs.constants.F_OK, (err) => {
if(err) throw err;
fs.readFile(`${homeDir}/.unitv`, 'utf8', (readErr, data) => {
if(readErr) throw readErr;
let jsonData = JSON.parse(data);
dir = jsonData.WorkingDirectory;
});
});
return dir;
}

Related

Node async operations on child_process are not ending

I have a little script that executes a childprocess using execFile. This child process is a node script too that runs async operations but it seems like that the async are never ending so the terminal and all the processes are on hold.
This is the main script that runs the execFile for the child process:
fs.readdir(directoryPath, function(err, files) {
if (err) console.log(`Error: ${err}`);
files.map((file) => {
execFile(`node`, ["updater.js", "BMW", file], (error, stdout, stderr) => {
if (error) {
red(`error: ${error.message}`);
return;
}
if (stderr) {
red(`stderr: ${stderr}`);
return;
}
console.log(stdout);
});
});
});
And this is the node script executed as child process:
const args = process.argv.slice(2);
const brand = args[0];
const model = args[1];
const data = readJSON(`./json-export/${brand}/${model}`);
const generations = data.generations;
const generationsDB = await getGenerationsByModelAndBrand(brand, model);
console.log(generationsDB);
generations.map((generation) => {
const lastModification =
generation.modifications.modification[
generation.modifications.modification.length - 1
];
console.log(lastModification);
});
All the code works if I comment the const generationsDB line and the next console.log. If not when execution hits to the async request the execution gets stucked there.
Tested the getGenerationsByModelAndBrand on the main script and works with no issue.
The getGenerationsByModelAndBrand runs a query on database and returns a Promise.
This is the getGenerationsByModelAndBrand method code:
export const getGenerationsByModelAndBrand = (brand, model) => {
return new Promise((resolve, reject) => {
const sql = `DATABASE SELECT`;
connection.query(sql, function(error, result) {
if (error) return reject(error);
return resolve(result);
});
});
};
connection comes from mysql.createConnection method from the mysql package.
I believe that the issue comes from the promise handling on the child process is like I'm missing something bu couldn't find what it is.
Edit:
After researching I didn't found a solution or explanation for this issue therefore in the meantime I moved the getGenerationsByModelAndBrand to the parent script and pass the result as parameter.

fs.readfile changes scope of global array and it can't be used outside it

I have 2 sections of code 1) that is called by 2nd to populate the array and write it into a file.
async function timeSeries(obj) {
data = [
{
original_value: []
}
]
//read file named as passed object's _id
await fs.readFile("./api/assignment_data/" + obj._id + ".json", "utf-8", function read(err, datas) {
if (err) {
throw err;
}
const filedata = JSON.parse(datas)
filedata.map(line => data[0].original_value.push(line.original_value))
})
setTimeout(() => {
try {
fs.writeFileSync("./api/timeseries.json", JSON.stringify(data), { encoding: 'utf8', flag: 'w' })
} catch (error) {
console.log(error)
}
}, 300);
}
The problem is, I can't access the global data array above after using it inside the fs.readfile function ( callback scope hell problem), I had to setTimeout then I am able to write it inside a file using another fs.writeFileSync function ( if I return the array I get a promise, I want data).How do I solve this? instead of writing it into another file and using it inside another route(below) how can I directly return the array in the second route and pass it as a json res?
section 2)
router.route("/api/debug/:num").get((req, res) => {
fs.readFile("./api/assignment_data/metrics.json", "utf8", function read(err, data) {
if (err) {
console.log(err);
}
const objdata = JSON.parse(data)
timeSeries(objdata[req.params.num])
})
fs.readFile("./api/timeseries.json", "utf8", function read(err, data) {
if (err) {
console.log(err);
}
const objdata = JSON.parse(data)
res.json(data)
})
})
If you use fs.readFile and want to do an action after the file has been read, you must do the action (write and read a file in your case) inside the callback function. Also, you can use fs.readFileSync if you can read synchronously.
First off, we need to explain a few things:
fs.readFile() is non-blocking and asynchronous. That means that when you call it, it starts the operation and then returns immediately and starts the execute the code that comes right after it. Then, some time later, it calls its callback.
So, your code is:
Calling fs.readFile()
Then, immediately setting a timer
Then, it's an indeterminate race between the fs.readFile() callback and the timer to see who finishes first. If the timer finishes first, then it will call its callback and you will attempt to access data BEFORE it has been filled in (because the fs.readFile() callback has not yet been called).
You cannot write reliable code this way as you are guessing on the timing of indeterminate, asynchronous operations. Instead, you have to use the asynchronous result from within the callback because that's the only place that you know the timing for when it finished and thus when it's valid. So, one way to solve your problem is by chaining the asynchronous operations so you put the second one INSIDE the callback of the first:
function timeSeries(obj, callback) {
//read file named as passed object's _id
fs.readFile("./api/assignment_data/" + obj._id + ".json", "utf-8", function read(err, datas) {
if (err) {
console.log(err);
// tell caller about our error
callback(err)
return;
} else {
let data = [{original_value: []}];
const filedata = JSON.parse(datas);
for (let line of filedata) {
data[0].original_value.push(line.original_value);
}
fs.writeFile("./api/timeseries.json", JSON.stringify(data), { encoding: 'utf8' }, (err) => {
if (err) {
console.log(err);
callback(err);
return;
} else {
// give our data to the caller
callback(data);
}
});
}
})
}
Then, to call this function, you pass it a callback and in the callback you can either see the error or get the data.
In modern nodejs, it's a bit easier to use async/await and the promise-based interfaces in the fs module:
const fsp = require('fs').promises;
async function timeSeries(obj) {
//read file named as passed object's _id
try {
let datas = await fsp.readFile("./api/assignment_data/" + obj._id + ".json", "utf-8");
const filedata = JSON.parse(datas);
let data = [{original_value: []}];
for (let line of filedata) {
data[0].original_value.push(line.original_value);
}
await fsp.writeFile("./api/timeseries.json", JSON.stringify(data), { encoding: 'utf8' });
return data;
} catch(e) {
console.log(e);
// handle error here or throw back to the caller
throw e;
}
}
For this version, the caller can use await and try/catch to get errors:
try {
let data = await timeSeries(obj);
// do something with data here
} catch(e) {
// handle error here
}
Based on what code you have written , I could just modify it using simple async-await - hope this helps
import fs from 'fs'
async function timeSeries(obj) {
const data = [{
original_value: []
}]
const assData = fs.readFileSync('./api/assignment_data/metrics.json', 'utf8')
const filedata = JSON.parse(assData)
filedata.map(line => data[0].original_value.push(line.original_value))
// no need for timeOut
fs.writeFileSync('./api/timeseries.json', JSON.stringify(data));
//return data if u need
return data
}
router.route("/api/debug/:num").get(async (req, res) => {
try {
const metricData = fs.readFileSync('./api/assignment_data/metrics.json', 'utf8')
const objdata = JSON.parse(data)
const timeSeriesData = await timeSeries(objdata[req.params.num])
// returning TimeSeriesData
res.status(200).json(timeSeriesData)
})
}
catch (error) {
res.status(500).send(error.message)
}

Angular reading the value of a js function as undefined, even when the object has value

I created some javascript functions that read and write to a json file, are suppose to be invoked in angular(from typescript code), using jsonfile library.
Here is the code:
function savePatient(patient){
const jsonfile = require('jsonfile')
const file = 'src/resources/patients.json'
jsonfile.writeFile(file, patient, {flag: 'a'}, function(err){
if(err) console.error(err)
})
}
function getPatients(){
const jsonfile = require('jsonfile')
const file = 'src/resources/patients.json'
jsonfile.readFile(file, function(err, obj){
if(err) console.error(err)
console.dir(obj)
return obj
})
}
Here is the declaration of functions in Angular component:
declare function savePatient(patient: Patient);
declare function getPatients(): Patient[];
I managed to successfully call the savePatient() function, and it does as intended.
When I try to invoke console.log(getPatients()) from inside the Angular component, the output is undefined, but the getPatients() function itself generates a correct console output from the console.dir(obj) line.
How am I suppose to get the correct value of the function inside the Angular component?
Also, this project is inside an electron container, if someone may find that relevant.
I found it interesting that the Angular component is the first one to output information to console, even though it would make sense that the js functions should give output before it, considering that the Angular component should be dependent on the return value of the js function, but I don't know what to make of that.
Your function
function getPatients(){
const jsonfile = require('jsonfile')
const file = 'src/resources/patients.json'
jsonfile.readFile(file, function(err, obj){
if(err) console.error(err)
console.dir(obj)
return obj
})
}
works asynchronous (see docs).
You have two options. The first one is to handle the file-reading asynchronously:
function getPatients(){
const jsonfile = require('jsonfile')
const file = 'src/resources/patients.json';
// Create a new promise
return new Promise((resolve, reject) => {
jsonfile.readFile(file, function(err, obj){
if(err){
console.error(err)
return reject(err);
}
console.dir(obj)
return resolve(obj);
});
});
}
...
// Prints the read object in the console, after the file reading is done
getPatients().then((obj) => {
console.dir(obj);
});
The second options, and in my opinion the best solution for you is using the synchronous way to read a file:
function getPatients(){
const jsonfile = require('jsonfile')
const file = 'src/resources/patients.json'
try {
const obj = jsonfile.readFileSync(file);
console.dir(obj);
return obj;
} catch(e) {
console.error(e);
});
}
Please make sure that your function return something. In this snippet i added a return statement before jsonfile.readfile().
function getPatients(){
const jsonfile = require('jsonfile')
const file = 'src/resources/patients.json'
return jsonfile.readFile(file, function(err, obj){
if(err) return err;
return obj;
});
}

Async readFile module.exports in node.js

I'm sorry for, what might easily be a naive question, but I`m trying to figure out how node works, especially for a problem like this:
What I need do is to send an object/file from fs.readFile through require and module.exports. This is what I have tried is this
in one file (call it app.js) the code for reading a file:
var fs = require('fs');
var file_contents = undefined;
var callback_reader = function(err, data) {
if (err) return console.error(err);
file_contents = data.toString().split('\n');
}
module.exports = {
parseFile: function(file_path) {
fs.readFile(file_path.toString(), 'utf-8', callback_reader);
}
}
and in some other file, (call it main.js) I need to use the contents of the file read by the readFile like this
var file_importer = require('./app.js')
file_importer.parseFile(real_path_to_file);
but if i try console.log of this last line I always get undefined object. Now I know it is because callback does not execute before the console.log but I`m unsure how to achieve this communication.
So i changed your code a little bit to use callbacks.
It seems that you can't use "return" from asyncronous function in module.exports. However, the code bellow works as expected. Hope it helps.
main.js
var file_importer = require('./app.js')
file_importer.parseFile('./time.js', function(err, data){
if(err) return console.log(err);
console.log(data);
});
app.js
var fs = require('fs');
module.exports = {
parseFile: function(file_path, callback) {
fs.readFile(file_path.toString(), 'utf-8', function(err, data) {
if (err) return callback(err);
callback(null, data);
});
}
}
// much shorter version
exports.parseFile = function(file_path, callback) {
fs.readFile(file_path.toString(), 'utf-8', callback);
}
This is javascript work, it don't wait the callback was called to return.
You should do your console.log in your callback.
Like these :
fs.readFile(pathToFile, 'utf-8', function(err, data) {
if (err) return err;
console.log(data);
// Continue your process here
})

Why is the following fs.writeFile only writing data from the last file?

I want to read the content of two files (same folder) and write them into a single one:
const input = process.argv[2]
fs.readdir(__dirname + `/${input}/`, (err, files) => {
if (err) {
return
}
files.forEach((file) => {
fs.readFile(__dirname + `/${input}/` + file, 'utf8', (err, data) => {
let items = []
items.unshift(data)
let result = items.join('\n\n')
fs.writeFile("untitled2.html", result, (err) => {
if (err) {
console.log(err)
} else {
console.log(result)
}
})
})
})
})
console.log(result) outputs the content of the two files:
alex#alex-K43U:~/node/m2n/bin$ node index4.js folder
File 1
File 2
The file, however, has only the content from the second file:
File 2
What's happening here?
Don't use writeFile but appendFile when your goal is to append to a file without replacing the content (appendFile takes care of creating the file when necessary).
You're also not waiting for the appending to be finished, which might lead to errors. You have various solutions here:
promises
a recursive function handling files one after the other
use appendFileSync (assuming you're writing an utility rather than a server)
Exemple with the recursive function:
(function doOneFile(){
var file = files.shift();
if (!file) return;
fs.readFile(__dirname + `/${input}/` + file, 'utf8', (err, data) => {
let items = []
items.unshift(data)
let result = items.join('\n\n')
fs.appendFile("untitled2.html", result, (err) => {
if (err) {
console.log(err)
} else {
console.log(result)
}
doOneFile();
})
})
})();
With the default options, writeFile erases previous contents every time. It's the "write mode". What you want is switch to "append mode", like so :
fs.writeFile("untitled2.html", result, {flag:"a"}, callbacks...);
In the process, you'll need to take care to erase the possible file contents before your loop, or have the first access be in write mode. Otherwise you'll keep appending to previsously existing contents.
Besides, in this case you'll be hitting problems with concurrent accesses. Either use the synchronous forms, or loop through files via a callback.

Categories

Resources