Cannot write to JSON in Nodejs - javascript

I'm trying to make a live JSON database of IDs and a tag. The Database refreshes by reading from the JSON file and I have traced my problem to Nodejs not writing to disk, and I don't quite know why.
This is my reading operation, and yes there is a file there with proper syntax.
let dbraw = fs.readFileSync('db.json');
var db = JSON.parse(dbraw);
This is my writing operation, where I need to save the updates to disk.
var authorid = msg.author.id
db[authorid] = "M";
fs.writeFileSync('db.json', JSON.stringify(db));
Am I doing something wrong? Is it a simple error I am just forgetting? Is there an easier/more efficient way to do this I am forgetting about? I can't seem to figure out what exactly is going wrong, but it has something to do with these two bits. There are no errors in my console, just the blank JSON file it reads every time on the Read Operation.

There is a problem with your JSON file's path.
Try using __dirname.
__dirname tells you the absolute path of the directory containing the currently executing file.
— source (DigitalOcean)
Example:
If the JSON file is in the root directory:
let dbraw = fs.readFileSync(__dirname + '/db.json');
var db = JSON.parse(dbraw);
If the JSON file is in a subdirectory:
let dbraw = fs.readFileSync(__dirname + '/myJsonFolder/' + 'db.json');
var db = JSON.parse(dbraw);
Side note: I suggest you read about Google Firestore, as it will be a faster way to work with real time updates.

Here's a simple block that does what is desired
const fs = require('fs');
let file_path = __dirname + '/db.json',
dbraw = fs.readFileSync(file_path),
db = JSON.parse(dbraw),
authorid = 'abc';
console.log(db);
db[authorid] = "M";
fs.writeFileSync(file_path, JSON.stringify(db));
dbraw = fs.readFileSync(file_path), db = JSON.parse(dbraw)
console.log(db);
I've added a couple of log statements for debugging. This works and so there may be something else that's missing or incorrect in your flow. The most probable issue would be that of different path references as pointed out by jfriend00 in the comment to your question.
As for better solutions, following are a few suggestions
use require for the json directly instead of file read if the file is small which will do the parsing for you
Use async fs functions
Stream the file if it's big in size
See if you can use a cache like redis or database as a storage means to reduce your app's serialization and deserialization overhead

Related

Passing data between two plain js files in ejs templating

my file structure is as follows:
file structure
I have two ejs views. I am taking a variable from index.ejs using document.querySelector. This variable is stored in index.js file.
I need to access this variable in board.js
How can I do so?
I have tried using:
module.exorts = varName and then require in board.js but it isn't working
index.js file
const btn = document.querySelector(".level1");
var levelMode;
btn.addEventListener("click", () => {
levelMode = btn.innerHTML;
alert(levelMode);
});
module.exports = levelMode;
board.js file
var levelMode = require("./index")
The console shows the following error:
uncaught reference: require is not defined
Im a new developer so take my advice with a grain of salt, but I feel I need to start by saying that js files dont "store" information that isn't the source code. Any information that you feel you need to pull from a variable is stored by the browser running the js file. If you need to pull data from the client my best advice is to first pass it to the server. You should get more familiar with the module you are using for HTTP Requests.

how to paste int into require?

I am writing discord bot, and I have a database (in JSON) with a prefix of guilds.
{
"guilds":{
"721463236070866955":{
"prefix":"!"
}
}
}
Then, I require a database by:
const db = require("./database/json");
After getting guild id by:
gld = new discord.Guild(client, data);
gid = gld.id();
I need to get prefix from JSON, but how?
db.guilds[gid].prefix would give you the prefix of the guild
However, using a JSON database is poor practice. You would be better using a db. Even something as simple as sql would suffice. (npm package better-sqlite3 is a pretty good one)
This would solve problems like
having to delete the cache and re-require the file
having to copy the entire contents into cache just to make simple changes to the file
Also there is a chance that your requesting the prefix before the guild has been added to the db so you will need to check for that
Simply if (!db.guilds[gid].prefix) { ....doesn't exist }

Node.js: Are Global Vars shared between instances?

I have this code:
app.post('/pst', function(req, res) {
var url = req.body.convo;
myAsyncFucntion(url).then(result => {
console.log('TAKE A LOOK AT THIS!');
//transforming array to string to pass to Buffer.from()
//then we remove ',' with newlines, so each index of array is a new line
var str = result.toString();
result = str.split(',').join('\r\n');
//clever way to send text file to client from the memory of the server
var fileContents = Buffer.from(result, 'ascii');
var readStream = new stream.PassThrough();
readStream.end(fileContents);
res.set('Content-disposition', 'attachment; filename=' + fileName);
res.set('Content-Type', 'text/plain');
readStream.pipe(res);
//garbage collecting. i don't know if it's neccessary
result = '';
str = '';
}).catch(err => {
console.log(err);
res.render('error.ejs');
})
});
This code will run an async function and serve the user some data from the memory, as a text file.
I am planing on using sockets and notifying the client that the work is done.
The client will enter a link and will download a file.
So i plan to take the local variable result and export it in a global variable.
This way, the app.get() will have access to it, and when the user follos that link, it will serve the file.
But a user told me that global variables are shared between instances.
Is this true? So if two (or more) users try to get their results at the same time, the global variables will be the same
for both of them?
Is this true?
Yes. (Well, effectively yes. The real answer is that there aren't "multiple instances" in the first place: You have one server which multiple users are making multiple requests to.)
If you want to associate data with a particular browser session, then use a session (NPM modules to handle sessions exist for Express).
This really depends on where you declare those variables.
Please read some documentation on "variable scope" in JS.
It's hard to explain it here when there are plenty of nicely written and pictured explanations out there.
If some variable is declared within the code-area that is read anew for every user-request, then this variable would be different for each user.
If you declare the variable ourside oth that part, then that content is the same for every action of your Node-application.

Push new data to external JSON file

I'm trying to push a new object into an external Javascript document but I am having problems pulling in the JSON file to write to. I have an external, local file called check.json.
How do I call the external json file correctly in node?
var newData = JSON.parse(check.json);
newData.check.push({
cheap: $el.text()
});
check.json = JSON.stringify(newData);
You can use the Filesystem object to read and write to files. Specifically, the readFile and writeFile methods.
For example, to read:
fs.readFile('/path/to/my/json.json', function (err, data) {
if (err) throw err;
var newData = JSON.parse(data);
});
That said, flat files are not a good format for storing data and being accessed like a database. You can run into race conditions and lose data. You would be better off with a real database that will take case of that sort of thing for you.
SQLite gives you a simple file (but puts lots of protection in around it). If you really wanted JSON as the storage format then you could look at something like couchDB.

How do I read and write files to the server with Meteor?

I'm working on a NoDB CMS in Meteor, but I'm new to both Meteor and JavaScript frameworks.
How do I go about reading and writing files to the server?
Within the Node fs module you have a writeFile function.
getUser = Meteor.users.findOne({_id : Meteor.userId()});
userObject = JSON.stringify(getUser);
var path = process.env["PWD"] + "/public/";
fs.writeFile(process.env["PWD"] + "/public/"+Meteor.userId()+'.txt', userObject,
function (err) {
if (err) throw err;
console.log('Done!');
}
);
The above snippet would create a file with all the information of the user. You could access the properties of the result of your query with something like getUser._id to prepare your data parameter (String or Buffer) to print pretty.
All this of course is server side.
you can try to use Npm.require inside the startup function. Like so
Meteor.startup(function () {
fs = Npm.require('fs');
}
But you should definitely have a look at collectionFS that does what you are looking for: storing files on the server and allowing you to retrieve them
an added advantage is that you can distribute everything over many nodes of a MongoDB cluster
to manipulate image files, you can use imagemagick with nodejs this should allow you to transform in any way you need.
The node fs module is a start. http://nodejs.org/api/fs.html
You might want to be a bit more specific with your question though, as it's kind of broad.

Categories

Resources