Writing in a JSON file - javascript

So I want to save usernames and user scores in a JSON file, currently I have a JSON file looking like this which I manually wrote for testing and it's looks like this:
[
{
"username": "Testing",
"score": 2
},
{
"username": "Testing123",
"score": 3
}
]
I can now read from this file and get the asked user score with this:
for (const player of players) {
if (player.username == message.author.tag) {
message.reply(`Points: - **${player.score}** - `);
}
}
Now I want to write in this JSON file (when a new user being registered with a score) as I manually did it for testing, but I just can't figure it out how to do it, even with searching on the internet for hours.

You need to use FS.writeFile to write the data to disk.
const FS = require('fs')
// you can't write an object, so first you need to serialize the object
// here, I serialize it as JSON
let data = JSON.stringify(MY_JSON)
// this example uses the sync version for simplicity; FS.writeFile is async
FS.writeFileSync( path, data, 'utf8' )

Related

How to read and write to local JSON files from React.js?

I have looked at multiple resources for this, however, none seem to be able to answer my question. I have a local JSON file in my React app called items.json. In that file, is a list of objects, which I want to be able to update. I have tried using fs however this apparently doesn't work in React, as I received this error:
Unhandled Rejection (TypeError): fs.readFileSync is not a function
What I am trying to do, is that when the code gets a new item, it looks through the JSON file to see if there is an existing object with a matching values in its name property. If there is, it increments that objects count property by 1, otherwise it creates a new object, and appends it to the list in the JSON file. This is the code that I have written to do that. The logic seems sound (although its not tested) but I can't figure out how to read/write the data.
let raw = fs.readFileSync("../database/items.json");
let itemList = JSON.parse(raw);
let found = false;
for (let item of itemList.averages) {
if (item.name === this.state.data.item_name) {
found = true;
item.count += 1;
}
}
if (!found) {
let newItem = {
name: this.state.data.item_name,
count: 1,
}
itemList.averages.push(newItem);
}
let newRaw = JSON.stringify(itemList);
fs.writeFileSync("../database/items.json", newRaw);
The JSON file:
{
"averages": [
{
"name": "Example",
"count": 1,
}
]
}
First of all, the browser itself doesn't have access to the filesystem, so you won't be able to achieve that using your react app. However, this can be achieved if you use Node.js(or any other FW) at the backend and create an API endpoint which can help you to write to the filesystem.
Secondly, if you wanted to only do things on the frontend side without creating an extra API just for saving the data in a JSON file which I think is not necessary in your case. You can use localstorage to save the data and ask the user to download a text file using this :
TextFile = () => {
const element = document.createElement("a");
const textFile = new Blob([[JSON.stringify('pass data from localStorage')], {type: 'text/plain'}); //pass data from localStorage API to blob
element.href = URL.createObjectURL(textFile);
element.download = "userFile.txt";
document.body.appendChild(element);
element.click();
}
Now, To use local storage API you can check here - https://developer.mozilla.org/en-US/docs/Web/API/Window/localStorage
reading and writing JSON file to local storage is quite simple with NodeJs, which means a tiny piece of backend API in express would help get this job done.
few piece of code that might help you. Assuming you JSON structure would be such as below;
{
"name":"arif",
"surname":"shariati"
}
Read JSON file;
// import * as fs from 'fs';
const fs = require('fs')
fs.readFile('./myFile.json', 'utf8', (err, jsonString) => {
if (err) {
return;
}
try {
const customer = JSON.parse(jsonString);
} catch(err) {
console.log('Error parsing JSON string:', err);
}
})
customer contains your JSON, and values can be accessed by customer.name;
Write to JSON File
Let's say you have an update on your JSON object such as below;
const updatedJSON = {
"name":"arif updated",
"surname":"shariati updated"
}
Now you can write to your file. If file does not exist, it will create one. If already exists, it will overwrite.
fs.writeFile('./myFile.json', JSON.stringify(updatedJSON), (err) => {
if (err) console.log('Error writing file:', err);
})
Importing and reading from json can be like:
import data from ‘./data/data.json’;
then use .map() to iterate data.
for writing locally you can use some libraries like https://www.npmjs.com/package/write-json-file

Using transform/duplex stream in NodeJS

I am gathering data from an external source (Bluetooth Low Energy) using NodeJS (noble module). I am streaming them to a JSON array in an external file. Let's call it fromSource.json. It looks like this:
fromSource.json
[
{
"id": 1,
"name": "foo"
"value": 123
},
{
"id": 2,
"name": "foo2",
"value": 123
},
{
"id": 3,
"name": "foo3",
"value": 789
}
]
In the other hand, I am going to process these objects in real-time and store the new values in a CSV file. Let's call it toDestination.csv. It looks like this:
toDestination.csv
id,name,convertedValue
1,"foo",123000
2,"foo2",456000
3,"foo3",789000
Every one second, I am going to receive a new value (a new json object) from the source, push it into a writeable stream (the json array file) then read it into a readable stream, do a transformation, and write it again into its final destination, the csv file.
My questions are: Are NodeJS streams adapted to handle JSON objects ? Should I stringify them before using them ? Should I use a Duplex or a Transform stream in my case ?
Based on your question I do think Transform is all you need. In Duplex you'd need to implement both reading and writing which is not necessary in your case.
The code would look like this:
const intermediate = measurementStream.pipe(
new Transform({transform: initialTransforms})
);
intermediate.pipe(someArrayifyStreamToJSON)
intermediate.pipe(
new Transform({transform: someMoreTransforms})
).pipe(
new Transform({transform: stringifyToCSV})
).pipe(
fs.createWriteStream('some_path.csv')
);
I would also recommend to take a look at the framework I've created and support scramjet. It is meant to deal with cases like yours and would make your code much simpler:
const {DataStream} = require('scramjet');
// Pass your original stream here, this could be also an
// AsyncGenerator in node v10 and up.
DataStream.from(measurementStream)
// you can map with any async or sync operation on the data
.map(async item => {
const newData = await doSomeTransforms();
return newData;
})
// tee creates a stream at this point in transforms
.tee(stream => stream
.toJSONArray()
.pipe(fs.createWriteStream('your-intermediate.json'))
)
// then you can add some more transforms
.map(someMapper)
.filter(someFilter)
.CSVStringify()
.pipe(fs.createWriteStream('your-final.csv');
If you'd choose the first path anyway, I'd recommend a couple modules that would make your life easier: JSONStream and papaparse both available in NPM.

Create Simple Node.js API from JSON files

I have a folder of JSON files that I'd like to use to create a simple API from.
Here's a simplified version of my folder structure:
/clients.json
/clients/1/client.json
/clients/2/client.json
...
my /clients.json file looks like this:
[
{
"id": 1,
"name": "Jon Parker"
},
{
"id": 2,
"name": "Gareth Edwards"
},
...
]
and my /clients/1/client.json file looks like this:
[
{
"date": "2014-09-12",
"score": 40,
...
},
{
"date": "2015-02-27",
"score": 75,
...
},
{
"date": "2015-05-10",
"score": 75,
...
},
{
"date": "2016-08-27",
"score": 60,
...
}
]
The id from clients.json relates to the folder in which the associated details are.
I have a lot of JSON files in the clients folder and rather than loading these all individually on the client side, I wanted to create an API using Node.js that gives me more flexibility, i.e...
returning a list of client names and id's
/clients
returning the client details
/clients/:id/details
and most importantly returning all clients with there names and associated details
/clients/all/details
I did begin playing with json-server, however it requires that your JSON be an object rather than an array, and I'm stuck with the format of this JSON unfortunately.
Appreciate any help!
Use the built-in file system module to get files from the file system.
Refer here
Here's an example.
var fs = require('fs');
exports.getClientDetail = function (id) {
var result;
fs.readFile('/clients/' + id + '/client.json', function (err, data) {
if (err) throw err;
result.push(JSON.parse(data));
});
}
exports.getAllClientsDetail = function () {
// if the id is sequential, you can check if '/clients/' + id + '/client.json' exists for every i from 1 and up. If it does - push it to an array of objects. if for a certain i it does not exist, quit the scan and return the array of objects.
}
You can require the json directly as an object in node, something like this:
app.get('/clients/:id/details', (req, resp) => {
const id = req.params.id;
const data = require(`./clients/${id}/client.json`); // or whatever path
resp.send(data)
});
You aren't nearly as stuck as you think.
You'll have to wrap your arrays in an object. Then, in the front end, you just have to access the array property.
After all, JSON is an acronym for Javascript Object Notation.
EDIT: Okay, let's try something new...
Perhaps before using code from json-server, do a little preprocessing. Assuming that variable clientJson is the file you have already read, inserting this code before you use any code from json-server:
clientJson = "{root:"+clientJson+"}";
That will wrap the file in an object with the first property being root.
After that, it's pretty easy to get your array back:
clientData = clientData.root;
You should use Read streams from FS module to send data to the client,catch possible errors and cleaning memory after send.
You could do it without any code if you uploaded your folder structure to a cloud service (ex. Amazon S3 or Dropbox) and serve them from there. No code required.

Read json file content with require vs fs.readFile

Suppose that for every response from an API, i need to map the value from the response to an existing json file in my web application and display the value from the json. What are the better approach in this case to read the json file? require or fs.readfile. Note that there might be thousands of request comes in at a same time.
Note that I do not expect there is any changes to the file during runtime.
request(options, function(error, response, body) {
// compare response identifier value with json file in node
// if identifier value exist in the json file
// return the corresponding value in json file instead
});
I suppose you'll JSON.parse the json file for the comparison, in that case, require is better because it'll parse the file right away and it's sync:
var obj = require('./myjson'); // no need to add the .json extension
If you have thousands of request using that file, require it once outside your request handler and that's it:
var myObj = require('./myjson');
request(options, function(error, response, body) {
// myObj is accessible here and is a nice JavaScript object
var value = myObj.someValue;
// compare response identifier value with json file in node
// if identifier value exist in the json file
// return the corresponding value in json file instead
});
There are two versions for fs.readFile, and they are
Asynchronous version
require('fs').readFile('path/test.json', 'utf8', function (err, data) {
if (err)
// error handling
var obj = JSON.parse(data);
});
Synchronous version
var json = JSON.parse(require('fs').readFileSync('path/test.json', 'utf8'));
To use require to parse json file as below
var json = require('path/test.json');
But, note that
require is synchronous and only reads the file once, following calls return the result from cache
If your file does not have a .json extension, require will not treat the contents of the file as JSON.
Since no one ever cared to write a benchmark, and I had a feeling that require works faster, I made one myself.
I compared fs.readFile (promisified version) vs require (without cache) vs fs.readFileSync.
You can see benchmark here and results here.
For 1000 iterations, it looks like this:
require: 835.308ms
readFileSync: 666.151ms
readFileAsync: 1178.361ms
So what should you use? The answer is not so simple.
Use require when you need to cache object forever. And better use Object.freeze to avoid mutating it in application.
Use readFileSync in unit tests or on blocking application startup - it is fastest.
Use readFile or promisified version when application is running and you don't wanna block event loop.
Use node-fixtures if dealing with JSON fixtures in your tests.
The project will look for a directory named fixtures which must be child of your test directory in order to load all the fixtures (*.js or *.json files):
// test/fixtures/users.json
{
"dearwish": {
"name": "David",
"gender": "male"
},
"innaro": {
"name": "Inna",
"gender": "female"
}
}
// test/users.test.js
var fx = require('node-fixtures');
fx.users.dearwish.name; // => "David"
I only want to point out that it seems require keeps the file in memory even when the variables should be deleted. I had following case:
for (const file of fs.readdirSync('dir/contains/jsons')) {
// this variable should be deleted after each loop
// but actually not, perhaps because of "require"
// it leads to "heap out of memory" error
const json = require('dir/contains/jsons/' + file);
}
for (const file of fs.readdirSync('dir/contains/jsons')) {
// this one with "readFileSync" works well
const json = JSON.parse(fs.readFileSync('dir/contains/jsons/' + file));
}
The first loop with require can't read all JSON files because of "heap out of memory" error. The second loop with readFile works.
If your file is empty, require will break. It will throw an error:
SyntaxError ... Unexpected end of JSON input.
With readFileSync/readFile you can deal with this:
let routesJson = JSON.parse(fs.readFileSync('./routes.json', 'UTF8') || '{}');
or:
let routesJson
fs.readFile('./dueNfe_routes.json', 'UTF8', (err, data) => {
routesJson = JSON.parse(data || '{}');
});
{
"country": [
"INDIA",
"USA"
],
"codes": [
"IN",
"US"
]
}
// countryInfo.json
const { country, code } = require('./countryInfo.json');
console.log(country[0]); // "INDIA"
console.log(code[0]); // "IN"

How to call REC Registery API and store returned JSONs into some kind of database

I'd like to break this into smaller, tighter questions but I don't know what I don't know enough to do that yet. So hopefully a can get specific answers to help do that.
The scope of the solution requires receiving & parsing a lot of records, 2013 had ~17 million certificate(s) transactions while I'm only interested in very small subsets of the order 40,000 records.
In pseudo code:
iterate dates(thisDate)
send message to API for thisDate
receive JSONS as todaysRecords
examine todaysRecords to look for whatever criteria match inside the structure
append a subset of todaysRecords to recordsOut
save recordsOut to a SQL/CSV file.
There's a large database of Renewable Energy Certificates for the use under the Australian Government RET Scheme called the REC Registery and as well as the web interface linked to here, there is an API provided that has a simple call logic as follows
http://rec-registry.gov.au/rec-registry/app/api/public-register/certificate-actions?date=<user provided date> where:
The date part of the URL should be provided by the user
Date format should be YYYY-MM-DD (no angle brackets & 1 date limit)
A JSON is returned (with potentially 100,000s of records on each day).
The API documentation (13pp PDF) is here, but it mainly goes into explaining the elements of the returned structure which is less relevant to my question. Includes two sample JSON responses.
While I know some Javascript (mostly not in a web context) I'm not sure how send this message within a script and figure I'd need to do it server side to be able to process (filter) the returned information and then save the records I'm interested in. I'll have no issue parsing the JSON (if i can use JS) and copying the objects I wish to save I'm not sure where to even start doing this. Do I need a LAMP setup to do this (or MAMP since I'm on OS X) or is there a more light-weight JS way I can execute this. I've never known how to save file from within web-browser JS, I thought it was banned for security reasons but I guess theres ways and means.
If i can rewrite this question to be more clear and effective in soliciting an answer I'm happy for edits to question also.
I guess maybe I'm after some boilerplate code for calling a simple API like this and the stack or application context in which I need to do it. I realise there's potential several ways to execute this but looking for most straightforward for someone with JS knowledge and not much PHP/Python experience (but willing to learn what it takes).
Easy right?
Ok, to point you in the right direction.
Requirements
If the language of choice is Javascript, you'll need to install Node.js. No server whatsoever needed.
Same is valid for PHP or Python or whatever. No apache needed, just the lang int.
Running a script with node
Create a file.js somewhere. To run it, you'll just need to type (in the console) node file.js (in the directory the file lives in.
Getting the onfo from the REC Webservice
Here's an example of a GET request:
var https = require('https');
var fs = require('fs');
var options = {
host: 'rec-registry.gov.au',
port: 443,
path: '/rec-registry/app/api/public-register/certificate-actions?date=2015-06-03'
};
var jsonstr = '';
var request = https.get(options, function(response) {
process.stdout.write("downloading data...");
response.on('data', function (chunk) {
process.stdout.write(".");
jsonstr += chunk;
});
response.on('end', function () {
process.stdout.write("DONE!");
console.log(' ');
console.log('Writing to file...');
fs.writeFile("data.json", jsonstr, function(err) {
if(err) {
return console.error('Error saving file');
}
console.log('The file was saved!');
});
});
})
request.on('error', function(e) {
console.log('Error downloading file: ' + e.message);
});
Transforming a json string into an object/array
use JSON.parse
Parsing the data
examine todaysRecords to look for whatever criteria match inside the structure
Can't help you there, but should be relatively straightforward to look for the correct object properties.
NOTE: Basically, what you get from the request is a string. You then parse that string with
var foo = JSON.parse(jsonstr)
In this case foo is an object. The results "certificates" are actually inside the property result, which is an array
var results = foo.result;
In this example the array contains about 1700 records and the structure of a certificate is something like this:
"actionType": "STC created",
"completedTime": "2015-06-02T21:51:26.955Z",
"certificateRanges": [{
"certificateType": "STC",
"registeredPersonNumber": 10894,
"accreditationCode": "PVD2259359",
"generationYear": 2015,
"generationState": "QLD",
"startSerialNumber": 1,
"endSerialNumber": 72,
"fuelSource": "S.G.U. - solar (deemed)",
"ownerAccount": "Solargain PV Pty Ltd",
"ownerAccountId": 25782,
"status": "Pending audit"
}]
So, to access, for instance, the "ownerAccount" of the first "certificateRanges" of the first "certificate" you would do:
var results = JSON.parse(jsonstr).result;
var ownerAccount = results[0].certificateRanges[0].ownerAccount;
Creating a csv
The best way is to create an abstract structure (that meets your needs) and convert it to a csv.
There's a good npm library called json2csv that can help you there
Example:
var fs = require('fs');
var json2csv = require('json2csv');
var fields = ['car', 'price', 'color']; // csv titles
var myCars = [
{
"car": "Audi",
"price": 40000,
"color": "blue"
}, {
"car": "BMW",
"price": 35000,
"color": "black"
}, {
"car": "Porsche",
"price": 60000,
"color": "green"
}
];
json2csv({ data: myCars, fields: fields }, function(err, csv) {
if (err) console.log(err);
fs.writeFile('file.csv', csv, function(err) {
if (err) throw err;
console.log('file saved');
});
});
If you wish to append instead of writing to a new file you can use
fs.appendFile('file.csv', csv, function (err) { });

Categories

Resources