I'm trying to write JavaScript code into a js with Nodejs fs module. I managed to write a json file but could wrap my head around on how to write JavaScript to it.
fs.writeFile("config.json", JSON.stringify({name: 'adman'tag: 'batsman',age: 25}), 'utf8',{ flag: "wx" }, function(err) {
if (err) {
return console.log(err);
}
console.log("The file was saved!");
});
I need to create a .js file with the following data
const cricketers = [
{
name: 'adman',
tag: 'batsman',
age: 25
},
// other objects
]
module.exports = cricketers ;
Two things:
If all you want to do is to be able to do let someData = require('someFile.json'); Nodejs already supports requiring json files and treats them like Js objects.
Otherwise I don't know of a library that will do exactly this for you, BUT...
You can do this yourself. The fs.writeFile function takes a string, so you just have to generate the string you want to write to the file.
let someData = [{name: 'adman', tag: 'batsman', age: 25}];
let jsonData = JSON.stringify(someData);
let codeStr = `const cricketers = ${jsonData}; module.exports = cricketers;`;
fs.writeFile("someFile.js", codeStr, 'utf8',{ flag: "wx" }, function(err) {
if (err) {
return console.log(err);
}
console.log("The file was saved!");
});
Obviously this only works for a very specific use case, but the point is it can be done with simple (or complicated...) string manipulation.
use string templating
const data = `const cricketers = ${JSON.stringify(yourArray)};
module.exports = cricketers;
`
Where yourArray is an array of objects
Related
I'm currently working on a NodeJS project, this takes data from a JSON and then use it to take weather stuff form an API, after that I want to save it to a DB, I already asked about it and that question helped me fixing some problems but now I have some others, I'm sending the data to a constant but the issue is that I don't know why am I getting an error in the JSON Parse, I want to use the lat and lon from the JSON (I have like a hundred data coords) and insert it into the const, any help would help, This is the error I'm getting
Successful connection
[]
undefined:1
^
SyntaxError: Unexpected token in JSON at position 0
at JSON.parse (<anonymous>)
here is my function that takes data from JSON and gets data from the API:
async function calcWeather() {
fs.readFile("./json/data.json","utf8", function (err, data) {
if(err) throw err;
data2 = JSON.parse(data);
console.log(typeof data2);
for (let item of data2) {
let base = `https://api.openweathermap.org/data/2.5/weather?lat=${item.latjson}&lon=${item.lonjson}&appid=${api_key}&units=metric&lang=sp`;
fetch(base)
.then((responses) => {
return responses.json();
})
.then((data) => {
var myObject = {
Id_Oficina: item.IdOficina,
Humedad: data.main.humidity,
Nubes: data.clouds.all,
Sensacion: data.main.feels_like,
Temperatura: data.main.temp,
Descripcion: data.weather.description,
};
// validation and saving data to array
if (myObject.Temperatura < 99) {
lstValid.push(myObject);
}
});
}
});
console.log(lstValid);
}
here is the JSON where I take the data:
[
{
"latjson": 1,
"lonjson": 1,
"IdOficina": "1"
},
{
"latjson": 2,
"lonjson": 2,
"IdOficina": "2"
}
]
I think the issue is in the parse, but I don't get what I am doing wrong
Since you are reading the file with fs.readFile, you are getting a string and not a JavaScript object. You would need to parse it entirely in order to be able to manipulate the content (you seem to be parsing the first character only):
const fs = require('fs')
let rawdata = fs.readFileSync('./data.json')
let data = JSON.parse(rawdata)
Personally, I think it's way easier to require it (no need to use fs):
const jsonData = require('./json/data.json')
async function calcWeather() {
for (let item of jsonData) {
// ...
}
}
What I'm trying to do
Requests come into my server to download a file containing data. The downloading part is in the front-end and works. I grab the data on my backend and then I want to write it into an existing template and return the data.
This is the handler for the request.
async handle(request: Request, response: Response) {
try {
const fileName = 'test.xlsx'
const binary = objectsToTemplateWorkBook()
response.setHeader(
'Content-Disposition',
'attachment; filename=' + fileName
)
response.setHeader('Content-Type', 'application/vnd.openxmlformats')
response.end(binary, 'binary')
} catch (error) {
console.log(error)
response.send(error)
}
}
This is the function that is supposed to write the data into the template.
export const objectsToTemplateWorkBook = ():
Promise<any> => {
var XlsxTemplate = require('xlsx-template')
var dataBlob
// Load an XLSX file into memory
const blob = fs.readFile(
path.join(__dirname, 'test_template.xlsx'),
function (err, data) {
console.log(__dirname)
// Create a template
var template = new XlsxTemplate(data)
// Replacements take place on first sheet
var sheetNumber = 1
// Set up some placeholder values matching the placeholders in the template
var values = {
people: [
{ name: 'John Smith', age: 20 },
{ name: 'Bob Johnson', age: 22 },
],
}
// Perform substitution
template.substitute(sheetNumber, values)
// Get binary data
dataBlob = template.generate()
// ...
}
)
return dataBlob
}
The function seems to write the data to the template because if I log the dataBlob inside the fs.Readfile method it shows me the file. However, the return dataBlob always returns undefined. I know this is due to the async nature, but I have no idea how to fix it quite honestly. So my question to you is: how can I get the dataBlob to my handler to send it as a response?
You can't get the return from a callback function like you're doing here, since they run asynchronously, their return will never be acessible because the external return will be executed before the inner code.
To solve this specific problem you can you the fs.readFileSync function, that executes synchronously and returns a value, that being the buffer you need to pass in your xlsxTemplate constructor. This way, the code turns into:
export const objectsToTemplateWorkBook = ():
Promise<any> => {
var XlsxTemplate = require('xlsx-template')
var dataBlob
// Load an XLSX file into memory
const data = fs.readFileSync(path.join(__dirname, 'test_template.xlsx'))
console.log(__dirname)
// Create a template
var template = new XlsxTemplate(data)
// Replacements take place on first sheet
var sheetNumber = 1
// Set up some placeholder values matching the placeholders in the template
var values = {
people: [
{ name: 'John Smith', age: 20 },
{ name: 'Bob Johnson', age: 22 },
],
}
// Perform substitution
template.substitute(sheetNumber, values)
// Get binary data
dataBlob = template.generate()
// ...
return dataBlob
}
With this you get access to the file buffer returned from the synchronous read file and is able to perform the rest of your operations. Hope it helps :D
Basically, this shouldn´t be a very difficult question, but I´ve tried for like 2 or 3 hours and couldnt reach my goal, especially for such an "easy" question. Im using Node.js and my goal is, to load data from a Json file into a variable, add some new data to this and store my data into the same json file again.
Herefor, my json file looks like this:
[
{
"name": "Max",
"date": "1.1.2020"
}, {
"name": "Henry",
"date": "2.2.2020"
}
]
Here´s my code:
const fs = require('fs');
const filename = './jsonFile.json';
const data = loadJSON();
// console.log(data) keeps saying undefined (whithout using .toString in loadJSON)
function loadJSON() {
JSON.parse(fs.readFileSync(filename).toString); // toString doesnt work
}
function saveJSON(data) {
fs.writeFileSync(filename, JSON.stringify(data));
}
function adduser(username) {
var today = "3.3.2020"; // doesnt matter
let obj = {
name: username,
date: today
}
vipJson.push(obj);
saveVIP(vipJson);
}
It doesnt seem to be working. Could anyone help me, to fix my problem, so I can work with .json files ? Thanks a lot!
You need to specify the BufferEncoding options to read the file, something like this.
const fs = require("fs")
const data = fs.readFileSync("./myfile.json", { encoding: "utf8", flag: "r" })
If you are sure about json files, you can read data from file using require.
const fs = require('fs');
const filename = './jsonFile.json';
const data = loadJSON();
function loadJSON() {
return require(filename)
}
function saveJSON(data) {
fs.writeFileSync(filename, JSON.stringify(data));
}
function adduser(username) {
var today = "3.3.2020"; // doesnt matter
let obj = {
name: username,
date: today
}
data.push(obj);
saveJSON(data);
}
Try above code snippet.
I have a filename.config.js file with contents like these:
module.exports = {
foo: [
{bar: "bar"},
],
};
Does the native Node.js or some lib for it have any tools that can add another item to foo without resorting to regular expressions?
Found a solution myself in some googled schematics:
const { cosmiconfigSync } = require('cosmiconfig');
export function getConfig(path?: string): MyConfig {
const explorer = cosmiconfigSync('filename');
if (path) {
path = `${process.cwd()}/${path}`;
}
const configSearch = explorer.search(path);
return configSearch ? configSearch.config : {};
}
Use a .json file for configurations. Never dynamically write to a js file.
filename.config.json
{
"foo": [
{"bar": "bar"},
],
}
Then in another js file you can read the file:
const fs = require('fs');
const path = require('path');
const config = JSON.parse(
fs.readFileSync(path.resolve(__dirname, './filename.config.json'), 'utf8')
);
console.log(config);
To edit the file, you can edit the object and write it back to the file:
config.biz = 'baz';
fs.writeFileSync(path.resolve(__dirname, './filename.config.json'), JSON.stringify(config));
JSON.parse and JSON.stringify can be used to convert JSON objects from a string to a real object and back. No regex required.
This question already has answers here:
How to filter object array based on attributes?
(21 answers)
Closed 8 years ago.
I have seen very old answers to this question and many of the technologies used 2 years back have changed.
What I have is JSON files sent by a database over to my server, and what I would like to know is how to filter that data.
I am running a server with node.js, and what I would like to do is something like:
var results = QueryLibrary.load(jsondata);
var filtered = results.query('select where user = "user1"');
How can I do something like this in javascript running in node?
underscore has a where function that does just this
var _ = require("underscore");
var json = '[{"user": "a", "age": 20}, {"user": "b", "age": 30}, {"user": "c", "age": 40}]';
var users = JSON.parse(json);
var filtered = _.where(users, {user: "a"});
// => [{user: "a", age: 20}]
Another utility library, Lo-Dash, has a where function that operates identically.
You can add underscore to your project using
$ npm install --save underscore
or lodash
$ npm install --save lodash
If you only care about the where function, lodash offers it as a separate module
// only install lodash.where
$ npm install --save lodash.where
To use it in your project
var where = require("lodash.where");
// ...
var filtered = where(users, {"user": "a"});
Even if you use a library to do this, a better approach is probably to setup a chain of streams that handles all of your data processing in smaller modules.
Without knowing what you actually want to do, I've created this as an example. For the purposes of this code, maybe think of a debug logging stream or something.
json-parser.js
input: string (JSON)
output: object
var Transform = require("stream").Transform;
function JsonParser() {
Transform.call(this, {objectMode: true});
this._transform = function _transform(json, enc, done) {
try {
this.push(JSON.parse(json));
}
catch (e) {
return done(e);
}
done();
}
}
JsonParser.prototype = Object.create(Transform.prototype, {
constructor: {
value: JsonParser
}
});
module.exports = JsonParser;
obj-filter.js
input: object
output: object (result of where(data, filters))
var Transform = require("stream").Transform;
var where = require("lodash.where");
function ObjFilter(filters) {
Transform.call(this, {objectMode: true});
this._transform = function _transform(obj, enc, done) {
this.push(where(obj, filters));
done();
}
}
ObjFilter.prototype = Object.create(Transform.prototype, {
constructor: {
value: ObjFilter
}
});
module.exports = ObjFilter;
stringifier.js
input: object
output: string (JSON)
var Transform = require("stream").Transform;
function Stringifier() {
Transform.call(this, {objectMode: true});
this._transform = function _transform(obj, enc, done) {
this.push(JSON.stringify(obj));
done();
}
}
Stringifier.prototype = Object.create(Transform.prototype, {
constructor: {
value: Stringifier
}
});
module.exports = Stringifier;
app.js
// modules
var JsonParser = require("json-parser");
var ObjFilter = require("obj-filter");
var Stringifier = require("stringifier");
// run
var parser = new JsonParser();
// setup stream chain
parser.pipe(new ObjFilter({"user": "a"}))
.pipe(new Stringifier())
.pipe(process.stdout);
// send example json in
parser.write('[{"user": "a", "age": 20}, {"user": "b", "age": 30}, {"user": "c", "age": 40}]');
// output
// => [{"user":"a","age":20}]
Here, I made a Stringifier stream that converts objects back into JSON so that we can see them dumped into the console, though you could easily create any streams you needed to handle the operations that your app requires. Your stream end points will likely not end up in writing to the console.
As a last note, you would probably create a database stream that accepts some sort of query options and emits json. You would pipe that stream directly into parser.
Anyway, I hope this gives you a better idea of how to process data in node.js.
You can use any of the normal array/object built-in functions that javascript has, normally that kind of query would be made at the time of retrieving your data from the database, not after.
something like
for(i=0;i<objIdsArray.length;i++) {
for(j=0;j<mockJSON.length;j++) {
if(mockJSON[j]["id"] === parseInt(objIdsArray[i])) {
mockJSON.splice(j, 1); // to delete it, could be any other instruction
}
}
}