How to format the received JSON to make it look beautiful? - javascript

I get an array of objects and display it on the screen, but the data is not beautiful, how to format to have indents and everything was beautiful, tell me how to do this using js or some kind of library.
const express = require('express');
const Database = require('./db');
const app = express();
const port = 3000;
const db = new Database();
app.use(express.json());
app.get('/gallery', (req, res) => {
db.pictures().then(data => {
const pictures = JSON.parse(data);
res.send(pictures)
})
});
How it looks on the screen:

Try
res.send(JSON.stringify(pictures, null, 4); // stringify with 4 spaces at each level
Taken from here
Either format from backend or from client side using JSON.stringify
JSON.stringify takes more optional arguments.
I recommend formaingt it in client side, as you can use the original JSON for rendering the page.

Human readable
Actually I am not sure why would we display data in Json format to the user, But it's your requirement, so i will roll with it.
There are useful answers and comments that gives you what you want, to display the json well formatted on the screen.
This answer intends to add bit of an edge if you want to display the json data in actual human readable form in an easy way. You can use this if you find necessary.
json.human.js
It takes json input and gives you a structured data as output. Example below:
Source

you can use JsonLint to beautify your json
https://jsonlint.com/

Related

Parse a string representation of javascript without storing a file

There is a scenario where I let a client to insert a JavaScript script that should be sent my server. Actually, this script (well, should..) will export some object.
So the frontend code is something like this:
<form action="/blabla.js" method="post">
<textarea>
</form>
Then the frontend sends the input of the <textarea /> to the server.
A typical input will be something like:
module.exports = {
glagla: {
blabla: 2
},
};
The frontend will send this script, as a string, to a server.
Next step is the server needs to parse this string.
So right now, for example with express package it should look like:
import fs from 'fs';
const handler = async (req, res) => {
const input = req.body.script;
await fs.promises.writeFile('./somewhere.js', req.body.script);
const parsedObject = require('./somewhere.js');
}
I'm trying to not use the file system, but cannot find a way to do so.
I there a pure way, without using the file system to parse such a script in JS?
I think this package https://www.npmjs.com/package/vm2 is the solution.
Basically you don't need to use module.exports for this, because you dont need to create file and then include it in the code with require. You can pass just function body:
// String from frontend, which must
// contain function body
const strFn = 'return { glagla: { blabla: 2 }}';
// Create function from string
const obj = new Function(strFn);
// Run created function and access
// property of returned object
console.log(obj().glagla.blabla);
But keep in mind, that this is pretty risky, because user can pass anything from frontend to your backend to run.
If you need to pass only object with it's properties and values, it's better to utilize JSON for this purpose:
// Object on frontend side
const frontendObj = { glagla: { blabla: 2 }};
// While on frontend, convert object
// in to the JSON string
const strJSON = JSON.stringify(frontendObj);
//
// Then send this string to the server
//
// On server side parse received JSON
// string in to the JS object
const obj = JSON.parse(strJSON);
// Now you can access property of
// returned object
console.log(obj.glagla.blabla);

How to append an item to a JSON stored in a file?

Im triying to write a JSON object to a DB with FS but it´s not working as expected. What i want to do is write the JSON data inside this: [] Example:
[
{"name":"jhon"},
{"name":"jhonathan"}
]
Here is the code:
Thanks.
The comment you provided doesn't make much sense, because the JSON data you provided in the post and in the comments is completely different. But I get the gist of it. I guess you have a JSON file containing an array and you want to push new items to it. Let's do this.
The thing is, when you call fs.appendFile, you're only writing to the end of the file. You're not following JSON format by doing so.
You need to do this:
Read file content.
Parse JSON text into an object.
Update the object in memory.
Write object in JSON format back to the file system.
I'll call the synchronous methods for simplicity's sake, but you should be able to convert it to async quite easily.
const path = __dirname + 'db.json'
// Reading items from the file system
const jsonData = fs.readFileSync(path)
const items = JSON.parse(jsonData)
// Add new item to the item list
items.push(newItem)
// Writing back to the file system
const newJsonString = JSON.stringify(items)
fs.writeFileSync(path, newJsonString)

from csv to json, unknown symbols nodejs

Really I don't know what to do. Please help to solve this problem. I have a CSV file, I transfer it to an array:
import csvToJson from 'convert-csv-to-json';
import fs from 'fs';
const URL_01 = `${__dirname}/URL/price.csv`;
//Unnecessary code removed for clarity...
app.get('/all', (req, res) => {
let json = csvToJson.getJsonFromCsv(URL_01);
res.send(json);
});
I got this:
How to get the correct data in UTF-8??
Try to use csvtojson library instead of convert-csv-to-json.
You will see examples here:
https://www.npmjs.com/package/csvtojson
Try wrapping the output from csvToJson in JSON.parse(JSON.stringify());
app.get('/all', (req, res) => {
let json = csvToJson.getJsonFromCsv(URL_01);
res.send(JSON.parse(JSON.stringify(json)));
});
JSON.Stringify() may parse the data causing the errors. But of course, you'll want to convert it back to JSON from the String using JSON.parse(). Although, it's difficult to say without knowing the problem characters.
The tactic should be:
var iconv = require('iconv-lite');
var csv = require('csv');
var converterStream = iconv.decodeStream('win1251');
fs.createReadStream('file-in-win1251.txt')
.pipe(iconv.decodeStream('win1251'))
.pipe(csv.parse())
.pipe(csv.stringify())
.pipe(process.stdout);

Transforming JSON in a node stream with a map or template

I'm relatively new to Javascript and Node and I like to learn by doing, but my lack of awareness of Javascript design patterns makes me wary of trying to reinvent the wheel, I'd like to know from the community if what I want to do is already present in some form or another, I'm not looking for specific code for the example below, just a nudge in the right direction and what I should be searching for.
I basically want to create my own private IFTTT/Zapier for plugging data from one API to another.
I'm using the node module request to GET data from one API and then POST to another.
request supports streaming to do neat things like this:
request.get('http://example.com/api')
.pipe(request.put('http://example.com/api2'));
In between those two requests, I'd like to pipe the JSON through a transform, cherry picking the key/value pairs that I need and changing the keys to what the destination API is expecting.
request.get('http://example.com/api')
.pipe(apiToApi2Map)
.pipe(request.put('http://example.com/api2'));
Here's a JSON sample from the source API: http://pastebin.com/iKYTJCYk
And this is what I'd like to send forward: http://pastebin.com/133RhSJT
The transformed JSON in this case takes the keys from the value of each objects "attribute" key and the value from each objects "value" key.
So my questions:
Is there a framework, library or module that will make the transform step easier?
Is streaming the way I should be approaching this? It seems like an elegant way to do it, as I've created some Javascript wrapper functions with request to easily access API methods, I just need to figure out the middle step.
Would it be possible to create "templates" or "maps" for these transforms? Say I want to change the source or destination API, it would be nice to create a new file that maps the source to destination key/values required.
Hope the community can help and I'm open to any and all suggestions! :)
This is an Open Source project I'm working on, so if anyone would like to get involved, just get in touch.
Yes you're definitely on the right track. There are two stream libs I would point you towards, through which makes it easier to define your own streams, and JSONStream which helps to convert a binary stream (like what you get from request.get) into a stream of parsed JSON documents. Here's an example using both of those to get you started:
var through = require('through');
var request = require('request');
var JSONStream = require('JSONStream');
var _ = require('underscore');
// Our function(doc) here will get called to handle each
// incoming document int he attributes array of the JSON stream
var transformer = through(function(doc) {
var steps = _.findWhere(doc.items, {
label: "Steps"
});
var activeMinutes = _.findWhere(doc.items, {
label: "Active minutes"
});
var stepsGoal = _.findWhere(doc.items, {
label: "Steps goal"
});
// Push the transformed document into the outgoing stream
this.queue({
steps: steps.value,
activeMinutes: activeMinutes.value,
stepsGoal: stepsGoal.value
});
});
request
.get('http://example.com/api')
// The attributes.* here will split the JSON stream into chunks
// where each chunk is an element of the array
.pipe(JSONStream.parse('attributes.*'))
.pipe(transformer)
.pipe(request.put('http://example.com/api2'));
As Andrew pointed out there's through or event-stream, however I made something even easier to use, scramjet. It works the same way as through, but it's API is nearly identical to Arrays, so you can use map and filter methods easily.
The code for your example would be:
DataStream
.pipeline(
request.get('http://example.com/api'),
JSONStream.parse('attributes.items.*')
)
.filter((item) => item.attibute) // filter out ones without attribute
.reduce((acc, item) => {
acc[item.attribute] = item.value;
return acc;
.then((result) => request.put('http://example.com/api2', result))
;
I guess this is a little easier to use - however in this example you do accumulate the data into an object - so if the JSON's are actually much longer than this, you may want to turn it back into a JSONStream again.

How to parse JSON using Node.js? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
The community reviewed whether to reopen this question 2 months ago and left it closed:
Original close reason(s) were not resolved
Improve this question
How should I parse JSON using Node.js? Is there some module which will validate and parse JSON securely?
You can simply use JSON.parse.
The definition of the JSON object is part of the ECMAScript 5 specification. node.js is built on Google Chrome's V8 engine, which adheres to ECMA standard. Therefore, node.js also has a global object JSON[docs].
Note - JSON.parse can tie up the current thread because it is a synchronous method. So if you are planning to parse big JSON objects use a streaming json parser.
you can require .json files.
var parsedJSON = require('./file-name');
For example if you have a config.json file in the same directory as your source code file you would use:
var config = require('./config.json');
or (file extension can be omitted):
var config = require('./config');
note that require is synchronous and only reads the file once, following calls return the result from cache
Also note You should only use this for local files under your absolute control, as it potentially executes any code within the file.
You can use JSON.parse().
You should be able to use the JSON object on any ECMAScript 5 compatible JavaScript implementation. And V8, upon which Node.js is built is one of them.
Note: If you're using a JSON file to store sensitive information (e.g. passwords), that's the wrong way to do it. See how Heroku does it: https://devcenter.heroku.com/articles/config-vars#setting-up-config-vars-for-a-deployed-application. Find out how your platform does it, and use process.env to retrieve the config vars from within the code.
Parsing a string containing JSON data
var str = '{ "name": "John Doe", "age": 42 }';
var obj = JSON.parse(str);
Parsing a file containing JSON data
You'll have to do some file operations with fs module.
Asynchronous version
var fs = require('fs');
fs.readFile('/path/to/file.json', 'utf8', function (err, data) {
if (err) throw err; // we'll not consider error handling for now
var obj = JSON.parse(data);
});
Synchronous version
var fs = require('fs');
var json = JSON.parse(fs.readFileSync('/path/to/file.json', 'utf8'));
You wanna use require? Think again!
You can sometimes use require:
var obj = require('path/to/file.json');
But, I do not recommend this for several reasons:
require is synchronous. If you have a very big JSON file, it will choke your event loop. You really need to use JSON.parse with fs.readFile.
require will read the file only once. Subsequent calls to require for the same file will return a cached copy. Not a good idea if you want to read a .json file that is continuously updated. You could use a hack. But at this point, it's easier to simply use fs.
If your file does not have a .json extension, require will not treat the contents of the file as JSON.
Seriously! Use JSON.parse.
load-json-file module
If you are reading large number of .json files, (and if you are extremely lazy), it becomes annoying to write boilerplate code every time. You can save some characters by using the load-json-file module.
const loadJsonFile = require('load-json-file');
Asynchronous version
loadJsonFile('/path/to/file.json').then(json => {
// `json` contains the parsed object
});
Synchronous version
let obj = loadJsonFile.sync('/path/to/file.json');
Parsing JSON from streams
If the JSON content is streamed over the network, you need to use a streaming JSON parser. Otherwise it will tie up your processor and choke your event loop until JSON content is fully streamed.
There are plenty of packages available in NPM for this. Choose what's best for you.
Error Handling/Security
If you are unsure if whatever that is passed to JSON.parse() is valid JSON, make sure to enclose the call to JSON.parse() inside a try/catch block. A user provided JSON string could crash your application, and could even lead to security holes. Make sure error handling is done if you parse externally-provided JSON.
use the JSON object:
JSON.parse(str);
Another example of JSON.parse :
var fs = require('fs');
var file = __dirname + '/config.json';
fs.readFile(file, 'utf8', function (err, data) {
if (err) {
console.log('Error: ' + err);
return;
}
data = JSON.parse(data);
console.dir(data);
});
I'd like to mention that there are alternatives to the global JSON object.
JSON.parse and JSON.stringify are both synchronous, so if you want to deal with big objects you might want to check out some of the asynchronous JSON modules.
Have a look: https://github.com/joyent/node/wiki/Modules#wiki-parsers-json
Include the node-fs library.
var fs = require("fs");
var file = JSON.parse(fs.readFileSync("./PATH/data.json", "utf8"));
For more info on 'fs' library , refer the documentation at http://nodejs.org/api/fs.html
Since you don't know that your string is actually valid, I would put it first into a try catch. Also since try catch blocks are not optimized by node, i would put the entire thing into another function:
function tryParseJson(str) {
try {
return JSON.parse(str);
} catch (ex) {
return null;
}
}
OR in "async style"
function tryParseJson(str, callback) {
process.nextTick(function () {
try {
callback(null, JSON.parse(str));
} catch (ex) {
callback(ex)
}
})
}
Parsing a JSON stream? Use JSONStream.
var request = require('request')
, JSONStream = require('JSONStream')
request({url: 'http://isaacs.couchone.com/registry/_all_docs'})
.pipe(JSONStream.parse('rows.*'))
.pipe(es.mapSync(function (data) {
return data
}))
https://github.com/dominictarr/JSONStream
Everybody here has told about JSON.parse, so I thought of saying something else. There is a great module Connect with many middleware to make development of apps easier and better. One of the middleware is bodyParser. It parses JSON, html-forms and etc. There is also a specific middleware for JSON parsing only noop.
Take a look at the links above, it might be really helpful to you.
JSON.parse("your string");
That's all.
as other answers here have mentioned, you probably want to either require a local json file that you know is safe and present, like a configuration file:
var objectFromRequire = require('path/to/my/config.json');
or to use the global JSON object to parse a string value into an object:
var stringContainingJson = '\"json that is obtained from somewhere\"';
var objectFromParse = JSON.parse(stringContainingJson);
note that when you require a file the content of that file is evaluated, which introduces a security risk in case it's not a json file but a js file.
here, i've published a demo where you can see both methods and play with them online (the parsing example is in app.js file - then click on the run button and see the result in the terminal):
http://staging1.codefresh.io/labs/api/env/json-parse-example
you can modify the code and see the impact...
Using JSON for your configuration with Node.js? Read this and get your configuration skills over 9000...
Note: People claiming that data = require('./data.json'); is a
security risk and downvoting people's answers with zealous zeal: You're exactly and completely wrong.
Try placing non-JSON in that file... Node will give you an error, exactly like it would if you did the same thing with the much slower and harder to code manual file read and then subsequent JSON.parse(). Please stop spreading misinformation; you're hurting the world, not helping. Node was designed to allow this; it is not a security risk!
Proper applications come in 3+ layers of configuration:
Server/Container config
Application config
(optional) Tenant/Community/Organization config
User config
Most developers treat their server and app config as if it can change. It can't. You can layer changes from higher layers on top of each other, but you're modifying base requirements. Some things need to exist! Make your config act like it's immutable, because some of it basically is, just like your source code.
Failing to see that lots of your stuff isn't going to change after startup leads to anti-patterns like littering your config loading with try/catch blocks, and pretending you can continue without your properly setup application. You can't. If you can, that belongs in the community/user config layer, not the server/app config layer. You're just doing it wrong. The optional stuff should be layered on top when the application finishes it's bootstrap.
Stop banging your head against the wall: Your config should be ultra simple.
Take a look at how easy it is to setup something as complex as a protocol-agnostic and datasource-agnostic service framework using a simple json config file and simple app.js file...
container-config.js...
{
"service": {
"type" : "http",
"name" : "login",
"port" : 8085
},
"data": {
"type" : "mysql",
"host" : "localhost",
"user" : "notRoot",
"pass" : "oober1337",
"name" : "connect"
}
}
index.js... (the engine that powers everything)
var config = require('./container-config.json'); // Get our service configuration.
var data = require(config.data.type); // Load our data source plugin ('npm install mysql' for mysql).
var service = require(config.service.type); // Load our service plugin ('http' is built-in to node).
var processor = require('./app.js'); // Load our processor (the code you write).
var connection = data.createConnection({ host: config.data.host, user: config.data.user, password: config.data.pass, database: config.data.name });
var server = service.createServer(processor);
connection.connect();
server.listen(config.service.port, function() { console.log("%s service listening on port %s", config.service.type, config.service.port); });
app.js... (the code that powers your protocol-agnostic and data-source agnostic service)
module.exports = function(request, response){
response.end('Responding to: ' + request.url);
}
Using this pattern, you can now load community and user config stuff on top of your booted app, dev ops is ready to shove your work into a container and scale it. You're read for multitenant. Userland is isolated. You can now separate the concerns of which service protocol you're using, which database type you're using, and just focus on writing good code.
Because you're using layers, you can rely on a single source of truth for everything, at any time (the layered config object), and avoid error checks at every step, worrying about "oh crap, how am I going to make this work without proper config?!?".
If you need to parse JSON with Node.js in a secure way (aka: the user can input data, or a public API) I would suggest using secure-json-parse.
The usage is like the default JSON.parse but it will protect your code from:
prototype poisoning
and constructor abuse:
const badJson = '{ "a": 5, "b": 6, "__proto__": { "x": 7 }, "constructor": {"prototype": {"bar": "baz"} } }'
const infected = JSON.parse(badJson)
console.log(infected.x) // print undefined
const x = Object.assign({}, infected)
console.log(x.x) // print 7
const sjson = require('secure-json-parse')
console.log(sjson.parse(badJson)) // it will throw by default, you can ignore malicious data also
My solution:
var fs = require('fs');
var file = __dirname + '/config.json';
fs.readFile(file, 'utf8', function (err, data) {
if (err) {
console.log('Error: ' + err);
return;
}
data = JSON.parse(data);
console.dir(data);
});
Just want to complete the answer (as I struggled with it for a while), want to show how to access the json information, this example shows accessing Json Array:
var request = require('request');
request('https://server/run?oper=get_groups_joined_by_user_id&user_id=5111298845048832', function (error, response, body) {
if (!error && response.statusCode == 200) {
var jsonArr = JSON.parse(body);
console.log(jsonArr);
console.log("group id:" + jsonArr[0].id);
}
})
Just to make this as complicated as possible, and bring in as many packages as possible...
const fs = require('fs');
const bluebird = require('bluebird');
const _ = require('lodash');
const readTextFile = _.partial(bluebird.promisify(fs.readFile), _, {encoding:'utf8',flag:'r'});
const readJsonFile = filename => readTextFile(filename).then(JSON.parse);
This lets you do:
var dataPromise = readJsonFile("foo.json");
dataPromise.then(console.log);
Or if you're using async/await:
let data = await readJsonFile("foo.json");
The advantage over just using readFileSync is that your Node server can process other requests while the file is being read off disk.
JSON.parse will not ensure safety of json string you are parsing. You should look at a library like json-safe-parse or a similar library.
From json-safe-parse npm page:
JSON.parse is great, but it has one serious flaw in the context of JavaScript: it allows you to override inherited properties. This can become an issue if you are parsing JSON from an untrusted source (eg: a user), and calling functions on it you would expect to exist.
Leverage Lodash's attempt function to return an error object, which you can handle with the isError function.
// Returns an error object on failure
function parseJSON(jsonString) {
return _.attempt(JSON.parse.bind(null, jsonString));
}
// Example Usage
var goodJson = '{"id":123}';
var badJson = '{id:123}';
var goodResult = parseJSON(goodJson);
var badResult = parseJSON(badJson);
if (_.isError(goodResult)) {
console.log('goodResult: handle error');
} else {
console.log('goodResult: continue processing');
}
// > goodResult: continue processing
if (_.isError(badResult)) {
console.log('badResult: handle error');
} else {
console.log('badResult: continue processing');
}
// > badResult: handle error
Always be sure to use JSON.parse in try catch block as node always throw an Unexpected Error if you have some corrupted data in your json so use this code instead of simple JSON.Parse
try{
JSON.parse(data)
}
catch(e){
throw new Error("data is corrupted")
}
As mentioned in the above answers, We can use JSON.parse() to parse the strings to JSON
But before parsing, be sure to parse the correct data or else it might bring your whole application down
it is safe to use it like this
let parsedObj = {}
try {
parsedObj = JSON.parse(data);
} catch(e) {
console.log("Cannot parse because data is not is proper json format")
}
Use JSON.parse(str);. Read more about it here.
Here are some examples:
var jsonStr = '{"result":true, "count":42}';
obj = JSON.parse(jsonStr);
console.log(obj.count); // expected output: 42
console.log(obj.result); // expected output: true
If you want to add some comments in your JSON and allow trailing commas you might want use below implemention:
var fs = require('fs');
var data = parseJsData('./message.json');
console.log('[INFO] data:', data);
function parseJsData(filename) {
var json = fs.readFileSync(filename, 'utf8')
.replace(/\s*\/\/.+/g, '')
.replace(/,(\s*\})/g, '}')
;
return JSON.parse(json);
}
Note that it might not work well if you have something like "abc": "foo // bar" in your JSON. So YMMV.
If the JSON source file is pretty big, may want to consider the asynchronous route via native async / await approach with Node.js 8.0 as follows
const fs = require('fs')
const fsReadFile = (fileName) => {
fileName = `${__dirname}/${fileName}`
return new Promise((resolve, reject) => {
fs.readFile(fileName, 'utf8', (error, data) => {
if (!error && data) {
resolve(data)
} else {
reject(error);
}
});
})
}
async function parseJSON(fileName) {
try {
return JSON.parse(await fsReadFile(fileName));
} catch (err) {
return { Error: `Something has gone wrong: ${err}` };
}
}
parseJSON('veryBigFile.json')
.then(res => console.log(res))
.catch(err => console.log(err))
I use fs-extra. I like it a lot because -although it supports callbacks- it also supports Promises. So it just enables me to write my code in a much more readable way:
const fs = require('fs-extra');
fs.readJson("path/to/foo.json").then(obj => {
//Do dome stuff with obj
})
.catch(err => {
console.error(err);
});
It also has many useful methods which do not come along with the standard fs module and, on top of that, it also bridges the methods from the native fs module and promisifies them.
NOTE: You can still use the native Node.js methods. They are promisified and copied over to fs-extra. See notes on fs.read() & fs.write()
So it's basically all advantages. I hope others find this useful.
You can use JSON.parse() (which is a built in function that will probably force you to wrap it with try-catch statements).
Or use some JSON parsing npm library, something like json-parse-or
Use this to be on the safe side
var data = JSON.parse(Buffer.concat(arr).toString());
NodeJs is a JavaScript based server, so you can do the way you do that in pure JavaScript...
Imagine you have this Json in NodeJs...
var details = '{ "name": "Alireza Dezfoolian", "netWorth": "$0" }';
var obj = JSON.parse(details);
And you can do above to get a parsed version of your json...
No further modules need to be required.
Just use
var parsedObj = JSON.parse(yourObj);
I don think there is any security issues regarding this
It's simple, you can convert JSON to string using JSON.stringify(json_obj), and convert string to JSON using JSON.parse("your json string").

Categories

Resources