How to read multiple objects from text file using node? - javascript

I have a text file, which contains multiple objects written to it. I need to fetch all the objects as JSON in the text file. What should I do?
data in my file:
{"events":[...] },{"events":[....]},{},{}....
I tried to read it as :
fs.readFile('gcyoi6.txt', function (err, data) {
if (err) throw err;
data =data.toString();
console.log(data)
});
it gives the data as a string. But I need it as JSON objects
Thanks in advance!

You can pass the data which you get in the file to JSON.parse function which will converte string get from file into a JSON representation of the content which is in your .txt file.
fs.readFile('gcyoi6.txt', function (err, data) {
if (err) throw err;
data =JSON.parse(data);
console.log(data)
});
This is a valid text which can convert to JSON
const validJSONString = JSON.parse(`[{"event":"name"},{"event": "test"}]`);
console.log(validJSONString);
This is a bad JSON
const invalidJSONString = JSON.parse(`[{"event":"name"},{'event': 'test'}]`); // Throw an error error

Related

Error creating json file in node js

I have followed many solutions provided in the previous questions but mine is not working. The problem is in .json extension. Whenever I use filename.json, the app will crash with ERR_CONNECTION_RESET but successfully created an empty .json file. However, if I change the extension to filename.txt, the fs.writeFile will successfully create the filename.txt with the data inside and the app will work as expected. Did I miss any configuration here to create the JSON file?
Here is the example code I used.
var jsonData = '{"persons":[{"name":"John","city":"New York"},{"name":"Phil","city":"Ohio"}]}';
// parse json
var jsonObj = JSON.parse(jsonData);
console.log(jsonObj);
// stringify JSON Object
var jsonContent = JSON.stringify(jsonObj);
console.log(jsonContent);
fs.writeFile("./public/output.json", jsonContent, 'utf8', function(err) {
if (err) {
console.log("An error occured while writing JSON Object to File.");
return console.log(err);
}
console.log("JSON file has been saved.");
});
So, ERR_CONNECTION_RESET means that the connection was closed midway. My guess, as in the comments, would be that it's a reloading server.
Try using --ignore public/**/*.json and it should work.

D3.csv not loading the csv file

For some reason, I am not able to load any .csv file with the d3.csv(...) function. What I do is that I load a csv file using the function above and print out the data directly to the console.
This is what what I do:
I create a file called irisData.csv using notepad. It includes some data from the iris dataset from https://archive.ics.uci.edu/ml/datasets/Iris
sepal_length,sepal_width,petal_length,petal_width,species
5.1,3.5,1.4,0.2,Iris-setosa
4.9,3,1.4,0.2,Iris-setosa
4.7,3.2,1.3,0.2,Iris-setosa
4.6,3.1,1.5,0.2,Iris-setosa
5,3.6,1.4,0.2,Iris-setosa
5.4,3.9,1.7,0.4,Iris-setosa
I write this code in my irisTest.html, to print out the data into the console, checking if it works correctly.
...
d3.csv("irisData.csv", type, function(error, data){
console.log(data);
});
...
Not sure if this is relevant, but I will put it up anyway: In order to run my html, I use Node.js to run a server. This is the code for the server, server.html:
var http = require('http');
fs = require('fs');
http.createServer(function(req, res){
fs.readFile('./irisTest.html', function(err, data){
if(err){
res.writeHead(500, {'Content-type': 'text/plain'});
res.end('500 - Internal Error');
}else{
res.writeHead(200, {'Content-type': 'text/html'});
res.end(data);
}
});
}).listen(3000);
So what I would expect is that the console prints out an object consisting of the data from the csv file. Something like this:
[
{data...}
{data...}
{data...}
...
]
However what I get is the code of my irisTest.html (which is the html code itself) wrapped into objects. I realize that it doesn't matter what I put instead of "irisData.cvs" as the path in d3.csv("irisData.csv", ...), it always outputs my own code such as below. So I thought it might be a problem with the path to the csv file, but there shouldn't be. All files are in the same folder.
[
...
{<!DOCTYPE html>: "d3.csv("irisData.csv", type, function(error, data){"}
{<!DOCTYPE html>: "console.log(data);"}
{<!DOCTYPE html>: "});}"}
...
]
Does anyone know what is going on?
As specified in the documentation here, the anonymous function is expected instead of type. I quote the example from the doc:
d3.csv("example.csv", function(d) {
return {
year: new Date(+d.Year, 0, 1), // convert "Year" column to Date
make: d.Make,
model: d.Model,
length: +d.Length // convert "Length" column to number
};
}, function(error, rows) {
console.log(rows);
});
So, in your case, reading the csv file should be done this way:
d3.csv("irisData.csv",function(data){
console.log(data);
},function(error, rows){
console.log(rows);
});
Here is a working example in gist, check the console to see the data object.

MVC with Node.js , Express and couchBase db

I started a Node.js application to implement Model View Controller.
The logic behind the application is relatively simple:
1) I get a JSON dump using a request.get 2) Put the JSON dump in CouchBase db 3) Get only the "id" value from the JSON file 4) Create a table (HTML) in the viewer to print how often I got every "id".
From what I read online, the model is where my data will be. The viewer will have a jade template to generate dynamically the table in HTML (i am using express.js) and the controller is supposed to react on events, to make sure data is received and viewer is updated.
My question is, which operations belong to which? In my case, what is the Model, the Viewer and the Controller supposed to do? Now I have an app.js file, view.jade, and model.js . Is this correct ?
And also, after I get the JSON dump, I connect to the CouchBase but I am unable to set the file since it does not find the key (the dump is valid JSON) using this code:
request.get(url, function(error, response, body){
console.log(body);
try
{JSON.parse(body);
} catch (e){
return -1;
}
});
var couchbase = require('couchbase');
var db = new couchbase.Connection({host: "localhost:8091", bucket: "default"}, function(err) {
if (err) throw err;
db.set({"id"="176"},body, function(err, result) {
if (err) throw err;
});
});

How do I run a Frisby.js test inside a function

I can't figure out why this frisby tests won't run!
Basically I'm trying to import JSON from a file and check it against a return from a request. The compiler doesn't seem to find any tests when I run this file.
If anyone could possibly suggest a better way to do this? I'm thinking about trying a different way to handle the file reading. I know about readFileSync() but I do not want to use that if I don't have to! Any help would be appreciated.
function readContent(callback,url,file) {
fs.readFile(file, 'UTF8', function (err, content) {
if (err) return callback(err)
data = JSON.parse(content)
callback(null, data)
})
}
readContent(function (err, content) {
frisby.create('Testing API')
.get(url)
.expectStatus(200)
.expectBodyContains(content)
.toss()
},
url,file)
Here's one I prepared earlier:
// Read a JSON file from disk, and compare its contents with what comes back from the API.
fs.readFile(path.resolve(__dirname, 'GET_ReferenceTypes.json'), 'utf-8', function(error, data){
if (error) throw error
frisby.create('GET ReferenceTypes, inside readFile callback')
.get(URL + 'ReferenceTypes?requestPersonId=2967&id=99')
.expectStatus(200)
// JSON.parse() is required to convert the string into a proper JSON object for comparison.
// The .replace() strips the BOM character from the beginning of the unicode file.
.expectJSON(JSON.parse(data.replace(/^\uFEFF/, '')))
.toss();
});
Double check the encoding on your JSON file, because this whole thing comes apart without the .replace() call.

Node xml2js is returning 'undefined' when using parseString()

I'm using this package elsewhere and it's working just fine, however in one particular example with one XML file I'm getting "undefined" errors.
Example:
fs.readFile('./XML/theXMLfile13mb.xml', 'ascii', function(err,data){
if(err) {
console.log("Could not open file " + err);
process.exit(1);
}
parseString(data, function (err, result) {
console.log(result); // Returns undefined
var json1 = JSON.stringify(result); // Gives an error
var json = JSON.parse(json1);
The xml2js docs don't really mention how this might be possible/what this might mean. I've tried using other XML files and they work fine. This particular XML file is no bigger than the others nor does it appear to be any less in-tact (it opens fine in my browser and all the data is presented as expected).
Any ideas on how I could troubleshoot this?
You need to convert the data from a Buffer to a String, use this:
parseString(data.toString(), function (err, result) {
Instead of:
parseString(data, function (err, result) {

Categories

Resources