Error creating json file in node js - javascript

I have followed many solutions provided in the previous questions but mine is not working. The problem is in .json extension. Whenever I use filename.json, the app will crash with ERR_CONNECTION_RESET but successfully created an empty .json file. However, if I change the extension to filename.txt, the fs.writeFile will successfully create the filename.txt with the data inside and the app will work as expected. Did I miss any configuration here to create the JSON file?
Here is the example code I used.
var jsonData = '{"persons":[{"name":"John","city":"New York"},{"name":"Phil","city":"Ohio"}]}';
// parse json
var jsonObj = JSON.parse(jsonData);
console.log(jsonObj);
// stringify JSON Object
var jsonContent = JSON.stringify(jsonObj);
console.log(jsonContent);
fs.writeFile("./public/output.json", jsonContent, 'utf8', function(err) {
if (err) {
console.log("An error occured while writing JSON Object to File.");
return console.log(err);
}
console.log("JSON file has been saved.");
});

So, ERR_CONNECTION_RESET means that the connection was closed midway. My guess, as in the comments, would be that it's a reloading server.
Try using --ignore public/**/*.json and it should work.

Related

On submitting to this post request, the main code is not getting executed, it is always reaching the catch block why?

I'm not properly handling async operation. I'm trying to write values that are received on app.post('/receiver') route, to an excel file. I'm using exceljs npm package. When I try this as a separate one its perfectly working, but when call this exceljs function from app.post its not working.
I wanna write/ read an excel file when My client sends a post request to the route /receive. But here the code is always reaching the catch block, I even tried removing this
app.post('/receiver',function(req, res){
var data = [req.body.deviceid, req.body.purposeofvisit, req.body.componentsused,req.body.rectification,req.body.hoursspent,req.body.completed, req.body.reason]
console.log(data);
workBook.xlsx.readFile('/routes/book.xlsx').then(function() {
var worksheet = workBook.getWorksheet(1);
var row = worksheet.getRow(7);
row.getCell(8).value = "AF22e-3852";
row.commit();
workBook.xlsx.writeFile('/routes/book.xlsx');
res.sendFile(path.join(__dirname + '/in.html'));
}).catch(function(){
console.log('Catch reached')
});
});
My current Output:
Screenshot
Require Output:
Client request: /receiver (posts data)
Server on reception: Writes these values to an excel file (I'm using exceljs package)

Problems Downloading files using Dropbox JavaScript SDK

I need to figure out where my files are downloading when I use the filesDownload(). I don't see an argument for file destination. Here's my code:
require('isomorphic-fetch');
var Dropbox = require('dropbox').Dropbox;
var dbx = new Dropbox({ accessToken: 'accessToken', fetch});
dbx.filesDownload({path: 'filepath}).
then(function(response) {
console.log(response);
})
.catch(function(error) {
console.log(error);
});
I'm getting a successful callback when I run the code but I don't see the file anywhere.
I need to know where my files are downloading to and how to specify the file destination in my function.
Thanks,
Gerald
I've used the function as described in the SDK's documentation (http://dropbox.github.io/dropbox-sdk-js/Dropbox.html#filesDownload__anchor) but I have no idea where my file goes.
Expected Result: Files are downloaded to Dropbox to path that I have designated.
Actual Results: I get a successful callback from Dropbox but I cannot find the files downloaded.
In Node.js, the Dropbox API v2 JavaScript SDK download-style methods return the file data in the fileBinary property of the object they pass to the callback (which is response in your code).
You can find an example of that here:
https://github.com/dropbox/dropbox-sdk-js/blob/master/examples/javascript/node/download.js#L20
So, you should be able to access the data as response.fileBinary. It doesn't automatically save it to the local filesystem for you, but you can then do so if you want.
You need to use fs module to save binary data to file.
dbx.filesDownload({path: YourfilePath})
.then(function(response) {
console.log(response.media_info);
fs.writeFile(response.name, response.fileBinary, 'binary', function (err) {
if (err) { throw err; }
console.log('File: ' + response.name + ' saved.');
});
})
.catch(function(error) {
console.error(error);
});

NeDB not loading or storing to file

I cannot get the simplest example of NeDB to run properly. My code only works in-memory, persistence to file keeps failing without any error messages.
The error callbacks for the loaddatabase and insert events always pass a null reference as error, so no information there. Oddly it seems no one else has this issue, so I guess I'm missing something here. All help is much appreciated.
Here is the code:
var Datastore = require('nedb'), db = new Datastore({ filename: 'test.db' });
db.loadDatabase(function (err) {
alert(err); // err is null, with the autoload flag no error is thrown either
});
var doc = { hello: 'world'};
db.insert(doc, function (err, newDoc) {
alert(err); // err is null here as well. Doc will be in the memory storage but no persisted to file
});
Although this question is pretty old, I'd like to share my experience for anyone facing a similar issue.
NeDB API does not allow JSON input. You have to put in a javascript object. When you use JSON input, no error is returned and nothing will be persisted.
'null' is returned as error in callback to signal that no problem occurred. When saving the first JSON document it is indexed with 'undefined' key, because NeDB calls 'key = obj[fieldname[0]]' which returns 'undefined', when the obj is just a (JSON) string. No error is returned unfortunately. Inserting a second document will cause a unique constraint violation error in the callback as the key 'undefined' has already been taken. Anyhow, nothing will be persisted.
Try
var Datastore = require('nedb'), db = new Datastore({ filename: 'test.db' });
db.loadDatabase(function (error) {
if (error) {
console.log('FATAL: local database could not be loaded. Caused by: ' + error);
throw error;
}
console.log('INFO: local database loaded successfully.');
});
// creating the object with new, just to make it clear.
// var doc = {hello: 'world'}; should work too.
function myDoc(greeting)
{
this.hello=greeting;
}
var doc = new myDoc('world');
db.insert(doc, function (error, newDoc) {
if (error) {
console.log('ERROR: saving document: ' + JSON.stringify(doc) + '. Caused by: ' + error);
throw error;
}
console.log('INFO: successfully saved document: ' + JSON.stringify(newDoc));
});
Maybe it helps someone. :)
This question is quite old but since I had very similar problem I thought that I'll write my resolution for anyone facing similar issues.
In my case I was writing Electron app using electron-webpack as an application builder. It turns out that NeDB loaded by Webpack was running in browser mode without access to file system.
To get it working I had to change import statement from:
import DataStore from 'nedb';
to this:
const DataStore = require('nedb');
Also I had to add NeDB to Webpack configuration as external module (in package.json):
"electronWebpack": {
"externals": {
"nedb": "commonjs nedb"
}
}
I have found this resolution on NeDB github page: https://github.com/louischatriot/nedb/issues/329
All I had to do to fix this was delete the .db file and let the program make one for me by running it one more time.
The other thing I did that could have fixed it was making sure my package.json had all the required information. this can be easily done with a quick "npm init" in the terminal.

D3.csv not loading the csv file

For some reason, I am not able to load any .csv file with the d3.csv(...) function. What I do is that I load a csv file using the function above and print out the data directly to the console.
This is what what I do:
I create a file called irisData.csv using notepad. It includes some data from the iris dataset from https://archive.ics.uci.edu/ml/datasets/Iris
sepal_length,sepal_width,petal_length,petal_width,species
5.1,3.5,1.4,0.2,Iris-setosa
4.9,3,1.4,0.2,Iris-setosa
4.7,3.2,1.3,0.2,Iris-setosa
4.6,3.1,1.5,0.2,Iris-setosa
5,3.6,1.4,0.2,Iris-setosa
5.4,3.9,1.7,0.4,Iris-setosa
I write this code in my irisTest.html, to print out the data into the console, checking if it works correctly.
...
d3.csv("irisData.csv", type, function(error, data){
console.log(data);
});
...
Not sure if this is relevant, but I will put it up anyway: In order to run my html, I use Node.js to run a server. This is the code for the server, server.html:
var http = require('http');
fs = require('fs');
http.createServer(function(req, res){
fs.readFile('./irisTest.html', function(err, data){
if(err){
res.writeHead(500, {'Content-type': 'text/plain'});
res.end('500 - Internal Error');
}else{
res.writeHead(200, {'Content-type': 'text/html'});
res.end(data);
}
});
}).listen(3000);
So what I would expect is that the console prints out an object consisting of the data from the csv file. Something like this:
[
{data...}
{data...}
{data...}
...
]
However what I get is the code of my irisTest.html (which is the html code itself) wrapped into objects. I realize that it doesn't matter what I put instead of "irisData.cvs" as the path in d3.csv("irisData.csv", ...), it always outputs my own code such as below. So I thought it might be a problem with the path to the csv file, but there shouldn't be. All files are in the same folder.
[
...
{<!DOCTYPE html>: "d3.csv("irisData.csv", type, function(error, data){"}
{<!DOCTYPE html>: "console.log(data);"}
{<!DOCTYPE html>: "});}"}
...
]
Does anyone know what is going on?
As specified in the documentation here, the anonymous function is expected instead of type. I quote the example from the doc:
d3.csv("example.csv", function(d) {
return {
year: new Date(+d.Year, 0, 1), // convert "Year" column to Date
make: d.Make,
model: d.Model,
length: +d.Length // convert "Length" column to number
};
}, function(error, rows) {
console.log(rows);
});
So, in your case, reading the csv file should be done this way:
d3.csv("irisData.csv",function(data){
console.log(data);
},function(error, rows){
console.log(rows);
});
Here is a working example in gist, check the console to see the data object.

MVC with Node.js , Express and couchBase db

I started a Node.js application to implement Model View Controller.
The logic behind the application is relatively simple:
1) I get a JSON dump using a request.get 2) Put the JSON dump in CouchBase db 3) Get only the "id" value from the JSON file 4) Create a table (HTML) in the viewer to print how often I got every "id".
From what I read online, the model is where my data will be. The viewer will have a jade template to generate dynamically the table in HTML (i am using express.js) and the controller is supposed to react on events, to make sure data is received and viewer is updated.
My question is, which operations belong to which? In my case, what is the Model, the Viewer and the Controller supposed to do? Now I have an app.js file, view.jade, and model.js . Is this correct ?
And also, after I get the JSON dump, I connect to the CouchBase but I am unable to set the file since it does not find the key (the dump is valid JSON) using this code:
request.get(url, function(error, response, body){
console.log(body);
try
{JSON.parse(body);
} catch (e){
return -1;
}
});
var couchbase = require('couchbase');
var db = new couchbase.Connection({host: "localhost:8091", bucket: "default"}, function(err) {
if (err) throw err;
db.set({"id"="176"},body, function(err, result) {
if (err) throw err;
});
});

Categories

Resources