How do I read a file in Node.js? - javascript

In Node.js, I want to read a file, and then console.log() each line of the file separated by \n. How can I do that?

Try this:
var fs=require('fs');
fs.readFile('/path/to/file','utf8', function (err, data) {
if (err) throw err;
var arr=data.split('\n');
arr.forEach(function(v){
console.log(v);
});
});

Try reading the fs module documentation.

Please refer to the File System API's in node.js, there is also few similar questions on SO, there is one of them

There are many ways to read a file in Node. You can learn about all of them in the Node documentation about the File System module, fs.
In your case, let's assume that you want to read a simple text file, countries.txt that looks like this;
Uruguay
Chile
Argentina
New Zealand
First you have to require() the fs module at the top of your JavaScript file, like this;
var fs = require('fs');
Then to read your file with it, you can use the fs.readFile() method, like this;
fs.readFile('countries.txt','utf8', function (err, data) {});
Now, inside the {}, you can interact with the results of the readFile method. If there was an error, the results will be stored in the err variable, otherwise, the results will be stored in the data variable. You can log the data variable here to see what you're working with;
fs.readFile('countries.txt','utf8', function (err, data) {
console.log(data);
});
If you did this right, you should get the exact contents of the text file in your terminal;
Uruguay
Chile
Argentina
New Zealand
I think that's what you want. Your input was separated by newlines (\n), and the output will be as well since readFile doesn't change the contents of the file. If you want, you can make changes to the file before logging the results;
fs.readFile('calendar.txt','utf8', function (err, data) {
// Split each line of the file into an array
var lines=data.split('\n');
// Log each line separately, including a newline
lines.forEach(function(line){
console.log(line, '\n');
});
});
That will add an extra newline between each line;
Uruguay
Chile
Argentina
New Zealand
You should also account for any possible errors that happen while reading the file by adding if (err) throw err on the line right before you first access data. You can put all of that code together in a script called read.js like this;
var fs = require('fs');
fs.readFile('calendar.txt','utf8', function (err, data) {
if (err) throw err;
// Split each line of the file into an array
var lines=data.split('\n');
// Log each line separately, including a newline
lines.forEach(function(line){
console.log(line, '\n');
});
});
You can then run that script in your Terminal. Navigate to the directory that contains both countries.txt and read.js, and then type node read.js and hit enter. You should see the results logged out on the screen. Congratulations! You've read a file with Node!

Related

Access locally stored file in NodeJS/Express

I am looking to access the locally created PDF in the server(NODEJS/Express) side, i am fairly new to java script and haven't seen any method to access the file.
I think i can use something like ${__dirname}/${user_details.Invoice_No_latest}.pdf but i am not sure if it is correct, any suggestions please
As can seen below the PDF that is generated (using html-pdf) is getting saved locally in the folder D:\Programming\Repo\JavaScript-Project\ec-server\22.pdf'
Once i get access to the file then i will use some thing similar as below code to save it to the database MySql
Snippet of code that i am looking to use afterwards
var jsfile = Buffer.concat(chunks);
var query = 'INSERT INTO `files` SET ?',
values = {
type: 'pdf',
data: jsfile
};
mysql.query(query, values, function(err) {
if (err) throw err; // TODO: improve
// do something after successful insertion
});
just looking if there is any simple way to access annd play with any file in the nodejs/Express that is stored locally there.
I found a way to use a locally stored file using the fs.readFile(/* Arguments Here */) function as shown here:
fs.readFile(`${__dirname}\\` + `${Invoice_No_Actual}` + `.pdf`, (err, data) => {
if (err) res.status(500).send(err);
else {
console.log(data);
res.contentType("blob");
res.send(`data:application/pdf;base64,${new Buffer.from(data).toString("base64")}`);
}
});

Capture messages in from python-shell in an array

I am new to Javascript but am attempting to build an Electron app as a gui to a python script.
I am using python-shell to call my Python script and can successfully do this and printing the output in the console. However I would like to capture multiple Python print statements in a Javascript array.
Following the python-shell docs I can access each Python print statement via the message event.
var res = [];
var ligation = PythonShell.run("ligation.py", options, function (err, results) {
if (err) throw err;
// console.log('results: %j', results);
});
ligation.on('message', function(message) {
// console.log(message)
res.push(message)
})
console.log(res)
However when I try to assign the ouput of these events to an array it works but I cannot access the values as the console looks like this:
where the 8 numbers are my output. When I try to access a single value (e.g. console.log(res[0])) I get it reported that it is undefined.
I gather that the little blue i means that the values are just evaluated and figure this may have something to do with it but I do not know what.
I figure there must be something straightforward I am missing. Any help is appreciated.

Push new data to external JSON file

I'm trying to push a new object into an external Javascript document but I am having problems pulling in the JSON file to write to. I have an external, local file called check.json.
How do I call the external json file correctly in node?
var newData = JSON.parse(check.json);
newData.check.push({
cheap: $el.text()
});
check.json = JSON.stringify(newData);
You can use the Filesystem object to read and write to files. Specifically, the readFile and writeFile methods.
For example, to read:
fs.readFile('/path/to/my/json.json', function (err, data) {
if (err) throw err;
var newData = JSON.parse(data);
});
That said, flat files are not a good format for storing data and being accessed like a database. You can run into race conditions and lose data. You would be better off with a real database that will take case of that sort of thing for you.
SQLite gives you a simple file (but puts lots of protection in around it). If you really wanted JSON as the storage format then you could look at something like couchDB.

Use Asynchronous IO better

I am really new to JS, and even newer to node.js. So using "traditional" programming paradigms my file looks like this:
var d = require('babyparse');
var fs = require('fs');
var file = fs.readFile('SkuDetail.txt');
d.parse(file);
So this has many problems:
It's not asynchronous
My file is bigger than the default max file size (this one is about 60mb) so it currently breaks (not 100% sure if that's the reason).
My question: how do I load a big file (and this will be significantly bigger than 60mb for future uses) asynchronously, parsing as I get information. Then as a followup, how do I know when everything is completed?
You should create a ReadStream. A common pattern looks like this. You can parse data as it gets available on the data event.
function readFile(filePath, done) {
var
stream = fs.createReadStream(filePath),
out = '';
// Make done optional
done = done || function(err) { if(err) throw err; };
stream.on('data', function(data) {
// Parse data
out += data;
});
stream.on('end', function(){
done(null, out); // All data is read
});
stream.on('error', function(err) {
done(err);
});
}
You can use the method like:
readFile('SkuDetail.txt', function(err, out) {
// Handle error
if(err) throw err;
// File has been read and parsed
}
If you add the parsed data to the out variable the entire parsed file will be sent to the done callback.
It already is asynchronous, javascript is asynchronous no extra effort is needed from your part. Does your code even work though? I think your parse should be inside a callback of read. Otherwise readfile is skipped and file is null.
In normal situations any io code you write will be "skipped" and the code after it which may be more direct will be executed first.
For the first question since you want to process chunks, Streams might be what you are looking for. #pstenstrm has an example in his answer.
Also, you can check this Node.js documentation link for Streams: https://nodejs.org/api/fs.html#fs_fs_createreadstream_path_options
If you want an brief description and example for Streams check this link: http://www.sitepoint.com/basics-node-js-streams/
You can pass a callback to the fs.readFile function to process the content once the file read is complete. This would answer your second question.
fs.readFile('SkuDetail.txt', function(err, data){
if(err){
throw err;
}
processFile(data);
});
You can see Get data from fs.readFile for more details.
Also, you could use Promises for cleaner code with other added benefits. Check this link: http://promise-nuggets.github.io/articles/03-power-of-then-sync-processing.html

How do I read and write files to the server with Meteor?

I'm working on a NoDB CMS in Meteor, but I'm new to both Meteor and JavaScript frameworks.
How do I go about reading and writing files to the server?
Within the Node fs module you have a writeFile function.
getUser = Meteor.users.findOne({_id : Meteor.userId()});
userObject = JSON.stringify(getUser);
var path = process.env["PWD"] + "/public/";
fs.writeFile(process.env["PWD"] + "/public/"+Meteor.userId()+'.txt', userObject,
function (err) {
if (err) throw err;
console.log('Done!');
}
);
The above snippet would create a file with all the information of the user. You could access the properties of the result of your query with something like getUser._id to prepare your data parameter (String or Buffer) to print pretty.
All this of course is server side.
you can try to use Npm.require inside the startup function. Like so
Meteor.startup(function () {
fs = Npm.require('fs');
}
But you should definitely have a look at collectionFS that does what you are looking for: storing files on the server and allowing you to retrieve them
an added advantage is that you can distribute everything over many nodes of a MongoDB cluster
to manipulate image files, you can use imagemagick with nodejs this should allow you to transform in any way you need.
The node fs module is a start. http://nodejs.org/api/fs.html
You might want to be a bit more specific with your question though, as it's kind of broad.

Categories

Resources