Read and write on a File using JS and Node - javascript

I'm trying to write a function that reads a file and when it finds a specific line it deletes everything below it and then it appends another set of lines. I have managed to read the file and find the string I need:
function read() {
lineReader.on('line', function (line) {
console.log('Line from file: ' + line)
if(line.trim() === 'Examples:'){
console.log('Found what I needed, delete everything below this line');
}
});
}
What I'm unable to see is how to delete everything below this line and then append the text I need. I am new to JS and Node.js.

You could do this by opening a file writing stream at the same time.
In the lineReader event handler, put all the lines before "Example" into a separate file stream. When Example comes along, just append your desired set of lines into the second file stream and close the lineReader.
So add something like this:
// before the code
var writer = fs.createWriteStream('new.txt');
// load custom lines from customLines.txt
var customLines = fs.readFileSync('customLines.txt');
// in the on('line') callback:
writer.write(line + "\n");
// if 'Examples:' was found:
writer.write(customLines);
writer.end();
lineReader.close();

Related

Is it possible to write text in the middle of a file with fs.createWriteStream ? (or in nodejs in general)

I'm trying to write in a text file, but not at the end like appendFile() do or by replacing the entiere content...
I saw it was possible to chose where you want to start with start parameter of fs.createwritestream() -> https://nodejs.org/api/fs.html#fs_fs_createwritestream_path_options
But there is no parameter to say where to stop writting, right ? So it remove all the end of my file after I wrote with this function.
const fs = require('fs');
var logger = fs.createWriteStream('result.csv', {
flags: 'r+',
start: 20 //start to write at the 20th caracter
})
logger.write('5258,525,98951,0,1\n') //example a new line to write
Is there a way to specify where to stop writting in the file to have something like:
....
data from begining
....
5258,525,98951,0,1
...
data till the end
...
I suspect you mean, "Is it possible to insert in the middle of the file." The answer to that is: No, it isn't.
Instead, to insert, you have to:
Determine how big what you're inserting is
Copy the data at your insertion point to that many bytes later in the file
Write your data
Obviously when doing #2 you need to be sure that you're not overwriting data you haven't copied yet (either by reading it all into memory first or by working in blocks, from the end of the file toward the insertion point).
(I've never looked for one, but there may be an npm module out there that does this for you...)
You could read/parse your file at first. Then apply the modifications and save the new file.
Something like:
const fs = require("fs");
const fileData = fs.readFileSync("result.csv", { encoding: "utf8" });
const fileDataArray = fileData.split("\n");
const newData = "5258,525,98951,0,1";
const index = 2; // after each row to insert your data
fileDataArray.splice(index, 0, newData); // insert data into the array
const newFileData = fileDataArray.join("\n"); // create the new file
fs.writeFileSync("result.csv", newFileData, { encoding: "utf8" }); // save it

How can I extract (read + delete) from a textfile in NodeJS?

I'm building a script that reads log files, handles what needs to be handled then writes them to a database
Some caveats :
Some log files have a lot of input, multiple times a second
Some log files have few to no input at all
What I try in simple words:
Reading the first line of a file, then deleting this line to go to the next one, while I handle the first line, other lines could be added..
Issues I'm facing
When I try reading a file then processing it, then deleting the
files, some lines have been added
When the app crashes while
handling multiple lines at once for any reason, I can't know what
lines have been processed.
Tried so far
fs.readdir('logs/', (err, filenames) => {
filenames.forEach((filename) => {
fs.readFile('logs/'+filename, 'utf-8', (err, content) => {
//processing all new lines (can take multiple ms)
//deleting file
fs.unlink('logs/'+filename)
});
});
});
Is there not a (native or not) method to 'take' first line(s), or take all lines, from a file at once?
Something similar to what the Array.shift() method does to arrays..
Why you are reading the file at once. Instead you can use the node.js streams.
https://nodejs.org/api/fs.html#fs_class_fs_readstream
This will read the files and output to console
var fs = require('fs');
var readStream = fs.createReadStream('myfile.txt');
readStream.pipe(process.stdout);
You can also go for the npm package node-tail to read the content of a files while new content written to it.
https://github.com/lucagrulla/node-tail
If your log files has been writen as rotate logs. Example: Each hours has each log file, 9AM.log, 10AM.log....When you process the log files, you can skip current file and process another files. ex: now is 10:30 AM o’clock, skip file 10AM.log, solve another files.

Create a text file using JavaScript without using web interface

I'm looking for a method to create + write a text file using JavaScript but I only found method with browser. I don't need my browser to do that, I want to implement a method in my code that write data in a text file nothing to do with the browser.
So someone know how to do this ?
Use Node.Js:
Node.js Write a line into a .txt file
Something like:
var fs = require('fs')
var logger = fs.createWriteStream('log.txt', {
flags: 'a' // 'a' means appending (old data will be preserved)
})
logger.write('some data') // append string to your file
logger.write('more data') // again
logger.write('and more') // again
https://nodejs.org/api/fs.html

How to write big array to .txt file using node.js?

The obvious solution is to fs.writeFile, I assume.
But the answer of this question suggests that I should be using a Stream technique.
I'm currently trying this code in order to remove a line from the text file by converting it to an array:
var index = randomIntFromInterval(1, unfinished_searches.length-1); // remove index
unfinished_searches.splice(index, 1);
fs.truncate('unfinished_searches.txt', 0, function()
{
var file = fs.createWriteStream('unfinished_searches.txt');
file.on('error', function(err) { /* error handling */ });
unfinished_searches.forEach(function(v) { file.write(v.join(', ') + '\n'); });
file.end();
})
which returns the following error:
TypeError: undefined is not a function
at join in this line:
unfinished_searches.forEach(function(v) { file.write(v.join(', ') + '\n'); });
If you want to remove line from a big text file, consider to use pipes. Just create read stream, pipe it to your "checking function", and then pipe it to write stream.
This way you don't have to load whole file into memory. In memory will be only small amount of data at once. Of course you can't write it to the same file. But after complete operation, you can rename this file and delete old one.
EDIT:
This post can be usefull for You: Parsing huge logfiles in Node.js - read in line-by-line

How can you read a file line by line in JavaScript?

I'm writing a web-app for the iPad that will be loading data from a text file. (A sample data set is around ~400 kb). I have everything set up except the file reading. The way I have set up my code, you pass an object which reads a file line by line.
How can I read a file line by line?
If there is no direct way to read a file line by line, can someone please show me an example of how to read a file into a string object? (so that I can use the split method :P)
This could work, if I understood what you want to do:
var txtFile = new XMLHttpRequest();
txtFile.open("GET", "http://website.com/file.txt", true);
txtFile.onreadystatechange = function()
{
if (txtFile.readyState === 4) { // document is ready to parse.
if (txtFile.status === 200) { // file is found
allText = txtFile.responseText;
lines = txtFile.responseText.split("\n");
}
}
}
txtFile.send(null);
Mobile Safari doesn't have the File API, so I assume you're talking about reading from a web resource. You can't do that. When you read a resource via ajax, the browser will first read it fully into memory and then pass the entire string to your ajax callback as a string.
In your callback, you can take the string and break it into lines, and wrap that up in an object that has the API that your code wants, but you're still going to have the string in memory all at once..
With jQuery:
myObject = {}; //myObject[numberline] = "textEachLine";
$.get('path/myFile.txt', function(myContentFile) {
var lines = myContentFile.split("\r\n");
for(var i in lines){
//here your code
//each line is "lines[i]"
//save in object "myObject":
myObject[i] = lines[i]
//print in console
console.log("line " + i + " :" + lines[i]);
}
}, 'text');
i dont think thats possible until you use ajax to hit some server side code.

Categories

Resources