Related
I am trying to append a string to a log file. However writeFile will erase the content each time before writing the string.
fs.writeFile('log.txt', 'Hello Node', function (err) {
if (err) throw err;
console.log('It\'s saved!');
}); // => message.txt erased, contains only 'Hello Node'
Any idea how to do this the easy way?
For occasional appends, you can use appendFile, which creates a new file handle each time it's called:
Asynchronously:
const fs = require('fs');
fs.appendFile('message.txt', 'data to append', function (err) {
if (err) throw err;
console.log('Saved!');
});
Synchronously:
const fs = require('fs');
fs.appendFileSync('message.txt', 'data to append');
But if you append repeatedly to the same file, it's much better to reuse the file handle.
When you want to write in a log file, i.e. appending data to the end of a file, never use appendFile. appendFile opens a file handle for each piece of data you add to your file, after a while you get a beautiful EMFILE error.
I can add that appendFile is not easier to use than a WriteStream.
Example with appendFile:
console.log(new Date().toISOString());
[...Array(10000)].forEach( function (item,index) {
fs.appendFile("append.txt", index+ "\n", function (err) {
if (err) console.log(err);
});
});
console.log(new Date().toISOString());
Up to 8000 on my computer, you can append data to the file, then you obtain this:
{ Error: EMFILE: too many open files, open 'C:\mypath\append.txt'
at Error (native)
errno: -4066,
code: 'EMFILE',
syscall: 'open',
path: 'C:\\mypath\\append.txt' }
Moreover, appendFile will write when it is enabled, so your logs will not be written by timestamp. You can test with example, set 1000 in place of 100000, order will be random, depends on access to file.
If you want to append to a file, you must use a writable stream like this:
var stream = fs.createWriteStream("append.txt", {flags:'a'});
console.log(new Date().toISOString());
[...Array(10000)].forEach( function (item,index) {
stream.write(index + "\n");
});
console.log(new Date().toISOString());
stream.end();
You end it when you want. You are not even required to use stream.end(), default option is AutoClose:true, so your file will end when your process ends and you avoid opening too many files.
Your code using createWriteStream creates a file descriptor for every write. log.end is better because it asks node to close immediately after the write.
var fs = require('fs');
var logStream = fs.createWriteStream('log.txt', {flags: 'a'});
// use {flags: 'a'} to append and {flags: 'w'} to erase and write a new file
logStream.write('Initial line...');
logStream.end('this is the end line');
Besides appendFile, you can also pass a flag in writeFile to append data to an existing file.
fs.writeFile('log.txt', 'Hello Node', {'flag':'a'}, function(err) {
if (err) {
return console.error(err);
}
});
By passing flag 'a', data will be appended at the end of the file.
Use a+ flag to append and create a file (if doesn't exist):
fs.writeFile('log.txt', 'Hello Node', { flag: "a+" }, (err) => {
if (err) throw err;
console.log('The file is created if not existing!!');
});
Docs: https://nodejs.org/api/fs.html#fs_file_system_flags
You need to open it, then write to it.
var fs = require('fs'), str = 'string to append to file';
fs.open('filepath', 'a', 666, function( e, id ) {
fs.write( id, 'string to append to file', null, 'utf8', function(){
fs.close(id, function(){
console.log('file closed');
});
});
});
Here's a few links that will help explain the parameters
open
write
close
EDIT: This answer is no longer valid, look into the new fs.appendFile method for appending.
My approach is rather special. I basically use the WriteStream solution but without actually 'closing' the fd by using stream.end(). Instead I use cork/uncork. This got the benefit of low RAM usage (if that matters to anyone) and I believe it's more safe to use for logging/recording (my original use case).
Following is a pretty simple example. Notice I just added a pseudo for loop for showcase -- in production code I am waiting for websocket messages.
var stream = fs.createWriteStream("log.txt", {flags:'a'});
for(true) {
stream.cork();
stream.write("some content to log");
process.nextTick(() => stream.uncork());
}
uncork will flush the data to the file in the next tick.
In my scenario there are peaks of up to ~200 writes per second in various sizes. During night time however only a handful writes per minute are needed. The code is working super reliable even during peak times.
Node.js 0.8 has fs.appendFile:
fs.appendFile('message.txt', 'data to append', (err) => {
if (err) throw err;
console.log('The "data to append" was appended to file!');
});
Documentation
Using fs.appendFile or fsPromises.appendFile are the fastest and the most robust options when you need to append something to a file.
In contrast to some of the answers suggested, if the file path is supplied to the appendFile function, It actually closes by itself. Only when you pass in a filehandle that you get by something like fs.open() you have to take care of closing it.
I tried it with over 50,000 lines in a file.
Examples :
(async () => {
// using appendFile.
const fsp = require('fs').promises;
await fsp.appendFile(
'/path/to/file', '\r\nHello world.'
);
// using apickfs; handles error and edge cases better.
const apickFileStorage = require('apickfs');
await apickFileStorage.writeLines(
'/path/to/directory/', 'filename', 'Hello world.'
);
})();
Ref: https://github.com/nodejs/node/issues/7560
If you want an easy and stress-free way to write logs line by line in a file, then I recommend fs-extra:
const os = require('os');
const fs = require('fs-extra');
const file = 'logfile.txt';
const options = {flag: 'a'};
async function writeToFile(text) {
await fs.outputFile(file, `${text}${os.EOL}`, options);
}
writeToFile('First line');
writeToFile('Second line');
writeToFile('Third line');
writeToFile('Fourth line');
writeToFile('Fifth line');
Tested with Node v8.9.4.
fd = fs.openSync(path.join(process.cwd(), 'log.txt'), 'a')
fs.writeSync(fd, 'contents to append')
fs.closeSync(fd)
I offer this suggestion only because control over open flags is sometimes useful, for example, you may want to truncate it an existing file first and then append a series of writes to it - in which case use the 'w' flag when opening the file and don't close it until all the writes are done. Of course appendFile may be what you're after :-)
fs.open('log.txt', 'a', function(err, log) {
if (err) throw err;
fs.writeFile(log, 'Hello Node', function (err) {
if (err) throw err;
fs.close(log, function(err) {
if (err) throw err;
console.log('It\'s saved!');
});
});
});
Using jfile package :
myFile.text+='\nThis is new line to be appended'; //myFile=new JFile(path);
Try to use flags: 'a' to append data to a file
var stream = fs.createWriteStream("udp-stream.log", {'flags': 'a'});
stream.once('open', function(fd) {
stream.write(msg+"\r\n");
});
Here's a full script. Fill in your file names and run it and it should work!
Here's a video tutorial on the logic behind the script.
var fs = require('fs');
function ReadAppend(file, appendFile){
fs.readFile(appendFile, function (err, data) {
if (err) throw err;
console.log('File was read');
fs.appendFile(file, data, function (err) {
if (err) throw err;
console.log('The "data to append" was appended to file!');
});
});
}
// edit this with your file names
file = 'name_of_main_file.csv';
appendFile = 'name_of_second_file_to_combine.csv';
ReadAppend(file, appendFile);
const inovioLogger = (logger = "") => {
const log_file = fs.createWriteStream(__dirname + `/../../inoviopay-${new Date().toISOString().slice(0, 10)}.log`, { flags: 'a' });
const log_stdout = process.stdout;
log_file.write(logger + '\n');
}
In addition to denysonique's answer, sometimes asynchronous type of appendFile and other async methods in NodeJS are used where promise returns instead of callback passing. To do it you need to wrap the function with promisify HOF or import async functions from promises namespace:
const { appendFile } = require('fs').promises;
await appendFile('path/to/file/to/append', dataToAppend, optionalOptions);
I hope it'll help 😉
I wrapped the async fs.appendFile into a Promise-based function. Hope it helps others to see how this would work.
append (path, name, data) {
return new Promise(async (resolve, reject) => {
try {
fs.appendFile((path + name), data, async (err) => {
if (!err) {
return resolve((path + name));
} else {
return reject(err);
}
});
} catch (err) {
return reject(err);
}
});
}
I am trying to append a string to a log file. However writeFile will erase the content each time before writing the string.
fs.writeFile('log.txt', 'Hello Node', function (err) {
if (err) throw err;
console.log('It\'s saved!');
}); // => message.txt erased, contains only 'Hello Node'
Any idea how to do this the easy way?
For occasional appends, you can use appendFile, which creates a new file handle each time it's called:
Asynchronously:
const fs = require('fs');
fs.appendFile('message.txt', 'data to append', function (err) {
if (err) throw err;
console.log('Saved!');
});
Synchronously:
const fs = require('fs');
fs.appendFileSync('message.txt', 'data to append');
But if you append repeatedly to the same file, it's much better to reuse the file handle.
When you want to write in a log file, i.e. appending data to the end of a file, never use appendFile. appendFile opens a file handle for each piece of data you add to your file, after a while you get a beautiful EMFILE error.
I can add that appendFile is not easier to use than a WriteStream.
Example with appendFile:
console.log(new Date().toISOString());
[...Array(10000)].forEach( function (item,index) {
fs.appendFile("append.txt", index+ "\n", function (err) {
if (err) console.log(err);
});
});
console.log(new Date().toISOString());
Up to 8000 on my computer, you can append data to the file, then you obtain this:
{ Error: EMFILE: too many open files, open 'C:\mypath\append.txt'
at Error (native)
errno: -4066,
code: 'EMFILE',
syscall: 'open',
path: 'C:\\mypath\\append.txt' }
Moreover, appendFile will write when it is enabled, so your logs will not be written by timestamp. You can test with example, set 1000 in place of 100000, order will be random, depends on access to file.
If you want to append to a file, you must use a writable stream like this:
var stream = fs.createWriteStream("append.txt", {flags:'a'});
console.log(new Date().toISOString());
[...Array(10000)].forEach( function (item,index) {
stream.write(index + "\n");
});
console.log(new Date().toISOString());
stream.end();
You end it when you want. You are not even required to use stream.end(), default option is AutoClose:true, so your file will end when your process ends and you avoid opening too many files.
Your code using createWriteStream creates a file descriptor for every write. log.end is better because it asks node to close immediately after the write.
var fs = require('fs');
var logStream = fs.createWriteStream('log.txt', {flags: 'a'});
// use {flags: 'a'} to append and {flags: 'w'} to erase and write a new file
logStream.write('Initial line...');
logStream.end('this is the end line');
Besides appendFile, you can also pass a flag in writeFile to append data to an existing file.
fs.writeFile('log.txt', 'Hello Node', {'flag':'a'}, function(err) {
if (err) {
return console.error(err);
}
});
By passing flag 'a', data will be appended at the end of the file.
Use a+ flag to append and create a file (if doesn't exist):
fs.writeFile('log.txt', 'Hello Node', { flag: "a+" }, (err) => {
if (err) throw err;
console.log('The file is created if not existing!!');
});
Docs: https://nodejs.org/api/fs.html#fs_file_system_flags
You need to open it, then write to it.
var fs = require('fs'), str = 'string to append to file';
fs.open('filepath', 'a', 666, function( e, id ) {
fs.write( id, 'string to append to file', null, 'utf8', function(){
fs.close(id, function(){
console.log('file closed');
});
});
});
Here's a few links that will help explain the parameters
open
write
close
EDIT: This answer is no longer valid, look into the new fs.appendFile method for appending.
My approach is rather special. I basically use the WriteStream solution but without actually 'closing' the fd by using stream.end(). Instead I use cork/uncork. This got the benefit of low RAM usage (if that matters to anyone) and I believe it's more safe to use for logging/recording (my original use case).
Following is a pretty simple example. Notice I just added a pseudo for loop for showcase -- in production code I am waiting for websocket messages.
var stream = fs.createWriteStream("log.txt", {flags:'a'});
for(true) {
stream.cork();
stream.write("some content to log");
process.nextTick(() => stream.uncork());
}
uncork will flush the data to the file in the next tick.
In my scenario there are peaks of up to ~200 writes per second in various sizes. During night time however only a handful writes per minute are needed. The code is working super reliable even during peak times.
Node.js 0.8 has fs.appendFile:
fs.appendFile('message.txt', 'data to append', (err) => {
if (err) throw err;
console.log('The "data to append" was appended to file!');
});
Documentation
Using fs.appendFile or fsPromises.appendFile are the fastest and the most robust options when you need to append something to a file.
In contrast to some of the answers suggested, if the file path is supplied to the appendFile function, It actually closes by itself. Only when you pass in a filehandle that you get by something like fs.open() you have to take care of closing it.
I tried it with over 50,000 lines in a file.
Examples :
(async () => {
// using appendFile.
const fsp = require('fs').promises;
await fsp.appendFile(
'/path/to/file', '\r\nHello world.'
);
// using apickfs; handles error and edge cases better.
const apickFileStorage = require('apickfs');
await apickFileStorage.writeLines(
'/path/to/directory/', 'filename', 'Hello world.'
);
})();
Ref: https://github.com/nodejs/node/issues/7560
If you want an easy and stress-free way to write logs line by line in a file, then I recommend fs-extra:
const os = require('os');
const fs = require('fs-extra');
const file = 'logfile.txt';
const options = {flag: 'a'};
async function writeToFile(text) {
await fs.outputFile(file, `${text}${os.EOL}`, options);
}
writeToFile('First line');
writeToFile('Second line');
writeToFile('Third line');
writeToFile('Fourth line');
writeToFile('Fifth line');
Tested with Node v8.9.4.
fd = fs.openSync(path.join(process.cwd(), 'log.txt'), 'a')
fs.writeSync(fd, 'contents to append')
fs.closeSync(fd)
I offer this suggestion only because control over open flags is sometimes useful, for example, you may want to truncate it an existing file first and then append a series of writes to it - in which case use the 'w' flag when opening the file and don't close it until all the writes are done. Of course appendFile may be what you're after :-)
fs.open('log.txt', 'a', function(err, log) {
if (err) throw err;
fs.writeFile(log, 'Hello Node', function (err) {
if (err) throw err;
fs.close(log, function(err) {
if (err) throw err;
console.log('It\'s saved!');
});
});
});
Using jfile package :
myFile.text+='\nThis is new line to be appended'; //myFile=new JFile(path);
Try to use flags: 'a' to append data to a file
var stream = fs.createWriteStream("udp-stream.log", {'flags': 'a'});
stream.once('open', function(fd) {
stream.write(msg+"\r\n");
});
Here's a full script. Fill in your file names and run it and it should work!
Here's a video tutorial on the logic behind the script.
var fs = require('fs');
function ReadAppend(file, appendFile){
fs.readFile(appendFile, function (err, data) {
if (err) throw err;
console.log('File was read');
fs.appendFile(file, data, function (err) {
if (err) throw err;
console.log('The "data to append" was appended to file!');
});
});
}
// edit this with your file names
file = 'name_of_main_file.csv';
appendFile = 'name_of_second_file_to_combine.csv';
ReadAppend(file, appendFile);
const inovioLogger = (logger = "") => {
const log_file = fs.createWriteStream(__dirname + `/../../inoviopay-${new Date().toISOString().slice(0, 10)}.log`, { flags: 'a' });
const log_stdout = process.stdout;
log_file.write(logger + '\n');
}
In addition to denysonique's answer, sometimes asynchronous type of appendFile and other async methods in NodeJS are used where promise returns instead of callback passing. To do it you need to wrap the function with promisify HOF or import async functions from promises namespace:
const { appendFile } = require('fs').promises;
await appendFile('path/to/file/to/append', dataToAppend, optionalOptions);
I hope it'll help 😉
I wrapped the async fs.appendFile into a Promise-based function. Hope it helps others to see how this would work.
append (path, name, data) {
return new Promise(async (resolve, reject) => {
try {
fs.appendFile((path + name), data, async (err) => {
if (!err) {
return resolve((path + name));
} else {
return reject(err);
}
});
} catch (err) {
return reject(err);
}
});
}
I have spent couple of hours trying to get image data as a buffer, search results lead me to using "request" module, others suggestions lead to using other modules in higher version of node, which I cannot use because we depend on node v 6.11 so far.
Here are my trials:
request(imageURL).pipe(fs.createWriteStream('downloaded-img-
1.jpg')).on('close', function () {
console.log('ok');
});
request(imageURL, function (err, message, response) {
fs.writeFile('downloaded-img-2.jpg', response, 'binary', function (err) {
console.log('File saved.');
});
fs.writeFile('downloaded-img-3.jpg', chunks, 'binary', function (err) {
console.log('File saved.');
})
resolve(response);
})
.on('data', function (chunk) {
chunks.push(chunk);
})
.on('response', function (response) {
});
});
The "downloaded-img-1.jpg" gets downloaded correctly, but I have to avoid saving the file on disk, then read it as a stream, it's a PRD environment constraint. So the next option is to use image data, as demonstrated by "downloaded-img-2.jpg" and "downloaded-img-3.jpg", by waiting for the "response" or the hand-made "chunks", the problem is that these 2 images are always corrupted, and I don't know why?
What is the point behind all of that? I am trying to add the image behind the URL in a zip file, and the zip lib I use (js-zip) accepts buffer as an input. Any ideas why I am not getting the "chunks" or the "response" correctly?
I've tested the code below in Node 6.9.2, it will download an image as a buffer. I also write the buffer to a file (just to test all is OK!), the body object is a buffer containing the image data:
"use strict";
var request = require('request');
var fs = require('fs');
var options = {
url: "https://upload.wikimedia.org/wikipedia/commons/thumb/5/52/Hubble2005-01-barred-spiral-galaxy-NGC1300.jpg/1920px-Hubble2005-01-barred-spiral-galaxy-NGC1300.jpg",
method: "get",
encoding: null
};
console.log('Requesting image..');
request(options, function (error, response, body) {
if (error) {
console.error('error:', error);
} else {
console.log('Response: StatusCode:', response && response.statusCode);
console.log('Response: Body: Length: %d. Is buffer: %s', body.length, (body instanceof Buffer));
fs.writeFileSync('test.jpg', body);
}
});
I'm trying with this code below, I can write file on server and the file has content but, when I download the file, it's blank.
var file = fs.createWriteStream('./tmp/text.txt');
file.on('error', function(err) { console.log(err) });
rows.forEach(function(v) {
file.write(v + "\r\n");
productExport.DeleteProduct(v.ProductId, function(err){
if(err) throw err;
});
});
file.end();
var file = "./tmp/text.txt";
res.download(file); // Set disposition and send it.
How can I download the file with all the content?
The way your code is structured is incorrect for what you're trying to do. Primarily, your issue is that you're responding with res.download() before the file stream is done being written to. Additionally, you have var file being initialized twice in the same scope, this isn't correct either.
var file = "./tmp/text.txt";
var writeStream = fs.createWriteStream(file);
writeStream.on('error', err => console.log );
writeStream.on('finish', () => {
return res.download(file); // Set disposition and send it.
});
rows.forEach((v) => {
writeStream.write(v + "\r\n");
productExport.DeleteProduct(v.ProductId, function(err){
if(err) throw err;
});
});
writeStream.end();
If you're confused by this, and perhaps async processing in general, this is the defacto answer on SO for understanding async in Node.js
Writing data to a file through I/O is an asynchronous operation. You have to wait for the WriteStream to complete before you can download the file.
var file = fs.createWriteStream('./tmp/text.txt');
file.on('error', function(err) { console.log(err) });
rows.forEach(function(v) {
file.write(v + "\r\n");
productExport.DeleteProduct(v.ProductId, function(err){
if(err) throw err;
});
});
file.end();
file.on('finish', function() {
var file = "./tmp/text.txt";
res.download(file); // Set disposition and send it.
});
Extra Information:
FileSystem#createWriteStream return WriteStream object.
From the document of WriteStream it stats there are 6 events [close, drain, error, finish, pipe, unpipe].
in the node.js world, you should always expect to use callback or seek for complete/finish event when there are I/O operations.
I am working on a Node Express, which would call REST URL to download a zip and extract it.
Below are the steps that I am following
Using the request module, send a POST request to REST API.
REST API returns a zip file. Pipe it to save it to the filesystem
Extract the zip file
Cleanup the zip file (this code is not there yet)
This is how my code looks like
return new Promise(function(resolve, reject) {
var req = require('request');
var extract = require('extract-zip');
var postData = {
username: "abc",
password: "zyz"
}
var options = {
method: 'post',
body: postData,
json: true,
url: "http://localhost:8080/zipdownload"
}
req(options, function(err, res, body) {
if(err) {
console.log('Error posting json ' + err)
return;
}
}).pipe(fs.createWriteStream(USER_DIR + '/MY.zip')).on('finish', function() {
console.log('Finished downloading ZIP file');
extract(USER_DIR + '/MY.zip', {
dir: USER_DIR + '/MyExtractedFolder'
}, function(err) {
if(err) {
console.log('Error extracting zip ' + err);
}
resolve("Promised resolved donwloading and extracting zip");
})
}).on('error', function(error) {
reject('Error in downloading content', error);
});
});
I am able to download the zip file and save it in the file system using .pipe. However when I try to extract it using extract-zip, it gives me the below error
End of central directory record signature not found.
I suspect that extract code is not at the right place and Promise gets resolved while the extract code is still running. As a result, the calling function, tries to do something with the extracted files which do not currently exist.
Also, I am not able to make out where should I put the code for zip cleanup. It needs to trigger once the extraction has been done and not in parallel to extraction.