I'm trying to read/write to a file in an async function (example):
async readWrite() {
// Create a variable representing the path to a .txt
const file = 'file.txt';
// Write "test" to the file
fs.writeFileAsync(file, 'test');
// Log the contents to console
console.log(fs.readFileAsync(file));
}
But whenever I run it I always get the error:
(node:13480) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 2): TypeError: Cannot read property 'map' of null
I tried using bluebird by installing it using npm install bluebird in my project directory and adding:
const Bluebird = require('bluebird');
const fs = Bluebird.promisifyAll(require('fs'));
to my index.js (main) file, as well as adding:
const fs = require('fs');
to every file where I wan't to use fs.
I still get the same error and can only narrow down the problem to fs through commenting out stuff.
Any help would be appreciated.
First of all: async functions return a promise. So by definition, you are already using a promise.
Second, there is no fs.writeFileAsync. You are looking for fs.writeFile https://nodejs.org/api/fs.html#fs_fs_writefile_file_data_options_callback
With promises, making use of the power of async functions
const fs = require('fs');
const util = require('util');
// Promisify the fs.writeFile and fs.readFile
const write = util.promisify(fs.writeFile);
const read = util.promisify(fs.readFile);
async readWrite() {
// Create a variable representing the path to a .txt
const file = 'file.txt';
// Write "test" to the file
await write(file, 'test');
// Log the contents to console
const contents = await read(file, 'utf8');
console.log(contents);
}
In the above: We used util.promisify to turn the nodejs callback style using functions to promises. Inside an async function, you can use the await keyword to store the resolved contents of a promise to a const/let/var.
Further reading material: https://ponyfoo.com/articles/understanding-javascript-async-await
Without promises, callback-style
const fs = require('fs');
async readWrite() {
// Create a variable representing the path to a .txt
const file = 'file.txt';
// Write "test" to the file
fs.writeFile(file, 'test', err => {
if (!err) fs.readFile(file, 'utf8', (err, contents)=> {
console.log(contents);
})
});
}
Related
I need to sequentially launch external scripts with such a condition that the following code is launched only after the completion of the entire contents of the external script, including promises. I use shell.js, but perhaps there are other tools. I need to run the script without importing into the "parent".
External script code (external.js):
const fs = require("fs");
const util = require("util");
const write = util.promisify(fs.writeFile);
(async () => {
await write("Hello.txt", "Hello");
})();
The "parent" code from which it is called:
const fs = require("fs");
const util = require("util");
const shell = require("shelljs");
const read = util.promisify(fs.readFile);
(async () => {
await shell.exec("node external.js", { async: true });
const data = await read('Hello.txt');
// do something with data...
})();
Please tell me if it is possible to implement this task, and how can this be done? Thanks for attention!
If external.js performs a file operation, then the node external.js process will only exit after the file operation has completed. Therefore
await util.promisify(child_process.exec)("node external.js");
should fulfill your requirement.
I have two methods.
The first one reads a file and writes to that file just as plain text. The second one writes a file as a stream.
In order to get this to work I have had to add fs twice in require.
const fs = require('fs').promises;
const fs2 = require('fs');
I'm trying to understand the difference and why I need this twice. But it sems that fs with out the promise doesn't have the ability to use createWriteStream and the one without the .promises doesnt have the ability to writeFile
/**
* Serializes credentials to a file compatible with GoogleAUth.fromJSON.
*
* #param {OAuth2Client} client
* #return {Promise<void>}
*/
async function saveCredentials(client) {
const content = await fs.readFile(CREDENTIALS_PATH);
const keys = JSON.parse(content);
const key = keys.installed || keys.web;
const payload = JSON.stringify({
type: 'authorized_user',
client_id: key.client_id,
client_secret: key.client_secret,
refresh_token: client.credentials.refresh_token,
});
await fs.writeFile(TOKEN_PATH, payload);
}
The second one writes to a file as a stream
/**
* Download file
* #param {OAuth2Client} authClient An authorized OAuth2 client.
*/
async function downloadFile(authClient) {
const service = google.drive({version: 'v3', auth: authClient});
const fileStream = fs2.createWriteStream("test.txt")
fileId = FILEID;
try {
const file = await service.files.get({
fileId: fileId,
alt: 'media',
}, {
responseType: "stream"
},
(err, { data }) =>
data
.on('end', () => console.log('onCompleted'))
.on('error', (err) => console.log('onError', err))
.pipe(fileStream)
);
} catch (err) {
// TODO(developer) - Handle error
throw err;
}
}
Note this does work, I am just trying to wrap my head around Node.js.
fs.promises contains a subset of the interface for what's on fs, but with promise-based interfaces instead of plain callback-style interfaces.
Some things which don't translate well to promises or don't have a natural promise-based interface such as fs.createReadStream() are only available on fs. Note that fs.createReadStream() returns a stream and uses events on the stream, not plain callbacks (which don't translate well to promises). As such, it's interface remains the same on fs and is not duplicated on fs.promises.
Many things are available in either with different interfaces:
fs.writeFile(filename, data, callback); // plain callback interface
or
await fs.promises.writeFile(filename, data) // promise interface
Most of the time, I can use only the fs.promises interface and do:
const fsp = require('fs').promises;
But sometimes, you need both and I would do this:
const fs = require('fs');
const fsp = fs.promises;
Keep in mind fs.promises is not a complete replacement for fs. It's an alternate (promise-based) interface for some (but not all) of the methods in the fs module.
Other interfaces such as fsp.open() have been enhanced and converted to promises in the fs.promises interface where it now returns a promise that resolves to an object-oriented fileHandle object whereas fs.open() just accepts a callback that will be passed a file descriptor.
So, my mode of operation is to look in the fs.promises interface for what I'm doing. If it's there, I use it there and use promises with it.
If it's not there, then go back to the fs interface for what I need.
I would advise you to NOT write code that uses the symbol fs for fs.promises. That will confuse people reading or working on your code because they are likely to think that a symbol named fs is the fs interface. That's why I use fs for the fs interface and fsp for the fs.promises interface in my code.
I'm learning about puppeteer and firebase at the moment. What I am trying to do is create a pdf of a web page and upload to firebase storage. This is my code.
const puppeteer = require('puppeteer');
const fs = require('fs').promises;
const firebase = require('firebase');
require("firebase/storage");
const url = process.argv[2];
if (!url) {
throw "Please provide URL as a first argument";
}
var firebaseConfig = {
#Firebase Config Goes here
};
// Initialize Firebase
firebase.initializeApp(firebaseConfig);
#Function to generate PDF file
async function run () {
const browser = await puppeteer.launch();
const page = await browser.newPage();
//await page.goto(url);
await page.goto(url, {waitUntil: 'domcontentloaded', timeout: 60000} );
//await page.pdf({ path: 'api.pdf', format: 'A4' })
const myPdf = await page.pdf();
await browser.close()
return myPdf;
}
const myOutput = run();
#Upload to Firebase based on the instruction here https://firebase.google.com/docs/storage/web/upload-files
var storageRef = firebase.storage().ref();
// Create a reference to 'mountains.jpg'
storageRef.child("Name.pdf").put(myOutput)
However, I'm running into this error when executing my code
$ node screenshot.js https://google.com
Promise { <pending> }
(node:20636) UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'byteLength' of undefined
at C:\Users\ppham\NucampFolder\Test\node_modules\#firebase\storage\dist\index.cjs.js:833:40
at Array.forEach (<anonymous>)
at Function.FbsBlob.getBlob (C:\Users\ppham\NucampFolder\Test\node_modules\#firebase\storage\dist\index.cjs.js:832:25)
at multipartUpload (C:\Users\ppham\NucampFolder\Test\node_modules\#firebase\storage\dist\index.cjs.js:1519:24)
at C:\Users\ppham\NucampFolder\Test\node_modules\#firebase\storage\dist\index.cjs.js:2003:31
at C:\Users\ppham\NucampFolder\Test\node_modules\#firebase\storage\dist\index.cjs.js:1900:21
at processTicksAndRejections (internal/process/task_queues.js:85:5)
(node:20636) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:20636) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
This looks to imply that myOutput doesn't contain anything. I thought that I have created the pdf file after executing the run() function, assigned it to the myOutput variable and passed it to the upload function? I've been reading the Puppeteer documentation and couldn't find any reason why this wouldn't work. Anyone knows why this is not valid?
The difference between your code and the example code at https://www.toptal.com/puppeteer/headless-browser-puppeteer-tutorial, for example, is trivial, but there's one significant difference:
run is defined as an async function -- in the example code, this may be inconsequential, as the last line of the example just calls run() and nothing happens afterwards. But, in your code, you expect to do something with that output. So, you'll need to call it with:
const myOutput = await run();
But, Node won't want you to use await at the top level -- await can only be used inside an async function. So, you can either use then() or just define another async wrapper. So, probably something like this:
async function runAndUpload() {
const myOutput = await run();
console.log(myOutput); //let's make sure there's output
var storageRef = firebase.storage().ref();
storageRef.child("Name.pdf").put(myOutput)
}
runAndUpload();
I use require("fs").promises just to avoid to use callback function.
But now, I also want to use fs.createReadstream to attach a file with POST request.
How can I do this?
Or what alter createReadstream in this case?
Or should I use require("fs")?
So by using const fs = require('fs').promises; you're only gaining access to the promise version of the fs module. According to spec, there is no equivalent createReadStream entry in the File System Promises API. If you want that functionality, you'll need to store a reference to it in addition to the promisified version of fs.
I'd encourage anyone reading this to use the following at the top of your file to include both the promises api and ability to createReadStreams.
const fs = require('fs').promises;
const createReadStream = require('fs').createReadStream;
Your creation of readstreams will look like this (note no longer includes a prepended fs.):
createReadStream('/your/path/here');
Equally important to note:
According to spec, you'll eventually want to use the following instead (disclaimer, current out of box node can't do this without certain flags/dependences)
import { createReadStream } from 'fs';
I do it like this:
import { promises as fs, createReadStream } from "fs"
await fs.mkdir("path...");
const buff = createReadStream("path ...")
Your question is a bit broad, but I can point you to some helpful package. It doesn't cover nearly all functions, but the package fs-extra automatically makes a lot of fs functions return a promise, I definitely suggest that. Then you can always just use the regular fs at the same time. As for the fs.createReadStream() you'll probably be wanting to just wrap what you need in a new Promise().
You can "promisify" it like this:
function streamAsPromise(stream) {
return new Promise((resolve, reject) => {
let data = "";
stream.on("data", chunk => data += chunk);
stream.on("end", () => resolve(data));
stream.on("error", error => reject(error));
});
}
const text = await streamAsPromise(createReadStream('file.txt'));
Notice the await will wait until the whole file is read before resolving the promise, if the file is very big might be a problem.
I am trying to read the contents of several files in Node.js using promises. Since the standard fs module does not provide a sufficient promise interface, I decided to use fs-extra instead which provides pretty much the same functions as the default fs module with an additional promise interface.
Reading the contents of a single file as shown below works as desired and logs the file's contents to the console:
const fse = require('fs-extra')
const filePath = './foo.txt'
fse.readFile(filePath, 'utf8')
.then(filecontents => {
return filecontents
})
.then(filecontents => {
console.log(filecontents)
})
However, I need to handle several files inside a given directory. To do this I need to implement the following steps:
get an array of all files inside the directory using fse.readdir() - done
join filenames and directory name to get a kind of a base file path using path.join with .map() to avoid iterating over the array - done
read file contents using fse.readFile() inside another .map()
These three steps are implemented as follows:
const fse = require('fs-extra');
const path = require('path');
const mailDirectory = './mails'
fse.readdir(mailDirectory)
.then(filenames => {
return filenames.map(filename => path.join(mailDirectory, filename))
})
.then(filepaths => {
// console.log(filepaths)
return filepaths
.map(filepath => fse.readFile(filepath).then(filecontents => {
return filecontents
}))
})
.then(mailcontents => {
console.log(mailcontents)
})
As stated above, steps 1 and 2 are working quite nice. However, I am unable to read the file contents using fse.readFile() inside the last .map() which results in an
[ Promise { <pending> },
Promise { <pending> },
Promise { <pending> },
Promise { <pending> },
Promise { <pending> } ]
output indicating that the promise is not resolved, yet. I assume that this unresolved promise is the promise returned by the fse.readFile() function. However I am unable to resolve it properly since a comparable approach in my very first snippet works like a charm.
How could I solve this issue? Where does it exactly come from since I am a newbie in the field of JS and especially in the field of Node.js?
You have an Array of Promises. You should wait on them using Promise.all():
const fse = require('fs-extra');
const path = require('path');
const mailDirectory = './mails'
fse.readdir(mailDirectory)
.then(filenames => {
return filenames.map(filename => path.join(mailDirectory, filename))
})
.then(filepaths => {
// console.log(filepaths)
return filepaths
.map(filepath => fse.readFile(filepath).then(filecontents => {
return filecontents
}))
})
// Promise.all consumes an array of promises, and returns a
// new promise that will resolve to an array of concrete "answers"
.then(mailcontents => Promise.all(mailcontents))
.then(realcontents => {
console.log(realcontents)
});
Also, if you don't want to have to have an additional dependency on fs-extra you can use node 8's new util.promisify() to make fs follow a Promise oriented API.