I have this fragment of code in a promise:
try {
const newFile = fs.createWriteStream(filePath);
response.pipe(newFile);
newFile.on('finish', () => {
newFile.close(resolve());
});
} catch (err) {
reject(err);
}
response is the result from get method in the http module, and
filePath is a string /a/b/c/file.mp3/ where the folder /a/b/c does not exist.
Instead of the error being caught, the next line after this fragment is executed and it then crashes with:
ENOENT: no such file or directory, open '/a/b/c/file.mp3' at WriteStream.onerror ... at WriteStream.emit ... at lazyFs.open ... at FSReqWrap.oncomplete
Why is it behaving in this way?
The code works fine if filePath is a valid path.
WritableStream is asynchronous, and you can't catch its errors using try...catch. You should listen for an error event.
const newFile = fs.createWriteStream(filePath);
response.pipe(newFile);
newFile.on('finish', () => {
newFile.close(resolve());
});
newFile.on('error', reject);
or more verbosely
newFile.on('error', exception => {
reject(exception);
});
Related
I'm mapping an index.html file to scrape the tag content and save as a file called fonts.css
This is my style:
<style>#font-face{font-family:Averti;src:url(https://services.serving-sys.com/HostingServices/custdev/site-140253/Averti/Averti-Bold.woff) format('truetype');font-weight:700;font-style:normal}#font-face{font-family:Averti-Light;src:url(https://services.serving-sys.com/HostingServices/custdev/site-140253/Averti_Webfonts/Averti-Light.woff) format('truetype');font-weight:400;font-style:normal}</style>
There is no major errors with the function, but the console.log is showing me that
{ [Error: ENOENT: no such file or directory, open './dist/css/fonts.css']
errno: -2,
code: 'ENOENT',
syscall: 'open',
path: './dist/css/fonts.css' }
I'm not sure if that is right as the file is not even created yet.
See my function below and let me know what I am missing.
Thank you in advance.
async function createStyle(){
var jsonObject = []
setTimeout(function(){
fs.readFile('./dist/index.html', 'utf8', function(err, html){
if (!err){
const $ = cheerio.load(html)
var cssScript = $('head > style').map(( i, x ) => x.children[0])
.filter(( i, x ) => x && x.data.match(/#font-face/)).get(0);
jsonObject.push(cssScript)
exportStyle(jsonObject)
}
})}, 2000);
async function exportStyle(_json) {
const stylePromise = new Promise(function(resolve, reject) {
fs.writeFile('./dist/css/fonts.css', _json, err => {
if (err) {
reject();
} else {
resolve();
console.log('Created: fonts.css');
}
console.log(err);
});
});
(async function() {
try {
await stylePromise;
} catch(err) {
console.log(err);
}
})();}}
I created the css and js folder inside the ./dist folder and the code worked like a charm. Other than that, the error message appears in the console.
I have been struggling with various FTP Node modules to try and get anything working in AWS Lambda. The best and most popular seems to be "Basic-FTP" that also supports async/await. But I just cannot get it to download files when any code is added beneath the FTP function.
I don't want to add the fs functions within the FTP async function as I need to solve what is causing the break when any code below is added and I also have other bits of code to add and work with the downloaded file and it's content later:
FTP SUCCESS - When the async function is used only with no fs code beneath it
FTP FAILURE - Adding the fs readdir/readFile functions or any other code below
ERROR Error: ENOENT: no such file or directory, open '/tmp/document.txt'
https://github.com/patrickjuchli/basic-ftp
const ftp = require("basic-ftp");
const fs = require("fs");
var FileNameWithExtension = "document.txt";
var ftpTXT;
exports.handler = async (event, context, callback) => {
example();
async function example() {
const client = new ftp.Client();
//client.ftp.verbose = true;
try {
await client.access({
host: host,
user: user,
password: password,
//secure: true
});
console.log(await client.list());
await client.download(fs.createWriteStream('/tmp/' + FileNameWithExtension), FileNameWithExtension);
}
catch (err) {
console.log(err);
}
client.close();
}
// Read the content from the /tmp/ directory to check FTP was succesful
fs.readdir("/tmp/", function (err, data) {
if (err) {
return console.error("There was an error listing the /tmp/ contents.");
}
console.log('Contents of AWS Lambda /tmp/ directory: ', data);
});
// Read TXT file and convert into string format
fs.readFile('/tmp/' + FileNameWithExtension, 'utf8', function (err, data) {
if (err) throw err;
ftpTXT = data;
console.log(ftpTXT);
});
// Do other Node.js coding with the downloaded txt file and it's contents
};
The problem is that you are getting lost when creating an async function inside your handler. Since example() is async, it returns a Promise. But you don't await on it, so the way it has been coded, it's kind of a fire and forget thing. Also, your Lambda is being terminated before your callbacks are triggered, so even if it got to download you would not be able to see it.
I suggest you wrap your callbacks in Promises so you can easily await on them from your handler function.
I have managed to make it work: I have used https://dlptest.com/ftp-test/ for testing, so change it accordingly. Furthermore, see that I have uploaded the file myself. So if you want to replicate this example, just create a readme.txt on the root of your project and upload it. If you already have this readme.txt file on your FTP server, just delete the line where it uploads the file.
Here's a working example:
const ftp = require("basic-ftp");
const fs = require("fs");
const FileNameWithExtension = "readme.txt";
module.exports.hello = async (event) => {
const client = new ftp.Client();
try {
await client.access({
host: 'ftp.dlptest.com',
user: 'dlpuser#dlptest.com',
password: 'puTeT3Yei1IJ4UYT7q0r'
});
console.log(await client.list());
await client.upload(fs.createReadStream(FileNameWithExtension), FileNameWithExtension)
await client.download(fs.createWriteStream('/tmp/' + FileNameWithExtension), FileNameWithExtension);
}
catch (err) {
console.log('logging err')
console.log(err);
}
client.close();
console.log(await readdir('/tmp/'))
console.log(await readfile('/tmp/', FileNameWithExtension))
return {
statusCode: 200,
body: JSON.stringify({message: 'File downloaded successfully'})
}
};
const readdir = dir => {
return new Promise((res, rej) => {
fs.readdir(dir, function (err, data) {
if (err) {
return rej(err);
}
return res(data)
});
})
}
const readfile = (dir, filename) => {
return new Promise((res, rej) => {
fs.readFile(dir + filename, 'utf8', function (err, data) {
if (err) {
return rej(err);
}
return res(data)
})
})
}
Here is the output of the Lambda function:
And here are the complete CloudWatch logs:
My file contains nothing but a 'hello' inside it. You can see it on the logs.
Do keep in mind that, in Lambda Functions, you have a 512MB limit when downloading anything to /tmp. You can see the limits in the docs
I'm trying to write a small app that installs some files and modules in a new folder, but I keep getting this error:
{ Error: ENOENT: no such file or directory, uv_chdir
at process.chdir (/home/aboardwithabag/LaunchProject/node_modules/graceful-fs/polyfills.js:20:9)
at cd (/home/aboardwithabag/LaunchProject/index.js:26:13)
Below is my code. Can someone help me out?
// node LaunchProject projectName
// Installs a server, node modules, and index page.
// not working due to issues with chdir.
const cp = require('child_process');
const fse = require('fs-extra');
// const path = require('path');
const project = process.argv[2];
let server ="";
let home = "";
function make (cb){
fse.mkdirs(project, function(err){
if (err){
console.error(err);
}
});
cb;
}
function cd(cb){
try{
process.chdir('/'+project);
cb;
} catch (err) {
console.error(err);
return;
}}
function install(cb){
cp.exec('npm install express', function(err){
if (err){
console.error(err);
} else {
console.log('Express Installed.');
cp.exec('npm install ejs', function(err){
if (err){
console.error(err);
} else{
console.log('Ejs Installed.');
fse.outputFile('index.js', server);
fse.outputFile('public/index.html', home);
}});
}
});
cb;
}
make(cd(install(console.log(project + ' created.'))));
unless the folder name you assign to the project variable (in this case it seems to be "uv_chdir") is located at the root folder of your HDD, below line will give the error:
process.chdir('/'+project);
make sure you give correct path to the program arguments. (in this case argv[2])
Or you may remove the leading '/' and make the path relative.
It seems there are some issues with this code.
cb callbacks provided as function arguments need to be called not after the async calls, but inside the callbacks of these calls. For example:
function make (cb){
fse.mkdirs(project, function(err){
if (err){
console.error(err);
}
cb();
});
}
The last call chain make(cd(install(console.log(project + ' created.')))); would work only with sync calls in reversed order and only if they returned needed callbacks.
That is why your new dir is not ready when you try to use it: your async functions do not actually wait for each other.
You do not call your callbacks as cb(), just mention them as cb. You should call them.
With minimal changess, your code can be refactored in this way:
'use strict';
const cp = require('child_process');
const fse = require('fs-extra');
const project = process.argv[2];
let server = '';
let home = '';
make(cd, install, () => { console.log(project + ' created.'); });
function make(cb1, cb2, cb3) {
fse.mkdirs(project, (err) => {
if (err) {
console.error(err);
}
cb1(cb2, cb3);
});
}
function cd(cb1, cb2) {
try {
process.chdir('/' + project);
cb1(cb2);
} catch (err) {
console.error(err);
}
}
function install(cb1) {
cp.exec('npm install express', (err) => {
if (err) {
console.error(err);
} else {
console.log('Express Installed.');
cp.exec('npm install ejs', (err) => {
if (err) {
console.error(err);
} else {
console.log('Ejs Installed.');
fse.outputFile('index.js', server);
fse.outputFile('public/index.html', home);
cb1();
}
});
}
});
}
But it is rather brittle and unnecessarily complicated in this form. Maybe it would be simpler to inline your functions each in other.
when I use PM2,i got this error "no such file or directory, uv_chdir"
the resolvent is :
first,I use 'pm2 delete' to delete old process
second,I use 'pm2 start',then ok
ps : just change your code or use 'pm2 reload' or 'pm2 restart' would not be ok.
more detail , you can see "https://blog.csdn.net/u013934914/article/details/51145134"
When I am trying to load the following package in Meteor https://github.com/vsivsi/meteor-job-collection
It gets downloaded 100% and extracted, but at the time of loading it throws the following error:
{ [
Error: ENOTEMPTY: directory not empty, rmdir 'C:\Users\LALITS~1\AppData\Local\Temp\mt-16riklk\npm\job\node_modules']
errno: -4051,
code: 'ENOTEMPTY',
syscall: 'rmdir',
path: 'C:\\Users\\LALITS~1\\AppData\\Local\\Temp\\mt-16riklk\\npm\\job\\node_modules' }
I am using windows 8.1 64 bit.
I have tried to delete the folder manually, but again it created a new one and throws the same error. Can anyone tell me what is the problem? Am I missing something?
Thanks in advance.
Your issue looks like this known Meteor bug:
https://github.com/meteor/meteor/issues/8663. This bug occurs under Windows when updating to the next Meteor version.
Maybe you can try the proposed solution, which is to edit the following file:
C:\Users\[yourName]\AppData\Local\.meteor\packages\meteor-tool\[yourMeteorVersion]\mt-os.windows.x86_32\tools\fs\files.js
...and replace functions files.rm_recursive_async and files.rm_recursive with this code:
files.rm_recursive_async = function (path) {
return new Promise(function (resolve, reject) {
rimraf(files.convertToOSPath(path), function (err) {
err && console.log(err);
resolve();
//return err ? reject(err) : resolve();
});
});
}; // Like rm -r.
files.rm_recursive = Profile("files.rm_recursive", function (path) {
try {
rimraf.sync(files.convertToOSPath(path));
} catch (e) {
if (e.code === "ENOTEMPTY" && canYield()) {
files.rm_recursive_async(path).await();
return;
}
console.log(e);
//throw e;
}
}); // Makes all files in a tree read-only.
I'm using the excellent Request library for downloading files in Node for a small command line tool I'm working on. Request works perfectly for pulling in a single file, no problems at all, but it's not working for ZIPs.
For example, I'm trying to download the Twitter Bootstrap archive, which is at the URL:
http://twitter.github.com/bootstrap/assets/bootstrap.zip
The relevant part of the code is:
var fileUrl = "http://twitter.github.com/bootstrap/assets/bootstrap.zip";
var output = "bootstrap.zip";
request(fileUrl, function(err, resp, body) {
if(err) throw err;
fs.writeFile(output, body, function(err) {
console.log("file written!");
}
}
I've tried setting the encoding to "binary" too but no luck. The actual zip is ~74KB, but when downloaded through the above code it's ~134KB and on double clicking in Finder to extract it, I get the error:
Unable to extract "bootstrap" into "nodetest" (Error 21 - Is a directory)
I get the feeling this is an encoding issue but not sure where to go from here.
Yes, the problem is with encoding. When you wait for the whole transfer to finish body is coerced to a string by default. You can tell request to give you a Buffer instead by setting the encoding option to null:
var fileUrl = "http://twitter.github.com/bootstrap/assets/bootstrap.zip";
var output = "bootstrap.zip";
request({url: fileUrl, encoding: null}, function(err, resp, body) {
if(err) throw err;
fs.writeFile(output, body, function(err) {
console.log("file written!");
});
});
Another more elegant solution is to use pipe() to point the response to a file writable stream:
request('http://twitter.github.com/bootstrap/assets/bootstrap.zip')
.pipe(fs.createWriteStream('bootstrap.zip'))
.on('close', function () {
console.log('File written!');
});
A one liner always wins :)
pipe() returns the destination stream (the WriteStream in this case), so you can listen to its close event to get notified when the file was written.
I was searching about a function which request a zip and extract it without create any file inside my server, here is my TypeScript function, it use JSZIP module and Request:
let bufs : any = [];
let buf : Uint8Array;
request
.get(url)
.on('end', () => {
buf = Buffer.concat(bufs);
JSZip.loadAsync(buf).then((zip) => {
// zip.files contains a list of file
// chheck JSZip documentation
// Example of getting a text file : zip.file("bla.txt").async("text").then....
}).catch((error) => {
console.log(error);
});
})
.on('error', (error) => {
console.log(error);
})
.on('data', (d) => {
bufs.push(d);
})