EISDIR - EISDIR: illegal operation on a directory, read - javascript

When I try to upload an image into a bucket on the server side I'm getting the error above. I checked using the debugger that the file parameter contains the file's path and not the folder's path. Here's the code :
function uploadFile(file, directory) {
return new Promise((resolve, reject) => {
try {
const bucket = storage.bucket(BUCKET_NAME);
const bucketFile = bucket.file(directory ? `${directory}/${file.originalname}` : file.originalname);
const blobStream = bucketFile.createWriteStream();
blobStream.on('error', err => {
const status = err.status || 500;
console.log(err, status);
reject(err);
});
blobStream.on('finish', async () => {
// The public URL can be used to directly access the file via HTTP.
await bucketFile.makePublic();
const publicUrl = `https://storage.googleapis.com/${bucket.name}/${bucketFile.name}`;
resolve(publicUrl);
});
blobStream.end(file.buffer);
} catch (err) {
reject(err);
}
});
}
Can you help me?

The path of the file was right. But the path to the credentials was wrong

Related

How to delete zip file after sent response in express

I just want to delete zip folder after sent response so I am looking for any alternative solution
Here is my code / it is get request
exports.Download = async (req, res) => {
try {
var zp = new admz();
zp.addLocalFolder(`./download/${folder}`);
const file_after_download = 'downloaded.zip';
const data = zp.toBuffer();
res.set('Content-Type', 'application/octet-stream');
res.set('Content-Disposition', `attachment; filename=${file_after_download}`);
res.set('Content-Length', data.length);
return res.send(data);
// HERE I want execute this code
let dir = `./download/${folder}`;
if (fse.existsSync(dir)) {
fse.rmdirSync(dir, { recursive: true })
}
} catch (err) {
console.log(err)
return res.render('pages/404');
}
}
Update
If send code without return ( res.send(data);)
Im getting this error //Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client //
If I put return res.send(data); at the end of block , then downloaded zip file will be empty - because its deleted already
From the docs of Express, you can use res.download() function which has a callback parameters to be executed once download is done.
res.download(filePath, 'yourFileName', function(err) {
if (err) {
next(err)
} else {
console.log('Delete:', filePath);
}
})

Zip image stream using archiver and send as express response

This is on Node/Express/Typescript. I'm trying get an image on my file system, stream it to a zip file, and then stream this zip file to the client. I have to make sure every step is streamed since this will expand to zipping up multiple files, which need to be streamed to the client as a zip.
I have the following code:
import express, { Application, Request, Response } from "express";
import fs from "fs";
import stream from "stream";
import archiver from "archiver";
app.get("/images", async (req: Request, res: Response) => {
const r = fs.createReadStream("appicon.png");
const ps = new stream.PassThrough();
// stream the image
stream.pipeline(
r,
ps,
(err) => {
if (err) {
console.log(err);
return res.sendStatus(400);
}
}
);
// zip the image and send it
let archive = archiver("zip");
archive.on("end", () => {
console.log(archive.pointer() + " total bytes");
console.log("archiver finalized");
})
archive.on('error', (err) => {
return res.status(500).send({
message: err
});
})
res.attachment('output.zip');
ps.pipe(archive);
archive.pipe(res);
archive.finalize();
});
However, when I access my /images route, I get an output.zip file which is empty.
I feel like I'm messing up the order of my pipes somehow.
What am I missing?
I figured out the issue. Here is the code that works:
app.get("/images", async (req: Request, res: Response) => {
const r = fs.createReadStream("appicon.png");
const ps = new stream.PassThrough();
stream.pipeline(
r,
ps,
(err) => {
if (err) {
console.log(err);
return res.sendStatus(400);
}
}
);
//r.pipe(ps); // alternative way to do it without pipeline
// zip the image and send it
let archive = archiver("zip");
archive.on("end", () => {
console.log(archive.pointer() + " total bytes");
console.log("archiver finalized");
})
archive.on('error', (err) => {
return res.status(500).send({
message: err
});
})
// name the output file
res.attachment("output.zip");
// pipe the zip to response
archive.pipe(res);
// add the image from stream to archive
archive.append(ps, {name: "image.png"});
archive.finalize();
});
I had to use archive.append(ps, {name: "image.png"}); to add my image stream to the zip archive.

FTP in AWS Lambda - Issues Downloading Files (Async/Await)

I have been struggling with various FTP Node modules to try and get anything working in AWS Lambda. The best and most popular seems to be "Basic-FTP" that also supports async/await. But I just cannot get it to download files when any code is added beneath the FTP function.
I don't want to add the fs functions within the FTP async function as I need to solve what is causing the break when any code below is added and I also have other bits of code to add and work with the downloaded file and it's content later:
FTP SUCCESS - When the async function is used only with no fs code beneath it
FTP FAILURE - Adding the fs readdir/readFile functions or any other code below
ERROR Error: ENOENT: no such file or directory, open '/tmp/document.txt'
https://github.com/patrickjuchli/basic-ftp
const ftp = require("basic-ftp");
const fs = require("fs");
var FileNameWithExtension = "document.txt";
var ftpTXT;
exports.handler = async (event, context, callback) => {
example();
async function example() {
const client = new ftp.Client();
//client.ftp.verbose = true;
try {
await client.access({
host: host,
user: user,
password: password,
//secure: true
});
console.log(await client.list());
await client.download(fs.createWriteStream('/tmp/' + FileNameWithExtension), FileNameWithExtension);
}
catch (err) {
console.log(err);
}
client.close();
}
// Read the content from the /tmp/ directory to check FTP was succesful
fs.readdir("/tmp/", function (err, data) {
if (err) {
return console.error("There was an error listing the /tmp/ contents.");
}
console.log('Contents of AWS Lambda /tmp/ directory: ', data);
});
// Read TXT file and convert into string format
fs.readFile('/tmp/' + FileNameWithExtension, 'utf8', function (err, data) {
if (err) throw err;
ftpTXT = data;
console.log(ftpTXT);
});
// Do other Node.js coding with the downloaded txt file and it's contents
};
The problem is that you are getting lost when creating an async function inside your handler. Since example() is async, it returns a Promise. But you don't await on it, so the way it has been coded, it's kind of a fire and forget thing. Also, your Lambda is being terminated before your callbacks are triggered, so even if it got to download you would not be able to see it.
I suggest you wrap your callbacks in Promises so you can easily await on them from your handler function.
I have managed to make it work: I have used https://dlptest.com/ftp-test/ for testing, so change it accordingly. Furthermore, see that I have uploaded the file myself. So if you want to replicate this example, just create a readme.txt on the root of your project and upload it. If you already have this readme.txt file on your FTP server, just delete the line where it uploads the file.
Here's a working example:
const ftp = require("basic-ftp");
const fs = require("fs");
const FileNameWithExtension = "readme.txt";
module.exports.hello = async (event) => {
const client = new ftp.Client();
try {
await client.access({
host: 'ftp.dlptest.com',
user: 'dlpuser#dlptest.com',
password: 'puTeT3Yei1IJ4UYT7q0r'
});
console.log(await client.list());
await client.upload(fs.createReadStream(FileNameWithExtension), FileNameWithExtension)
await client.download(fs.createWriteStream('/tmp/' + FileNameWithExtension), FileNameWithExtension);
}
catch (err) {
console.log('logging err')
console.log(err);
}
client.close();
console.log(await readdir('/tmp/'))
console.log(await readfile('/tmp/', FileNameWithExtension))
return {
statusCode: 200,
body: JSON.stringify({message: 'File downloaded successfully'})
}
};
const readdir = dir => {
return new Promise((res, rej) => {
fs.readdir(dir, function (err, data) {
if (err) {
return rej(err);
}
return res(data)
});
})
}
const readfile = (dir, filename) => {
return new Promise((res, rej) => {
fs.readFile(dir + filename, 'utf8', function (err, data) {
if (err) {
return rej(err);
}
return res(data)
})
})
}
Here is the output of the Lambda function:
And here are the complete CloudWatch logs:
My file contains nothing but a 'hello' inside it. You can see it on the logs.
Do keep in mind that, in Lambda Functions, you have a 512MB limit when downloading anything to /tmp. You can see the limits in the docs

Download file via FTP, write to /tmp/ and output .txt contents to the console with AWS Lambda

I am using just a single Node package, basic-ftp to try and download a TXT file and write the contents to the console. Further down the line I will be editing the text so will need to use fs. Just struggling to work with the output from createWriteStream from within the FTP program.
Can anyone help me write a TXT file to the /tmp/ file within AWS Lambda and then the correct syntax to open and edit the file after createWriteStream has been used?
var fs = require('fs');
const ftp = require("basic-ftp")
var path = require('path');
exports.handler = (event, context, callback) => {
var fullPath = "/home/example/public_html/_uploads/15_1_5c653e6f6780f.txt"; // File Name FULL PATH -------
const extension = path.extname(fullPath); // Used to calculate filenames below
const wooFileName = path.basename(fullPath, extension); // Uploaded filename with no path or extension eg. filename
const myFileNameWithExtension = path.basename(fullPath); // Uploaded filename with the file extension eg. filename.txt
const FileNameWithExtension = path.basename(fullPath); // Uploaded filename with the file extension eg. filename.txt
example()
async function example() {
const client = new ftp.Client()
client.ftp.verbose = true
try {
await client.access({
host: "XXXX",
user: "XXXX",
password: "XXXX",
//secure: true
})
await client.download(fs.createWriteStream('./tmp/' + myFileNameWithExtension), myFileNameWithExtension)
}
catch(err) {
console.log(err)
}
client.close()
}
//Read the content from the /tmp directory to check it's empty
fs.readdir("/tmp/", function (err, data) {
console.log(data);
console.log('Contents of AWS Lambda /tmp/ directory');
});
/*
downloadedFile = fs.readFile('./tmp/' + myFileNameWithExtension)
console.log(downloadedFile)
console.log("Raw text:\n" + downloadedFile.Body.toString('ascii'));
*/
}
Pretty sure your fs.createWriteStream() has to use an absolute path to /tmp in Lambdas. Your actual working directory is var/task not /.
Also, if you're using fs.createWriteStream() you'll need to wait for the finish event before reading from the file. Somethign like this...
async function example() {
var finalData = '';
const client = new ftp.Client()
client.ftp.verbose = true
try {
await client.access({
host: "XXXX",
user: "XXXX",
password: "XXXX",
//secure: true
})
let writeStream = fs.createWriteStream('/tmp/' + myFileNameWithExtension);
await client.download(writeStream, myFileNameWithExtension)
await finalData = (()=>{
return new Promise((resolve, reject)=> {
writeStream
.on('finish', ()=>{
fs.readFile("/tmp/"+myFileNameWithExtension, function (err, data) {
if (err) {
reject(err)
} else {
console.log('Contents of AWS Lambda /tmp/ directory', data);
resolve(data);
}
});
})
.on('error', (err)=> {
console.log(err);
reject(err);
})
})
})();
}
catch(err) {
console.log(err)
}
client.close();
return finalData;
}
You'll also need to access the file using fs.readFile(). What you were using fs.readdir() gives you a list of files in the directory, not the file's contents.
If you want to used readdir() you could do it like this, but as you can see it is redundant in your case. To handle errors I would suggest just handling the error event in the initial createWriteStream() instead of adding this extra overhead (added to previous example)...
writeStream
.on('finish', ()=>{
fs.readdir('/tmp',(err, files)=> {
let saved = files.find(file => file === myFileNameWithExtension);
fs.readFile("/tmp/"+saved, function (err, data) {
if (err) throw new Error();
console.log(data);
console.log('Contents of AWS Lambda /tmp/ directory');
});
})
})
.on('error', (err)=> {
console.log(err);
throw new Error();
})
NOTE: Please log out the result of saved, I can't remember if the files array is absolute of relative paths.

Electron Dialog not saving the file

Electron version: 1.3.3
Operating system: Ubuntu 14.04
I want to save a XML object into a .xml file with Electron. I try this:
const {dialog} = require("electron").remote;
dialog.showSaveDialog(myObj)
A new windows is opening, I fill the name of the file but nothing has been saving.
it's recommended to use returned path from dialog.showSaveDialog to get filepath in new versions of electron: (which is result.filePath in the below code)
filename = dialog.showSaveDialog({}
).then(result => {
filename = result.filePath;
if (filename === undefined) {
alert('the user clicked the btn but didn\'t created a file');
return;
}
fs.writeFile(filename, content, (err) => {
if (err) {
alert('an error ocurred with file creation ' + err.message);
return
}
alert('WE CREATED YOUR FILE SUCCESFULLY');
})
alert('we End');
}).catch(err => {
alert(err)
})
The showSaveDialog() API does not save the file for you. You must use the returned path and use Node to save your file.
const {dialog} = require('electron').remote;
const fs = require('fs');
dialog.showSaveDialog({}).then((result) => {
fs.writeFile(result.filePath, MyFileData, (err) => {
// file saved or err
});
}).catch((err) => {
// err
});

Categories

Resources