NodeJS download file from AWS S3 Bucket - javascript

I'm trying to make an endpoint in NodeJS/Express for downloading content from my AWS S3 Bucket.
It works well, I can download the file in the client side but I can also see the stream preview in the Network tab which is annoying...
QUESTION
I'm wondering if what I'm doing is correct and a good practice.
Also would like to know if it's normal to see the output stream in the Network tab.
How should I properly send I file from S3 to my client application using NodeJS/Express?
I'm pretty sure other websites requests don't let you preview the content with a: "Fail to load response data".
This is what I do in my NodeJS application to get the stream file from AWS S3:
download(fileId) {
const fileObjectStream = app.s3
.getObject({
Key: fileId
})
.createReadStream();
this.res.set("Content-Type", "application/octet-stream");
this.res.set(
"Content-Disposition",
'attachment; filename="' + fileId + '"'
);
fileObjectStream.pipe(this.res);
}
And in the client side I can see this:

I think the issue is with the header :
//this line will set proper header for file and make it downloadable in client's browser
res.attachment(key);
// this will execute download
s3.getObject(bucketParams)
.createReadStream()
.pipe(res);
So code should be like this (This is what I am doing it in my project handling file as res.attachment or res.json in case of error so client can display error to end user) :
router.route("/downloadFile").get((req, res) => {
const query = req.query; //param from client
const key = query.key;//param from client
const bucketName = query.bucket//param from client
var bucketParams = {
Bucket: bucketName,
Key: key
};
//I assume you are using AWS SDK
s3 = new AWS.S3({ apiVersion: "2006-03-01" });
s3.getObject(bucketParams, function(err, data) {
if (err) {
// cannot get file, err = AWS error response,
// return json to client
return res.json({
success: false,
error: err
});
} else {
res.attachment(key); //sets correct header (fixes your issue )
//if all is fine, bucket and file exist, it will return file to client
s3.getObject(bucketParams)
.createReadStream()
.pipe(res);
}
});
});

Related

Backend gets a pdf from a third-party. How can I send it via REST api to my client?

I get a pdf from a third party.
I save the file on S3. I can see the file on S3 and when I open it I can see my PDF.
However, when I passed the pdf to the client and check it on Postman I get an empty PDF.
Here is my code:
public async getReportFromThirdParty(token) {
const params = {
headers: { Authorization: `Bearer ${token}`},
responseType: "arraybuffer",
}
let report = {};
report = await axios.get(`https://api.thirdparty.com/api/get-pdf`, params);
return report.data;
}
app.post("/download", async (req, res) => {
const token = 'abcde-secret-token';
const pdf = await getReportFromThirdParty(token);
await saveToS3(pdf) // <---- I checked and it saves the file properly on S3 as PDF
res.contentType("application/pdf");
return res.status(200).send(pdf); // <--- this returns an empty pdf file
});
Any ideas?
im going to guess your app is an node express app so therefore you likely need to
res.sendFile();
More info found on the express site

"Access denied" accessing images in myS3 bucket from my express server

I have some problems accessing my S3 images via get request form my express server.
I have a mongo database where I store text information for the items on my webpage and save the image key that I send to my S3 bucket. Now when I try to get all the items and the respective png images, this error came to me:
...aws-sdk\lib\request.js:31
throw err;
^
AccessDenied: Access Denied ...
even if my user authorization in S3 is good.
Because I need to fetch all the items for a productPage component I go like this:
//ROUTER FILE
router.get("/cust/test", async (req, res) => {
try {
let tests;
tests = await Test.find();
tests.map((t) => {
const png = t.png;
const readStream = s3DwnPng(png);
readStream.pipe(res);
console.log(png);
});
res.status(200).json(tests);
console.log(tests);
} catch (err) {
res.status(500).json(err);
}
});
//S3 FILE
function s3DwnPng(fileKey) {
const dwnParams = {
Bucket: process.env.AWS_BUCKET_NAME,
Key: `png/${fileKey}`,
};
return s3.getObject(dwnParams).createReadStream();
}
exports.s3DwnPng = s3DwnPng;
but this does not work for me.
Someone could help me?
And is it worth persisting accessing the images passing throw my server? I'm considering switching to a public policy with private CORS access to make the load on my server lighter, is it really secure to do so?

Upload stream to azure blob storage

I would upload an audio file to azure blob storage.
First I make a http request to url to get the file .
Then I would "directly" save it to azure blob storage without the need to store it in server then upload it.
Here's my code:
request
.get(url, {
auth: {
bearer: token
}
})
.on("response", async function(response) {
const res = await blobService.createAppendBlobFromStream(
"recordings", // container name
"record.wav",
response, // should be stream
177777, // stream length
function(err, res) {
try {
console.log(res);
} catch (err) {
console.log(err);
}
}
);
});
Actually when I upload a file in blob storage and check my database I get an empty file with no data inside, I think I'm not sending the data stream correcty.
What I expect is to get the audio file in blob storage with data inside that I get from the get request
I should also specify the stream length but I don't know how to get it I putted a random number but it should be the right stream length. I have checked response object but I havn't found that infomation.
I think you can't upload a file directly so first of all create a folder in your blob storage then create a file.
Read the selected file data and write it into the created file using file stream or else.
here is the upload file code,
app.post('/upload',function(req,res){
if(req.files.fileInput){
var file = req.files.fileInput,
name = file.name,
type = file.mimetype;
var uploadpath = __dirname + '/uploads/' + name;
file.mv(uploadpath,function(err){
if(err){res.send("Error Occured!")}
else {res.send('Done! Uploading files')}
});
}
else {
res.send("No File selected !");
res.end();
};
})

Using SSH2 and SFTPStream to stream file from server to AWS S3 Bucket

I'm trying to use the ssh2 module to take a file from a server and add it to an S3 bucket in AWS. I would like to be able to stream the file so that I don't have to have it in memory. I tried the following:
const Client = require('ssh2').Client;
const aws = require('aws-sdk');
const s3 = new aws.S3();
exports.handler = function(event, context, callback) {
let connSettings = {
host: event.serverHost,
port: event.port,
username: event.username,
password: event.password
};
let conn = new Client();
conn.on('ready', function() {
conn.sftp(true, function(err, sftp) {
if (err) throw err;
let stream = sftp.createReadStream(filename);
let putParams = {
Bucket: s3Bucket,
Key: s3Key,
Body: stream
};
s3.putObject(putParams, function (err) {
if (err) throw err;
console.log("Uploaded!");
});
});
}).connect(connSettings);
};
However, the method sftp.createReadStream(filename) is looking at my local directory and not the server. Which other than that, it works.
Is there a way I can stream a file from a server to S3?
I know I could use the sftp.fastGet method to download the file from the server, save it locally, and then upload it to S3. But I would prefer not to have to save the file locally. The s3 SDK accepts a stream, so it would be much more convenient to just stream it.
UPDATE: the method sftp.createReadStream(filename) is correctly reading from the server and not locally. It is the s3.putObject method that is for some reason trying to get the file locally even though I'm giving it a stream.
For some reason the method s3.putObject is looking for the file locally even though I give it a stream. The stream contains the path from the server, but for whatever reason, when the method s3.putObject method is reading the stream, it tries reading the file locally.
I fixed this by instead using the s3.upload method.
s3.upload(putParams, function (err) {
if (err) throw err;
console.log("Uploaded!");
});

Generate Download URL After Successful Upload

I have successfully uploaded files to Firebase's storage via Google Cloud Storage through JS! What I noticed is that unlike files uploaded directly, the files uploaded through Google Cloud only have a Storage Location URL, which isn't a full URL, which means it cannot be read! I'm wondering if there is a way to generate a full URL on upload for the "Download URL" part of Firebase's actual storage.
Code being used:
var filename = image.substring(image.lastIndexOf("/") + 1).split("?")[0];
var gcs = gcloud.storage();
var bucket = gcs.bucket('bucket-name-here.appspot.com');
request(image).pipe(bucket.file('photos/' + filename).createWriteStream(
{metadata: {contentType: 'image/jpeg'}}))
.on('error', function(err) {})
.on('finish', function() {
console.log(imagealt);
});
When using the GCloud client, you want to use getSignedUrl() to download the file, like so:
bucket.file('photos/' + filename).getSignedUrl({
action: 'read',
expires: '03-17-2025'
}, function(err, url) {
if (err) {
console.error(err);
return;
}
// The file is now available to read from this URL.
request(url, function(err, resp) {
// resp.statusCode = 200
});
});
You can either:
a) Create a download url through the firebase console
b) if you attempt to get the downloadurl programmatically from a firebase client, one will be created on the fly for you.

Categories

Resources