"Access denied" accessing images in myS3 bucket from my express server - javascript

I have some problems accessing my S3 images via get request form my express server.
I have a mongo database where I store text information for the items on my webpage and save the image key that I send to my S3 bucket. Now when I try to get all the items and the respective png images, this error came to me:
...aws-sdk\lib\request.js:31
throw err;
^
AccessDenied: Access Denied ...
even if my user authorization in S3 is good.
Because I need to fetch all the items for a productPage component I go like this:
//ROUTER FILE
router.get("/cust/test", async (req, res) => {
try {
let tests;
tests = await Test.find();
tests.map((t) => {
const png = t.png;
const readStream = s3DwnPng(png);
readStream.pipe(res);
console.log(png);
});
res.status(200).json(tests);
console.log(tests);
} catch (err) {
res.status(500).json(err);
}
});
//S3 FILE
function s3DwnPng(fileKey) {
const dwnParams = {
Bucket: process.env.AWS_BUCKET_NAME,
Key: `png/${fileKey}`,
};
return s3.getObject(dwnParams).createReadStream();
}
exports.s3DwnPng = s3DwnPng;
but this does not work for me.
Someone could help me?
And is it worth persisting accessing the images passing throw my server? I'm considering switching to a public policy with private CORS access to make the load on my server lighter, is it really secure to do so?

Related

Amazon S3 Fetching Error: "NoSuchKey" However Key Does Exist

I am trying to fetch a S3 Object using AWS Storage
fetchAvatar = async () => {
try {
const imageData = await Storage.get("public/public/us-east-2:/1597842961073/family.jpg")
console.log(imageData)
} catch (err) {
console.log('error fetching avatar: ')
console.log(err)
}
}
When I click on the link that the imageData provides I get NoSuchKey error, however it does exist
I've made sure that the image is public and accessible by everyone, so there shouldn't be any authentication problems. I've also looked at similar issue to this and I made sure there is no spaces or a lot of special keys in my image keys. I am kind of stumped on this...
So I figured out the reason, and it has to do something with AWS S3 Management. For some reason that every time I upload an image, the folder will reset and become privet. When I remake the folders and image public manually I am able to render the image properly...So i guess it is more of AWS issue or bug that they need to fix I think
I suggest to use javascript aws sdk, you can get an object from the bucket like below:
var params = {
Bucket: "your-bucket-name",
Key: "yourFileName.jpg"
};
s3.getObject(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(data);
});
UPDATE:
You can define your region when you create a s3 instance, like:
const s3 = new S3({
region: 'eu-central-1',
});

NodeJS download file from AWS S3 Bucket

I'm trying to make an endpoint in NodeJS/Express for downloading content from my AWS S3 Bucket.
It works well, I can download the file in the client side but I can also see the stream preview in the Network tab which is annoying...
QUESTION
I'm wondering if what I'm doing is correct and a good practice.
Also would like to know if it's normal to see the output stream in the Network tab.
How should I properly send I file from S3 to my client application using NodeJS/Express?
I'm pretty sure other websites requests don't let you preview the content with a: "Fail to load response data".
This is what I do in my NodeJS application to get the stream file from AWS S3:
download(fileId) {
const fileObjectStream = app.s3
.getObject({
Key: fileId
})
.createReadStream();
this.res.set("Content-Type", "application/octet-stream");
this.res.set(
"Content-Disposition",
'attachment; filename="' + fileId + '"'
);
fileObjectStream.pipe(this.res);
}
And in the client side I can see this:
I think the issue is with the header :
//this line will set proper header for file and make it downloadable in client's browser
res.attachment(key);
// this will execute download
s3.getObject(bucketParams)
.createReadStream()
.pipe(res);
So code should be like this (This is what I am doing it in my project handling file as res.attachment or res.json in case of error so client can display error to end user) :
router.route("/downloadFile").get((req, res) => {
const query = req.query; //param from client
const key = query.key;//param from client
const bucketName = query.bucket//param from client
var bucketParams = {
Bucket: bucketName,
Key: key
};
//I assume you are using AWS SDK
s3 = new AWS.S3({ apiVersion: "2006-03-01" });
s3.getObject(bucketParams, function(err, data) {
if (err) {
// cannot get file, err = AWS error response,
// return json to client
return res.json({
success: false,
error: err
});
} else {
res.attachment(key); //sets correct header (fixes your issue )
//if all is fine, bucket and file exist, it will return file to client
s3.getObject(bucketParams)
.createReadStream()
.pipe(res);
}
});
});

Catch AWS S3 Get Object Stream Errors Node.js

I'm trying to build an Express server that will send items in a S3 bucket to the client using Node.js and Express.
I found the following code on the AWS documentation.
var s3 = new AWS.S3({apiVersion: '2006-03-01'});
var params = {Bucket: 'myBucket', Key: 'myImageFile.jpg'};
var file = require('fs').createWriteStream('/path/to/file.jpg');
s3.getObject(params).createReadStream().pipe(file);
I have changed I slightly to the following:
app.get("/", (req, res) => {
const params = {
Bucket: env.s3ImageBucket,
Key: "images/profile/abc"
};
s3.getObject(params).createReadStream().pipe(res);
});
I believe this should work fine. The problem I'm running into is when the file doesn't exist or S3 returns some type of error. The application crashes and I get the following error:
NoSuchKey: The specified key does not exist
My question is, how can I catch or handle this error? I have tried a few things such as wrapping that s3.getObject line in a try/catch block, all of which haven't worked.
How can I catch an error and handle it my own way?
I suppose you can catch error by listening to the error emitter first.
s3.getObject(params)
.createReadStream()
.on('error', (e) => {
// NoSuchKey & others
})
.pipe(res)
.on('data', (data) => {
// data
})

Using SSH2 and SFTPStream to stream file from server to AWS S3 Bucket

I'm trying to use the ssh2 module to take a file from a server and add it to an S3 bucket in AWS. I would like to be able to stream the file so that I don't have to have it in memory. I tried the following:
const Client = require('ssh2').Client;
const aws = require('aws-sdk');
const s3 = new aws.S3();
exports.handler = function(event, context, callback) {
let connSettings = {
host: event.serverHost,
port: event.port,
username: event.username,
password: event.password
};
let conn = new Client();
conn.on('ready', function() {
conn.sftp(true, function(err, sftp) {
if (err) throw err;
let stream = sftp.createReadStream(filename);
let putParams = {
Bucket: s3Bucket,
Key: s3Key,
Body: stream
};
s3.putObject(putParams, function (err) {
if (err) throw err;
console.log("Uploaded!");
});
});
}).connect(connSettings);
};
However, the method sftp.createReadStream(filename) is looking at my local directory and not the server. Which other than that, it works.
Is there a way I can stream a file from a server to S3?
I know I could use the sftp.fastGet method to download the file from the server, save it locally, and then upload it to S3. But I would prefer not to have to save the file locally. The s3 SDK accepts a stream, so it would be much more convenient to just stream it.
UPDATE: the method sftp.createReadStream(filename) is correctly reading from the server and not locally. It is the s3.putObject method that is for some reason trying to get the file locally even though I'm giving it a stream.
For some reason the method s3.putObject is looking for the file locally even though I give it a stream. The stream contains the path from the server, but for whatever reason, when the method s3.putObject method is reading the stream, it tries reading the file locally.
I fixed this by instead using the s3.upload method.
s3.upload(putParams, function (err) {
if (err) throw err;
console.log("Uploaded!");
});

AWS S3 JavaScript SDK getSignedUrl returns base path only

I have some very simple code for generating an S3 URL. The URL I get back from the SDK only has the base path for S3. It doesn't contain anything else. Why is this happening?
var AWS = require('aws-sdk');
var s3 = new AWS.S3();
console.log(s3.getSignedUrl('getObject', {
Bucket: 'test',
Key: 'test'
}));
// Returns "https://s3.amazonaws.com/"
Node.js v0.12.0, AWS SDK 2.1.15 or 2.1.17, Windows 7 64-bit,
The problem wasn't with code. It turns out that when you don't have your AWS credentials set up properly in your environment that the AWS SDK doesn't complain. Fixing the credentials in ~/.aws/credentials resolved the issue.
I too had the same problem. I got the correct output by changing the below
from AWS_Access_Key_Id = myaccesskey to aws_access_key_id=myaccesskey
Similarly for Secret key. That means you should not use Upper case and no space before and after =
I had the same problem.
I inserted a correct access token, but some requests received only basepath, and some requests received normal URLs.
I was able to get the correct URL when I modified getSignedUrl to await getSignedUrlPromise.
To trace your issue whether your bucket exists with right permissions, and/or credentials are correct in your ~/.aws/credentials file, or whatever other aws access related problems. I just used the (Headbucket) operation as per documentation.
Ref: https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#getSignedUrlPromise-property
to achieve this programmatically:
/* This operation checks to see if a bucket exists. Put into aws.ts files*/
var params = {
Bucket: "acl1"
};
s3.headBucket(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(data); // successful response
});
Meanwhile the callback:
var params = {
Bucket: 'STRING_VALUE', /* required */
ExpectedBucketOwner: 'STRING_VALUE' /* the owner's aws account id */
};
s3.headBucket(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(data); // successful response
});
This will throw an exception like:
for example => CredentialsError: Missing credentials in config, if using AWS_CONFIG_FILE...

Categories

Resources