Using signed requests with AWS S3 and uploading photos? - javascript

So I have a react native application that's kind of like slack and I'm trying to do image uploads to s3.
I went with getSignedUrl route.
So the client pics a photo,
fetches a signed url to the bucket
then changes the url on the server for that user
then a put request to the signed url that was fetched.
It mostly works the files get in the right bucket and they are photos. but
A) the link makes me download the file instead of displaying it in browser.
B) the file isn't an image...its an xml file and can only be opened in photoshop
I've tried changing the type in the data.append type,
Adding header to the signed request
Adding x-amz- headers to the signed request
hard coding the file type in server
converting image to base64 string with a native module but It still is coming up wrong.
Client Side calls to server
uploadToServer() {
// alert('coming soon!');
//Go back to profile page
this.props.navigation.goBack();
//grab user from navigator params
let user = this.props.navigation.state.params.user
let pic = this.state.selected;
// turn uri into base64
NativeModules.ReadImageData.readImage(pic.uri, (image) => {
console.log(image);
var data = new FormData();
data.append('picture', {
uri: image,
name: pic.filename,
type: 'image/jpeg'
});
//get the signed Url for uploading
axios.post(api.getPhotoUrl, {fileName: `${pic.filename}`}).then((res) => {
console.log("get Photo URL response", res);
//update the user with the new url
axios.patch(api.fetchUserByID(user.id), {profileUrl: res.data.url}).then((resp) => {
console.log("Update User response", resp.data);
}).catch(err => errorHandler(err));
//upload the photo using the signed request url given to me.
//DO I NEED TO TURN DATA INTO A BLOB?
fetch(res.data.signedRequest, {
method: 'PUT',
body: data
}).then((response) => {
console.log("UPLOAD PHOTO RESPONSE: ", response);
}).catch(err => errorHandler(err))
}).catch((err) => errorHandler(err))
})
}
GET SIGNED URL logic from on out
router.post('/users/sign-s3', (req, res) => {
const s3 = new aws.S3({signatureVersion: 'v4', region: 'us-east-2'});
const fileName = `${req.user.id}-${req.body.fileName}`;
const fileType = req.body.fileType;
const s3Params = {
Bucket: AWS_S3_BUCKET,
Key: `images/${fileName}`,
Expires: 60,
ContentType: 'image/jpeg',
ACL: 'public-read'
};
s3.getSignedUrl('putObject', s3Params, (err, data) => {
if (err) {
console.log(err);
return res.end();
}
const returnData = {
signedRequest: data,
url: `https://${AWS_S3_BUCKET}.s3.amazonaws.com/${s3Params.Key}`
};
res.write(JSON.stringify(returnData));
res.end();
return null;
});
});

You need to change content type from image to supported xml format if you want it to be displayed in browser.
Refer this and set content type accordingly.

Related

express.js: pass an uploaded image to s3

I am trying to pass an image uploaded from a react app through express to a managed s3 bucket. The platform/host I am using creates and manages the s3 bucket and generates upload and access urls. This all works fine (I have tested a generated upload url in postman with an image in a binary body and it worked perfectly).
My problem is passing the image through express. I am using multer to get the image from the form but I am assuming multer is turning that image into some kind of file object and s3 is expecting some sort of blob or stream.
In following code, the image in req.file exists, I get a 200 response from s3 with no errors and when I visit the asset url the url works, but the image itself is missing.
const router = Router();
const upload = multer()
router.post('/', upload.single('file'), async (req, res) => {
console.log(req.file)
const asset = req.file
const assetPath = req.headers['asset-path']
let s3URLs = await getPresignedURLS(assetPath)
const sendAsset = await fetch(
s3URLs.urls[0].upload_url, // the s3 upload url
{
method: 'PUT',
headers: {
"Content-Type": asset.mimetype
},
body: asset,
redirect: 'follow'
}
)
console.log("s3 response", sendAsset)
res.status(200).json({"url": s3URLs.urls[0].access_url });
});
export default router;
I am not sure what to do to convert what multer gives me to something that aws s3 will accept. I am also open to getting rid of multer if there is an easier way to upload binary files to express.
Instead of multer, you can use multiparty to get file data from request object. And to upload file to s3 bucket you can use aws-sdk.
const AWS = require("aws-sdk");
const multiparty = require("multiparty");
/**
* Helper method which takes the request object and returns a promise with a data.
*/
const getDataFromRequest = (req) =>
new Promise(async(resolve, reject) => {
const form = new multiparty.Form();
await form.parse(req, (err, fields, files) => {
if (err) reject(err);
const bucketname = fields.bucketname[0];
const subfoldername = fields.subfoldername[0];
const file = files["file"][0]; // get the file from the returned files object
if (!file) reject("File was not found in form data.");
else resolve({
file,
bucketname,
subfoldername
});
});
});
/**
* Helper method which takes the request object and returns a promise with the AWS S3 object details.
*/
const uploadFileToS3Bucket = (
file,
bucketname,
subfoldername,
options = {}
) => {
const s3 = new AWS.S3();
// turn the file into a buffer for uploading
const buffer = readFileSync(file.path);
var originalname = file.originalFilename;
var attach_split = originalname.split(".");
var name = attach_split[0];
// generate a new random file name
const fileName = name;
// the extension of your file
const extension = extname(file.path);
console.log(`${fileName}${extension}`);
const params = {
Bucket: bucketname, //Bucketname
ACL: "private", //Permission
Key: join(`${subfoldername}/`, `${fileName}${extension}`), // File name you want to save as in S3
Body: buffer, // Content of file
};
// return a promise
return new Promise((resolve, reject) => {
return s3.upload(params, (err, result) => {
if (err) reject(err);
else resolve(result); // return the values of the successful AWS S3 request
});
});
};
router.post('/', upload.single('file'), async(req, res) => {
try {
// extract the file from the request object
const {
file,
bucketname,
subfoldername
} = await getDataFromRequest(req);
// Upload File to specified bucket
const {
Location,
ETag,
Bucket,
Key
} = await uploadFileToS3Bucket(
file,
bucketname,
subfoldername
);
let response = {};
res["Location"] = Location;
response["ETag"] = ETag;
response["Bucket"] = Bucket;
response["Key"] = Key;
res.status(200).json(response);
} catch (error) {
throw error;
}
});
Request body will be form data with following fields
bucketname:
subfoldername:
file: FileData
For anyone that ever stumbles across this question the solution was to create an custom multer storage engine. Inside the engine you get access to the file with a stream property that s3 accepted (with the correct headers).

Busboy file event not firing , received file as buffer in request body

I am developing a react application which uses react-dropzone hook to implement drag and drop file upload feature. I am then sending the file via a POST request to a firebase cloud function.
I am using BusBoy to write the received file to a tmpdir and upload the received file to firebase storage.
BusBoy file event is not firing in my cloud function.
Frontend
const onDrop = useCallback((acceptedFiles) => {
// Check whether excel file is uploaded
if (
acceptedFiles[0].type == `application/vnd.ms-excel`.trim() ||
acceptedFiles[0].type ==
`application/vnd.openxmlformats-officedocument.spreadsheetml.sheet
`.trim()
) {
auth.currentUser.getIdToken().then((token) => {
let bodyFormData = new FormData();
bodyFormData.append('file', acceptedFiles[0]);
axiosInstance
.post('/upload', bodyFormData, {
crossdomain: true,
headers: {
Authorization: `Bearer ${token.toString()}`,
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET,PUT,POST,DELETE,PATCH,OPTIONS',
},
})
.then((res) => {
console.log(res);
})
.catch((err) => {
console.log(err);
});
});
} else {
setFileTypeError(true);
}
}, []);
Request Header Content-Type is multipart/form-data; boundary=----WebKitFormBoundaryRWo4LrjStmEjoZll
and Form Data is file:{binary}
Backend
app.post('/upload', verifyAuth, (req, res) => {
const BusBoy = require('busboy');
const path = require('path');
const os = require('os');
const fs = require('fs');
const busboy = new BusBoy({ headers: req.headers });
let tempFileName;
let fileToBeUploaded = {};
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
console.log('file received');
const fileExtension = filename.split('.')[filename.split('.').length - 1];
tempFileName = `bulk-upload-${shortid.generate()}.${fileExtension}`;
const filePath = path.join(os.tmpdir(), tempFileName);
fileToBeUploaded = { filePath, mimetype };
file.pipe(fs.createWriteStream(filePath));
});
busboy.on('finish', () => {
bucket
.upload(fileToBeUploaded.filePath, {
resumable: false,
metadata: {
metadata: {
contentType: fileToBeUploaded.mimetype,
},
},
})
.then(() => {
return res.status(200).json({ success: true });
})
.catch((err) => {
return res.status(400).json({ error: err });
});
});
});
Busboy On file event is not getting fired. I also checked req.file, this returns undefined. Could anyone help where I have gone wrong?
I had the same problem, the file event doesn't fire when I sent a file with a multipart/form to a express firebase cloud function. I read other post that suggest to remove other middleware that can read the file stream from the request, but it was not work for me and nowadays I don't fix this yet.
But, a possible alternative is use the Firebase API for first upload the picture from the client app directly to the bucket and then do some action (in my case save a record on the database) :
var photoRef = storage.ref().child(photoName);
var uploadTask = photoRef.put(this.post.photo);
uploadTask.on(
"state_changed",
(snapshot) => {
// Observe state change events such as progress, pause, and resume
// Get task progress, including the number of bytes uploaded and the total number of bytes to be uploaded
},
(error) => {
// Handle unsuccessful uploads
},
() => {
// Handle successful uploads on complete
uploadTask.snapshot.ref.getDownloadURL().then((downloadURL) => {
console.log("File available at", downloadURL);
db.collection("images")
.doc(fields.id)
.set({
id: fields.id,
caption: fields.caption,
location: fields.location,
date: parseInt(fields.date),
imageUrl: downloadURL,
})
.then(() => {
console.log("Post added: " + fields.id);
});
});
}
);
In the case that you need do something with the file on the server, you can upload it first to the bucket and next retrieve it on a cloud function.

s3 isn't uploading file and getting error of SignatureDoesNotMatch

I'm trying to add images to my s3 bucket in aws, but it doesn't seem to work. I get the error of SignatureDoesNotMatch
Here's how I'm uploading the file/image:
FrontEnd
const file = e.target.files[0];
const fileParts = file.name.split('.');
const fileName = fileParts[0];
const fileType = fileParts[1];
const response = axios.post('api/aws/sign_s3', { fileName, fileType );
Backend
router.post('/sign_s3', async (req, res) => {
aws.config.update({
accessKeyId: config.aws.accessKey,
secretAccessKey: config.aws.secretKey,
region: 'us-west-1'
});
const s3 = new aws.S3(); // Create a new instance of S3
const fileName = req.body.fileName;
const fileType = req.body.fileType;
const s3Params = {
Bucket: config.aws.bucketName,
Key: fileName,
Expires: 500,
ContentType: fileType,
ACL: 'public-read'
};
s3.getSignedUrl('putObject', s3Params, (err, data) => {
if (err) return res.send(err);
const returnData = {
signedRequest: data,
url: `https://sim-to-do.s3.amazonaws.com/${fileName}`
};
res.json({ success: true, responseData: returnData });
});
});
I get two urls. When I go to the first one, I get the following error code:
SignatureDoesNotMatch
Error Message
The request signature we calculated does not match the signature you provided. Check your key and signing method.
What am I doing wrong? What's the correct way of uploading a file to aws s3?
I was able to fix this issue after removing the Content-Type from the headers.
If you get "Signature does not match", it's highly likely you used a wrong secret access key. Can you double-check access key and secret access key to make sure they're correct?
from awendt answer

Angular 2 Http POST request to send image file to server for s3 upload

I am new to angular 2 and nodejs and so far I have only ever needed to send json objects to the backend with POST requests but now I need to send an image file and I am running into problems. I want a user to input an image file which is then sent to the backend as the body of a POST request then in the backend the image is uploaded to s3. On the frontend I have the following html
<div>
<input type="file" (change)="testOnChange($event)" multiple/>
</div>
which triggers the following code in the associated component
testOnChange(event){
var files = event.srcElement.files;
this.jobService.s3Upload(files[0]).subscribe(data => {
console.log(data);
});
}
which uses the s3Upload function in the jobService which is
s3Upload(file){
const body = file;
const headers = new Headers({'Content-Type': 'image/jpg'});
return this.http.post('http://localhost:3000/job/s3upload', body, {headers: headers})
.map((response:Response) => {
console.log(response);
return response;
})
.catch((error:Response) => {
this.errorService.handleError(error.json());
return Observable.throw(error.json());
});
}
Then on the backend
router.post('/s3upload', function(req,res){
console.log(req.body);
const file = req.body;
aws.config.update({
"accessKeyId": process.env.AWS_ACCESS_KEY_ID,
"secretAccessKey": process.env.AWS_SECRET_ACCESS_KEY,
"region": "us-west-2"
});
const s3 = new aws.S3();
var params = {
Bucket: 'labeller-images',
Key: file.name,
ContentType: file.type,
Body: file,
ACL: 'public-read'
};
console.log(params);
s3.putObject(params, function (err, data) {
if (err) {
console.log("upload error");
return res.status(500).json({
title: 'An error occurred',
error: err
});
} else {
console.log("Successfully uploaded data");
res.status(200).json({
message: 'Success',
obj: `https://labeller-images.s3.amazonaws.com/${file.name}`
})
}
});
});
The specific error that arises is There were 2 validation errors: * MissingRequiredParameter: Missing required key 'Key' in params * InvalidParameterType: Expected params.Body to be a string, Buffer, Stream, Blob, or typed array object. But the logging statements reveal that req.body is just {}, so that params is
{ Bucket: 'labeller-images',
Key: undefined,
ContentType: undefined,
Body: {},
ACL: 'public-read' }
So it looks like the req.body is empty and the image file is never actually getting to the backend. Can anyone see what I'm doing wrong here? Is it some kind of file serialization issue?

React-Native send image file with XHR to Api backend

We are building a photo upload feature in our application, which is built using react native.
I am using this: https://github.com/marcshilling/react-native-image-picker
Upon selecting the image, I get the URI of the image, something like this:
file:///storage/151A-3C1B/Pictures/image-c47d8624-8530-43df-873e-e31c2d27d0e9.jpg
I can also get the base64 encoded string of the image, but I do not want to deal with base64, since it slows down the app and the result is about 1/3 bigger request.
So my question is, I have the URI like above, how can I send the contents of the file to my API backend? It expects multipart/form-data, the name "photo".
I wanted to try with this:
var formData = new FormData();
formData.append('photo', CONTENTS);
But I do not know how to get the contents of the file, or how to pass the file URI to the formData object, so the contents would be sent, not the URI string itself. Any help please?
You can try something like this. It worked for me :
// api.js file
'use strict';
import request from 'superagent';
import {NativeModules} from 'react-native';
var api = (method, URL) => {
var r = request(method, apiURL);
return r;
}
api.uploadPhoto = (fileName, fileUri, uri, callback) => {
var upload = {
uri: fileUri, // either an 'assets-library' url (for files from photo library) or an image dataURL
uploadUrl: // your backend url here,
fileName: fileName,
mimeType: 'image/jpeg',
headers: {},
data: {}
};
NativeModules.FileTransfer.upload(upload, (err, res) => {
console.log(err, res);
if (err || res.status !== 200) {
return callback(err || res.data || 'UNKNOWN NETWORK ERROR');
}
callback();
});
};
export default api;
// you can then call your action this way
'use strict';
import request from './api';
request.uploadPhoto('picture', uri, apiURL, (err) => {
if (err) {
console.log(err);
return;
}
});
This is an old post, but if it helps others this will do the trick:
var formData = new FormData();
formData.append("photo", {
uri: imageUri,
type: "image/jpg"
});
axios.post(serviceUrl, formData);

Categories

Resources