How do I store/upload base64 string as a png file? - javascript

The base64 string is what I'm getting in the controller. I'm trying to save it as png file or upload it directly to the S3 server. But the uploaded file is blank on the AWS.
While that encoded string perfectly shows image when using online base64 image viewer. Here is the code.
imageCropped(image: string) {
this.croppedImage = image;
let base64Image = this.croppedImage.split(';base64,').pop();
var the_blob = new Blob([window.atob(base64Image)], {type: 'image/jpg'});
var reader = new FileReader();
var url = URL.createObjectURL(the_blob);
// AWS CONFIGURATIONS
s3.upload({ Key: 'upload_2.png', Bucket: bucketName, Body: the_blob, ContentEncoding: 'base64', ACL: 'public-read' }, function (err, data) {
if (err) {
console.log(err, 'there was an error uploading your file');
} else if (data) {
console.log(data, 'File has been uploaded successfully!');
}
});
}

Related

Uploading images to S3 bucket from browser via NodeJs

I'm trying to upload images from the browser to Amazon S3, and the code below sends some sort of blob to Amazon S3 just fine, I can't read the resulting file in a browser. It doesn't seem to know it's an image file.
I send it to NodeJS from the browser:
let myReader=new FileReader();
myReader.onloadend=(e)=>{ app.ws.send(myReader.result); }
myReader.readAsDataURL(e.target.files[0]);
In NodeJS I send it to S3:
const s3=new AWS.S3();
const params= { Bucket:<bucketName>, Key:fileName, Body:imgData, ACL:"public-read", ContentEncoding:'base64' };
s3.putObject(params, (err, data)=>{
if (err) throw err;
});
Check AWS S3 guide,
This doc contains the logic needed to upload image from browser to S3 bucket
https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/s3-example-photo-album.html
Turns out you need to modify the base64 image data coming in and explicitly set the ContentType:
const s3=new AWS.S3();
const type = imgData.split(';')[0].split('/')[1];
imgData= new Buffer.from(imgData.replace(/^data:image\/\w+;base64,/, ""), 'base64');
let params = { Bucket:<bucketName>, Key:fileName, Body:imgData,
ACL:"public-read", ContentType:"image."+type, ContentEncoding: 'base64' };
s3.upload(params, (err, data)=>{
if (err) throw err;
... Do something ...
});

"Unsupported body payload object" when trying to upload to Amazon S3

I want to upload a file from my frontend to my Amazon S3 (AWS).
I'm using dropzone so I convert my file and send it to my backend.
In my backend my file is like:
{ fieldname: 'file',
originalname: 'test.torrent',
encoding: '7bit',
mimetype: 'application/octet-stream',
buffer: { type: 'Buffer', data: [Array] },
size: 7449 },
and when I try to upload my file with my function:
var file = data.patientfile.file.buffer;
var params = { Bucket: myBucket, Key: data.patientfile.file.fieldname, Body: file };
s3.upload(params, function (err, data) {
if (err) {
console.log("******************",err)
} else {
console.log("Successfully uploaded data to myBucket/myKey");
}
});
I get as error:
Unsupported body payload object
Do you know how can I send my file?
I have tried to send it with putobject and get a similar error.
I think you might need to convert the file content (which probably in this case is the data.patientfile.file.buffer) to binary
var base64data = new Buffer(data, 'binary');
so the params would be like:
var params = { Bucket: myBucket, Key: data.patientfile.file.fieldname, Body: base64data };
Or if I'm mistaken and the buffer is already in binary, then you can try:
var params = { Bucket: myBucket, Key: data.patientfile.file.fieldname, Body: data.patientfile.file.buffer};
This is my production code that is working.
Please note the issue can happen at data1111.
But, to get full idea, add all key parts of working code below.
client:
// html
<input
type="file"
onChange={this.onFileChange}
multiple
/>
// javascript
onFileChange = event => {
const files = event.target.files;
var file = files[0];
var reader = new FileReader();
reader.onloadend = function(e) {
// save this data1111 and send to server
let data1111 = e.target.result // reader.result // ----------------- data1111
};
reader.readAsBinaryString(file);
}
server:
// node.js/ javascript
const response = await s3
.upload({
Bucket: s3Bucket, // bucket
Key: s3Path, // folder/file
// receiving at the server - data1111 - via request body (or other)
Body: Buffer.from(req.body.data1111, "binary") // ----------------- data1111
})
.promise();
return response;
To make the above code working, it took full 2 days.
Hope this helps someone in future.
Implemented Glen k's answer with nodejs ...worked for me
const AWS = require('aws-sdk');
const s3 = new AWS.S3({
accessKeyId: process.env.AWSAccessKeyID,
secretAccessKey: process.env.AWSSecretAccessKey,
});
let base64data = Buffer.from(file.productImg.data, 'binary')
const params = {
Bucket: BUCKET_NAME,
Key: KEY,
Body: base64data
}
s3.upload(params, function(err, data) {
if (err) {
console.log(err)
throw err;
}
console.log(data)
console.log(`File uploaded successfully. ${data.Location}`);
})

Error uploading data:The request signature we calculated does not match the signature you provided. Check your key and signing method

I trying to upload image over AWS S3 Bucket using react native platform but i am getting Error uploading data: Error: The request signature we calculated does not match the signature you provided. Check your key and signing method.
Have any one tried to upload images
JavaScript Code to upload Images over AWS S3
var uniqueFileName = image.fileName;
console.log("File Name",uniqueFileName)
var bodyData = image.data;
console.log("File Json",bodyData)
var filetype= image.type;
console.log("File Type",filetype)
var AWS3 = require('aws-sdk/dist/aws-sdk-react-native');
AWS3.config.update({
"accessKeyId": AWS.accessKeyId,
"secretAccessKey": AWS.secretAccessKey,
"region": "us-east-1"
});
var s3 = new AWS3.S3();
var params = {
Bucket: AWS.bucketName ,
Key: uniqueFileName,
ContentType: filetype,
Body: bodyData,
ContentEncoding: 'base64'
};
s3.upload(params, function (err, res) {
if (err) {
console.log("Error uploading data: ", err);
} else {
console.log("Successfully uploaded data");
}
});
}

Node.js upload to Amazon S3 works but file corrupt

I am submitting a form via my CMS which contains a filepicker for an image & some text. The code runs & an object is created in my S3 account with the correct name but it is corrupt. For example, I am uploading JPG images but when I view them in the s3 dashboard I just see a black screen.
Any help is greatly appreciated.
My HTML form:
<form enctype="multipart/form-data" action="updateSchedule" method="POST">
<input type="file" name="schedulepicture" id="schedulepicture">
<textarea rows="4" cols="50" id="ScheduleText" name="ScheduleText" maxlength="2000"> <button type="submit" id="updateschedulebutton">Update</button>
</form>
My Node.JS script:
router.post('/updateschedule', isLoggedIn, upload.single('schedulepicture'), function(req, res) {
var scheduleImageToUpload;
//Check if image was uploaded with the form & process it
if (typeof req.file !== "undefined") {
//Create Amazon S3 specific object
var s3 = new aws.S3();
//This uploads the file but the file cannot be viewed.
var params = {
Bucket: S3_BUCKET,
Key: req.file.originalname, //This is what S3 will use to store the data uploaded.
Body: req.file.path, //the actual *file* being uploaded
ContentType: req.file.mimetype, //type of file being uploaded
ACL: 'public-read', //Set permissions so everyone can see the image
processData: false,
accessKeyId: S3_accessKeyId,
secretAccessKey: S3_secretAccessKey
}
s3.upload( params, function(err, data) {
if (err) {
console.log("err is " + err);
}
res.redirect('../adminschedule');
});
}
});
I do believe you need to pass a stream instead of the file path, you can use fs.createReadStream like this:
router.post('/updateschedule', isLoggedIn, upload.single('schedulepicture'), function(req, res) {
var scheduleImageToUpload;
//Check if image was uploaded with the form & process it
if (typeof req.file !== "undefined") {
//Create Amazon S3 specific object
var s3 = new aws.S3();
var stream = fs.createReadStream(req.file.path)
//This uploads the file but the file cannot be viewed.
var params = {
Bucket: S3_BUCKET,
Key: req.file.originalname, //This is what S3 will use to store the data uploaded.
Body: stream, //the actual *file* being uploaded
ContentType: req.file.mimetype, //type of file being uploaded
ACL: 'public-read', //Set permissions so everyone can see the image
processData: false,
accessKeyId: S3_accessKeyId,
secretAccessKey: S3_secretAccessKey
}
s3.upload( params, function(err, data) {
if (err) {
console.log("err is " + err);
}
res.redirect('../adminschedule');
});
}
});

Upload a binary file to S3 using AWS SDK for Node.js

Update: For future reference, Amazon have now updated the documentation from what was there at time of asking. As per #Loren Segal's comment below:-
We've corrected the docs in the latest preview release to document this parameter properly. Sorry about the mixup!
I'm trying out the developer preview of the AWS SDK for Node.Js and want to upload a zipped tarball to S3 using putObject.
According to the documentation, the Body parameter should be...
Body - (Base64 Encoded Data)
...therefore, I'm trying out the following code...
var AWS = require('aws-sdk'),
fs = require('fs');
// For dev purposes only
AWS.config.update({ accessKeyId: 'key', secretAccessKey: 'secret' });
// Read in the file, convert it to base64, store to S3
fs.readFile('myarchive.tgz', function (err, data) {
if (err) { throw err; }
var base64data = new Buffer(data, 'binary').toString('base64');
var s3 = new AWS.S3();
s3.client.putObject({
Bucket: 'mybucketname',
Key: 'myarchive.tgz',
Body: base64data
}).done(function (resp) {
console.log('Successfully uploaded package.');
});
});
Whilst I can then see the file in S3, if I download it and attempt to decompress it I get an error that the file is corrupted. Therefore it seems that my method for 'base64 encoded data' is off.
Can someone please help me to upload a binary file using putObject?
You don't need to convert the buffer to a base64 string. Just set body to data and it will work.
Here is a way to send a file using streams, which might be necessary for large files and will generally reduce memory overhead:
var AWS = require('aws-sdk'),
fs = require('fs');
// For dev purposes only
AWS.config.update({ accessKeyId: 'key', secretAccessKey: 'secret' });
// Read in the file, convert it to base64, store to S3
var fileStream = fs.createReadStream('myarchive.tgz');
fileStream.on('error', function (err) {
if (err) { throw err; }
});
fileStream.on('open', function () {
var s3 = new AWS.S3();
s3.putObject({
Bucket: 'mybucketname',
Key: 'myarchive.tgz',
Body: fileStream
}, function (err) {
if (err) { throw err; }
});
});
I was able to upload my binary file this way.
var fileStream = fs.createReadStream("F:/directory/fileName.ext");
var putParams = {
Bucket: s3bucket,
Key: s3key,
Body: fileStream
};
s3.putObject(putParams, function(putErr, putData){
if(putErr){
console.error(putErr);
} else {
console.log(putData);
}
});

Categories

Resources