AWS S3 Image Upload Error: Unsupported Body Payload Object - javascript

I try to upload a local image to an s3 bucket and keep getting
Error: Unsupported body payload object
The image can be jpg, jpeg, or png. I've read that images can only be uploaded to s3 as base64 so I read the file as a base64 string using readAsDataURL, and then create a Buffer with it.
const reader = new FileReader()
reader.readAsDataURL(file)
const base64str = reader.result.replace(/^data:image\/\w+;base64,/, "");
const fileContent = Buffer.from(base64str,'base64')
The above code is executed on the react frontend where fileContent is set to a variable through a hook, and that variable gets PUT-ed to my server with
static uploadImage = (id, fileContent) => {
return axios.put(ROOT_URL + "/images/" + id, fileContent);
}
Then its uploaded with
await s3.upload({
Bucket: BUCKET_NAME,
Key: KEY,
Body: fileContent,
ContentType: TYPE,
}).promise();
I have tried many different solutions I've found on this website, and still receive the error. Not sure what I am missing.
EDIT: Here is one of the threads that I found helpful, but did not fix the error.
Uploading base64 encoded Image to Amazon S3 via Node.js

Not sure where you read that images must be base64, but you can simply read the file content with something like fs and upload it to S3 as it is.
// Read content from the file
const fileContent = fs.readFileSync(fileName);
// Setting up S3 upload parameters
const params = {
Bucket: BUCKET_NAME,
Key: 'cat.jpg', // File name you want to save as in S3
Body: fileContent
};
// Uploading files to the bucket
await s3.upload(params).promise();

Related

How to upload images to firebase without using an HTML form?

How do I upload images to firebase without an HTML form, I need to use code only.
I have tried some ways myself but the files fail to preview they're corrupted I guess.
I'm developing my App in React and I need a way to upload images to firebase without an HTML form
I tried:
await uploadBytes(storageRef, '../images/image.jpg')
I also tried:
const metadata ={ contentType:'image/jpeg' }
and also
const metadata ={
contentType:'image/svg+xml'
}
await uploadBytes(storageRef, '../images/image.jpg', metadata)
There is no way to upload a file to Cloud Storage for Firebase with just a local path as you do here uploadBytes(storageRef, '../images/image.jpg').
If you pass a string as the second argument, you will have to call uploadString and the second argument has to be the base64 encoded data that you want to upload.
If you want to upload a file based on its path, you will have to either create a File reference to that file, or read its data into a Blob and then pass that to uploadBytes.
All of these are covered in the Firebase documentation on uploading data, so I recommend keeping that handy.
use the Firebase SDK for Cloud Storage and the FileReader API in JavaScript.
import { getStorage, ref, uploadBytes } from "firebase/storage";
const storage = getStorage();
const handleFileUpload = (file) => {
const storageRef = ref(storage, "images/" + file.name);
const metadata = {
contentType: file.type,
};
const reader = new FileReader();
reader.readAsArrayBuffer(file);
reader.onload = async (event) => {
const buffer = event.target.result;
await uploadBytes(storageRef, buffer, metadata);
};
};

Upload file from external url to S3

I'm trying to upload an image to S3 from an external url.
Currently, I can successfully upload a file, but after I download and open it I see that it is corrupted.
Here is the code I'm using (got is just what I use to fetch the resource):
const got = require('got');
const Web3 = require('web3');
const s3 = new AWS.S3({
accessKeyId: AWS_ACCESS_KEY_ID,
secretAccessKey: AWS_SECRET_ACCESS_KEY,
});
const response = await got('https://example.com/image.jpg');
const uploadedFile = await s3
.upload({
Bucket: 'my_bucket',
Key: 'images/',
Body: response.body,
ContentType: 'image/jpeg',
})
.promise();
I tried to create a buffer, and use putObject instead of upload, but I end up with with files that are only a few bytes on S3 instead.
The request to get the object is converting it to a string. Pretty much whatever encoding you pick to do that will corrupt it, since a JPG is binary data not meant to be represented with an string's encoding.
The documentation for the got library states:
encoding
Type: string
Default: 'utf8'
Encoding to be used on setEncoding of the response data.
To get a Buffer, you need to set responseType to buffer instead. Don't set this option to null.
In other words, if you change your download to:
const response = await got('https://example.com/image.jpg', {'responseType': 'buffer'});
You'll get and upload a Buffer object without changing it by encoding it as a string.
Your key is wrong when you are uploading the file to S3:
Key: 'images/'
It cannot be images/ because that would upload the image to an object that represents a folder. While that might work with the local file system on your Windows/Mac laptop, it doesn't work with object stores. It needs to be a key that represents a file, for example:
Key: 'images/image.jpg'
Doing it through streams as mentioned by Ermiya Eskandary seems to work:
const response = got.stream('https://example.com/image.jpg');

Read File Uploaded Via PostMan or other clients From NodeJS without using Multer

I am trying to read an avatar file uploaded via PostMan or any other client to my ExpressJS API.
So far, the recommendations I have being getting has all being Multer.
I don't want to use Multer as I am having some issues with it. I want to be able to read the file directly and upload it to a remote location of choice
Here is the code I have but not working
const getS3Params = (file) => {
let fileName = file.name;
let fileType = file.mimetype;
let fileContent = fs.readFileSync(file); /**Getting an error that says TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be of type string or an instance of Buffer or URL. Received an instance of Object**/
return {
Bucket: process.env.S3_BUCKET_NAME,
Key: fileName,
ACL: 'public-read',
ContentType: fileType,
Body: fileContent
};
};
Is there a way to read the content of the file without using Multer?
Thanks.

Upload image to strapi with external link

What is a proper way to upload an external image via URL into strapi on backend-side?
I had tried to load image with node-fetch and processed it with buffer()/blob()/blob().stream() and then passed it into strapi.plugins['upload'].services.upload.upload(). Also tried to generate FormData in node.js and passed it into upload service but still didn't help.
How to convert image buffer from fetch into suitable type for upload service?
I used axios and it was on client, but you can try in on server too I think.
This worked for me:
Fetch an image and create File instance from it
async getImage(imageUrl, imageName) {
const response = await axios.get(imageUrl, { responseType: 'blob' });
const mimeType = response.headers['content-type'];
const imageFile = new File([response.data], imageName, { type: mimeType });
return imageFile;
}
GraphQL API query
{
query: `
mutation($files: [Upload!]!) {
multipleUpload(files: $files) {
id
}
}
`,
variables: {
files: [
// your files to upload
]
}
}
Then I called this mutation and it worked perfect.
Resources that I used to find this solution:
https://www.freecodecamp.org/news/how-to-manage-file-uploads-in-graphql-mutations-using-apollo-graphene-b48ed6a6498c/
Client side convert png file stream into file object
https://github.com/jaydenseric/graphql-multipart-request-spec

AWS SDK JS: Multipart upload to S3 resulting in Corrupt data

Trying to upload an mp4 file using the AWS JS SDK initiating a multipart upload, I keep getting a file corrupt error when I try to download and play it on my local.
Gists of my code:
Initiating the multipart upload with params:
const createMultipartUploadParams = {
Bucket: bucketname,
Key: fileHash.file_name,
ContentType: 'video/mp4' // TODO: Change hardcode
};
Call:
s3Instance.createMultipartUpload(createMultipartUploadParams, function(err, data) {
}
Doing the chunking:
Params:
const s3ChunkingParams = {
chunkSize,
noOfIterations,
lastChunk ,
UploadId: data.UploadId
}
Reading the file:
const reader = new FileReader();
reader.readAsArrayBuffer(file)
Uploading each chunk:
reader.onloadend = function onloadend(){
console.log('onloadend');
const partUploadParams = {
Bucket: bucketname,
Key: file_name,
PartNumber: i, // Iterating over all parts
UploadId: s3ChunkingParams.UploadId,
Body: reader.result.slice(start, stop) // Chunking up the file
};
s3Instance.uploadPart(partUploadParams, function(err, data1) {
}
Finally completing the multipartUpload:
s3Instance.completeMultipartUpload(completeMultipartParams, function(err, data)
I am guessing the problem is how I am reading the file, so I have tried Content Encoding it to base64 but that makes the size unusually huge. Any help is greatly appreciated!
Tried this too
Only thing that could corrupt is perhaps you are uploading additionally padded content for your individual parts which basically leads to final object being wrong. I do not believe S3 is doing something fishy here.
You can verify after uploading the file what is the final size of the object, if it doesn't match with your local copy then you know you have a problem somewhere.
Are you trying to upload from browser?
Alternatively you can look at - https://github.com/minio/minio-js. It has minimal set of abstracted API's implementing most commonly used S3 calls.
Here is a nodejs example for streaming upload.
$ npm install minio
$ cat >> put-object.js << EOF
var Minio = require('minio')
var fs = require('fs')
// find out your s3 end point here:
// http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region
var s3Client = new Minio({
url: 'https://<your-s3-endpoint>',
accessKey: 'YOUR-ACCESSKEYID',
secretKey: 'YOUR-SECRETACCESSKEY'
})
var outFile = fs.createWriteStream('your_localfile.zip');
var fileStat = Fs.stat(file, function(e, stat) {
if (e) {
return console.log(e)
}
s3Client.putObject('mybucket', 'hello/remote_file.zip', 'application/octet-stream', stat.size, fileStream, function(e) {
return console.log(e) // should be null
})
})
EOF
putObject() here is a fully managed single function call for file sizes over 5MB it automatically does multipart internally. You can resume a failed upload as well and it will start from where its left off by verifying previously upload parts.
So you don't necessarily have to go through the trouble of writing lower level multipart calls.
Additionally this library is also isomorphic, can be used in browsers as well.

Categories

Resources