Upload file from external url to S3 - javascript

I'm trying to upload an image to S3 from an external url.
Currently, I can successfully upload a file, but after I download and open it I see that it is corrupted.
Here is the code I'm using (got is just what I use to fetch the resource):
const got = require('got');
const Web3 = require('web3');
const s3 = new AWS.S3({
accessKeyId: AWS_ACCESS_KEY_ID,
secretAccessKey: AWS_SECRET_ACCESS_KEY,
});
const response = await got('https://example.com/image.jpg');
const uploadedFile = await s3
.upload({
Bucket: 'my_bucket',
Key: 'images/',
Body: response.body,
ContentType: 'image/jpeg',
})
.promise();
I tried to create a buffer, and use putObject instead of upload, but I end up with with files that are only a few bytes on S3 instead.

The request to get the object is converting it to a string. Pretty much whatever encoding you pick to do that will corrupt it, since a JPG is binary data not meant to be represented with an string's encoding.
The documentation for the got library states:
encoding
Type: string
Default: 'utf8'
Encoding to be used on setEncoding of the response data.
To get a Buffer, you need to set responseType to buffer instead. Don't set this option to null.
In other words, if you change your download to:
const response = await got('https://example.com/image.jpg', {'responseType': 'buffer'});
You'll get and upload a Buffer object without changing it by encoding it as a string.

Your key is wrong when you are uploading the file to S3:
Key: 'images/'
It cannot be images/ because that would upload the image to an object that represents a folder. While that might work with the local file system on your Windows/Mac laptop, it doesn't work with object stores. It needs to be a key that represents a file, for example:
Key: 'images/image.jpg'

Doing it through streams as mentioned by Ermiya Eskandary seems to work:
const response = got.stream('https://example.com/image.jpg');

Related

Read File Uploaded Via PostMan or other clients From NodeJS without using Multer

I am trying to read an avatar file uploaded via PostMan or any other client to my ExpressJS API.
So far, the recommendations I have being getting has all being Multer.
I don't want to use Multer as I am having some issues with it. I want to be able to read the file directly and upload it to a remote location of choice
Here is the code I have but not working
const getS3Params = (file) => {
let fileName = file.name;
let fileType = file.mimetype;
let fileContent = fs.readFileSync(file); /**Getting an error that says TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be of type string or an instance of Buffer or URL. Received an instance of Object**/
return {
Bucket: process.env.S3_BUCKET_NAME,
Key: fileName,
ACL: 'public-read',
ContentType: fileType,
Body: fileContent
};
};
Is there a way to read the content of the file without using Multer?
Thanks.

AWS S3 Image Upload Error: Unsupported Body Payload Object

I try to upload a local image to an s3 bucket and keep getting
Error: Unsupported body payload object
The image can be jpg, jpeg, or png. I've read that images can only be uploaded to s3 as base64 so I read the file as a base64 string using readAsDataURL, and then create a Buffer with it.
const reader = new FileReader()
reader.readAsDataURL(file)
const base64str = reader.result.replace(/^data:image\/\w+;base64,/, "");
const fileContent = Buffer.from(base64str,'base64')
The above code is executed on the react frontend where fileContent is set to a variable through a hook, and that variable gets PUT-ed to my server with
static uploadImage = (id, fileContent) => {
return axios.put(ROOT_URL + "/images/" + id, fileContent);
}
Then its uploaded with
await s3.upload({
Bucket: BUCKET_NAME,
Key: KEY,
Body: fileContent,
ContentType: TYPE,
}).promise();
I have tried many different solutions I've found on this website, and still receive the error. Not sure what I am missing.
EDIT: Here is one of the threads that I found helpful, but did not fix the error.
Uploading base64 encoded Image to Amazon S3 via Node.js
Not sure where you read that images must be base64, but you can simply read the file content with something like fs and upload it to S3 as it is.
// Read content from the file
const fileContent = fs.readFileSync(fileName);
// Setting up S3 upload parameters
const params = {
Bucket: BUCKET_NAME,
Key: 'cat.jpg', // File name you want to save as in S3
Body: fileContent
};
// Uploading files to the bucket
await s3.upload(params).promise();

Storing and retrieving a base64 encoded string in Firebase storage

I have a Base64 encoded string (this is AES encrypted string).
I am trying to store it in Firebase Storage and then download it from it.
I have tried multiple options e.g
pathReference.putString(data, 'base64')
This does not retain the the base64 string in storage but converts it into integers. I have also tried providing a {contentType: "application/Base64"} but putString doesn't seem to work.
I then tried making it a blob
blob = new Blob([data], {type: "application/Base64"})
await pathReference.put(blob)
With this I am able to get the base64 encoded string in storage (though there are newlines added in string)
When I download it with ES6 fetch I am not getting back the string
const url = await pathReference.getDownloadURL()
const response = await fetch(url)
const data = await response.blob()
Instead getting an error Unhandled promise rejection: URIError: URI error
I am just looking for a very simple upload and download sample for base64 encoded string to firebase storage.
Any help is greatly appreciated.
I was able to make it work, though some firebase / fetch with react-native behavior is still unclear.
To upload a base64 encoded string to firebase storage I used the following snippet.
Here "data" is already a Base64 encoded string.
const pathReference = storage.ref(myFirebaseStorageLocation)
const blob = new Blob([data], {type: "application/Base64"})
await pathReference.put(blob)
I verified the contents in Firebase storage and downloaded the file manually which also looked fine.
Then to download under a React Native, Expo project there were several roadblocks but what finally worked was this
I had to add a btoa() function in global namespace.
Used the following code to download and then read it back as a Base64 string (which was surprisingly hard to get to)
Code to download the file and read back as Base64 string.
const fetchAsBlob = url => fetch(url)
.then(response => response.blob());
const convertBlobToBase64 = blob => new Promise((resolve, reject) => {
const reader = new FileReader;
reader.onerror = reject;
reader.onload = () => {
resolve(reader.result);
};
reader.readAsDataURL(blob);
});
const url = await pathReference.getDownloadURL()
const blob = await fetchAsBlob(url)
const doubleBase64EncodedFile = await convertBlobToBase64(blob)
const doubleEncodedBase64String = doubleBase64EncodedFile.split(',')[1]
const myBase64 = Base64.atob(doubleEncodedBase64String)
The caveat was that the FileReader reads the content and encodes it again into Base64 (so there is double encoding). I had to use the Base64.atob() to get back my original Base64 encoded string.
Again this may be unique to the situation where there is fetch being called under a React Native Expo project, both of which have some additional quirks when it comes to handling blobs or Base64.
(PS: I tried using response.blob(), response.buffer() and tried everything including libs to convert Blobs to Base64 strings but ran into one or the other issue, I also tried using Expo FileSystem, download file locally and read using FileSystem.readAsStringAsync, but it ran into native issues with iOS. tl;dr; the above solution worked but if someone can provide any explanation or clarity on all other attempts or a better solution then it will be greatly appreciated.
Also unclear is why firebase storage putString(data, 'base64') does not work.)

Chrome Extension Upload Stream to S3

I'm trying to use S3 multipart upload to upload data to S3 using a stream on the client-side.
I'm using Browserify to convert the Nodejs code into a single file that can be loaded by the Chrome Extension.
Here is my code:
const Stream = require('stream');
var inputBytesReadable = new Stream.Readable();
// add data to the Stream
var s3 = new AWS.S3({
params: {Bucket: bucketName}
});
var params = {
Bucket: bucketName,
Key: fileName,
PartNumber: partNumber,
UploadId: uploadId,
Body: inputBytesReadable
};
s3.uploadPart(params, function(err, data) {
if (err)
{
appendMessage("Error in uploading "+fileName+"part "+partNumber);
console.log(err, err.stack); // an error occurred
}
});
However, this results in the error: InvalidParameterType: Expected params.Body to be a string, Buffer, Stream, Blob, or typed array object
What am I doing incorrectly? Is there any way that I can pass a Stream to S3 in client-side JavaScript?
It seems that the AWS JavaScript SDK does not support stream inputs unless it is used in a Node.js environment. There is a specific environment check to enable this validation.
Therefore, it seems that it is not possible to use streams with the AWS SDK on the front end.

AWS SDK JS: Multipart upload to S3 resulting in Corrupt data

Trying to upload an mp4 file using the AWS JS SDK initiating a multipart upload, I keep getting a file corrupt error when I try to download and play it on my local.
Gists of my code:
Initiating the multipart upload with params:
const createMultipartUploadParams = {
Bucket: bucketname,
Key: fileHash.file_name,
ContentType: 'video/mp4' // TODO: Change hardcode
};
Call:
s3Instance.createMultipartUpload(createMultipartUploadParams, function(err, data) {
}
Doing the chunking:
Params:
const s3ChunkingParams = {
chunkSize,
noOfIterations,
lastChunk ,
UploadId: data.UploadId
}
Reading the file:
const reader = new FileReader();
reader.readAsArrayBuffer(file)
Uploading each chunk:
reader.onloadend = function onloadend(){
console.log('onloadend');
const partUploadParams = {
Bucket: bucketname,
Key: file_name,
PartNumber: i, // Iterating over all parts
UploadId: s3ChunkingParams.UploadId,
Body: reader.result.slice(start, stop) // Chunking up the file
};
s3Instance.uploadPart(partUploadParams, function(err, data1) {
}
Finally completing the multipartUpload:
s3Instance.completeMultipartUpload(completeMultipartParams, function(err, data)
I am guessing the problem is how I am reading the file, so I have tried Content Encoding it to base64 but that makes the size unusually huge. Any help is greatly appreciated!
Tried this too
Only thing that could corrupt is perhaps you are uploading additionally padded content for your individual parts which basically leads to final object being wrong. I do not believe S3 is doing something fishy here.
You can verify after uploading the file what is the final size of the object, if it doesn't match with your local copy then you know you have a problem somewhere.
Are you trying to upload from browser?
Alternatively you can look at - https://github.com/minio/minio-js. It has minimal set of abstracted API's implementing most commonly used S3 calls.
Here is a nodejs example for streaming upload.
$ npm install minio
$ cat >> put-object.js << EOF
var Minio = require('minio')
var fs = require('fs')
// find out your s3 end point here:
// http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region
var s3Client = new Minio({
url: 'https://<your-s3-endpoint>',
accessKey: 'YOUR-ACCESSKEYID',
secretKey: 'YOUR-SECRETACCESSKEY'
})
var outFile = fs.createWriteStream('your_localfile.zip');
var fileStat = Fs.stat(file, function(e, stat) {
if (e) {
return console.log(e)
}
s3Client.putObject('mybucket', 'hello/remote_file.zip', 'application/octet-stream', stat.size, fileStream, function(e) {
return console.log(e) // should be null
})
})
EOF
putObject() here is a fully managed single function call for file sizes over 5MB it automatically does multipart internally. You can resume a failed upload as well and it will start from where its left off by verifying previously upload parts.
So you don't necessarily have to go through the trouble of writing lower level multipart calls.
Additionally this library is also isomorphic, can be used in browsers as well.

Categories

Resources