Upload Image to Azure Blob container using Axios [VueJs] - javascript

I am trying to using Axios PUT function to upload an image to Azure Storage.
What I did is as following:
I created a storage account in Azure, then I Add CORS rule as following: CORS rule
I created a Blob with name user-pic.
I use Axios to make the request for me
code:
function upload(formData) {
//get the current date
var currentdate = new Date();
var Curr_date = currentdate.getDay + '-' + currentdate.getMonth + '-' + currentdate.getFullYear;
//Here I am trying to convert the image to binary encoding.
var data = btoa(formData);
//The image Url, [ below there is an example from where I take this url ].
const url = "https://XXXXXXX.blob.core.windows.net/XXXXXXXXXXXX";
//Headers are required for me, do I need to add all Headers in my code also in CORS of the storage account?
axios.put(url, data {
headers: {
"Access-Control-Allow-Origin" : "*",
'Access-Control-Allow-Methods': 'GET, POST, PATCH, PUT, DELETE, OPTIONS',
"Access-Control-Allow-Headers": "Origin, Content-Type, x-ms-*",
"Content-Type": "image/png",
"Content-Length": data.length, //here I am trying to get the size of image.
"x-ms-date": Curr_date,
"x-ms-version": sv,
"x-ms-blob-type": "BlockBlob",
}
})
.then ( response => { console.log(response); console.log('correct!!'); } )
.catch ( error => { console.log(error); console.log('error here!!'); });
}
What I mean by the comments inside the code:
The image URL should be at the same format of this: Blob SAS Url
Is the format of Curr_date is correct to be accepted by x-ms-date header?
Is function btoa used to convert the image to binary encoding?
Should I add all headers in Axios into account storage CORS (in the header field)?
What is the correct method to get the size of the image? (.size function? actually, I am passing the formData after appending all images into it.
After running the program, in the console, I got two error messages:
How can I solve these problems?
Update:
I made these changes:
I change the CORS: CORS
I got this error msg: Error Msg

Your CORS rule defined for your Azure Storage is missing your http://localhost origin.

Related

Can not PUT file to google cloud storage bucket from JS in browser

I am using google-cloud-storage to generate a presigned URL to upload a file from the browser.
BUCKET.cors = [
{
"origin": ["*"],
"responseHeader": [
"Content-Type",
"x-goog-resumable"],
"method": ['PUT', 'POST'],
"maxAgeSeconds": 3600
}
]
BUCKET.patch()
blob = BUCKET.blob(blob_name)
url = blob.generate_signed_url(
version="v4",
# This URL is valid for 15 minutes
expiration=datetime.timedelta(minutes=15),
# Allow PUT requests using this URL.
method="PUT",
content_type="application/octet-stream",
)
The URL I get back looks like this. The blob_name I use is "<UUID>/<UUID>/test.png".
I then PUT with fetch to the returned URL.
const {signed_url} = await(await fetch(baseUrl + "/upload/signed?blob_name=" + path, {
credentials: "include",
})).json();
console.log('url', signed_url);
// upload to google
const url = await (await fetch(signed_url, {method: "PUT", body: file}) ).json()
Two problems occur
I get a CORS error - the PUT response does not include the CORS-Allow-Origin: "*" header.
When I bypass the first problem by using a CORS plugin I get the following response:
<?xml version='1.0' encoding='UTF-8'?><Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your Google secret key and signing method.</Message><StringToSign>GOOG4-RSA-SHA256
20210910T200720Z
20210910/auto/storage/goog4_request
3df68a505fbb635cc4092462461b715ad31b4e83db668726ca5d87ebe1d64d9a</StringToSign><CanonicalRequest>PUT
/my-project-dev/edb5b48fcd724ab4a46afff3da5efa20/5b76eed38ed2473ea7c0b83e0e1d081c/52b675f15e2147849c911fe4f35951e9/test.png
X-Goog-Algorithm=GOOG4-RSA-SHA256&X-Goog-Credential=dev-laptop%40my-project.iam.gserviceaccount.com%2F20210910%2Fauto%2Fstorage%2Fgoog4_request&X-Goog-Date=20210910T200720Z&X-Goog-Expires=900&X-Goog-SignedHeaders=content-type%3Bhost
content-type:image/png
host:storage.googleapis.com
content-type;host
UNSIGNED-PAYLOAD</CanonicalRequest></Error>
I tried finding JS fetch examples on stackoverflow and the gcloud documentation site. I also checked if there is an option for the python library that I am missing. Now I am out of options. It seems that the storage buckets were not designed for this use case. Help would be greatly appreciated.
As mentioned by Mabel A. in the comments:
The problem was a mismatch of the content type in the creation of the PSL.
I needed to have content-type: application/octet-stream when uploading the file.

Sending file to s3 bucket using presigned url gives 403 error (SignatureDoesNotMatch)

I'm generating a presigned URL on the server side of my application that looks like the following:
const s3 = new AWS.S3({
region: 'us-west-2',
signatureVersion: 'v4',
});
const params = {
Bucket: 'bucketName',
Key: fileName,
ContentType: 'text/csv',
Expires: 120
};
return s3.getSignedUrl('putObject', params);
Then I am trying to put the file into my S3 bucket using an ajax call that looks like this:
$.ajax({
url: url
type: 'PUT',
data: file,
processData: false,
contentType: 'text/csv',
success: function () {
console.log('Uploaded data successfully.');
},
error: function (xhr) {
console.log(xhr);
}
});
However, when I try to do this, I get a 403 Forbidden error and the XML says SignatureDoesNotMatch and The request signature we calculated does not match the signature you provided. Check your key and signing method.
I have made sure that the ContentType is the same for when I am generating the presigned URL AND when I am putting the file in the S3 Bucket. No matter what I do, nothing works and I still get this 403 error. I have tried to do binary/octet-stream for the ContentType and that didn't work. I tried to do ACL: 'public-read' and that didn't work. CORS is configured in my bucket as well so I know that isn't the issue (I had a different error for this). I did notice that in the Network calls, it says this:
Request URL: https://bucket-name.s3.us-west-2.amazonaws.com/fileName?Content-Type=text%2Fcsv&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=ASIAXBWQUNDKKHHYM5GH%2F20210219%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20210219T015308Z&X-Amz-Expires=120&X-Amz-Security-Token=FwoGZXIvYXdzEIv%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaDOQ1EV3FIT00%2Fuo1BCKyAROysQ9G5PY9RFLeS8GwBLPEo7LCEJWstwki7nZatfddDczn0GKO7GwNEe5Qs%2BsLtMZv2xPTXo3Bwur%2BIhH7jV35HHQm976s1mOf8JZe2g%2BimUGNwLxBKY%2BrhWsN8yryNrd6k1VBRf1R9No9Jh%2FIumuwiVEoFLvVBHtILB9i53FdDo%2BJ8T%2BMCliV22SGBAwPQnYk8xvbo1%2B%2B%2B%2BAu%2FwVFl3tvG2yo7PHLzPpKqcpyJq4pMwko3aK8gQYyLUl0hZCTtit2cvBD5YAo57aMZBdTlpN5Wx3q27PSQZ1d8Bq1lQY%2BIQVkPlxZ%2Fw%3D%3D&X-Amz-Signature=446c16abde4d278c42c72373c85a6d44f959330468076e6bd888a8e2816b2b86&X-Amz-SignedHeaders=host
Request Method: PUT
Status Code: 403 Forbidden
Remote Address: 52.218.246.105:443
Referrer Policy: strict-origin-when-cross-origin
For Response Headers:
Referrer Policy: strict-origin-when-cross-origin
Access-Control-Allow-Methods: GET, PUT, POST, DELETE
Access-Control-Allow-Origin: *
Connection: close
Content-Type: application/xml
Date: Fri, 19 Feb 2021 01:53:09 GMT
Server: AmazonS3
Transfer-Encoding: chunked
Vary: Origin, Access-Control-Request-Headers, Access-Control-Request-Method
And Request Headers has Content-Type: text/csv... Not sure if this matters at all though
Any help will be greatly appreciated. I literally searched all over Google and nothing people said worked for me for some reason..
Adding answer because I cannot comment
Can you check why content-type isn't coming in your "X-Amz-Signed-Headers" along with "host"?
Since, that is part of "params" and therefore part of signature calculation, you should see
X-Amz-SignedHeaders=content-type%3Bhost
instead of ?Content-Type=text%2Fcsv
check your API key and secret, as SignatureDoesNotMatch error usually refers to the API key and secret mismatch, do refer to this previous answer:
https://stackoverflow.com/a/38837566/6687588
Turns out that something was wrong with my bucket, which is why I got this error. What exactly is different about the bucket I'm trying to write to is still unknown.. I tried another S3 bucket and it worked fine

Send file with form-data and axios

I am trying to send a video to a videosite, I am able to upload the video using the REST api and postman, so I know the api works as intended. Now I want to do exatcly the same request using axios. I have code that looks like the example on how to use form-data and axios:
const form = new FormData();
const stream = fs.createReadStream(PATH_TO_FILE);
form.append('image', stream);
// In Node.js environment you need to set boundary in the header field 'Content-Type' by calling method `getHeaders`
const formHeaders = form.getHeaders();
axios.post('http://example.com', form, {
headers: {
...formHeaders,
},
})
.then(response => response)
.catch(error => error)
I get the error that data: 'Content-Length is required'
Any ideas?
May be I got your questions wrong , you want to add Content-Length in the header.
I can see you are uploading video stream. So first you have to calculate the data chunk length.
('Content-Length', File.getSize(stream))
Reference: Can I stream a file upload to S3 without a content-length header?
You can make the post request as multi-part type : 'Content-Type': 'multipart/form-data'.
It is preferable way to send large data to server.
You can check this link : How do I set multipart in axios with react?
If I got your question wrong , plese comment or reply . Thanks
The solution to my problem was to set Content-Length accordingly:
"Content-Length": fs.statSync(filePath)['size']
I think the best way to handle this is to actually use the FormData's own method:
const headers = { 'content-length': formData.getLengthSync(), ...formData.getHeaders() }
This will be more accurate because it includes any other data you may add.
To expound, if you are using a ReadStream, you must use the async function instead.
const { promisify } = require('util')
const getLength = promisify(formData.getLength.bind(formData))
const contentLength = await getLength()
const headers = { 'content-length': contentLength, ...formData.getHeaders() }

Dropbox download file API stopped working with 400 error

I use dropbox download file API , and i got a token , but it's return 400 bad request error
"Error in call to API function "files/download": Must provide HTTP header "Authorization" or URL parameter "authorization"
I follow dropbox api doc , but it cannot work ~""~
How do I fix it ?
this is my code ( angular2 )
downloadFile(fileid){
let headers = new Headers();
headers.append('Authorization', 'Bearer ' + this.accessToken);
headers.append('Dropbox-API-Arg','path:'+ fileid);
return this.http.post('https://content.dropboxapi.com/2/files/download',new RequestOptions({ headers: headers ,responseType:ResponseContentType.ArrayBuffer})).map((res) => {
let arrayBuffer = res.arrayBuffer();
let contentType = res.headers.get('content-type');
return {
fileid: fileid,
blob: new Blob([arrayBuffer], { type: contentType })
};
});
I use dropbox v2 api in android. Just as you, I got 400 bad request. It turns out that Android HttpUrlConnection set a default "Content-Type" header value. And dropbox download api require "Content-Type" to be missing or empty. I don't have same issue in iOS though.
So maybe in angular2, you need to do something like:
headers.append('Content-Type','');
Also the 'Dropbox-API-Arg' header need to be like:
headers.append('Dropbox-API-Arg','{\"path\": \"/filepath\"}');

Request.post on already uploaded image file

I am using Angularjs and nodejs in the project and working on file uploads. I want to send the post request to the url endpoint in a secure way as I need to attach accesstoken with the request. So the way I did this was, I added the directive to choose the file from UI and once it gets the file, I append it using FormData() like this in the controller
var fd = new FormData();
fd.append('file',myFile);
and sending this formdata object to the nodejs server like mentioned here http://uncorkedstudios.com/blog/multipartformdata-file-upload-with-angularjs
expect this request will be going to my nodejs server url from there I will be making another post request to external web service
$http.post('api/collections/upload',fd, {
transformRequest: angular.identity,
headers: {
'Content-type': undefined
}
});
So it will attach the right content-type and boundaries in the request. I am getting the file on server side nodejs when I do
function(req,res){
console.log(req.files); //I am able to see the file content
}
It is uploaded on the nodejs server side.
Now I want to make a post request using the req.files to a different endpoint along with proper accessToken and headers. Tried many things but not able to make the request go thru. Not sure how can I attach the imagedata/ req.files along with the request. I tried these two things mentioned in request npm module https://www.npmjs.org/package/request
1)
request.post({
url: 'https://www.example.com/uploadImage',
headers: {
'Authorization': <accessToken>,
'Content-type': 'multipart/form-data'
},
body: req.files
});
Don't know how can I attach and binary data with this request and how can I put boundary. Is boundary needed when you want to send the uploaded image with this request?
2)
fs.createReadStream(req.files.file.path, {encoding: base64}).pipe(request.post({
url: 'https://www.example.com/uploadImage',
headers: {
'Content-type': 'multipart/form-data'
}
}));
Can someone please suggest some ideas on how can I send this post request using request npm module? Thanks.
Documentation here has lots of examples of doing exactly what you describe:
https://github.com/mikeal/request#streaming
As can be seen in that link, the request library adds a .pipe() method to your http's req object, which you should be able to use like the examples in the link:
function(req, res) {
req.pipe(request.post('https://www.example.com/uploadImage');
}
Or something similar.
You were nearly there with your #2 try, but that would only work if you have previously written the file out to disk and were reading it in with fs.createReadStream()
your suggestion helped me to atleast know what I was trying was right. Another article that solved my problem was this http://aguacatelang.wordpress.com/2013/01/05/post-photo-from-node-js-to-facebook/ .Basically, here is what I did and it worked. Thanks for your suggestion.
var form = new FormData();
form.append('file', fs.createReadStream(__dirname + '/image.jpg'));
var options = {
url: 'https://www.example.com/uploadImage?access_token='+ <accesstoken>,
headers: form.getHeaders()
};
form.pipe(request.post(options,function(err,res){
if(err){
log.debug(err);
}
else {
log.debug(res);
}
}));

Categories

Resources