I would like to load the image from the AWS S3 bucket using the pre-signed URL for image cropping purpose. I'm using the ngx-image-copper and here's my code:
app-cropper.component.ts:
getBase64FromImageUrl(url: string) {
this.httpClient
.get(url, { responseType: 'blob' } })
.subscribe((blob) => {
const reader = new FileReader();
reader.readAsDataURL(blob);
reader.addEventListener('load', () => {
this.imageBase64 = reader.result;
});
})}
app-cropper.component.html:
<image-cropper
class="ly-cropper"
id="cropper"
*ngIf="imageBase64; else spinner"
[imageFile]="imageBase64"
[maintainAspectRatio]="isMaintainAspectRatio$ | async"
[aspectRatio]="ratio"
[autoCrop]="true"
format="{{ asset?.fileType }}"
(imageCropped)="onImageCropped($event)"
(imageLoaded)="onImageLoaded()"
(cropperReady)="onCropperReady()"
(loadImageFailed)="onImageLoadFailed()"
[cropper]="cropper"
[canvasRotation]="canvasRotation"
></image-cropper>
<ng-template #spinner>
<ly-spinner mode="indeterminate" color="gray"> </ly-spinner>
</ng-template>
<div *ngIf="isLoading$ | async" class="loading-overlay">
<ly-spinner mode="indeterminate" color="gray" [diameter]="200" [showLabel]="false"></ly-spinner>
</div>
With most browsers, the image loading works perfectly fine. However, only in the Chrome browser, I keep getting the error saying that
Access to XMLHttpRequest at 'xxx' from origin 'http://localhost:4200' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested
I checked most of the posts which are related to the CORS issue with the S3 bucket, and the S3 configuration has been configured as follows. But the CORS issue specifically happening in Chrome still couldn't be fixed. Does anybody know how can I fix this?
[
{
"AllowedHeaders": [
"*"
],
"AllowedMethods": [
"GET"
],
"AllowedOrigins": [
"*"
],
"ExposeHeaders": []
}
]
Related
My Amazon S3 buckets were working fine until I decided to update my aws sdk from version v2 to the modular v3.
I am able to programatically upload the file using the sdk but I am not able to upload files using the pre-signed url it generates.
const { getSignedUrl } = require('#aws-sdk/s3-request-presigner');
const { S3Client, , PutObjectCommand } = require('#aws-sdk/client-s3');
const s3Client = S3Client({ region: 'us-east-2'});
const params = {
Bucket: '<bucket>',
Key: '1234567890.jpg',
ACL: 'private',
ContentType: 'image/jpg',
// Body: '<base64 encoded image content>'
};
const command = new PutObjectCommand(params);
// await s3Client.send(command); // works fine
const signedUrl = await getSignedUrl(s3Client, command); // generated signed url fails to upload image
When I try to make a PUT request using the presigned url generated I get a 403 HTTP error code and a message SignatureDoesNotMatch.
Please guide me on what I might be missing because I've been working on this for two days now.
Have you tried this;
https://www.digitalocean.com/community/questions/signature-does-not-match-when-putting-to-presigned-spaces-upload-url
Here is a link for more details;
https://www.msp360.com/resources/blog/s3-pre-signed-url-guide/
I got mine to work after I created an IAM user that has programatic access, (aws-access-key-id, aws-secret-access-key) with a bucket policy attached to put images.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::bucket-name/*"
}
]
}
I also added the credentials of this IAM user to the config
const credentials = { accessKeyId, secretAccessKey }
const params = { credentials, region }
I also applied CORS rules to the bucket as follows:
[
{
"AllowedHeaders": [],
"AllowedMethods": [
"GET"
],
"AllowedOrigins": [
"*"
],
"ExposeHeaders": [],
"MaxAgeSeconds": 0
},
{
"AllowedHeaders": [
"*"
],
"AllowedMethods": [
"PUT"
],
"AllowedOrigins": [
"*"
],
"ExposeHeaders": [],
"MaxAgeSeconds": 3600
}
]
I didn't need to add ACL, ContentType to the config.
I have spent about 2 weeks trying to debug this but no luck.
I have created a lambda function using python that creates a charge. This works fine with Stripe Checkout's simple script. It invokes it and returns the response without any issues (Python).
try:
stripe.api_key = "*******PRIVATE KEY***********"
Tokenstring = event.get('body')
Stripe_List = Tokenstring.split('=')
Token = Stripe_List[1].split('&')[0]
Email = Stripe_List[-1]
Email = Email.replace('%40', '#')
charge = stripe.Charge.create(
amount=100,
currency="gbp",
description="Example charge",
source=Token,
receipt_email=Email
)
print('Full SUCCESSFUL Transaxn Info ==== {}'.format(event))
return {
"statusCode": 302,
"headers": {
"Location": "https://example.com/#success"
}
}
This is invoked very simply in the html body with <form action="https://XXXXXXX.execute-api.eu-central-1.amazonaws.com/beta" method="POST">
Now, when I try to use the custom stripe checkout code, I get:
Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'null' is therefore not allowed access.
My javascript code is:
var handler = StripeCheckout.configure({
key: '*****PRIVATE KEY*****',
image: 'logo.png',
locale: 'auto',
token: function(token) {
var xhr = new XMLHttpRequest();
xhr.open("POST","https://XXXXXXX.execute-api.eu-central-1.amazonaws.com/beta", true);
xhr.setRequestHeader('Content-Type','application/json');
xhr.setRequestHeader('Access-Control-Allow-Origin','*');
xhr.onreadystatechange = handler;
xhr.send(JSON.stringify({
body : token
}));
}
});
I have set up OPTIONS in Amazon's API Gateway to respond and have enabled CORS on amazon API gateway.
How can I pass the pre-flight request and let lambda execute the function?
Aside from enabling CORS on the API Gateway Console, your Lambda function itself has to return those CORS headers.
return {
"statusCode": 302,
"headers": {
"Location": "https://example.com/#success",
"Access-Control-Allow-Origin": "*",
}
}
I'm trying to write a small add-on for firefox using the WebExtensions structure.
This add-on should read a local file content by it's absolute path:
"/home/saba/desktop/test.txt"
manifest.json
{
"manifest_version": 2,
"name": "Test - load files",
"version": "0.0.1",
"description": "Test - load files",
"permissions": [ "<all_urls>" ],
"background": {
"scripts": [ "main.js" ]
}
}
Here what I tried so far (inside the main.js):
Using XMLHttpRequest
function readFileAjax(_path){
var xhr = new XMLHttpRequest();
xhr.onloadend = function(event) {
console.log("onloadend", this);
};
xhr.overrideMimeType("text/plain");
xhr.open("GET", "file:///"+_path);
xhr.send();
}
readFileAjax("/home/saba/desktop/test.txt");
Failed.
I can't figure out why it always return an empty response
(test.txt contains "test", the path is correct)
onloadend XMLHttpRequest {
onreadystatechange: null,
readyState: 4,
timeout: 0,
withCredentials: false,
upload: XMLHttpRequestUpload,
responseURL: "",
status: 0,
statusText: "",
responseType: "",
response: ""
}
Using FileReader
function readFileFR(_path){
var reader = new FileReader();
reader.addEventListener("loadend", function() {
console.log("loadend", this.result)
});
reader.readAsText(file); // file ????
}
readFileFR("/home/saba/desktop/test.txt");
but here I got stuck because of the file argument.
This method usually get along with an input type="file" tag which gives back a .files array. (but I only have a local path string)
I searched if was possible to create a new Blob or File var using an absolute local file path but seams like it's not possible.
Using WebExtensions API
I didn't find any clue form the documentation pages on how to do this.
Isn't there (maybe) some kind of WebExtensions API which makes this possible like in the SDK?
https://developer.mozilla.org/en-US/Add-ons/SDK/Low-Level_APIs/io_file
https://developer.mozilla.org/en-US/Add-ons/SDK/Low-Level_APIs/io_text-streams
What am I doing wrong or missing?
..is it possible to get the content of a local file by it's absolute path with a WE Add-on?
I finally found the way to do this using the Fetch requests and FileReader APIs.
Here what I came up to:
function readFile(_path, _cb){
fetch(_path, {mode:'same-origin'}) // <-- important
.then(function(_res) {
return _res.blob();
})
.then(function(_blob) {
var reader = new FileReader();
reader.addEventListener("loadend", function() {
_cb(this.result);
});
reader.readAsText(_blob);
});
};
Using the example in my question this is how to use it:
readFile('file:///home/saba/desktop/test.txt', function(_res){
console.log(_res); // <-- result (file content)
});
ES6 with promises
If you prefer to use Promises rather than callbacks:
let readFile = (_path) => {
return new Promise((resolve, reject) => {
fetch(_path, {mode:'same-origin'})
.then(function(_res) {
return _res.blob();
})
.then(function(_blob) {
var reader = new FileReader();
reader.addEventListener("loadend", function() {
resolve(this.result);
});
reader.readAsText(_blob);
})
.catch(error => {
reject(error);
});
});
};
Using it:
readFile('file:///home/saba/desktop/test.txt')
.then(_res => {
console.log(_res); // <-- result (file content)
})
.catch(_error => {
console.log(_error );
});
This doesn't work, or at least not any longer taking the accepted answer into consideration.
Addon's run in a fake root meaning you can only ever access files which have been
Shipped with your extension [1] using e.g. fetch() or
Opened interactive (meaning initiated by the user using either the file
picker or drag&drop) through the File() constructor [2]
Everything else will lead to a Security Error: Content at moz-extension://... may not load data from file:///... causing fetch() to throw the aforementioned TypeError: NetworkError when attempting to fetch resource.
[1] https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/manifest.json/web_accessible_resources
[2] https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/Working_with_files#open_files_in_an_extension_using_a_file_picker
I attempt to fetch vine thumbnail following their doc with the following code:
<!-- language: lang-js -->
var onGetVineThumbnailSuccess = function( videoUrl ) {
return function( response ) {
var args = { videoUrl: videoUrl };
args.thumbnailUrl = response['thumbnail_url']; // jshint ignore:line
$rootScope.$broadcast( 'event:onGetVineThumbnailSuccess', args);
};
};
var getVineThumbnail = function ( videoUrl ) {
$http
.get( 'https://vine.co/oembed.json?url=' + encodeURIComponent( videoUrl ) )
.then( onGetVineThumbnailSuccess( videoUrl ) );
};
but in the console I've this error:
XMLHttpRequest cannot load https://vine.co/oembed.json?url=https%3A%2F%2Fvine.co%2Fv%2FeV1mMuab7Mp. Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:9000' is therefore not allowed access.
By the way this link: https://vine.co/oembed.json?url=https%3A%2F%2Fvine.co%2Fv%2FeV1mMuab7Mp works. If I put directly to browser's url bar. I obtain this JSON:
{
"version": 1.0,
"type": "video",
"cache_age": 3153600000,
"provider_name": "Vine",
"provider_url": "https://vine.co/",
"author_name": "Evengelia",
"author_url": "https://vine.co/u/1204040590484971520",
"title": "Everything was beautiful on this day. #6secondsofcalm",
"thumbnail_url": "https://v.cdn.vine.co/r/videos/59734161E81269170683200901120_45a46e319ea.1.1.8399287741149600271.mp4.jpg?versionId=tc3t.oqGtjpJNlOX1AeM1CAnWONhbRbQ",
"thumbnail_width": 480,
"thumbnail_height": 480,
"html": "<iframe class=\"vine-embed\" src=\"https://vine.co/v/eV1mMuab7Mp/embed/simple\" width=\"600\" height=\"600\" frameborder=\"0\"><\/iframe><script async src=\"//platform.vine.co/static/scripts/embed.js\"><\/script>",
"width": 600,
"height": 600
}
Sounds as CORS issue. But as I've no control on Vine, how should I call this service?
Access-Control-Allow-Origin is set on the response from server, not on client request to allow clients from different origins to have access to the response.
In your case, http://www.vine.co/ does not allow your origin to have access to the response. Therefore you cannot read it.
For more information about CORS: https://developer.mozilla.org/en-US/docs/Web/HTTP/Access_control_CORS
But, The Chrome Webstore has an extension that adds the 'Access-Control-Allow-Origin' header for you when there is an asynchronous call in the page that tries to access a different host than yours.
The name of the extension is: "Allow-Control-Allow-Origin: *" and this is the link: https://chrome.google.com/webstore/detail/allow-control-allow-origi/nlfbmbojpeacfghkpbjhddihlkkiljbi
I would like to upload files to cross domain server from extjs using form.submit() method. When i call form.submit(), request is going to my restful web service and file is getting uploaded successfully. But the response is blocked at the browser with message: Blocked a frame with origi…host:1841" from accessing a cross-origin frame.
From older posts and form submit code, i found that doSubmit() is sending the Ajax request with out cors:true statement due to which cross domain response is blocked.
I thought of sending normal Ajax request but dont know how the file data can be read and send to server through ajax request.
Is there anyway in php to send the file data to server as form.doSubmit()does? Can someone help me on this problem?
Thanks.
Solution is: What does document.domain = document.domain do? and http://codeengine.org/extjs-fileuplaod-cross-origin-frame/
In case someone faces the same issue... Extjs 6.6
Objective: Using fileUpload and form.submit with CORS.
Issue: ExtJS form.submit failed due to “accessing a cross-origin frame -> The file gets successfully uploaded however it ALWAYS returns FAILURE on form.submit Reason..."Blocked a frame with origin "http://localhost:57007" from accessing a cross-origin frame."
Solution: Don't use form.submit, use fetch instead.
View
{
xtype: 'form',
reference: 'fileForm',
items: [
{
xtype: 'fileuploadfield',
buttonOnly: true,
name: 'file',
buttonConfig: {
text: 'Attach',
iconCls: 'x-fa fa-plus green',
ui: 'default-toolbar-small'
},
width: 80,
listeners: {
change: 'onAttachFile'
}
}
]
},
View Controller
/**
*
*/
onAttachFile: function (cmp, newValue) {
const self = this;
const fileForm = self.lookupReference('fileForm');
if (Ext.isEmpty(newValue) === false && fileForm.isValid()) {
const file = cmp.fileInputEl.dom.files[0];
const fileSizeInMB = parseFloat((file.size / (1024*1024)).toFixed(2));
// Validating file size
if (fileSizeInMB > 4)
alert('File size exceeds the allowable limit: 4MB');
else {
const url = '' // URL goes here
const headers = {} // Any special headers that you may need, ie auth headers
const formData = new FormData();
formData.append('file', file);
fetch(url, {
method: 'POST',
headers,
credentials: 'include',
body: formData
})
.then(response => {
response.json().then(json => {
if (response.ok) {
console.log(json);
}
else {
console.error(json);
}
});
})
.catch((error) => {
console.error(error);
});
}
}
},
Related Posts:
cross origin problems with extjs 6.01
extjs form.submit failed due to "accessing a cross-origin frame"
extjs file uploads through form submit for cross domain
ExtJS 6.6.0 Enable CORS in form submit
https://forum.sencha.com/forum/showthread.php?368824-extjs-form-submit-failed-due-to-%E2%80%9Caccessing-a-cross-origin-frame%E2%80%9D
https://forum.sencha.com/forum/showthread.php?298504-Extjs-5-Fileupload-submit-error
https://forum.sencha.com/forum/showthread.php?294852
https://forum.sencha.com/forum/showthread.php?343448-Cross-origin-file-upload
Ajax call does not work with downloading and i presume with uploading files.
Have you tried to set this before doSubmit:
Ext.Ajax.cors = true;
Ext.Ajax.useDefaultXhrHeader = false;
Solution is: What does document.domain = document.domain do? and http://codeengine.org/extjs-fileuplaod-cross-origin-frame/