I have a project in ReactJs where I use web3.js to read from the smart contract and I an Ethereum client provided by Infura. Problem is, when I access my project in Google Chrome, everything works fine.
If I access my project via Microsoft Edge, I am getting the following error:
SEC7120: [CORS] The origin 'http://localhost:3000' did not find
'http://localhost:3000' in the Access-Control-Allow-Origin response
header for cross-origin resource at
'https://rinkeby.infura.io/v3/censored'.
I looked into other posts and I found this one which didn't solve my problem.
Other things I have noticed:
If I access the website from Internet Explorer, I can read data as normal. If I go to Google Chrome and not use MetaMask, I can read data as normal. If I activate MetaMask and I don't select the correct infura network (Ropsten) and I choose for example Main Network, it doesn't work as expected
UPDATE:
Seems that it's not an issue from my side and instead it comes from Infura. I enquired them to see if Microsoft Edge is a limitation of their service and if they are willing to do anything with it.
Ah the good old CORS problem. Since you don't have control of the server's CORS settings, you're better off creating a small server side application that will proxy these requests on behalf of your react application.
Related
I'm curious if anyone else has encountered this issue.
I am building an application that will authenticate users using Google 0Auth 2.0 + OpenID.
I've built a simple site just with HTML and CSS to hold the UI and I'm using live server in Vscode to view it.
In The Google developer console for oauth, you must set Authorised JavaScript origins for client-side applications. I assumed I would just set this to http://localhost:5500 for the port that live server uses but I always get the following error:
Authorization Error
Error 400: invalid_request
Permission denied to generate login hint for target domain.
I have got around the issue by just getting a domain and hosting for a test site and setting this as the "Authorised JavaScript origin". However is seems really clunky and I have to FTP all my files to my hosting provider every time I want to change my code.
I could also host everything on a Node.js server from my local machine but this would just cause the same issue as before.
My question isn't so much how to stop getting this error but what is the proper way of developing with OAuth 2.0 and is there any way to speed up the process/create a local environment that doesn't get the same errors.
Thanks for your help.
There is an answer to this Google question here that may help you.
The way I have always set up an OAuth environment is to try to avoid localhost and use more real world URLs on a Developer PC. I also like to split them like this which helps me to visualize the architecture:
Base URL
Represents
http://www.example.com -
Your web UIs
http://api.ecample.com
Your APIs
http://login.example.com
The Authorization Server
This approach can also give you more confidence that your code will work well in beowsers, in areas such as these:
CORS
Cookies
Content Security Policy
By default you just need to edit your hosts file and add an entry like this. It can work very well for demos to business people also.
127.0.0.1 localhost www.example.com api.example.com login.example.com
:1 localhost
ADVANCED SCENARIOS
At Curity we provide some quite advanced developer setups and this approach scales well, as in the below articles. The second of these also provides a script you can use to run locally over SSL, in case this is ever useful:
Single Page Apps End to End Developer Setup
Kubermetes End to End Developer Setup
The prologue:
I am trying to pull a layer into my arcgis-js-api application and I'm having a hard time. The layer is here:
https://maps.disasters.nasa.gov/ags04/rest/services/ca_fires_202008/sentinel2/MapServer
And I am trying to add it in this way:
export const SC2Sept29 = new MapImageLayer({
url:
'https://maps.disasters.nasa.gov/ags04/rest/services/ca_fires_202008/sentinel2/MapServer/547',
});
When running my app, I get the classic CORS error
Access to fetch at 'https://maps.disasters.nasa.gov/ags04/rest/services/ca_fires_202008/sentinel2/MapServer?f=json' from origin 'https://cdpn.io' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
This is coming from this codepen, but the same happens when developing locally with vscode live server, or webpack-dev-server. This does not happen with other layers, just from layers on the maps.disasters.nasa.gov server.
trying to set up a proxy
I got some advice from the thread CORS error when trying to access NASA layer that I need to set up a proxy for anything coming from this server. Their advice was to follow these instructions to set up a proxy, and to use one of arcgis's ready-made proxies. Personally, I am finding the instructions for the proxies repo to be lacking. All of my experience in setting up server-side apps is with nodejs, and I am not understanding the instructions for how to do this. The codepen I linked, for now, tries to use the CORS anywhere proxy but setting it in the esr/core/urlUtils:
urlUtils.addProxyRule({
urlPrefix: 'maps.disasters.nasa.gov',
proxyUrl: 'https://cors-anywhere.herokuapp.com',
});
But this gives an error saying that Unexpected token T in JSON at position 0. I can see in the network tab that the browser is indeed attempting to access the correct layer URL, correctly prefixed by the cors anywhere proxy URL. But the response itself is just the text of the cors anywhere proxy, hence the error:
As I mentioned my dev environments are vscode live server and a webpack dev server, depending on what part of the app im building. My target production environment is github pages - I didn't really expect this app to need a back end. If I need a server side to provide a proxy, I can host it on heroku or even AWS as a full stack app. When trying to use the pre-provided arcgis proxies, I get the same issue. For example, I cloned their proxies repo to my directory:
When configuring urlUtils to refer to one of these proxies, it does so, but just returns the text content of the proxy file and gives me the Unexpected token T in JSON at position 0 error. There's a lot of chat on the esri forums about IIS, but I'm a mac guy and have no experience with that. Esri offers proxies in .NET, java, or PHP, none of which I have experience in.
How can I get rid of these cors errors and properly pull layers from the nasa server into my app. If I need a proxy, how can I set one up that will work for both my dev and production environments? I have had a hard time finding tutorials at my level that apply to this scenario. Thanks for reading.
Ok, I think now we can summarize.
In order for this to work you need to set a proxy. Like you mention ESRI provides some implementations on different techs.
I fork their repository in order to include an easy test setup using docker and docker-compose. resource-proxy fork
After clone it run,
sudo docker-compose -f docker-compose.php.yml up -d --build
Test proxy,
http://localhost:8082/proxy?ping
In their you will also find and example nasa-service.html that shows a correct configuration of the application and the proxy (all in PHP folder).
Just need to run http://localhost:8082/nasa-service.html.
The key thing here is that the application needs to be in the same origin that the proxy.
We have a legacy website which I'd like to supplement with a chrome extension as a temporary measure.
The site is set up with Spring Security and each request is authenticates with an access token.
I have tried creating an extension that makes a request to the endpoint visible from network requests with the correct data but the endpoint does not respond at all.
Running the same code directly on the website through DevTools gives me a proper response, so I'm thinking the error is due to CORS.
Is there a way to make a chrome extension run code directly on the page so there wouldn't be any CORS issues or should I approach this issue a different way?
The Very Short Version: is anybody successfully requesting local resources via AJAX, in IE, over SSL? I cannot solve getting an "access denied" error.
The Longer Version:
I am using AJAX to retrieve JSON from an application that runs a local web service. The web service channel is encrypted so that if the remote site is being served over HTTPS, no "insecure resource on a secure page" errors appear.
So, in the address bar is a remote site of some sort... mysite.com. It is receiving information from https://localhost/.
The web service is setting correct headers for CORS and everything works in Chrome and Firefox. In IE, if I put my https://localhost resource into the address bar, the correct resource is returned and displayed. However, when using AJAX (not just the address bar), a security setting in IE is denying access. This is documented (in part) here:
Access denied in IE 10 and 11 when ajax target is localhost
The only proper solution in one reply is to add the requesting domain (mysite.com in this case) to the trusted sites. This works, but we would prefer to not have user intervention... pointing to a knowledge base article on how to add a trusted site is hardly a great user experience. The other replies to that question are invalid for the same reasons as below-->
Some more stumbling around and I discovered this:
CORS with IE, XMLHttpRequest and ssl (https)
Which had a reply containing a wrapper for AJAX requests in IE. It seemed promising, but as it turns out, IE11 has now deprecated the XDomainRequest API. This was probably the right thing for Microsoft to do... but now the "hack" workaround of adding a void onProgress handler to the XDR object is obviously not an option and the once-promising workaround wrapper is rendered null and void.
Has anybody come across either:
a) a way to get those requests through without needing to modify the trusted sites in IE? In other words, an updated version of the workaround in the second link?
b) as a "next best" case: a way to prompt the user to add the site to their trusted zone? "mysite.com wishes to be added to your trusted zones. Confirm Yes/No" and have it done, without them actually needing to open up their native settings dialogues and doing it manually?
For security reasons, Internet Explorer's XDomainRequest object blocks access (see #6 here) to the Intranet Zone from the Internet Zone. I would not be surprised to learn that this block was ported into the IE10+ CORS implementation for the XMLHTTPRequest object.
One approach which may help is to simply change from localhost to 127.0.0.1 as the latter is treated as Internet Zone rather than Intranet Zone and as a consequence the zone-crossing is avoided.
However, you should be aware that Internet Explorer 10+ will block all access to the local computer (via any address) when a site is running in Enhanced Protected Mode (EPM)-- see "Loopback blocked" in this post. Currently, IE uses EPM only for Internet sites when run in the Metro/Immersive browsing mode (not in Desktop) but this could change in the future.
No, there's no mechanism to show the Zones-Configuration UI from JavaScript or to automatically move a site from one zone to another. However, the fact that you have a local server implies that you are running code on the client already, which means you could use the appropriate API to update the Zone Mapping on the client. Note that such a change requires that you CLEARLY obtain user permission first, lest your installer be treated as malware by Windows Defender and other security products.
So, in summary, using the IP address should serve as a workaround for many, but not all platforms.
Since those are two different domains, one solution would be to create an application which proxies the requests in the direction you want.
If you have control over the example.com end, and want to support users who bring their own localhost service, this would be harder, as you would have to provide more requirements for what they bring.
If however, you have control over what runs in localhost, and want to access example.com, and have it access the localhost service, set up redirection in your web server of preference, or use a reverse proxy. You could add an endpoint to the same localhost app which doesn't overlap paths, for example, route http://localhost/proxy/%1 to http://%1, leaving the rest of localhost alone. Or, run a proxy on e.g. http://localhost:8080 which performs a similar redirection, and can serve example.com from a path, and the API from another.
This winds up being a type of "glue" or integration code, which should allow you to mock interactions up to a point.
I have setup FineUploader on a site and I included a check box that allows users to upload files using HTTPS if the want to.
Unfortunately if the user accesses the site using http and then tries to use ssl it errors out, I assume because of CORS issues. I assume it is a CORS issue because if I access the site using https and try to upload using ssl it works fine.
I found some documentation about enabling CORS support, but it appears that you either need to make it so only CORS requests will be made or none will be made. In my situation there will be CORS request some times and not others.
Does anyone know of a good work around for this? Should I just reload the entire page using HTTPS when the checkbox is clicked?
If you're uploading straight to Amazon s3, see the note in the official docs, "SSL is also supported, in which case your endpoint address must start with https://" in the script within your uploaderpage.html file.
request: {
endpoint: 'https://mybucket.s3.amazonaws.com',
// Note that "https://" added before bucket name, to work over https
accessKey: 'AKIAblablabla' // as per the specific IAM account
},
This will still work if uploaderpage.html is served over http (or you could populate the endpoint value dynamically if you need flexibility re endpoint).
This will help you avoid the mixed content error when uploading over https, "requested an insecure XMLHttpRequest endpoint", which happens if the page is https but you request a http endpoint.
Just to reiterate what I've mentioned in my comments (so others can easily see this)...
Perhaps you can instantiate Fine Uploader after the user selects HTTP or HTTPS as a protocol for uploads. If you must, you can enabled the CORS feature via the expected property of the cors option. Keep in mind that there are some details server-side you must address when handling CORS requests, especially if IE9 or older is involved. Please see my blog post on the CORS feature for more details.