Can't read from server. It may not have the appropriate access-control-origin settings.
I've had this error before, but I could really do with having Swagger UI work this time. The json file I'm putting the URL in for is being served from the same host, so it shouldn't require CORS. Indeed I've enabled CORS on the server side but it's not adding the headers because the browser isn't sending an Origin header (presumably because it knows CORS isn't required).
In fact the json file returns a 200 and I can see the content of it in the response in the Chrome debug tools network tab - infuriatingly there isn't a failed request in there so I don't know what's broken, other than that the petstore URL does work.
ETA: Swagger Editor can call my API without these issues, even when it is hosted on another server, but it's Swagger UI I want to share with users.
It was nothing to do with CORS, whatever the issue was.
Replacing the JSON file with one taken from the petstore worked, so it must've been some issue parsing it.
Related
I have made some project which sends XHR request to fetch a locally saved JSON file (the file being in the same folder).
I use VS Code with 'live server' extension.
The request, response and every thing else works perfectly fine when I open the html file with Live Server.
But when I open the file without starting any kind of local server, then the request doesn't return any response and instead logs out an error-
(I am using Chrome)
Access to XMLHttpRequest at 'file:///G:/_PROJECTS/Graph%20Plotter/sample_data.json' from origin 'null' has been blocked by CORS policy: Cross origin requests are only supported for protocol schemes: http, data, chrome, chrome-extension, https.
I searched online about this and found some google documentation but didn't quite understand it. I want to know what the error is about and how can I fix it?
Also I would be great help if you could simplify it so that I understand it.
Thanks in advance!
I need to have a HTTP server point to my localhost to be able to fetch files locally.
Since XHR needs a network to send HTTP requests, I cannot simply fetch local files without being on any network/server (either internet or local server). The resulting chrome error is due to the fact that chrome has disabled fetching local files without being on any network due to security and privacy concerns
I am working on a project with some colleagues using ASP MVC 5 and we're using a lot of common Javascript libraries from public CDNs. I've configured and enabled CORS and everything works fine except with this one specific case and we're somewhat stumped at this point.
The app uses ASP.NET Identity 2 and some functions rely on the Impersonation feature. The backend implementation generally follows the answers here: How do I use ASP.NET Identity 2.0 to allow a user to impersonate another user?
The frontend uses an AJAX post with antiforgery tokens to a WebAPI endpoint (following this specific implementation: https://stackoverflow.com/a/24394578/5330050). To have the new identity take effect, the app does a window.location.reload(true);.
It is at this point that Firefox blocks all the CORS requests (all of which are requests for the libraries and frameworks hosted on CDNs). This is the specific error (same for all the requests, different lib, same domain):
Cross-Origin Request Blocked: The Same Origin Policy disallows reading
the remote resource at
https://cdnjs.cloudflare.com/{some.js}.
(Reason: CORS request did not succeed)
This issue only happens in Firefox. And it continues blocking these resources even if I attempt to navigate to a different page in the app. Unless I clear the cache (but not cookies so the identity still remains) then everything is fine again.
There's nothing special about how these resources are called. It's not a POST request. It's just up there in the <head>, for example:
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js" integrity="sha256-FgpCb/KJQlLNfOu91ta32o/NMZxltwRo8QtmkMRdAu8=" crossorigin="anonymous"></script>
Things that work to resolve it (but are unacceptable as far as user experience is concerned)
Clear the cache (but not cookies)
Wait 5 minutes (for Firefox to forget? Session is set to 12 hours on
the server)
I don't really know what the cause could be and I appreciate any help in finding either a workaround or a clue as to where I could look for a solution.
This is not really an answer, but I have found out specifically what is going on with my case above. I have a follow up question but before that I will share what I found:
https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS#Requests_with_credentials
When responding to a credentialed request, the server must specify an
origin in the value of the Access-Control-Allow-Origin header, instead
of specifying the "*" wildcard.
Because the request headers in the above example include a Cookie
header, the request would fail if the value of the
Access-Control-Allow-Origin header were "*". But it does not fail:
Because the value of the Access-Control-Allow-Origin header is
"http://foo.example" (an actual origin) rather than the "*" wildcard,
the credential-cognizant content is returned to the invoking web
content.
Note that the Set-Cookie response header in the example above also
sets a further cookie. In case of failure, an exception—depending on
the API used—is raised.
This is exactly what is going on because this issue does not occur until the user is logged in. cdnjs (or whatever other CDN out there) would send the wildcard header because that's the standard for how it's done.
I'm still confused why it's happening with a script tag though so my follow-up question would be along those lines.
I'm trying to process onedrive files in client-side javascript, but first I need a way to use XMLHttpRequest to download the file. Onedrive supports cors for a lot of operations, but for downloading the file into javascript there is the following problem:
As mentioned here: onedrive rest api manual
I can send a request to:
GET https://apis.live.net/v5.0/FILE_ID/content?access_token=ACCESS_TOKEN
and it will reply with a location header redirecting the browser to the file. The problem is when I send these requests through XHR, the browser always sends the Origin header with the request. For the first request I described above, onedrive also replies with an Access-Control-Allow-Origin:* header, so the request is allowed in the browser. However, when the browser is redirected to the actual location of the file, that resource does not have the Access-Control-Allow-Origin header, so the XHR request is denied by the browser(chrome sends an Origin header set to null for the redirect request).
I've also tried getting the location but not redirecting automatically, and then sending another XHR request, this will set the origin header to the domain of my site, but the result is the same.
As I mentioned in the beginning, I need to process the data in javascript, so I'm not asking about how to download onedrive files to hard drive. I need the data to be accessible by javascript in the webpage.
I know that I can use server side programming to get the file data for me and then send it to the client, but for my application this is not an option(at least this is not what I'm asking for at the moment).
If there is no way to do this, does anyone have an idea why they would implement their api this way? To allow javascript to get the location through cors and redirect but not include a cors header for the redirected resource. Why not just deny cors in the first place? Is this a bug?
The answer, as best as I can tell, is that downloading content cannot be done purely by JavaScript in a browser. Why did they do it this way? You'd have to ask them, but I would guess either a bug, or some unspecified "security concerns". For what it's worth, they seem to think that downloading content is CORS compliant in the documentation here: https://dev.onedrive.com/misc/working-with-cors.htm:
To download files from OneDrive in a JavaScript app you cannot use the
/content API, since this responds with a 302 redirect. A 302 redirect
is explicitly prohibited when a CORS preflight is required, such as
when providing the Authorization header.
Instead, your app needs to select the #content.downloadUrl property,
which returns the same URL that /content would have redirected to.
This URL can then be requested directly using XMLHttpRequest. Because
these URLs are pre-authenticated they can be retrieved without a CORS
preflight request.
However, to the best of my knowledge, they are wrong. Just because you don't need a preflight request doesn't mean that the response is CORS-compliant. You still need an Access-Control-Allow-Origin header on the response.
For anyone wondering, this is still an issue in the new Graph API (which is essentially a proxy API to the OneDrive API, as I understand it). The same basic issue is still present - you can get a download URL from your items, but that URL points to a non-CORS-compliant resource, so it doesn't do you a whole lot of good.
I have an active issue open with Microsoft here about this issue. There has been some response to my issue (I got them to expose the download URL through the graph API), but I'm still waiting to see if they'll come up with a real solution to downloading content from JavaScript.
If I get a solution or real answer on that issue, I'll try to report back here so others in the future can have a real answer to reference.
This is not an answer, I cannot comment yet.
Last week the new API for onedrive was released. http://onedrive.github.io/index.htm
Unfortunately it will not solve the problem.
https://api.onedrive.com/v1.0/drive/root:{path and name}:/content?access_token={token}
Will still redirect to a ressource somewhere at https://X.files.1drv.com/.X.
Which will not contain any Access-Control-Allow-Origin headers. Same goes for the Url "#content.downloadUrl" in the JSON response.
I hope that microsoft will address this problem in the very near future, because the API is at the moment of very limited use, since you cannot process file contents from onedrive with html5 apps. Apart from the usual file browser.
The only solution, which I see at the moment would be a chrome app, which can process the Url without CORS. see https://developer.chrome.com/apps/angular_framework
Box does exactly the same thing for download requests. I have not found any way around this problem without involving a server because the browser will not let your program get access to the contents of the 302 redirect response. For security reasons I am not convinced of, browsers mandatorily follow redirect requests without allowing user intervention.
The way we finally worked around this was
the browser app sends the GET request to the server which forwards it to the cloud provider (box/ondrive).
server then DOES NOT follow the 302 redirect response from Box or OneDrive
The server instead returns to the browser app, the content of the location field in the 302 response header, which contains the download url
The javascript in the browser app then downloads the file using the url.
You can now just use the "#content.downloadUrl" property of the item in your GET request. Then there is no redirection.
From https://dev.onedrive.com/items/download.htm:
Returns a 302 Found response redirecting to a pre-authenticated download URL for the file. This is the same URL available through the #content.downloadUrl property on an item.
I have setup FineUploader on a site and I included a check box that allows users to upload files using HTTPS if the want to.
Unfortunately if the user accesses the site using http and then tries to use ssl it errors out, I assume because of CORS issues. I assume it is a CORS issue because if I access the site using https and try to upload using ssl it works fine.
I found some documentation about enabling CORS support, but it appears that you either need to make it so only CORS requests will be made or none will be made. In my situation there will be CORS request some times and not others.
Does anyone know of a good work around for this? Should I just reload the entire page using HTTPS when the checkbox is clicked?
If you're uploading straight to Amazon s3, see the note in the official docs, "SSL is also supported, in which case your endpoint address must start with https://" in the script within your uploaderpage.html file.
request: {
endpoint: 'https://mybucket.s3.amazonaws.com',
// Note that "https://" added before bucket name, to work over https
accessKey: 'AKIAblablabla' // as per the specific IAM account
},
This will still work if uploaderpage.html is served over http (or you could populate the endpoint value dynamically if you need flexibility re endpoint).
This will help you avoid the mixed content error when uploading over https, "requested an insecure XMLHttpRequest endpoint", which happens if the page is https but you request a http endpoint.
Just to reiterate what I've mentioned in my comments (so others can easily see this)...
Perhaps you can instantiate Fine Uploader after the user selects HTTP or HTTPS as a protocol for uploads. If you must, you can enabled the CORS feature via the expected property of the cors option. Keep in mind that there are some details server-side you must address when handling CORS requests, especially if IE9 or older is involved. Please see my blog post on the CORS feature for more details.
Regarding source maps, I came across a strange behavior in chromium (build 181620).
In my app I'm using minified jquery and after logging-in, I started seeing HTTP requests for "jquery.min.map" in server log file. Those requests were lacking cookie headers (all other requests were fine).
Those requests are not even exposed in net tab in Developer tools (which doesn't bug me that much).
The point is, js files in this app are only supposed to be available to logged-in clients, so in this setup, the source maps either won't work or I'd have to change the location of source map to a public directory.
My question is: is this a desired behavior (meaning - source map requests should not send cookies) or is it a bug in Chromium?
The String InspectorFrontendHost::loadResourceSynchronously(const String& url) implementation in InspectorFrontendHost.cpp, which is called for loading sourcemap resources, uses the DoNotAllowStoredCredentials flag, which I believe results in the behavior you are observing.
This method is potentially dangerous, so this flag is there for us (you) to be on the safe side and avoid leaking sensitive data.
As a side note, giving jquery.min.js out only to logged-in users (that is, not from a cookieless domain) is not a very good idea to deploy in the production environment. I;m not sure about your idea behind this, but if you definitely need to avoid giving the file to clients not visiting your site, you may resort to checking the Referer HTTP request header.
I encountered this problem and became curious as to why certain authentication cookies were not sent in requests for .js.map files to our application.
In my testing using Chrome 71.0.3578.98, if the SameSite cookie atttribute is set to either strict or lax for a cookie, Chrome will not send that cookie when requesting the .js.map file. When there is no sameSite restriction, the cookie will be sent.
I'm not aware of any specification of the intended behavior.