Cross-Origin Stack Traces - javascript

Based on some reading, I've been lead to understand the following:
Javascript files loaded from a CDN will produced sanitized stack traces in error.stack which contain simply "Script Error"
Actual traces can be obtained if BOTH the script file includes crossorigin="anonymous" and the CDN sends Access-Control-Allow-Origin: *
I have a number of questions about this that I'm hoping someone can answer:
If I programmatically append a script to the DOM and call script.crossorigin="anonymous" on it first, does that "count"? Will it cause issues on any older browsers?
Any advice on getting Access-Control headers working with S3 => CloudFlare? It honestly seems impossible. Despite being configured, S3 only sends the headers if the REQUEST includes Origin, which never seems to happen. And even if it did, if the first request came from a browser that doesn't support CORS/Origin headers, the invalid version would be the one that gets cached.
Is it actually true that I even need these headers? I feel like I've seen stack traces in Chrome without the headers being there. Was the restriction removed from Chrome because it was unrealistic? Or does it not apply to javascript files added programatically? What browser versions actually sanitizes the traces?

Related

Azure Storage Javascript library "createBlobServiceWithSas" throws error: Refused to set unsafe header "user-agent"

Using the Azure Storage JS Client library to upload an image throws an error: "Refused to set unsafe header "user-agent""
All requests in the network tab are 200 or 201, it appears like the xhr requests are working. Is it possible to not set this header or filter it out before the post call? I would like to avoid this error in the console.
https://github.com/Azure/azure-storage-node#azure-storage-javascript-client-library-for-browsers
Have tested the sample azurestoragejs-2.9.100-preview in link you mentioned, it causes no error on my side(both Chrome and Firefox).
Open azure-storage.blob.js lib file, search variable var unsafeHeaders and check whether user-agent is in its list. I saw it on my side and reproduce your problem after deleting it. So it might be missing in your file.
If your lib is unbroken, you can ignore this "error" as nothing goes wrong and it's all implemented by storage lib and browser.
Explanation:
When http request executes, method in this lib will make sure headers in unsafeHeaders list won't be set by xhr. If not, browsers will throw warnings as you have seen, because it's a requirement of xhr standard.
See remarks in this lib.
This check is not necessary, but it prevents warnings from browsers about setting unsafe headers.To be honest I'm not entirely sure hiding these warnings is a good thing, but http-browserify did it, so I will too.
Everyting does work on your side may have proved the check is not necessary. Also in xhr standard, user-agent is no more an unsafe header, but browser doesn't catch up.

Using XHR to precache resources not behaving as expected

I'm simply trying to use XHR to precache some resources, but the cache is not behaving as expected.
Here are the bounds of the problem:
I know the resource URLs in advance (of course).
I don't know their content-types (mix of CSS, images, and other).
They will always be same-origin.
I control the cache headers.
They can be cached forever for all I care.
I've always been under the impression that XHR used the browser cache more or less like any other resource, but never rigorously tested that. Here's what I'm doing:
Request all resources up-front with XHR.
Explicitly set request header Cache-Control: max-age=3600 (Chrome was setting max-age=0 for some reason).
Set the following response headers on the server:
Cache-control: public; max-age=3600
Date: now
Expires: now + 1 hour
[Content-Type, Content-Length]
Here's what I'm seeing:
XHR always fetches the resource (confirmed on server and with dev tools).
Subsequent requests (via image/css/etc elements) always fetch (even after the XHRs have completed) on a cold cache.
But they always use the cache when it's warm.
I've poked at it in various ways, but this behavior never seems to change.
After much wailing and gnashing of teeth, I believe I've proven that this approach simply won't work on all browsers. Here's my experience thus far:
Firefox: Works a charm.
Chrome: Seldom works -- it's as though XHR uses a different cache than the elements (even though I'm pretty sure that's not the case; I haven't had time to delve into Chrome code to figure out exactly what's going on.
Safari: Apparently random. Sometimes resource requests kicked off from elements retrieve from the cache, sometimes not. I'm sure there's a method, but it appears to be madness from the outside.
In the end, I had to switch to the somewhat more craptastic-but-reliable approach of creating a hidden iframe, injecting script/img elements into it, then waiting on the iframe window's onload event. This works, but gives no fine-grained feedback in terms of which elements are loaded (getting reliable onload events from the individual elements is more "cross-browser tricky" than just waiting on the whole frame.
I'd love to understand more precisely what's going on in Chrome/Safari, but sadly don't have the time to dig in further.

Refused to get unsafe header "Content-Length" / reading ID3 tags from external mp3's

I am trying to write a script that loads ID3 tags from an .mp3 file.
After searching for hours, I found one script that had a small size and did what I wanted, however I can't get it to work with cross-domain .mp3 files.
I have tried using CSP headers to bypass this, but without luck.
Is there a way to get around this, like downloading the file in the backgrond and showing a progress bar, or can I bypass this another way?
The error I get when loading an external file:
Refused to get unsafe header "Content-Length"
It's connected to a .getResponseHeader(); as far as I remember.
Content-Length is currently a safelisted header defined in the CORS spec: https://fetch.spec.whatwg.org/#cors-safelisted-response-header-name, so the library or browser you are using may be outdated.
There are two things you can try:
Use the updated library for reading ID3 tags
Follow this stackoverflow answer to get around the CORS issue

CORS: PHP bypass not working

I'm working on a Chrome extension part of which is a function which manipulates images on a page using canvas and its context.getImageData function. That's when I ran into CORS issues. It's my understanding that a server serving an image has to server said image with appropriate CORS headers in order for cross-domain requests to be successful. I started reading up on this (to me) new and unfamiliar technology (tutorial). A substantial number of servers doesn't employ CORS and it's very important for the function of my extension that every image is processed. I've spent a whole day trying to circumvent this issue using client-side scripting but came to the conclusion that the only way is to send the image url to a server and then serve it back with the needed CORS headers (Access-Control-Allow-Origin: *). Now before I get into explaining my implementation I'd like to quote a paragraph from the tutorial page I linked previously.
Cross-Domain from Chrome Extensions
Chrome extensions support cross-domain requests in a two different ways:
Include domain in manifest.json - Chrome extensions can make cross-domain requests to any domain if the domain is included in the "permissions" section of the manifest.json file:
"permissions": [ "http://*.html5rocks.com"]
The server doesn't need to include any additional CORS headers or do any more work in order for the request to succeed.
This should mean that "permissions": "<all_urls>" should circumvent same origin policy restrictions. However, this does not work.
My solution
An XMLHttpRequest passes the image url and callback function to the server on localhost (for testing purposes) which first sets the appropriate header:
header('Access-Control-Allow-Origin: *');,
and then prints a JSON encoded array containing image width, height, and using file_get_contents, imagecreatefromstring, and base64_encode, the equivalent of context.getImageData and a call to the callback function.
The callback function sets the src property of an Image Object (that has crossOrigin set to Anonymous) which is used for drawing the images onto the canvas and sets it's width and height properties.
Result
The expected result was for every image to be loaded and processed without raising a Cross-origin image load denied by Cross-Origin Resource Sharing policy error, however now every image seems to be served without the needed CORS headers crippling my extension. I checked the headers the page on localhost which processes this request sends and it seems to be okay. (screenshot)
Conclusion
My implementation of this solution seems like it should work and I really have no idea why it doesn't. The server is sending the Access-Control-Allow-Origin header, the image data is good and the callback function is called. This is the only issue left to resolve before release. This is a really intriguing issue. I realise the header I'm sending isn't the only one I might want to send but it's sufficient for testing purposes.
I hope this question was clear, and detailed enough for someone to help me resolve this issue. Please do not hesitate to ask for more information and/or code snippets as I didn't really include any code in an attempt to keep this concise.
If your image src is a data uri (base64 encoded image data), then there is no headers to set access control.
Just set the image source to the url you're calling in ajax and send back the image not encoded(echo file_get_contents).

Hard refresh and XMLHttpRequest caching in Internet Explorer/Firefox

I make an Ajax request in which I set the response cacheability and last modified headers:
if (!String.IsNullOrEmpty(HttpContext.Current.Request.Headers["If-Modified-Since"]))
{
HttpContext.Current.Response.StatusCode = 304;
HttpContext.Current.Response.StatusDescription = "Not Modified";
return null;
}
HttpContext.Current.Response.Cache.SetCacheability(HttpCacheability.Public);
HttpContext.Current.Response.Cache.SetLastModified(DateTime.UtcNow);
This works as expected. The first time I make the Ajax request, I get 200 OK. The second time I get 304 Not Modified.
When I hard refresh in Chrome (Ctrl+F5), I get 200 OK - fantastic!
When I hard refresh in Internet Explorer/Firefox, I get 304 Not Modified. However, every other resource (JS/CSS/HTML/PNG) returns 200 OK.
The reason is because the "If-Not-Modified" header is sent for XMLHttpRequest's regardless of hard refresh in those browsers. I believe Steve Souders documents it here.
I have tried setting an ETag and conditioning on "If-None-Match" to no avail (it was mentioned in the comments on Steve Souders page).
Has anyone got any gems of wisdom here?
Thanks,
Ben
Update
I could check the "If-Modified-Since" against a stored last modified date. However, hopefully this question will help other SO users who find the header to be set incorrectly.
Update 2
Whilst the request is sent with the "If-Modified-Since" header each time. Internet Explorer won't even make the request if an expiry isn't set or is set to a future date. Useless!
Update 3
This might as well be a live blog now. Internet Explorer doesn't bother making the second request when localhost. Using a real IP or the loopback will work.
Prior to IE10, IE does not apply the Refresh Flags (see http://blogs.msdn.com/b/ieinternals/archive/2010/07/08/technical-information-about-conditional-http-requests-and-the-refresh-button.aspx) to requests that are not made as a part of loading of the document.
If you want, you can adjust the target URL to contain a nonce to prevent the cached copy from satisfying a future request. Alternatively, you can send max-age=0 to force IE to conditionally revalidate the resource before each reuse.
As for why the browser reuses a cached resource that didn't specify a lifetime, please see http://blogs.msdn.com/b/ie/archive/2010/07/14/caching-improvements-in-internet-explorer-9.aspx
The solution i came upon for consistent control was managing the cache headers for all request types.
So, I forced standard requests the same as XMLHttpRequests, which was telling IE to use the following cache policy: Cache-Control: private, max-age=0.
For some reason, IE was not honoring headers for various requests types. For example, my cache policy for standard requests defaulted to the browser and for XMLHttpRequests, it was set to the aforementioned control policy. However, making a request to something like /url as a standard get request, render the result properly. Unfortunately, making the same request to /url as an XMLHttpRequest, would not even hit the server because the get request was cached and the XMLHttpRequest was hitting the same url.
So, either force your cache policy on all fronts or make sure you're using different access points (uri's) for your request types. My solution was the former.

Categories

Resources