PUT request to Cross Domain URL fails only in IE - javascript

I want to do a PUT request across a different domain. But the script fails only in IE.
I figured out what the problem was, in IE if you look at Internet Options > Security tab > Custom level > Miscellaneous > Access data sources across domains option was set to disable. The only way I was able to get my put request work is setting that option to Allow.
So my question: Is there a way I can get this working without enforcing end users to set the option?
There is XDomainRequest() which can be used for XDomain requests in IE but, this method doesn't support PUT.

IE9 and older does not support PUT method in cross domain request. Only GET and POST.

You could use a library like Xdomain or EasyXDM to get a CORS alternative using the Post Message hack.
I prefer to use Xdomain because it hijacks the native XMLHTTPRequest and provides a "drop-in" solution. EasyXDM forces you to use their API which means more conditional coding overhead, however, it supports IE6/IE7.
The main take away? Don't stop supporting CORS! Just make IE behave itself and opt-in to the future.

Related

Should AJAX use hashtag /#!/ or not?

I've made a webpage that has the URL-form http://www.example.com/module/content
It's a very dynamic webpage, actually it is a web app.
To make it as responsive as possible, I want to use AJAX instead of normal page requests. This is also enabling me to use JavaScript to add a layer to provide offline capabilities.
My question is only: How should I make the URLs? Should they be http://www.example.com/module/content or http://www.example.com/#!/module/content?
Following is only my thoughts in both directions. You don't need to read it if you already have a clear thought about this.
I want to use the first version because I want to support the new HTML5 standard. It is easy to use, and the URLs look pretty. But more importantly is that it allows me to do this:
If the user requests a page, it will get a full HTML page back.
If the user then clicks a link, it will insert only the contents into the container div via AJAX.
This will enable users without JavaScript to use my website, since it does not REQUIRE the use to have JavaScript, it will simply use the plain old "click a link-request-get full html page back"-approach.
While this is great, the problem is of course Internet Explorer. Only the last version of IE supports History API, and to get this working in older versions, one needs to use a hashtag. (50 % of my users will be using IE, so I need to support it...) So then I have to use /#!/ to get it working.
If one uses both these URL-versions, the problem arises that if a IE user posts this link to a website, Google will send the server a _unescaped_string (or similar..) And it will index the page with the IE-version with the hashtag. And some pages will be without the hashtag.
And as we remember, a non-hashtag is better on many different things. So, can this search engine problem be circumvented? Is it possible to tell the GoogleBot that if it's reaching for the hashtag-version of the website, it should be redirected to the non-hashtag-version of the webpage? This way, one could get all the benefits of a non-hashtag URL and still support IE6-IE9.
What do you think is the best approach? Maybe you have tried it in practice yourself?
Thanks for your answer!
If you want Google to index your Ajax content then you should use the #!key=value like this. That is what Google prefers for Ajax navigation.
If you really prefer the pretty HTML5 url without #! then, yes, you can support both without indexing problems! Just add:
<link rel="canonical" href="preferredurl" />
to the <head> section of each page (for the initial load), so to help Google know which version of the url you would prefer them index. Read more about canonical urls here.
In that case the solution is very easy. You use the first URL scheme, and you don't use AJAX enhancements for older IE browsers.
If your precious users don't want to upgrade, it's their problem, but they can't complain about not having these kewl effects and dynamics.
You can throw a "Your browser is severely outdated!" notice for legacy browsers as well.
I would not use /#!/ in the url. First make sure the page works normally, with full page requests (that should be easy). Once that works, you can check for the window.history object and if that is present add AJAX. The AJAX calls can go to the same url and the main difference is the server side handling. The check is simple, if the HTTP_X_REQUESTED_WITH is set then the request is an AJAX request and if it is not set then you're dealing with a standard request.
You don't need to worry about duplicate content, because GoogleBot does not set the HTTP_X_REQUESTED_WITH request header.

What triggers "Internet Explorer has modified this page to help prevent cross-site scripting."?

I'm trying to implement a workaround for missing CORS functionality in Internet Explorer. For GET requests I use JSONP, no problem here. For small POST/DELETE/PUT requests I also use JSONP by tunneling the requests through GET but this does not work for larger requests (Because the length of the GET URL is limited). So for large data I try to implement a form POST via an iframe. I can't read the response from this POST because of the same-origin policy so I fetch the response via a JSONP GET request after posting the data. Works great but sometimes I get a strange warning in IE 9:
Internet Explorer has modified this page to help prevent cross-site scripting.
First I wondered what the hell IE is doing there because even when this warning appears everything still works correctly. Then I found out that IE replaces the content of the hidden iframe AFTER the POST answer (which I can't read and need anyway) with a "#" character.
So my workaround still works even when this warning appears but I would like to know what exactly triggers this warning so maybe I can modify my CORS workaround to get rid of this warning. Any hints?
You can configure the X-XSS-Protection header on your server. This will tell IE to disable XSS protection on your site.

cross-origin header in IE8/IE9

Since jQuery ajax ist not working for CORS/IE, I'm using XDomainRequest to retreive data from another Server. Work's fine, but I would like to send some header ('Authentification', 'content-type').
Is there a chance to add/change header in XDomainRequest?
Or does someone know a workaround?
This is what we did for IE.
If you have control over the target domain, host a (static) html file there. Include the html using the iframe.
Now this iframe does actually have access to the local domain, so you can communicate between the parent and child frame to get what you need.
This worked much better than XDomainRequest for us.
window.postMessage is the best way to setup the communication:
But I'm pretty sure that only started working since IE8. If you require older browsers as well, you must use a different hack.
In our case, this was our 3-layer system:
CORS, for browsers that support it
An iframe & window.postMessage as a primary fallback
A server-side proxy script as the secondary fallback
All of these options work well, are reliable and didn't feel too much like a hack. The secondary fallback was barely ever used.
Keep in mind that the 'Authentication' header specifically is special, and I would not be shocked that that's blocked under certain circumstances anyway. We added a custom header 'X-Authenticate' as it did pass through all the time.
IE's XDomainRequest does not allow custom headers to be set. See item #3 here: http://blogs.msdn.com/b/ieinternals/archive/2010/05/13/xdomainrequest-restrictions-limitations-and-workarounds.aspx The XDomainRequest object is locked down to the point where it is difficult to make authenticated requests.

Stop link from sending referrer to destination

I have a page where I don't want the outbound links to send a referrer so the destination site doesn't know where they came from.
I'm guessing this isn't possible but I just want to make sure there weren't any hidden javascript magic that could do it and that would work with some (if not most) browsers.
Maybe some clever HTTP status code redirecting kung-fu?
Something like this would be perfect
link
The attribute you are looking for is rel="noreferrer": https://html.spec.whatwg.org/multipage/links.html#link-type-noreferrer
According to https://caniuse.com/rel-noreferrer, all the major browsers have supported it since at least 2015, though Opera Mini does not (and, of course, some users may be using older browser versions).
For anyone who's visiting in 2015 and beyond, there's now a proper solution gaining support.
The HTTP Referrer Policy spec lets you control referrer-sending for links and subresources (images, scripts, stylesheets, etc.) and, at the moment, it's supported on Firefox, Chrome, Opera, and Desktop Safari 11.1.
Edge, IE11, iOS Safari, and desktop versions of Safari prior to 11.1 support an older version of the spec with never, always, origin, and default as the options.
According to the spec, these can be supported by specifying multiple policy values. Unrecognized ones will be ignored and the last recognized one will win.
<meta name="referrer" content="never">
<meta name="referrer" content="no-referrer">
Also, if you want to apply it to audio, img, link, script, or video tags which require a crossorigin attribute, prefer crossorigin="anonymous" where possible, so that only the absolute minimum (the Origin header) will be shared.
(You can't get rid of the Origin header while using CORS because the remote sites need to know what domain is making the request in order to allow or deny it.)
HTML 5 includes rel="noreferrer", which is supported in all major browsers. So for these browsers, you can simply write:
link
There's also a shim available for other browsers: https://github.com/knu/noreferrer
Bigmack is on the right track, but a javascript location change still sends a referrer in firefox. Using a meta refresh seems to solve the problem for me.
</html>'>Link
I was trying to figure this out too.
The solution I thought of was to use a data url to hide the actual page I am coming from.
<a href='data:text/html;charset=utf-8, <html><script>window.location = "http://google.ca";</script></html>'>Link</a>
This link opens a page that only contains javascript to load a different page.
In my testing no referrer is given to the final destination. I don't know what it could send as a referrer if it tried anyways, maybe the data url ? which wouldn't give away where you came from.
This works in Chrome. Chrome is my only concern for my current problem but for browsers that don't like javascript in pages that are data urls. You could probably try a meta refresh.
In addition to the information already provided. Lots more information on the topic here: https://w3c.github.io/webappsec-referrer-policy/#referrer-policy-no-referrer
Specifically allowing you to either send or not send referral information if you need different rules for same-origin or cross-origin requests.
Something to consider depending on your specific use case. i.e. if you are pulling in images/css/javascript from 3rd party websites, then you may want to not identify the URL that you are doing this from and hence would use the no-referrer option. Whereas if you are linking out to other websites from your own website, you may want them to know that you are sending them traffic. Always think through the implications of this on both sides. If there is a conflict in these two areas, then there are other options such as adding UTM tracking parameters to the end of URLs which may come in handy for some people. Full details here: https://www.contradodigital.com/2014/06/03/importance-utm-tracking-parameters-social-media/
I don't know if I'm missing something here and am v happy to be corrected, but wouldn't a URL shortening service meet your needs here?
Presumably the logs at the destination site would only show the domain of the shortening service, not the initial referring domain, so you would remain hidden.

javascript cross-domain issue

I am building a small widget that I am giving to users to embed in their websites and blogs. Now the widget loads a javascript file in the page where it is embedded from my server, which in turn puts an xmlhttp request back to my server to obtain data. Due to security reasons this request is being blocked when placed on other server (except my server). I need a solution to this.
I have searched a lot for a solution.
I am sure I cannot use the proxy solution here as the domains on which the script will be running is not controlled by me.
Also, I cannot use iframe due to Search engines.
What could be a possible solution?
Thanks,
happyhardik
To my knowledge, using JSONP is the only way to do this.
Also, I cannot use iframe due to Search engines.
This I don't understand, though: If your widget is JavaScript driven, it won't turn up in any search engines anyway, will it?
This is pretty much a duplicate of any number of related queries. e.g.
Cross Domain Scripting Issues & JSONP
Basically you want to use JSONP.
EDIT: I see Pekka has already said this.
Browsers do not support cross domain ajax requests due same origin policy. You can check out this javascript library: ACD.

Categories

Resources