I am developing a web page that needs to display, in an iframe, a report served by another company's SharePoint server. They are fine with this.
The page we're trying to render in the iframe is giving us X-Frame-Options: SAMEORIGIN which causes the browser (at least IE8) to refuse to render the content in a frame.
First, is this something they can control or is it something SharePoint just does by default? If I ask them to turn this off, could they even do it?
Second, can I do something to tell the browser to ignore this http header and just render the frame?
If the 2nd company is happy for you to access their content in an IFrame then they need to take the restriction off - they can do this fairly easily in the IIS config.
There's nothing you can do to circumvent it and anything that does work should get patched quickly in a security hotfix. You can't tell the browser to just render the frame if the source content header says not allowed in frames. That would make it easier for session hijacking.
If the content is GET only you don't post data back then you could get the page server side and proxy the content without the header, but then any post back should get invalidated.
UPDATE: 2019-12-30
It seem that this tool is no longer working! [Request for update!]
UPDATE 2019-01-06: You can bypass X-Frame-Options in an <iframe> using my X-Frame-Bypass Web Component. It extends the IFrame element by using multiple CORS proxies and it was tested in the latest Firefox and Chrome.
You can use it as follows:
(Optional) Include the Custom Elements with Built-in Extends polyfill for Safari:
<script src="https://unpkg.com/#ungap/custom-elements-builtin"></script>
Include the X-Frame-Bypass JS module:
<script type="module" src="x-frame-bypass.js"></script>
Insert the X-Frame-Bypass Custom Element:
<iframe is="x-frame-bypass" src="https://example.org/"></iframe>
The X-Frame-Options header is a security feature enforced at the browser level.
If you have control over your user base (IT dept for corp app), you could try something like a greasemonkey script (if you can a) deploy greasemonkey across everyone and b) deploy your script in a shared way)...
Alternatively, you can proxy their result. Create an endpoint on your server, and have that endpoint open a connection to the target endpoint, and simply funnel traffic backwards.
Yes Fiddler is an option for me:
Open Fiddler menu > Rules > Customize Rules (this effectively edits CustomRules.js).
Find the function OnBeforeResponse
Add the following lines:
oSession.oResponse.headers.Remove("X-Frame-Options");
oSession.oResponse.headers.Add("Access-Control-Allow-Origin", "*");
Remember to save the script!
As for second question - you can use Fiddler filters to set response X-Frame-Options header manually to something like ALLOW-FROM *. But, of course, this trick will work only for you - other users still won't be able to see iframe content(if they not do the same).
So my ISP (Smartfren; Indonesia) has decided to start injecting all non-SSL pages with an iframing script that allows them to insert ads into pages. Here's what's happening:
My browser sends a request to the server. ISP intercepts it and instead returns a javascript that loads the requested page inside an iframe.
Aside being annoying in principle, this injection also breaks any number of standard page functionality; and presents possible security hazards.
What I've tried to do so far:
Using a GreaseMonkey script to nix away the injected code and redirect to the original URL. Result: Breaks some legitimate iframes. Also, the ISP's code gets executed, because GreaseMonkey only kicks in after the page is loaded.
Using Privoxy for a local proxy and setting up a filter to clean up the injection and replace it with a plain javascript redirect to the original URL. Result: Breaks some legitimate iframes. ISP's code never gets to the browser.
You can view the GreaseMonkey and Privoxy fixes I've been working on at the following paste: http://pastebin.com/sKQTvgY2 ... along with a sample of the ISP's injection.
Ideally I could configure Privoxy to immediately resend the request when the alteration is detected, instead of filtering out the injected JS and replacing it with a JS redirection to the original URL. (The ISP-injection gets switched off when the same request is resent without delay.) I'm yet to figure out how to accomplish that. I believe it'd fix the iframe-breaking problem.
I know I could switch to a VPN or use the Tor browser. (Or change the ISP.) I'm hoping there's another way around. Any suggestions on how to eliminate this nuisance?
Actually now I have a solution:
The ISP proxy react on the Accept: header that the browser sends.
So this is the default for firefox:
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Now we are going to change this default:
And set it to: Accept: */*
Here is how to setup header hacker for google chrome
Set the title to anything you like:NO IFRAME
Append/replace select replace with
String */*
And Match string to .* and then click add.
In the permanent header switches
Set domain to .* and select the rule you just created
PS: changing it in the firefox settings does not work 100% because some request like ajax seem to bypass it so a plugin is the only way as it literally intercepts every outgoing browser request
That's it no more iframes!!!
Hope this helps!
UPDATE: Use DNSCrypt is the best solution 😁
OLD ANSWER
Im using this method
Find resource that contain iframe code (use chrome dev tool)
Block the url with proxy or host file
I'm using linux, so i edited my hosts file on
/etc/hosts
Example :
127.0.0.1 ibnads.xl.co.id
I've recently got an email from Google, saying that they are going to ban my AdSense account because I'm sending Personally Identifiable Information to them with my Google AdSense tag requests. It says that around 1% of requests from my website have a referrer of:
some_user#my_website.com/some/subpage
and they consider some_user#my_website.com to be PII (even though it can be completely made up abcd1234#my_website.com). More on this here: https://support.google.com/adsense/answer/6163366?hl=en .
I never link to this kind of URLs (the only form I use is my_website.com/some/subpage), but I guess my users sometimes enter it manually (since product-wise my website is providing an email service, it may seem reasonable by some logic).
I figured URI of some_user#my_website.com/some/subpage is legal since http basic auth allows for specifying user like this. When I entered it manually to Firefox, some_user# disappears from the location bar but in the Net panel of Firebug I can see all files are indeed requested from some_user#my_website.com/some/subpage and that's how Google sees it too.
I though that as a brute-force solution even something like:
if uri contains '#':
redirect to my_website.com
would do.
I'm using NGINX/UWSGI/Python Paste + JS. I've tried to implement the above condition both on server side and in JS, but my URI always says my_website.com/some/subpage even if I manually put some_user#my_website.com/some/subpage in the browser address bar.
I've also tried configuring basic_auth in NGINX to disallow providing any user but with no effect.
How do I get rid of these requests?
How do I get the FULL URI (with some_user#) in JS? I tried document.URI and window.location.href but they didn't contain the user part...
Apparently presence of user# part in the URI can be detected by examining window.location.href. I haven't noticed it before since window.location.href only contains user# in Webkit-based browsers (e.g. Chrome, Opera, Safari) but not in Firefox!
To resolve the problem I've added a check on that in JS + a JS redirect to an URL without user[:password]#.
Hopefully Google uses the same variable to figure out referrer for the ad requests, so it get PII only from Webkit browsers & fixing it for Webkit suffices. Will keep you posted.
In my https application I am loading some images. If Img src is not available I will show the default image - I got this js that will do the job for IE also:
$(document).ready(function(){
$("img").each(function(index) {
$(this).error(function() {
this.src = "/shared/ts/web/images/missing_img.gif";
});
$(this).attr("src", $(this).attr("src"));
});
});
It works great with IE10, but with IE8 I have a security warning:
Do you want to view only the webpage content that was delivered securely?
If I click on yes -> I cannot see the missing_img.gif, if I click on no everything is ok.
Can you tell me why am I getting this warning and what can I do to remove it?
I don't want to change the client default IE8 settings.
From the looks of it you are not delivering that particuar image via https. Any resource that is not delivered by https can potentially be modified or read by an attacker whilst in transit. This will throw errors in some browsers, whilst others (such as Chrome) will 'cross out' your 'https' bit in the url to show that not all elements are secure.
To fix it, make sure that all the resources you deliver are via https. So in this case you might want to force it by doing
this.src = "https://mysite.com/shared/ts/web/images/missing_img.gif";
I'm not entirely sure why this isn't automatically using https. Maybe someone else can enlighten you on that one.
It's because your image (because you didn't specify https:// as protocol) gets the http:// protocol..
Since this on your secure site, it'll try to load "http://domain.com/shared/ts/web/images/missing_img.gif" and that's "not secure content on a secure site"
I think you can fix this by either make the path absolute (include the entire part of it including https://your-domain.com), or use "//" (double slashes) and omit the http(s): part, so it'll take the same protocol as the site (https when the site is secure, http otherwise).
There are a few ways to include jQuery and jQuery UI and I'm wondering what people are using?
Google JSAPI
jQuery's site
your own site/server
another CDN
I have recently been using Google JSAPI, but have found that it takes a long time to setup an SSL connection or even only to resolve google.com. I have been using the following for Google:
<script src="https://www.google.com/jsapi"></script>
<script>
google.load('jquery', '1.3.1');
</script>
I like the idea of using Google so it's cached when visiting other sites and to save bandwidth from our server, but if it keeps being the slow portion of the site, I may change the include.
What do you use? Have you had any issues?
Edit: Just visited jQuery's site and they use the following method:
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.3/jquery.min.js"></script>
Edit2: Here's how I've been including jQuery without any problems for the last year:
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.4.3/jquery.min.js"></script>
The difference is the removal of http:. By removing this, you don't need to worry about switching between http and https.
Without a doubt I choose to have JQuery served by Google API servers. I didn't go with the jsapi method since I don't leverage any other Google API's, however if that ever changed then I would consider it...
First: The Google api servers are distributed across the world instead of my single server location: Closer servers usually means faster response times for the visitor.
Second: Many people choose to have JQuery hosted on Google, so when a visitor comes to my site they may already have the JQuery script in their local cache. Pre-cached content usually means faster load times for the visitor.
Third: My web hosting company charges me for the bandwidth used. No sense consuming 18k per user session if the visitor can get the same file elsewhere.
I understand that I place a portion of trust on Google to serve the correct script file, and to be online and available. Up to this point I haven't been disappointed with using Google and will continue this configuration until it makes sense not to.
One thing worth pointing out... If you have a mixture of secure and insecure pages on your site you might want to dynamically change the Google source to avoid the usual warning you see when loading insecure content in a secure page:
Here's what I came up with:
<script type="text/javascript">
document.write([
"\<script src='",
("https:" == document.location.protocol) ? "https://" : "http://",
"ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js' type='text/javascript'>\<\/script>"
].join(''));
</script>
UPDATE 9/8/2010 -
Some suggestions have been made to reduce the complexity of the code by removing the HTTP and HTTPS and simply use the following syntax:
<script type="text/javascript">
document.write("\<script src='//ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js' type='text/javascript'>\<\/script>");
</script>
In addition you could also change the url to reflect the jQuery major number if you wanted to make sure that the latest Major version of the jQuery libraries were loaded:
<script type="text/javascript">
document.write("\<script src='//ajax.googleapis.com/ajax/libs/jquery/1/jquery.min.js' type='text/javascript'>\<\/script>");
</script>
Finally, if you don't want to use Google and would prefer jQuery you could use the following source path (keep in mind that jQuery doesn't support SSL connections):
<script type="text/javascript">
document.write("\<script src='http://code.jquery.com/jquery-latest.min.js' type='text/javascript'>\<\/script>");
</script>
One reason you might want to host on an external server is to work around the browser limitations of concurent connections to particular server.
However, given that the jQuery file you are using will likely not change very often, the browser cache will kick in and make that point moot for the most part.
Second reason to host it on external server is to lower the traffic to your own server.
However, given the size of jQuery, chances are it will be a small part of your traffic. You should probably try to optimize your actual content.
jQuery 1.3.1 min is only 18k in size. I don't think that's too much of a hit to ask on the initial page load. It'll be cached after that. As a result, I host it myself.
If you want to use Google, the direct link may be more responsive. Each library has the path listed for the direct file. This is the jQuery path
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.1/jquery.min.js"></script>
Just reread your question, is there a reason your are using https? This is the script tag Google lists in their example
<script src="http://www.google.com/jsapi"></script>
I wouldn't want any public site that I developed to depend on any external site, and thus, I'd host jQuery myself.
Are you willing to have an outage on your site when the other (Google, jquery.com, etc.) goes down? Less dependencies is the key.
Pros: Host on Google has benefits
Probably faster (their servers are more optimised)
They handle the caching correctly - 1 year (we struggle to be allowed to make the changes to get the headers right on our servers)
Users who have already had a link to the Google-hosted version on another domain already have the file in their cache
Cons:
Some browsers may see it as XSS cross-domain and disallow the file.
Particularly users running the NoScript plugin for Firefox
I wonder if you can INCLUDE from Google, and then check the presence of some Global variable, or somesuch, and if absence load from your server?
There are a few issues here. Firstly, the async load method you specified:
<script type="text/javascript" src="https://www.google.com/jsapi"></script>
<script type="text/javascript">
google.load('jquery', '1.3.1');
google.setOnLoadCallback(function() {
// do stuff
});
</script>
has a couple of issues. Script tags pause the page load while they are retrieved (if necessary). Now if they're slow to load this is bad but jQuery is small. The real problem with the above method is that because the jquery.js load happens independently for many pages, they will finish loading and render before jquery has loaded so any jquery styling you do will be a visible change for the user.
The other way is:
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.3.1/jquery.min.js"></script>
Try some simple examples like, have a simple table and change the background of the cells to yellow with the setOnLoadCallback() method vs $(document).ready() with a static jquery.min.js load. The second method will have no noticeable flicker. The first will. Personally I think that's not a good user experience.
As an example run this:
<html>
<head>
<title>Layout</title>
<style type="text/css">
.odd { background-color: yellow; }
</style>
</head>
<body>
<table>
<tr><th>One</th><th>Two</th></tr>
<tr><td>Three</td><td>Four</td></tr>
<tr><td>Five</td><td>Six</td></tr>
<tr><td>Seven</td><td>Nine</td></tr>
<tr><td>Nine</td><td>Ten</td></tr>
</table>
<script src="http://www.google.com/jsapi"></script>
<script>
google.load("jquery", "1.3.1");
google.setOnLoadCallback(function() {
$(function() {
$("tr:odd").addClass("odd");
});
});
</script>
</body>
</html>
You (should) see the table appear and then the rows go yellow.
The second problem with the google.load() method is that it only hosts a limited range of files. This is a problem for jquery since it is extremely plug-in dependent. If you try and include a jquery plugin with a <script src="..."> tag and google.load() the plug-in will probably fail with messages of "jQuery is not defined" because it hasn't loaded yet. I don't really see a way around this.
The third problem (with either method) is that they are one external load. Assuming you have some plugins and your own Javascript code you're up to a minimum of two external requests to load your Javascript. You're probably better off getting jquery, all relevant plug-ins and your own code and putting it into one minified file.
From Should You Use Google's Ajax Libraries API for Hosting?:
As to load times, you're actually
loading two scripts - the jsapi script
and the mootools script (the
compressed version from above). So
that is two connections, rather than
one. In my experience, I found that
the load time was actually 2-3 times
slower than loading from my own
personal shared server, even though it
was coming from Google, and my version
of the compressed file was a couple of
K larger than Google's. This, even
after the file had loaded and
(presumably) cached. So for me, since
the bandwidth doesn't matter much,
isn't going to matter.
Lastly you have the potential problem of a paranoid browser flagging the request as some sort of XSS attempt. It's not typically a problem with default settings but on corporate networks where the user may not have control over which browser they use let alone the security settings you may have a problem.
So in the end I can't really see me using the Google AJAX API for jQuery at least (the more "complete" APIs are a different story in some ways) much except to post examples here.
In addition to people who advices to host it on own server, I'd proposed to keep it on separate domain (e.g. static.website.com) to allow browsers to load it into separate from other content thread. This tip also works for all static stuff, say images and css.
I have to vote -1 for the libraries hosted on Google. They are collecting data, google analytics style, with their wrappers around these libraries. At a minimum, I don't want a client browser doing more than I'm asking it to do, much less anything else on the page. At worse, this is Google's "new version" of not being evil -- using unobtrusive javascript to gather more usage data.
Note: if they've changed this practice, super. But the last time I considered using their hosted libraries, I monitored the outbound http traffic on my site, and the periodic calls out to google servers were not something I expected to see.
I might be old-school about this, but I still frown on hotlinking. Maybe Google is the exception, but in general, it's really just good manners to host the files on your own server.
I will add this as a reason to locally host these files.
Recently a node in Southern California on TWC has not been able to resolve the ajax.googleapis.com domain (for users with IPv4) only so we are not getting the external files. This has been intermittant up until yesterday (now it is persistant.) Because it was intermittant, I was having tons of problems troubleshooting SaaS user issues. Spent countless hours trying to track why some users were having no issues with the software, and others were tanking. In my usual debugging process I'm not in the habit of asking a user if they have IPv6 turned off.
I stumbled on the issue because I myself was using this particular "route" to the file and also am using only IPV4. I discovered the issue with developers tools telling me jquery wasn't loading, then started doing traceroutes etc... to find the real issue.
After this, I will most likely never go back to externally hosted files because: google doesn't have to go down for this to become a problem, and... any one of these nodes can be compromised with DNS hijacking and deliver malicious js instead of the actual file. Always thought I was safe in that a google domain would never go down, now I know any node in between a user and the host can be a fail point.
I just include the latest version from the jQuery site: http://code.jquery.com/jquery-latest.pack.js It suits my needs and I never have to worry about updating.
EDIT:For a major web app, certainly control it; download it and serve it yourself. But for my personal site, I could not care less. Things don't magically disappear, they are usually deprecated first. I keep up with it enough to know what to change for future releases.
Here some useful resource, hope can help you to chose your CDN.
MS has recently add a new domain for delivery Libraries trough their CDN.
Old Format: http://ajax.microsoft.com/ajax/jQuery/jquery-1.5.1.js
New Format: http://ajax.aspnetcdn.com/ajax/jQuery/jquery-1.5.1.js
This should not send all cookies for microsoft.com.
http://www.asp.net/ajaxlibrary/cdn.ashx#Using_jQuery_from_the_CDN_11
Here some statistics about most popular technology used on the web across all technology.
http://trends.builtwith.com/
Hope can help you to choose.
If I am responsible for the 'live' site I better be aware of everything that is going
on and into my site. For that reason I host the jquery-min version myself either on the same server or a static/external server but either way a location where only I (or my program/proxy) can update the library after having verified/tested every change
In head:
(function() {
var jsapi = document.createElement('script'); jsapi.type = 'text/javascript'; jsapi.async = true;
jsapi.src = ('https:' == document.location.protocol ? 'https://' : 'http://') + 'www.google.com/jsapi?key=YOUR KEY';
(document.getElementsByTagName('head')[0] || document.getElementsByTagName('head')[0]).appendChild(jsapi);
})();
End of Body:
<script type="text/javascript">
google.load("jquery", "version");
</script>
I host it with my other js files on my own server, and, that's that point, combine and minify them (with django-compresser, here, but that's not the point) to be served as just one js file, with everything the site needs put into it. You'll need to serve your own js files anyway, so I see no reason to not add the extra jquery bytes there too - some more kbs are much more cheaper to transfer, than more requests to be made. You are not dependent to anyone, and as soon as your minified js is cached, you're super fast as well.
On first load, a CDN based solution might win, because you must load the additional jquery kilobytes from your own server (but, without an additional request). I doubt the difference is noticable, though. And then, on a first load with cleared cache, your own hosted solution will probably always be much faster, because of more requests (and DNS lookups) needed, to fetch the CDN jquery.
I wonder how this point is almost never mentioned, and how CDNs seem to take over the world :)