As I'm reading a lot about third party javascript/cookies - I have an implementation question.
I've seen this slide show (#45) (the context of the slide is about cross domain but it is used also for advertisers)
AFAIK it goes like this :
First enterence to Site #1 : the site has a page .that page also holds an advertiser iframe from TotalNotTrackingYou.com.
The SRC of the iframe has general info for that particular page content ( if any ).
So TotalNotTrackingYou.com sends a cookie with identification token when the page #1 loads .
This way - when you browse to other pages besides site #1 ( notice ! the user didn't click on any add yet !) - TotalNotTrackingYou.com knows what intereting topics you are interested .
Now the user has left site #1 and went to site #2 which also holds an iframe from TotalNotTrackingYou.com. Same goes here. the cookie which was generated by site #1 request (which in turn loads an iframe) - is sent back to TotalNotTrackingYou.com which again - reads the referrer and the relevant querystring (at SRC) for that iframe .
TotalNotTrackingYou.com (sends you cookies when their iframe loads at sites #1..#5) and learning only your navigation habbit (using referrer - which site you were on).But when you click on the advertising add - TotalNotTrackingYou.com now knows for sure what you are intereted in , and they add it to their db.
from now on - all sites (which holds TotalNotTrackingYou.com iframe) will send relevant adds according to the user interests list..
Question
A script reference <script src='www.TotalNotTrackingYou.com/cookiecreator.ashx' /> can also send / recieve cookies. So why advertisers don't use scripts but iframes ?
Additional info.
I know that 3rd party cookies are disabled by default in Safari. but there is a hack to create an iframe and a form and to post that form to that iframe - which will write cookie.
This hack in safari to allow 3rd party cookies by posting was fixed. (Btw, Google also received a hefty fine from the FCC for exploiting this "hack": http://www.theverge.com/2012/7/31/3207388/fcc-approval-google-fine-safari-cookies )
In any case, the reason that they use iframes is because the preferred method for storing data associated with the 3rd-party domain is no longer a 3rd party cookie, but instead localStorage. To access the localStorage of the 3rd-party domain, the javascript code has to be running under a document from that domain, hence why the iframe works, and not a script loaded on the 1st party domain.
The benefits of localStorage vs the cookie is that it's not blocked even when the user requests blocking of 3rd-party cookies. See for example this thread from Firefox development:
https://bugzilla.mozilla.org/show_bug.cgi?id=536509 or this article running through the code itself http://log.scalemotion.com/2012/10/how-to-trick-safari-and-set-3rd-party.html
Related
I've been working on a requirement that involves a website fetching/manipulating data stored on a different domain. There didn't seem a way except for enabling CORS on the other server to allow me to get and modify data from a different domain. However, that caused some issues with Office 365 apps and I had to take a different approach.
The approach is to use postMessage to talk to a hidden iframe (not a good approach, but I was insisted to use it) on the page that is running on the target domain. The source page posts message along with information about the REST call to the hidden iframe which makes a requests on behalf of the parent page and uses postMessage to return back the results.
Everything works fine except for when the website is being used on an iPhone. Turned out placing alert calls in the script running inside the target iframe makes it to work but removing the alert calls sort of disables the target iframe from making those cross-origin network calls.
My theory is that it is due to the security of mobile Safari that in order to make cross-origin calls from an iframe running on a different domain, the user needs to provide their consent by interacting at least once with the embedded iframe. Does that sound correct?
The comment by diodeus-james-macfarlane is the closest that we could go but the iframe being hidden, there was no way we could have placed a control for the user to interact with, even if that was only for it to work.
To my surprise, turning off a setting on the SharePoint site made it work. The setting was around mobile view compatibility and without that, the iframe is able to make HTTP requests, send and receive messages to and from the parent webpage.
So recently I have been tasked with making standard hyperlinks work on a website that open pages on an intranet site. This works in Chrome on the same environment but not in IE 8. This is the clients current supported browser.
The issue is that in IE the setting Websites in less privileged web content zone can navigate into this zone. is disabled by policy settings, so if you attempt to open a link from the site to an intranet site, in IE 8 you get an Access Denied error.
Now obviously given this is disabled and can't be enabled in the short term (if at all) and given that providing them a link to copy is not an agreeable solution, I have been asked to see what is possible.
One url points to the sitemap of the intranet site. I also do not have access to the intranet site code.
Things I have tried:
Using javascript opening a window - Access Denied
Using javascript to open a blank window and injecting javascript to update the window.location, this also resulted in an Accesss Denied because the new window appears to be on the original domain (rather than blank).
IIS Reverse proxy, clicking on the link appears to host the intranet site from within the current site. This worked well but two features failed to work on the intranet site due to internal redirects, so wasn't feasible.
Performing a redirect from a mapped internal link to the required intranet link at IIS and .Net Controller levels, both of these fail as redirects are disabled on the client machine.
IFrame, eurgh - Access Denied
So my question is given that the feature is disabled by IE can any one think of the way around this?
Thanks
Just in-case anyone finds this useful. I managed to resolve this using the IIS Reverse proxy. There was an issue with some internal redirects not being handled, which was under a different sub folder to the rest of the site. But the main issue I had in getting this to work was that calls to .aspx files on the external site were actually being caught by the internal site's page handler, which meant that the rewrite rules for certain calls were getting handled as pages internally and returning a 404 page.
As our internal site uses razor under MVC, the page handler isn't used, so I was luckily able to remove the page handlers within the web.config. If this wasn't the case, I would have had to reorder the handlers so that the redirect rules handled the call before the handler.
I'm currently working on a content generator and I have objects which allow users to add custom scripts to the page.
I'm concerned about the preview of my plugin. Pages cannot be saved in the preview, but can the user mess with my preview page permanently if I allow him to use dynamically added javascript?
I'd also like to mention, the javascript is sent via AJAX to a php file, then appended to the body.
Pages cannot be saved in the preview, but can the user mess with my preview page permanently if I allow him to use dynamically added javascript?
Not permanently, no. He can only mess up his own current page.
If the custom scripts and pages don't leave the client's computer, or you can make sure they will not be served to other people (which implies they're not stored on the server) then you're safe from XSS attacks.
However, notice that as soon as your plugin leaves "preview" and you allow saving pages that are shown to other visitors, you will have that problem.
Yes, this is a big attack vector known as Cross Site Scripting (XSS). You should never run JavaScript provided by your users on arbitrary pages, unless you absolutely must.
For instance, I could add:
document.body.style.display = 'none';
and that would hide the entire page.
Although your script only displays to the current user, your page may be vulnerable to a Cross Site Scripting attack. The way to handle it in this case (as you are allowing scripts) is to use a similar mechanism to a Cross Site Request Forgery prevention (although CSRF and XSS are completely different).
e.g. if your page https://www.example.com/preview displays all content (HTML and script) POSTed to it (for thie example assume the POSt parameter is called content), an attacker may include the following code on their page and then entice the victim to visit it whilst logged into your website.
On www.evil.com:-
<form method="post" action="https://www.example.com/preview">
<input type="hidden" name="content" value="<script>alert('foo');</script>" />
</form>
and this form could be submitted automatically via JavaScript (document.forms[0].submit()).
This will cause the script in content to be executed in the context of your site, possibly passing cookie values of the user's session to www.evil.com rather than my benign example of an alert box.
However, as your are POSTing the content value to your own site using AJAX, you can prevent this attack by checking that the X-Requested-With request header is set to XMLHttpRequest. This header cannot be POSTed cross domain (without your server agreeing to this using CORS).
Also, if your page is for a preview - what is the preview for if your preview cannot be saved? If this is related to your full save functionality, then it is possible to allow a user to save scripts safely by running the entered content within a sandbox.
I've recently stumbled upon a website called Overlay101 which allows you to create tours for other websites.
I was very interested to see the technique they use to load the third party websites for editing.
When you type the address of the website, it is loaded as a sub domain of the overlay101.com website.
For example, if I type https://stackoverflow.com/questions/111102/how-do-javascript-closures-work - it is loaded as http://stackoverflow.com.www.overlay101.com/questions/111102/how-do-javascript-closures-work
I was wondering how is that subdomain creation achieved and I saw in the source code of the page that JavaScript in injected. I was wondering how was that possible too.
What intrigued me most is that Stackoverflow.com does not allow pages to be loaded within frames - I was wondering how they managed to load up the page so that tour popups could be added.
They simply use wildcard DNS entries to make all subdomains work. They then use the Host header to get the original domain name and download the HTML code of the site. Since they do this on the server side they do not need any frames etc.
I'm having a few problems with an application that integrates sharepoint, SQL reporting services and a bunch of custom forms that are built using ASP.NET MVC.
Assuming my servers are as follows;
MOSS
SSRS
Custom forms
In MOSS, my portal has need on occassion to popup a custom form to capture user input. I've done this by using a jQuery dialog (using Boxy), which iframes the custom form in and passes the url of the portal into it. When the custom form is finished, it navigates the parent window (the MOSS portal) to the URL passed in, which effectively refreshes the page.
This was working fine until we threw in the complexity of SSRS.
Now in MOSS, I have a report that lists some data, but the SSRS report viewer web part seems to iframe it's report content in, which means the hyperlinks from the report can't ask the parent to overlay the same dialogs (as it's cross domain) and if it were to perform the overlay itself, it would just overlay the iframe.
Sorry for the long post, getting to the point - this is an internal intranet application only. Is it possible to allow cross domain scripting somehow so that the popup dialogs can all be controlled from javascript within the sharepoint portal and SSRS and my custom forms can just invoke javascript methods on the parent?
Preferably I wouldn't want to have to do configuration in the client browser to allow this to happen, as I'd have to roll that change out to all the machines within the estate - which is a significant number.
Thanks in advance, beer available to anyone who can solve my woes ;)
Cheers,
Tony
IE8, Firefox 3, recent Opera and Safari/Chrome support postMessage which allows cooperating pages on different domains to talk to each other:
http://ajaxian.com/archives/cross-window-messaging-with-html-5-postmessage
If you are stuck with older browsers, you have few options. The cleanest is to send everything that needs to communicate with each other through the same proxy, although in the OP's situation it looks like this isn't possible.
The next cleanest is to use Flash's cross-domain facility.
Another option is xssinterface, which wraps postMessage where available and uses some voodoo involving cookies and polling where it isn't.
The only other option is to use hidden iframes - to send a message to a page, change the iframe's location to one on the destination page's domain and poll in the destination page - but again I think the proxying in the OP's case makes this unworkable.
There is another option in addition to those Andrew provides. You can dynamically inject script tags into the DOM, wherein the src attribute can point to a javascript file on any domain.
In jQuery you accomplish this by specifying "jsonp" as the datatype for the ajax request. You can read more about this approach here:
http://blog.ropardo.ro/2009/09/23/cross-domain-ajax-calls/
I finally got around these issues by using hidden iframes as suggested. I posted an article on my blog with more details and pushed the code onto codeplex:
http://www.deepcode.co.uk/2009/11/overcoming-cross-domain-issues-between.html