IE8 - IE10 cross domain JSONP cookie headache - javascript

Due to decisions that are completely outside of my control, I am in the following situation:
I have a product listing on catalog.org
Clicking the "Add to Cart" button on a product makes an AJAX JSONP request to secure.com/product/add/[productKey], which saves the cart record to the database, sets a cookie with the cart ID, and returns a true response (or false if it failed)
Back on catalog.org, if the response is true, another AJAX JSONP request is made to secure.com/cart/info, which reads the cart ID cookie, fetches the record, and returns the number of items in the cart
Back on catalog.org once again, the response is read and an element on the page is updated showing the number of items in the cart (if any)
At this point, clicking the "Go to Cart" button on catalog.org displays the cart summary on secure.com
This works beautifully in Firefox 17, Chrome 32 and IE 11. It also works in IE8 - IE10 on our development and test environments, where catalog.org is catalog.development.com and catalog.test.com and secure.com is secure.development.com and secure.test.com respectively.
However, after we deployed to production, this stopped working in IE8 - IE10. After adding a product to the cart, the number of items in the cart is updated successfully on catalog.org. Then, after clicking the "Go to Cart" button on catalog.org, the cart summary on secure.com shows nothing because it can't read the cookie. Going to Cache > "View cookie information" in IE develeoper tools shows no cart ID cookie. It should be there, just like it is there in other browsers and in our development and test environments.
I believe what's happening is IE is blocking third party cookies. We have added a P3P compact policy header to all requests on secure.com, but the cookie is still not being set. The header we are setting is:
P3P: CP="CAO PSA OUR"
Why doesn't adding the compact policy header fix this in IE8 - IE10? How can I fix this to work in all versions of IE?
Solution
There are several good ideas posted below. I accepted #sdecima's because it sounded the most promising. We ended up combining some of these ideas but managed to avoid XDomainRequest:
Clicking the "Add to Cart" button on a product makes an AJAX JSONP
request to secure.com/product/add/[productKey], which saves the cart
record to the database, sets a cookie with the cart ID, and returns a
true response (or false if it failed)
We changed the action at secure.com/product/add to return a JSON object with a boolean indicating success or failure and the cart ID.
Back on catalog.org, if the response is true, another AJAX JSONP
request is made to secure.com/cart/info, which reads the cart ID
cookie, fetches the record, and returns the number of items in the
cart
We changed the callback function to check for both properties in the response object. If success is true and the cart ID is present, we create a hidden iframe on the page. The src attribute of the iframe is set to a new endpoint we added to secure.com. This action accepts a cart ID parameter and saves the cart ID cookie. We no longer need to save the cookie in the secure.com/product/add action.
Next, we changed the action at secure.com/cart/info to accept a cart ID parameter. This action will use the cart ID parameter if present to fetch the cart information, otherwise it will still attempt to read the cookie. This extra check would be unnecessary if we could guarantee that the iframe had finished loading and the cookie had been saved on secure.com, but we have no way of knowing when the iframe has finished loading on catalog.org due to browser security restrictions.
Finally, the P3P header CP="CAO PSA OUR" is still required for this to work in IE7 - IE10. (Yes, this works in IE7 now too :)
We now have a solution (albeit an incredibly complex one) for saving and accessing cross domain cookies that works in all major browser, at least as far back as we can reliably test.
We will probably refactor this some more. For one thing, the second AJAX JSONP request to secure.com/cart/info is redundant at this point since we can return all the information we need in the original request to secure.com/product/add action (a side benefit of changing that action to return a JSON object - plus we can return an error message indicating exactly why it failed if there was an error).

In short
Cookies will NOT go through a cross-origin request on IE 8 and 9. It should work on IE 10 and 11 though.
IE 8 and 9
On IE8/9 XMLHttpRequest partially supports CORS, and cross-origin requests are made with the help of the XDomainRequest object which does NOT send cookies with each request.
You can read more about this on the following official MSDN Blog post:
http://blogs.msdn.com/b/ieinternals/archive/2010/05/13/xdomainrequest-restrictions-limitations-and-workarounds.aspx
Particularly this part:
5 . No authentication or cookies will be sent with the request
In order to prevent misuse of the user’s ambient authority (e.g.
cookies, HTTP credentials, client certificates, etc), the request will
be stripped of cookies and credentials and will ignore any
authentication challenges or Set-Cookie directives in the HTTP
response. XDomainRequests will not be sent on previously-authenticated
connections, because some Windows authentication protocols (e.g.
NTLM/Kerberos) are per-connection-based rather than per-request-based.
IE 10+
Starting with IE10, full CORS support was added to XMLHTTPRequest and it should work fine with a correct Access-Control-Allow-Origin header property on the response from the server (that wishes to set the cookie on the browser).
More about this here:
http://blogs.msdn.com/b/ie/archive/2012/02/09/cors-for-xhr-in-ie10.aspx
And here:
http://www.html5rocks.com/en/tutorials/cors/
Workarounds on IE 8 and 9
The only way to go around this on IE8/9 is, quoting the same MSDN post as above:
Sites that wish to perform authentication of the user for cross-origin
requests can use explicit methods (e.g. tokens in the POST body or
URL) to pass this authentication information without risking the
user’s ambient authority.

Bottom line: third party cookies are commonly blocked by privacy/advertisement blocking extensions and should be considered unreliable. You'll be shooting yourself in the foot leaving it in production.
The syntax suggests that the endpoint has ambitions to one day become RESTful. The only problem with that is using cookies, which throws the whole "stateless" concept out of the window! Ideally, changes should be made to the API. If you are not integrating with a third party (i.e. "secure.com" is operated by your company) this is absolutely the correct way to deal with the issue.
Move the cartId out of the secure.com cookie into its querystring:
secure.com/product/add/9876?cartId=1234 //should be a POST
Where to get a valid cartId value? We can persist it in some secure-com-cart-id cookie set for catalog domain, which will avoid any cross-domain issues. Check that value and, if present, append to every secure.com request as above:
$.post('secure.com/product/add/9876', { //needs jQuery.cookie
cartId: $.cookie('secure-com-cart-id')
});
If you don't have a valid cartId, treat it as a new user and make the request without the parameter. Your API should then assign a new id and return it in the response. The "local" secure-com-cart-id cookie can then be updated. Rinse and repeat.
Voila, you've just persisted an active user cart without polluting API calls with cookies. Go yell at your architect. If you can't do that (changing API syntax or yelling), you'll have to set up a tunnel to secure.com endpoint so that there will be no cross-domain request - basically something sitting at catalog.org/secure-com-endpoint which will channel the requests to secure.com verbatim. It's a workaround specifically to avoid making changes to the API, just don't do it with code and have proper Apache/IIS/F5 rules set up to handle it instead. A quick search comes up with several explanations, this one looks pretty good to me.
P.S.: this is a classic XY problem in my opinion. The solution isn't necessarily about persisting 3rd party cookies but about passing necessary parameters to a 3rd party while persisting the data somewhere.

Although a correct solution would be a change of architecture, if you're looking for a quick, temporary solution:
JSONP files are actually just javascript. You could add a line of code to set cookies to the front of your JSONP.
eg instead of:
callback({"exampleKey": "exampleValue"});
Your JSONP could look like:
document.cookie="cartID=1234";
callback({"exampleKey": "exampleValue"});

If you control the DNS records, create a new entry so both servers are in the same domain.

Is there 1 database serving catalog.org and secure.com or can they communicate?
If so then, you got it.
When catalog.org servers a cookie, save it in the db.
When secure.com servers a cookie, save it in the db.
Then you can determine who's cart belongs to which user.
This is a fun problem to consider......Update 2:
When a user goes to catalog.org:
check if he has a cat_org cookie, if not, then:
in catalog.org:
create a key value pair and save in the db {cat_cookie_id, unique_number}
set cat_cookie_id in browser
instruct the browser to ajax visit secure.com/register/unique_number
in secure.com
read unique_number from url
create a secure_cookie id
save in the db {cat_cookie_id, unique_number, secure_cookie_id}
delete unique_number, as this is a one-time use key
Now the db can map cat_cookie_id to secure_cookie_id and vice versa.

Related

HTML form seems to be submitting *both* POST and GET?

This is not a duplicate of questions such as this, but rather the opposite: I have a form that I'm submitting via jQuery
$('<form>', {
action : 'service',
method : 'post',
target : '_blank'
}).append(
$('<input>', {
type : 'hidden',
name : 'payload',
value : JSON.stringify(payload)
})
).appendTo('body').submit().remove();
This is done so that I can open a different page with HTML.
Since I need to submit quite a lot of complex information, what I actually do is serialize them all into a big JSON string, then create a form with only one field ("payload") and submit that.
The receiving end has a filter that goes like this:
if the method is POST,
and there is only one submitted variable,
and the name of that one variable is "payload",
then JSON-decode its value and use it to create fake GET data.
So when the GET data grows too much I can switch methods without modifying the actual script, which notices no changes at all.
It always worked until today.
What should happen
The server should receive a single POST submission, and open the appropriate response in a popup window.
What actually happens instead
The server does receive the correct POST submission...
...apparently ignores it...
...and immediately after that, the browser issues a GET with no parameters, and it is the result of that parameterless GET that gets (pardon the pun) displayed in the popup window.
Quite unsurprisingly, this is always a "You did not submit any parameters" error. Duh.
What I already did
verified that this method works, and has always worked for the last couple of years with different forms and different service endpoints
tried replacing the form with a hardcoded <FORM> in HTML, without any jQuery whatsoever. Same results. So, this is not a jQuery problem.
tried with different browsers (it would not have helped if it only worked on some browsers: I need to support most modern browsers. However, I checked. Luckily, this failure reproduces in all of them, even on iPhones).
tried sending few data (just "{ test: 0 }").
tried halting the endpoint script as soon as it receives anything.
checked Stack Overflow. I found what seems to be the same problem, in various flavours, but it's of little comfort. This one has an interesting gotcha but no, it does not help.
checked firewalls, proxies, adblockers and plugins (I'm now using plain vanilla Firefox).
called the IT guys and asked pointed questions about recent SVN commits. There were none.
What I did not yet do
Check the HTTPS conversation at low level (I don't have sufficient access).
Compared the configuration, step by step, of a server where this works and the new server where it does not.
Quite clearly, put my thinking hat on. There must be something obvious that I'm missing and I'm setting myself up for a sizeable facepalm.
Use a tool like hurl.it or Postman to manually send a request to the server. The tools will nicely display the response from the server including all HTTP headers. I suspect the server responds with a redirect (Status code 30X) which leads to a GET request being issued after the POST completes.
Update: HTTP redirects
HTTP redirects do not necessarily use the same HTTP method or even the same data to issue a request to the redirect target. Especially for non-idempotent requests this could be a security issue (you don't generally want your form submission to be automatically re-submitted to another address). However, HTTP gives you both options:
[...] For this reason, HTTP/1.1 (RFC 2616) added the new status codes 303 and 307 [...], with 303 mandating the change of request type to GET, and 307 preserving the request type as originally sent. Despite the greater clarity provided by this disambiguation, the 302 code is still employed in web frameworks to preserve compatibility with browsers that do not implement the HTTP/1.1 specification.
[from Wikipedia: HTTP 302]
Also for 301s:
If the 301 status code is received in response to a request of any type other than GET or HEAD, the client must ask the user before redirecting.
[from Wikipedia: HTTP 301]

How to properly cache AJAX

I'm doing some research on how to properly cache AJAX responses, since that speeds up a page with lots of AJAX requests. I found this piece on the Yahoo website:
Let's look at an example. A Web 2.0 email client might use Ajax to download the user's address book for autocompletion. If the user hasn't modified her address book since the last time she used the email web app, the previous address book response could be read from cache if that Ajax response was made cacheable with a future Expires or Cache-Control header. The browser must be informed when to use a previously cached address book response versus requesting a new one. This could be done by adding a timestamp to the address book Ajax URL indicating the last time the user modified her address book, for example, &t=1190241612. If the address book hasn't been modified since the last download, the timestamp will be the same and the address book will be read from the browser's cache eliminating an extra HTTP roundtrip. If the user has modified her address book, the timestamp ensures the new URL doesn't match the cached response, and the browser will request the updated address book entries.
This makes it only less clear. The reason I want to know all this, is that I'm building a simple webpage where users can add shortcuts to websites. They see a grid of icons and can click on or search for the website they need. This is only meant as a project to get to know PHP and most importantly AJAX a lot better; nothing that actual users will ever see.
As you can imagine, the search function slows the website down a lot. Especially since it's performing an AJAX request after every typed letter. Therefore, I think it would greatly improve the website if some parts of this would be cached.
You have to make up your mind first:
Do you want a cached response used, or do you want the server to create a fresh one each time?
It isn't clear to me what you are asking exactly. What do you want to be cached, and how long?
If you want a cached response, simply use an URL you used before. (In combination with an expires-header)
For example:
http://www.example.com/myAjaxHelper.php?q=ab
If server responds with an expires header that says "one week from now", you will get the cached response for a week. Your browser will make that so.
(Remember YOU can set up the expires value in the response headers)
There are problems with browsers NOT respecting the expires header, and sometimes they fetch the cached version instead of the new one, even if it expired in the past. (Notably older IE needed a few of extra headers to fix that, and it is highly confusing)
If you want a fresh response (and this can lead to slowness because on each key-up you make a roundtrip to the server), make a NEW url.
Eg:
http://www.example.com/myAjaxHelper.php?q=ab?time=< ?php echo time(); ?>
This will lead to a new URL each second. And that one will never be cached, because the URL differs (only for the time=.... part, but that is enough to force a new request).

Are cookies guarantied to be ready on time for redirected page?

I have the following scenario :
user goes to www.mysite.com/someProduct
this is rewritten internally to /products.php?p=someProduct
in product.php, I detect that this user is currently logged in so I want to switch to his specific url but I still want to show the requested product, so I do:
set a "productRequest" cookie with "someProduct" in it
redirect with a location header to www.mysite.com/theUser
the new location is internally rewritten to /users.php?u=theUser
in users.php, I may access the "productRequest" cookie (although not a use case right now)
once the client gets served the user page, the javascript code will need to access the "requestProduct" cookie to perform some Ajax calls and fetch "someProduct" info.
Note that I do not want to pass "someProduct" in the user specific URL.
Also note that I could keep the request for "someProduct" in the user's SESSION on the server side but since that info is ultimately destined to javascript on the client's side, I find it somewhat ugly.
Now here is my question : are there any guaranties (would come from the http protocol I guess) that the cookie will always be received by the client before the redirect in order to be sent back to the newly requested page ?
THank you all !
Yes. As soon as you set that cookie (which is done with an HTTP response header) the browser will begin using it in subsequent requests and it will be available to any JavaScript trying to access it.

Disable browser cache

I implemented a REST service and i'm using a web page as client.
My page has some javascript functions that performs several times the same http get request to REST server and process the replies.
My problem is that the browser caches the first reply and not actualy sends the following requests..
Is there some way to force the browser execute all the requests without caching?
I'm using internet explorer 8.0
Thanks
Not sure if it can help you, but sometimes, I add a random parameter in the URL of my request in order to avoid being cached.
So instead of having:
http://my-server:8080/myApp/foo?bar=baz
I will use:
http://my-server:8080/myApp/foo?bar=baz&random=123456789
of course, the value of the random is different for every request. You can use the current time in milliseconds for that.
Not really. This is a known issue with IE, the classic solution is to append a random parameter at the end of the query string for every request. Most JS libraries do this natively if you ask them to (jQuery's cache:false AJAX option, for instance)
Well, of course you don't actually want to disable the browser cache entirely; correct caching is a key part of REST and the fact that it can (if properly followed by both client and server) allow for a high degree of caching while also giving fine control over the cache expiry and revalidation is one of the key advantages.
There is though an issue, as you have spotted, with subsequent GETs to the same URI from the same document (as in DOM document lifetime, reload the page and you'll get another go at that XMLHttpRequest request). Pretty much IE seems to treat it as it would a request for more than one copy of the same image or other related resource in a web page; it uses the cached version even if the entity isn't cacheable.
Firefox has the opposite problem, and will send a subsequent request even when caching information says that it shouldn't!
We could add a random or time-stamped bogus parameter at the end of a query string for each request. However, this is a bit like screaming "THIS IS SPARTA!" and kicking our hard-won download into a deep pit that no Health & Safety inspector considered putting a safety rail around. We obviously don't want to repeat a full unconditional request when we don't need to.
However, this behaviour has a time component. If we delay the subsequent request by a second, then IE will re-request when appropriate while Firefox will honour the max-age and expires headers and not re-request when needless.
Hence, if two requests could be within a second of each other (either we know they are called from the same function, or there's the chance of two events triggering it in close succession) using setTimeout to delay the second request by a second after the first has completed will make it use the cache correctly, rather than in the two different sorts of incorrect behaviour.
Of course, a second's delay is a second's delay. This could be a big deal or not, depending primarily on the size of the downloaded entity.
Another possibility is that something that changes so rapidly shouldn't be modelled as GETting the state of a resource at all, but as POSTing a request for a current status to a resource. This does smell heavily of abusing REST and POSTing what should really be a GET though.
Which can mean that on balance the THIS IS SPARTA approach of appending random stuff to query strings is the way to go. It depends, really.

JavaScript/Greasemonkey: Avoiding FireFox Security Warning when Submitting a Form from a Secure Page

I'm writing a Greasemonkey script to connect two company-internal webpages. One is SSL, and the other is insecure and can only be accessed via a POST request. If I create a hidden form on the secure page and submit it via an onclick() in an <a>, it works fine, but FF gives a warning:
Although this page is encrypted, the information you have entered is to be sent over an unencrypted connection and could easily be read by a third party.
Are you sure you want to continue sending this information?"
The insecure page can't be accessed via SSL and the other one can't be accessed w/o it, and I can't modify either server =\ Is there any way to avoid this warning by doing some kind of JavaScript/Greasemonkey redirect magic? Thanks!
EDIT: The warning can't be disabled (for rather good reasons, since it's hard to tell if what you're about to send is secure, otherwise). I'm mostly wondering if there's a way to POST in JavaScript without looking like you're submitting a form.
This may be possible by doing a GM_xmlhttpRequest. e.g.,
GM_xmlhttpRequest({
method: 'POST',
url: 'http://your.insecure.site.here',
onload: function(details) {
// look in the JavaScript console
GM_log(details.responseText);
/* This function will be called when the page (url)
has been loaded. Do whatever you need to do with the remote page here.*/
}
});
API/more info here: GM_xmlhttpRequest wiki
You could set up a new SSL site as a proxy that just passes data back to the insecure site. Or just have all your users turn off that particular security warning. (Sorry FF team, but that's not a terribly useful message to start with.)
That's a browser configuration setting, which can't (or shouldn't) be changable by Javascript.
Unless the script needs to be used by more than one user, Tools -> Options -> Security. You can click on settings to display which warning messages are displayed. Note that this currently affects all sites rather then just your internal system.

Categories

Resources