Does Chrome violate the standards in caching? - javascript

We noticed Chrome caches files locally and doesn't even send a request to our server to check if there's a newer version of the javascript file.
Example of HTTP response headers for a js file that Google cached:
Accept-Ranges:bytes
Access-Control-Allow-Headers:Content-Type
Access-Control-Allow-Methods:GET, POST, PUT, DELETE, OPTIONS
Access-Control-Allow-Origin:*
Content-Encoding:gzip
Content-Length:5479
Content-Type:application/javascript
Date:Tue, 12 Jan 2016 22:46:07 GMT
ETag:"7d68e1ceb647d11:0"
Last-Modified:Tue, 05 Jan 2016 12:44:25 GMT
Server:Microsoft-IIS/8.5
Vary:Accept-Encoding
x-robots-tag:noindex
Is it valid that Chrome cached the file? There's no Cache-control header or something that declares that the file can be cached locally, it only has ETag and Last-Modified.
BTW
Is there a way (maybe a header) to instruct Chrome to check if the cached file has changed without appending version to the file name? Setting no-cache is not an option since I do want it to be cached, but I want to use the ETag and Last-Modified headers as it should.

Unless specifically constrained by a cache-control (section 14.9)
directive, a caching system MAY always store a successful response
(see section 13.8) as a cache entry, MAY return it without validation
if it is fresh, and MAY return it after successful validation.
You can always use the must-revalidate directive.
When the must-revalidate directive is present in a response received
by a cache, that cache MUST NOT use the entry after it becomes stale
to respond to a subsequent request without first revalidating it with
the origin server.
Source

Related

How to completely prevent image caching for a resource served via HTTP

In a project, client-side in the browser, i dynamically create an <img>-tag and set the source to an image. It is served from apache2 on the host.
The user can then make changes and i sometimes need to reload the image (as the source on the server has changed). I do that by changing the src-attribute to the new image.
The problem is, the old (first) image remains in the cache and no further changes are ever reflected.
I did of course try to prevent caching by the regular means:
I change the URL of the source image on every reload, by adding a parameter to the url and setting its value to the current time. I checked to confirm and yes, every load actually requests a different URL from the server, but the image is still served as a cached version.
I'm returning a variety of headers to prevent caching. Here is what the response headers look like:
Access-Control-Allow-Headers: origin, x-requested-with, content-type
Access-Control-Allow-Methods: PUT, GET, POST, DELETE, OPTIONS
Cache-control: no-cache, no-store, must-revalidate
Connection: Keep-Alive
Content-Length: 48503
Content-Type: image/png
Date: Wed, 05 Sep 2018 15:51:08 GMT
Expires: 0
Keep-Alive: timeout=5, max=100
Pragma: no-cache
Server: Apache/2.4.25 (Debian)
Set-Cookie: locale=de; Domain=c.test; Path=/; Expires=Mon, 04 Mar 2019 15:51:08 GMT
Set-Cookie: session_id=563bbb7d216d4edf7aed7e38427e15aec584414a605df6d2481223f840bf13f7; Domain=c.test; Path=/
A requested URL looks like this:
/event/590c713b5fd3197a0a16c851/reg/data/streamThumbnail?file=93c180702fd9926d40f77dd19ae48cee.crop.jpg&t=0478533001536162394&dimensions=130x181
Unfortunately i am out of options. I tried debugging this by loading the image src directly in a new tab, making changes on the server and then reloading, but the image remains the same, even though it doesn't exist on the server anymore at all.
Can anyone point me in the right direction here? Does anyone know what is going on or what i've been missing?
I'm sorry i cannot provide any testing outlet for this, so i guess this one's up to the ones who have encountered this problem themselves.
Thank you.
Tried this? Should work in both .htaccess, httpd.conf and in a VirtualHost
<filesMatch "\.(html|htm|js|css)$">
FileETag None
<ifModule mod_headers.c>
Header unset ETag
Header set Cache-Control "max-age=0, no-cache, no-store, must-revalidate"
Header set Pragma "no-cache"
Header set Expires "Wed, 11 Jan 1984 05:00:00 GMT"
</ifModule>
</filesMatch>
And optionally add the extension for the template files you are retrieving if you are using an extension other than .html for those.
You can do this easily client side, with adding a url query. i.e.
<img src="folder/my-image.jpg?cache=1">
Every time you want to refresh the image, increase the number on your cache variable. i.e. make it ?cache=2. A lot of people will use a date/time variable in this field to never allow caching.
EDIT:
Okay, sorry, just realized you tried this. Another option is to use data-uri's. If you have the ability to read the image as source, you can use a data-uri in the image tag as a base64 hex string:
<img src="data:image/gif;base64,R0lGODdhnQAmAPcAAM3OzpKTlP759P3u4va3hvawe/Wpbr/AwPjGoPrYvvKSSPKQRfzo2P/+/frWu/KNQD5AQm5wcSwuMYWGiI+Qkvayf/3x5/OWT/OUS/vfyfnPrv/8+fa2hPKPQ/nOrP///zw+QF5gYjg6PfKPRPzr3jo8PvSdWvaueJGSlGlrbbi5uva0gtzc3aioqe3u7uPk5MDBwr2+v7m6uvrUtfOaVv769vSeXvrRsvz9/dLT1O7u71ZYWk9RVPe+k/707PKRR/WmavWrcfe7jKCiooSFh/rVuP/9++rq6/nNqve6inJzdVtcX/rUtykrLkVGSfHx8fvawvn5+pGSk9XV15ucnoqLjPWnbPvcxL6/wHR1d1hZXH+Bg8bHyFBSVNvb3IaHiYyNjm1ucISGh8PExf3+/rS2tn6AgWxub4KEhczNzdfY2LCxslJUViYoKjY4OoGChPX19Xx+fzk7PjAyNfn6+kNFRyUnKuTk5ZSVlj4/QiQmKUhKTKqrrE5QUpqbnPb39/zj0d3e3peZmvzp2ry9vlhaXGFiZImKjP7483p7fTs9QHV3eUxOUEpMTo2OkEdJS2RmaN7f4M7P0GJkZvKOQfrTteXm5sbHx/zhzZWWmPjIo7O0tTQ2ONna2oeIitDQ0aeoqfa1hPShYv39/icpLO3t7sDBwcrLy7a3uO/w8P37+oyOj/728dDS0jI0Nvf4+CstMJaYmayurra3t3R2d6ytrmJjZcrLzCgqLZmanFpcXcTExa+wsVxeYJ+gouLi4vvgzPnIpNTV1ePj5JWWlzEzNs7Oz4CBgkxNUG5wcvHy8ubn5/rOrvOYU/aud/jEnPrawP7w5v7z676+v9XW156foC8wM5+goeHi4iMmKHN1dkBCRTc5PKOkpXZ4evOUTP7278fIyC8xNCMlKLu8vWBiY8HCw+zs7JydnsTFxqKjpKysrVVXWLq6u8bFxfe4iPOXUK6vsH1/gCosL9vc3ImLjEZHSrKytKChoqmqq9DR0vrXvd/g4KqsrdLS0yZFySH5BAEAAP8ALAAAAACdACYAAAj/AD8IHEiwoMGDCBMqXMiwocOHECNKnEixosWLGDNq3Mixo8ePIEOKHClxAyKSKFOqJDhA1AkcMGPKnEmzps2bOHPq3MmzJ89REw08eIAkCo5RPpMqXcq06VKgEckM+DGUAUykTrNq3crVJlSHDciQGVVkKA04o9J2Xcu27c+HgESR+EAGB4GhJ9Kqdcu3b9uvCy1ceKAgg9hXNoYiwfEBq9/HkJkCVqhqxVBKHsaSGDF0UOTPoHtORigWB4KhD5LQwVHJLNrQsGPLHG1QbF0cCTg/MMCKTqihFT536lZKNs0okbCQw4EKVDfHXGkTJDMDnECYDJqZJQGHxlANUaC7/wU1TpL42Od2jFuPAxl7ttIFkmFAuIcFHHWfiBpaePOD4EfhFMIbTnUzjj7nhTZKIuutFwUPDibYVHxiJYEaJRyQAFMNlimmgSbh6XUTN5NIqNMo5JlnXEyKjKNHJ+H1wZ6JT5FGlyo3eIdaEIDUddpQwNBBhnx7zTSKCJDQmJOBCOb0hw5KpQLHTkeOw01aH0A4TngyvVLchAeJRdcHMDmwH2pAQEMHNCP08IptYo5CjhI7TCLPGEeJwAM55KgxCjaHKJECGjDIFMUmcaSwxS1HpRjTFN7EIxM5WnKyhTJRcDOOBK/EtM162LyCQovj2JNPTJoiE4sTesgRBxya6v/RhyGjyDhOgDiEs4Md48zhTXGf6qEMTGqsN8VRyvC6DVKjDeksmXXRkYFQqNnAhCWqMAbnKPWM88gW3jRihyQ4aFpqKbfMQ6IS7nmD1B8hjMOJFpqiMIqB5OKgTzFaKIPVJg02uMQoE6ynAlJTrKcFDmEEvJ4jMJkb8A7KyNHglVpyyQWvATcSxRfrkYMUBQ8PGcN6XzBbkJhkCIAJIjPRwYAzF94H55BH2OFJgKMMk5YIKUgCpTKHTHnUIePcMgoFdpxKpiMHOypJMTughRSs45CCQj7ujXNPseMoMfJ6oKSzXi9ccLGEi5GMYm4emXjTb5USkGNOrRHi8IiLh9T/osV6VNyyHhpI2bOeE2IVPA4XV01Hl1jAmFWBJhn4QMflJBDQAQEwsUxXDuOUMRZ03CQC1JB60YXNOCj84ateWKJYXhrWaPHaUQCPEwdSnaw3sHucjrJH1so0jMsTaenAKwVVjnMHUnQIJIKVUNl6lODjJElGKbw6gUMx4+yBA9jrqUFGI+MUYxSW8j1Lhgeoxd+MM89Ac84T9+Hn7JB/cNLHNKnYy5FMNyaklOIIy0gYBbgwDkIQRC8GyoUExiEIAYJhPajAiqaKMQoqhCwS60kGDvIwDjeg4IQoAB8kPjA9bhwFdtMTQWPwtqVR4GM9vmgMGbqwnlcsYj1HIJkc/1pEgVKsZxH4Yd+QsMFEsTjACvCInxQJAw4cMBEbN7uErSSgBTz8oVwEJAMcEgELhzmCHOPwB+xmSJ5x7CAE82gbVpA2jhhghYTF+EDOwpaL9cBgFJxwWINK1ELxaEoEWOnaUXyxnnxAhQ3r+YMK1nMPJ4zjCyBzQhkwCBjbjOGTpbkcHBgwg2esQBQKiN8XPzkG2wykActwRy3i0IQwDPB0DPJDJ5axjECM44wHAkrqZEexX8wjBLDzRMj0Ysk5NAYS47CGjLhhlEC6wREUwGY21zAKiyFyNi2Mia3CUw2y6QWS44ADHEjhxvVIQhLrgSQp0OLKIX3AHfhsX7RwcP85UQKiEs/AQQPw6Y79laYB8iEZNkp3OlykAHbLGAcFQOhI2KWFPMZISybGUQu95GM9b0jL6sbBg8bkDmX40cU4xGG11H3AYi6ciaZcmBZGRGga63noB0qBCyvBZBINEkEU6ECqcdDqcc+6Zz7JgICmOvWpCHhGU/GDz10k9QgIbUBYUDAONcghC7aRQApeKCeJkoEHfQjRKM6RFnylJQr24MQ5BFKKJowDF5mohZZ8kZZXWKNBfiJDP0I4jCeEQwt3q1JMY6LYmBRiPbnIwR/c8LB4rO2SjWrQFvCDhgaBQpgDsSdBxTLF0j5ASLtwxy7GRIY0NCEOpgjEKagwDy3/jCIFnCDHET6wCD0EIG34oCwFyCC4XkyDBfdQxO7Io48ZCk5sGhVkHf6gFwaNAxlHIRM6A0YKFjRWpj5Ny0cbNAo+CJITOhhLRNfDqNY2yBLsk49A8NlKMph2iqh1xxge9wFVgAIC2VhPE2iRig8Eog7jkGEqoLmeYnB1uGSQQQka1AgvyA5BevHGOGBwnVzYdT08wAbsjAG4tKg3BQHjxD1+5lNwhhcpnujprUaRj7+uxx5TON0oIMSJ1dRletgV0enmm1q6RPUZSE6ykpGc3/o6S6su6MQv6NAAE9MhENQQyCiU4YVhDHUZyrAnHVjgjmGQ1aKwe+EH/pCDdNwB/z8zTMsdlrG+gYyiFJIwRyTUisAj0KTPs4kCL/XyCjWYw8xYGpIOlpFeMS06FVdpnD13QWnb8LOfmO5n51K7WvmEpQFGCDVCHbfG2qAuzmiOnUVnI8w0q7kxNEm1kQSo41Sv8WauROrosussgmLjF1fkxxWHzUR+CJvT8jFCAwT6Sq1qVSy2XiNooz3MIjG21WmOXU1sfZNEa9nWSdUnUu3Ma7HsYgypVe25WcnudlNatXT59OPwM2q6KBuhtm5MnFEd7TjPmt+r3ra0rc3qAp7ZpfbUNZyIxCwxKZXS6SaoxCdeVf2GxQhkQOiQtCqQZxeQ2rEDeL5FdG2R0wjN/qmOda0PDrubKXzcKnscsYuNDWHbvOY4F/YVWRvvD2R11BqvMrX1jW2QY7vk0e42yklecNSxvJO4xrVaTufyqFs96mTKqs+37mxnm3zpRq82VlTNbYGDXTyJnvrQScMyveCa5/Vkmdzr6fWMa73e3wZ53sNOdjWD3Oxnbzqs+830krM20Quv+tzHLd+E293nWrcz3ydvUX5/vfBXIXyR0s4szaOd6mP6QEAAADs=">
I believe there's a size limit on some older versions of Internet Explorer.
EDIT 2:
If you are truly looking for a HTTP HEADER option, did you also try the "Expires" value. If I recall, you have to use a negative value, i.e. Expires: -1. There's also a few other cache controls you can play with, like Cache-Control: max-age=0 or Cache-Control: must-revalidate.
EDIT 3:
Ok, you did try the Expires as well, BUT you are using ZERO. You need to change this to -1. Full explanation here:
HTTP Expires header values "0" and "-1"

Ajax response data to vue resource served via cloudflare not being parsed from json in Chrome

I have a site deployed to Heroku with which I use Cloudflare as a proxy.
For some front-end behaviour I'm using Vue with Vue-resource to perform some ajax requests. The responses are in the form of JSON with a Content-Type:application/json header.
Strangely when my scripts are erroring because the responses are not being parsed to a javascript array/object by vue-resource, despite this being the expected behaviour. However, this problem is only occuring when the site is accessed via the Cloudflare URL. If I access it from the Heroku url (e.g. foo-bar-1234.herokuapp.com) all works fine. It also is working fine on my local dev environment (Vagrant, Nginx).
The problem also only occurs in Chrome. It works fine in Safari and Firefox regardless of whether it's Cloudflare or not.
I believe the problem must lie with something that CloudFlare is doing to the response from the Heroku server that is stopping Vue Resource in Chrome from properly parsing the response.
For reference here is the response from the CloudFlare-served request:
cache-control:no-cache
cf-ray:2d2f1980573d3476-LHR
content-encoding:gzip
content-type:application/json
date:Mon, 15 Aug 2016 19:37:10 GMT
server:cloudflare-nginx
set-cookie:sam_session=[redacted]; expires=Mon, 15-Aug-2016 21:37:10 GMT; Max-Age=7200; path=/; HttpOnly
status:200
via:1.1 vegur
x-ratelimit-limit:60
x-ratelimit-remaining:59
vs the heroku-served one:
Cache-Control:no-cache
Connection:keep-alive
Content-Type:application/json
Date:Mon, 15 Aug 2016 19:40:10 GMT
Server:Apache
Set-Cookie:sam_session=[redacted]; expires=Mon, 15-Aug-2016 21:40:10 GMT; Max-Age=7200; path=/; HttpOnly
Transfer-Encoding:chunked
Via:1.1 vegur
X-Ratelimit-Limit:60
X-Ratelimit-Remaining:58
The headers seem pretty similar so I can't think how the responses differ in a way that would cause this...
Any debug suggestions appreciated.
Update: The compiled javascript can be seen here (large file): https://github.com/samarkanddesign/samarkand-web/blob/master/public/js/admin.js
It's a large file so better to look at the repo: https://github.com/samarkanddesign/samarkand-web/tree/master/resources/assets/js
Update 2: example of a vue-resource ajax request executed in both situations:
CloudFlare
Direct from Heroku:
We can see the in the heroku example that the response data is resolved as an array of objects whilst in the CloudFlare one it's a string.
I believe the problem lies with Vue-Resource and the fact that CloudFlare (SPDY protocol) uses lower-case response headers, whilst Heroku Apache uses capitalised.
Vue Resource is incorrectly ignoring lowercase headers in its interceptors resulting in it missing parsing the response body.
This is referenced in this bug issue: https://github.com/vuejs/vue-resource/issues/317
And a quick way to patch it is via a custom interceptor:
Vue.http.interceptors.unshift(function(request, next) {
next(function(response) {
if(typeof response.headers['content-type'] != 'undefined') {
response.headers['Content-Type'] = response.headers['content-type'];
}
});
});

Forcing AJAX request to revalidate cache with server, without reloading completely

I have a web application that lets the browser cache AJAX requests result for a long time. I have found out how to make a request that bypasses the cache entirely, when probable modifications are detected. But I would want to let the user trigger a data refresh.
In this scenario, I'd like the browser to check with the server if the cache is stalled but use it if it is not (that is, if the server responds with a 304 code). The goal is to spare the loading time because the data is huge.
The server includes the following headers in all responses:
Cache-Control: private, max-age=604800
Last-Modified: ... # actual last modification date
I managed to burst the cached object entirely in Chrome (not tested other browsers yet) by using the following HTTP headers in the request:
Cache-Control: max-age=0
If-Last-Modified: Tue, 01 Jan 1970 01:00:00 +0100
The If-Last-Modified line is the one that really has an effect. Chrome seems to ignore the Cache-Control header in the request.
I have also found that using Cache-Control: must-revalidate in the server response forces the browser to validate its cache with the server for each request.
But is there any way to revalidate for just one precise request, decided on the client-side?
Note that I'm not specially attached to doing this with HTTP headers, so any other method that I would not be aware of is welcome!
you can add a url parameter which value base on time to clean cache for just one precise request.
$.ajax({
url:"/questions?nocache="+Date.now(),
"success":function(data){
console.log(data);
}
});

Removing HTTP headers from an XMLHttpRequest

I am working on an ajax long polling type application, and I would like to minimize the amount of bandwidth I am using. One of the big costs right now are the client side HTTP headers. Once I have a connection established and a session id stored on the client, I don't really want to squander any more bandwidth transferring redundant http information (such as browser type, accept encodings, etc.). Over the course of many connections, this quickly adds up to a lot of data!
I would really like to just take my XMLHttpRequest and nuke all of the headers so that only the absolute minimum gets transmitted to the server. Is it possible to do this?
You have very little control over request headers, but you can still do a few things -
Reduce the size of the cookie. In general, you only want the session id, everything else can be eliminated and stored server side.
Minimize http referrer by keeping a short URL. The longer your page url, the more data will have to be sent via the http referrer. One trick is to store data in the fragment identifier (the portion of the url after the #). The fragment identifier is never sent to the server, so you save a few bytes over there.
Some request headers are only sent if you had previous set corresponding response headers. For example, you can indirectly control the ETag and if-modified-since request headers.
You may want to consider Web Sockets. Support is pretty good (IE10+).
You may be able to override some of the standard headers using setRequestHeader() before sending the request, but it is possible the browser may not allow overriding of some and it seems there is no way to get a list of headers (besides asking the server to echo them back to you) to know which to try to override.
I think it's possible to remove all headers at least in some browsers.
Take a look at the communication between gmail/calendar apps and the backend from google in chrome (it's not the same in firefox)
it's possible google has some hidden api for the XMLHttpRequest object,
you'll see something like the below output (notice there is no request headers section):
Request URL:https://mail.google.com/mail/u/0/channel/bind?XXXXXXXXXXXXXX
Request Method:POST
Status Code:200 OK
Query String Parameters
OSID:XXXXXXXXXXXXX
OAID:XXXXXXXXX
VER:8
at:XXXXXXXXXXXXXX
it:30
SID:XXXXXXXXXXXX
RID:XXXXXXXXX
AID:XXXXXXXXXX
zx:XXXXXXXXXXXX
t:1
Request Payload
count=1&ofs=211&req0_type=cf&req0_focused=1&req0__sc=c
Response Headers
cache-control:no-cache, no-store, max-age=0, must-revalidate
content-encoding:gzip
content-type:text/plain; charset=utf-8
date:Tue, 09 Oct 2012 08:52:46 GMT
expires:Fri, 01 Jan 1990 00:00:00 GMT
pragma:no-cache
server:GSE
status:200 OK
version:HTTP/1.1
x-content-type-options:nosniff
x-xss-protection:1; mode=block

Can browsers react to Set-Cookie specified in headers in an XSS jquery.getJSON() request?

(Note: This is a follow up to my question Can jQuery.getJSON put a domain's cookies in the header of the request it makes? and covers the XSS case of Setting a cookie in an AJAX request?)
I've been told I'm unable to set cookies to be read by other domains that are not subdomains of the current domain using $.cookie(..., ..., {domain: ...}). But in a comment on a response to my last question, #zanlok said "The server's reply, however, can definitely set a cookie" and it got two upvotes.
So I thought I'd try using a service which was created for the explicit purpose of setting cookies called Freebase's "touch" API. The call looks like:
$.getJSON("http://api.sandbox-freebase.com/api/service/touch",
{}, // URL parameters
afterCookieIsSetCallback); // Callback function
Looking in FireBug at the response header it's like this:
Date Wed, 24 Nov 2010 03:35:28 GMT
Server Apache
X-Metaweb-Cost [...]
Etag [...]
Expires Wed, 24 Nov 2010 03:35:29 GMT
Cache-Control no-store
Vary Accept-Encoding
Content-Encoding gzip
Set-Cookie mwLastWriteTime=1290569730|10325_9202a8c04000641f80000000199eff96|sandbox; expires=Thu, 25-Nov-2010 03:35:28 GMT; Path=/
Last-Modified Wed, 24 Nov 2010 03:35:28 GMT
Content-Length 134
Content-Type text/plain; charset=utf-8
X-Cache MISS from cache01.sandbox.sjc1.metaweb.com
Connection keep-alive
X-Metaweb-TID cache;cache01.sandbox.sjc1:8101;2010-11-24T03:35:28Z;0001
So there's definitely a Set-Cookie in there, and the script runs the response handler. Yet the cookie is not present in the request headers for later JSON requests this script makes to .sandbox-freebase.com.
(By contrast, simply typing the touch api URL into the address bar and loading it that way does set the cookie for future requests. That applies even in other tabs.)
This seems to be a deviation from a prior "expected behavior", because there was a toolkit published by MetaWeb circa "2007-2009" which seemed to think such an approach could work:
http://www.google.com/codesearch/p?hl=en#v099O4eZ5cA/trunk/src/freebase/api.js&q=touch%20package:http://mjt%5C.googlecode%5C.com&l=340
Without knowing much about it, I'm wondering if it was a recent change that Firefox adopted and then WebKit followed suit. Perhaps the one mentioned here:
http://trac.webkit.org/browser/trunk/WebCore/xml/XMLHttpRequest.cpp#L856
So is there any canonical documentation on this particular issue?
The AJAX call you are making, is making a request to a domain outside of the domain of the top level url(the url in the address bar). This results in it being a 3rd party cookie, by default Internet explorer won't persist a 3rd party cookie. Meaning that the cookie will come back in the Set-Cookie header on the first request, but subsequent requests that you make to that server will not have that cookie sent in the request.
Like you said, if you go directly to the url in your browser it works. This is because in this case it's a first party cookie.
In order for IE to accept 3rd party cookie's the server that is sending the SET-COOKIE header on it's response, must also have a P3P Policy Header set.
Here is an example, when you navigate to CNN, you will notice one of the requests it makes is to a domain name of b.scorecardresearch.com, scorecardresearch is dropping a tracking cookie, but this cookie is considered a 3rd party cookie. So in order to make it work they had to also in include a p3p header, see headers below:
HTTP/1.1 200 OK
Content-Length: 43
Content-Type: image/gif
Date: Thu, 02 Dec 2010 19:57:16 GMT
Connection: keep-alive
Set-Cookie: UID=133a68a4-63.217.184.91-1288107038; expires=Sat, 01-Dec-2012 19:57:16 GMT; path=/; domain=.scorecardresearch.com
P3P: policyref="/w3c/p3p.xml", CP="NOI DSP COR NID OUR IND COM STA OTC"
Expires: Mon, 01 Jan 1990 00:00:00 GMT
Pragma: no-cache
Cache-Control: private, no-cache, no-cache=Set-Cookie, no-store, proxy-revalidate
Server: CS
If you were to copy this header and add it to the response, you would notice that the cookie's start working,
P3P: policyref="/w3c/p3p.xml", CP="NOI DSP COR NID OUR IND COM STA OTC"
It's best that you craft a P3P header specific for your business, but the above should work for testing purposes.
If I correctly understand you, you are wondering why the server sends Set-Cookie only on the first request. If that is true, then it's by design - take a look here:
http://en.wikipedia.org/wiki/HTTP_cookie
Set-Cookie is like a setter - the server sends it for the browser to cache it locally. It can send every time, but there is no need to do that, so it will send it again only if it needs to change the value stored locally.
Browser, on the other hand, will send Cookie header every time with the contents set by the last issued Set-Cookie from the server.

Categories

Resources