Check browser's cache for a js file - javascript

How can I check for a javascript file in user's cache. If he refreshed the page or visits the site after sometime. I need not download that js file again. Does the js files get cleaned up after a site is closed.

Whether a javascript file is cached depends on how your web server is setup, how the users browser is setup and also how any HTTP proxy servers between your server and the user are setup. The only bit you can control is how your server is setup.
If you want the best chance of your javascript being cached then you server needs to be sending the right HTTP headers with the javascript file. Exactly how you do that depends on what web server you are using.
Here are a couple of links that might help:
Apache - http://httpd.apache.org/docs/2.0/mod/mod_expires.html
IIS - http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/0fc16fe7-be45-4033-a5aa-d7fda3c993ff.mspx?mfr=true

To check if a Javascript file is cached works only if the Javascript file is from the same domain afaik.
You could then check it like this:
try {
await fetch("https://YOUR-DOMAIN.org/static/build/js/YOUR-FILE.js",
{method:'Head',cache:'only-if-cached',mode:'same-origin'});
} catch (error) {
console.error(error);
}
You will get a 200 back if it is cached, otherwise an error is thrown.

The browser will automatically look after what it has in it's own cache. There are various mechanisms you can use for controlling it however.
Look into the various HTTP caching headers, such as:
Last-Modified
Expires
ETag
The Expires header is the most critical of these when it comes to client-side caching. If you set a far future Expires header (eg, 10 years) the browser will not (in theory) not look to the server again for that file. Of course then you need a method of changing the file name when the contents of the the file change. Most people manage this by adding a build number to the file path.

The browser will take care of caching for you.
When the cache is emptied depends partly on the browser settings and partly on the headers you send. If you set an Expires header, then the browser shouldn't re-request the file until it has expired. Reading about HTTP Headers might help you.

Related

Server side rendering issue over a CDN

I have recently launched a site that uses server side rendering (with next.js). The site has login functionality where if an authentication cookie is present from a user's request then it will render a logged in view for that user on the server and return the rendered logged in view to the users browser. If the user does not have an authentication cookie present then it renders a logged out view on the server and returns that to the users browser.
Currently it works great but I have hit a snag when trying to serve the site over a CDN. My issue is that the CDN will cache a servers response to speed it up so what will happen is the first user to hit the website on the CDN will have their logged in view cached and returned to the browser. This in turn means because it is cached then other users who hit the site also see the other users logged in view as opposed to their own as that's what has been cached by the CDN. Not ideal.
I'm trying to think of what the best way to solve this problem would be. Would love to hear any suggestions of the best practice way to get around this?
One way I have thought of would be to potentially always return a logged out view request on the first page visit and so the authentication/ logging in client side and from then on always do the authentication on the server. This method would only work however if next.js only does server side rendering on the first request and let's subsequent requests do all rendering on the client and I'm not sure if that's the case.
Thanks and would love all the help/ suggestions I could get!
UPDATE
From what I can gather so far from the answers it seems that the best way for me to get around this will be to serve a CDN cached logged out view to every user when they first visit the site. I can then log them in manually from the frontend if an authentication token is present in their cookies. All pages after the first page they land on will have to return a logged in view - is this possible with Next.js? Would this be a good way to go about it? Here is a summary of these steps:
The user lands on any webpage
A request is made to the server for that page along with the users cookies.
Because this is the first page they are visitng the cookies are ignored and a "logged out" view is returned to the users browser (that will have been cached in the CDN)
The frontend then loads a logged out view. Once loaded it checks for an authentication token makes a call to the API to log them in if there is one present
Any other page navigation after that is returned from the server as a "logged in" view (ie the authentication cookie is not ignored this time). This avoids having to do step 4 again which would be annoying for the user on every page.
For well-behaved caching proxies (which your CDN should be), there are two response headers you should use:
Cache-Control: private
Setting this response header means that intermediary proxies are not allowed to cache the response. (The browser can still cache it, if it's appropriate to do so. If you want to prevent any caching, you'd use no-store instead.)
See also: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control
Vary: Cookie
This response header indicates that the data in the response is dependent on the Cookie request header. That is, if my request has the header Cookie: asdf and your request has the header Cookie: zxcv, then the requests are considered different, and will be cached independently. Note that using this response header may drastically impact your caching if cookies are used for anything on your domain... and I'd bet that they are.
See also: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Vary
An alternative...
A common alternative approach these days is to handle all the user facing dynamic data client-side. That way, you can make a request to some API server which has no caching CDN at all. The page is then filled client-side with the data needed. The static parts of the site are served directly from the CDN.
All CDNs cache and distribute data rely on the cache header in the HTTP response. You should consider these two simple notes to get the best performance without miss the power of CDN.
1. No-cache header for dynamic content (HTML response, APIs,...):
You should make sure all dynamic contents (HTML response, APIs,...) cache header response is Cache-Control: no-cache.
If you're using next.js can use a custom server (express.js) to serve your app and full control on the response header or you can change next.js config.
2. Set cache header for static content (js, CSS, images, ...)
You should make sure all statics contents (js, CSS, images, ...) cache header response is Cache-Control: max-age=31536000.
If you're using next.js in every build all assets have a unique name and you can set a long-term cache for static assets.
Try to add cache control header to your Auth required pages.
Cache-Control: Private
The private response directive indicates that a resource is user specific—it can still be cached, but only on a client device. For example, a web page response marked as private can be cached by a desktop browser, but not a content delivery network (CDN).
What I understand from your question is that when a user logged in, the logged-in view is getting cached on the CDN and when the user is logged out then also the site is shown in the logged-in view from the CDN cache.
There are some solutions to this issue are as follows:
Set some TTL(Time To Live) for the CDN so that it will automatically invalidate the cache data after a specific time.
As you want to deliver the site fastly means you want to achieve low latency. For this you can do one thing, just cache the big files from the website like images, videos, documents, etc to the CDN. And don't cache the entire website there. Now, every time the user request comes then the site will be served from the regular server and the media files will be taken from the CDN. In this way, you can achieve low latency. And as the media files are taken from the CDN cache, the website code will load fastly and the site will be served quickly. In this way, the authentication will be done on the server-side.
Another solution would be to invalidate the cookie and the authentication after a certain time of inactivity. And after that when a user comes then the site should render a logged-out view.

Detect Javascript Tampering in Ajax call

We have a Javascript file that we have developed for our clients to use. The Javascript snippet takes a screenshot of the website it is run on and then sends it back to our server via jQuery.post()
The nature of our industry means that we have to ensure there is no way that the file can be tampered with by the client.
So the challenge is that we need to make sure that the screenshot was generated by the javascript file hosted on our server, and not one that's been copied or potentially tampered with in any way.
I know that I can get the script location using:
var scripts = document.getElementsByTagName("script"),
src = scripts[scripts.length-1].src;
But this won't help if a client tampers with that part of the SRC.
What methods can I employ to make sure that:
1) The post was made from the javascript file hosted on our server
2) The javascript was not tampered with in any way.
Short answer:
You can't.
You can't.
Both stem from the fact that once you hand over something to the client side, it's out of your hands. Nothing will prevent the user from putting a proxy between you and their machine, a process that intercepts content or an extension that tampers content, headers, cookies, requests, responses etc.
You could, however, harden your app by preventing XSS (prevent injection of scripts via user input), using SSL (prevent tampering of the connection), applying CSP (only allow certain content on the page), add CSRF tokens (ensure the form is authorized by the server) and other practices to make it harder for tampered content to get through.
But again, this won't prevent a determined hacker to find an opening.

Browser Caching - Why do I see repetitive http requests for javascript files if they are cached?

I am trying to improve performance of few web pages and wanted to understand if the javascript files are cached by I.E or not for my internal application. So, I had fiddler to watch the requests going to server.
I can see, every single time a create customer page is loaded, the same number of requests in the fiddler, for the same files with Result '200' (and not 304 - not modified ) to fetch javascript files. These include jquery, knockout and a few custom ones.
I studied the request and response header (below) but I see cache-control to be ok and nothing that conveys it is not cached. But don't understand why these same http requests show up in fiddler (which conveys a request is actually made to server) if it is cached.
I can see the same requests every time going to server, which makes me wonder :
Is the browser caching these or not ?
If not, are these atleast cached in IIS ?
How can I avoid these unnecessary http requests, since these javascript files dont change at all ?
Many Thanks.
Your request for the file has a Pragma: no-cache header (up at the top of your image, two lines under "Request Headers"), which tells the browser and the server that you don't want to use the cached copy.
You'll want to look at how you're making that request to find out why that header is there, and get rid of it.
Possibilities:
You're loading it via some kind of AMD or other dynamic loading mechanism that is configured to not use cache
You're running with development tools with the "disable cache" option most of them have turned on

Leverage browser caching for some css and javascript file only

Is there any way to browser caching for some css and javascript files only through htaccess file?
I have three css files
http://www.example.com/css/main.css
http://www.example.com/css/star_rating.css
http://www.example.com/js/jquery.autocomplete.css
"main.css" may be chaged day by day. I want caching for star_rating.css and jquery.autocomplete.css only, not for main.css. How can I achieve this?
Also is there any way to caching google adsense javascript file.
https://www.gstatic.com/swiffy/v7.1/runtime.js
http://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js
https://pagead2.googlesyndication.com/pagead/osd.js
Set a cache-control header in your HTTP Response, in .htaccess, already answered here: How can i add cache control code to htaccess?
You will need a subsequent rule to reduce the cache interval of main.css, to whatever you need. However, before you go ahead with that...
Personally, I wouldn't bother with such sophisticated granularity, just set your cache time so the resources are only requested once for a typical browsing session (24 hours?). Although some browser caches can be rather large, there's no guarantee a busy user is going to still have your resources cached the next time they visit your site, if they fill their cache, the less frequent/stale items will be removed.
For long-term caching strategies I would just check that ETag support is working on your servers. If a browser already has one of your items cached, it will request with an "If Not Modified" header and provide the ETag it holds for your resource.
If the resource has not been modified (if the ETag values match), your server will respond with a 304 (Not Modified) instead of a 200, a good saving for large resources.
You cannot influence the response headers if hot-linking to the Google AdSense JavaScript files and not hosting them yourself, but they should have sensible cache-control headers (set by Google) anyway I would expect.

source map HTTP request does not send cookie header

Regarding source maps, I came across a strange behavior in chromium (build 181620).
In my app I'm using minified jquery and after logging-in, I started seeing HTTP requests for "jquery.min.map" in server log file. Those requests were lacking cookie headers (all other requests were fine).
Those requests are not even exposed in net tab in Developer tools (which doesn't bug me that much).
The point is, js files in this app are only supposed to be available to logged-in clients, so in this setup, the source maps either won't work or I'd have to change the location of source map to a public directory.
My question is: is this a desired behavior (meaning - source map requests should not send cookies) or is it a bug in Chromium?
The String InspectorFrontendHost::loadResourceSynchronously(const String& url) implementation in InspectorFrontendHost.cpp, which is called for loading sourcemap resources, uses the DoNotAllowStoredCredentials flag, which I believe results in the behavior you are observing.
This method is potentially dangerous, so this flag is there for us (you) to be on the safe side and avoid leaking sensitive data.
As a side note, giving jquery.min.js out only to logged-in users (that is, not from a cookieless domain) is not a very good idea to deploy in the production environment. I;m not sure about your idea behind this, but if you definitely need to avoid giving the file to clients not visiting your site, you may resort to checking the Referer HTTP request header.
I encountered this problem and became curious as to why certain authentication cookies were not sent in requests for .js.map files to our application.
In my testing using Chrome 71.0.3578.98, if the SameSite cookie atttribute is set to either strict or lax for a cookie, Chrome will not send that cookie when requesting the .js.map file. When there is no sameSite restriction, the cookie will be sent.
I'm not aware of any specification of the intended behavior.

Categories

Resources