We have multiple pages on a website which require many of the same Javascript and CSS files.
How do we avoid those files being downloaded again if it has already been downloaded by the user browsing some other page?
If the file is in the same path, the browser should automatically cache it. You may want to explicitly specify the cache expiry time, if possible via your web server or programming environment.
If you use an HTTP traffic analyzer like Fiddler you should see that requests for JavaScript and CSS resources return an HTTP code 304 (Not Modified). This tells the browser "the version of the resource you have in your cache is the same as the one on the server so you don't need to download it again".
For even better performance you can explicitly set caching headers for these resources.
This caching tutorial has great info.
You should explicitly set caching headers if you want caching. www.fiddler2.com/redir/?id=httpperf
Related
What is the fundamental difference running a file using a server in localhost, and opening a file such as file:///Users/$user_name/$your_directory/index.html, assuming no backend is used, and it is only frontend and contains html/css/js
How does this also affect interactions with other server ie. ajax requests?
I am sorry if this is too broad, but I haven't found a solid answer to these underlying questions.
Fundamentally, assuming at some point you're going to host the result on an actual web server, the former matches the target environment while the latter doesn't. Browsers treat local files and files served from web servers (even localhost web servers) differently, although very similarly. One aspect of this is the encoding: When you retrieve a file from a web server, the process of determine what encoding the data is in is different from opening a local file.
How does this also affect interactions with other server ie. ajax requests?
This is one of the primary ways in which they're handled differently, and it even varies from browser to browser. A page loaded from a file:// URL has origin null from a Same Origin Policy standpoint. Some browsers (like Chrome) disallow Cross-Origin Resource Sharing entirely for origin null, even when the server you're trying to talk to has a wide-open CORS policy (*). Others (like Firefox) allow origin null to match the wildcard.
In general, for best results, ensure that your development environment matches your deployment environment in the important ways. That means doing your development using a web server process rather than local files. Most IDEs will happily provide that process for you; if not, Apache or Nginx aren't hard to install.
answer is simple,
if u don't have made active backend yet for "index.html" then it would not effect.e.g.-"localhost" and "index.html" will be same this time.
but when u start working with the backend,then most of the backend processes need an active server (need localhost).
e.g.-
1.
fetch('local.json')... //fetch json or any file would not work for local files.
2.
u may not ineract with mysql/django etc. databases.
means it cause errors in signup/login , store any image/video/docs at database etc.
so better is work in localhost, it's most easy way is :-
VScode(IDE) >> extenctions >> live server (just need to click a button to make
localhost and click again to stop localhost)
https://marketplace.visualstudio.com/items?itemName=ritwickdey.LiveServer
It won't make any difference, I think.
But there is an exception when using Chrome! Sometimes I have seen if a html file is added with some CDN link, then it doesn't loaded into html specifically in Chrome but if you try the same file in Firefox or Internet Explorer, it works.
I have faced this problem and hence I always put it under local IIS default website.
We have a client with thousands of users (who all use Internet Explorer) and a large amount of javascript files that enhance their user experience with our product.
The problem I'm having is that any time we update one of these scripts there is no way to know whether the client is seeing the latest version. What we're having to do is tell our client to do a hard refresh (ctrl+f5) before viewing any changes. Obviously this approach is not ideal.
I know that browsers cache based on the url, so one could use something like
<script src='myScript.js?ver=1.2'>
to get around the issue, but this is not an option for us.
I was hoping that there's some kind of header property or something similar that we could use to tell IE not to cache these scripts.
Any ideas?
You can also version the filename itself like jQuery does:
<script src='myScript-v1-2.js'>
Then, each time you revise the script, you bump the version number and modify the pages that include it to point to the name of the new script. This is foolproof vs. caching, yet still allows your viewers to receive the maximum benefit of caching and requires no server configuration changes for the .js file.
A full solution will typically include setting a relatively short cache lifetime for your host web page and then allow the various resources (stylesheet files, JS files, images, etc...) to have longer cache lifetimes for maximum caching. Anything that is fingerprinted can have a very long cache lifetime. See the reference that fabianhjr posted about for ways to set the cache lifetime of the host web page. It can be done in the web page itself (<meta> settings) or in the http headers via the server.
If you turn off caching for your script file (which would likely have to be done at the web server level for a script file) then all your viewers will lose the performance benefit of caching and you will lose the bandwidth and load-saving benefit of caching. If you use a common .JS file across many pages (a common design pattern), your viewers will see slower performance on every page.
Everything you need to know about cache http://www.mnot.net/cache_docs/
http://www.mnot.net/cache_docs/#CACHE-CONTROL <-- HTTP Headers
I understand that the browser is forced to fetch a new version of the cached JS file when the file name is changed or a query string is added to it.
We don't do this and until now we've never had issues with browser serving stale files. Recently, we are seeing some users using IE9 who complain about the browser serving cached JS/CSS files. This issue is not consistent across everyone using the site.
My understanding is that when the file name or query string is not changed, but the JS file content is changed, the browser would fetch the new version.
Why is this happening and why is it not consistent?
Any thoughts?
Setting an expiry date or a maximum age in the HTTP headers for static resources instructs the browser to load previously downloaded resources from local disk rather than over the network.
This is good if we want to actual cache the resource. If we want to force a new download set no-cache, which forces caches to submit the request to the origin server for validation before releasing a cached copy, every time. This is useful to assure that authentication is respected (in combination with public), or to maintain rigid freshness, without sacrificing all of the benefits of caching.
HTTP Server-Specified Expiration - specs
Yes, when the content seems the same (i.e. same file names), users may get a cached version of those files on subsequent visits.
It is really beyond your control... it's up to each specific browser to decide how to handle caching and it's also up to the user... some dump their cache regularly or refresh the page if something doesn't seem right.
If you want to force the user to see your updated CSS or JS content, change the CSS or JS file name... otherwise it may be inconsistent for a short, but unknown, period of time.
This tutorial may help you...
http://www.mnot.net/cache_docs/
For example Chrome caches scripts until Shift + F5 or some time expired ( and ignore the fact it is changed on server, it don't even send a request ).
So is done by other browsers ( but when cache is enabled ) - i cannot descripe exactly when it happens
as mentioned on this site
Note that while JavaScript files are
not reliably cached by browsers, CSS
files are.
http://www.websiteoptimization.com/speed/tweak/http/
The browsers I know of "reliably cache" all kinds of static data (including JS and CSS, as well as images, HTML, etc) as long as they're served with proper cache-support headers. Maybe the text means something different than actual caching, such as parsing just once and then keeping some efficient internal format...? I don't know which browsers do or don't do that for different kinds of files, but at least under this hypothesis I can see why (e.g) CSS might be easier for the browser to keep in preprocessed form than JS.
I can't see any reason to make that claim expressly for JavaScript files. I can, however, see an argument made that caching can be unreliable in general regardless of the file type sent, depending on the server configuration, additional headers that are sent, proxies and caches, and how the end-user's browser is configured.
Setting an expiry date or a maximum age in the HTTP headers for static resources instructs the browser to load previously downloaded resources from local disk rather than over the network.
This is good if we want to actual cache the resource. If we want to force a new download set no-cache, which forces caches to submit the request to the origin server for validation before releasing a cached copy, every time. This is useful to assure that authentication is respected (in combination with public), or to maintain rigid freshness, without sacrificing all of the benefits of caching.
HTTP Server-Specified Expiration - specs
i was wondering - when using jQuery (or any other javascript include) in my web,
does the browser cache it after the first download for all all pages (i assume yes) or will it download it every time?
2nd, when the user quits the browser and starts it again (for loading my website), will the jquery js file still be cached or will it completely download again?
thx
This depends on the browser and on how your server is set up. Have a look at the headers sent by the server along with the file (you can use a tool like Firebug to look at the headers). A good idea is to use the jQuery file hosted by google, since many other sites (including stackoverflow) use the same file. Then the browser can cache that file and never download it from your server. This page has a list of files hosted by google, and this page explains how to properly set your server up to (tell your browser to) cache files.
1: Yes, the browser caches all jscript/css includes
2: If the user does not clear his/her cache. Yes it will still be in the cache of the browser, even after closing and reopening it.
If your webserver serves jquery.js using a proper expires header, then yes, the browser will cache it.
http://developer.yahoo.com/performance/rules.html#expires
Yes the scripts will get cached between page views, along with the CSS files and images.
Yes as well, in general. The cache is normally maintained between browser restarts.
It will typically not be downloaded again, but unless your server explicitly tells the browser to cache it for a while, then it will send a request on each page load asking "was jquery.js updated?" which is almost as slow as just downloading it again.
You can test how it works on your site with Google's Page Speed or Yahoo's YSlow.