Is there a difference when we use files that are already stored somewhere(like Google hosted libraries or fonts) than the ones that are stored on local servers for our web pages, like differences on performance perspective?
Yes, there are a few differences. In most of the scenarios, your page will load much faster because:
If your browser already has a cached version of the library for some other site, it won't load it again.
Your browser may connect to Google or other CDN servers simultaneously, which may not happen if you are loading from your server.
The latency in getting the file from a Google server is almost always significantly shorter than with your server.
You will also have reduced load on your server.
The scenario where you may want to host on your site is if you are serving content for countries where Google, Google's CDN or similar are blocked.
Yes, these is small difference exists.
Modern browsers has limit, for example: no more than 4 parallel downloads from one domain. So, if you will use fonts from Google CDN servers, it possible, page will load little bit faster.
Related
Is it possible to force caching of certain Javascript Library files (ie react.min.js, etc.) when navigating between pages of a website that isn't a SPA?
Trying to look at the feasibility of a more componentized structure while not going full on SPA. The website I'm working on oftentimes has people visit a single page and then leave, but in cases where they do stick around, I don't want to have to have them reload each and every library on page load.
Background You Should Understand
There are literally thousands of articles on the web about this topic but here is a very good summary from Make Us Of's Everything You Need to Know About the Browser Cache.
The browser cache is a temporary storage location on your computer for files downloaded by your browser to display websites. Files that are cached locally include any documents that make up a website, such as html files, CSS style sheets, JavaScript scripts, as well as graphic images and other multimedia content.
When you revisit a website, the browser checks which content was updated in the meantime and only downloads updated files or what is not already stored in the cache. This reduces bandwidth usage on both the user and server side and allows the page to load faster. Hence, the cache is especially useful when you have a slow or limited Internet connection.
TL;DR
I don't know if your really looking for a way to force the browser to cache your files or if you just misunderstood how the cache works. In general the browser the visitor is using is the one that makes that decision and handles everything for you. If it sees that a resource is needed that was already accessed in the past it wont request it again, it'll just use its cache. So no, your libraries will not get re-loaded over and over. Just once.
Now if you really do need to force the browser to cache your files take a look at the answer(s) to Caching a jquery ajax response in JavaScript/browser. That should get you on a good path to a solution.
There are two ways to call bootstrap's css and js files.
From CDN
From root file
My question is what way is the best way to work.
This question is essentially "Should I use a CDN?".
The answer boils down to the following factors.
You will need an internet connection all the time, even to test your code.
If you're on the move and don't have an internet connection, or one of your clients doesn't have one when you're demoing your code, then you're in trouble if you're using a CDN.
The CDN will most probably be faster.
CDNs are designed for the sole purpose of serving files, fast. Most of the time, there are several mirrors serving the content, so files can be served fast to users around the world. Also, if you host it from your own domain, you might also include several cookies every time you serve the file, while the CDN will not.
The CDN might go offline.
This is obvious, but it's a concern nonetheless. Check the reputation of the CDN you're planning to use.
Your bandwidth usage will be minimised if you use the CDN.
If you host your site up somewhere, and your host has a limit on the amount of data it will transfer for you, commonly called 'bandwidth', then a CDN will help, since it won't count against your usage.
Keeping these factors in mind, the final choice is yours. However, for JS at least, I recommend using a CDN with a local fallback. An example done using jQuery can be found at https://stackoverflow.com/a/1014251/2141058.
The CDN serves all the file for you, and will probably be faster than hosting them yourself. However, all you clients must have an internet connection. If your client will have an internet connection (if you are just writing a normal website), you should probably use the CDN.
If your client might not have an internet connect (if this site is internal to a company network) you should serve the files yourself, as your clients won't be able to reach the CDN via the internet.
A foundation of a Content Delivery Network is that a different url/ip address then the primary domain of your website can receive client requests and response with resources.
This means the client's browser can and will make requests for resources that are from different urls/ips in parallel. So when you have a js file on your website, say www.mywebsite.com/site.js, the client's browser will request this resource and queue up the next resource, if the resource is from a CDN, aka a different url/ip, the client's browser can make a request to a different server in parallel. So if you are requesting, cdn.bootstap.com/boostrap3-min.js, this resource can be downloaded to the client without waiting for www.mywebsite.com/site.js to finish downloading to the clients browser.
So this means the overall website load time will be faster. but will it be human perceivable faster? If this is a website built for mobile devices then you may take advantage of a CDN because you need to optimize load times whenever you can. If the primary use of this website is a desktop website, with mild traffic, in a geographic region where broadband/decent connections is common place, it is not really worth having a CDN unless it is your own CDN server.
The bootstrap CDN server cannot be relied on, it is not part of your infrastructure. You should not trust that those that maintain the Bootstrap CDN will always have it up when people are going to your website. Sure, most likely, the CDN will be up and there is never an issue but if the Bootstrap CDN where to go down and your website doesn't look right or doesn't even work...You don't have the level of control you should have over the situation.
Also, using a CDN to a different domain and IE9 and IE10 and iFrames can cause issues, I wrote about this in my blog here:
http://www.sandiegosoftware.net/blog/script5-access-is-denied-error-in-internet-explorer-loading-jquery/
Use your own CDN server with a subdomain of the primary website domain or don't use a CDN is the best rule of thumb.
I use a website which has a very high response time. Most of the time is taken to load a page with its javascript and css files.
I want to write a Google Chrome extension which can save/cache all the js files for a really long time.
I know JS on its own would not provide me this feature. Does google chrome have an api to do the same?
Are there any other options ?
It doesn't sound like a good idea
Do you expect people to install this extension, for the sole purpose of visiting your site? I visit hundreds of sites regularly, should I install hundreds of extensions?
Will you also make an extension for Firefox / Opéra / Safari / whatever browser I like?
If the webserver correctly places the HTTP headers, the browser (Chrome and all others) will cache all resources correctly.
See How can I improve loading times on a static HTML site?
Edit: Now that I better understand your need, what you can do is create an extension that
is applied on the site in question
removes the script/css loading, for instance $('head>stylesheet').remove()
injects the same script/css with a local copy, and optionally improves the loading of script with $(document).ready(main_function) (rather than <body.onload="main_function()">)
We have a client with thousands of users (who all use Internet Explorer) and a large amount of javascript files that enhance their user experience with our product.
The problem I'm having is that any time we update one of these scripts there is no way to know whether the client is seeing the latest version. What we're having to do is tell our client to do a hard refresh (ctrl+f5) before viewing any changes. Obviously this approach is not ideal.
I know that browsers cache based on the url, so one could use something like
<script src='myScript.js?ver=1.2'>
to get around the issue, but this is not an option for us.
I was hoping that there's some kind of header property or something similar that we could use to tell IE not to cache these scripts.
Any ideas?
You can also version the filename itself like jQuery does:
<script src='myScript-v1-2.js'>
Then, each time you revise the script, you bump the version number and modify the pages that include it to point to the name of the new script. This is foolproof vs. caching, yet still allows your viewers to receive the maximum benefit of caching and requires no server configuration changes for the .js file.
A full solution will typically include setting a relatively short cache lifetime for your host web page and then allow the various resources (stylesheet files, JS files, images, etc...) to have longer cache lifetimes for maximum caching. Anything that is fingerprinted can have a very long cache lifetime. See the reference that fabianhjr posted about for ways to set the cache lifetime of the host web page. It can be done in the web page itself (<meta> settings) or in the http headers via the server.
If you turn off caching for your script file (which would likely have to be done at the web server level for a script file) then all your viewers will lose the performance benefit of caching and you will lose the bandwidth and load-saving benefit of caching. If you use a common .JS file across many pages (a common design pattern), your viewers will see slower performance on every page.
Everything you need to know about cache http://www.mnot.net/cache_docs/
http://www.mnot.net/cache_docs/#CACHE-CONTROL <-- HTTP Headers
I would like to know which solution is the fastest and the best for my web pages between importing a javascript file from an external source and internally. Which pro and cons for each solution.
For example, which one is the best:
< script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script>
or
< script type="text/javascript" src="../jquery.js"></script>
(same for json2.js)
I could not find any tips on google
Thanks!
The main benefit of using a CDN (Content Delivery Network) is that given their widespread use, the chances are your visitor may already have a cached copy of the script you're trying to load on their browser. This will completely negate any loading time. If they don't have a cached copy, the chances are the CDN would deliver it to them faster than your server could anyway. In my opinion it's best to use a CDN where possible.
Even with that in mind, CDN aren't infallible, and you don't want your site to rely 100% on someone else's server. I'd advise on having a local copy of your scripts and use those as a backup where possible. For jQuery, this is quite easy:
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js"></script>
<script type="text/javascript">
if (typeof jQuery == 'undefined') {
document.write(unescape("%3Cscript src='/Scripts/jquery-1.7.1.min.js' type='text/javascript'%3E%3C/script%3E"));
}
</script>
Other libraries may vary in their methods for testing if they're loaded, but the idea is the same.
It's also worth noting that if you are loading from Google's CDN ALWAYS use the full version number otherwise the script will not be cached.
That is to say if your request URL looks like this:
"http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js" // highest 1.4 version (1.4.4)
"http://ajax.googleapis.com/ajax/libs/jquery/1/jquery.min.js" // latest version (1.7.1)
The expires header is set previous to the current date, so the effect of caching is nullified.
More info on this here
If you import javascript from http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.jsimprove data access, Google has CDN that means deliver content more efficiently to users (depend on their location).
Read more about CDN:http://developer.yahoo.com/performance/rules.html
The fastest is definetely from your own server, at least in most cases(that is in pure download speed).
However, there is a much greater chance that a visitor has Google's version of jQuery already cached in their browser from visiting another site using the same library, and as such it probably makes more sense using the Google API for the most common libraries, as that would be much faster if the library is cached compared to having to download it from your server.
Also, these days you can do this, and request the version by just using the first number :
http://ajax.googleapis.com/ajax/libs/jquery/1/jquery.min.js
And automagicly get the latest version ;)
Using a CDN has some advantages:
If the user has already visited another site that uses the same script from the same location, they may have it in the browser cache already. The page loading speeds up when they don't have to re-download it.
The CDN provider probably has the server set up to maximize the efficiency of serving the scripts, for example by sending the file from a server physically closes to the user.
You save bandwidth.
The disadvantages:
You are dependent on the service provider: if their service is down, your site breaks. (This can be helped by serving a local copy of the file if the external script couldn't be loaded.)
You have to trust the service provider for serving the correct file and nothing malicious.
If it is some known resource like googlePlusOne or another stable web service (or external ad), it is better to use external link. This way it will always be up to date.
If it is a library js (like jQuery or Ext) it is better to download the source.
Loading libraries from a local repository will always be faster which would suggest that local is always better, however... Loading libraries from external sources, for example jQuery, will allow your site to always load the most up to date version of the library.