I would like to know which solution is the fastest and the best for my web pages between importing a javascript file from an external source and internally. Which pro and cons for each solution.
For example, which one is the best:
< script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script>
or
< script type="text/javascript" src="../jquery.js"></script>
(same for json2.js)
I could not find any tips on google
Thanks!
The main benefit of using a CDN (Content Delivery Network) is that given their widespread use, the chances are your visitor may already have a cached copy of the script you're trying to load on their browser. This will completely negate any loading time. If they don't have a cached copy, the chances are the CDN would deliver it to them faster than your server could anyway. In my opinion it's best to use a CDN where possible.
Even with that in mind, CDN aren't infallible, and you don't want your site to rely 100% on someone else's server. I'd advise on having a local copy of your scripts and use those as a backup where possible. For jQuery, this is quite easy:
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js"></script>
<script type="text/javascript">
if (typeof jQuery == 'undefined') {
document.write(unescape("%3Cscript src='/Scripts/jquery-1.7.1.min.js' type='text/javascript'%3E%3C/script%3E"));
}
</script>
Other libraries may vary in their methods for testing if they're loaded, but the idea is the same.
It's also worth noting that if you are loading from Google's CDN ALWAYS use the full version number otherwise the script will not be cached.
That is to say if your request URL looks like this:
"http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js" // highest 1.4 version (1.4.4)
"http://ajax.googleapis.com/ajax/libs/jquery/1/jquery.min.js" // latest version (1.7.1)
The expires header is set previous to the current date, so the effect of caching is nullified.
More info on this here
If you import javascript from http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.jsimprove data access, Google has CDN that means deliver content more efficiently to users (depend on their location).
Read more about CDN:http://developer.yahoo.com/performance/rules.html
The fastest is definetely from your own server, at least in most cases(that is in pure download speed).
However, there is a much greater chance that a visitor has Google's version of jQuery already cached in their browser from visiting another site using the same library, and as such it probably makes more sense using the Google API for the most common libraries, as that would be much faster if the library is cached compared to having to download it from your server.
Also, these days you can do this, and request the version by just using the first number :
http://ajax.googleapis.com/ajax/libs/jquery/1/jquery.min.js
And automagicly get the latest version ;)
Using a CDN has some advantages:
If the user has already visited another site that uses the same script from the same location, they may have it in the browser cache already. The page loading speeds up when they don't have to re-download it.
The CDN provider probably has the server set up to maximize the efficiency of serving the scripts, for example by sending the file from a server physically closes to the user.
You save bandwidth.
The disadvantages:
You are dependent on the service provider: if their service is down, your site breaks. (This can be helped by serving a local copy of the file if the external script couldn't be loaded.)
You have to trust the service provider for serving the correct file and nothing malicious.
If it is some known resource like googlePlusOne or another stable web service (or external ad), it is better to use external link. This way it will always be up to date.
If it is a library js (like jQuery or Ext) it is better to download the source.
Loading libraries from a local repository will always be faster which would suggest that local is always better, however... Loading libraries from external sources, for example jQuery, will allow your site to always load the most up to date version of the library.
Related
Browsers cache static files. It's what they're designed to do. 99% of the time, that's a good thing. Until we as developers update that static content.
If a developer updates a javascript file, but a user's browser pulls the cached version of it, then:
Best case, it'll be missing some new functionality until the browser decides to update its cache
Worse case, if you also updated the html page to call a javascript function that didn't exist in the older version of the javascript file that the browser cached, your page breaks
As developers, we know to hit Ctrl+Shift+R, or Ctrl+F5, or open dev console, disable cache on the Network tab, and reload. But users don't know that.
What is the best practice to handle updates to static content?
Is it to make sure that when you add new functions to a .js file, you push out the update to production a few hours/days before you update the html to call that function in <script> tags, allowing browsers to updated their cache over that time?
Is it to not call javascript functions from HTML within <script> tags at all?
Is there a way to somehow force browsers to expire cache on a specific static file when you update it?
Something else?
Obviously disabling all caching on your site is possible, but not a realistic solution.
PS. I'm not using any kind of frontend framework - just raw javascript/jquery. If the situation is different with frontend frameworks, I'd love to heard about that too at a high level
If I understand correctly, you want the JavaScript file to be updated for the user when you update. you should use service work API to create a cache version for specific files or use the Google workbox library. click here. for service worker API click here
Some years ago location.reload(true) allowed bypassing the cache like CTRL / Command+Shift+R does. Only Firefox continues to support this feature by now, but the hard reload using javascript is no longer supported on chromium based browsers. (spec doesn't describe this feature (anymore))
This change was also discussed on this issue on github/Microsoft/TypeScript and several other places on the web.
jQuery uses a simple workaround to be compatible with almost everything. If you load something with jQuerys jQuery.ajax({ url, cache: false }), it appends a _=TIMESTAMP parameter to the url, which has a similar effect but may bloat the cache.
You can make use of the Entity tag header (ETag). Entity tags are similar to fingerprints and if the resource at a given URL changes, a new Etag value must be generated by the server, which is a similar behavior to the Last-Modified header. (caniuse:etag)
Entity tags in: Apache, IIS, nginx (nginx docs), nodejs
It is also possible to clear the sites cache with a Clear-Site-Data: "cache" header. (mdn, caniuse:clear-site-data)
I have a ReactJS app hosted in S3 and using Cloudflare as DNS & CDN.
I have a huge issue, a lot of visitors have old version of the application stored in their browser cache (index.html only). I have configured advanced cache control in the newest version, but it cannot be accessed because older version is shown instead.
Static file (CSS, JS) versioning is done using create-react-app, but I have discovered that index.html file is the only cached one.
What should I do now?
How to purge visitors cache now?
PS: I have purged Cloudflare cache already and setup rule to bypass cache.
Unfortunately there is no such solution for this.
The only way is to wait until users cache will empty (expire).
It is technicaly impossible to clear users cache from external resource (JS script etc.), due to security reasons.
Also if it will be possible, there is no way how to tell users to download latest JS (including cache purging code), because they have old version of index.html (including link to those .js files).
You are stack and the only option is to wait.
A better approach would be, whenever your build changes, change the JS link so that the browser downloads the new version from the server, no matter the user's or the server's caching policy.
For example, the way Stack Exchange does it is, whenever the build changes, the HTML goes from something like:
<script src="https://cdn.sstatic.net/Js/stub.en.js?v=1bac371ac78f"></script>
to
<script src="https://cdn.sstatic.net/Js/stub.en.js?v=f83b2f654"></script>
Whenever there's a new build, you can randomize the parameter in the query string in the HTML, and still only have the single (most recent) built .js on your server.
There are two ways to call bootstrap's css and js files.
From CDN
From root file
My question is what way is the best way to work.
This question is essentially "Should I use a CDN?".
The answer boils down to the following factors.
You will need an internet connection all the time, even to test your code.
If you're on the move and don't have an internet connection, or one of your clients doesn't have one when you're demoing your code, then you're in trouble if you're using a CDN.
The CDN will most probably be faster.
CDNs are designed for the sole purpose of serving files, fast. Most of the time, there are several mirrors serving the content, so files can be served fast to users around the world. Also, if you host it from your own domain, you might also include several cookies every time you serve the file, while the CDN will not.
The CDN might go offline.
This is obvious, but it's a concern nonetheless. Check the reputation of the CDN you're planning to use.
Your bandwidth usage will be minimised if you use the CDN.
If you host your site up somewhere, and your host has a limit on the amount of data it will transfer for you, commonly called 'bandwidth', then a CDN will help, since it won't count against your usage.
Keeping these factors in mind, the final choice is yours. However, for JS at least, I recommend using a CDN with a local fallback. An example done using jQuery can be found at https://stackoverflow.com/a/1014251/2141058.
The CDN serves all the file for you, and will probably be faster than hosting them yourself. However, all you clients must have an internet connection. If your client will have an internet connection (if you are just writing a normal website), you should probably use the CDN.
If your client might not have an internet connect (if this site is internal to a company network) you should serve the files yourself, as your clients won't be able to reach the CDN via the internet.
A foundation of a Content Delivery Network is that a different url/ip address then the primary domain of your website can receive client requests and response with resources.
This means the client's browser can and will make requests for resources that are from different urls/ips in parallel. So when you have a js file on your website, say www.mywebsite.com/site.js, the client's browser will request this resource and queue up the next resource, if the resource is from a CDN, aka a different url/ip, the client's browser can make a request to a different server in parallel. So if you are requesting, cdn.bootstap.com/boostrap3-min.js, this resource can be downloaded to the client without waiting for www.mywebsite.com/site.js to finish downloading to the clients browser.
So this means the overall website load time will be faster. but will it be human perceivable faster? If this is a website built for mobile devices then you may take advantage of a CDN because you need to optimize load times whenever you can. If the primary use of this website is a desktop website, with mild traffic, in a geographic region where broadband/decent connections is common place, it is not really worth having a CDN unless it is your own CDN server.
The bootstrap CDN server cannot be relied on, it is not part of your infrastructure. You should not trust that those that maintain the Bootstrap CDN will always have it up when people are going to your website. Sure, most likely, the CDN will be up and there is never an issue but if the Bootstrap CDN where to go down and your website doesn't look right or doesn't even work...You don't have the level of control you should have over the situation.
Also, using a CDN to a different domain and IE9 and IE10 and iFrames can cause issues, I wrote about this in my blog here:
http://www.sandiegosoftware.net/blog/script5-access-is-denied-error-in-internet-explorer-loading-jquery/
Use your own CDN server with a subdomain of the primary website domain or don't use a CDN is the best rule of thumb.
I am currently testing a web site as the development goes on, and almost every time the client script is updated, I need to clear the browser cache for a new functionality to become available on the client due to the fact that the browser downloads the fresh compy of the .js file.
What if in production I roll out a new version of a script? How do I get the client browsers to get it as soon as it is uploaded to the server?
I am using an ASP.NET MVC 4 site.
Easiest way will be adding the version number to the script file(say script_1.6.js etc)
Rename the file to create versioning:
so
<script src="myscript.js"></script>
becomes
<script src="myscript-9-5-2012.js"></script>
Also per https://developers.google.com/speed/docs/best-practices/caching#LeverageProxyCaching
It's not recommended to use querystrings for versioning (ie. myscript.js?v=1.1.0) because specifically
Most proxies, most notably Squid up through version 3.0, do not cache resources with a "?" in their URL ...
The best way to stop your scripts from caching is to add a random querystring value at the end of each line. e.g.
<script src="/path-to-your-script/script.js?v=0b1"></script>
This is great in development as your scripts never get cached, although in production you do really want the browser to cache the scripts to speed things up.
So for production, you would probably want to introduce some versioning like jquery for instance, jquery-1.8.0.js
We have a client with thousands of users (who all use Internet Explorer) and a large amount of javascript files that enhance their user experience with our product.
The problem I'm having is that any time we update one of these scripts there is no way to know whether the client is seeing the latest version. What we're having to do is tell our client to do a hard refresh (ctrl+f5) before viewing any changes. Obviously this approach is not ideal.
I know that browsers cache based on the url, so one could use something like
<script src='myScript.js?ver=1.2'>
to get around the issue, but this is not an option for us.
I was hoping that there's some kind of header property or something similar that we could use to tell IE not to cache these scripts.
Any ideas?
You can also version the filename itself like jQuery does:
<script src='myScript-v1-2.js'>
Then, each time you revise the script, you bump the version number and modify the pages that include it to point to the name of the new script. This is foolproof vs. caching, yet still allows your viewers to receive the maximum benefit of caching and requires no server configuration changes for the .js file.
A full solution will typically include setting a relatively short cache lifetime for your host web page and then allow the various resources (stylesheet files, JS files, images, etc...) to have longer cache lifetimes for maximum caching. Anything that is fingerprinted can have a very long cache lifetime. See the reference that fabianhjr posted about for ways to set the cache lifetime of the host web page. It can be done in the web page itself (<meta> settings) or in the http headers via the server.
If you turn off caching for your script file (which would likely have to be done at the web server level for a script file) then all your viewers will lose the performance benefit of caching and you will lose the bandwidth and load-saving benefit of caching. If you use a common .JS file across many pages (a common design pattern), your viewers will see slower performance on every page.
Everything you need to know about cache http://www.mnot.net/cache_docs/
http://www.mnot.net/cache_docs/#CACHE-CONTROL <-- HTTP Headers