Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
When serving static css/js/etc resources one has the option to use a cdn or to use your own server.
With a CDN you include many includes but they can already be minified.
Alternatively you may minify and concatenate all the files into one css bundle and one js bundle.
Which method is more effecient/performant and why is this the case?
Interesting question. I'd say it depends.
Using a CDN can be slower than using your own server. This is primarily because the CDN resides on another domain, hence each request to the asset would result in an additional DNS round trip. This may cause the entire operation to become even slower than if you had just served it using your own server. That said, after a one-time DNS lookup penalty, DNS round-trip largely becomes a non-issue; the speed then depends on other network factors - how congested the CDN is, distance between the client and the CDN node (relative to your server).
The value of using CDN will only become apparent as your site grows - and your server begins to choke under heavy traffic. A CDN can be used to offload all static requests and prevent them from ever hitting your server. This is especially useful if your server bandwidth is limited and you're serving large files. If you're planning to scale, a CDN is almost a necessity.
CDNs may also help propagate your content to nodes nearer to clients from another part of the world. So if you don't have servers sprawled across multiple continents, CDNs can help provide fast access for a wider audience. This is because CDNs usually serves content from the node closest to the client.
All in all, whether you should use an external CDN depends on these factors:
how large are the static files in your site?
how much bandwidth does your server(s) have?
where are your servers located?
what is your site target audience?
how much traffic are you catering for?
I'd say most of the time, going with CDN is a better option, but there are cases where "going solo" can be justified. In my experience, the free CDNs are not exactly blistering fast compared to even modest AWS or GCE servlets. The network speed is also quite unpredictable, suffering from occasional lags (up to a few seconds) - this could possibly be because of periodic DDos attacks keeping the CDN busy. For small sites not expecting high traffic volume, it's usually faster to simply serve files off your own server, especially if you are catering mostly to local traffic.
You should always minify and concatenate files, regardless of CDN or not. Don't bother to keep individual library files separate. A false promise that you may have heard regularly: if the user has visited another site that happened to use the same library file from the same CDN, his browser would have already cached that file, and you'd save a HTTP round trip request for fetching that file.
Sadly, more often than not, this does not happen. Even with jQuery, the most popular JS library, it seldom happens. There are simply too many different versions of jQuery in deployment as websites are developed at different times. Furthermore, different sites use different CDNs - Akamai, CloudFlare, Amazon CloudFront, cdnjs, MaxCDN, the list goes on... It's actually quite rare to find two popular sites using the same CDN and the exact same library file.
So, if you don't concatenate files in hope of the occasional browser cache hit (you save what - 28KiB?), you incur the HTTP request overhead for EVERY single file. I'd say just go ahead and concatenate all your JS into one big file, including jQuery and all other vendor libraries - it's gonna be faster in 99% of the time.
In my opinion, using a CDN is more efficient because you'll benefit from parallelism : while your server will serve the main content, requests will be made in parallel to serve content from the CDN.
Related
Looking for some information here mainly.
Recently changed my setup to work with Grunt and LiveReload. Its great for creating small minified css files from scss files so thats great.
However, whats best in terms of JS? I currently use CDNs for most things such as jQuery, Isotope, TagIt and so on.
I have a few custom JS files which contain my own code for my site, so its okay for me to concatenate them and minify. I understand that having many requests affects performance, hence minifying and concatenating.
But is it better to download all the libraries and compile them into one JS file to include on my site? Or keep the external ones linked to a CDN?
There are really two considerations here:
Security
Performance
Security
In terms of security, hosting JavaScript yourself puts you in charge, whereas relying on JavaScript from an external domain essentially trusts that domain with the security of your domain (which, depending on your level of trust with that third party, may or may not be acceptable). If you use advanced security settings such as Content-Security-Policy, you may need to do some extra work to allow these externally hosted scripts (e.g. specifically whitelisting the CDN domains in the script-src).
Performance
In terms of performance, using a CDN hosted version -- especially if it is popular -- may give you better caching; however, hosting it yourself and combining it with your other scripts may produce fewer requests. In terms of which is actually faster, you will need to do some of your own measurement (I recommend doing an a/b test with real user traffic, which will give you a better idea of whether real users already have the CDN hosted versions of the resource cached or not).
When page-load speed is the priority, is it better to use a minimal, lightweight javascript library (hosted on a CDN), or is it better to use something like jQuery, hosted on Google's CDN that the browser more than likely already has loaded?
Edit: What my question really boils down to is whether the cross-site caching effect of using jQuery hosted on Google's CDN outweighs the benefits of using an ultra-light library, also on a CDN.
jQuery is not heavy as compared to any other javascript library at present looking at the amount of features and browsers it supports.
You can consider this factor while selecting the plugins to be used on the page because they are written by various users and some may right it intelligently considering this factor or some may just right it for the sake.
Yes, if you use CDN like Google for jQuery it is most likely that the library must be cached by the browser and also Google has number of servers based on location so you don't have to worry about it.
Decreased Latency
A CDN distributes your static content across servers in various, diverse physical locations. When a user’s browser resolves the URL for these files, their download will automatically target the closest available server in the network.
In the case of Google’s AJAX Libraries CDN, what this means is that any users not physically near your server will be able to download jQuery faster than if you force them to download it from your arbitrarily located server.
There are a handful of CDN services comparable to Google’s, but it’s hard to beat the price of free! This benefit alone could decide the issue, but there’s even more.
Increased parallelism
To avoid needlessly overloading servers, browsers limit the number of connections that can be made simultaneously. Depending on which browser, this limit may be as low as two connections per hostname.
Using the Google AJAX Libraries CDN eliminates one request to your site, allowing more of your local content to downloaded in parallel. It doesn’t make a gigantic difference for users with a six concurrent connection browser, but for those still running a browser that only allows two, the difference is noticeable.
Better caching
Potentially the greatest benefit of using the Google AJAX Libraries CDN is that your users may not need to download jQuery at all.
No matter how well optimized your site is, if you’re hosting jQuery locally then your users must download it at least once. Each of your users probably already has dozens of identical copies of jQuery in their browser’s cache, but those copies of jQuery are ignored when they visit your site.
However, when a browser sees references to CDN-hosted copies of jQuery, it understands that all of those references do refer to the exact same file. With all of these CDN references point to exactly the same URLs, the browser can trust that those files truly are identical and won't waste time re-requesting the file if it's already cached. Thus, the browser is able to use a single copy that's cached on-disk, regardless of which site the CDN references appear on.
This creates a potent "cross-site caching" effect which all sites using the CDN benefit from. Since Google's CDN serves the file with headers that attempt to cache the file for up to one year, this effect truly has amazing potential. With many thousands of the most trafficked sites on the Internet already using the Google CDN to serve jQuery, it's quite possible that many of your users will never make a single HTTP request for jQuery when they visit sites using the CDN.
Even if someone visits hundreds of sites using the same Google hosted version of jQuery, they will only need download it once!
It's better to use the library that best suits the needs of your application and your development team. A super-lightweight library might save you a few hundred milliseconds of load time, but may end up costing you in development hours if your team has significantly more experience with jQuery/MooTools/Dojo etc.
If new feature implementation and bug fixing is hindered by using a second-rate tool solely to improve load times, your users are ultimately going to suffer.
i simply would like to know from who has well documented about, if i have 10 js/css external files to append at my site does is better to compress them into only 1 file or is good to have 10,20 external source links anyway in a page speed point of view?
i also ask this, cause using both firebug google page speed and yahoo Yslow page speed tools, they conflict in this, google says to separate files, yahoo says compress all in one :| .. normally i would trust in bigG but who knows :|
One of the key performance enhancements you can make is to reduce the number of HTTP requests. Each external resource means one extra request, so grouping them together will have a positive impact on page performance.
If you want to learn more about front-end performance, check out Steve Souders' books. You can find a simplified overview of the topics in the books on the Yahoo! Developer Network (he was at Yahoo! when he wrote the first book).
First of all, compressing files is (nearly) always a good idea. Both using specific JS and/or CSS compressors, and using GZIP compression at the HTTP-level.
Deciding whether to combine files or not is not so easy; you need to juggle different goals:
Minimize the total number of bytes loaded; this includes making sure files can be retrieved from cache
Make sure the files loaded arrive in as few requests as possible.
Combining files optimizes for #2, but can be at the cost of #1. If different pages use different CSS / JS, then every page might get a different combined file (permutation of component files), making caching impossible.
A quick-and-dirty solution is to include all generic JavaScript and all CSS used on all the pages in two single compressed files (one JS, one CSS). If your visitors stay on your site for a longer time they will have the best experience, since all CSS/JS needs to be loaded only once, and that one time is as quick as possible.
If you combine your scripts into one script on one host and this host is slow -- your page will be slow. If you have your script broken up into a few scripts hosted on a CDN with different sub domains for the scripts, your browser will download more of them in parallel. Read that site on boosting download times. It has the conclusion "boosting parallel downloads can realize up to a 40% improvement in web page latency. You can use two or three hostnames to serve objects from the same server to fool browsers into multithreading more objects."
Make sure you have the Google Page Speed, YSlow and Firebug addons installed, then use them. They will help you make your website faster.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
We're currently pulling jQuery and jQueryUI (and jQueryUI CSS) libraries from the google CDN. I like this because I can call google.load("jquery", "1");
and the latest jQuery 1.x.x will be used.
Now I am to pull the libraries locally because of security.
I'm happy to pull them locally but I'm wondering what are some of the other benefits and pitfalls to watch out for?
I always use the CDN (Content Delivery Network) from Google. But just in case it's offline:
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script>
<script>!window.jQuery && document.write('<script src="jquery-1.4.2.min.js"><\/script>')</script>
Grab Google CDN's jQuery and fallback to local if necessary
Edit:
If you don't need to support IE6 and your site has partial https usage you can remove the http as well:
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script>
The main benefit of having them on a CDN is that the files can be downloaded in parallel to files downloaded from your own website. This reduces latency on every page. So, the flip side of this is a pitfall of hosting locally - increased latency. The main reason for that is that browsers are limited in the number of connections that they can make at the same time to the same web domain. In IE6 this was defaulted to 2 concurrent connections to the same domain - shared between all open windows of IE!! In IE8+ it improved, defaulting to 6, which is inline with FF/Chrome, but still, if you have a lot of images and you are not using sprites, you will experience heavy latency.
Using a CDN, I would always set the library version explicitly rather than getting the latest one. This reduces the risk of new versions breaking your code. Not very likely with jQuery, but possible.
The other main benefit of using a CDN is reduced traffic on your site. If you pay per GB or you are on a virtual server with limited resources, you might find that overall site performance increases and hosting costs come down when you farm off some of your content to a public CDN.
Make sure you also read the other answer to this question by #Xaver. This is a very good trick
Others have covered the benefits. Pitfalls:
If you only include content from your own server, that's one server that needs to be running—and not blocked by firewalls etc—to make your site work. Pull script from a third party and now that's two servers that need to be running and unblocked to make your site work.
Any site you pull <script> from can completely control the user's experience on your site. If Google were feeling evil they could put something in their copy of jQuery to log your keypresses, steal personal information from the page you're on to tie into their web tracking database, make you post “I love Google!” comments to every form, and so on.
Google probably aren't actually going to do that, but it's a factor that's out of your control, and certainly something to worry about with other script-hosting services. There have been incidents before where stats scripts have been compromised with malware loaders.
Before including any script from a third party—even on one single page of your site—you must 100% trust them with all user-accessible functionality visible on that hostname (including web-facing admin functions).
Google CDN:
caching, good for performance, more users likely to have it already, and it downloads in parallel
if ever, heaver forbid cdn goes down. you're screwed.
if a new version breaks your existing plugins or site, you'll know about it possibly too late
Locally:
development without being connected to the net is possible
can still get some performance benefits by gzipping, in addition to minifying
I prefer to use my local version, because I don't have control about what they will provide.
For example I don't want my users to get affected by google-analytics or anything similar, because this is a legal problem in my country.
Benefits: (Specifically for Google's CDN)
Downloads in parallel with your files. Other answers address this further
Google's Servers are likely to be able to physically deliver the content faster
Common libraries and frameworks might already be on the user's machine, as the HTTP cache for a CDN is universal across all sites
Your bandwidth wouldn't have to go towards serving large library files
Virtually every way you look at it, using Google's CDN is a good thing.
Performance will be improved (albeit fairly marginally, unless your site is really busy), and the amount of data your servers have to transmit will go down (although jQuery isn't exactly a massive thing to download), etc.
The only reason you wouldn't want to use it is if you don't trust Google. By using it, you are effectively giving Google an additional window of information into your site's traffic profile, including knowledge of URLs that you may otherwise not want to make public (eg secure areas of your site).
If you are paranoid about security then this may be enough to persuade you not to use them (after all, hosting it yourself isn't exactly going to slow your site down to a crawl), but in general most people would take the pragmatic view that Google knows enough about their site already that adding this won't make much difference.
Probably I'm in minority nowadays, but I'd say that you don't want to use CDN unless you really need to. Key factors to start using it are:
Cross geo users. If you host your website in the US but have visible amount of European users - CDN will improve the loading time.
Big amount of users and\or big content, so one main server is not enough any more. One can think of any porn-video website (or Netflix, if you want). Video stream is a heavy load, with CDN would be much much less load on the main server.
But... the point is that these points are not really applicable to 90% of websites in the world. I bet you're not Facebook with millions of online users around the globe, you're not Pornhub with hundreds of GB transferred every second.
If your website is targeted to users in your city/country and capacity of one server is enough for amount of users you have - why would you ever want a CDN? It's quicker for your users in your city and simpler for you to fetch everything from your main server locally.
It was more about CDNs in general, now let me be closer to the actual question about jQuery or any other library.
If you want your website to stay accessible and working without maintenance for more than a year, let's say - put it locally. Libraries nowadays are being updated in a crazy tempo which you probably don't want to follow. And old versions are being deleted eventually. Moreover, the whole library can die (probably not applicable to jQuery though).
From my recent experience - I updated TinyMCE on the website I maintain from 3.x.x (dated 2012) to 5.x.x (dated Spring 2019). This website was working for 7(seven!) years without any maintenance in this part of the logic. There was no "minifying" concept back then and CDNs were not as common as now. But even if they would be common - you never know what will happen in 3-5-10 years from now. Usually you want your website to stay alive even without you maintaining it, don't you? However if you pull jQuery from CDN today, then this link may (and, probably, will) break in 5 years.
Solution with CDN AND fallback to local version as #Xaver suggested can be a good compromise. But... maybe just get rid of CDN link? ;)
To me it really depends on how much control you desire to have. If you are like me and need to develop on local host when working and traveling. Having the jquery files local is better than having it hosted on google or else where.
I'm looking for the pros/cons of pulling jQuery & other JS libraries from Google API's cloud as opposed to downloading files and deploying directly.
What say you?
My decision
The likelihood of the lib already cached on the users system is the overriding factor for me, so I'm going with a permalink to googleapis.com (e.g. ajax.googleapis.com/ajax/libs/…). I agree with others here that loss of access to the Google server cloud is a minimal concern.
Con
Users in countries embargoed by the U.S. (e.g. Iran) won't get a response from Google
Pros: It may already be cached on the user's system. Google has big pipes. You don't pay for the bandwidth.
Cons: You now have two different ways for your site to become unavailable: A service interruption on your server or one on Google's server.
I've been looking at the real-world performance of the Google loader for jQuery, particularly, and here's what I've found:
Google's servers are quick and plenty reliable.
They are serving from a CDN, which means if you have a lot of overseas users they'll get much better load times.
They are not serving gzipped files. So they're serving a lot more bytes than they need to.
If you know what you're doing in Apache, Lighttpd, or whatever you're serving files with, you could set your cache headers just like Google's and significantly reduce the amount of data your end user has to download by serving it from your own server. You could also combine your scripts at that point and reduce your overall HTTP requests.
Bottom line: Google's performance is good but not great. If you have many many overseas users then Google is probably better, if your users are mostly US-based and maximum performance is your concern, learn about caching, Etags, gzipping, etc. and serve it yourself.
Pros:
Google's connectivity is probably way better than yours
It's a free CDN (content distribution network)
Your webapp might load faster, since you're using a CDN
Cons:
If/when you need to optimize by repackaging a subset of that third-party JS library, you're on your own, and your webapp might then load slower
In addition to points made by others I'll point out two additional cons:
An additional external HTTP request, so assuming you have a Javascript file of your own (almost certain) that's two minimum instead of one minimum; and
IMHO because jQuery load is async your entire page can load before the library has loaded so the effects that you do on document ready are sometimes visibly noticeable to the user when they are applied. I think this is not a great user experience.
The pros are quite obvious and are in the other answers :
you save bandwidth
google is probably more reliable than your server
probably cached in most browsers (anyone stats on this ?)
But the cons can be very tricky :
If you are using https, you will get an error on most browsers as your certificate isn't valid for google's domain, only yours. This is a major issue for https.
I think what would be cool to do is run A/B tests and see what the latency is to load minified version of jquery from Google's servers vs your server. Hopefully that'll put things into perspective. Chances are the Google server might be faster, but in terms of accepting responsibility of down time, nothing beats hosting it yourself.
Pro:
Google's Ajaxlibs offer a very fine-grained "version control" for the included libraries. You can enforce a certain version (e.g. JQuery 1.3.2) or automatically request the latest version from a certain branch (e.g. JQuery 1.3 series -> would currently deliver 1.3.2, but maybe soon 1.3.3).
The later has definitely has benefits: you'll profit from smaller bugfixes/performance improvements without breaking your scripts/plugins.
Maintaining such a multi-library repository on your own can become quite ressource intensive.
Con:
When afraid of DNS poisoning, or when afraid that some public wireless network might not be trusted, then the non-SSL versions might actually not be served by Google at all, opening up drive-by installation of malware. (But: caching is set to be a full year, so even though many browsers will issue a If-Modified-Since request for cached content when hitting refresh, this might still be a theoretical issue as most users will already have cached the resources while using another network.)
When taking extreme care for your visitors' privacy, you might not want Google to record visits to your site by using their CDN. (Quite theoretical as well, as the same note on caching applies.)