I'm having difficulty in deciding which approach is better in terms of site performance.
Either to have all required jQuery plugins in one file to be included on every page on the site OR split the plugins out to individual files and use the jQuery.getScript() method to load them as and when required.
Is there any real benefit in loading the scripts asynchronously over one http request?
All my Javascript will be minified and gzipped.
Thanks!
It's not so simple and depends on the distribution of javascript across your site. Have a look at this question : Which is better for JavaScript load-time: Compress all in one big file or load all asynchronously?
From my poit of view the best solution until now is controljs
Read the complete post http://www.stevesouders.com/blog/2010/12/15/controljs-part-1/
One request will be better for performance. Period. Only downside is, every time one of the files changes, the whole thing changes (and will have to be downloaded again). Plugins won't change much, so I'd put everything (as much as possible) in 1 file.
Put jQuery core in that file as well. And your custom javascripts as well. Just make sure it's in the right order :)
Try the YSlow plugin to Firefox and try your different setups.
That said, minifying your js to one file would be an easy way with great results. You will get one file, and it's often very much smaller than the sum of the parts.
Related
A typical website consists of one index.html file and a bunch of javascript and css files. To improve the performance of the website, one can:
Minify the javascript and css files, to reduce the file sizes.
Concatenate the javascript files into one file and similar for the css files, to reduce the number of requests to the server. For commonly used (and shared) libraries like jquery it makes sense to leave them external, allowing the browser to cache the library and reuse it in different web applications.
I'm wondering if it makes sense to put the concatenated javascript and css file inline in on single html file, which will reduce the number of requests even further. Will this improve the performance of your site? Or will it work reversed, making it impossible for the browser to cache anything?
Concatinating your CSS and JS files into one file will reduce the number of requests and make it load faster. But as commented, it won't make much sense unless you have a one-page site and the load time of that page is very critical. So you're better off to separate CSS from Javascript in my opinion.
Here's a book where you can learn more about the topic:
High Performance Web Sites
this tools maybe help you.
Turns your web page to a single HTML file with everything inlined - perfect for appcache manifests on mobile devices that you want to reduce those http requests.
https://github.com/remy/inliner
It would cut down on the number of requests but it would also mean no caching of those for use on other pages. Think of defining an external file as also a way to tell the browser "and this section of the site is reusable". You'd be taking that ability away and so the CSS and JS would load. Like jackwanders said it's great if you only have one page.
This is not a good idea for the following reasons:
You will not enjoy the benefit of cache
You will load unneeded resources in all of your pages
You will have a hard time while developing your website because of large files with unrelated code branches
If you work in a team you will have to work with your teammates on the same files always, which means that you will have a lot of merge conflicts.
You can have a single CSS for all your pages and since it will be cached, the subsequent pages will refer it from cache without sending extra request.
However, putting all Javascript files is into one is contextual.
Most probably you might be using libraries like jQuery, and relevant plugins. This 'might' throw conflicting issues between plugins. So, before you try it all at once, try merging few files at once and checking if the error pops or not.
In order to improve performance of our web pages, we are recommended to use CDNs to serve .js files on our web pages. That makes sense.
Also, we are recommended to bundle our .js files in order to reduce the number of requests which are being made to server on load.
So, we need to sit down and make decision between if we use CDN or bundle .js files.
What are the pros and cons? Which ones make more sense?
Why can't you bundle them and place them are the CDN? It should hardly be a decision of one or the other?
If you have to choose one or the other, it depends on how many .js files you are including. For a small number of files, I'd suggest that a CDN would be quicker, where-as for a greater number of files, a bundle of .js files would definitely be quicker. Where the switch-over would be, is something for you to experiment with.
My answer: both. Bundle them and place them on a CDN.
The downside of doing this? Depends. What does you build process look like? Can you easily automate the bundling and minification? Are you using Yahoo YUI or Google Closure or something else?
Also, if there is a lot of GUI dependent jQuery there might be some time consuming friction due to constantly changing elements/effects/css.
Testing is important too because due to possible minification quirks.
Bottom line: 5 javascript files safely bundled into 1 file === 4 fewer requests.
A page with just plain old Html and one external javascript reference === 2 requests to your server. However, a page with just plain old Html and one external javascript reference on a CDN === 1 request to your server.
Currently we are using the Google Closure tools. The Google Closure Inspector helps with the following:
Closure Compiler modifies your original JavaScript code and produces code that's smaller and more efficient than the original, but harder to read and debug. Closure Inspector helps by providing a source mapping feature, which identifies the line of original source code that corresponds to the compiled code.
As others have already stated, the answer is both if possible. Bundled (and minifying) gives a benefit to your users because it decreases the page weight. The CDN benefits your servers because you are offloading work. Generally speaking, you need not optimize either unless you have observed performance issues or you just have nothing better to do.
There's a few things you need to think about...
How much of the JS do you need to load early in the page load, and how much can you delay until later?
If you can delay loading JS (e.g. put it at the bottom of the page) or load it asynchronously as Google Analytics does, then you will minimise the amount of time downloading the JS spends blocking the UI thread.
After working out how the load of the JS can be split, I'd deal with the merge / minify of the various JS files - cutting down HTTP requests is key to improving performance.
Then look at moving to the CDN and ensure the CDN can serve the JS content compressed and allow you to set headers so it's "cached forever" (you'll need to version the files if you cache forever). A CDN helps reduce the latency but will also reduce size by being cookieless
Other thing you might want to consider is setting up a separate domain for static content, point it to your server(s) while you sort things out and then switch to a CDN if it looks worthwhile.
Andy
So I know it's best to have one javascript file for an entire site to limit http requests. So obviously only some javascript is required for some pages. What is the best way of only running the javascript required for the current page?
EG.
if(page=='home'){
//run javascript require for the home page
}
Maybe this isn't an issue and if targeting elements are not found on the page javascript will just fail gracefully? I would just like to know the best practice for this javascript structure.
Encapsulate your logic in functions. Then just call the function(s) you need in each page, either via "onload" or an embedded function call in the page:
<script type="text/javascript">
yourFunctionForThisPage();
</script>
Edit: Just to clarify: my answer is assuming the (implied) constraint of a single .js file. As others have pointed out, although you save on HTTP requests, this is not necessarily a good idea: the browser still has to parse all the code in the file for each page, whether used or not. To be honest it's pretty unusual to have a global site-wide js resource with everything in it. It's probably a much better idea to logically split out your js into various files, i.e libraries. These libraries could be page-based - i.e specific code for a particular page, or algorithm/task-based that you can include in whatever pages need them.
Is this feasible?
While it is best to have just a single Javascript file per page to lower the number of requests yet it may not be feasible. Especially the way that you'd like to do it.
If you're asking how to join various scripts of various pages into a single script and then running just those parts that are related to a particular page then this is something you shouldn't do. What good is it for you to have one huge file with lots of scripts (also think of maintainability) compared to a few short integrated scripts? If you keep the number of scripts low (ie. below 10) you shouldn't be to worried.
The big downside is also that browser will load the complete script file which means it will take it more time to parse them as well as consume a lot more resources to use it. I'd strongly suggest against this technique of yours even though it may look interesting...
Other possibilities
The thing is that the number of Javascript files per page is low. Depending on the server side technology you're using there are tools that can combine multiple script files into one so every page will just request a single script file which will combine all those scripts that this particular page will use. There is a bit overhead on the server to accomplish this task, but there will be just one script request.
What do you gain?
every page only has scripts that it needs
individual script files are smaller hence easier to maintain
script size per request is small
browser parsing and resource consumption is kept low
Know what you will need on the page and use a script loader like labjs.
Also, remember that your specific case might be different from what others have found, so you might want to do some tests, to verify if, for example, having 5 little files, is better (or worse) than 1 big file.
The only way to be sure is to test different options yourself and come up with a fitting solution.
So I have web app with multiple JS files (jQuery, jQuery, my own JS code and more). Say I have a page named index.html. What would be the best practice to include / preload my js files? I was thinking about creating a separate JS file that will do the preloading (include all the other scripts and call jQuery.noConflict()). What do you guys suggest? Is this possible? How would you implement it?
Thanks!
In general, combine your script files into one file (and minify or compress them, or even compile them, but note that this last item is not zero-impact, there are pain points). See notes here and here. Basically, one of the first guidelines you'll see for a good fast page load is "minimize HTTP requests." So you don't want six separate script tags where you could have one.
For popular scripts, though, you may benefit from using them from Google's CDN. Google is kind enough to host most popular JavaScript libraries on their CDN for free. The advantage here being not only that the CDN will be fairly fast, but that the target user's browser may well have a cached version of the script you want to use even though they've never been to your site before.
Check out RequireJS, a smart and robust script loader for JavaScript. It's designed to work well with jQuery and comes with an optimization tool to combine all of your scripts into one.
The best way is to minimize all the js files and combine them into one script. This will cause less work for the browser, as it doesn't have to make multiple requests to the server.
If you are going to load everything up at the same time, you could put it all into a single compressed file
What are pros of using an external javascript file? I just can't figure it out, I see big websites using them all around several times instead of server-side includes. Is it just for caching?
If it's a matter of clean code and seperation of concerns, then you can still include it from the serverside into the html. For example I use SMARTY and I can just include the file {include file='javascript.js} inside <script></script> tages.
If it's for performance I can't see anything other than an extra http request that makes the external file slower involved. I'm sure I must be missing something because all the big websites still do this.
Is it because of caching the file? my javascripts are dynamic and shouldn't be cached anyway.
could someone help me out to make the right decision to choose what to do with my javascript files.
ps:can a 1.5K user create a tag for external-javascript?
The most important is that the file is cached by the browser. The fewer bytes that need to be sent from the server the better. This is a big part of web performance.
Second to that, it provides modularity.
I'm not sure why your JavaScript is dynamic, but I suggest you rewrite it in a way that removes that need. That in itself might be an issue for you down the road.
In your case where there is no caching because the entire javascript file is generated dynamically, inline is probably superior. It saves you the HTTP overhead.
Source: http://developer.yahoo.com/performance/rules.html#external
they also help the developers separate different conceptual areas of their code. It can get real annoying looking at hundred to thousands of lines of js in a single file, on top of complicated html.
Besides what #Gabriel said, it also helps you use the same function in different pages, withouth the need for them (.html docs) to be larger.