What are the benefits to concatenating JS files? [duplicate] - javascript

For example, if you have
<body>
<script src="someLibrary.js"></script>
<script src="someLibrary2.js"></script>
<script src="someLibrary3.js"></script>
<script src="someLibrary4.js"></script>
<script src="myApp"></script>
</body>
What is the benefit aside from prettiness in the html to having all of those be concatenated and minified by a task running (Grunt/Gulp) before sending it to client in form of
<body>
<script src="allTheJavascripts.js"></script>
</body>

Combining multiple JS files into one file has the following benefits:
Browsers can download a single file more efficiently and faster than downloading multiple smaller files. One http connection downloading the file is usually faster than many http connections downloading smaller files.
The browser has a limit on how many simultaneous connections it will make to the same domain and, if it reaches that limit, some connections have to then wait until others finish. This causes delays in download. Downloading fewer files make it less likely to hit this limit. This limits applies to all connections to a domain (download of JS files, download of CSS files, download of frames, ajax calls, etc...).
Server scalability can be increased because each page download requires fewer http connections to serve the content.
There are cases where version control and the interaction between version upgrades and browsing JS file caching can be simpler with one larger JS file. When all your JS files are concatenated, you can assign a single version number to that combined JS file (like jQuery does with its versions). Then, any change to the JS anywhere causes a bump in the version number for the master combined file. Since a given browser gets the entire combined file all or nothing, there is never an opportunity for a browser to accidentally get one version of one file fresh from the server and another version of another file from a stale browser cache. Also, maintaining one master version number is a lot simpler than versioning lots of smaller files.
Minifying a JS file makes it smaller to download and parse which increases download performance.
If you are both combining multiple files AND minifying, the minifying can be more effective. When minifying multiple small files separately, you cannot minify variable names that are shared between the different files - they must retain their original names. But, if you combine all the JS files and then minify, you can minify all symbols that are shared among the different JS files (as long as they aren't shared externally).
Obviously, there are some limits here and things don't get arbitrarily better if the whole world puts their JS into one file. Some things to think about when deciding what to package together into one file:
You don't want a large group of your pages to be parsing and executing a large block of code that they will not use. This is obviously a tradeoff because if the code is being effectively cached, then it's not so much a download issue, but rather just a runtime efficiency issue. Each use will have to decide how to draw that tradeoff line.
You may not want to package code that is revised fairly regularly with code that hardly ever changes because this degrades the efficiency of browser caching if the large combined JS is always changing.
In a team environment with multiple projects sharing code, it is very important to think about packaging things into combined and minified chunks that work for the largest number of projects sharing the code. You generally want to optimize the packaging for the broader needs, not just for a single project.
Mobile access often has smaller caches, slower CPUs and slower connections so its important to consider the needs of your most accessed mobile pages in how you package things too.
And some downsides to combining and minimizing:
Directly debugging the minimized site can be quite difficult as many symbols have lost their meaningful names. I've found it often required to have a way of serving an unminimized version of the site (or at least some files) for debugging/troubleshooting reasons.
Error messages in browsers will refer to the combined/minimized file, not to the actual source files so it is can be more difficult to track down which code is causing a given browser error that has been reported.
The combined and minimized site has to be tested to make sure no issues were caused by these extra steps.

Many browsers limit the number of concurrent HTTP requests to a particular domain. Concatenating the JavaScript files reduces the number of HTTP requests needed, allowing the files to be downloaded faster.
The same is true for CSS files.
Separately, such combined files are sometimes put through a minification process that produces syntactically equivalent files, that are smaller.
One downside is that, if any component file changes, the cache for the entire combined file must be invalidated and the combined file reloaded. This is a very small downside for most scenarios.

Very simple:
Reduced Latency (1 file mean one HTTP GET) traded for wasted bandwidth and unnecessary
consumption of memory resources. (Scripts needs to be loaded, parsed, execute even if not needed)
More difficult to debug (1 import) vs. easier to read.
If your page is definitely going to use them all, then go ahead and load them as one. But that's a gross over assumption that generally breaks down. As a general rule, monolithic code base is bad.

Related

Referencing separate JS files vs one JS file

Which would result in greater speed/efficiency: Referencing one JavaScript file for all files in the directory OR referencing a different JavaScript file for each file in the directory?
So basically, referencing the same JavaScript file in all web pages vs a unique JavaScript file for every webpage.
Note: I thought that referencing the single file would be slower as there is code in there that is obsolete to some files, thus running useless code and causing the file to run less efficient.
There are tradeoffs involved so you may ultimately need to measure your specific circumstances to be sure. But, I'll explain some of the tradeoffs.
If you have giant amounts of data or giant amounts of code that are only used in one or a few pages, then you will probably want to separate that out into its own file just so you can ONLY load it, initialize it and have it take memory when it's actually needed. But, note with the amount of memory in modern computers (even phones these days), the data or code has to be pretty large to warrant a separate download.
Other than item 1, you pretty much always want to optimize for maximum caching efficiency. Retrieving a file (even a larger file than needed) from the cache is so massively much faster than retrieving any file (even a small file) over the network that you really want to optimize for caching. And, the time to retrieve these files generally dwarfs any of the JS parse time (CPUs are pretty fast these days) so triggering an extra download to save some JS parse time is unlikely to be faster.
The best way to optimize for caching is to have most of your pages reference the same common script files. Then, they get loaded once when the viewer first hits your site and all subsequent loads come right from the browser cache. This is ideal. This caching efficiency easily overcomes having some unused or untriggered code in the master file that is not used in some pages.
Lots of small downloads (even from the cache) is less efficient than one larger download. More separate requests generally just isn't as efficient for either the browser or the server. So, combining JS files into larger concatenated files is generally a good thing.
There are limits to all of this. If you had completely separate code for 100 separate pages all concatenated together and each piece of code would search the DOM for multiple page elements (and not find them 99% of the time), then that's probably not an efficient way to do things either. But, usually you can make your shared code smarter than that by breaking things into categories based on a high level class name. So, for example, based on the presence of a class name on the <body> tag, you would then run only part of the initialization code, skipping the rest because its classification is not present. So, when combining code, much of which won't be relevant on any given page, it's wise to be smart in how you decide what initialization code in the shared file to actually run.
You need to measure for your specific case - as every site/page have its own balance between loading less files/loading extra unnecessary scripts (same apply to CSS too).
Generally single file is faster in HTTP v1 as there are restrictions on total number of parallel downloads, HTTP v2 should be removing the difference.

Require JS(AMD) working under the hood on minification of files in to a single file

Require.js looks like a perfect solution for having a module based code.As it is mentioned in its website it loads the dependencies on demand(So its faster).But when we use their r.js plugin to minify the code base in to single file , it loads the entire combined file.
So does that mean it is a normal async load and not on demand load? or will r.js has any optimization technique internally to do this efficiently?
Loading individual JS modules on demand is a bit of a red herring because it's often a better idea to put all your code into a single minified bundle even if most users will only use a subset of it. GZipping your resources will reduce size of your scripts by further 33%-50% (at least) so you may end up with a total that's smaller than a single JPEG file (depending on how big your project is, of course).
When loading individual files suddenly the network itself becomes the biggest bottleneck - the browser limits the number of parallel downloads (so you end up loading files sequentially) and there's the connection negotiation overhead for each resource. Together with the fact that it complicates r.js configuration I'd say on-demand loading should be considered only when the modules/libraries are really big and are only used by a tiny fraction of users.
The real benefit of AMD modules isn't actually asynchronous loading but code organisation - with neatly organised modules with well-defined dependencies it's much harder to write bad or untestable code.

Optimising js for production - Lots of small or one large js file

I have an AngularJS app, which I am looking to optimise for speed.
I am currently uglifying and concatenating all my bower_components which I require into a vendor.js file.
I am currently uglifying and concatenating all my custom js into a scripts.js file.
As such, when a user downloads a page, there are very few resources in the get request. (Presently 6 in total without image assets). The disadvantage is that I have two large-ish js documents to download - about half megabyte in total - The entire doc needs to be downloaded before any page rendering can be done.
My main concern is with the vendor.js file. Is it better to use cdn provided, minified javascript files (approx 10 in total), or is it better to use my concatenated & uglified vendor.js?
The former would mean that the total resources would increase to 16 without image assets, however they would be served by different vendor provided CDN networks, allowing parallel downloading.
Even though the former would allow parallel downloading, it is 16 different TCP connections with HTTP embedded, so 16 HTTP headers are sent too. TCP is a bit costly to open new connections. So I think it might be better to agglomerate everything in only one file.
What you could do anyway is to test both case and see at the network developer tools of your favourite web browser to see, in average, which is the fastest way. At least doable with Firefox.

Bundling .js files vs CDN

In order to improve performance of our web pages, we are recommended to use CDNs to serve .js files on our web pages. That makes sense.
Also, we are recommended to bundle our .js files in order to reduce the number of requests which are being made to server on load.
So, we need to sit down and make decision between if we use CDN or bundle .js files.
What are the pros and cons? Which ones make more sense?
Why can't you bundle them and place them are the CDN? It should hardly be a decision of one or the other?
If you have to choose one or the other, it depends on how many .js files you are including. For a small number of files, I'd suggest that a CDN would be quicker, where-as for a greater number of files, a bundle of .js files would definitely be quicker. Where the switch-over would be, is something for you to experiment with.
My answer: both. Bundle them and place them on a CDN.
The downside of doing this? Depends. What does you build process look like? Can you easily automate the bundling and minification? Are you using Yahoo YUI or Google Closure or something else?
Also, if there is a lot of GUI dependent jQuery there might be some time consuming friction due to constantly changing elements/effects/css.
Testing is important too because due to possible minification quirks.
Bottom line: 5 javascript files safely bundled into 1 file === 4 fewer requests.
A page with just plain old Html and one external javascript reference === 2 requests to your server. However, a page with just plain old Html and one external javascript reference on a CDN === 1 request to your server.
Currently we are using the Google Closure tools. The Google Closure Inspector helps with the following:
Closure Compiler modifies your original JavaScript code and produces code that's smaller and more efficient than the original, but harder to read and debug. Closure Inspector helps by providing a source mapping feature, which identifies the line of original source code that corresponds to the compiled code.
As others have already stated, the answer is both if possible. Bundled (and minifying) gives a benefit to your users because it decreases the page weight. The CDN benefits your servers because you are offloading work. Generally speaking, you need not optimize either unless you have observed performance issues or you just have nothing better to do.
There's a few things you need to think about...
How much of the JS do you need to load early in the page load, and how much can you delay until later?
If you can delay loading JS (e.g. put it at the bottom of the page) or load it asynchronously as Google Analytics does, then you will minimise the amount of time downloading the JS spends blocking the UI thread.
After working out how the load of the JS can be split, I'd deal with the merge / minify of the various JS files - cutting down HTTP requests is key to improving performance.
Then look at moving to the CDN and ensure the CDN can serve the JS content compressed and allow you to set headers so it's "cached forever" (you'll need to version the files if you cache forever). A CDN helps reduce the latency but will also reduce size by being cookieless
Other thing you might want to consider is setting up a separate domain for static content, point it to your server(s) while you sort things out and then switch to a CDN if it looks worthwhile.
Andy

Multiple JavaScript files, combine into one

I am developing in ASP.NET MVC and using multiple JavaScript files. E.g. jQuery, jQuery UI, Google Maps, my own JavaScript files, etc.
For performance, should I combine these into one? If so, how?
The reason you want to combine many files into one is so to minimize latency of setting up and tearing down http requests, the fewer you make the better. However, many newer browsers are downloading JavaScript files in parallel (still executing sequentially). The consequence is that downloading a single 1Mb file may be slower than three 350Kb files. What I've come to do is to separate my files into three bundles:
External lib files (jquery, flot, plugins)
Internal lib files (shared by multiple pages)
Page specific files (used only by that page, maybe by two pages)
This way, I get the best of both worlds: not an excessive number of http requests at startup, but also, it's not a single file that can't benefit from parallel downloads
The same applies to CSS, each page load three CSS bundles. So in total, our pages download six bundled files plus the main html file. I find this to be a good compromise. You may find that a different grouping of files works better for you, but my advice is don't create a single bundle, unless it's a one page app. if you find yourself putting the same file into different bundles a lot, it's time to re-think the bundling strategy since you're downloading the same content multiple times.
What to use? Martijn's suggestions are on the money. YUI is the most widely used from my experience, that's what we used at my previous and current jobs.
For the question of whether you should, check out the link in Shoban’s comment.
For the question of how:
Google’s Closure Compiler
Yahoo!’s YUI Compressor
If they are all going to be included on all of your pages, then it doesn't really make a difference. But if some are only relevant to certain pages, it would technically be better to keep them separated and only include the relevant ones on relevant pages.
As far as I know, you should indeed : less files means less http get, hence better performance for the user when they first load the page.
So they will save a split second they will come on your page for the first page. But after, these files are cashed, and it makes then no difference at all...
I haven't digged into the javascript engines itselves, but a function in one file will be handled in the same way if it is in a big file or a small file. So it makes no difference in the execution.
So, save your time, don't change anything as it'll cost you too much time for too little reward, especially when you'll discover that you want the latest version of jquery (a new version came out today btw), and that you have to re-concatene everything...

Categories

Resources