Tools for JS and CSS file concatenating - javascript

I recently started working with a large code base with many (15-20) js requests per page. I'm tasked with optimization and performance improvement of these sites.
I've been using tools like Google's PageSpeed and Yahoo's YSlow in conjunction with WebPageTest.org's tests to determine a baseline speed of the site and area of improvement. I'm curious if there are some standard or best-practice solutions for concatenation and minification of JS and CSS files.
I watched: http://www.youtube.com/watch?v=30_AIEhar-I and the first 20 minutes were really good at hammering mod_pagespeed as a good target.
I'm currently considering mod_pagespeed with a YUI compressor and perhaps a sprite generator on top of it all.
What are some good tools that I may have missed or things that I should be concerned about with my current build?
Edit: It should be noted that this is one page of many (possibly hundreds) and the site receives a new build every two weeks so being able to automate this concatenation and minification is a must, I can't just do it once and call it good.
Edit 7/30/2012 -
I spent some time looking at different tools, it's hard to say which ones are the best but at this time, not very many people use mod_page speed.
Closure is more widely used for certain, but even that is lacking. It seems the optimal way to do this is to just use a plugin with YUI.
There are other places that suggest Packer but it seems that many believe the smaller file sizes are eliminated by the necessity to unpack them on the client machine. This stackoverflow response is a good read regarding these types of tools.

Google's Closure Compiler is quite nice for concatenating and minifying JavaScript. It has the added bonus of linting your code for you when you compile, it will remove dead code, and it can also perform compile-time type checking if you include type hints in docblocks.
In certain cases, the dead code removal feature gives Closure a huge advantage over other minifiers... for example, think of cases where you include a library, but only use about 10% of the functionality. The other 90% can be removed if you compress the library along with the rest of the project.
As for CSS, YUI compressor is probably your best bet if you want something fancy. Otherwise, you could just concatenate the files together using cat and take the hit of a few extra bytes from whitespace.

Related

Javascript dead code elimination

I'm looking for a way to reduce a javascript library/framework with the objective of only sending to the client (browser) only the necessary code. I've searched and I found that, for example, closure compiler can't remove jquery dead code.
How can I write a web application, using javascript or any other languages/frameworks/tools to archive this goal? I've nothing against frameworks but in mobile devices performance counts a lot. I've tested HTML5 mobile apps on cheap android devices and it's terrible to use due to performance issues. I want to extract maximum performance.
My 2 cents:
In term of performance, use good practises with a linter such as jslint, jshint, or eslint.
To check wich concept is better than another one, you can check at jsperf or make your own ones with benchmark.js.
For loading time, jQuery is a bit big for mobile. Zepto is a good alternative. Imo, it's better to concat all js files on mobile than async load the js.
MicroJs is a site which list all libraries that usually fit a specific need.

What Are The Cons (Ill-Effects) Of HTML, JavaScript and CSS Minification?

First, I see that Google suggests minifying everything - - HTML, JS and CSS to increase performance.
But I doubt it does only good, and no bad; especially because many of the popular websites haven't enabled at least HTML minification (and some haven't even enabled JS and CSS minification as well).
So, can someone knowledgeable, please enlighten me off the ill-effects / cons of enabling the following on a website:
HTML minification
JS minification
CSS minification
for example, I heard that HTML minification could cause issues with Google analytics and Adsense (or any ad) code in the page. Is it true?
If done right, minification can be completely without side effects. The thing is, it’s not that easy to get things right.
For example, Google’s JS compiler, Closure, generally works fine but breaks more complex scripts. Always a tradeoff, better compression at the cost of less compatibility, or the other way around.
Also, by enabling gzip, you achieve somewhat more compression than minification, all without touching your code. This burns server CPU.
Bottomline is, if you’re not sure you need minificaton — you probably don’t.
Deployment will be more complex and error-prone. Where you could previously just upload your whole code, you know have to make certain to compilers.
Development becomes more complex; developers will probably install the minification tools.
Debugging becomes more complex, you know have to use a utility to find the original line. Of course.
Minifiers could have errors and introduce errornous code (that of course could cause various issues).
For large software projects, the first three criteria are all but irrelevant, and bugs in minifiers can be mitigated by careful (automated) testing. For a small/personal project, you probably don't need minification unless bandwidth or website performance is an issue.

Properly managing javascripts for large website / code concatenation

The site I am developing has a large amount of javascript that is shared across various functionality, and an equally large amount of feature-specific javascript. I've read all about using one monolithic javascript file vs. many smaller ones.
For my purposes, the huge-file approach would not only result in a script difficult to maintain, but contain a lot of unneeded javascript as well. At the same time, separating the javascript so that only the required code is included would result in an excessive number of files / HTTP requests. The idea of including even a moderate amount of unneeded code seems contrary to the concepts of proper software design, besides the additional file size overhead for the user.
I have found the mod_concat module for Apache which seems like it would solve my problem entirely - I could separate my javascripts into as many files as I want, include only those necessary, and take almost no hit on performance.
Is this actually the case? Is the only potential drawback the need to manage many files? I know mod_concat has not been around forever, so I'm also looking for a bit of background on a) how this was handled before, and b) if, even with code concatenation, including a moderate amount of unneeded javascript is considered acceptable (or even a best practice).
Thanks, Brian
I don't think you need an apache module for that. Creating one minified JS file for production should be the best way to go, because it is only loaded once and then cached by the browser. Although for development of course, it makes sense to have your application split into separate files.
My personal favorite for JavaScript module management and compression is Steal JS which is part of the great JavaScript MVC framework (could be generally interesting for larger JS applications). It can load module files dynamically during development and for production you can create one compressed JavaScript file (it can do CSS, too).
Another alternative is RequireJS but I only had a quick look at it.

Which JavaScript framework is generally used for high performance websites?

There are different JavaScript frameworks like jQuery, Dojo, mooTools, Google Web Toolkit(GWT), YUI, etc.
Which one from this is suitable for high performance websites?
(Full disclaimer: I am a Dojo developer and this is my unofficial perspective).
All major libraries can be used in high load scenarios. There are several things to consider:
Initial load
The initial load affects your response time: from requesting a web page to being responsive and in working mode. Trivial things to do are:
concatenate several JavaScript files together (works for CSS files too)
minimize and/or compress your JavaScript
The idea is to send less — good for the server, and good for the client.
The less trivial thing to do:
structure your program in such a way so it is operational without all modules loaded
Example of the latter: divide your modules into essential (e.g., the core logic), and non-essential (e.g., helpers: tooltips, hints, verifiers, help facilities, various "gradual enhancers", and so on). The idea is that frequently there are things which are not important for frequent users, but nice for casual users ⇒ they can be delayed.
We can load essential modules first and load the rest asynchronously. Example: if user wants to edit an object we need to show it first, after that we have several hundred milliseconds to load the rest: lookup tables, hints, and so on.
Obviously it helps when asynchronous loading of modules is supported by the framework you use. Dojo has this facility built-in.
Distribute files
Everybody knows that due to browser restrictions on number of parallel downloads from the same site it is beneficial to load resources (images, CSS, JavaScript) from different domains:
we can download more in parallel, if user's line has enough bandwidth — these days it is almost always true
we can set up web servers optimized for serving static files: huge disk cache, small workers, keep-alive, async serving, and so on
we can remove all unnecessary features we don't need when serving static files: sessions, cookies, and so on
One frequently overlooked optimization in JavaScript applications is to use CDN:
your web site can benefit from the geographical distribution of CDN (files can be served from the closest/fastest server)
user may have required files in her cache, if they were used by other application
intermediate/corporate caches increase the probability that required files are already cached
the last but not least: these are files that you don't serve — think about it
Again, Dojo supports CDNs for a long time and distributed publicly by AOL CDN and Google CDN. The latter carries practically all popular JavaScript toolkits too. Obviously you can create your own CDN and your very own CDN- and app- specific Dojo build, if you feel you need it — it is trivial and well documented.
Communication bandwidth
How that can be different for different toolkits? XHR is XHR.
You need to reduce the load on your servers as much as possible. Analyze all traffic and consider how much static/immutable stuff is sent over the pipe. For example, typically a lot of HTML is redundant across several pages: a header, a footer, a menu, and so on. Do you really need all of these to be sent over every time?
One obvious solution is to move from static HTML + "gradual enhancements" with JavaScript to real "one page" JavaScript applications. Again, this is a frequently overlooked, but the most rewarding optimization.
While the idea sounds easy, in reality it is not as simple as it seems. As soon as we go from one-liners to apps we have a plethora of questions, and the biggest of them is the packaging: what your components are, what components are provided by the toolkit, and how to package and deliver them.
Dojo provides modules, good OOP for general classes, widgets (a combination of an optional HTML and related behaviors), and a lot of facilities to work with them. You can:
load modules on demand rather than in the head
load modules asynchronously
find all dependencies between modules automatically and create a "build" — one file in simple cases, or more, if your app is big and requires several layers
while doing the "build" it can inline all HTML snippets for your widgets, optimize CSS, and minify/compress JavaScript
Dojo can automatically find and instantiate widgets in HTML saving a lot of boilerplate code
and much much more
All these features help greatly when building applications on the client side. That's why I like Dojo.
Obviously there are more ways to optimize high load web sites but according to my practice these are the most specific for JavaScript frameworks.
Quite simply: all of them.
All frameworks have been built in order to provide the fastest performance possible and provide the developers with useful functions and tools. Your choice should be based on your requirements.
JavaScript runs on the client-side, so none will affect your server performance. The only difference server-side is the amount of bandwidth used to transfer the .js files to the client.
I'm personally fond of MooTools because it answers my requirements and also sticks to my coding ideals. A lot of people adopted jQuery (I personally don't like it, doesn't mean it's not great). I haven't used the other ones.
But none is better than the other, it's all a question of requirements and personal preference.
I do not really think it makes a bit of difference. The big ones seem to use a mixture of Jquery & prototype along with others.
Quite frankly, it makes no difference what you use for heavily visited websites as we are talking about client technologies. After the file is loaded, there are not really any overheads. So, if you just want to do one simple thing and multiple frameworks support it, use whatever one has the smaller file size (of course, if it performs really bad, use another!)
This being said, google hosts a lot of the frameworks, so even this is really a non issue. I use Jquery hosted by Google and am very happy.
http://code.google.com/apis/ajaxlibs/
Backend and what the server should be using is a whole different question where you will get a thousand different answers!
I'd recommend you look into Dojo.
Dojo 1.6 is also the first (and only) popular JavaScript Library that can be successfully used with the Closure Compiler's Advanced mode, with the massive size, performance and obfuscation benefits attached to it -- other than Google's own Closure Library, that is.
http://dojo-toolkit.33424.n3.nabble.com/file/n2636749/Using_the_Dojo_Toolkit_with_the_Closure_Compiler.pdf?by-user=t
In other words, a program using Dojo can be 100% obfuscated -- even the library itself.
Compiled code has exactly the same behavior as plain-text code, except that it is much smaller (average 25% over minifiers), runs much faster (especially on mobile devices), and almost impossible to reverse-engineer, even after passing through a beautifier, because the entire code base (including the library) is obfuscated.
Code that is only "minified" (e.g. YUI compressor, Uglify) can be easily reverse-engineered after passing through a beautifier.
Well - as an example stackoverflow relies on jQuery ( and uses the google apis link ) - it's one of the speediest and most popular libraries and not only that but I'd say it's the easiest to use. What type of behavior are you going to have on the site? It really all depends on your needs.
The answer, as always, is: it depends. What kind of performance are you talking about? Download speed? Use a minimiser and there's probably not a lot of difference. Or client-side performance, and what are you doing with it?
But, I would suggest that if you're after raw performance, I would not use a framework at all, and create low level javascript that will be far more difficult to maintain.
Some good information can be found on the YUI site.
As other answers already explained, the framework's not going to be the bottleneck in your site's performance -- rather, many other factors are. If you use popular frameworks and load them from popular URLs for them (e.g. AOL's or Google's) they're likely to be cached in your users' browsers, so you don't have to worry much about that, either.
If you care at all about performance, however, absolutely DO check out Steve Souders; work -- including both of his books, "High Performance Web Sites" and "Even Faster Web Sites".
I'm biased, as Steve is a friend and a colleague (and we share publishers as well), but I praised and admired his work even before we met in person and became colleagues -- I'm mostly a back-end person, as he used to be, so I just can't help admire somebody who, coming from the same background, had the integrity and courage to switch almost entirely to front-end focus as he realized THAT was by far the bottleneck for user-perceived performance (i.e., somebody who had the gumption to put user experience first, something we all pay homage to, of course, but don't always practice, when that "overriding priority" gets in the way of our own professional specialties, interests and skills...).

How many lines of code is in your custom jQuery script on your site? And how much is too much?

For our site, Im using a lot of jQuery - right now Im looking at 340 lines of jQuery code on top of the base library. How much is too much? I will be adding more, when do I start trying to condense the code and eventually move to OOP?
The number of lines doesn't mean anything - what matters is what you're actually doing. You could have 10 lines of supremely inefficient code that would do much more damage than a meticulously crafted 1000 lines of code.
Optimally, you should keep you script size as minimum as possible, but with today's 'Web 2.0' websites, you will most probably accumulate quite a lot of JavaScript code.
The important thing is that before you deploy your website, make sure to minify and gzip your script files as to reduce the size of your script files as much as possible.
If you are really interested in optimizing and improving your website performance, I highly recommend taking a look at Steve Souders' High Performance Web Sites: Essential Knowledge for Front-End Engineers
How much is too much depends a lot on your application.
You should strive to be concise, but not at the expense of readability or user experience.
I would pay attention to script loading time more than lines of code. If it gets to be too big, break the file down into page or section specific files. "Too much" is based solely on application performance and what you deem to be acceptable for your users.
340 lines is nothing, try using a few telerik controls...soon gets to 15k+ lines!
It depends on the project you are working on. You should keep your code efficient and readable. Once you deploy your website, just compress and gzip your scripts and that would improve performance.
I wouldn't concern yourself with the length of your JavaScript. You have multiple options available to you like using Packer to compress your JavaScript for release (you'll want to practice with it some since it does have a few rules for how it works).
Focus on making sure your code is understandable and easy to maintain. Heavy use of JavaScript in websites can get hairy in a big hurry.
Concerning yourself with trying to make it short or small can hurt you more than if a user has to wait an extra second for the page to load.
For development it becomes absolutely essential to separate out code into separate .js files or things will get messy.
HOWEVER,
Do not leave a ton of script references in a production page. Most browsers are limited to 2 simultaneous HTTP requests. Those script references will slow down your page load and far outweigh any possible benefit of caching components separately.
You can concatenate your development files into one file using JS Builder:
http://code.google.com/p/js-builder/
Edit: By script references I mean the < script src="blah.js">. Each of those needs to be loaded via HTTP when the page loads.
340 lines of javascript is nothing, but as your javascript code base grows I'd spend some time looking into frameworks for compressing and concatenating javascript on the fly. If you're on Java I'd recommend using JAWR, which lets you switch between multiple references in development mode and a single, minified script in production. Just make sure you test your app in production mode before you go live, as the minification algorithm could screw up your code in some obscure cases (if you write clean code and remember to end every line with a ';' you should be fine).
If you're not on Java I don't know of any frameworks, but implementing something similar yourself actually isn't that hard. I think I have some code lying around somewhere for doing it in eZ Publish, which is written in PHP.

Categories

Resources