What Are The Cons (Ill-Effects) Of HTML, JavaScript and CSS Minification? - javascript

First, I see that Google suggests minifying everything - - HTML, JS and CSS to increase performance.
But I doubt it does only good, and no bad; especially because many of the popular websites haven't enabled at least HTML minification (and some haven't even enabled JS and CSS minification as well).
So, can someone knowledgeable, please enlighten me off the ill-effects / cons of enabling the following on a website:
HTML minification
JS minification
CSS minification
for example, I heard that HTML minification could cause issues with Google analytics and Adsense (or any ad) code in the page. Is it true?

If done right, minification can be completely without side effects. The thing is, it’s not that easy to get things right.
For example, Google’s JS compiler, Closure, generally works fine but breaks more complex scripts. Always a tradeoff, better compression at the cost of less compatibility, or the other way around.
Also, by enabling gzip, you achieve somewhat more compression than minification, all without touching your code. This burns server CPU.
Bottomline is, if you’re not sure you need minificaton — you probably don’t.

Deployment will be more complex and error-prone. Where you could previously just upload your whole code, you know have to make certain to compilers.
Development becomes more complex; developers will probably install the minification tools.
Debugging becomes more complex, you know have to use a utility to find the original line. Of course.
Minifiers could have errors and introduce errornous code (that of course could cause various issues).
For large software projects, the first three criteria are all but irrelevant, and bugs in minifiers can be mitigated by careful (automated) testing. For a small/personal project, you probably don't need minification unless bandwidth or website performance is an issue.

Related

What is the security of javascript minification

So for a long time now I have been under the assumption that, while it does performance gains, one of the primary reasons we minify javascript/css is to give a modicum of obfuscation to it so that it is harder to reverse engineer.
However a friend of mine just showed me how it is not only possible; but extremely simple to just reverse minification on minified javascript and css.
So my question is - other than performance gains, what is the point? Is there any other actual way to protect javascript from being simply stolen right from your site?
Javascript minification is done primarily to increase performance. Upon minification, it's not uncommon to see >25% reduction in script size. On top of this, some minify-ers/compilers will obfuscate your code a little as well, renaming functions and variables to less obvious names.
As you've pointed out, it can always been unminified or pretty-printed, but since Javascript is a non-compiled, client-side language there isn't a whole lot you can do to protect your javascript.
See this link on javascript obfuscation.
If you have proprietary code or code you really don't want users seeing, you'll have to keep it server side. Consider moving it to a server side language such as PHP, Python, C, etc and expose the functions via web services.
There is no way to prevent javascript from being stolen directly off your site. It is "stolen" the instant someone visits your site and loads the HTML page or file containing the javascript code. Minification will do nothing more from a security perspective than obfuscate your code from a casual browser. It's primary purpose is for performance.
Rule of thumb: If you don't want the user to have access to it, don't send it to the client/browser.

Tools for JS and CSS file concatenating

I recently started working with a large code base with many (15-20) js requests per page. I'm tasked with optimization and performance improvement of these sites.
I've been using tools like Google's PageSpeed and Yahoo's YSlow in conjunction with WebPageTest.org's tests to determine a baseline speed of the site and area of improvement. I'm curious if there are some standard or best-practice solutions for concatenation and minification of JS and CSS files.
I watched: http://www.youtube.com/watch?v=30_AIEhar-I and the first 20 minutes were really good at hammering mod_pagespeed as a good target.
I'm currently considering mod_pagespeed with a YUI compressor and perhaps a sprite generator on top of it all.
What are some good tools that I may have missed or things that I should be concerned about with my current build?
Edit: It should be noted that this is one page of many (possibly hundreds) and the site receives a new build every two weeks so being able to automate this concatenation and minification is a must, I can't just do it once and call it good.
Edit 7/30/2012 -
I spent some time looking at different tools, it's hard to say which ones are the best but at this time, not very many people use mod_page speed.
Closure is more widely used for certain, but even that is lacking. It seems the optimal way to do this is to just use a plugin with YUI.
There are other places that suggest Packer but it seems that many believe the smaller file sizes are eliminated by the necessity to unpack them on the client machine. This stackoverflow response is a good read regarding these types of tools.
Google's Closure Compiler is quite nice for concatenating and minifying JavaScript. It has the added bonus of linting your code for you when you compile, it will remove dead code, and it can also perform compile-time type checking if you include type hints in docblocks.
In certain cases, the dead code removal feature gives Closure a huge advantage over other minifiers... for example, think of cases where you include a library, but only use about 10% of the functionality. The other 90% can be removed if you compress the library along with the rest of the project.
As for CSS, YUI compressor is probably your best bet if you want something fancy. Otherwise, you could just concatenate the files together using cat and take the hit of a few extra bytes from whitespace.

Should I inline CSS & JS in mobile sites to save bandwidth?

Is there a reason not to inline CSS & JS when I make a mobile-ONLY site, to save bandwidth?
The only possible benefit I can think of is a couple less HTTP requests, but you totally give up the benefits of having the files cached if you do so.
Caching is a good thing and it saves bandwidth, so I can't see why you'd want to lose that advantage.
Besides that (not related to performance), maintenance will be a nightmare with everything inline, as it would be with any site.
I wouldn't be the least bit surprised if there were even more compelling reasons not to.
Use separate files.
Yes. First of all, you'll either have to code like that or inline them dynamically. Dynamically = waste of processing power. Code like that = hard to maintain and bad practice. And for what? You barely save any bandwidth at all, and it makes caching impossible and might actually slow you down. Now minification, on the other hand... that's what you should do instead. Minify your CSS and JavaScript, combine them into one file, and it's okay if you do this dynamically because the benefits outweigh the problems.
In-lining everything has different effects:
Reduces number of requests -- but increases your HTML file size
Increased HTML file size -- Load time increases considerably
No caching -- you have lost a good opportunity
Maintenance is like hell -- unless you inline as a step of your development process
A good blog post you can read - Why inlining everything is not the answer
There he recommends only to inline very small files (less than 1KB)
Hey by the way, why not inline -- Google does it in their homepage. Anyone who has 'View-ed Source' Google has seen it. But still its your choice.
If you are still thinking to reduce the number of HTTP requests then it is better to use a build tool to do the inlining autonomously. Otherwise you'll have to go through the 'maintenance hell'.
Yes, this reason is named cache :-) not inline css and js will be cached (Mobile browsers with html support use cache)

Properly managing javascripts for large website / code concatenation

The site I am developing has a large amount of javascript that is shared across various functionality, and an equally large amount of feature-specific javascript. I've read all about using one monolithic javascript file vs. many smaller ones.
For my purposes, the huge-file approach would not only result in a script difficult to maintain, but contain a lot of unneeded javascript as well. At the same time, separating the javascript so that only the required code is included would result in an excessive number of files / HTTP requests. The idea of including even a moderate amount of unneeded code seems contrary to the concepts of proper software design, besides the additional file size overhead for the user.
I have found the mod_concat module for Apache which seems like it would solve my problem entirely - I could separate my javascripts into as many files as I want, include only those necessary, and take almost no hit on performance.
Is this actually the case? Is the only potential drawback the need to manage many files? I know mod_concat has not been around forever, so I'm also looking for a bit of background on a) how this was handled before, and b) if, even with code concatenation, including a moderate amount of unneeded javascript is considered acceptable (or even a best practice).
Thanks, Brian
I don't think you need an apache module for that. Creating one minified JS file for production should be the best way to go, because it is only loaded once and then cached by the browser. Although for development of course, it makes sense to have your application split into separate files.
My personal favorite for JavaScript module management and compression is Steal JS which is part of the great JavaScript MVC framework (could be generally interesting for larger JS applications). It can load module files dynamically during development and for production you can create one compressed JavaScript file (it can do CSS, too).
Another alternative is RequireJS but I only had a quick look at it.

Which JavaScript framework is generally used for high performance websites?

There are different JavaScript frameworks like jQuery, Dojo, mooTools, Google Web Toolkit(GWT), YUI, etc.
Which one from this is suitable for high performance websites?
(Full disclaimer: I am a Dojo developer and this is my unofficial perspective).
All major libraries can be used in high load scenarios. There are several things to consider:
Initial load
The initial load affects your response time: from requesting a web page to being responsive and in working mode. Trivial things to do are:
concatenate several JavaScript files together (works for CSS files too)
minimize and/or compress your JavaScript
The idea is to send less — good for the server, and good for the client.
The less trivial thing to do:
structure your program in such a way so it is operational without all modules loaded
Example of the latter: divide your modules into essential (e.g., the core logic), and non-essential (e.g., helpers: tooltips, hints, verifiers, help facilities, various "gradual enhancers", and so on). The idea is that frequently there are things which are not important for frequent users, but nice for casual users ⇒ they can be delayed.
We can load essential modules first and load the rest asynchronously. Example: if user wants to edit an object we need to show it first, after that we have several hundred milliseconds to load the rest: lookup tables, hints, and so on.
Obviously it helps when asynchronous loading of modules is supported by the framework you use. Dojo has this facility built-in.
Distribute files
Everybody knows that due to browser restrictions on number of parallel downloads from the same site it is beneficial to load resources (images, CSS, JavaScript) from different domains:
we can download more in parallel, if user's line has enough bandwidth — these days it is almost always true
we can set up web servers optimized for serving static files: huge disk cache, small workers, keep-alive, async serving, and so on
we can remove all unnecessary features we don't need when serving static files: sessions, cookies, and so on
One frequently overlooked optimization in JavaScript applications is to use CDN:
your web site can benefit from the geographical distribution of CDN (files can be served from the closest/fastest server)
user may have required files in her cache, if they were used by other application
intermediate/corporate caches increase the probability that required files are already cached
the last but not least: these are files that you don't serve — think about it
Again, Dojo supports CDNs for a long time and distributed publicly by AOL CDN and Google CDN. The latter carries practically all popular JavaScript toolkits too. Obviously you can create your own CDN and your very own CDN- and app- specific Dojo build, if you feel you need it — it is trivial and well documented.
Communication bandwidth
How that can be different for different toolkits? XHR is XHR.
You need to reduce the load on your servers as much as possible. Analyze all traffic and consider how much static/immutable stuff is sent over the pipe. For example, typically a lot of HTML is redundant across several pages: a header, a footer, a menu, and so on. Do you really need all of these to be sent over every time?
One obvious solution is to move from static HTML + "gradual enhancements" with JavaScript to real "one page" JavaScript applications. Again, this is a frequently overlooked, but the most rewarding optimization.
While the idea sounds easy, in reality it is not as simple as it seems. As soon as we go from one-liners to apps we have a plethora of questions, and the biggest of them is the packaging: what your components are, what components are provided by the toolkit, and how to package and deliver them.
Dojo provides modules, good OOP for general classes, widgets (a combination of an optional HTML and related behaviors), and a lot of facilities to work with them. You can:
load modules on demand rather than in the head
load modules asynchronously
find all dependencies between modules automatically and create a "build" — one file in simple cases, or more, if your app is big and requires several layers
while doing the "build" it can inline all HTML snippets for your widgets, optimize CSS, and minify/compress JavaScript
Dojo can automatically find and instantiate widgets in HTML saving a lot of boilerplate code
and much much more
All these features help greatly when building applications on the client side. That's why I like Dojo.
Obviously there are more ways to optimize high load web sites but according to my practice these are the most specific for JavaScript frameworks.
Quite simply: all of them.
All frameworks have been built in order to provide the fastest performance possible and provide the developers with useful functions and tools. Your choice should be based on your requirements.
JavaScript runs on the client-side, so none will affect your server performance. The only difference server-side is the amount of bandwidth used to transfer the .js files to the client.
I'm personally fond of MooTools because it answers my requirements and also sticks to my coding ideals. A lot of people adopted jQuery (I personally don't like it, doesn't mean it's not great). I haven't used the other ones.
But none is better than the other, it's all a question of requirements and personal preference.
I do not really think it makes a bit of difference. The big ones seem to use a mixture of Jquery & prototype along with others.
Quite frankly, it makes no difference what you use for heavily visited websites as we are talking about client technologies. After the file is loaded, there are not really any overheads. So, if you just want to do one simple thing and multiple frameworks support it, use whatever one has the smaller file size (of course, if it performs really bad, use another!)
This being said, google hosts a lot of the frameworks, so even this is really a non issue. I use Jquery hosted by Google and am very happy.
http://code.google.com/apis/ajaxlibs/
Backend and what the server should be using is a whole different question where you will get a thousand different answers!
I'd recommend you look into Dojo.
Dojo 1.6 is also the first (and only) popular JavaScript Library that can be successfully used with the Closure Compiler's Advanced mode, with the massive size, performance and obfuscation benefits attached to it -- other than Google's own Closure Library, that is.
http://dojo-toolkit.33424.n3.nabble.com/file/n2636749/Using_the_Dojo_Toolkit_with_the_Closure_Compiler.pdf?by-user=t
In other words, a program using Dojo can be 100% obfuscated -- even the library itself.
Compiled code has exactly the same behavior as plain-text code, except that it is much smaller (average 25% over minifiers), runs much faster (especially on mobile devices), and almost impossible to reverse-engineer, even after passing through a beautifier, because the entire code base (including the library) is obfuscated.
Code that is only "minified" (e.g. YUI compressor, Uglify) can be easily reverse-engineered after passing through a beautifier.
Well - as an example stackoverflow relies on jQuery ( and uses the google apis link ) - it's one of the speediest and most popular libraries and not only that but I'd say it's the easiest to use. What type of behavior are you going to have on the site? It really all depends on your needs.
The answer, as always, is: it depends. What kind of performance are you talking about? Download speed? Use a minimiser and there's probably not a lot of difference. Or client-side performance, and what are you doing with it?
But, I would suggest that if you're after raw performance, I would not use a framework at all, and create low level javascript that will be far more difficult to maintain.
Some good information can be found on the YUI site.
As other answers already explained, the framework's not going to be the bottleneck in your site's performance -- rather, many other factors are. If you use popular frameworks and load them from popular URLs for them (e.g. AOL's or Google's) they're likely to be cached in your users' browsers, so you don't have to worry much about that, either.
If you care at all about performance, however, absolutely DO check out Steve Souders; work -- including both of his books, "High Performance Web Sites" and "Even Faster Web Sites".
I'm biased, as Steve is a friend and a colleague (and we share publishers as well), but I praised and admired his work even before we met in person and became colleagues -- I'm mostly a back-end person, as he used to be, so I just can't help admire somebody who, coming from the same background, had the integrity and courage to switch almost entirely to front-end focus as he realized THAT was by far the bottleneck for user-perceived performance (i.e., somebody who had the gumption to put user experience first, something we all pay homage to, of course, but don't always practice, when that "overriding priority" gets in the way of our own professional specialties, interests and skills...).

Categories

Resources