Javascript and website loading time optimization - javascript

I know that best practice for including javascript is having all code in a separate .js file and allowing browsers to cache that file.
But when we begin to use many jquery plugins which have their own .js, and our functions depend on them, wouldn't it be better to load dynamically only the js function and the required .js for the current page?
Wouldn't that be faster, in a page, if I only need one function to load dynamically embedding it in html with the script tag instead of loading the whole js with the js plugins?
In other words, aren't there any cases in which there are better practices than keeping our whole javascript code in a separate .js?

It would seem at first glance that this would be a good idea, but in fact it would actually make matters worse. For example, if one page needs plugins 1, 2 and 3, then a file would be build server side with those plugins in it. Now, the browser goes to another page that needs plugins 2 and 4. This would cause another file to be built, this new file would be different from the first one, but it would also contain the code for plugin 2 so the same code ends up getting downloaded twice, bypassing the version that the browser already has.
You are best off leaving the caching to the browser, rather than trying to second-guess it. However, there are options to improve things.
Top of the list is using a CDN. If the plugins you are using are fairly popular ones, then the chances are that they are being hosted with a CDN. If you link to the CDN-hosted plugins, then any visitors who are hitting your site for the first time and who have also happened to have hit another site that's also using the same plugins from the same CDN, the plugins will already be cached.
There are, of course, other things you can to to speed your javascript up. Best practice includes placing all your script include tags as close to the bottom of the document as possible, so as to not hold up page rendering. You should also look into lazy initialization. This involves, for any stuff that needs significant setup to work, attaching a minimalist event handler that when triggered removes itself and sets up the real event handler.

One problem with having separate js files is that will cause more HTTP requests.
Yahoo have a good best practices guide on speeding up your site: http://developer.yahoo.com/performance/rules.html
I believe Google's closure library has something for combining javascript files and dependencies, but I havn't looked to much into it yet. So don't quote me on it: http://code.google.com/closure/library/docs/calcdeps.html
Also there is a tool called jingo http://code.google.com/p/jingo/ but again, I havn't used it yet.

I keep separate files for each plug-in and page during development, but during production I merge-and-minify all my JavaScript files into a single JS file loaded uniformly throughout the site. My main layout file in my web framework (Sinatra) uses the deployment mode to automatically either generate script tags for all JS files (in order, based on a manifest file) or perform the minification and include a single querystring-timestamped script inclusion.
Every page is given a body tag with a unique id, e.g. <body id="contact">.
For those scripts that need to be specific to a particular page, I either modify the selectors to be prefixed by the body:
$('body#contact form#contact').submit(...);
or (more typically) I have the onload handlers for that page bail early:
jQuery(function($){
if (!$('body#contact').length) return;
// Do things specific to the contact page here.
});
Yes, including code (or even a plug-in) that may only be needed by one page of the site is inefficient if the user never visits that page. On the other hand, after the initial load the entire site's JS is ready to roll from the cache.

The network latency is the main problem.You can get a very responsive page if you reduce the http calls to one.
It means all the JS, CSS are bundled into the HTML page.And if your can forget IE6/7 you can put the images as data:image/png;base64
When we release a new version of our web app, a shell script minify and bundle everything into a single html page.
Then there is a second call for the data, and we render all the HTML client-side using a JS template library: PURE
Ensure the page is cached and gzipped. There is probably a limit in size to consider.We try to stay under 400kb unzipped, and load secondary resources later when needed.

You can also try a service like http://www.blaze.io. It automatically peforms most front end optimization tactics and also couples in a CDN.
There currently in private beta but its worth submitting your website to.

I would recommend you join common bits of functionality into individual javascript module files and load them only in the pages they are being used using RequireJS / head.js or a similar dependency management tool.
An example where you are using lighbox popups, contact forms, tracking, and image sliders in different parts of the website would be to separate these into 4 modules and load them only where needed. That way you optimize caching and make sure your site has no unnecessary flab.
As a general rule its always best to have less files than more, its also important to work on the timing of each JS file, as some are needed BEFORE the page completes loading and some AFTER (ie, when user clicks something)
See a lot more tips in the article: 25 Techniques for Javascript Performance Optimization.
Including a section on managing Javascript file dependencies.
Cheers, hope this is useful.

Related

If grouping front-end code helps reduce requests, why aren't more websites written on one html document?

I guess what I'm asking is that if grouping JavaScript is considered good practice, why don't more websites place the JavaScript and CSS directly into one HTML document?
why don't more websites place the JavaScript and CSS directly into one HTML document
Individual file caching.
External files have the advantage of being cached. Since scripts and styles rarely change (static) and/or are shared between pages, it's better to just separate them from the page making the page lighter.
Instead of downloading 500kb of page data with embedded JS and CSS, why not load 5kb of the page, and load from the cache the 495kb worth of JS and CSS - saves you 495kb of bandwidth and avoids an additional 2 HTTP requests.
Although you could embed JS and CSS into the page, the page will most likely be dynamic. This will make the page load a new copy all the time, making each request very heavy.
Modular code
Imagine a WordPress site. They are built using a tom of widgets made by different developers around the world. Handling that many code stuffed in one page is possible, but unimaginable.
if some code just short circuited or just didn't work on your site, it's easier to take out that code linking the external file, rather than scouring the page for the related code and possibly accidentally remove code from another widget.
Separation of concerns
It's also best practice to separate HTML from CSS and JS. That way, it's not spaghetti you are dealing with.
When you have a lot of code in a single document, it's harder to work with the code because you need more time to find the necessary string to change.
That is why it's good practice to divide code into separate files, with each of them solving its own special task, and then include them in code where it's necessary.
However, you can a write script which will join your files from the development version, which has many files, to a release version, which has fewer files, but this brings two problems:
People are often lazy to do additional coding to create this script and then change it when the structure of your project becomes more complex.
If you find a bug or add a small feature, you will need to rebuild your project again both in developed and release versions.
They separated them so that multiple webpages can use the same file. When you change a single file, multiple pages can aromatically updated also. In addition, big HTML file will cause a long time to download.

Organizing Javascript w/ jQuery

I am tinkering around with jQuery and am finding it very useful and almost exciting.
As of now, I am referencing the jQuery script via Google's CDN and I store plugins I use locally in a static/scripts directory.
Naturally, each page has its own individual implementation of components that are required for the features it currently offers. I.E. the main page has the Twitter plugin whereas the login page has form validation logic and password strength metering. However, certain components (navigation bar) for example use the same script across multiple pages.
Admittedly so, I am not a fan of putting javascript code in the header of a page, but I rather prefer to have it in an external file (for caching, re-usability, and optimization purposes).
My question is, what is the preferred route for organizing the external files. I wanted to try and keep it to one javascript file for the entire site to reduce IO requests. However, I am not sure how to implement document ready functions on a conditional per page bases.
$(document).ready(function () { ... }
Is there some way to reference a page by some method (preferably id based and not a url conditional).
Thank you in advance for your time!
You should try REQUIRE JS.
This will allow you to load only those plugins the pages where you need them, and unload them again if they are not needed anymore.
Then again, it might be overkill. It really depends on the size of your project.
Paul Irish:
http://paulirish.com/2009/markup-based-unobtrusive-comprehensive-dom-ready-execution/
This will allow you to block your scripts by body class/ID and execute them automatically.
First you might want to use YUI Compressor or some other JS compressing tool. Then perhaps creating a resource file (resx) for your JavaScript is the way to go. Then just reference the resource within your code. This is the approach Telerik took for their RadControl ASP.NET AJAX control framework.

Put CSS and JavaScript in files or main HTML?

Although it is always recommended to put JavaScript and CSS code into appropriate files (as .js and .css), most of major websites (like Amazon, facebook, etc.) put a significant part of their JavaScript and CSS code directly within the main HTML page.
Where is the best choice?
Place your .js in multiple files, then package, minify and gzip that into one file.
Keep your HTML into multiple seperate files.
Place your .css in multiple files, then package, minify and gzip that into one file.
Then you simply send one css file and one js file to the client to be cached.
You do not inline them with your HTML.
If you inline them then any change to the css or html or js forces to user to download all three again.
The main reason major websites have js & cs in their files, is because major websites code rot. Major companies don't uphold standards and best practices, they just hack it until it works then say "why waste money on our website, it works doesn't it?".
Don't look at examples of live websites, because 99% of all examples on the internet show bad practices.
Also for the love of god, Separation of concerns please. You should never ever use inline javascript or inline css in html pages.
http://developer.yahoo.com/performance/rules.html#external
Yahoo (even though they have many inline styles and scripts), recommends making them external. I believe google page speed used to (or still does?) do the same as well.
It's really a logical thing to have them separate. There are so many benefits to keeping CSS and JS separate to the HTML. Things like logical code management, caching of those pages, lower page size (would you rather a ~200ms request for a 400kb cached resource, or a 4000ms delay from having to download that data on every page?), SEO options (less crap for google to look through when scripts/styles are external), easier to minify external scripts (online tools etc), can load them synchronously from different servers....
That should be your primary objective in any website. All styles that make up your whole website should be in the one file (or files for each page, then merged and minified when updated), the same for javascript.
In the real world (not doing a project for yourself, doing one for a client or stakeholder that wants results), the only time where it doesn't make sense to load in another javascript resource or another stylesheet (and thus use inline styles/javascript) is if there's some kind of dynamic information that is on a kind of per-user, per-session or per-time-period that can't be accomplished as simply any other way. Example: when my website has a promotion, we dump a script tag with a small JSON object of information. Because we don't minify and merge multiple files, it makes more sense to just include it in the page. Sure there are other ways to do this, but it costs $20 to do that, whereas it could cost > $100 to do it another way.
Perhaps Amazon/Facebook/Google etc use so much inline code is so their servers aren't taxed so much. I'm not too sure on the benchmarking between requesting a 1MB file in one hit or requesting 10 100KB files (presuming 1MB/10 = 100KB for examples' sake), but what would be faster? Potentially the 1MB file, BUT smaller requests can be loaded synchronously, meaning each one of those 10 requests could come from a separate server/domain potentially, thus reducing overall load time.
Further, google homepages for example seem to dump a JSON array of information for the widgets, presumably because it compiles all that information from various sources, minifies it, caches it, then puts in on the page, then the javascript functions build the layout (client side processing power rather than server-side).
An interesting investigation might be whether they include various .css files regardless of the style blocks you're also seeing. Perhaps it's overhead or perhaps it's convenience.
I've found that while working with different styles of interface developer (and content deployers) that convenience/authority often wins in the face of deadlines and "getting the job done". In a project of a large scale there could be factors involved like "No, you ain't touching our stylesheets", or perhaps if there isn't a stylesheet using an http request already then convenience has won a battle against good practice.
If your css and javascript code is for a global usage, then it is best to put them into appropriate files.
Otherwise, if the code is used just by a certain page, like the home page, put them directly into html is acceptable, and is good for maintenance.
Our team keeps it all seperate. All resources like this goes into a folder called _Content.
CSS goes into _Content/css/xxx.js
JS goes into _Content/js/lib/xxx.js (For all the library packages)
Custom page events and functions get called from the page, but are put into a main JS file in _Content/js/Main.js
Images will go into the same place under _Content/images/xxx.x
This is just how we lay it out as it keeps the HTML markup as it should be, for markup.
I think putting css and js into the main html makes the page loads fast.

Integrate Javascript resources in HTML: what's the best way?

I'm working on a project which uses many scripts (Google Maps, jQuery, jQuery plugins, jQuery UI...). Some pages have almost 350 kB of Javascript.
We are concerned about performance and I'm asking myself what is the best way to integrate those heavy scripts.
We have 2 solutions:
Include all scripts in the head, even if they are not utilized on the page.
Include some common scripts in the head, and include page specific ones when they are needed.
I would like to have your advice.
Thanks.
For the best performance I would create a single static minified javascript file (using a tool like YUI compressor) and reference it from the head section. For good tips on website performance check out googles website optimizations page.
Note that the performance penalty of retrieving all your javascript files only happen on the first page, as the browser will use the cache version of the file on subsequent pages.
For even better responsiveness you would split your javascript in two files. Load the first with all the javascript you need when the page loads, then after the page loads load the second file in the background.
If your interested, I have an open source AJAX javascript framework that simplifies compresses and concatenates all your html, css and javascript (including 3rd party libraries) into a single javascript file.
If you think it's likely that some users will never need the Google Maps JavaScript for example, just include that in the relevant pages. Otherwise, put them all in the head - they'll be cached soon enough (and those from Google's CDN may be cached already).
Scripts in the <head> tag do (I think) stop the page from rendering further until they’ve finished downloading, so you might want to move them down to the end of the <body> tag.
It won’t actually make anything load faster, but it should make your page content appear more quickly in some situations, so it should feel faster.
I’d also query whether you’ve really got 350 KB of JavaScript coming down the pipe. Surely the libraries are gzipped? jQuery 1.4 is 19 KB when minifed and gzipped.
1) I would recommend gather all the common scripts and most important like jquery and etc in one file to reduce number of requests for this files and compress it and i would recommend google closure u will find it here
2) Make the loading in a page the user have to open it in the beginning like login page and put the scripts at the end of the page to let all the content render first and this recommended by most of the performance tools like yslow and page speed
3) don't write scripts in your page , try to write everything in a file to make it easier later on for compression and encryption
4) put the scripts and all statics files like images and css on other domain to separate the loading on your server

Javascript file inclusion in html pages- what happens underneath in the browser?

I think this may be a browser dependent question- Suppose I have 10 Javascript files and several HTML pages. Suppose HTML pageA needs only JS1.js and JS3.js, similarly HTML pageB needs JS4.js and JS1.js. I want to know what would be effect of including all the 10 javascript files in all HTML pages? Will it directly relate to the memory consumption by the browser?
I am facing this problem particularly with YUI javascript library. There are several components like datatable, event, container, calendar, dom-event etc., The order in which they are included also seems to matter a lot- For example the dom-event js should be included before the rest for it to work. So to avoid all this confusion, I thought of including all these js files in a header file that gets included in all HTML pages.
The thing that I am worried about is the memory bloat and performance problems that it may cause. Please provide your suggestions on the same..
Thanks,
-Keshav
Any script you load into your page, even once downloaded and cached must still be parsed before the rest of the page can load. So in that sense there is a memory penalty, and there's still a potential for something in the script to significantly delay rendering.
However, in the case of a conscientiously designed library such as YUI I would expect the parsing time to be minimised.
If you can load all your scripts in at the end of the page, that can vastly improve performance as the entire page can render before being blocked by javascript execution, and your site will feel a lot snappier.
I would suggest investigating the Firebug Net panel and the YSlow extension to get specific performance stats for your website.
External scripts delay the display of the following html until they have loaded and executed. The impact is much less after the first page load, since they're already cached, though browsers will occasionally check for new versions, which still carries a delay. I try to limit the number of scripts and move the script tags to the bottom of the page when possible. Users won't notice the script loading delay if the page has already fully displayed.
if a given script does nothing, it will not affect the performance.
Obviously the first page will load slowly, but the rest will not need to load all the scripts because they will be cached. So the next pages will load faster
Tips:
1) Load the script at the bottom of the page (just before the closing BODY tag).
2) Use a non-blocking way of loading the scripts. This is the one I'm using .
<script type="text/javascript">
function AttachScript(src) {
var script = document.createElement("script");
script.type = "text/javascript";
document.getElementsByTagName("body")[0].appendChild(script);
script.src = src;
}
AttachScript("/js/jquery.min.js");
AttachScript("/js/ndr.js");
AttachScript("/js/shadowbox.js");
AttachScript("/js/libraries/sizzle/sizzle.js");
AttachScript("/js/languages/shadowbox-es.js");
AttachScript("/js/players/shadowbox-img.js");
AttachScript("/js/adapters/shadowbox-jquery.js");
Can't find the source web page though :-(
Memory Consumption:
Assuming the scripts are well written then memory consumption and performance issues should be nominal. Your biggest problem with including all scripts at once will be the latency in the user experience first time through, or if you make changes, because they will have to download all of them in one hit. I think you should only include the scripts you need per page, not all scripts at once.
You can assess the impact yourself using simple tools like task manager/processes in Windows to monitor memory/processor useage, or plugs ins like Firebug for FireFox.
You can also look into something called minification to help make your script files as small as possible.
Dependencies:
The order in which you include the scripts is important as some scripts may depend on functionality in other scripts. So if the code in one script attempts to run and it requires code in another script that has not been downloaded then it will fail. My advice would be to actually understand those dependancies in your scripts files rather than just downloading everything at once because it seems easier.
Use the YUI Configurator to help determine the required file includes and order, as well as how to use the Yahoo! CDN combo service to combine all YUI files into a single script tag.
http://developer.yahoo.com/yui/articles/hosting/
External assets to the HTML page are typically cached by the browser. External assets are anything requested from the HTML such as images, CSS, JavaScript, and anything else. So if you load all 10 script files up front you are forcing a one time massive download hit to your user. After this one time the user does not need to download the scripts again unless the modify timestamp on the files change.
Your page will only use what it requires. If a particular page requests js4.js and js5.js then all the functions in those files will be loaded into the interpreter in the order in which they are first requested from the HTML and second by the order in which they are specified in each of those files. If there are any namespace conflicts what ever is loaded into the interpreter last wins. The interpreter will clear out the functions once the page is unloaded from the browser.
For efficiency I would suggest using a server-side inclusion process to read each of the js files and include the contents of each file into a new single js file. This will reduce the number of HTTP requests to the server and save your users an extreme amount of bandwidth resources with regard to HTTP headers and GET requests. Also, put the request of this new one script file directly prior to the closing body tag of your HTML. Downloading of scripts block parallel downloads in IE, so you want to load scripts at the lowest possible point in the page.
Scriptaculous implements a nice way to handle js dependencies. Guess you could check it out and "re-implement" it. ;D
As for memory bloat and performance issues... as long as your JS doesn't leak a lot (YUI probably doesn't) memory won't be much of a problem, although it will make your pages load slower, especially if loaded in the header.
You can read on caching methods using PHP to pass on several javascript files as one big JS file which includes everything you need. For additional performance gains, you can make the browser cache the file locally in addition to sending it gzipped (if the browser has support for the encoding using something like ob_start("ob_gzhandler");). By using gzip encoding, you can severely reduce the filesize of the main JS file you're sending which includes all your JS code (since plain text compresses so well). I recently had to do this on my own website and it's worked like a charm for both JS and CSS files.
http://www.ejeliot.com/blog/72
Note that by following the instructions on that tutorial, your JS file will only be sent once and the browser on the client's machine will keep a local copy stored which will also improve performance of every visit thereafter.
Also, consider googling "Minify" which should be hosted on Google Code.

Categories

Resources