I'm working on a project which uses many scripts (Google Maps, jQuery, jQuery plugins, jQuery UI...). Some pages have almost 350 kB of Javascript.
We are concerned about performance and I'm asking myself what is the best way to integrate those heavy scripts.
We have 2 solutions:
Include all scripts in the head, even if they are not utilized on the page.
Include some common scripts in the head, and include page specific ones when they are needed.
I would like to have your advice.
Thanks.
For the best performance I would create a single static minified javascript file (using a tool like YUI compressor) and reference it from the head section. For good tips on website performance check out googles website optimizations page.
Note that the performance penalty of retrieving all your javascript files only happen on the first page, as the browser will use the cache version of the file on subsequent pages.
For even better responsiveness you would split your javascript in two files. Load the first with all the javascript you need when the page loads, then after the page loads load the second file in the background.
If your interested, I have an open source AJAX javascript framework that simplifies compresses and concatenates all your html, css and javascript (including 3rd party libraries) into a single javascript file.
If you think it's likely that some users will never need the Google Maps JavaScript for example, just include that in the relevant pages. Otherwise, put them all in the head - they'll be cached soon enough (and those from Google's CDN may be cached already).
Scripts in the <head> tag do (I think) stop the page from rendering further until they’ve finished downloading, so you might want to move them down to the end of the <body> tag.
It won’t actually make anything load faster, but it should make your page content appear more quickly in some situations, so it should feel faster.
I’d also query whether you’ve really got 350 KB of JavaScript coming down the pipe. Surely the libraries are gzipped? jQuery 1.4 is 19 KB when minifed and gzipped.
1) I would recommend gather all the common scripts and most important like jquery and etc in one file to reduce number of requests for this files and compress it and i would recommend google closure u will find it here
2) Make the loading in a page the user have to open it in the beginning like login page and put the scripts at the end of the page to let all the content render first and this recommended by most of the performance tools like yslow and page speed
3) don't write scripts in your page , try to write everything in a file to make it easier later on for compression and encryption
4) put the scripts and all statics files like images and css on other domain to separate the loading on your server
Related
I have too much of javascript in the beta site of mine which will be the real one, So what I am thinking is to make the javascript external BUT this raises some really important questions.
What is the standard size for an external javascript (e-g should be
never more than 50KB per file).
How many javascript files should be make if the above question is
answered like that the size doesn't matter, then does it mean I
should put all scripts (including Jquery library) in one external
file?
If a javascript is already minified and it is added with other files
that are not minified will it effect the one that is already
minified?
What is the best minifier for Javascript and CSS (best means that
maintains the standards).
If I place the external script just after <body> is it good
enough?(as if I go more lower some scripts might stop working).
Here is the link to beta site just incase if you want to check http://bloghutsbeta.blogspot.com/
There is no "standard size" per say, but it's always good to keep file sizes at a minimum. Gmail, Google Maps etc. for instance, load many MBs of Javascript.
The fewer the JS files, the better in general, as the number of connections to the web server serving them are reduced, thus resulting in reduced load.
No, mixing minified and un-minified files should not be a problem.
JSMin and UglifyJS are popular compressors.
You should attach your executions to appropriate events such as document.ready so that scripts don't stop working because the page hasn't loaded fully.
What is the standard size for an external javascript (e-g should be never more than 50KB per file).
There isn't one.
How many javascript files should be make if the above question is answered like that the size doesn't matter, then does it mean I should put all scripts (including Jquery library) in one external file?
In general, the fewer the better. There are exceptions. (e.g. it might be more efficient to load jQuery from a public CDN that visitors might already have a cached copy from).
If a javascript is already minified and it is added with other files that are not minified will it effect the one that is already minified?
Not if done properly.
What is the best minifier for Javascript and CSS (best means that maintains the standards).
Subjective.
If I place the external script just after is it good enough?(as if I go more lower some scripts might stop working).
It depends on the script. Some schools of thought say that you should front load all the markup (if scripts "stop working" then fix the scripts), other say they should register event handlers using event delegation as soon as possible so users don't interact with controls that aren't wired up with JS before the page has finished loading.
I use External JavaScripts in a website as I always try to keep JavaScript at bottom and external.
But Google page speed is giving this suggestion
The following external resources have small response bodies. Inlining
the response in HTML can reduce blocking of page rendering.
http://websiteurl/ should inline the following small resources:
http://websiteurl/script.js
This external js file has only this content
$(document).ready(function() {
$("#various2").fancybox({
'width': 485,
'height': 691,
});
});
But in Yslow I get this suggestion
Grade n/a on Make JavaScript and CSS external
Only consider this if your property is a common user home page.
There are a total of 3 inline scripts
JavaScript and CSS that are inlined in HTML documents get downloaded
each time the HTML document is requested. This reduces the number of
HTTP requests but increases the HTML document size. On the other hand,
if the JavaScript and CSS are in external files cached by the browser,
the HTML document size is reduced without increasing the number of
HTTP requests.
Which is right Google or Yahoo?
This is a bit of a problematic example, on quite a few fronts.
You can organise your scripts in such a way that you do not need to inline that JS. For example you could have a common.js file that runs that snippet, other similar snippets and simplifies your code.
Additionally, this seems to have awoken "never inline any JavaScript EVER" architecture police. Turns out that sometimes it is a best practice to inline JavaScript, for example look at the common snippet from Google analytics.
Why are Google suggesting you should inline this tiny script?
Because 20% of the page visits you get have an unprimed cache
If you have a cache miss, it is likely a new connection to your site will need to be opened (1 round trip) and then the data delivered in the 2nd round trip. (if you are lucky you get to use a keepalive connection and it is cut to 1 round trip.
For a general "global" English web application you are looking at a typical 110ms round trip time for a service hosted in the US. If you are using a CDN the number would probably be halved.
Even if the resource is local, the web browser may still need to access the disk to grab that tiny file.
Non async or defer JavaScript script tags are blocking, if this script is somewhere in the middle of your page, it will be stuck there until the script downloads.
From a performance perspective if the only 2 options are:
Place a 50 char JavaScript bit inline
Place the 50 chars in a separate file and serve it.
Considering that you are a good web citizen and compress all your content, the amount of additional payload this adds is negligible compared to the 20 percent risk of giving people a considerable delay. I would always choose #1.
In an imperfect world it is very rare to have such a clear and easy set of options. There is an option 3 that involved async loading jQuery and grabbing this functionality from a common area.
Making scripts inline can have some detrimental effects -
a) Code organization - Your code gets scattered in between your markup, thus affecting readability
b) Code Minification and obfuscation becomes difficult
Its best to keep your js in seperate files, and then at build time integrate all of them into a single file, and minify and obfuscate this.
This is not quite true. You can configure the web server (well atleast apache) to make the scrips/ccs inlined when they are served.
Here is a useful link
http://www.websiteoptimization.com/speed/tweak/mod_pagespeed/
There are two factors to consider here. One is download time, the other is maintainability. Both of these are impacted by how many times a piece of Javascript is needed.
With respect to download time, you obviously have two choices: include the JS in the body of the page, or as an external file. Including the JS in the body does save an extra HTTP request, although it also bloats the HTML a bit and can be a pain to maintain if you have several scripts you're putting inline on several different pages.
Another important consideration is whether or not the JS is needed immediately on the page. If a small piece of JS is needed as soon as the page loads, then putting it inline may be a good idea. If it's being used for something asynchronous in the future, then putting it an external file may still be a good choice.
I usually write javascript inline, especially if the script is this small. I would say just paste it in your code. It won't increase the http document size by a lot.
While inlining the script will save a request, as Yslow suggests it increases the HTML document size, and mixes content/markup with code/logic, which you generally want to avoid from as much as possible.
The reason Yslow gives this caveat:
Only consider this if your property is a common user home page.
Is that if the page is loaded frequently, it's worth it to have the javascript external, since the files will be cached in the browser. So, if you combine your JS into one file, on the first request you incur one extra request, and on subsequent requests the file is loaded from the cache.
Aaron Peters talk from last year's Velocity EU gives a good insight into the options, and course you should choose - http://www.slideshare.net/startrender/fast-loading-javascript
For really small snippet of js it's really not worth putting them in an external file as the network overhead of retrieving them will dwarf the benefits.
Depending on the latency it may be ever worth including large scripts e.g. Bind mobile has loads of js in the first page loaded which it then cached in localstorage for later pages.
Addy Osmani recently put together a experimental library to help people play with caching scripts in localstorage - http://addyosmani.github.com/basket.js/
I know that best practice for including javascript is having all code in a separate .js file and allowing browsers to cache that file.
But when we begin to use many jquery plugins which have their own .js, and our functions depend on them, wouldn't it be better to load dynamically only the js function and the required .js for the current page?
Wouldn't that be faster, in a page, if I only need one function to load dynamically embedding it in html with the script tag instead of loading the whole js with the js plugins?
In other words, aren't there any cases in which there are better practices than keeping our whole javascript code in a separate .js?
It would seem at first glance that this would be a good idea, but in fact it would actually make matters worse. For example, if one page needs plugins 1, 2 and 3, then a file would be build server side with those plugins in it. Now, the browser goes to another page that needs plugins 2 and 4. This would cause another file to be built, this new file would be different from the first one, but it would also contain the code for plugin 2 so the same code ends up getting downloaded twice, bypassing the version that the browser already has.
You are best off leaving the caching to the browser, rather than trying to second-guess it. However, there are options to improve things.
Top of the list is using a CDN. If the plugins you are using are fairly popular ones, then the chances are that they are being hosted with a CDN. If you link to the CDN-hosted plugins, then any visitors who are hitting your site for the first time and who have also happened to have hit another site that's also using the same plugins from the same CDN, the plugins will already be cached.
There are, of course, other things you can to to speed your javascript up. Best practice includes placing all your script include tags as close to the bottom of the document as possible, so as to not hold up page rendering. You should also look into lazy initialization. This involves, for any stuff that needs significant setup to work, attaching a minimalist event handler that when triggered removes itself and sets up the real event handler.
One problem with having separate js files is that will cause more HTTP requests.
Yahoo have a good best practices guide on speeding up your site: http://developer.yahoo.com/performance/rules.html
I believe Google's closure library has something for combining javascript files and dependencies, but I havn't looked to much into it yet. So don't quote me on it: http://code.google.com/closure/library/docs/calcdeps.html
Also there is a tool called jingo http://code.google.com/p/jingo/ but again, I havn't used it yet.
I keep separate files for each plug-in and page during development, but during production I merge-and-minify all my JavaScript files into a single JS file loaded uniformly throughout the site. My main layout file in my web framework (Sinatra) uses the deployment mode to automatically either generate script tags for all JS files (in order, based on a manifest file) or perform the minification and include a single querystring-timestamped script inclusion.
Every page is given a body tag with a unique id, e.g. <body id="contact">.
For those scripts that need to be specific to a particular page, I either modify the selectors to be prefixed by the body:
$('body#contact form#contact').submit(...);
or (more typically) I have the onload handlers for that page bail early:
jQuery(function($){
if (!$('body#contact').length) return;
// Do things specific to the contact page here.
});
Yes, including code (or even a plug-in) that may only be needed by one page of the site is inefficient if the user never visits that page. On the other hand, after the initial load the entire site's JS is ready to roll from the cache.
The network latency is the main problem.You can get a very responsive page if you reduce the http calls to one.
It means all the JS, CSS are bundled into the HTML page.And if your can forget IE6/7 you can put the images as data:image/png;base64
When we release a new version of our web app, a shell script minify and bundle everything into a single html page.
Then there is a second call for the data, and we render all the HTML client-side using a JS template library: PURE
Ensure the page is cached and gzipped. There is probably a limit in size to consider.We try to stay under 400kb unzipped, and load secondary resources later when needed.
You can also try a service like http://www.blaze.io. It automatically peforms most front end optimization tactics and also couples in a CDN.
There currently in private beta but its worth submitting your website to.
I would recommend you join common bits of functionality into individual javascript module files and load them only in the pages they are being used using RequireJS / head.js or a similar dependency management tool.
An example where you are using lighbox popups, contact forms, tracking, and image sliders in different parts of the website would be to separate these into 4 modules and load them only where needed. That way you optimize caching and make sure your site has no unnecessary flab.
As a general rule its always best to have less files than more, its also important to work on the timing of each JS file, as some are needed BEFORE the page completes loading and some AFTER (ie, when user clicks something)
See a lot more tips in the article: 25 Techniques for Javascript Performance Optimization.
Including a section on managing Javascript file dependencies.
Cheers, hope this is useful.
I have a PHP application in which there is a div container which includes the app which has been $_GET from the URL string.
Currently, there are two applications which require TinyMCE, and one application which requires jQuery and jQuery UI.
Question is, where should I include the files on the page?
In the header, the page loads really slowly, >30 seconds (now <10 seconds, using different router), at at the bottom, the pages which require the scripts fail to load correctly.
The JS files have already been minified and compressed.
TinyMCE won't gZIP becuase Zlib is installed (as a result of GD), so how should I optimise the situation?
The Yahoo! Exceptional Performance team recommends to put the script elements at the end of the body element.
At the bottom and run your scripts when the document is fully loaded (using "onload" event).
By placing the JavaScript file just before the closing BODY tag, you are allowing the rest of the page to load while the JavaScript file is loading. If you place it in the HEAD section, the page will hang until the script loads.
If it's taking 30 seconds to load in the header, though, you are probably facing a different issue. TinyMCE should not take 30 seconds to load.
There probably is no one correct answer for this.
Generally placing javascript to <head> works fine, but 30 seconds is way too much. I'm developing a JavaScript app which dynamically loads about 70 uncompressed javascript files (some quite large) and it doesn't take anywhere near 30 seconds.
Too little information to solve this.
How many JS files is it? If its many, then you may want to look at Steve Souders slides for Even Faster Websites. Downloading JS file is a blocking action. Souders has a nice solution for dealing with script blocking. Check the PPT from http://www.thebitsource.com/2009/03/14/sxsw-interactive-2009-steve-souders-on-even-faster-web-sites/
Also, where are you serving the JS from? Try serving jQuery from Google AJAX Libraries API. It uses their CDN and caches for a long time. So the user will only have to dl the files 1 time.
I want my JS Code be completely seperated from the XHTML, thus putting it inline HTML before the body closing tag won't do for me.
I declare one single JS File in the html head. Then copy/paste all Libraries etc into that JS file. This will result in one HTTP Request, which speeds things up on mobile browsers too.
I then use Prototype to get DOM sensitive functions started:
document.observe("dom:loaded", function) { // code goes here });
I think this may be a browser dependent question- Suppose I have 10 Javascript files and several HTML pages. Suppose HTML pageA needs only JS1.js and JS3.js, similarly HTML pageB needs JS4.js and JS1.js. I want to know what would be effect of including all the 10 javascript files in all HTML pages? Will it directly relate to the memory consumption by the browser?
I am facing this problem particularly with YUI javascript library. There are several components like datatable, event, container, calendar, dom-event etc., The order in which they are included also seems to matter a lot- For example the dom-event js should be included before the rest for it to work. So to avoid all this confusion, I thought of including all these js files in a header file that gets included in all HTML pages.
The thing that I am worried about is the memory bloat and performance problems that it may cause. Please provide your suggestions on the same..
Thanks,
-Keshav
Any script you load into your page, even once downloaded and cached must still be parsed before the rest of the page can load. So in that sense there is a memory penalty, and there's still a potential for something in the script to significantly delay rendering.
However, in the case of a conscientiously designed library such as YUI I would expect the parsing time to be minimised.
If you can load all your scripts in at the end of the page, that can vastly improve performance as the entire page can render before being blocked by javascript execution, and your site will feel a lot snappier.
I would suggest investigating the Firebug Net panel and the YSlow extension to get specific performance stats for your website.
External scripts delay the display of the following html until they have loaded and executed. The impact is much less after the first page load, since they're already cached, though browsers will occasionally check for new versions, which still carries a delay. I try to limit the number of scripts and move the script tags to the bottom of the page when possible. Users won't notice the script loading delay if the page has already fully displayed.
if a given script does nothing, it will not affect the performance.
Obviously the first page will load slowly, but the rest will not need to load all the scripts because they will be cached. So the next pages will load faster
Tips:
1) Load the script at the bottom of the page (just before the closing BODY tag).
2) Use a non-blocking way of loading the scripts. This is the one I'm using .
<script type="text/javascript">
function AttachScript(src) {
var script = document.createElement("script");
script.type = "text/javascript";
document.getElementsByTagName("body")[0].appendChild(script);
script.src = src;
}
AttachScript("/js/jquery.min.js");
AttachScript("/js/ndr.js");
AttachScript("/js/shadowbox.js");
AttachScript("/js/libraries/sizzle/sizzle.js");
AttachScript("/js/languages/shadowbox-es.js");
AttachScript("/js/players/shadowbox-img.js");
AttachScript("/js/adapters/shadowbox-jquery.js");
Can't find the source web page though :-(
Memory Consumption:
Assuming the scripts are well written then memory consumption and performance issues should be nominal. Your biggest problem with including all scripts at once will be the latency in the user experience first time through, or if you make changes, because they will have to download all of them in one hit. I think you should only include the scripts you need per page, not all scripts at once.
You can assess the impact yourself using simple tools like task manager/processes in Windows to monitor memory/processor useage, or plugs ins like Firebug for FireFox.
You can also look into something called minification to help make your script files as small as possible.
Dependencies:
The order in which you include the scripts is important as some scripts may depend on functionality in other scripts. So if the code in one script attempts to run and it requires code in another script that has not been downloaded then it will fail. My advice would be to actually understand those dependancies in your scripts files rather than just downloading everything at once because it seems easier.
Use the YUI Configurator to help determine the required file includes and order, as well as how to use the Yahoo! CDN combo service to combine all YUI files into a single script tag.
http://developer.yahoo.com/yui/articles/hosting/
External assets to the HTML page are typically cached by the browser. External assets are anything requested from the HTML such as images, CSS, JavaScript, and anything else. So if you load all 10 script files up front you are forcing a one time massive download hit to your user. After this one time the user does not need to download the scripts again unless the modify timestamp on the files change.
Your page will only use what it requires. If a particular page requests js4.js and js5.js then all the functions in those files will be loaded into the interpreter in the order in which they are first requested from the HTML and second by the order in which they are specified in each of those files. If there are any namespace conflicts what ever is loaded into the interpreter last wins. The interpreter will clear out the functions once the page is unloaded from the browser.
For efficiency I would suggest using a server-side inclusion process to read each of the js files and include the contents of each file into a new single js file. This will reduce the number of HTTP requests to the server and save your users an extreme amount of bandwidth resources with regard to HTTP headers and GET requests. Also, put the request of this new one script file directly prior to the closing body tag of your HTML. Downloading of scripts block parallel downloads in IE, so you want to load scripts at the lowest possible point in the page.
Scriptaculous implements a nice way to handle js dependencies. Guess you could check it out and "re-implement" it. ;D
As for memory bloat and performance issues... as long as your JS doesn't leak a lot (YUI probably doesn't) memory won't be much of a problem, although it will make your pages load slower, especially if loaded in the header.
You can read on caching methods using PHP to pass on several javascript files as one big JS file which includes everything you need. For additional performance gains, you can make the browser cache the file locally in addition to sending it gzipped (if the browser has support for the encoding using something like ob_start("ob_gzhandler");). By using gzip encoding, you can severely reduce the filesize of the main JS file you're sending which includes all your JS code (since plain text compresses so well). I recently had to do this on my own website and it's worked like a charm for both JS and CSS files.
http://www.ejeliot.com/blog/72
Note that by following the instructions on that tutorial, your JS file will only be sent once and the browser on the client's machine will keep a local copy stored which will also improve performance of every visit thereafter.
Also, consider googling "Minify" which should be hosted on Google Code.