How can I optimize my web-page (which is quite large)? - javascript

i'm working on a web application...
The application is running fine but the problem is the first time wen i open the application in the browser it shows a blank page i have to hit refresh three or four times to load the page completely and correctly.....
I think my application is too heavy to load, However once it is loaded it's good to go....
i have probably 5 JavaScript files which is around 1.3mb in size and also some UI components.....
is there a possible way to control it so that wen i load the application it returns the entire application without the necessarily hitting refresh again and again....
is there a way to optimize this page....
please help...
Thank you in adavance...
hi again,
is there a way to automatically reload the page if it didn't load the first time?

Check whether you can optimize your code in the javascript. Do you need all the functions that are defined in those 5 javascript files?If not you can split it and load it when other pages load that need this functionality.
Try to find out which part of the code is making it too slow?

1.3 MB of javascript is too much. Try compressing your javascript.
http://jscompress.com/
After compression, try delay loading the javascript files which ever possible:
http://www.websiteoptimization.com/speed/tweak/defer/
Run YSlow addon to gather more information about optimizations possible
http://developer.yahoo.com/yslow/

The easiest method is to run YSlow on a Firefox Console
You should also compress your javascript files using YUI Compressor

Have you minified your javascript. This makes it more difficult for humans to understand but can significantly reduce the file size. If one of those scripts is jQuery you might consider referencing the copy hosted at google on your page rather than having it hosted on the serve. Google's server is probably faster than your, and a lot of users will have a copy of jQuery from google cached.
If the site is image heavy and PNGs are used you might consider removing some data from them to make them smaller using tools like pngcrush
As mentioned by a few other, running the page through YSlow is very likely to help find issues that could cause slow performance

Related

My web site is freezing - probably JavaScript, but I can't identify why

I know this is a long shot, so I'm not counting on an answer, but I just for the life of me can't figure out why this web design keeps freezing my browsers.
http://fuzionve.com/test
I ran the Audit tool in Chrome Developer Tools, but nothing seems that substantial. When I load it in the browser, it freezes, then loads completely at once.
Any suggestions?
Thanks so much for your help.
1) Remove/comment out code
2) Load Page and test if freezing
3) if still freezing, repeat from step 1
4) if not freezing, examine the code you just removed.
It works fine here.
Some advice on "easy" optimizations:
<link rel="stylesheet" href="http://fonts.googleapis.com/css?family=PT+Sans:400,700,400italic" /> - this is an external stylesheet. Your browser has to set up a connection with another server. Consider downloading the font (and stylesheet) to your own server.
You load 4 stylesheets, which requires 4 roundtrips to the server. Consider merging those in 1 stylesheet. If you prefer using multiple stylesheets in development, merge them into a single file in the release version.
Same thing for scripts: you have 3 scripts. Consider merging them into 1 file in the release version.
When your browser detects a <script> element, it stops rendering your page until that script file has been downloaded from your server and executed. This is because scripts can use document.write() to write HTML, changing your page.
If you know your scripts don't write HTML (usually they don't, at least until the document.ready event is fired), consider using the HTML5 async or defer attributes (https://developer.mozilla.org/en-US/docs/HTML/Element/script). Note that these attributes are only supported in modern browsers.
If those things don't help, follow Brad M's advice to trace your problem.
I had the same problem when using multiples (and heavy ones) plugins, what you can do, is change some of their codes, to start after another, like callbacks, at least i changed 3-4 plugins codes to make that change.
Its possible accomplish this job, checkinging, with setInterval, to see if some plugin its done.
Or implement a callback, into the code, after he done, call another code.
i've found this lib, which can help you:
https://github.com/caolan/async

Put javascript and css inline in a single minified html file to improve performance?

A typical website consists of one index.html file and a bunch of javascript and css files. To improve the performance of the website, one can:
Minify the javascript and css files, to reduce the file sizes.
Concatenate the javascript files into one file and similar for the css files, to reduce the number of requests to the server. For commonly used (and shared) libraries like jquery it makes sense to leave them external, allowing the browser to cache the library and reuse it in different web applications.
I'm wondering if it makes sense to put the concatenated javascript and css file inline in on single html file, which will reduce the number of requests even further. Will this improve the performance of your site? Or will it work reversed, making it impossible for the browser to cache anything?
Concatinating your CSS and JS files into one file will reduce the number of requests and make it load faster. But as commented, it won't make much sense unless you have a one-page site and the load time of that page is very critical. So you're better off to separate CSS from Javascript in my opinion.
Here's a book where you can learn more about the topic:
High Performance Web Sites
this tools maybe help you.
Turns your web page to a single HTML file with everything inlined - perfect for appcache manifests on mobile devices that you want to reduce those http requests.
https://github.com/remy/inliner
It would cut down on the number of requests but it would also mean no caching of those for use on other pages. Think of defining an external file as also a way to tell the browser "and this section of the site is reusable". You'd be taking that ability away and so the CSS and JS would load. Like jackwanders said it's great if you only have one page.
This is not a good idea for the following reasons:
You will not enjoy the benefit of cache
You will load unneeded resources in all of your pages
You will have a hard time while developing your website because of large files with unrelated code branches
If you work in a team you will have to work with your teammates on the same files always, which means that you will have a lot of merge conflicts.
You can have a single CSS for all your pages and since it will be cached, the subsequent pages will refer it from cache without sending extra request.
However, putting all Javascript files is into one is contextual.
Most probably you might be using libraries like jQuery, and relevant plugins. This 'might' throw conflicting issues between plugins. So, before you try it all at once, try merging few files at once and checking if the error pops or not.

Which is better for JavaScript load-time: Compress all in one big file or load all asynchronously?

A simple question that I'm not sure if it has a short answer!
Description
I have a files of JavaScript that to be loaded in a website here are some notes about them:
They are all comes from the same domain (no cross domain loading needed)
They are identical around the website.
There are several files, like jQuery, and 5 other plugins plus my own application script that is based on them.
Their size all compressed = 224KB, ( I combine all the files in one file then I compress them at once using YUI Compressor 2
Problem
I've heard that 224KB is not ideal to be in one file! and it should be split into several files with maximum of 44KB each .. I can't recall when I've heard this and I'm not sure if it's effective to split it into more files, but It's true that 224KB takes long time to load for the first time, consider that the website is loaded with images and css of course.
I've minimized the need for the early loading of JavaScript file and put it on the bottom, so far this is a good progress but I need to load it assynchounosly with the HTML to gain time Source and the decission to make is:
Yes or No?
Keep it in one compressed big file? or to split them into many compressed file and loaded a asynchronously (I'm aware of handling the dependency related problems)?
It depends on what the site is and how important first load time is for it.
Regardless of that though, I'd probably load JQuery and stuff like that from a public CDN. One big benefit is that it might already be cached even if they have never been to your site.
http://encosia.com/2008/12/10/3-reasons-why-you-should-let-google-host-jquery-for-you/
The Cappuccino team is a big proponent of one file -- they make a javascript framework. Apps made with their tool are expected to have some load time.
http://cappuccino.org/discuss/2009/11/11/just-one-file-with-cappuccino-0-8/
Another benefit of loading JQuery and related from a public CDN would the increased requests by destination. I believe the client is restricted to 2 requests per domain, so by loading jquery from google, and a plugin from jquery, and your custom app code from your own domain, the browser can execute these concurrently rather than waiting for the first two and then issuing a third request.
I guess this adds another performance improvement over one large file as well. Even if you just split that 1 file into 2, it could be retrieved with 2 concurrent requests from the browser potentially improving load time.
Here's what we did to make our web app fast.
The main JS and CSS files are compressed and put inline with the HTML markup.
The white spaces of the HTML are removed and the images are converted to data:image/png by a shell script.
The size is ~400kb but cached and gzipped.
The mobile version of the web app is the same but at ~250kb.
It means the whole app is ready to run, like an executable, in a single http call.
Then a second http call get the data(JSON), and we use PURE to render it in HTML using the existing markups in the page as templates.
The app is divided in modules, only the common modules are preloaded this way.The others are coming when requested by the user.
There is no exact answer to this question. It pretty much depends on how and when you are making use of those files.
Typically, you only want to download JS files on pageload which are universally required by the web app. Module specific or page specific JS files shouldn't be compressed in the main JS download and would ideally be loaded on demand.
Also, this question is valid only if you are concerned about user experience for first time users. The JS files would be cached anyways for every other visit.

Integrate Javascript resources in HTML: what's the best way?

I'm working on a project which uses many scripts (Google Maps, jQuery, jQuery plugins, jQuery UI...). Some pages have almost 350 kB of Javascript.
We are concerned about performance and I'm asking myself what is the best way to integrate those heavy scripts.
We have 2 solutions:
Include all scripts in the head, even if they are not utilized on the page.
Include some common scripts in the head, and include page specific ones when they are needed.
I would like to have your advice.
Thanks.
For the best performance I would create a single static minified javascript file (using a tool like YUI compressor) and reference it from the head section. For good tips on website performance check out googles website optimizations page.
Note that the performance penalty of retrieving all your javascript files only happen on the first page, as the browser will use the cache version of the file on subsequent pages.
For even better responsiveness you would split your javascript in two files. Load the first with all the javascript you need when the page loads, then after the page loads load the second file in the background.
If your interested, I have an open source AJAX javascript framework that simplifies compresses and concatenates all your html, css and javascript (including 3rd party libraries) into a single javascript file.
If you think it's likely that some users will never need the Google Maps JavaScript for example, just include that in the relevant pages. Otherwise, put them all in the head - they'll be cached soon enough (and those from Google's CDN may be cached already).
Scripts in the <head> tag do (I think) stop the page from rendering further until they’ve finished downloading, so you might want to move them down to the end of the <body> tag.
It won’t actually make anything load faster, but it should make your page content appear more quickly in some situations, so it should feel faster.
I’d also query whether you’ve really got 350 KB of JavaScript coming down the pipe. Surely the libraries are gzipped? jQuery 1.4 is 19 KB when minifed and gzipped.
1) I would recommend gather all the common scripts and most important like jquery and etc in one file to reduce number of requests for this files and compress it and i would recommend google closure u will find it here
2) Make the loading in a page the user have to open it in the beginning like login page and put the scripts at the end of the page to let all the content render first and this recommended by most of the performance tools like yslow and page speed
3) don't write scripts in your page , try to write everything in a file to make it easier later on for compression and encryption
4) put the scripts and all statics files like images and css on other domain to separate the loading on your server

Javascript file inclusion in html pages- what happens underneath in the browser?

I think this may be a browser dependent question- Suppose I have 10 Javascript files and several HTML pages. Suppose HTML pageA needs only JS1.js and JS3.js, similarly HTML pageB needs JS4.js and JS1.js. I want to know what would be effect of including all the 10 javascript files in all HTML pages? Will it directly relate to the memory consumption by the browser?
I am facing this problem particularly with YUI javascript library. There are several components like datatable, event, container, calendar, dom-event etc., The order in which they are included also seems to matter a lot- For example the dom-event js should be included before the rest for it to work. So to avoid all this confusion, I thought of including all these js files in a header file that gets included in all HTML pages.
The thing that I am worried about is the memory bloat and performance problems that it may cause. Please provide your suggestions on the same..
Thanks,
-Keshav
Any script you load into your page, even once downloaded and cached must still be parsed before the rest of the page can load. So in that sense there is a memory penalty, and there's still a potential for something in the script to significantly delay rendering.
However, in the case of a conscientiously designed library such as YUI I would expect the parsing time to be minimised.
If you can load all your scripts in at the end of the page, that can vastly improve performance as the entire page can render before being blocked by javascript execution, and your site will feel a lot snappier.
I would suggest investigating the Firebug Net panel and the YSlow extension to get specific performance stats for your website.
External scripts delay the display of the following html until they have loaded and executed. The impact is much less after the first page load, since they're already cached, though browsers will occasionally check for new versions, which still carries a delay. I try to limit the number of scripts and move the script tags to the bottom of the page when possible. Users won't notice the script loading delay if the page has already fully displayed.
if a given script does nothing, it will not affect the performance.
Obviously the first page will load slowly, but the rest will not need to load all the scripts because they will be cached. So the next pages will load faster
Tips:
1) Load the script at the bottom of the page (just before the closing BODY tag).
2) Use a non-blocking way of loading the scripts. This is the one I'm using .
<script type="text/javascript">
function AttachScript(src) {
var script = document.createElement("script");
script.type = "text/javascript";
document.getElementsByTagName("body")[0].appendChild(script);
script.src = src;
}
AttachScript("/js/jquery.min.js");
AttachScript("/js/ndr.js");
AttachScript("/js/shadowbox.js");
AttachScript("/js/libraries/sizzle/sizzle.js");
AttachScript("/js/languages/shadowbox-es.js");
AttachScript("/js/players/shadowbox-img.js");
AttachScript("/js/adapters/shadowbox-jquery.js");
Can't find the source web page though :-(
Memory Consumption:
Assuming the scripts are well written then memory consumption and performance issues should be nominal. Your biggest problem with including all scripts at once will be the latency in the user experience first time through, or if you make changes, because they will have to download all of them in one hit. I think you should only include the scripts you need per page, not all scripts at once.
You can assess the impact yourself using simple tools like task manager/processes in Windows to monitor memory/processor useage, or plugs ins like Firebug for FireFox.
You can also look into something called minification to help make your script files as small as possible.
Dependencies:
The order in which you include the scripts is important as some scripts may depend on functionality in other scripts. So if the code in one script attempts to run and it requires code in another script that has not been downloaded then it will fail. My advice would be to actually understand those dependancies in your scripts files rather than just downloading everything at once because it seems easier.
Use the YUI Configurator to help determine the required file includes and order, as well as how to use the Yahoo! CDN combo service to combine all YUI files into a single script tag.
http://developer.yahoo.com/yui/articles/hosting/
External assets to the HTML page are typically cached by the browser. External assets are anything requested from the HTML such as images, CSS, JavaScript, and anything else. So if you load all 10 script files up front you are forcing a one time massive download hit to your user. After this one time the user does not need to download the scripts again unless the modify timestamp on the files change.
Your page will only use what it requires. If a particular page requests js4.js and js5.js then all the functions in those files will be loaded into the interpreter in the order in which they are first requested from the HTML and second by the order in which they are specified in each of those files. If there are any namespace conflicts what ever is loaded into the interpreter last wins. The interpreter will clear out the functions once the page is unloaded from the browser.
For efficiency I would suggest using a server-side inclusion process to read each of the js files and include the contents of each file into a new single js file. This will reduce the number of HTTP requests to the server and save your users an extreme amount of bandwidth resources with regard to HTTP headers and GET requests. Also, put the request of this new one script file directly prior to the closing body tag of your HTML. Downloading of scripts block parallel downloads in IE, so you want to load scripts at the lowest possible point in the page.
Scriptaculous implements a nice way to handle js dependencies. Guess you could check it out and "re-implement" it. ;D
As for memory bloat and performance issues... as long as your JS doesn't leak a lot (YUI probably doesn't) memory won't be much of a problem, although it will make your pages load slower, especially if loaded in the header.
You can read on caching methods using PHP to pass on several javascript files as one big JS file which includes everything you need. For additional performance gains, you can make the browser cache the file locally in addition to sending it gzipped (if the browser has support for the encoding using something like ob_start("ob_gzhandler");). By using gzip encoding, you can severely reduce the filesize of the main JS file you're sending which includes all your JS code (since plain text compresses so well). I recently had to do this on my own website and it's worked like a charm for both JS and CSS files.
http://www.ejeliot.com/blog/72
Note that by following the instructions on that tutorial, your JS file will only be sent once and the browser on the client's machine will keep a local copy stored which will also improve performance of every visit thereafter.
Also, consider googling "Minify" which should be hosted on Google Code.

Categories

Resources