I'm working on a new project based entirely on Java EE 6 technologies, including JSF/EJB/JPA. In previous projects I widely used json/ajax before. But this time I'm reconsidering if I really need json/ajax any more. JSF has supplied me with a great way to build web UI. It seems the only benefit left on the json/ajax side is data-refreshing without reloading the page, at the cost of complex data processing in javascript. If I cut off json/ajax, I'm both cutting off the benefit & the javascript work, which doesn't really appear to hurt much to me.
I especially do not like jquery. I think JQuery is like a fabulous promise, which always starts in an attractive way, but as the project grows more and more complex, as more jquery elements added weired javascript errors emerge here and there. Even if I use json/ajax, I will not use jQuery anyway.
What do you think about it? Will you continue to work on json/ajax when using JSF?
On one of my web projects, I use a lot of javascript/jQuery code, which is pretty slow on browsers (Windows 7 x64), especially on IE.
I use 3 Ajax requests at the same time only on Home page.
On Search page, I also use ajax requests, which are fired on scroll event, on any 'search tag' (simple anchor tag) click event and etc. which in general is making data loading very slow.
I use jQuery plugins such as, Anythingslider, jquery coockies plugin, Raty (rating plugin), Tipsuy, jQuery coreUISelect, jScrollPane, mouse wheel and etc. All those 3rd party plugins I have minified and combined in jquery.plugins.js file, which is almost 80KB.
I select a lot of DOM elements with jQuery. For example I use the following code:
$("#element")
instead of:
document.getElementById('element');
I also have one big CSS file, which is more than 5 000 lines, because I have combined all 3rd party jQuery plugins's css files into one file, for caching and less HTTP requests.
Well, I wonder, what can I do to optimize my code for better performance and speeding up web page load?
What kind of tools can I use to debug and my JS code? I forgot to mention that, when I refresh page in Google Chrome or Firefox with firebug or Chrome native developer tools opened, the page in that case loads also very slow. Sometimes the Firefox is even crushed.
Will selecting of DOM elements with raw js give me a better and faster way to parse the document? Or should I leave, the jQuery selecting? Talk about is about 50 elements.
Should I separate and after that minify external plugins, such as Anythingslider? Or is it better when I have 'all in one' js file?
Is it better to also separate jQuery plugins's css code from main style.css? Because even hovering on element and affecting the :hover state from css file, is pretty slow.
Well guys, I'm really counting on you.
I've been googling all night to find answers on my questions and really hope to find it here.
Thanks.
1) minify it
2) all the browsers come with built in debugging tools
3) reduce access to the dom by storing references in variables, don't look up the same element twice
4) separate and use a well known cdn
5) separate just cos its easier to manage
More jQuery tips here : jquery-performance-rules and here : improve-your-jquery-25-excellent-tips.
Make sure all your static resources are cached with no disk lookup
This library is pretty cool
You can compare selector performance here: http://jsperf.com/
Just setup your HTML code, include jQuery lib and place each selector you want to compare as various test case.
Many of the jquery-performance-rules still apply,
Also have look at here jquery-proven-performance-tips-tricks
Since there are a lot of ways to improve code, especially with such big websites like yours, I think it will be more useful to just post the external links, since these are very nicely written and concise tutorials and tools. Here are some of them:
Yahoo's tutorial, one of the most complete tutorials I know
W3Schools' tutorial on using image sprites, especially useful when there are a lot of small images on the page
Tips on efficient jQuery usage
Firebug, Firefox plugin for debugging javascript, html, css
HTML validator, can be very useful to quickly find errors in markup
HTML compressor for minifying your HTML code (there are a lot of tools on the web for this purpose, it's just a matter of finding the best one)
CSS compressor, same as for HTML
I would also recommend IDE for building web applications/websites called JetBrains' PHPStorm. It is commercial software, but definitely worth every cent, since it gives hints and tips for improvement while typing.
Raw performance may not be your issue. If you notice poor performance on page load, see this article I wrote specifically about minimizing javascript execution on page initialization.
http://blog.lavoie.sl/2013/12/optimizing-page-loads-by-reducing-javascript.html
I am working on web app in which I am using javascript on client side to handle validation/calling backend cgi scripts etc kind of things.Now my problem is the WebGUI has become some what slow than it was earlier.Actually everything is working fine as expected but this issue.
I know that there may be many issues which are directly affecting the speed of application.
After all all the application is depended on cgi script response but still is it possible to make the app faster by taking care of certain javascript functions??
So can you please suggest that what are the steps I should take care to make javascript execution some what faster ???(i.e less number of LOC)
Thanks in advance...
As long as you do not post some js code, it will be kind of difficult to help you.
Still, speaking in general: Keep your DOM-Manipulations to a minimum, especially when dealing with lots of data. Do not use jquery functions that affect the dom (append, insertBefore, insertAfter) in a loop - try to do all your preparation in a loop and then call DOM-affecting functions once having all the changes together.
But of course, i do not know if this is the case in any code of your application.
For our site, Im using a lot of jQuery - right now Im looking at 340 lines of jQuery code on top of the base library. How much is too much? I will be adding more, when do I start trying to condense the code and eventually move to OOP?
The number of lines doesn't mean anything - what matters is what you're actually doing. You could have 10 lines of supremely inefficient code that would do much more damage than a meticulously crafted 1000 lines of code.
Optimally, you should keep you script size as minimum as possible, but with today's 'Web 2.0' websites, you will most probably accumulate quite a lot of JavaScript code.
The important thing is that before you deploy your website, make sure to minify and gzip your script files as to reduce the size of your script files as much as possible.
If you are really interested in optimizing and improving your website performance, I highly recommend taking a look at Steve Souders' High Performance Web Sites: Essential Knowledge for Front-End Engineers
How much is too much depends a lot on your application.
You should strive to be concise, but not at the expense of readability or user experience.
I would pay attention to script loading time more than lines of code. If it gets to be too big, break the file down into page or section specific files. "Too much" is based solely on application performance and what you deem to be acceptable for your users.
340 lines is nothing, try using a few telerik controls...soon gets to 15k+ lines!
It depends on the project you are working on. You should keep your code efficient and readable. Once you deploy your website, just compress and gzip your scripts and that would improve performance.
I wouldn't concern yourself with the length of your JavaScript. You have multiple options available to you like using Packer to compress your JavaScript for release (you'll want to practice with it some since it does have a few rules for how it works).
Focus on making sure your code is understandable and easy to maintain. Heavy use of JavaScript in websites can get hairy in a big hurry.
Concerning yourself with trying to make it short or small can hurt you more than if a user has to wait an extra second for the page to load.
For development it becomes absolutely essential to separate out code into separate .js files or things will get messy.
HOWEVER,
Do not leave a ton of script references in a production page. Most browsers are limited to 2 simultaneous HTTP requests. Those script references will slow down your page load and far outweigh any possible benefit of caching components separately.
You can concatenate your development files into one file using JS Builder:
http://code.google.com/p/js-builder/
Edit: By script references I mean the < script src="blah.js">. Each of those needs to be loaded via HTTP when the page loads.
340 lines of javascript is nothing, but as your javascript code base grows I'd spend some time looking into frameworks for compressing and concatenating javascript on the fly. If you're on Java I'd recommend using JAWR, which lets you switch between multiple references in development mode and a single, minified script in production. Just make sure you test your app in production mode before you go live, as the minification algorithm could screw up your code in some obscure cases (if you write clean code and remember to end every line with a ';' you should be fine).
If you're not on Java I don't know of any frameworks, but implementing something similar yourself actually isn't that hard. I think I have some code lying around somewhere for doing it in eZ Publish, which is written in PHP.
I would like to know when I should include external scripts or write them inline with the html code, in terms of performance and ease of maintenance.
What is the general practice for this?
Real-world-scenario - I have several html pages that need client-side form validation. For this I use a jQuery plugin that I include on all these pages. But the question is, do I:
write the bits of code that configure this script inline?
include all bits in one file that's share among all these html pages?
include each bit in a separate external file, one for each html page?
Thanks.
At the time this answer was originally posted (2008), the rule was simple: All script should be external. Both for maintenance and performance.
(Why performance? Because if the code is separate, it can easier be cached by browsers.)
JavaScript doesn't belong in the HTML code and if it contains special characters (such as <, >) it even creates problems.
Nowadays, web scalability has changed. Reducing the number of requests has become a valid consideration due to the latency of making multiple HTTP requests. This makes the answer more complex: in most cases, having JavaScript external is still recommended. But for certain cases, especially very small pieces of code, inlining them into the site’s HTML makes sense.
Maintainability is definitely a reason to keep them external, but if the configuration is a one-liner (or in general shorter than the HTTP overhead you would get for making those files external) it's performance-wise better to keep them inline. Always remember, that each HTTP request generates some overhead in terms of execution time and traffic.
Naturally this all becomes irrelevant the moment your code is longer than a couple of lines and is not really specific to one single page. The moment you want to be able to reuse that code, make it external. If you don't, look at its size and decide then.
If you only care about performance, most of advice in this thread is flat out wrong, and is becoming more and more wrong in the SPA era, where we can assume that the page is useless without the JS code. I've spent countless hours optimizing SPA page load times, and verifying these results with different browsers. Across the board the performance increase by re-orchestrating your html, can be quite dramatic.
To get the best performance, you have to think of pages as two-stage rockets. These two stages roughly correspond to <head> and <body> phases, but think of them instead as <static> and <dynamic>. The static portion is basically a string constant which you shove down the response pipe as fast as you possibly can. This can be a little tricky if you use a lot of middleware that sets cookies (these need to be set before sending http content), but in principle it's just flushing the response buffer, hopefully before jumping into some templating code (razor, php, etc) on the server. This may sound difficult, but then I'm just explaining it wrong, because it's near trivial. As you may have guessed, this static portion should contain all javascript inlined and minified. It would look something like
<!DOCTYPE html>
<html>
<head>
<script>/*...inlined jquery, angular, your code*/</script>
<style>/* ditto css */</style>
</head>
<body>
<!-- inline all your templates, if applicable -->
<script type='template-mime' id='1'></script>
<script type='template-mime' id='2'></script>
<script type='template-mime' id='3'></script>
Since it costs you next to nothing to send this portion down the wire, you can expect that the client will start receiving this somewhere around 5ms + latency after connecting to your server. Assuming the server is reasonably close this latency could be between 20ms to 60ms. Browsers will start processing this section as soon as they get it, and the processing time will normally dominate transfer time by factor 20 or more, which is now your amortized window for server-side processing of the <dynamic> portion.
It takes about 50ms for the browser (chrome, rest maybe 20% slower) to process inline jquery + signalr + angular + ng animate + ng touch + ng routes + lodash. That's pretty amazing in and of itself. Most web apps have less code than all those popular libraries put together, but let's say you have just as much, so we would win latency+100ms of processing on the client (this latency win comes from the second transfer chunk). By the time the second chunk arrives, we've processed all js code and templates and we can start executing dom transforms.
You may object that this method is orthogonal to the inlining concept, but it isn't. If you, instead of inlining, link to cdns or your own servers the browser would have to open another connection(s) and delay execution. Since this execution is basically free (as the server side is talking to the database) it must be clear that all of these jumps would cost more than doing no jumps at all. If there were a browser quirk that said external js executes faster we could measure which factor dominates. My measurements indicate that extra requests kill performance at this stage.
I work a lot with optimization of SPA apps. It's common for people to think that data volume is a big deal, while in truth latency, and execution often dominate. The minified libraries I listed add up to 300kb of data, and that's just 68 kb gzipped, or 200ms download on a 2mbit 3g/4g phone, which is exactly the latency it would take on the same phone to check IF it had the same data in its cache already, even if it was proxy cached, because the mobile latency tax (phone-to-tower-latency) still applies. Meanwhile, desktop connections that have lower first-hop latency typically have higher bandwidth anyway.
In short, right now (2014), it's best to inline all scripts, styles and templates.
EDIT (MAY 2016)
As JS applications continue to grow, and some of my payloads now stack up to 3+ megabytes of minified code, it's becoming obvious that at the very least common libraries should no longer be inlined.
Externalizing javascript is one of the yahoo performance rules:
http://developer.yahoo.com/performance/rules.html#external
While the hard-and-fast rule that you should always externalize scripts will generally be a good bet, in some cases you may want to inline some of the scripts and styles. You should however only inline things that you know will improve performance (because you've measured this).
i think the specific to one page, short script case is (only) defensible case for inline script
Actually, there's a pretty solid case to use inline javascript. If the js is small enough (one-liner), I tend to prefer the javascript inline because of two factors:
Locality. There's no need to navigate an external file to validate the behaviour of some javascript
AJAX. If you're refreshing some section of the page via AJAX, you may lose all of your DOM handlers (onclick, etc) for that section, depending on how you binded them. For example, using jQuery you can either use the live or delegate methods to circumvent this, but I find that if the js is small enough it is preferrable to just put it inline.
Another reason why you should always use external scripts is for easier transition to Content Security Policy (CSP). CSP defaults forbid all inline script, making your site more resistant to XSS attacks.
I would take a look at the required code and divide it into as many separate files as needed. Every js file would only hold one "logical set" of functions etc. eg. one file for all login related functions.
Then during site developement on each html page you only include those that are needed.
When you go live with your site you can optimize by combining every js file a page needs into one file.
The only defense I can offer for inline javascipt is that when using strongly typed views with .net MVC you can refer to c# variables mid javascript which I've found useful.
On the point of keeping JavaScript external:
ASP.NET 3.5SP1 recently introduced functionality to create a Composite script resource (merge a bunch of js files into one). Another benefit to this is when Webserver compression is turned on, downloading one slightly larger file will have a better compression ratio then many smaller files (also less http overhead, roundtrip etc...). I guess this saves on the initial page load, then browser caching kicks in as mentioned above.
ASP.NET aside, this screencast explains the benefits in more detail:
http://www.asp.net/learn/3.5-SP1/video-296.aspx
Three considerations:
How much code do you need (sometimes libraries are a first-class consumer)?
Specificity: is this code only functional in the context of this specific document or element?
Every code inside the document tends to make it longer and thus slower. Besides that SEO considerations make it obvious, that you minimize internal scripting ...
External scripts are also easier to debug using Firebug. I like to Unit Test my JavaScript and having it all external helps. I hate seeing JavaScript in PHP code and HTML it looks like a big mess to me.
Another hidden benefit of external scripts is that you can easily run them through a syntax checker like jslint. That can save you from a lot of heartbreaking, hard-to-find, IE6 bugs.
In your scenario it sounds like writing the external stuff in one file shared among the pages would be good for you. I agree with everything said above.
During early prototyping keep your code inline for the benefit of fast iteration, but be sure to make it all external by the time you reach production.
I'd even dare to say that if you can't place all your Javascript externally, then you have a bad design under your hands, and you should refactor your data and scripts
Google has included load times into it's page ranking measurements, if you inline a lot, it will take longer for the spiders to crawl thru your page, this may be influence your page ranking if you have to much included. in any case different strategies may have influence on your ranking.
well I think that you should use inline when making single page websites as scripts will not need to be shared across multiple pages
Having internal JS pros:
It's easier to manage & debug
You can see what's happening
Internal JS cons:
People can change it around, which really can annoy you.
external JS pros:
no changing around
you can look more professional (or at least that's what I think)
external JS cons:
harder to manage
its hard to know what's going on.
Always try to use external Js as inline js is always difficult to maintain.
Moreover, it is professionally required that you use an external js since majority of the developers recommend using js externally.
I myself use external js.