How can I improve my sites IE6/7 JS performance? - javascript

So I was involved in a site rewrite recently and we've gone live with what’s a big improvement on the previous in every way (no it's not perfect, we live by deadlines and are always improving :D) with one exception: in IE6/7 it will lockup after the page has shown. I know it's the JS as with it disabled it's fast and I'm aware of some things like the simplegallery plugin that we use being very slow but even with that and Google ads removed it's still at a crawl(+8sec). I've looked through the Firebug profiler and made loads of JS/CSS changes such as:
Moving all JS except our img error handling to the bottom of the page
Improving all jQuery selectors specify for best performance
Moving to jQuery 1.4
running our core custom JS (main.js) through JS Lint
Spriting commonly used images
Reducing CSS selector complexity
Doing this was good for all browsers and I know I can do even more but I'm not seeing a major improvement in IE6/7 which I need. We do use DD_roundies_0.0.2a.js for IE7 but not for IE6. I tried DynaTrace but couldn't see anything obvious though I did get a bit lost in its depth.
A sample listing page
A sample search page
Can anyone see what I might be missing here and/or point to some good IE profiling tools?
Edit: I should have mentioned that I've been through YSlow, PageSpeed and Chrome's Developer Tool. All of which I used to base most of the improvements mentioned above on. At this point I'm not saying the site is fully optomised but it's Ok and moving in the right direction. However I have an issue in IE6/7 and I believe it to be the JS execution.
Edit 2: We already send down the Chrome Frame meta tag for IE6 from the server. It's not a solution but I see it doing more good than harm for IE6. I'm after JS specifc feedback at this point as I think I've covered all the other bases.

You can manually profile your "common.js" script in IE6.
Just grab a new time-stamp at strategic places and alert them at the end.
e.g.
function ts() { return (new Date).getTime(); }
var t0 = ts();
// some of your code
var t1 = ts();
// rest of your code
var t2 = t();
alert(t1-t0); // milliseconds between t0 and t1
alert(t2-t0); // ms between t0 and t2
Maybe one part of the script is that much slower than the rest.
Or maybe it's just IE6.

You're including jquery from http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.js, it'll download faster if you host it on your website.
Also, checkout the YSlow addon for Firebug, it gives you lots of information about what you can do to improve the load time of your site.

If you want to be drastic force your users to install the Google Chrome Frame it will make IE use the chrome renderer and javascript engine
http://code.google.com/chrome/chromeframe/

Currently only thing that seemed there suspicious was "omniture.js".
Here's a blog post I found regarding omniture.js and IE6.
You can use SpeedTracer on Chrome to debug speed issues. Of course the js speed will be according to V8 engine.

The javascript itself is often not the issue, it's when you modify the DOM you end up in trouble, for example animating, adding and removing elements. Be extra careful with opacity.

Take a look at LABjs & RequireJS: Loading JavaScript Resources the Fun Way which talks about how you can load scripts in parallel.
Until the latest generation of
browsers, the tag had some
really unfavorable performance
characteristics to it. Namely,
tags “block”, meaning they
call a halt to everything else that’s
loading/happening on the page, while
they load and while they execute. Not
only do they halt all other parts of
the page, they even used to halt the
loading of any other tags.

Related

Authoritative JavaScript validation to a standard

I am trying to validate the JavaScript on my website. The scripts do not throw any errors and run fine on Chrome and Firefox (latest stable version). However, the animated parts absolutely do not work on IE (9)[*1][*solved]. I have used jQuery and jQueryUI for the animation and had hoped that it would be cross browser compatible.
Usually I would not have cared abut IE users, but liked the idea of folks at anybrowser.org and thought of sticking to a standard instead of using the lowest common denominator and leave the graceful degradation to the browsers / JS engines. My page is HTML5 compliant according to the w3c validator and I wanted to do the same with JavaScript, but could not find an acceptable way to do it.
Why JSLint did not work -
It will not take the page and check it like w3c validator.
So there is no way to check jQuery referenced script. (There are a few SO posts on this).
I did find an ECMA-264 test page that would check the browser, but not my JavaScript.
The question after all the preamble is: How do I validate the Javascript on my webpage? Is there an authoritative validator for JavaScript?
*1 Added: The minification has nothing to do with the script not working on IE9. Even the unminified version does not work.
*Solved: The problem with the specific page was solved by this comment.
This started as "What is Javascript?" got closed here and migrated to Programmers. Thanks to YannisRizos for helping me to split the original question to something that was acceptable between Programmers and SO.
A test suite. No really.
If you want to ensure your javascript runs perfectly in every target environment, then you write a test suite for your code and make sure it passes in all your target environments.
A test suite helps find where your code breaks when run on different platforms. And therefore helps you fix it.
But assuming you want something that is less work, http://www.jshint.com/ is your best bet.
Or http://www.jslint.com/ if you like getting yelled at.
But honestly, no validator will ever be able to ensure that your code runs without error and exactly how you expect everywhere. Simply because your code follows best practices correctly, that doesn't mean it will work in an old version of IE.
Standards evolve, but an old browser that never updates get stuck with whatever standard was out at the time that the browser maker may or may not have even implemented successfully or accurately.
Testing, automated or manual, is the only way to ensure things work universally.

Is Javascript size a performance concern after it is cached?

I'm writing a project which will use some fairly large JS libraries including jquery UI. The project will be run within an Intranet though. So download time is not really an issue for me and most people should only have to download the libraries once since I assume they will remain in the browser's cache.
My question is about how modern browsers (IE9,FF5,etc) handle the processing of the Javascript code. I imagine at some point it is compiled, but is this done on each page load, or is the compiled code cached too. If so, is it cached even after the browser is closed?
This web app may run on some low powered portable devices so I wanted to be reasonably efficient. I wanted to combine all the javascript files into one large one that is linked to on every page of the app.
But depending on how much work the browser must do to process all the JS I'm wondering if I should split them up so not all pages must load all the JS. Obviously that's more work though.
Any thoughts or info would be appreciated. Thank you.
Yes, JavaScript size is still a performance concern if it is cached for the following reasons:
Most browsers don't cache the byte code that they execute. So the script must still be re-parsed on every page load. Modern browsers are doing this faster, but it still may be a concern for very large JavaScript files.
The code in your JavaScript files is actually executed on every page load. Even if browsers start caching byte code, they still have to re-execute the JavaScript every time the page is loaded.
You may get a lower cache rate than expected, for reasons beyond your control. Users may disable their cache, or visit so many sites that your files get expired from the cache quickly. You should check your logs to make sure that the ratio of page loads to JavaScript file loads is high.
Loading a file from cache is usually pretty fast, but it's not always trivial. It can take upwards of 200ms in some cases.
You can do a pretty quick test to get a rough idea of how long your scripts take to parse and execute like this:
<script>
var startTime = (new Date()).getTime();
</script>
<script src="cachedFile1.js"></script>
<script src="cachedFile2.js"></script>
<!--all your scripts included this way-->
<script>
var endTime = (new Date()).getTime();
alert("Took " + (endTime - startTime) + " milliseconds to parse and execute");
</script>
Make sure to test on all the target browsers you support; JavaScript parse and execution time can vary wildly between different browsers. Also make sure that you test on a computer that is as slow as the ones your users will have. If you do find performance problems, you probably will need to solve them in a profiler. Minification won't help much for improving parse and execution time.
Minify your javascript files. This takes up less space.
Also, Javascript is an interpreted language so it is never compiled.
http://en.wikipedia.org/wiki/Minification_(programming)
It doesn't really matter how much there is code but how heavy it is. Nowadays browsers can run JS quite fast. Just try opening Gmail forexample (which is almost all about javascript) with IE7 and then with IE9 or Chrome.
You need to look into the JavaScript engines each browser uses to get a better understanding, for instance Google Chrome uses V8: http://code.google.com/p/v8/
Also this may help you understand a great deal better:
How do browsers execute javascript
How do browsers handle JavaScript?
If your use-case is only in an intranet environment, then compared to other optimization techniques, code size is not really a problem for you. Presumably, target browsers are fairly new and have JS engines that can handle modern full-blown javascript sites, in which case the size of the code won't really affect the speed, since the code parsing takes so little anyways, compared to the execution. In older browsers, however there might be a small speed difference compared to more optimized code length since they were never meant to run javascript that's thousands and thousands of lines of code.
Compared to code execution optimization, code length optimization won't probably even be noticeable to end-user. You might be able to knock of a few ms by compressing the code, but most modern engines create a "map" when they parse the code, so that at execution time, the location of a function, variable etc doesn't really matter. So worry more about overall code optimization, rather than the library sizes etc.
You can listen about V8 internals here
I never thought about this from that point of view, but I can't really why not bundling everything up in a big file wouldn't help* - the Javascript build tools I know focus on doing precisely this.
*unless you have a big module that is only rarely used. In that case, keep it separate so not everyone has to use it.

Can a Javascript error slow the website's load time

I have a web portal cricandcric.com which I have done using the php, java script, and mysql.
I dont see the java script error in Firefox, but i see the error in IE,
I observe the site loads faster in Firefox than in IE.
So my question, does java script errors can slow down the website loading time, even though the java script placed at the end of the page, (Yslow Strategies)
It depends. If the error happens early on and a lot of your script code is bypassed, it could actually make it faster. But every time an error occurs, there is some overhead (the exception object has to be built and sent up the call stack to look for any catches), so if it happens at the end, the script would run slower.
But I doubt your change in load time is noticeably impacted by script errors. How long the script takes to execute on the browser's JS engine or a host of other factors will have more impact.
IE's javascript engine has always been notably behind the other common browsers in performance, so it really might come down to that more than anything else. One of the many improvements in IE9 is JS execution speed that's actually competitive.
That said, the JS error probably is worth looking into, since it's occurring every time the image slideshow advances which happens once every couple of seconds.
If you're concerned about performance in general, there are a couple of tools like YSlow and the recently-opensourced DOM Monster bookmarklet, for giving suggestions on general ways to speed up websites. You might also want to check out some of the writings of Steve Souders.

Javascript acceleration?

Is there any way to speed up JS scripts (i mean some complex DOM manipulations, like games or animations)?
Theres really no way to really speed it up. You can compact it but it won't be much faster.
Use the V8 javascript engine? :P
The only way you can do that is to reduce the amount of dom and scope access in your code.
e.g. when accessing an element in the dom multiple times, instead of
document.something.mydiv.myelement.property1 = this
document.something.mydiv.myelement.property2 = bla
document.something.mydiv.myelement.property3 = foo
document.something.mydiv.myelement.property4 = bar
do
var myel = document.something.mydiv.myelement
myel.property1 = this
myel.property2 = bla
myel.property4 = bar
myel.property3 = foo
Note that some javascript engines will automatically make this optimization for you, but for older browsers, the latter will definitely be faster than the former because it only has to go through the access chain once to reach myel
Have a look at this video for more javascript optmization techniques
http://www.youtube.com/watch?v=mHtdZgou0qU
If you mean outside the browser then you should use the fastest around, i.e. Chrome's V8 Java Script engine.
Inside the browser there is a wealth of optimization techniques to faster loading Java Script, here is a good place for optimization techniques by google.
Compile your Java Script using a tool like YUI compressor than serve it gzipped.
Only load the bare minimum you need
Complex animations are still best served by Rich UI plugins, i.e. Flash/Silverlight
For animations look at using the HTML 5 Canvas element for browsers that support it, fall back to flash for ones that don't.
Google maps is a good case of what's possible with pure Java Script although they've spent a wealth resources optimizing the performance for each browser. As always the best way to improve your speed is to benchmark different approaches. e.g. div.innerHTML ="", is most of the times quicker than using the DOM to dynamically add elements, etc.
The best you can do is optimize your code. Use a profiler -- for Firefox there's Firebug, Safari and Windows IE 8 have JavaScript debuggers and profilers built in (At least I believe IE 8 does, someone correct me if I'm wrong...). Running a profile on your code will show you where the slowest parts are, and those are the sections you can focus on optimizing... perhaps with more questions that are a lot more specific.
That's a very vague question. There are a million things you can do to speed up your code (Ajaxian has 157 articles on the topic at the time of this writing), but there is no "Make Me Faster" button that magically makes all scripts run faster. If there were, life would be so much easier.
The closure project from Google makes some claims along those lines, although I haven't tried it personally.
The Closure Compiler compiles JavaScript into compact, high-performance code. The compiler removes dead code and rewrites and minimizes what's left so that it downloads and runs quickly. It also also checks syntax, variable references, and types, and warns about common JavaScript pitfalls.
Try to make animation and display changes to positioned or 'offscreen' elements- redraw the page the fewest number of times.
Make multiple style changes by changing the cssText or the className, not one property at a time.
If you need to lookup an element or property twice in the same process, you should have made a local reference the first time.
And remember to turn off the debugger, if you are not debugging.

JavaScript being loaded asynchronously in Firefox 3 (according to Firebug)?

I'm trying to profile the performance of a web site that I'm fairly confident is being slowed down by the loading of JavaScript files on the page.
The same JavaScript files are included several times on the page, and <script /> tags are scattered throughout the page instead of being included at the bottom.
As I suspected, when looking at FireBug's "Net" tab, most of the time (not all) when JavaScript is being loaded, no other files are requested. The browser waits for the JavaScript to complete loading.
There are a few exceptions however. There are a few occasions where JavaScript is loaded, but then at the same time, other resources appear to get loaded, such as other JavaScript files and images.
I always thought that JavaScript blocks the loading of other resources on the page. Am I incorrect in thinking this, or does this behavior vary depending on the browser or browser version?
UPDATE:
To those who have explained how loading a script blocks the loading of other resources, I'm already aware of this. My question is why a script wouldn't block the loading of other resources. Firebug is showing that some JavaScript files do not block loading other resources. I want to know why this would happen.
Javascript resource requests are indeed blocking, but there are ways around this (to wit: DOM injected script tags in the head, and AJAX requests) which without seeing the page myself is likely to be what's happening here.
Including multiple copies of the same JS resource is extremely bad but not necessarily fatal, and is typical of larger sites which might have been accreted from the work of separate teams, or just plain old bad coding, planning, or maintenance.
As far as yahoo's recommendation to place scripts at the bottom of the body, this improves percieved response times, and can improve actual loading times to a degree (because all the previous resources are allowed to async first), but it will never be as effective as non-blocking requests (though they come with a high barrier of technical capability).
Pretty decent discussion of non-blocking JS here.
I'm not entirly sure that Firebug offers a true reflection of what is going on within the browser. It's timing for resource loading seems to be good but I am not sure that it is acting as a true reflection of exactly what is going on. I've had better luck using HTTP sniffers/proxy applications to monitor the actual HTTP requests coming from the browser. I use Fiddler, but I know there are other tools out there as well.
In short, this many be an issue with the tool and not with how resources are actually being loaded ... at least worth ruling out.
I suppose you're using Firefox 3.0.10 and Firebug 1.3.3 since those are the latest releases.
The Firebug 1.4 beta has done many improvements on the net tab, but it requires Firefox 3.5. If you want to test it in Firefox 3.0, use one of the previous 1.4 alpha versions. But even with the improvements I still struggle to understand the result. I wish the Firebug developers would document more precisely what each part of a download means. It doesn't make sense to me why queuing is after connecting.
My conclusion was not to trust the results in Firebug, and ended up using the WebPageTest. Which was surprisingly good to come from AOL ;-)
Also, what kind of resources are being loaded at the same time as the javascript? Try tracing down the resources that are loaded at the same time, and see if it's referenced in a css/iframe/html-ajax. I'm guessing the reason why nothing else is loaded, is because the browser stops parsing the current HTML when it sees a script tag (without defer). Since it can't continue parsing the HTML, it has nothing more to request.
If you could provide a link to the page you're talking about. It would help everyone to give a more precise answer.
I believe that the content is downloadeded, but not rendered until the JavaScript is finished loading.
This is, from the server's POV, not much of a deal, but to the user it can make a huge difference in speed.
If you think about it a tag has to finish processing before you can continue to render content. What if the tag used document.write or some other wonderfully dumb thing? Until anything within the script tag has finished running the page can't be sure what it's going to display.
Browsers usually have a set number of connections opened to a single domain.
So, if you load all your scripts from the same domain you will usually load them one after the other.
But, if those scripts are loaded from several domains, they will be loaded in parallel.
The reason the browser is blocking during JavaScript downloading is that the browser suspects that there will be DOM nodes created inside the script.
For example, there might be "dcoument.write()" calls inside the script.
A way to hint to the browser that the script does not contain any DOM generation is with the "defer" attribute.
So,
<script src="script.js" type="text/javascript" defer="defer"></script>
should allow the browser to continue parallelizing the requests.
References:
http://www.w3.org/TR/REC-html40/interact/scripts.html#adef-defer
http://www.websiteoptimization.com/speed/tweak/defer/
As others have stated, the script is probably loading other resources through DOM injection.
Script.aculo.us actually loads its child components/scripts itself by doing this -- injecting other <script> tags into the DOM for them.
If you want to see whether or not this is the case, use Firebug's profiler and take a look at what the scripts are doing.
Like others said, one non-blocking way is to inject <script> tags in the page head.
But firefox can also execute loaded <script>s in parallel:
Copy the two lines below:
http://streetpc.free.fr/tmp/test.js
http://streetpc.free.fr/tmp/test2.js
Then go to this page, paste in the input textarea, click "JavaScript", then "Load Scripts" (which builds and adds a <script> child element to the head).
Try that in FF : you'll see "test2 ok", move the dialog box to see "test ok".
In other browsers, you should see "test ok" (with no other dialog behind) then "test2 ok", (except for Safari 4, showing me tes2 before test).
Firefox 3 has introduced connection parallelism feature to improve performance while loading a webpage, I bet this is the source of your problem ;)
When you open a web page that has many
different objects on it, like images,
Javascript files, frames, data feeds,
and so forth, the browser tries to
download several of them at once to
get better performance.
Here's the ZDNET blogpost about it.

Categories

Resources