Javascript acceleration? - javascript

Is there any way to speed up JS scripts (i mean some complex DOM manipulations, like games or animations)?

Theres really no way to really speed it up. You can compact it but it won't be much faster.

Use the V8 javascript engine? :P
The only way you can do that is to reduce the amount of dom and scope access in your code.
e.g. when accessing an element in the dom multiple times, instead of
document.something.mydiv.myelement.property1 = this
document.something.mydiv.myelement.property2 = bla
document.something.mydiv.myelement.property3 = foo
document.something.mydiv.myelement.property4 = bar
do
var myel = document.something.mydiv.myelement
myel.property1 = this
myel.property2 = bla
myel.property4 = bar
myel.property3 = foo
Note that some javascript engines will automatically make this optimization for you, but for older browsers, the latter will definitely be faster than the former because it only has to go through the access chain once to reach myel
Have a look at this video for more javascript optmization techniques
http://www.youtube.com/watch?v=mHtdZgou0qU

If you mean outside the browser then you should use the fastest around, i.e. Chrome's V8 Java Script engine.
Inside the browser there is a wealth of optimization techniques to faster loading Java Script, here is a good place for optimization techniques by google.
Compile your Java Script using a tool like YUI compressor than serve it gzipped.
Only load the bare minimum you need
Complex animations are still best served by Rich UI plugins, i.e. Flash/Silverlight
For animations look at using the HTML 5 Canvas element for browsers that support it, fall back to flash for ones that don't.
Google maps is a good case of what's possible with pure Java Script although they've spent a wealth resources optimizing the performance for each browser. As always the best way to improve your speed is to benchmark different approaches. e.g. div.innerHTML ="", is most of the times quicker than using the DOM to dynamically add elements, etc.

The best you can do is optimize your code. Use a profiler -- for Firefox there's Firebug, Safari and Windows IE 8 have JavaScript debuggers and profilers built in (At least I believe IE 8 does, someone correct me if I'm wrong...). Running a profile on your code will show you where the slowest parts are, and those are the sections you can focus on optimizing... perhaps with more questions that are a lot more specific.

That's a very vague question. There are a million things you can do to speed up your code (Ajaxian has 157 articles on the topic at the time of this writing), but there is no "Make Me Faster" button that magically makes all scripts run faster. If there were, life would be so much easier.

The closure project from Google makes some claims along those lines, although I haven't tried it personally.
The Closure Compiler compiles JavaScript into compact, high-performance code. The compiler removes dead code and rewrites and minimizes what's left so that it downloads and runs quickly. It also also checks syntax, variable references, and types, and warns about common JavaScript pitfalls.

Try to make animation and display changes to positioned or 'offscreen' elements- redraw the page the fewest number of times.
Make multiple style changes by changing the cssText or the className, not one property at a time.
If you need to lookup an element or property twice in the same process, you should have made a local reference the first time.
And remember to turn off the debugger, if you are not debugging.

Related

Is Javascript size a performance concern after it is cached?

I'm writing a project which will use some fairly large JS libraries including jquery UI. The project will be run within an Intranet though. So download time is not really an issue for me and most people should only have to download the libraries once since I assume they will remain in the browser's cache.
My question is about how modern browsers (IE9,FF5,etc) handle the processing of the Javascript code. I imagine at some point it is compiled, but is this done on each page load, or is the compiled code cached too. If so, is it cached even after the browser is closed?
This web app may run on some low powered portable devices so I wanted to be reasonably efficient. I wanted to combine all the javascript files into one large one that is linked to on every page of the app.
But depending on how much work the browser must do to process all the JS I'm wondering if I should split them up so not all pages must load all the JS. Obviously that's more work though.
Any thoughts or info would be appreciated. Thank you.
Yes, JavaScript size is still a performance concern if it is cached for the following reasons:
Most browsers don't cache the byte code that they execute. So the script must still be re-parsed on every page load. Modern browsers are doing this faster, but it still may be a concern for very large JavaScript files.
The code in your JavaScript files is actually executed on every page load. Even if browsers start caching byte code, they still have to re-execute the JavaScript every time the page is loaded.
You may get a lower cache rate than expected, for reasons beyond your control. Users may disable their cache, or visit so many sites that your files get expired from the cache quickly. You should check your logs to make sure that the ratio of page loads to JavaScript file loads is high.
Loading a file from cache is usually pretty fast, but it's not always trivial. It can take upwards of 200ms in some cases.
You can do a pretty quick test to get a rough idea of how long your scripts take to parse and execute like this:
<script>
var startTime = (new Date()).getTime();
</script>
<script src="cachedFile1.js"></script>
<script src="cachedFile2.js"></script>
<!--all your scripts included this way-->
<script>
var endTime = (new Date()).getTime();
alert("Took " + (endTime - startTime) + " milliseconds to parse and execute");
</script>
Make sure to test on all the target browsers you support; JavaScript parse and execution time can vary wildly between different browsers. Also make sure that you test on a computer that is as slow as the ones your users will have. If you do find performance problems, you probably will need to solve them in a profiler. Minification won't help much for improving parse and execution time.
Minify your javascript files. This takes up less space.
Also, Javascript is an interpreted language so it is never compiled.
http://en.wikipedia.org/wiki/Minification_(programming)
It doesn't really matter how much there is code but how heavy it is. Nowadays browsers can run JS quite fast. Just try opening Gmail forexample (which is almost all about javascript) with IE7 and then with IE9 or Chrome.
You need to look into the JavaScript engines each browser uses to get a better understanding, for instance Google Chrome uses V8: http://code.google.com/p/v8/
Also this may help you understand a great deal better:
How do browsers execute javascript
How do browsers handle JavaScript?
If your use-case is only in an intranet environment, then compared to other optimization techniques, code size is not really a problem for you. Presumably, target browsers are fairly new and have JS engines that can handle modern full-blown javascript sites, in which case the size of the code won't really affect the speed, since the code parsing takes so little anyways, compared to the execution. In older browsers, however there might be a small speed difference compared to more optimized code length since they were never meant to run javascript that's thousands and thousands of lines of code.
Compared to code execution optimization, code length optimization won't probably even be noticeable to end-user. You might be able to knock of a few ms by compressing the code, but most modern engines create a "map" when they parse the code, so that at execution time, the location of a function, variable etc doesn't really matter. So worry more about overall code optimization, rather than the library sizes etc.
You can listen about V8 internals here
I never thought about this from that point of view, but I can't really why not bundling everything up in a big file wouldn't help* - the Javascript build tools I know focus on doing precisely this.
*unless you have a big module that is only rarely used. In that case, keep it separate so not everyone has to use it.

How can I improve my sites IE6/7 JS performance?

So I was involved in a site rewrite recently and we've gone live with what’s a big improvement on the previous in every way (no it's not perfect, we live by deadlines and are always improving :D) with one exception: in IE6/7 it will lockup after the page has shown. I know it's the JS as with it disabled it's fast and I'm aware of some things like the simplegallery plugin that we use being very slow but even with that and Google ads removed it's still at a crawl(+8sec). I've looked through the Firebug profiler and made loads of JS/CSS changes such as:
Moving all JS except our img error handling to the bottom of the page
Improving all jQuery selectors specify for best performance
Moving to jQuery 1.4
running our core custom JS (main.js) through JS Lint
Spriting commonly used images
Reducing CSS selector complexity
Doing this was good for all browsers and I know I can do even more but I'm not seeing a major improvement in IE6/7 which I need. We do use DD_roundies_0.0.2a.js for IE7 but not for IE6. I tried DynaTrace but couldn't see anything obvious though I did get a bit lost in its depth.
A sample listing page
A sample search page
Can anyone see what I might be missing here and/or point to some good IE profiling tools?
Edit: I should have mentioned that I've been through YSlow, PageSpeed and Chrome's Developer Tool. All of which I used to base most of the improvements mentioned above on. At this point I'm not saying the site is fully optomised but it's Ok and moving in the right direction. However I have an issue in IE6/7 and I believe it to be the JS execution.
Edit 2: We already send down the Chrome Frame meta tag for IE6 from the server. It's not a solution but I see it doing more good than harm for IE6. I'm after JS specifc feedback at this point as I think I've covered all the other bases.
You can manually profile your "common.js" script in IE6.
Just grab a new time-stamp at strategic places and alert them at the end.
e.g.
function ts() { return (new Date).getTime(); }
var t0 = ts();
// some of your code
var t1 = ts();
// rest of your code
var t2 = t();
alert(t1-t0); // milliseconds between t0 and t1
alert(t2-t0); // ms between t0 and t2
Maybe one part of the script is that much slower than the rest.
Or maybe it's just IE6.
You're including jquery from http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.js, it'll download faster if you host it on your website.
Also, checkout the YSlow addon for Firebug, it gives you lots of information about what you can do to improve the load time of your site.
If you want to be drastic force your users to install the Google Chrome Frame it will make IE use the chrome renderer and javascript engine
http://code.google.com/chrome/chromeframe/
Currently only thing that seemed there suspicious was "omniture.js".
Here's a blog post I found regarding omniture.js and IE6.
You can use SpeedTracer on Chrome to debug speed issues. Of course the js speed will be according to V8 engine.
The javascript itself is often not the issue, it's when you modify the DOM you end up in trouble, for example animating, adding and removing elements. Be extra careful with opacity.
Take a look at LABjs & RequireJS: Loading JavaScript Resources the Fun Way which talks about how you can load scripts in parallel.
Until the latest generation of
browsers, the tag had some
really unfavorable performance
characteristics to it. Namely,
tags “block”, meaning they
call a halt to everything else that’s
loading/happening on the page, while
they load and while they execute. Not
only do they halt all other parts of
the page, they even used to halt the
loading of any other tags.

jQuery or javascript to find memory usage of page

Is there a way to find out how much memory is being used by a web page, or by my jquery application?
Here's my situation:
I'm building a data heavy webapp using a jquery frontend and a restful backend that serves data in JSON. The page is loaded once, and then everything happens via ajax.
The UI provides users with a way to create multiple tabs within the UI, and each tab can contain lots and lots of data. I'm considering limiting the number of tabs they can create, but was thinking it would be nice to only limit them once memory usage has gone above a certain threshold.
Based on the answers, I'd like to make some clarfications:
I'm looking for a runtime solution (not just developer tools), so that my application can determine actions based on memory usage in a user's browser.
Counting DOM elements or document size might be a good estimation, but it could be quite inaccurate since it wouldn't include event binding, data(), plugins, and other in-memory data structures.
2015 Update
Back in 2012 this wasn't possible, if you wanted to support all major browsers in-use. Unfortunately, right now this is still a Chrome only feature (a non-standard extension of window.performance).
window.performance.memory
Browser support: Chrome 6+
2012 Answer
Is there a way to find out how much memory is being used by a web page, or by my jquery application? I'm looking for a runtime solution (not just developer tools), so that my application can determine actions based on memory usage in a user's browser.
The simple but correct answer is no. Not all browsers expose such data to you. And I think you should drop the idea simply because the complexity and inaccuracy of a "handmade" solution may introduce more problem than it solves.
Counting DOM elements or document size might be a good estimation, but it could be quite inaccurate since it wouldn't include event binding, data(), plugins, and other in-memory data structures.
If you really want to stick with your idea you should separate fixed and dynamic content.
Fixed content is not dependant on user actions (memory used by script files, plugins, etc.)
Everything else is considered dynamic and should be your main focus when determining your limit.
But there is no easy way to summarize them. You could implement a tracking system that gathers all these information. All operations should call the appropriate tracking methods. e.g:
Wrap or overwrite jQuery.data method to inform the tracking system about your data allocations.
Wrap html manipulations so that adding or removing content is also tracked (innerHTML.length is the best estimate).
If you keep large in-memory objects they should also be monitored.
As for event binding you should use event delegation and then it could also be considered a somewhat fixed factor.
Another aspect that makes it hard to estimate your memory requirements correctly is that different browsers may allocate memory differently (for Javascript objects and DOM elements).
You can use the Navigation Timing API.
Navigation Timing is a JavaScript API for accurately measuring performance on the web. The API provides a simple way to get accurate and detailed timing statistics—natively—for page navigation and load events.
window.performance.memory gives access to JavaScript memory usage data.
Recommended reading
Measuring page load speed with Navigation Timing
This question is 5 years old, and both javascript and browsers have evolved incredibly in this time. Since this now possible (in at least some browsers), and this question is the first result when you Google "javascript show memory useage", I thought I'd offer a modern solution.
memory-stats.js: https://github.com/paulirish/memory-stats.js/tree/master
This script (which you can run at any time on any page) will display the current memory useage of the page:
var script=document.createElement('script');
script.src='https://rawgit.com/paulirish/memory-stats.js/master/bookmarklet.js';
document.head.appendChild(script);
I don't know of any way that you could actually find out how much memory is being used by the browser, but you might be able to use a heuristic based on the number of elements on the page. Uinsg jQuery, you could do $('*').length and it will give you the count of the number of DOM elements. Honestly, though, it's probably easier just to do some usability testing and come up with a fixed number of tabs to support.
Use the Chrome Heap Snapshot tool
There's also a Firebug tool called MemoryBug but seems it's not very mature yet.
If you want to just see for testing there is a way in Chrome via the developer page to track memory use, but not sure how to do it in javascript directly.
I would like to suggest an entirely different solution from the other answers, namely to observe the speed of your application and once it drops below defined levels either show tips to the user to close tabs, or disable new tabs from opening. A simple class which provides this kind of information is for example https://github.com/mrdoob/stats.js .
Aside of that, it might not be wise for such an intensive application to keep all tabs in memory in the first place. E.g. keeping only the user state (scroll) and loading all the data each time all but the last two tabs are opening might be a safer option.
Lastly, webkit developers have been discussing adding memory information to javascript, but they have gotten in a number of arguments about what and what should not be exposed. Either way, it's not unlikely that this kind of information will be available in a few years (although that information isn't too useful right now).
Perfect question timing with me starting on a similar project!
There is no accurate way of monitoring JS memory usage in-app since it would require higher level privileges. As mentioned in comments, checking the number of all elements etc. would be a waste of time since it ignores bound events etc.
This would be an architecture issue if memory leaks manifest or unused elements persist. Making sure that closed tabs' content is deleted completely without lingering event handlers etc. would be perfect; assuming that it's done you could just simulate heavy usage in a browser and extrapolate the results from memory monitoring (type about:memory in the address bar)
Protip: if you open the same page in IE, FF, Safari... and Chrome; and than navigate to about:memory in Chrome, it will report memory usage across all other browsers. Neat!
What you might want to do is have the server keep track of their bandwidth for that session (how many bytes of data have been sent to them). When they go over the limit, instead of sending data via ajax, the server should send an error code which javascript will use to tell the user they've used too much data.
You can get the document.documentElement.innerHTML and check its length. It would give you the number of bytes used by your web page.
This may not work in all browsers. So you can enclose all your body elements in a giant div and call innerhtml on that div. Something like <body><div id="giantDiv">...</div></body>

Is it possible to write a JavaScript library that makes all browsers standards compliant?

I'm not a JavaScript wiz, but is it possible to create a single embeddable JavaScript file that makes all browsers standards compliant? Like a collection of all known JavaScript hacks that force each browser to interpret the code properly?
For example, IE6 does not recognize the :hover pseudo-class in CSS for anything except links, but there exists a JavaScript file that finds all references to :hover and applies a hack that forces IE6 to do it right, allowing me to use the hover command as I should.
There is an unbelievable amount of time (and thus money) that every webmaster has to spend on learning all these hacks. Imagine if there was an open source project where all one has to do is add one line to the header embedding the code and then they'd be free to code their site per accepted web standards (XHTML Strict, CSS3).
Plus, this would give an incentive for web browsers to follow the standards or be stuck with a slower browser due to all the JavaScript code being executed.
So, is this possible?
Plus, this would give an incentive for web browsers to follow the standards or be stuck with a slower browser due to all the JavaScript code being executed.
Well... That's kind of the issue. Not every incompatibility can be smoothed out using JS tricks, and others will become too slow to be usable, or retain subtle incompatibilities. A classic example are the many scripts to fake support for translucency in PNG files on IE6: they worked for simple situations, but fell apart or became prohibitively slow for pages that used such images creatively and extensively.
There's no free lunch.
Others have pointed out specific situations where you can use a script to fake features that aren't supported, or a library to abstract away differences. My advice is to approach this problem piecemeal: write your code for a decent browser, restricting yourself as much as possible to the common set of supported functionality for critical features. Then bring in the hacks to patch up the browsers that fail, allowing yourself to drop functionality or degrade gracefully when possible on older / lesser browsers.
Don't expect it to be too easy. If it was that simple, you wouldn't be getting paid for it... ;-)
Check out jQuery it does a good job of standardizing browser javascript
Along those same lines explorercanvas brings support for the HTML5 canvas tag to IE browsers.
You can't get full standards compliance, but you can use a framework that smooths over some of the worst breaches. You can also use something called a reset style sheet.
There's a library for IE to make it act more like a standards-compliant browser: Dean Edwards' IE7.
Like a collection of all known
javascript hacks that force each
browser to interpret the code properly
You have two choices: read browser compatibility tables and learn each exception a browser has and create one yourself, or use avaiable libraries.
If you want a javascript correction abstraction, you can use jQuery.
If you want a css correction abstraction, you can check /IE7/.
I usually don't like to use css corrections made by javascript. It's another complexity to my code, another library that can insert bugs to already bugged browsers. I prefer creating conditional statements to ie6, ie7 and such and create separate stylesheets for each of them. This approach works and doesn't generate a lot of overhead.
EDIT: (I know that we have problems in other browsers, but since IE is the major browser out there and usually we need really strange hacks to make it work, css conditional statements is a good approach IMO).
Actually you can,there are lots of libraries to handle this issue. From the start of the time, javascript compliance issue always was a problem for developers and thanks to innovative ones who developed libraries to get over this problem...
One of them and my favorite is JQuery.
Before JavaScript 1.4 there was no global arguments Array, and it is impossible to implement the arguments array yourself without a highly advanced source filter. This means it is going to be impossible for the language to maintain backwards-compatibility with Netscape 4.0 and Internet Explorer 4.0. So right out I can say that no, you cannot make all browser standards compliant.
Post-netscape, you can implement nearly all of the features in the core of the language in JavaScript itself. For example, I coded all methods of the Array object in 100% JavaScript code.
http://openjsan.org/doc/j/jh/jhuni/StandardLibrary/1.81/index.html
You can see my implementation of Array here if you go to the link and then go down to Array and then "source."
What most of you are probably referring to is implementing the DOM objects yourself, which is much more problematic. Using VML you can implement the Canvas tag across all the modern browsers, however, you will get a buggy/barely-working performance in Internet Explorer because VML is markup which is not a good format for implementing the Canvas tag...
http://code.google.com/p/explorercanvas/
Flash/Silverlight: Using either of these you can implement the Canvas tag and it will work quite well, you can also implement sound. However, if the user doesn't have any browser plugins there is nothing you can do.
http://www.schillmania.com/projects/soundmanager2/
DOM Abstractions: On the issue of the DOM, you can abstract away from the DOM by implementing your own Event object such as in the case of QEvent, or even implementing your own Node object like in the case of YAHOO.util.Element, however, these usually have some subtle changes to the standard API, so people are usually just abstracting away from the standard, and there is hundreds of cases of libraries that abstract away.
http://code.google.com/p/qevent/
This is probably the best answer to your question. It makes browsers as standards-compliant as possible.
http://dean.edwards.name/weblog/2007/03/yet-another/

Executing JavaScript to Render HTML for Server-Side Caching

There are lots of widgets provided by sites that are effectively bits of JavaScript that generate HTML through DOM manipulation or document.write(). Rather than slow the browser down even more with additional requests and trust yet another provider to be fast, reliable and not change the widget output, I want to execute* the JavaScript to generate the rendered HTML, and then save that HTML source.­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­
Things I've looked into that seem unworkable or way too difficult:
The Links Browser (not lynx!)
Headless use of Xvfb plus Firefox plus Greasemonkey (yikes)
The all-Java browser toolkit Cobra (the best bet!)
Any ideas?
** Obviously you can't really execute the JavaScript completely, as it doesn't necessarily have an exit path, but you get the idea.
Wikipedia's "Server-side JavaScript" article lists numerous implementations, many of which are based on Mozilla's Rhino JavaScript-to-Java converter, or its cousin SpiderMonkey (the same engine as found in Firefox and other Gecko-based browsers). In particular, something simple like mod_js for Apache may suit your needs.
If you're just using plain JS, Rhino should do the trick. But if the JS code is actually calling DOM methods and so on, you're going to need a full-blown browser. Crowbar might help you.
Is this really going to make things faster for users without causing compatibility issues?
There's John Resig's project Bringing the Browser to the Server: "browser/DOM environment, written in JavaScript, that runs on top of Rhino; capable of running jQuery, Prototype, and MochiKit (at the very least)."

Categories

Resources