I have a single web page application that is all JavaScript. I noticed the JavaScript heap size goes up on each AJAX call that returns a new view. Is there something I should be doing to clean up the older views?
I would recommend you to take a look at the following resources. You can see how to inspect the browser timeline in order to detect leaking references and performance problems.
The first one is from a Chrome developer talking this year at the Google IO:
http://www.youtube.com/watch?v=3pxf3Ju2row
The second one is from Paul Irish talking at the breaking point episode 2:
http://www.youtube.com/watch?v=PPXeWjWp-8Y
I'm sure you will find a lot of clues!!
Anyway ... if you share a test case of your code at jsfiddle.net, we can take a look :)
Related
I have a heap profile taken in Chrome Dev Tools, but I am not sure how to interpret the data. As a test, I created 10,000 WidgetBuilder objects, each with their own methods. I would like to profile out storing methods on instances versus the prototype and see how that affects memory and performance when my page loads.
Should I focus on Retained Size or Shallow Size?
Are the values listed in these columns in bytes?
What is considered a lot of memory?
You might want to start here:
https://developers.google.com/chrome-developer-tools/docs/heap-profiling
It goes into detail on how to understand what you're reading. As for what is considered a lot of memory that's a tricky question. If your website is targeted at mobile devices I would start there as a constraint. To come up with a good comparison I'd suggest running the profiler against sites that you use every day and observe the memory consumption there.
If you find you're using more memory than gmail you might want to rethink ;)
I also recommend checking out jspref:
http://jsperf.com/prototype-vs-instance-functions
There is a LOT of prior work done on that site in regards to performance testing. You might be able to save yourself some time.
I am currently reading High Performance JavaScript by Nicholas C. Zakas and in the book he says things like:
Comparing the two pieces of code shows that using the Selectors API is
2 to 6 times faster across browsers (Figure 3-6).
What I'm looking for is a browser based tool that lets me capture and measure the performance of a given piece of JavaScript and compare it against another script that uses a different approach (e.g., using the Selector API vs getElementsByTagName).
I've used Chrome and Firebug, but neither of them really seem to give me the kind of comparisons he's doing here. Am I using these tools incorrectly or is there a new tool I'm not familiar with that I should be using?
The most popular approach is just use the free online services of http://jsperf.com/.
Or clone it from github.
It has one big advantage over manual testing: It uses a Java Applet which gives access to a nanosecond timer, while JS timers (Date objects) can only resolve to milliseconds.
Chrome developers tools are the way to go. There are three awesome features.
Timeline. This will show you the rendering time of any executing javascript on a timeline graph.
Heap snapshot. This guy will take a snapshot of your current JS, including all of your chain. It also shows you how much memory each element is taking - provides a good way to find places your code is chewing.
CPU Profile. Shows how much CPU usage a function is eating up, also useful for finding places to optimize and perhaps introduce web workers.
If you use the Chrome beta channel check out the Speed Tracer extension (by Google). It's basically an enhanced timeline. If you're a jQuery guy the beta also has CSS Selector Profiling. I have yet to use so I can't speak to its uses.
A web application I am currently working on appears to be leaking memory: with each refresh of a page IE's memory usage goes up and it never releases that memory. Since certain pages are ment to be kept open in a browser and auto-refresh this is quickly turning into a problem.
The application is very javascript-heavy and relies on a combination of old 'plain' javascript code and new JQuery code. I'm looking for tools that would allow me to keep track of the javascript objects and memory allocated but I'm coming up short. Any suggestions?
In some of the newer versions of Chrome you can get a heap snapshot with the built in developer tools.
When working with IE you can use DynaTrace Ajax to get a bunch of diagnostics information.
Sorry for the short answer, but I'm not going to copy pasta what others have written better than me.
So I was involved in a site rewrite recently and we've gone live with what’s a big improvement on the previous in every way (no it's not perfect, we live by deadlines and are always improving :D) with one exception: in IE6/7 it will lockup after the page has shown. I know it's the JS as with it disabled it's fast and I'm aware of some things like the simplegallery plugin that we use being very slow but even with that and Google ads removed it's still at a crawl(+8sec). I've looked through the Firebug profiler and made loads of JS/CSS changes such as:
Moving all JS except our img error handling to the bottom of the page
Improving all jQuery selectors specify for best performance
Moving to jQuery 1.4
running our core custom JS (main.js) through JS Lint
Spriting commonly used images
Reducing CSS selector complexity
Doing this was good for all browsers and I know I can do even more but I'm not seeing a major improvement in IE6/7 which I need. We do use DD_roundies_0.0.2a.js for IE7 but not for IE6. I tried DynaTrace but couldn't see anything obvious though I did get a bit lost in its depth.
A sample listing page
A sample search page
Can anyone see what I might be missing here and/or point to some good IE profiling tools?
Edit: I should have mentioned that I've been through YSlow, PageSpeed and Chrome's Developer Tool. All of which I used to base most of the improvements mentioned above on. At this point I'm not saying the site is fully optomised but it's Ok and moving in the right direction. However I have an issue in IE6/7 and I believe it to be the JS execution.
Edit 2: We already send down the Chrome Frame meta tag for IE6 from the server. It's not a solution but I see it doing more good than harm for IE6. I'm after JS specifc feedback at this point as I think I've covered all the other bases.
You can manually profile your "common.js" script in IE6.
Just grab a new time-stamp at strategic places and alert them at the end.
e.g.
function ts() { return (new Date).getTime(); }
var t0 = ts();
// some of your code
var t1 = ts();
// rest of your code
var t2 = t();
alert(t1-t0); // milliseconds between t0 and t1
alert(t2-t0); // ms between t0 and t2
Maybe one part of the script is that much slower than the rest.
Or maybe it's just IE6.
You're including jquery from http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.js, it'll download faster if you host it on your website.
Also, checkout the YSlow addon for Firebug, it gives you lots of information about what you can do to improve the load time of your site.
If you want to be drastic force your users to install the Google Chrome Frame it will make IE use the chrome renderer and javascript engine
http://code.google.com/chrome/chromeframe/
Currently only thing that seemed there suspicious was "omniture.js".
Here's a blog post I found regarding omniture.js and IE6.
You can use SpeedTracer on Chrome to debug speed issues. Of course the js speed will be according to V8 engine.
The javascript itself is often not the issue, it's when you modify the DOM you end up in trouble, for example animating, adding and removing elements. Be extra careful with opacity.
Take a look at LABjs & RequireJS: Loading JavaScript Resources the Fun Way which talks about how you can load scripts in parallel.
Until the latest generation of
browsers, the tag had some
really unfavorable performance
characteristics to it. Namely,
tags “block”, meaning they
call a halt to everything else that’s
loading/happening on the page, while
they load and while they execute. Not
only do they halt all other parts of
the page, they even used to halt the
loading of any other tags.
Consider a complex rich internet application with lots of user interaction. I'm thinking of extensive drag-drop support, server-side user input validation, custom drawn UI controls such as an Outlook-like calendar, real-time UI feedback, etc... Would such an application be debuggable? I mean, can you easily step through the source code, place breakpoints, view the contents of variables, see the current call stack, use a profiler to pinpoint performance issues, etc...
Yes, why wouldn't it be?
Complexity just means more code to dig through, but tools like console.trace() from Firebug makes that easier.
Yes, it would be debug-able.
If you're using IE8 to test your site, you could use the Developer Tools to inspect individual HTML elements and change their CSS on the fly. There's also the ability to break into Javascript from the same interface.
If you're using Firefox, Firebug has almost identical abilities with a different interface.
Safari also has developer tools installed by default, you just have to go through the hoops of enabling them.
When you are designing your application, design it with debugability and testability in mind. Make sure that individual parts are independently testable, you have enough test data, you have appropriate debug/probe points in your program logic, etc. Essentially if the complexity is properly managed, debugability won't be an issue at all.
If your job depended on it, you would find a way! :)
Seriously... a passenger jet has literally millions of parts and yet there are regular routine maintenance checks and if it breaks down it gets fixed. It's a very rare piece of software that approaches that much complexity.
Web app front ends tend to be relatively simple. Essentially you're just pushing some text from the server to the browser and making it pretty; and you're using various parts of the in-browser display as controls, some of which initiate some more text conversations with the server. There are lots of little things that can go wrong, of course, but much of the hardship is simply getting the browser (all of them!) to Do What You Mean.
The only truly difficult problems are those that are intermittent and/or timing sensitive. Those can be a bear to reproduce and trace. That calls for in-depth logical analysis of your source code and/or some specialized testing methods.