I want to make my web applications faster.
For now I profile my web applications on a few test machines, and test some parts on jsPerf. However this only gives me limited insight about how the applications run when used by my clients.
What I would want is to measure performance "in the wild", when the scripts are in production.
I understand it will occur an overhead. However by only profiling specific parts of the code for each user, then combining those reports, it should be possible to get a complete picture with next to no performance hit.
Are there already solutions like this? Does Google Analytics or Google Webmaster Tools provide anything like this? I can't find any such thing.
There is no proper JS profiling in Google Analytics. You can use user timings to test how long your scripts run or how it takes to load them. And if you use Google Tag Manager you can implement the Javascript error listener that tracks unhandled javascript exceptions (i.e. it gives the error message and the offending script and the line in which the error happened).
This provides at least some insight into how your javascript behaves in the wild, but (IMO) it's not actually "profiling".
Related
We use PhantomJs 2.0 to take screenshots of web pages. We've found that one particular page takes several minutes to process. This page does not appear to have this issue (or at least not of any comparable magnitude) when loaded in Chrome.
I believe that this is because the javascript is hanging/running very slowly. During the hang, Phantom is using a lot of CPU (although only one core). It does not appear to be taking up an abnormal amount of memory. I am fairly confident that javascript is the culprit because I can see from logging that all requests complete quickly, but then after the page loads Phantom hangs for awhile and won't run anything (I think this is because Phantom is all single-threaded so if the page is still running javascript my Phantom script won't run anything).
I'd like to debug and try to understand what part of the JS is taking so long, but I can't figure out how to get at this in Phantom. For example, I can't seem to collect any output from console.profile/console.profileEnd. How can I profile the javascript running in Phantom to find the bottleneck?
I use Phantomas, via grunt-phantomas. It's a tool that integrates with PhantomJS to profile a wide variety of performance-related metrics. Definitely worth checking out. If it doesn't give you exactly what you need, you can look at the source and see how they integrate with PhantomJS and get data out.
I'm refactoring the code of an existing website "home made". When I open some pages my CPU consumption jumping by 20%...
I'm used to Firefox development tools, but everything seems to be calm after the page is loaded. I guess it's a javascript running in circle but I don't know how to catch it.
Do you know a way to spot the right script? Is there something exisiting similar to resource consumption but at the website scale?
Test your website for performance test.
Some simple websites and browser plugins can show the exact javascript which is taking time (If only javascript is the real issue. If other things are also helping in cpu util then these plugins can point you to them.)
GTMetrix - website which can help you find bottlenecks in your website
pagespeed - chrome plugin which is really good plugin to find bottlenecks as well as improved javascripts. this plugin can pin point your culprit javascript and will provide you optimized/minified version of it. Of course if only java script is the actual bottleneck.
firebug - mozilla plugin in firefox which also can perform same job but wont give optimizations.
meanwhile you can monitor your system resource util. and make sure that nothing else is running on your client other than website to make sure that util is because of those 2-3 pages.
Based on suggestions provided by above tools/profilers, decide your optimization candidates.
I'm currently testing Javascript Visualization Toolkits and want to measure execution time, memory consumption etc.
I know hot to profile Javascript with the chrome dev tools, google speed analyzer and so on, but I want users to perform the tests on their own and display the results. (without using dev tools or installing an extension)
Is there a library or something that can be used to achieve this? Subtracting start and end time for each function does not seem like a good solution.
Best case scenario would be a Library to profile individual functions.
Caveat: you will not be able to get CPU profile or memory usage using a JS-based testing solution. If this is what you are after, a Chrome extension may very well be the way forward.
If, however, this doesn't bother you, if you are after a ready-made solution, Benchmark.js may prove to be a good starting point.
The method it uses is akin to what you mentioned - taking time differences in execution. However, it does so multiple times (100 or more times) in order to average out the statistical errors. This allows your results to be free of truly random errors (this does not, however, mean that your data will be meaningful.).
I have a huge Web App that's switching from a HTML-rendered-on-the-server-and-pushed-to-the-client approach to a let-the-client-decide-how-to-render-the-data-the-server-sends, which means the performance on the client mattered in the past, but it's critica now. So I'm wondering if in the current state of affairs it's possible to profile Web apps and extract the same data (like call stacks, "threads", event handlers, number of calls to certain functions, etc) we use for server side perf.
I know every browser implements some of these functionalities to some extent (IE dev tools has an embedded profiler, so does Firefox [with Firebug], and Google Chrome has Speed Tracer), but I was wondering if it'd be possible to get, for example, stack traces of sessions. Is it advisable to instrument the code and have a knob to turn on/off the instrumentation? Or it's simply not that useful to go that level in analyzing JavaScript performance?
Fireunit is decent and YUI also provides a profiler, but neither provide stack traces or call frames. Unfortunately, there aren't many JS profiling tools out right now. And none of them are particularly great.
I think it's very important to go to a high level of performance analysis, especially considering the user will deal with the JS app 90%+ of the time directly.
Using Google Chromes new Speed Tracer extension to profile my app.
Appears my app is constantly reporting "Sluggish (events) 100%", which means the browser is blocking html rendering.
I don't understand enough how to interpreter the Speed Tracer tool to fix this issue.
Any help appreciated. My web app is: bit.ly/7J0U
Doesn't seem sluggish to me (Macbook 10.5.8/FF 3.5.5). My initial thought was to direct you to some event delegation information, but that doesn't seem applicable after visiting your app.
Try to get a barometer for Speed Tracer's reports by running the tool on different sites and see how they compare with your own. That may help you to get a better sense of how to interpret the reports. Also, don't forget that these things (YSlow included) are tools not rules.