Measuring performance of clients - javascript

I'm currently researching any ways to gather some analytics/metrics on the performance of client machines to our webapp. The app is heavily ajax and we are hoping to gather some stats about how well the clients machines are running it.
We don't necessarily want to put performance monitoring code all through the application (for a great number of reasons this may not be feasible anyways). Rather we would like to be able to run some kind of test or something when a user submits feedback that could give us an idea of how well their browser/computer performs.
This has been a slightly tricky thing to research as it keeps bringing up discussions about profiling etc. This is obviously useful but only to a point as our development machines are massively overpowered. We are hoping to get some metrics on the kinds of machines our clients are connecting with.
Does any kind of library/framework or best practice exist for this? So far my best though is to run some kind of CPU intensive process through JS for a few seconds and measure the performance that way ...
Thoughts or suggestions? May be an interesting discussion.

here is what we do to monitor and analyze client usage data...
use Google Analytics to capture information about user (platforms, browsers, connection speeds, site usage, etc)
use Google Webmaster Tools to get additional site stats and optimization suggestions
use Pagespeed plugin to analyze/fine tune high volume and/or slow pages
use Apache AB or JMeter - to run basic load tests against high volume pages

This is an interesting question as you brought up most developer profile on their machines. I am not sure if there is any other way other than putting performance profiler in your code. The interesting part that you brought up that this is based on the user's feedback and not necessarily be sent all the time to the server.
We could develop a Profiler javascript class that basically collects:
Function name
Network round trip time
Total Function execution time
UserMachineProcessingTime = Total function execution time-network round trip
Other useful info (similar to what YSlow or similar tools provides)
As you mention that his is based on the user feedback, we don't need to send this information all the time as each function gets called (which makes the app very chatty). We then aggregate this information on the client's side and possibly store it somewhere (maybe using HTML5 local storage?)
Only when the user give their consent to submit the performance profile, then we send this information to the server where you get the needed data. It would be interesting too to see how the user's react if we how a tiny message saying "We notice that your performance is below our average users' performance. Would you like to send your performance profile so we could learn and make it better?" (Different wording necessary, I am bad with this, but that is basically the message). Upon saying yes, the Profile send out the aggregate information that it has collected + additional information that Javascript could collect (user agent, etc). Off course the question is how many users would opt in to send their profile info, but it is one approach that we could try.

Related

Data in Javascript Object Performance

The more proficient I become with javascript, the more performance aware I want to be. For example, I had autocomplete box that would hit the server every so often to try auto-complete the users request. Exactly how google and most websites are enhancing their search. Now my question is if I am querying the FB api for a list of friends and storing these in a javascript object ( +- 700 with their full name and userid), how will this effect performance? I cant imagine it would be worse than constantly hitting the server for the request. Storing this info locally and then querying seems much more proficient. Where do I draw the line in storing information in a javascript object. I would like my server to as little as possible and have adopted the philopsophy on letting the client do 80% of the work and the server the remaining 20%. Obviously I want the client to experience a smooth application. How do the js ninjas test the performance of their application?
The most efficient solution - in terms of reduced network latency - would be to retrieve the friend list, store it using something like the localStorage API, and update the list in the background every once in a while. Keeping a list of this size in memory really shouldn't affect page performance much if at all (although on mobile this may not be the case). If someone has thousands of friends then you may see a slow-down, especially when reading the data from localStorage, but still, I doubt it.
Deciding where to draw the line is harder; it depends on so many factors. TBH this is something you may only know once your product is released and your users can give you feedback on performance, allowing you to fine-tune. A/B testing can help a lot in this regard.

Turn based multiplayer for iOS using CouchDB and IrisCouch

Me and my startup app company is working on a turn based multiplayer iPhone application. Let it be said that neither one of us have any database, or server, knowledge whatsoever. Though, we are willing to learn.
The flow of the game will be similar to such games as: WordFeud, WordsWithFriends, Rumble etc.
Let me start of by where a lot of searching on the web has gotten us:
We have decided to use CouchDB as a tool for storing information about users, game sessions and other stuff. CouchDB is an open source noSQL database system. The reason is that we have been taught, that it should support a lot off concurring users. Besides, that it scales - we are hoping to go big, of course.
Our CouchDB, is hosted on IrisCouch. IrisCouch is an "in cloud" hosting service designed for running CouchDb.
So, we've got a CouchDB server up and running, and we know the basics on how to query data from the server.
Our biggest confusion right now, is how we should set up the system to work according to best practices. Right now we are at the point where we are able to receive and submit data to the server.
Our game is supposed to have Facebook integration, so that the users can register via our app or through Facebook. After that they can play with random matched opponents, or play with friends. After a match is started, one player will get a set of question to be asked, after he has answered, the other player should be notified, through push notification, that it's their turn. After a few rounds the game is finished.
At this point, we think this might be the best solution for the flow of the application:
A user connects to another user -> a game session is opened as a
document in a database called "games".
The newly created document contains both player names, question,
answers etc
A field named "whos_turn" decides which of the two players turn it is.
After the game has ended, the session is erased.
Again, and as you may see, we are in the dark as of how to really do it, but this is the general idea.
So, my questions goes as following:
Is it best to query the data directly from the iPhone application, or through a web service?
What is the best way to set up the database, to best manage the flow of the application?
Any information, that could lead us closer will be gladly appreciated :)
In advance, thank you!
Olav Gundersen
EDIT#1 : Our Objective-C programmer managed to connect two iPhone devices using CouchDb. The iPhone application consists of a table view, that has a concurrent connection with the database, so that when someone POST to the database, it shows up on the tableview of all the other connected phones. Behold: a severely ineffective chat system.
If is a multiplayer you would need to have the app to communicate to the remote iriscouch.db but I am concerned by the point where you state that neither you or your friends have any database experience. You are willing to learn so I think the best place to start is:
http://guide.couchdb.org/editions/1/en/index.html
There are several issue you might find with scalability if you plan to erase documents continuously. DB Size can be considerable on couchdb and you will need to compact &cleanup the db regularly. But I don't think is a major issue for now as this is at a start up level.
The question "best way to setup the database and best manage the flow of the application" should be addressed by your team. If you do not have someone with any database experience you should try to find someone willing to help you. It should be someone with extensive experience in databases. You might find some fairly reasonable professionals at http://www.odesk.com
In total honestly I don't think you will be successful if you don't have such a figure - either as a freelancer or contributor - to help you having a solid database logic in the game that will ensure a great user experience.
For example: have you considered the latency-delay issue by using a db based in the USA (Iriscouch) vs. where your users are located?
For this reason you might want to do as much as possible client side (embedded database like sqlite or touchDB that is essentially couchDB for iPhone)
For an iPhone application you might want to try TouchDB that is made exactly for that
https://github.com/couchbaselabs/TouchDB-iOS (caveat: being that you need connectivity to check turns etc this might not be the ideal solution but it could work to store some information locally).
To lay this out you would need someone with experience with couchdb to set up a proper, usable application. There is nothing wrong in being enthusiast about your idea but to make it a success you need a technical mind in the database side. Of course you might be well capable to learn this yourself. After reading the CouchDB book you should be in position to create a basic flow to fit your needs.
Of course other more experienced users might come with a more comprehensive answer or a sample layout but I don't think would be the best approach. Even if someone posts a full layout of the doc structure and how to query it how are you going to service the app if something goes awry e.g. sessions don't get deleted, conflicts etc. ? hence my sincere advice to get some ad-hoc expertise for your case.
This might also result in analyzing suitable alternatives. I don't think you should buy into the idea that CouchDB can scale and hence is the best/only option for you (of course this might well be the case and if you feel that is a good option..go for it). For example twitter, google adwords and many other online apps are using mysql to store their data so for sure CouchDB is not the only database that can scale!
I think this demo app could be a good example to follow: iOS Couchbase Demo

Opinions on possible optimization for a web application using javascript

I'm thinking of implementing my web application in a certain way as an optimization, and I'd like to get people's opinions on whether this is a good idea or not.
Here's the details:
For most of my pages, instead of determining server side whether the user is logged in, and then modifying the page I send based on that, I want to send the same page to everyone, this way I can make use of my reverse caching proxy and for most requests not even have to run any dynamic code at all.
The differences that need to be done for logged in users will be done in javascript. The necessary information to make the changes (what their user name is, their user id, and if they are logged in or not) will be stored in a cookie that can be read by javascript.
I don't need to worry about the users not having javascript because my web app requires javascript to be used anyways.
Only the most popular pages that are accessible to both logged in and logged out users will do this.
What do you guys think? Pros/cons? Is this something that websites commonly do?
Doing it for 100% of your application would be a little problematic, however, it sounds like you are seeking to use something called the Model-View-Presenter pattern:
http://en.wikipedia.org/wiki/Model_View_Presenter
Be wary that, when using javascript, your code is exposed, meaning that any security measure taken is potentially hackable through the browser. Add protection on the server side and you are set.
Also, since you are going to rely heavily on javascript, I really recommend you using Mootools, which is an object-oriented approach to javascript. That way you can keep your code really modular, work around messy implementations using custom and class events, etc.
Major con: If you are determining what content a viewer can access with JavaScript alone, it stands to reason that a malicious user can potentially access premium content with just a little glance at your source code.
I'm not sure what you are optimizing really - you need to fetch the user data anyway, and only the server has that. Do you plan on sending an AJAX request requesting for data and using javascript to format it? you are only saving on output generation which is usually not the bottleneck in web application. Much more often the database / IO (files) / network (HTTP requests) are the bottlenecks.
The major con here is that by moving all output generation to javascript, you will increase substantially the download size and reduce overall responsiveness. Since none of the big sites use this approach, you can be sure it doesn't solve scalability problems.

How much external data is too much? (XML or JSON)

I have written pure JavaScript front ends before and started noticing performance decrease when working with large stores of data. I have tried using xml and json, but in both cases, it was a lot for the browser to handle.
That poses my question, which is how much is too much?
You can't know, not exactly and not always. You can make a good guess.
It depends on the browser, OS, RAM, CPU, what else is running at that moment, how fast their connection is, what else they're transferring, etc.
Figure out several situations you expect for your average user, and test those. Add for various best, worst, and interesting (e.g. mobile, tablet) cases.
You can, of course, apply experience and extrapolate from your specific cases, and the answer will change for the future.
But don't fall into the trap of "it works for me!"
I commonly see this with screen resolutions: as those have increased, it's much more popular to have multiple windows visible at the same time. In 1995 it was rare for me to not have something maximized; now fifteen years later, it's exactly the opposite.
Yet sometimes people will design some software or a website, use lower contrast[1], maximize it, and connect to a server on localhost—and that's the only evaluation they do.
[1] Because they know what the text says and don't need to read it themselves, so lower contrast looks aesthetically better.
In my opinion, if you need to stop and think about this issue, then the data is too much. In general you should design your applications so that users with a low-end netbooks and/or slow internet connections are still able to run them. Also keep in my mind that more often than not your application isn't the only page your users are visiting at the same time.
My recommendation is to use Firefox with Firebug to do some measurements. See how long a request takes to complete in a modest configuration. If it takes noticeable time for the browser to render data, then you'd better off doing a redesign.
A good guiding principle should be that instead of worrying about whether the browser can handle the volume of data you're sending it, worry about whether your user can handle it. It all depends on the presentation of course (i.e., a lot of data bound for a visualization tool that'll render a complex graph in a canvas is different than a lot of raw numbers bound for a gigantic table), but in my experience a user's brain reaches data overload before the browser/network/client computer.
It really depends on the form that your external data is going to take in your Javascript. If you want to load all your data at once and keep it in memory as a large object with lots of properties (associative array), then you will find that most current desktops can only handle about 100k entries (with small key-value pairs) before performance really degrades.
If it is possible, you should see if there are ways to only load the data that is needed by the user for a given request / interaction. You can use AJAX to request needed data and prefetch data that you think the user may need.

To Ajaxify Or Not?

I really love the way Ajax makes a web app perform more like a desktop app, but I'm worried about the hits on a high volume site. I'm developing a database app right now that's intranet based, that no more then 2-4 people are going to be accessing at one time. I'm Ajaxing the hell out of it, but it got me to wondering, how much Ajax is too much?
At what point does the volume of hits, outweigh the benefits seen by using Ajax? It doesn't really seem like it would, versus a whole page refresh, since you are, in theory, only updating the parts that need updating.
I'm curious if any of you have used Ajax on high volume sites and in what capacity did you use it? Does it create scaling issues?
On my current project, we do use Ajax and we have had scaling problems. Since my current project is a J2EE site that does timekeeping for the employees of a large urban city, we've found that it's best if the browser side can cache data that won't change for the duration of a user session. Fortunately we're moving to a model where we have a single admin process the timekeeping for as many employees as possible. This would be akin to how an ERP application might work (or an email application). Consequently our business need is that the browser-side can hold a lot of data, but we don't expect the volume of hits to be a serious problem. So we've kept an XML data island on the browser-side. In addition, we load data only on an as-needed basis.
I highly recommend the book Ajax Design Patterns or their site.
Ajax should help your bandwidth on a high volume site if that is your concern as you are, like you said, only updating the parts that need updating. My problem with Ajax is that your site can be rendered useless if the visitors do not have javascript enabled, and most of the time I do not feel like coding the site again for non-javascript users.
Look at it this way: AJAX must not be the only option because of the possibility of !script, it must exist as a layer on top of an existing architecture to provide a superior experience in some regards. Given that, it is impossible for AJAX to create more requests or more work than simple HTML because it is handling the exact same data transfer.
Where it can save you bandwidth and server load is because AJAX provides you the ability to transfer only the data. You can save on redundant HTML, image, css, etc requests with every page refresh whilst providing a snappier user experience.
As mike nvck points out the technique of polling is a big exception to this rule, but that's about the technique not the tech: you would have the same kind of impact if you had a simple page poll.
Understand the tool and use it for what it was designed. If AJAX implementation is reducing performance, you've done something wrong.
(fwiw, my experience of profiling AJAX vs simple HTML tends to result in ~60% bandwidth, ~80-90% performance benefits)
The most common scaling issue of ajax apps is when they are to set up to check back with the server to see if the content got updated in the meantime without the need for user actively requesting it. 5 clients checking every 10 seconds is not 5000 clients checking every 10 sec.
Ajax on one side reduces the server workload because it usually shows or refreshes just part of the page, while on the other side it increases number of hits to the server. I would say that all then depends of the architecture of your web application. If your application needs a lot of processing for every hit (like database access) regardless of size of the response, then Ajax will hit you a lot.

Categories

Resources