I am looking to build an in-house debugging system so we can see how users react to certain things.
What would be the best way to communicate all mouse clicks, moves, etc. back to the server?
One way I've thought of is to run a bind on body for everything and then just add it to an array which is sent at page unload, but I figured this could seriously kill a browser if the user has decided to click everything in sight or has sat there in work for 4 hours just moving his mouse on the page.
Ideally I want to avoid web sockets.
I'm sure this has been done before so I'd love to know how it's been done.
Thanks
For those of you wondering, I used the answer found here (How do you log all events fired by an element in jQuery?)
as a wrapper, with a combination of #hallleron's approach - storing the values in a string separated by | firing off AJAX queries every 3.5 seconds, then setting the array back to null. On page unload, the AJAX query fires one final time.
I'm also considering making the unload script dynamically create an iFrame with (again) a dynamically generated form and contents which auto posts the contents, just in case the AJAX hangs for whatever reason.
All array strings use their own CSRF token and have a randomly generated ID for the client side which is then hashed on the server side and is used to check if that has already been sent, just to stop any possible double AJAX requests.
On the server side, it is stored using ARCHIVE Engine Type and also INSERT DELAYED Insert method.
Eventually I'll probably move the logging system to its own EC2/RDS group.
The reasoning behind all this is to be able to see the most popular features of the website, who is clicking where (say if there's 2 home buttons, which one is more popular, etc.)
Hope this helps anyone else stuck in this predicament.
Related
I am building a WebApp (ERP) and I need to display the people currently logged in and active on the page. I managed to get something pretty accurate by listening on the mouse/keyboard events and periodically reporting to the DB.
I don't know how to mark people offline when they close the page. I tried using onbeforeunload, but it obviously fires when the user simply changes pages (click a link inside the ERP, that point to another page in the ERP).
I then tried to use WebSockets, but the problem is the same : everytime the page is realoded, the WebSockets connection is closed.
So I can think of two ways:
Use WebSockets indeed, and replace all links by a call to a javascript function that would somehow tell the server that the user is going to change page (so that the server doesn't mark it as offline). But that doesn't feel right, semantically speaking, links should be links, it simply points to another location.
Use either WebSockets or AJAX and never actually change page: links are replaces by a function that will call for the content, and display it on screen (updating the DOM with Javascript). But again, it doesn't feel right either, because semantically speaking the page would have no meaning and the URL would never change, so the user can't "copy paste" the link of the page to refer to it, right ?
So, is there a proper, clean way of doing this? Thanks for your help.
If each of your pages has a webSocket connection to your server, then on the server you can see when any given page is closed by seeing that the webSocket gets closed.
To avoid thinking that a user has left the site when they are just navigating from one page in your site to another, you simply need to add a delay server-side so that you only report that the user has left your site if there has been no webSocket connection from this user for some time period (probably at least a few seconds).
So, on your server when you detect that the last webSocket connection for this user has been closed, you set a timer for some number of seconds. If the user opens up another page on your site (either via navigation or just opens another page) before the timer goes off, you cancel the timer and count the user as still connected. If the timer goes off, then you now know that the user has been away from your site for whatever time period you picked (say 10 seconds) and to you, this will signify that they have left the site. You can control how long you want that time period to be before you decide that, yes they are gone.
All attempts at trying to "see" the user leaving your page IN the browser with Javascript are going to have holes in them because there are always ways for a web page to get closed without your client-side javascript having a chance to tell your server. The beauty of the webSocket solution is that the browser automatically and reliably tells your server when the page is now gone because it closes the webSocket and your server receives the notification that the socket has been closed.
As I understand you want to compute users active on website/pages.
Identify the user (99% unique id computed):
http://valve.github.io/blog/2013/07/14/anonymous-browser-fingerprinting/ you can use another library, there are few.
On each page send from time to time at page load meaning user is navigating or (60sec you can chose lower time frame meaning user is staing on the page) computed id (fingerprint js) to server (web-socket/ajax)
On server you need to have list of id's with expiration date (60s) increment when new user log's in (stored in database or session).
Retrieve on your website the count (60sec ajax/websocket) of id's having timestamp <= server time - let say 120sec.
Knowing if user is logged, and specify the page:
use an object to be sent at server {fingerprint: 123123124234, logged : true, page: home}
Clear your list if you are not storing in Database the users:
Separate thread (server only) access the object and destroy all nodes older then 10 min or whatever your page session is set.
js timer: http://www.w3schools.com/jsref/met_win_setinterval.asp
Let hope it's helpful, id did something similar using the timer at 5 min to sent to server if user is still on the page, or signal at page load.
Getting the cont of users in frame of 60 sec. And even the users with names present on page :)
Somebody already post this kind of question.
Hope this could help you .
Detect if user has closed ALL windows for a website?
I have an html5/javascript application in which multiple users can be viewing the same set of data of any given time. For the sake of a real world example, lets say its a calendar type page.
So user1 is looking has the browser open and looking at the calendar page and user2 is also on the calendar page. User2 makes a change to the calendar and i'd like (as quickly as possible) for those changes the be recognized and refreshed on user1's screen. What is the best way to do this?
I'm thinking about have a mysql table for active users that stores the page they are currently on and a timestamp for its last update, then use ajax calls to ping the server every few seconds and check for an updated timestamp, if its newer than what they have client side, the new data gets sent and the page "reloaded." I am putting reloaded in quotes because the actual browser window will not be refreshed, but a function will be called via javascript that will reload the page. Sort of the way stack overflow performs its update checks, but instead of telling the user the page has changed and providing a button for reload, it should happen automatically. If user1 is working away on the calendar, it seems it might be quite annoying for user2's screen to constantly be refreshing...
Is this a horrible idea? Is pinging the server with an ajax request every few seconds going to cause major slow downs? Is there a better way to do this? I would like the views on either users side to be real time because its important that user1 not be able to update an element on the calendar page that user2 has already changed.
Update: based on some web sockets research it doesnt seem like a proper solution. First its not compatible with older browsers and i support ie8+ and second i dont need real time updstes for all users on the site. The site is an account based applicatiin and an account can have multiple users. The data needs to sync between those users only. Any other recommendations would be great.
You need realtime app for this. You should have a look at socketio. Everytime a user log in, you make him listen for changes on the server. Then when something changed on the server, every users listening are notified.
you can find examples on the official website : http://socket.io/
I'm working with a 3rd-party payment service that sends my user back to my site after doing a payment on their site. It sends the user back with javascript (setTimeout and location='my_site.com?data'), but there's also a button the user can click if it takes too long. The double request happens when the browser is already loading my site because of the javascript, and the user clicks the button.
Can I prevent the request is handled twice? Or should I simply handle it twice?
Right now, I show a "request is already handled" when the same request is handled twice, but because the two requests happen near simultaneously, the user never sees the page that says the request was correctly handled.
Ideally the button should be disabled at your service provider, but otherwise you'd have to manage it gracefully on your side...
In this case you want your user to get the correct feedback and from what you've said you are aware that they currently lose the response to the first request.
So what you probably want to do is for any subsequent request (possibly only within a certain time frame) to check if the event has been handled already. If it has been handled already you may want to check that the handling is consistent (ie that the data is the same both times and deal appropriately if the data is in fact different).
If its all the same you can display the "handled correctly" page whether it is the first or second or tenth time they went to the page.
Other techniques to handle this may include making sure that your landing page returns faster. Presumably the problem is that a reasonable amount of work is being done on your side whcih is causing them to give up and go for the click as well. If you can reduce this time to display content then that may solve your problem in a slightly better way. This might involve something like returing a small page that has your site branding and says "please wait while we process your order" or whatever. This then puts the control in your hand and allows you to more intelligently deal with the double click scenario at the beginning.
Exact solution will depend on your exact situation though.
Just disable submit button before calling form.submit():
document.getElementById("submit").disabled=true;
Using Python, I built a scraper for an ASP.NET site (specifically a Jenzabar course searching portlet) that would create a new session, load the first search page, then simulate a search by posting back the required fields. However, something changed, and I can't figure out what, and now I get HTTP 500 responses to everything. There are no new fields in the browser's POST data that I can see.
I would ideally like to figure out how to fix my own scraper, but that is probably difficult to ask about on StackOverflow without including a ton of specific context, so I was wondering if there was a way to treat the page as a black box and just fire click events on the postback links I want, then get the HTML of the result.
I saw some answers on here about scraping with JavaScript, but they mostly seem to focus on waiting for javascript to load and then returning a normalized representation of the page. I want to simulate the browser actually clicking on the links and following the same path to execute the request.
Without knowing any specifics, my hunch is that you are using a hardcoded session id and the web server's app domain recycled and created new encryption/decryption keys, rendering your hardcoded session id (which was encrypted by the old keys) useless.
You could try using Firebugs NET tab to monitor all requests, browse around manually and then diff the requests that you generate with ones that your screen scraper is generating.
If you are just trying to simulate load, you might want to check out something like selenium, which runs through a browser and handles postbacks like a browser does.
I know a few sites (such as my bank and my school) that kill a session after their has been idle for a set amount of time. It is my understanding that session activity is determined by users following links or at the very least from some kind of active interaction, like updating a form via ajax. Basically the server gets a request to do something during the session and goes ahead and extends the session time another 15 minutes.
But on some occasions I have lost major amounts of time and info while filling out a text box or reading some long set of instructions along the way.
So why not have an ajax script that listens for keyboard activity and mouse movement and lets the server know that the user is still there and active, even if they aren't clicking a submit button or following a link?
I was wondering if anyone knew of respectable sites that already do this, or if I was overlooking some major security hazard with this idea.
The only thing I can imagine would be risky are the random acts of cats, vibrating electronics nearby, or a hyper child.
But in all of the above, the user is most likely at home and -- unless they are trying to get exploited -- have probably minimized the window and thus these things are very unlikely to trigger as an event.
Does anybody see any other major risks?
Typical AJAX sites are making posts back to the server anyway. These events are renewing the users session already.
If you put these events on keyboard or mouse clicks, how many times are you going to be posting to the server? If I am typing in a form field like I am now, that means you could potentially have a ping to the server for each letter I type; not a very efficient solution. On the other end, what if your user is just sitting their reading or using an external text editor to type their text into and will copy it into your form later.
I think the more typical solution to provide a friendly UI so that long posts do not get dropped because of a session expiration is to use an auto-save feature. Google Docs does this. Every few seconds/minutes, they post the contents of the editor back to the server without the user actually clicking save/submit. The other option is to inform the user that their session is about to expire (could be done with a javascript timeout). Provide a link to ping the server to renew the session.
Your solution lends itself to the same problem: you are relying on user behavior. In the first case, navigating between pages and in the second, mouse clicks.
I used to work for a company that did a lot of online contests where users would have to enter essay content as well as shorter blocks of data; we used to modify the session time out "session.setMaxInactiveInterval()" for the user's session when they hit the "long-winded" page so that they would have more time to edit, and then we would set it back to normal after the submit.
Later at that company and a couple others I worked at I proposed a solution similar to what you are describing, but for various reasons it was never accepted. It was never considered a bad idea, just not one we chose. Basically it was going to be an ajax call on a timer so that just before the session timed out, it would fire off a light-weight ajax "ping" and keep the session alive as long as that page was open. I have never had the chance to implement it in the "real world" so perhaps there are negatives that I have not thought of.
Good luck.