Handling server-side aborted long-poll/comet updates in jquery - javascript

I have an application which uses an open JQuery Ajax connection to do long-polling/comet handling of updates.
Sometimes the browser and the server lose this connection (server crashes, network problems, etc, etc).
I would like the client to detect that the update has crashed and inform the user to refresh the page.
It originally seemed that I had 2 options:
handle the 'error' condition in the JQuery ajax call
handle the 'complete' condition in the JQuery ajax call
On testing, however, it seems that neither of these conditions are triggered when the server aborts the query.
How can I get my client to understand that the server has gone away?

Isn't it possible to add a setInterval() function that runs every few seconds or minutes? That way you can trigger a script that checks whether the server is still up, and if not, reset the comet connection. (I don't know what you use for the long-polling exactly though, so I don't know if it's possible to reset that connection without a page reload. If not, you can still display a message to the user).
So something like this:
var check_server = setInterval(function() {
// run server-check-script...
// if (offline) { // reset }
}, 60000);

Related

Run a bash script through CGI on closing of browser window [duplicate]

I'm trying to find out when a user left a specified page. There is no problem finding out when he used a link inside the page to navigate away but I kind of need to mark up something like when he closed the window or typed another URL and pressed enter. The second one is not so important but the first one is. So here is the question:
How can I see when a user closed my page (capture window.close event), and then... doesn't really matter (I need to send an AJAX request, but if I can get it to run an alert, I can do the rest).
Updated 2021
TL;DR
Beacon API is the solution to this issue (on almost every browser).
A beacon request is supposed to complete even if the user exits the page.
When should you trigger your Beacon request ?
This will depend on your usecase. If you are looking to catch any user exit, visibilitychange (not unload) is the last event reliably observable by developers in modern browsers.
NB: As long as implementation of visibilitychange is not consistent across browsers, you can detect it via the lifecycle.js library.
# lifecycle.js (1K) for cross-browser compatibility
# https://github.com/GoogleChromeLabs/page-lifecycle
<script defer src="/path/to/lifecycle.js"></script>
<script defer>
lifecycle.addEventListener('statechange', function(event) {
if (event.originalEvent == 'visibilitychange' && event.newState == 'hidden') {
var url = "https://example.com/foo";
var data = "bar";
navigator.sendBeacon(url, data);
}
});
</script>
Details
Beacon requests are supposed to run to completion even if the user leaves the page - switches to another app, etc - without blocking user workflow.
Under the hood, it sends a POST request along with the user credentials (cookies), subject to CORS restrictions.
var url = "https://example.com/foo";
var data = "bar";
navigator.sendBeacon(url, data);
The question is when to send your Beacon request. Especially if you want to wait until the last moment to send session info, app state, analytics, etc.
It used to be common practice to send it during the unload event, but changes to page lifecycle management - driven by mobile UX - killed this approach. Today, most mobile workflows (switching to new tab, switching to the homescreen, switching to another app...) do not trigger the unload event.
If you want to do things when a user exits your app/page, it is now recommended to use the visibilitychange event and check for transitioning from passive to hidden state.
document.addEventListener('visibilitychange', function() {
if (document.visibilityState == 'hidden') {
// send beacon request
}
});
The transition to hidden is often the last state change that's reliably observable by developers (this is especially true on mobile, as users can close tabs or the browser app itself, and the beforeunload, pagehide, and unload events are not fired in those cases).
This means you should treat the hidden state as the likely end to the user's session. In other words, persist any unsaved application state and send any unsent analytics data.
Details of the Page lifecyle API are explained in this article.
However, implementation of the visibilitychange event, as well as the Page lifecycle API is not consistent across browsers.
Until browser implementation catches up, using the lifecycle.js library and page lifecycle best practices seems like a good solution.
# lifecycle.js (1K) for cross-browser compatibility
# https://github.com/GoogleChromeLabs/page-lifecycle
<script defer src="/path/to/lifecycle.js"></script>
<script defer>
lifecycle.addEventListener('statechange', function(event) {
if (event.originalEvent == 'visibilitychange' && event.newState == 'hidden') {
var url = "https://example.com/foo";
var data = "bar";
navigator.sendBeacon(url, data);
}
});
</script>
For more numbers about the reliability of vanilla page lifecycle events (without lifecycle.js), there is also this study.
Adblockers
Adblockers seem to have options that block sendBeacon requests.
Cross site requests
Beacon requests are POST requests that include cookies and are subject to CORS spec. More info.
There are unload and beforeunload javascript events, but these are not reliable for an Ajax request (it is not guaranteed that a request initiated in one of these events will reach the server).
Therefore, doing this is highly not recommended, and you should look for an alternative.
If you definitely need this, consider a "ping"-style solution. Send a request every minute basically telling the server "I'm still here". Then, if the server doesn't receive such a request for more than two minutes (you have to take into account latencies etc.), you consider the client offline.
Another solution would be to use unload or beforeunload to do a Sjax request (Synchronous JavaScript And XML), but this is completely not recommended. Doing this will basically freeze the user's browser until the request is complete, which they will not like (even if the request takes little time).
1) If you're looking for a way to work in all browsers, then the safest way is to send a synchronous AJAX to the server. It is is not a good method, but at least make sure that you are not sending too much of data to the server, and the server is fast.
2) You can also use an asynchronous AJAX request, and use ignore_user_abort function on the server (if you're using PHP). However ignore_user_abort depends a lot on server configuration. Make sure you test it well.
3) For modern browsers you should not send an AJAX request. You should use the new navigator.sendBeacon method to send data to the server asynchronously, and without blocking the loading of the next page. Since you're wanting to send data to server before user moves out of the page, you can use this method in a unload event handler.
$(window).on('unload', function() {
var fd = new FormData();
fd.append('ajax_data', 22);
navigator.sendBeacon('ajax.php', fd);
});
There also seems to be a polyfill for sendBeacon. It resorts to sending a synchronous AJAX if method is not natively available.
IMPORTANT FOR MOBILE DEVICES : Please note that unload event handler is not guaranteed to be fired for mobiles. But the visibilitychange event is guaranteed to be fired. So for mobile devices, your data collection code may need a bit of tweaking.
You may refer to my blog article for the code implementation of all the 3 ways.
I also wanted to achieve the same functionality & came across this answer from Felix(it is not guaranteed that a request initiated in one of these events will reach the server).
To make the request reach to the server we tried below code:-
onbeforeunload = function() {
//Your code goes here.
return "";
}
We are using IE browser & now when user closes the browser then he gets the confirmation dialogue because of return ""; & waits for user's confirmation & this waiting time makes the request to reach the server.
Years after posting the question I made a way better implementation including nodejs and socket.io (https://socket.io) (you can use any kind of socket for that matter but that was my personal choice).
Basically I open up a connection with the client, and when it hangs up I just save data / do whatever I need. Obviously this cannot be use to show anything / redirect the client (since you are doing it server side), but is what I actually needed back then.
io.on('connection', function(socket){
socket.on('disconnect', function(){
// Do stuff here
});
});
So... nowadays I think this would be a better (although harder to implement because you need node, socket, etc., but is not that hard; should take like 30 min or so if you do it first time) approach than the unload version.
The selected answer is correct that you can't guarantee that the browser sends the xhr request, but depending on the browser, you can reliably send a request on tab or window close.
Normally, the browser closes before xhr.send() actually executes. Chrome and edge look like they wait for the javascript event loop to empty before closing the window. They also fire the xhr request in a different thread than the javascript event loop. This means that if you can keep the event loop full for long enough, the xhr will successfully fire. For example, I tested sending an xhr request, then counting to 100,000,000. This worked very consistently in both chrome and edge for me. If you're using angularjs, wrapping your call to $http in $apply accomplishes the same thing.
IE seems to be a little different. I don't think IE waits for the event loop to empty, or even for the current stack frame to empty. While it will occasionally correctly send a request, what seems to happen far more often (80%-90% of the time) is that IE will close the window or tab before the xhr request has completely executed, which result in only a partial message being sent. Basically the server receives a post request, but there's no body.
For posterity, here's the code I used attached as the window.onbeforeunload listener function:
var xhr = new XMLHttpRequest();
xhr.open("POST", <your url here>);
xhr.setRequestHeader("Content-Type", "application/json;charset=UTF-8");
var payload = {id: "123456789"};
xhr.send(JSON.stringify(payload));
for(var i = 0; i < 100000000; i++) {}
I tested in:
Chrome 61.0.3163.100
IE 11.608.15063.0CO
Edge 40.15063.0.0
Try this one. I solved this problem in javascript, sending ajax call to server on browse or tab closing. I had a problem with refreshing page because on onbeforeunload function including refreshing of the page. performance.navigation.type == 1 should isolate refresh from closing (on mozzila browser).
$(window).bind('mouseover', (function () { // detecting DOM elements
window.onbeforeunload = null;
}));
$(window).bind('mouseout', (function () { //Detecting event out of DOM
window.onbeforeunload = ConfirmLeave;
}));
function ConfirmLeave() {
if (performance.navigation.type == 1) { //detecting refresh page(doesnt work on every browser)
}
else {
logOutUser();
}
}
$(document).bind('keydown', function (e) { //detecting alt+F4 closing
if (e.altKey && e.keyCode == 115) {
logOutUser();
}
});
function logOutUser() {
$.ajax({
type: "POST",
url: GWA("LogIn/ForcedClosing"), //example controller/method
async: false
});
}
Im agree with Felix idea and I have solved my problem with that solution and now I wanna to clear the Server Side solution:
1.send a request from client side to server
2.save time of the last request recived in a variable
3.check the server time and compare it by the variable of last recived
request
4.if the result is more than the time you expect,start running the
code you want to run when windows closed...
Use:
<body onUnload="javascript:">
It should capture everything except shutting down the browser program.

Node.js error handling with socket.emit

I have some node.js client side code like this:
socket.emit('clickAccept', { myrecid: recid });
Server side node.js code gets it fine and all is well.
If I take the server down to simulate a server side outage, then click the button that fires this socket.emit on the client side, this happens:
Nothing really, I guess it might eventually time out
When I bring the server back up, the clicks end up being sent to the server and the server acts on them (TCP-like I Guess).
What I want to happen is for those socket.emit calls to die after a short timeout and not send when the server comes back up, it causes all sorts of confusion because if they click 3 times, nothing happens, then when/if the connection or server comes back up they get 3 reactions all at once.
Also, if they click and it times out because the server is down, I would like to show an error to the client user to let them know that basically the click didn't work and to try again.
I know how to act on and show an error if the socket goes down but I don't want to do this if they aren't trying to click something at that time. No sense is firing errors at the user because the socket went down briefly if they have no need to do anything at that moment.
So, to be clear, I only want to show an error if they click on the button and the socket between the client and server is down. AND... If they get an error, I want to kill that emit, not save it all up and fire it and all the other clicks when the server comes back up a few seconds later.
Thanks in advance and I hope that was at least reasonably clear.
The root of your issue is that socket.io attempts to buffer any data that it can't currently send to the server (because the connection to the server is disconnected) and when the server comes back up and the connection is restored, it then sends that data.
You can see the technical details for how this works here: socket.io stop re-emitting event after x seconds/first failed attempt to get a response
You have several implementation options:
If socket.io already knows the client is not connected to the server, then don't buffer the data (perhaps even give you back an error to show to your user).
When socket.io reconnects and there was data buffered while the connection was down, clear that data and throw it away so old data isn't sent on a reconnect.
Implement a timeout to do one of the above after some sort of timeout.
So, to be clear, I only want to show an error if they click on the button and the socket between the client and server is down. AND... If they get an error, I want to kill that emit, not save it all up and fire it and all the other clicks when the server comes back up a few seconds later.
Probably, the simplest way to do that is to implement a version of what is shown in the above referenced answer:
Socket.prototype.emitWhenConnected = function(msg, data) {
if (this.connected) {
this.emit(msg, data);
return null;
} else {
return new Error("not connected");
}
}
Then, switch your code from using .emit() to use .emitWhenConnected() and check the return value when using it. If the return value is null, then no error was detected. If the return value is not null, then there was an error.
Thanks for the other answers and help. I ended up solving this in a super simple way. See below:
if (socket.connected){
// Do your thing here
} else {
// Throw error here that tells the user they're internet is likely down
}
Hope this helps someone out there, it was a huge improvement in our code to make sure that user's are getting proper feedback when if they have brief network/internet outages.

Will ajax in beforeunload reliably execute?

I have a HTML5 application that needs to send a disconnect ajax request when the user changes/refreshes the page. I am currently using this code:
window.addEventListener("beforeunload", function(event) {
$.ajax({
url: api_disconnect,
data: { identifier: token },
method: "GET"
});
});
I don't need to process the response, or even ensure that the browser receives a response. My question is, can I rely on the server receiving the request?
And if not, how can I accomplish this? Currently I have the app send an "I'm alive!" request every 15 seconds (which already feels like too much). I want the server to know the second the user disconnects.
To clarify, I know that if the browser/computer crashes there's nothing I can do about that. That's what the heartbeat is for. I just mean in a normal use case, when the user closes/changes/refreshes the page.
You cannot 100% rely on the ajax call getting through. You can test many browsers and operating systems and determine which ones will usually get the ajax call sent before the page is torn down, but it is not guaranteed to do so by any specification.
The heartbeat like you are using is the most common work-around. That will also cover you for a loss in network connection or a power-down or computer sleep mode or browser crash which the beforeunload handler will not.
Another work-around I've seen discussed is to use a socket.io connection to the server. Since the socket.io connection has both a small, very efficient heartbeat and the server will see the socket get closed when the page is closed, you kind of get the best of both worlds since you will see an abnormal shut-down via the heartbeat and you will see a normal shut-down immediately via the webSocket connection getting closed.

Is there any javascript-event which allows an action to be executed by the web-nav continually?

I'm trying to make a websocket using jQuery which is triggered continually after the load of the page. The idea, is to get information continually from the server and to display them into the web page without any refresh.
What kind of event is?
Any brilliant idea, please?
setInterval can be dangerous because of the timing of your http request/response. Better to used chained setTimeouts, e.g.
var tick = function() {
// do something here
$('foo').toggle();
setTimeout(tick, 1000); // wait 1 second and call again
};
tick();
I think an HTML5 Websocket is what you are looking for.
The WebSocket specification defines an API establishing "socket" connections between a web browser and a server. In plain words: There is an persistent connection between the client and the server and both parties can start sending data at any time.
http://html5rocks.com/en/tutorials/websockets/basics

Node.js how to keep connection between page loads

On my website I have a list of all online users, updated in real-time by node.js (I'm using now.js)
The problem is, when a user navigates my site, they of course disconnect for a couple of seconds when the new page is loading. Which means they disappear from the list for all other clients, to pop back in just seconds later.
Is there any way to set a timeout on the disconnect function, e.g. if user has not reconnected in 30 seconds, remove from the list otherwise don't?
Or if there is a better way to accomplish this? Can someone please point me in the right direction :)
EDIT:
Came up with a working solution, if anyone would like to know. On server side I have this function
nowjs.on('disconnect', function() {
everyone.now.clientDisconnected();
});
which whenever a user disconnects calls this function on the client
now.clientDisconnected = function() {
setTimeout(function() { now.serverUpdateUsers(); }, 20000);
}
So instead of updating the users right away, we wait 20 seconds. By then the user should have finished loading the new page, and no difference will show for all other clients.
The serverUpdateUsers(); is the serverside function that gathers all user data and pushes it out to all clients.
I'm not exactly sure if you can modify Socket.IO's settings with now.js (which uses Socket.IO), but if you could (not sure, never used now.js) you should set the heartbeat interval to be bigger:
https://github.com/LearnBoost/Socket.IO/wiki/Configuring-Socket.IO
heartbeat interval defaults to 20 seconds

Categories

Resources