AJAX onreadystatechange: navigate away and save changes at same time - javascript

When a user clicks a link, I would like to send an AJAX request to save the contents of the current page, and navigate away at the same time.
Typically the window is trying to navigate away, all AJAX requests get the "stop" button, but that may or may not mean that the server is processing the request. If the AJAX is aborted to soon, the changes will not be saved.
The valid readystates according to W3Schools
1: server connection established
2: request received
3: processing request
4: request finished and response is ready
I should I wait for number 2 or number 3 to ensure the request goes through on major browsers before navigating away?
I acknowledge the risk that by not confirming a successful save in number 4, I risk not letting the user know about a failure in saving changes,, But the code is very stable, so once the server receives the request, I am almost 100% sure that if the changes are not saved, the user will have no recourse anyway (post deleted or locked or something like that, and the changes are not that important anyway).
But the only problem is, if there is an Internet Connection Failure, I need to at least know about that failure in major browsers.
Do I have to wait for number 4 to know about that?
Assuming I don't even care about connection failures, which one should I wait for?

Yes wait for 4 and check the response. You could pass back something from your server in the POST / GET to say success, then change window.location. Be sure to preventDefault if you're clicking a link to trigger your ajax.

Related

Differentiate browser close event and logout

1) I need the following requirement to be satisfied:
Client's request(Long running process) should wait till the server is serving the request.
Current solution:
Client initiates the request followed by ping request every 5 sec to check the request status and
with that also maintains the session.
2) If the client moves to other tab in the application and comes back, The client should still show the process status and server should continue working on the request.
3) If the client closes the browser or logs out, the server should stop the process.
PS : Need the functionality for all the browsers after IE-9,Chrome and Firefox.
There are many ways to skin a cat, but this is how I would accomplish it.
1, assign unique identifier to the request (You most likely have done this as you're requesting the ready state every few seconds).
Set a member of their session data to the unique ID.
Set all your pages to load the JS needed to continually check the process, but the JS should NOT use any identifier.
In the script that parses the ajax request, have it check the session for the unique identifier, and update an internal system (file or database) with the time of the last request and the unique identifier.
and push back details if there are details to be pushed.
In another system(like a cron system) or within the process itself(if in a loop for example) have it check the same database or file system that gets updated with the timestamp for the unique identifier and the last timestamp. If the timestamp is too old, lets say 15 seconds (remember page load times may delay the 5 second interval), then kill the process if cron'd, or suicide the process if within the process script itself.
Logout will kill the session data, thus making the updating of the table/file impossible(and a check should be there for this) and that will make it so that in the next few seconds from logout, the process stops.
You will not be able to find a reliable solution for logout. window.onbeforeunload will not allow you to communicate with the server (you can only prompt the user using only the built-in dialog, and that's pretty much it). Perhaps, instead of finding a solution on capturing logout/abandon, add some logic to the server's process to wait for those pings (maybe allow 30 seconds of no-comm before abandoning); that way you're not wasting server's cycles that much and you still have the monitoring working as before.

handle HTTP time out for ajax save

I have a JavaScript application that regularly saves new and updated data. However I need it to work on slow connection as well.
Data is submitted in one single HTTP POST request. The response will return newly inserted ids for newly created records.
What I'm finding is that data submitted is fully saved, however sometimes the return result times out. The browser application therefore does not know the data has been submitted successfully and will try to save it again.
I know I can detect the timeout in the browser, but how can I make sure the data is saved correctly?
What are some good methods of handling this case?
I see from here https://dba.stackexchange.com/a/94309/2599 that I could include a pending state:
Get transaction number from server
send data, gets saved as pending on server
if pending transaction already exists, do not overwrite data, but send same results back
if success received, commit pending transaction
if error back, retry later
if timeout, retry later
However I'm looking for a simpler solution?
Really, it seems you need to get to the bottom of why the client thinks the data was not saved, but it actually was. If the issue is purely one of timing, then perhaps a client timeout just needs to be lengthened so it doesn't give up too soon or the amount of data you're sending back in the response needs to be reduced so the response comes back quicker on a slow link.
But, if you can't get rid of the problem that way, there are a bunch of possibilities to program around the issue:
The server can keep track of the last save request from each client (or a hash of such request) and if it sees a duplicate save request come in from the same client, then it can simply return something like "already-saved".
The code flow in the server can be modified so that a small response is sent back to the client immediately after the database operation has committed (no delays for any other types of back-end operations), thus lessening the chance that the client would timeout after the data has been saved.
The client can coin a unique ID for each save request and if the server sees the same saveID being used on multiple requests, then it can know that the client thinks it is just trying to save this data again.
After any type of failure, before retrying, the client can query the server to see if the previous save attempt succeeded or failed.
You can have multiple retries count as a simple global int.
You can also automatically retry, but this isn't good for an auto save app.
A third option is use the auto-save plugins for jQuery.
Few suggestions
Increase the time out, don't handle timeout as success.
You can flush output of each record as soon as you get using ob_flush and flush.
Since you are making request in regular interval. Check for connection_aborted method on each API call, if client has disconnected you can save the response in temp file and on next request you can append the last response with new response but this method is more resource consuming.

Is an AJAX request killed when a link gets clicked?

I have a website with an AJAX cart. The concept is pretty simple: you end up on a page with a product and can click the Buy Now button. When you do so, the JavaScript code adds the product to the cart, changes the cart visual with the new count, and sends an AJAX request to the server to report the change.
What I'm wondering about, since the client and the server may take a while to process the AJAX request, is... will the client clicking a link to move to another page (i.e. "next product") before the AJAX is reported as successful stop the AJAX request at all?
// prepare request...
...snip...
// send request (usually a POST)
jQuery.ajax(uri, ajax_options);
// return to user
// will a click on a link cancel the AJAX call after this point?
Further, I have timed AJAX requests. If the user clicks on a link before those timed requests happen, they will be lost for sure. Assuming that the click does not cancel an AJAX request, would starting one in the unload event work? Would using a cookie be better/safer than attempting another AJAX request? (although if the user clicks an external link, the unload is really the only solution I can think of to save that data...)
As a side note: I do not want to darken the screen when the user adds an item to the cart so that way the user can continue to do things... but if the AJAX needs to be confirmed before a link can be clicked, I'd have to make sure clicks cannot be used until then.
Update:
I thinks that some of you are missing the point. I do not care about the done() or completed() functions getting called on the client side. What I do care about is making sure that in the end I get all the data on the server.
I understand that's asynchronous, but what I want to make sure of is avoiding loss of data, especially if the link goes to another website (to the same website, I am really thinking to make use of a cookie to make sure that the data of delayed AJAX requests get to the server no matter what.)
Also, the idea of timed data requests is to avoid heavy loads on the server. With a properly timed set of AJAX requests, the client and server both work a lot better!
#meagar summed this up pretty well in the comments
Any pending AJAX requests will be aborted when the browser navigates away from the page.
So depending on how you define "killing" an AJAX request, that means the request may be started, but it also might not have finished. If it's a short request, most likely it won't be aborted by the time it finishes. But if it's a long request (lots of data processing, takes a second or two to complete), then most likely it's going to be aborted somewhere in the middle.
Of course, this all depends on the browser. The issue typically is that the request makes it to the server, but the browser aborts the request before the response comes through. This all depends on the server and how it processes the data.
Some servers will interrupt the execution of your view, where the requests data is being processed and the response is being generated. Many servers will just let the code run and trigger an error when you try to write output to the response. This is because there is nobody on the other end, so you're writing the response to a closed connection.
although if the user clicks an external link, the unload is really the only solution I can think of to save that data
From my own experience, most browsers will allow you to send out a request during the beforeunload event. This is not always true for unload though, as by that time the page change cannot typically be stopped.
One way to get around this, especially when the response matters, is to halt the page change after the user clicks the link. This can be as simple as calling evt.preventDefault() on the click event for the link, and then later redirecting the user to where they wanted to go when the request is finished. You should make sure to indicate to the user that their request has not just been ignored, but that they're waiting on something to finish first. Users don't want to be left in the dark, so make sure to give them some feedback (like changing the button text, disabling it, etc.).

Will ajax call return successfully to reloaded web page?

I have a beginner question. If I make an ajax call using JavaScript, and then successfully reload the page before the ajax request gets a response, will the response still work?
It depends what you mean by 'work'. The request will still be sent to the server, and the server will send back a response, but any callback functions which you have assigned to be executed when the response is received will not run, because the function objects that were assigned will have been lost when the page was reloaded.
No.
By reloading the page, all JavaScript (and its parser/engine) is terminated, reset and initialized again. So any pending (Ajax) calls are aborted. Hence, existing calls it will not work anymore.
Furthermore, all active (HTTP) connection are (should be) reset, so the server might still process the request (if it arrived on time), but the response is lost due to the aborted connections.
The fact that these things are not working after a reload, is a good thing: As it would result in unexpected, error-prone situations!

Ajax patterns - assume success or wait for response

Typically an ajax interaction would involve sending the request, providing feedback to the user that the request is in process, then once the response arrives handle it.
Waiting for the response is obviously unavoidable when the next action requires the data sent from the server but what if the interaction is an update to the some data on the server such as sorting the order of a list. Would it be bad practice to assume success? So you'd make the request and simply update the DOM based on the assumption that the sort will be successful. I'd imagine you'd then have to provide a rollback method should the request fail as well as notify the user but 99% of the time the request should go through and appear instantaneous to the user.
Is this a common pattern and are there any other factors that should be considered other than a rollback and notification method?
Any advice would be much appreciated,
Rich
You probably want to take a look at the command pattern http://en.wikipedia.org/wiki/Command_pattern. It looks like you want to modify some data and assume that the server gets modified as well. If the AJAX handler fails than you can rollback the command (and notify the user).
If the user is to receive feedback about the success or not, then it would be bad practice to assume success.
Just return a success/fail response from the server, and let the user know that their action was successful. You're not really losing anything by this extra trip back from the server, and any request could potentially fail.

Categories

Resources