How to perform an ajax call on page unload? - javascript

I have a dashboard where users can toggle some input values in order to configure the appearance of the page.
I want to store those changes in a DB table, so when user comes back, the dashboard appears according to the specific user configuration retrieved from the DB.
I know how to send those values to DB and how to retrieve them
I don't want to make ajax calls every time the user changes a configuration.
Instead, I think this senario would be better:
Page load (retrieve DB configuration if exist)
User toggles the configuration ui items (e.g. checkboxes, select etc) and the appropriate client side changes take place (some divs get hidden, some others are shown etc and the config input values are stored to a hidden field), but no ajax call takes place.
When user clicks a link to another page, the configuration input values (which have been stored to the hidden field) are sent to the DB via ajax call.
MY QUESTION
One solution(?) would be the use of onbeforeunload event like this:
window.onbeforeunload = function(e) {
// Perform the ajax call
return 'Do you want to save the configuration changes?';
};
But, if the user's browser prevent popups, the function will not get executed?
Is there a way to perform an ajax call on onbeforeunload event, without calling a dialog box?

No. During unload, the browser will kill all pending requests. So the AJAX might or might not arrive at the server. You also can't do it inside the handler of the beforeunload event because the first A in AJAX means: Asynchronous -> Just put the request on the stack and eventually execute it. So the request will be looked at only after the handler returns. But very soon after that, the browser will kill anything related to the page.
The solution is to always send updates to the server while the users makes changes. Put those into a special "temporary" table from which you can restore the state later.
You could also use something like localStorage in the browser but then, the data wouldn't move with the user. For example if they were working on an tablet and that broke or had a power loss, they could move to their PC to continue where they left off.

You can't guarantee that an Ajax call will complete in this way, see Aaron's response. My suggestion would be to use something like localStorage to read / write the user's settings instead.
If you need a user's appearance to persist across multiple devices, add a periodic request to read / write the recent updates from localStorage to the central DB.

Try using Navigator.sendBeacon on the visibilitychange event.
document.addEventListener("visibilitychange", () => {
if (document.visibilityState === "hidden") {
navigator.sendBeacon("/ajaxUrl");
}
});
Regarding Navigator.sendBeacon(), MDN details that when leaving a page,
the browser may choose not to send asynchronous XMLHttpRequest requests.
With the sendBeacon() method, the data is transmitted asynchronously when the user agent has an opportunity to do so, without delaying unload or the next navigation. This means:
The data is sent reliably
Regarding the visibilitychange event, MDN recommends:
Web sites often want to send analytics or diagnostics to the server when the user has finished with the page. The most reliable way to do this is to send the data on the visibilitychange event

Related

Is there a way to send POST data to another form and return the result of that form?

I need to send form data to another page that will allow the user to do something in a form and return the result of that form back to the original page? Is this possible? I know it's not ideal, but the issue is that I need to make a "drop-in" solution that does not need to be integrated with other code. I know it's a very specific request and scenario.
I know how to send POST data that doesn't require any user input on the processing page. i.e. I can send POST data to 'calculate.php' which will do the math and send it back, but if I need additional user input on 'calculate.php', how can I still send it back?
An example of expected results would be:
Page #1: User enters a number and presses submit to go to next page.
Page #2: User enters a second number and presses submit to finish.
Back to Page #1: User receives sum of both numbers.
Obviously, this is a really redundant thing to do, but I'm trying to simplify the problem as much as possible.
EDIT: There a few restrictions I forgot to add.
Page #1 is not my application, I am developing Page #2 as a "drop-in" solution for Page #1. Essentially, I can only use Page #1 to call Page #2 and receive a response from it. The problem is that I need to be able to allow for user input on Page #2.
I know I can post to Page #2 and then post to Page #1 again, but what if I need to maintain the state of Page #1. For example, if there's an open Web Socket connection.
Please note, I understand that this may be impossible or extremely difficult, but if I don't ask I'll never know right?
You want it with PHP or any other language. If you are running Php on server side then you can use Global variables like $_GET and $_POST.
Page #1: Use Post/Get method to send data to second page.
Page #2: Receive all fields' values using Globe variables ($_GET and $_POST). You can use these values as default values of form fields. Now submit this data to page 1 using post or get method.
Back to Page #1: Here you will receive the data of first page from second page and newly posted data from page 2
Either of these should work:
Never leave the page - use AJAX / XMLHttpRequest to call out to other pages to process chunks of data
Do everything on page 1 using "postbacks" -- the form targets are the same page, there is a state variable like "stage=1", and you use JavaScript to add set hidden variables for any additional state that's needed.
... PHP state validation and processing for the different stages ...
... one or more blocks of HTML for the page (PHP if / else can be used to choose between multiple page views) ...
Edit for added restrictions:
Have page 2 use postbacks or AJAX to collect the additional information
I figured out a few ways to do it.
Update a Database (or Data Store of some sort, depends on security needs) and have Page #1 listen for events from a separate page (on the same server as the database). Very similar to the way PayPal's Instant Payment Notification (IPN) works. I was actually able to set up server sent events with it as well.
Essentially, Page #1 sends data to Page #2 where the user will perform the function and then Page #2 will send POST data to a listener somewhere (either on the same server or Page #1's server), the listener will update a database and Page #1 will be listening or pulling to an event handler that will send an update once the database updates.
Use JavaScript Child/Parent Window functions. This is okay if Page #1 and Page #2 are on the same server, but can get messy and browsers have a lot of restrictions and it varies depending on browser.
Page #1 will open Page #2 in a child window, after the user performs a function, Page #2 will call a function that accepts the result data on Page #1.

Is an AJAX request killed when a link gets clicked?

I have a website with an AJAX cart. The concept is pretty simple: you end up on a page with a product and can click the Buy Now button. When you do so, the JavaScript code adds the product to the cart, changes the cart visual with the new count, and sends an AJAX request to the server to report the change.
What I'm wondering about, since the client and the server may take a while to process the AJAX request, is... will the client clicking a link to move to another page (i.e. "next product") before the AJAX is reported as successful stop the AJAX request at all?
// prepare request...
...snip...
// send request (usually a POST)
jQuery.ajax(uri, ajax_options);
// return to user
// will a click on a link cancel the AJAX call after this point?
Further, I have timed AJAX requests. If the user clicks on a link before those timed requests happen, they will be lost for sure. Assuming that the click does not cancel an AJAX request, would starting one in the unload event work? Would using a cookie be better/safer than attempting another AJAX request? (although if the user clicks an external link, the unload is really the only solution I can think of to save that data...)
As a side note: I do not want to darken the screen when the user adds an item to the cart so that way the user can continue to do things... but if the AJAX needs to be confirmed before a link can be clicked, I'd have to make sure clicks cannot be used until then.
Update:
I thinks that some of you are missing the point. I do not care about the done() or completed() functions getting called on the client side. What I do care about is making sure that in the end I get all the data on the server.
I understand that's asynchronous, but what I want to make sure of is avoiding loss of data, especially if the link goes to another website (to the same website, I am really thinking to make use of a cookie to make sure that the data of delayed AJAX requests get to the server no matter what.)
Also, the idea of timed data requests is to avoid heavy loads on the server. With a properly timed set of AJAX requests, the client and server both work a lot better!
#meagar summed this up pretty well in the comments
Any pending AJAX requests will be aborted when the browser navigates away from the page.
So depending on how you define "killing" an AJAX request, that means the request may be started, but it also might not have finished. If it's a short request, most likely it won't be aborted by the time it finishes. But if it's a long request (lots of data processing, takes a second or two to complete), then most likely it's going to be aborted somewhere in the middle.
Of course, this all depends on the browser. The issue typically is that the request makes it to the server, but the browser aborts the request before the response comes through. This all depends on the server and how it processes the data.
Some servers will interrupt the execution of your view, where the requests data is being processed and the response is being generated. Many servers will just let the code run and trigger an error when you try to write output to the response. This is because there is nobody on the other end, so you're writing the response to a closed connection.
although if the user clicks an external link, the unload is really the only solution I can think of to save that data
From my own experience, most browsers will allow you to send out a request during the beforeunload event. This is not always true for unload though, as by that time the page change cannot typically be stopped.
One way to get around this, especially when the response matters, is to halt the page change after the user clicks the link. This can be as simple as calling evt.preventDefault() on the click event for the link, and then later redirecting the user to where they wanted to go when the request is finished. You should make sure to indicate to the user that their request has not just been ignored, but that they're waiting on something to finish first. Users don't want to be left in the dark, so make sure to give them some feedback (like changing the button text, disabling it, etc.).

When doing AJAX edit to the database, should I update the interface immediately with the new data

I'm using inline-edit to update text in the database with AJAX. This is basically the process, pretty usual stuff:
text is not editable
I click the text, it becomes editable
I type the new text
then click to send the updated text to the database
then return the text to non-editable format
My question is when should I update the interface with the new data? Should I update it immediately before the ajax call, or should I wait for the update response to return from the database?
My concern:
If I don't update the interface immediately and wait to get the response from the database, then I've lost the asynchronous benefit that comes with ajax.
But if I update it immediately, then if the database response has an error, I somehow have to track the change I already made, and reverse it, which is a lot more work.
So how is this sort of thing usually done?
I think it is completely reasonable to wait for the response and update as a result of a callback. Doing so does not detract from the async approach. It is still fully async because you are not blocking the entire page or reloading it completely.
Plenty of times in apps, especially in mobile ones where the bandwidth might be limited, I will see a spinner indicating that the field is submitting. This does not hold up any other part of the app. Even stackoverflow does this when I use the mobile view. Rely on the callbacks in order to stay async and still be synced to database return values.
AJAX calls are pretty quick, excepting network issues of course. Personally, I don't think you will lose the benefit of AJAX by waiting for a response from the database. That is, unless you plan on it being slow because of server-side processing, etc...
Now if you were to set the textfield to a non-editable state, the user might think that his change has been accepted and will be confused when the server returns an error and the value is reverted to its original state. I would leave the field editable until the server returns.
If you are using jQuery it's pretty simple, but if you are using your homebrew ajax call script you will have to add some mechanism to see if everything went good or bad.
$.ajax({
url: '/ajax/doSomething.php',
type: 'POST',
dataType: 'json',
data: {
'q': $("#editableText").val()
},
success: function(json) {
$("#editableText").html(json.value);
},
error: {
alert('something went wrong!');
}
})
So when your doSomething.php returns true or false, our ajax calls does something according to it.
Yes the ajax calls are pretty fast, but before changing the data displayed on the page I guess you must be sure that everything went OK, else the user might leave the page without knowing if they have done the editing or not.
The case that you have mentioned is an Optimistic UI update. In this case you are assuming (implictly) that the update will be performed on the server without any error.
The disadvantage to this approach would be with a following scenario
User clicks on non-editable text
Text becomes editable
User types in new text
User clicks send
The UI changes to the new text and makes it uneditable
User closes the browser window (or navigates away from the page) before the reply ever comes back (assuming that the change was performed)
Next time the user logs in (or comes back to the page) they are confused as to why the change did not apply!
However you also want to use the asynchronous nature of ajax and make sure that the user can still interact with your app (or the rest of the page) as this change is being performed.
The way we do that (at my work-place) would typically be (using long polling or http push)
The user interacts with non-editable text
The text becomes editable
User types in new text
User clicks send
Instead of updating the text optimistically we show some kind of spinner (only on the text) that indicates to the user that we are waiting for some kind of response from the server. Note that since we are only disabling just the part of the page that shows this text the user does not have to wait for the ajax call to return in order to interact with the rest of the page. (In fact we have many cases that support updating multiple parts of the page in parallel). Think of gmail where you might be chatting with someone in the browser window and at the same time you get an email. The chat can go on and the email counter is also incremented. Both of them are on the same page but do not depend on each other (or wait for each other in any way)
When the update is complete we take away the spinner (which is usually shown using css - toggle class) and replace the element's value with the new text.
Lastly if you are doing something like this make sure that your server also has support for adding a pending class to the text field (that will cause a spinner to appear). This is done for the following reason
User updates text and submits
User immediately navigates to a different page
User then comes back to the original page again (and let us assume that the update is not yet complete)
Since your server knows that the field is being updated and can add a pending css class to the label / display field the page comes up loaded with a spinner.(On the other hand if there is no pending request there will be no spinner shown)
Finally when the long poller message comes back from the server the spinner is taken off and the current value is displayed

how to silently guarantee executing an ASP.NET MVC3 action on page unload

I need to execute an action of a controller when a user leave a page (close, refresh, go to link, etc.). The action code is like:
public ActionResult WindowUnload(int token)
{
MyObjects[token].Dispose();
return Content("Disposed");
}
On window download I do Ajax request to the action:
$(window).unload(function ()
{
$.ajax({
type: "POST",
url: "#Url.Action("WindowUnload")",
data: {token: "#ViewData["Token"]"},
cache: false,
async: true
});
//alert("Disposing.");
})
The above ajax request does not come to my controller, i.e., the action is not executed.
To make the code above to work I have to uncomment the alert line, but I don't want to fire alert on a user.
If I change async option to false (alert is commented), then it sometimes works. For example, if I refresh the page several times too fast then the action will not be executed for every unload.
Any suggestions how to execute the action on every unload without alert?
Note, I don't need to return anything from action to the page.
Updated: answers summary
It is not possible reliably to do request on unload, since it is not proper or expected behavior on unload. So it is better to redesign the application and avoid doing HTTP request on window unload.
If it is not avoidable, then there are common solutions (described in the question):
Call ajax synchronously, i.e., async: false.
Pros: works in most cases.
Pros: silent
Cons: does not work in some cases, e.g, when a user refreshes the windows several times too fast (observed in Firefox)
Use alert on success or after ajax call
Pros: seems to work in all cases.
Cons: is not silent and fires pop up alert.
According to unload documentation, with async: false it should work as expected. However, this will always be a bit shaky - for example, user can leave your page by killing/crashing the browser and you will not receive any callback. Also, browser implementations vary. I fear you won't get any failproof even.
HTTP is stateless and you can never get a reliable way to detect that the user has left your page.
Suggested events:
Session timeout (if you are using sessions)
The application is going down
A timer (need to be combined with the previous suggestion)
Remove the previous token when a new page is visited.
Why does this need to happen at all?
From the code snippet you posted you are attempting to use this to dispose of objects server side? You are supposed to call Dispose to free up any un-managed resources your objects are using (such as Database connections).
This should be done during the processing of each request. There shouldn't be any un-managed resources awaiting a dispose when the client closes the browser window.
If this is the way you are attempting this in the manner noted above the code needs to be reworked.
Have you tried onbeforeunload()?
$(window).bind('beforeunload', function()
{
alert('unloading!');
}
);
or
window.onbeforeunload = function() {
alert('unloading!');
}
From the comment you made to #Frazzell's answer it sounds like you are trying to manage concurrency. So on the chance that this is the case here are two common method for managing it.
Optimistic concurrency
Optimistic concurrency adds a timestamp to the table. When the client edits the record the timestamp is included in the form. When they post their update the timestamp is also sent and the value is checked to make sure it is the most recent in the table. If it is, the update succeeds. If it is not then someone else got in sooner with an update so it is discarded. How you handle this is then up to you.
Pessimistic concurrency
If you often experience concurrency clashes then pessimistic concurrency may be better. Here when the client edits the record a flag is set on that row to lock it. This will remain until the client completes the edit and no other user can edit that row. This method avoids users loosing changes but add an administration over head to the application. Now you need a way to release unwanted locks. You also have to inform the user through the UI that a row is locked for edit.
In my experience it is best to start with optimistic concurrency. If I have lots of people reporting problems I will try to find out why people are having these conflicts. It maybe that I have to break down some entities in to smaller types as they have become responsible for doing too many jobs.
This wont work and even if you are able to somehow make it work it will give you lots of headaches later on, because this is not how the browser/HTTP is supposed to be used. When the page is unloading (in browser) the browser will call the unload event and then unload the page (you cannot make it wait, not even my making sync ajax calls) and in case the call was going on and the browser after executing the code unload the page, the call will also get cancelled and thats why you see the call on server sometimes and sometimes it doesn't work. If you could tell use why you want to do this we could suggest you a better approach.
You can't. The only thing you can do is prompt the user to stay and hope for the best. There are a whole host of security concerns here.

Browser cache and history back button (hashing, history.js)

I have problem with browser back button. Problem exists for IE and google chrome.
I'm creating autoload mechanism for search engine. Mechanism works like google search.
Procedure of building:
1) typing keywords
2) ajax search request
3) response as json
4) building results list using json
5) append list to container
If i have builded results and redirect to another page and back to the page with results, results desapear. I tried a lot solutions described by developers like hashing, history.js and many more but every one is not working.
When you go back, the original HTML of the page is loaded from the cache. Everything you added through Javascript is not stored, so you will have to restore that modification on page load.
For that you can use the popstate event. When that even is fired, you'll need to restore the information. You can do that by re-executing the AJAX request and process the result. Therefore you must save enough information in the url (or hash) to be able to execute the same request again.
That also means, you may need to do earlier requests! For example, if you execute an ajax request to get item X of a list, where X increments each time after the request (so you can get the next item on each click), you will need to make sure that you load all items again. If you don't do that, you will only get the original items on the cached page, and the latest item that was AJAXed, while the items inbetween will be missing.
But if you use pushState or replaceState to store states, you can also store additional data. You can use this to store the JSON result with the state, so you don't need an additional request. Anyway, this is an optimization and is not strictly needed, so you should start with implementing the AJAX request being fired on popstate. You'll need that anyway, because the data may not always be stored with the state, so you will always need the AJAX request as a fallback.

Categories

Resources