Ajax patterns - assume success or wait for response - javascript

Typically an ajax interaction would involve sending the request, providing feedback to the user that the request is in process, then once the response arrives handle it.
Waiting for the response is obviously unavoidable when the next action requires the data sent from the server but what if the interaction is an update to the some data on the server such as sorting the order of a list. Would it be bad practice to assume success? So you'd make the request and simply update the DOM based on the assumption that the sort will be successful. I'd imagine you'd then have to provide a rollback method should the request fail as well as notify the user but 99% of the time the request should go through and appear instantaneous to the user.
Is this a common pattern and are there any other factors that should be considered other than a rollback and notification method?
Any advice would be much appreciated,
Rich

You probably want to take a look at the command pattern http://en.wikipedia.org/wiki/Command_pattern. It looks like you want to modify some data and assume that the server gets modified as well. If the AJAX handler fails than you can rollback the command (and notify the user).

If the user is to receive feedback about the success or not, then it would be bad practice to assume success.
Just return a success/fail response from the server, and let the user know that their action was successful. You're not really losing anything by this extra trip back from the server, and any request could potentially fail.

Related

Node.js Request drops before Response is received

The project that I am working on is to receive a request where in the main and/or most part of that request consists of data coming from a database. Upon receiving, my system proceeds with its function which is to parse all the data and ultimately concatenates the needed information to form a query, then insert those data using the mentioned query into my local database.
It is working fine and no issue at all. Except for the fact that it takes too long to process when the request has over 6,000,000 characters and over 200,000 lines (or maybe less but still with large numbers).
I have this tested with my system being used as a server (the supposed setup in production), and with Postman as well, but both drops the connection before the final response is built and sent. I have already tested and seen that although the connection drops, my system still proceeds with processing the data even up to the query, and even until it sends its supposed response. But since the request dropped somewhere in the middle of the processing, the response is ignored.
Is this about connection timeout in nodejs?
Or limit in 'app.use(bodyParser.json({limit: '10mb'}))'?
I really only see 1 way around this. I have done similar in the past. Allow the client to send as much as you need/want. However, instead of trying to have the client wait around for some undetermined amount of time (at which point the client may timeout), instead send an immediate response that is basically "we got your request and we're processing it".
Now the not so great part but it's the only way I've ever solved this type of issue. In your "processing" response, send back some sort of id. Now the client can check once in a while to see if it's request has been finished by sending you that id. On the server end you store the result for the client by the id you gave them. You'll have to make a few decisions about things like how long a response id is kept around and if it can be requested more than once, things like that.

Vue js and favorite/like button

I'm wondering how favorite, subscribe or like buttons work.
I don't understand something.
For exemple:
A user like a post with id 243.
A ajax request is sent to the server with the id of the post (243) [here comes back end stuff, the user's favorite list is updated, including that post] and the server sends back a success response.
Now, how I suppose to deal with modifying the like button to actually display that is liked (permanently, including refresh).
How can I achieve that in Vue JS. How things get updated? I don't understand this part.
If the server sends back a successful response you can increment the number that is already there.
This initial number is something you have gotten either through a prop, directly from the server or through an initial AJAX request.
If you want to "permanently" update the amount of likes on your button you have to persist it to a database(or some other storage medium). On you server you could have a route that accepts a post id as an argument and increment that specific user post:
/incrementlike/243
That is where you would make a POST ajax request to. Most of the time in an MVC framework you would have a controller action/method mapped to this route that holds the logic to respond to this call.
If you are interested in the part that happens after you make an AJAX request to the server to increment your like on the backend side, I suggest you read up on routing or MVC structure.
How you would do this is really done on a case by case basis. It really depends on a number of things, for example what your backend does to a post when it is liked.
If you would like a general 'explanation' to the process I attach it below, this is not really Vue specific, but the general idea is the same:
Frontend side:
Modify the local state of you post to set the proper flag, ex. post1.liked = true immediately when it is clicked, before sending the request to the server.
Make sure your GUI represent this change. ex. Base the color of the button on the property 'liked' of each post.
If a failure response it received from the server, notify the user and allow them to 'try again' or something similar.
When refreshing the page, make sure changes are fetched from the server, If you have done the backend part correctly, the modification of the state of the post will be correct in the data you receive from your backend (post1.liked will be true)
Backend side
When the request comes in, modify the state of the post the correct way and make sure that next time the post is fetched, the new state is sent.

handle HTTP time out for ajax save

I have a JavaScript application that regularly saves new and updated data. However I need it to work on slow connection as well.
Data is submitted in one single HTTP POST request. The response will return newly inserted ids for newly created records.
What I'm finding is that data submitted is fully saved, however sometimes the return result times out. The browser application therefore does not know the data has been submitted successfully and will try to save it again.
I know I can detect the timeout in the browser, but how can I make sure the data is saved correctly?
What are some good methods of handling this case?
I see from here https://dba.stackexchange.com/a/94309/2599 that I could include a pending state:
Get transaction number from server
send data, gets saved as pending on server
if pending transaction already exists, do not overwrite data, but send same results back
if success received, commit pending transaction
if error back, retry later
if timeout, retry later
However I'm looking for a simpler solution?
Really, it seems you need to get to the bottom of why the client thinks the data was not saved, but it actually was. If the issue is purely one of timing, then perhaps a client timeout just needs to be lengthened so it doesn't give up too soon or the amount of data you're sending back in the response needs to be reduced so the response comes back quicker on a slow link.
But, if you can't get rid of the problem that way, there are a bunch of possibilities to program around the issue:
The server can keep track of the last save request from each client (or a hash of such request) and if it sees a duplicate save request come in from the same client, then it can simply return something like "already-saved".
The code flow in the server can be modified so that a small response is sent back to the client immediately after the database operation has committed (no delays for any other types of back-end operations), thus lessening the chance that the client would timeout after the data has been saved.
The client can coin a unique ID for each save request and if the server sees the same saveID being used on multiple requests, then it can know that the client thinks it is just trying to save this data again.
After any type of failure, before retrying, the client can query the server to see if the previous save attempt succeeded or failed.
You can have multiple retries count as a simple global int.
You can also automatically retry, but this isn't good for an auto save app.
A third option is use the auto-save plugins for jQuery.
Few suggestions
Increase the time out, don't handle timeout as success.
You can flush output of each record as soon as you get using ob_flush and flush.
Since you are making request in regular interval. Check for connection_aborted method on each API call, if client has disconnected you can save the response in temp file and on next request you can append the last response with new response but this method is more resource consuming.

Is an AJAX request killed when a link gets clicked?

I have a website with an AJAX cart. The concept is pretty simple: you end up on a page with a product and can click the Buy Now button. When you do so, the JavaScript code adds the product to the cart, changes the cart visual with the new count, and sends an AJAX request to the server to report the change.
What I'm wondering about, since the client and the server may take a while to process the AJAX request, is... will the client clicking a link to move to another page (i.e. "next product") before the AJAX is reported as successful stop the AJAX request at all?
// prepare request...
...snip...
// send request (usually a POST)
jQuery.ajax(uri, ajax_options);
// return to user
// will a click on a link cancel the AJAX call after this point?
Further, I have timed AJAX requests. If the user clicks on a link before those timed requests happen, they will be lost for sure. Assuming that the click does not cancel an AJAX request, would starting one in the unload event work? Would using a cookie be better/safer than attempting another AJAX request? (although if the user clicks an external link, the unload is really the only solution I can think of to save that data...)
As a side note: I do not want to darken the screen when the user adds an item to the cart so that way the user can continue to do things... but if the AJAX needs to be confirmed before a link can be clicked, I'd have to make sure clicks cannot be used until then.
Update:
I thinks that some of you are missing the point. I do not care about the done() or completed() functions getting called on the client side. What I do care about is making sure that in the end I get all the data on the server.
I understand that's asynchronous, but what I want to make sure of is avoiding loss of data, especially if the link goes to another website (to the same website, I am really thinking to make use of a cookie to make sure that the data of delayed AJAX requests get to the server no matter what.)
Also, the idea of timed data requests is to avoid heavy loads on the server. With a properly timed set of AJAX requests, the client and server both work a lot better!
#meagar summed this up pretty well in the comments
Any pending AJAX requests will be aborted when the browser navigates away from the page.
So depending on how you define "killing" an AJAX request, that means the request may be started, but it also might not have finished. If it's a short request, most likely it won't be aborted by the time it finishes. But if it's a long request (lots of data processing, takes a second or two to complete), then most likely it's going to be aborted somewhere in the middle.
Of course, this all depends on the browser. The issue typically is that the request makes it to the server, but the browser aborts the request before the response comes through. This all depends on the server and how it processes the data.
Some servers will interrupt the execution of your view, where the requests data is being processed and the response is being generated. Many servers will just let the code run and trigger an error when you try to write output to the response. This is because there is nobody on the other end, so you're writing the response to a closed connection.
although if the user clicks an external link, the unload is really the only solution I can think of to save that data
From my own experience, most browsers will allow you to send out a request during the beforeunload event. This is not always true for unload though, as by that time the page change cannot typically be stopped.
One way to get around this, especially when the response matters, is to halt the page change after the user clicks the link. This can be as simple as calling evt.preventDefault() on the click event for the link, and then later redirecting the user to where they wanted to go when the request is finished. You should make sure to indicate to the user that their request has not just been ignored, but that they're waiting on something to finish first. Users don't want to be left in the dark, so make sure to give them some feedback (like changing the button text, disabling it, etc.).

jQuery: Using a single Ajax call, receive progressive statuses instead of one single response?

I'm just wondering..is it possible to receive multiple responses from a single ajax call?
I'm thinking purely for aesthetic purposes to update the status on the client side.
I have a single ajax method that's called on form submit
$.ajax({
url: 'ajax-process.php',
data: data,
dataType: 'json',
type: 'post',
success: function (j) {
}
});
I can only get one response from the server-side. Is it possible to retrieve intermittent statuses? Such as:
Default (first): Creating account
Next: Sending email confirmation
Next: Done
Thanks for your help! :)
From a single ajax call, I don't think it is possible.
What you could do is check frequently where the process is (it's what is used for the upload bars in gmail for example). You do a first ajax request to launch the process, and then a series of ajax request to ask the server how he is doing. When the server answers "I'm done", you're good to go, and until that you can make the server respond and say the current state.
There is something called comet which you can set up to "push" requests to client, however it is probably way more than what you are wanting to invest in, time-wise.
You can open up a steady stream from the server, so that it continues to output, however I'm not sure how client-side script can handle these as individual "messages". Think about it like a server that outputs some info to the browser, does more work, outputs some more to the browser, does more work, etc. This shows up more or less in real time to the browser as printed text. It is one long response, but it is still one response. I think ajax only handles a response once it finished being sent, but maybe someone else will know more than me on the topic.
But you couldn't have the server output several individual responses without reloading itself, at least not with PHP, because once you start outputting the response, the response has begun and you can't chop that up without finishing the response, which happens when the script is done executing.
Your best bet is with the steady stream, but again, I'm not sure how ajax handles getting responses in chunks.
Quick Update
Based on the notes for this plugin:
[http://plugins.jquery.com/project/ajax-http-stream]
things don't look promising. Specifically:
Apparently the trend is to disallow access to the xmlhttprequest.responseText before the request is complete (stupid imo). Sorry there's nothing I can do to fix this
Thus, not only can you not get what you want in one request, you probably can't get it multiple requests, unless you want to break up the actual server-side process into several parts, and only have it continue to the next step when an ajax function triggers it.
Another option would be to have your script write it's status at specific points to another file on the server, call it "status.xml" or "status.txt". Have your first ajax function initialize the process, and have a second ajax function that queries this status file and outputs that to the user.
It is possible, but it has more to do with your backend script. As Anthony mentioned there is a tech called comet. Another term I've heard is called "Long polling". The idea is that you delay the time in which your php(insert language of choice) script finished processing.
In php you can do something like this:
while($response !== 'I'm done'){
sleep(1);
}else{
return $some_value;
exit();
}
This code stops your script from completely finishing. sleep(1) allows the script to stop and lets the server rest for 1 millisecond, before it loops back through. You can adjust the sleep time based on your needs. In php the amount of time the script sleeps is not counted agains your server timeout time.
You'll obviously need to make more checks for you code. You'll probably also want to allow for an abort script call. Something like sending a get request to kill the backend script. Maybe on the javascript unload event.
In the tests that I've done. I made the initial ajax call, and when the value was returned, I made another ajax call, that way your back end script wont time out.
I've only played around with this on my local server, so i'm not sure how real world this is, but it works.

Categories

Resources