I noticed that when I was "away from keyboard" over 5-10min and I make a call to backend (GET, POST or something) the first call is directly the request in question and the browser not send before the OPTIONS request so the call remains pending and if I make a second call works fine, because send first the OPTIONS (the browser understands that it doesn't send OPTIONS call before) and after successfull response start the GET...
I think the browser cache the "Authorization of request" and that's is why the first call after some minutes goes in pending: browser doesn't send OPTIONS call before...
I use Chrome and I never tested with Firefox or something else...
It's a big problem because if a user is afk for some reason, he need to press button twice for what he need to do.
Anyone know fix/trick to avoid this? I'm working on angularjs applications, and the http request are made with $http.
Related
I have a rest api backend server (NodeJs/Typescript) to which I am making a post request which return me a response. The thing is that I am trying to use my frontend to make about 8 post request calls at the same time with the same data (weird, I know) but that is the requirement of the project.
Background: When I make one post call at the press of a button and then I refresh the page and press the button again. The backend runs both of those requests in parallel. This is what I want to do. So, I tried changing the front end code to make 5 post request at the call of the button but for some reason these request then get executed in sequence, meaning that I get one response and then the other request starts it execution as opposed to the page refers approach where they all start at the same time.
I want to do this because the server won't get any requests and with this approach I am hoping to get sone sort of parallelization from the node environment.
Each browser has a limit to the number or requests that can be fired on the same host - here. When the limit is reached the requests are queued.
Please can anybody suggest what might have been wrong in the following scenario?
When a page loads there are e.g. 5 async calls being made to the server.
The first call fails and sends error code to the browser.
The javascript code upon failure tries redirect to login page and makes a redirect call, however all the previous 4 calls responses are first handled before the javascript redirection call.
My guess is redirection call is in last of the browser callback queue and hence gets handled when all other previous calls response (which are error responses) were made.
Now, I do not want to show the error message box 4 times before user is redirected to login page.
Is there any workaround for the issue? Is my understanding correct?
I've read a bunch of other SO threads but none of them had an answer that worked for me.
I have a simple jQuery script that sends some data over ajax to a Node.js server. Basically this:
$.ajax
url: '/v2/send/[id]'
method: 'POST'
data: 'test'
success: ->
console.log 'Complete', arguments
When I run this code, in the network tab in Chrome the request appears but it's "stalled". It stays that way for minutes, I have read posts from other people where it eventually did send but so far I haven't noticed that happening.
I'm developing the server as well, and I'm logging every request to the application, so I can see that in fact nothing comes in.
I currently have nothing else in my script (this is the whole thing).
So far I've checked and verified:
That jQuery is loaded ($ is set)
That the 'submit' event is in fact triggered (by adding an alert which popped up)
That the request is being created (AjaxStart and AjaxSend happen)
That the URL is correct (right-clicking the request in the network tab -> 'open in new tab' shows me the data I wanted)
That there are no other requests / connections getting in the way (I only have 5 other items in the network tab)
That my server actually logs all requests (I hit it with a bunch of random requests using Postman and they all showed)
Am I missing something obvious here, or is jQuery or Chrome at fault? How can I fix this to actually send the request?
I have a website with an AJAX cart. The concept is pretty simple: you end up on a page with a product and can click the Buy Now button. When you do so, the JavaScript code adds the product to the cart, changes the cart visual with the new count, and sends an AJAX request to the server to report the change.
What I'm wondering about, since the client and the server may take a while to process the AJAX request, is... will the client clicking a link to move to another page (i.e. "next product") before the AJAX is reported as successful stop the AJAX request at all?
// prepare request...
...snip...
// send request (usually a POST)
jQuery.ajax(uri, ajax_options);
// return to user
// will a click on a link cancel the AJAX call after this point?
Further, I have timed AJAX requests. If the user clicks on a link before those timed requests happen, they will be lost for sure. Assuming that the click does not cancel an AJAX request, would starting one in the unload event work? Would using a cookie be better/safer than attempting another AJAX request? (although if the user clicks an external link, the unload is really the only solution I can think of to save that data...)
As a side note: I do not want to darken the screen when the user adds an item to the cart so that way the user can continue to do things... but if the AJAX needs to be confirmed before a link can be clicked, I'd have to make sure clicks cannot be used until then.
Update:
I thinks that some of you are missing the point. I do not care about the done() or completed() functions getting called on the client side. What I do care about is making sure that in the end I get all the data on the server.
I understand that's asynchronous, but what I want to make sure of is avoiding loss of data, especially if the link goes to another website (to the same website, I am really thinking to make use of a cookie to make sure that the data of delayed AJAX requests get to the server no matter what.)
Also, the idea of timed data requests is to avoid heavy loads on the server. With a properly timed set of AJAX requests, the client and server both work a lot better!
#meagar summed this up pretty well in the comments
Any pending AJAX requests will be aborted when the browser navigates away from the page.
So depending on how you define "killing" an AJAX request, that means the request may be started, but it also might not have finished. If it's a short request, most likely it won't be aborted by the time it finishes. But if it's a long request (lots of data processing, takes a second or two to complete), then most likely it's going to be aborted somewhere in the middle.
Of course, this all depends on the browser. The issue typically is that the request makes it to the server, but the browser aborts the request before the response comes through. This all depends on the server and how it processes the data.
Some servers will interrupt the execution of your view, where the requests data is being processed and the response is being generated. Many servers will just let the code run and trigger an error when you try to write output to the response. This is because there is nobody on the other end, so you're writing the response to a closed connection.
although if the user clicks an external link, the unload is really the only solution I can think of to save that data
From my own experience, most browsers will allow you to send out a request during the beforeunload event. This is not always true for unload though, as by that time the page change cannot typically be stopped.
One way to get around this, especially when the response matters, is to halt the page change after the user clicks the link. This can be as simple as calling evt.preventDefault() on the click event for the link, and then later redirecting the user to where they wanted to go when the request is finished. You should make sure to indicate to the user that their request has not just been ignored, but that they're waiting on something to finish first. Users don't want to be left in the dark, so make sure to give them some feedback (like changing the button text, disabling it, etc.).
When a user clicks a link, I would like to send an AJAX request to save the contents of the current page, and navigate away at the same time.
Typically the window is trying to navigate away, all AJAX requests get the "stop" button, but that may or may not mean that the server is processing the request. If the AJAX is aborted to soon, the changes will not be saved.
The valid readystates according to W3Schools
1: server connection established
2: request received
3: processing request
4: request finished and response is ready
I should I wait for number 2 or number 3 to ensure the request goes through on major browsers before navigating away?
I acknowledge the risk that by not confirming a successful save in number 4, I risk not letting the user know about a failure in saving changes,, But the code is very stable, so once the server receives the request, I am almost 100% sure that if the changes are not saved, the user will have no recourse anyway (post deleted or locked or something like that, and the changes are not that important anyway).
But the only problem is, if there is an Internet Connection Failure, I need to at least know about that failure in major browsers.
Do I have to wait for number 4 to know about that?
Assuming I don't even care about connection failures, which one should I wait for?
Yes wait for 4 and check the response. You could pass back something from your server in the POST / GET to say success, then change window.location. Be sure to preventDefault if you're clicking a link to trigger your ajax.