Is it possible to cancel asynchronous call independently of its current state? - javascript

When I type text into my textfield widget I send request with every character typed into it to get matching data from the server.
When I type really fast I swarm server with the requests and this causes to freeze my control, I managed to create throttling mechanism where I set how many ms client should wait before sending request. This requires to set arbitrary constant ms to wait. The better solution would be to just cancel sending previous request when the next key button is pressed.
Is it possible to cancel AJAX request independently of its current state? If yes, how to achieve this?

Call XMLHttpRequest.abort()
See: https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/abort
You'll have to track your requests somehow, perhaps in an array.
requests = [];
xhr = new XMLHttpRequest(),
method = "GET",
url = "https://developer.mozilla.org/";
requests.push(xhr);

MDN says :
The XMLHttpRequest.abort() method aborts the request if it has already
been sent. When a request is aborted, its readyState is set to 0
(UNSENT), but the readystatechange event is not fired.
What's important to note here is that while you will be able to .abort() requests on the client side (and thus not have to worry about the server's response), you are still swarming your server because all those requests are still being sent.
My opinion is that you had it right the first time, by implementing a mechanism that limits the frequency of AJAX requests. You mentioned that you had a problem with this freezing your control (I assume you mean that the browser is either taking longer to respond to user actions, or stops responding completely), and this could be a sign that there is a problem with the way your application handles asynchronous code.
Make sure you are using async APIs like Promise correctly, avoid loops that do heavy processing or just wait around in client code, and make your event processing (i.e your AJAX callback) simple and fast to reduce the impact on the user.

Related

Running Multiple Post Request in parallel from the same source

I have a rest api backend server (NodeJs/Typescript) to which I am making a post request which return me a response. The thing is that I am trying to use my frontend to make about 8 post request calls at the same time with the same data (weird, I know) but that is the requirement of the project.
Background: When I make one post call at the press of a button and then I refresh the page and press the button again. The backend runs both of those requests in parallel. This is what I want to do. So, I tried changing the front end code to make 5 post request at the call of the button but for some reason these request then get executed in sequence, meaning that I get one response and then the other request starts it execution as opposed to the page refers approach where they all start at the same time.
I want to do this because the server won't get any requests and with this approach I am hoping to get sone sort of parallelization from the node environment.
Each browser has a limit to the number or requests that can be fired on the same host - here. When the limit is reached the requests are queued.

Does jquery post ever timeout?

Official documentation at jQuery does not mention it.
Possible confusion: I know I can use ajax to gain control over timeout, but my question is different.
Scenario:
I am using post to grab data from a backend which I know will take a long (sometimes very very long) time to load.
Question:
Will my javascript request ever timeout or will it always wait until backend is loaded, even if it takes a few minutes?
Jquery uses the native XMLHttpRequest module to make requests.
The XMLHttpRequest.timeout property is an unsigned long representing the number of milliseconds a request can take before automatically being terminated. The default value is 0, which means there is no timeout.
Reading the source code of the jquery library, the ajax method does not set a timeout in and way, hence it is save to say that the request does not timeout.
But you can explicitly set a timeout in both jquery and the native module.
this does not mean that your request will not timeout, since the server usually does impose a bail timeout strategy, usually long responses timeout from the server side. you could consider chunking or streaming as a safe and convenient solution.
github jquery ajax source:
https://github.com/jquery/jquery/tree/2d4f53416e5f74fa98e0c1d66b6f3c285a12f0ce/src/ajax
The timeout of a request is, by default, controlled by the browser and the receiving server, whichever cancels the request first. I believe most browsers have a 60 second timeout by default. The server can be any arbitrary value.
Will my javascript request ever timeout or will it always wait until backend is loaded, even if it takes a few minutes?
The answer to this is therefore, yes, your request will timeout at an arbitrary point. If you want to control the amount of time you force your users to wait for a request then you can specifically set this time by using the timeout property of the $.ajax call. This overrides any timeout set in the browser or on the server.
15 seconds should be more than enough. If a request is taking longer than that I'd suggest you change the pattern you're using to generate the response.
HTTP Request timeout is a server side configuration not a client side configuration. Requests submitted via Jquery code is no different.
You might want to have a test against the return code from the last request and add exception handling to your code (like resubmit the request)
Always check the response code and a common strategy is to rety. https://www.lifewire.com/g00/troubleshooting-network-error-messages-4102727

Is an AJAX request killed when a link gets clicked?

I have a website with an AJAX cart. The concept is pretty simple: you end up on a page with a product and can click the Buy Now button. When you do so, the JavaScript code adds the product to the cart, changes the cart visual with the new count, and sends an AJAX request to the server to report the change.
What I'm wondering about, since the client and the server may take a while to process the AJAX request, is... will the client clicking a link to move to another page (i.e. "next product") before the AJAX is reported as successful stop the AJAX request at all?
// prepare request...
...snip...
// send request (usually a POST)
jQuery.ajax(uri, ajax_options);
// return to user
// will a click on a link cancel the AJAX call after this point?
Further, I have timed AJAX requests. If the user clicks on a link before those timed requests happen, they will be lost for sure. Assuming that the click does not cancel an AJAX request, would starting one in the unload event work? Would using a cookie be better/safer than attempting another AJAX request? (although if the user clicks an external link, the unload is really the only solution I can think of to save that data...)
As a side note: I do not want to darken the screen when the user adds an item to the cart so that way the user can continue to do things... but if the AJAX needs to be confirmed before a link can be clicked, I'd have to make sure clicks cannot be used until then.
Update:
I thinks that some of you are missing the point. I do not care about the done() or completed() functions getting called on the client side. What I do care about is making sure that in the end I get all the data on the server.
I understand that's asynchronous, but what I want to make sure of is avoiding loss of data, especially if the link goes to another website (to the same website, I am really thinking to make use of a cookie to make sure that the data of delayed AJAX requests get to the server no matter what.)
Also, the idea of timed data requests is to avoid heavy loads on the server. With a properly timed set of AJAX requests, the client and server both work a lot better!
#meagar summed this up pretty well in the comments
Any pending AJAX requests will be aborted when the browser navigates away from the page.
So depending on how you define "killing" an AJAX request, that means the request may be started, but it also might not have finished. If it's a short request, most likely it won't be aborted by the time it finishes. But if it's a long request (lots of data processing, takes a second or two to complete), then most likely it's going to be aborted somewhere in the middle.
Of course, this all depends on the browser. The issue typically is that the request makes it to the server, but the browser aborts the request before the response comes through. This all depends on the server and how it processes the data.
Some servers will interrupt the execution of your view, where the requests data is being processed and the response is being generated. Many servers will just let the code run and trigger an error when you try to write output to the response. This is because there is nobody on the other end, so you're writing the response to a closed connection.
although if the user clicks an external link, the unload is really the only solution I can think of to save that data
From my own experience, most browsers will allow you to send out a request during the beforeunload event. This is not always true for unload though, as by that time the page change cannot typically be stopped.
One way to get around this, especially when the response matters, is to halt the page change after the user clicks the link. This can be as simple as calling evt.preventDefault() on the click event for the link, and then later redirecting the user to where they wanted to go when the request is finished. You should make sure to indicate to the user that their request has not just been ignored, but that they're waiting on something to finish first. Users don't want to be left in the dark, so make sure to give them some feedback (like changing the button text, disabling it, etc.).

XMLHttpRequest is timeout before response is arrived

I am calling one cgi script which takes 50 minutes to send response.I need to keep my xhr alive till the response comes from that cgi.
How to do that ?
it seems like after certain default time request automatically timed out.
You can use:
xhr.timeout = 10000;
xhr.ontimeout = timeoutFired;
for this purpose
I don't think you should be doing this inline with the request. The alternative is to submit your request as a job to some back end process/thread/message queue and store a reference so that you can access the results, e.g. in a database or even a file once the processing is complete.
Until the results are available, you could show some 'updating' text or spinning icon to let the user know something is happening.
For that length of time I wouldn't rely on an open HTTP connection.

How to perform Ajax requests, a few at a time

I am not really sure it is possible in JavaScript, so I thought I'd ask. :)
Say we have 100 requests to be done and want to speed things up.
What I was thinking of doing is:
Create a loop that will launch the first 5 ajax calls
Wait until they all return (success - call a function to update the dom / error) - not sure how, maybe with a global counter?
Repeat until all requests are done.
Considering browser JavaScript does not support thread, can we "exploit" the async functionality to do that?
Do you think it would work, or there are inherent problems doing that in JavaScript?
Yes, I have done something similar to this before. The basic process is:
Create a stack to store your jobs (requests, in this case).
Start out by executing 3 or 4 of the requests.
In the callback of the request, pop the next job out of the stack and execute it (giving it the same callback).
I'd say, the comment from Dancrumb is the "answer" to this question, but anyway...
Current browsers do limit HTTP requests, so you can even easily just start all 100 request immediately, and the browser will take care of sending those requests as fast as possible, but limited to a decent number of parallel requests.
So, just start them all immediately and trust on the browser.
However, this may change in the future (the number of parallel requests that a browser sends increases as end-user internet bandwidth increases and technology advances).
EDIT: you should also think and read about the meaning of "asynchronous" in a javascript context.. asynchronous here just means that you give up control about something to some other part of a system. so "sending" an async request just means, that you tell the browser to do so! you do not control the browser, you just tell it to send that request and please notify me about the outcome.
It's actually slower to break up 100 requests and batch post them 5 at a time whilst waiting for them to complete till you send the next batch. You might be better off simply sending 100 requests, remember JavaScript is single threaded so it can only resolve 1 response at a time anyways.
A better way is set up a batch request service that accepts something like:
/ajax_batch?req1=/some/request.json&req2=/other/request.json
And so on. Basically you send multiple requests in a single HTTP request. The response of such a request would look like:
[
{"reqName":"req1","data":{}},
{"reqName":"req2","data":{}}
]
Your ajax_batch service would resolve each request and send back the results in proper order. Client side, you keep track of what you sent and what you expect, so you can match up the results to the correct requests. Downside, it takes quite some coding.
The speed gain would come entirely from a massive reduction of HTTP requests.
There's a limit on how many requests you send because the url length has a limit iirc.
DWR does exactly that afaik.

Categories

Resources