javascript: cancel all kinds of requests - javascript

My website makes a lot of requests. I often need to cancel all current requests, so that the browser is not blocking relevant new requests.
I have 3 kinds of requests:
Ajax
inserted script-tags (which do JSONP-Communication)
inserted image-tags (which cause the browser to request data from various servers)
For Ajax its no problem as the XMLHttpRequest object supports canceling.
What I need is a way to make any browser stop loading resources, from DOM-Objects.
Looks like simply removing an object (eg. an image-tag) from the DOM only helps avoiding an request, if the request is not already running.
UPDATE: a way to cancel all requests, which are irrelevant, instead of really any request would be perfect.

window.stop() should cancel any pending image or script requests.

I think document.close() stops all requests, but I'm not so sure about it.

Related

Is it possible to cancel asynchronous call independently of its current state?

When I type text into my textfield widget I send request with every character typed into it to get matching data from the server.
When I type really fast I swarm server with the requests and this causes to freeze my control, I managed to create throttling mechanism where I set how many ms client should wait before sending request. This requires to set arbitrary constant ms to wait. The better solution would be to just cancel sending previous request when the next key button is pressed.
Is it possible to cancel AJAX request independently of its current state? If yes, how to achieve this?
Call XMLHttpRequest.abort()
See: https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/abort
You'll have to track your requests somehow, perhaps in an array.
requests = [];
xhr = new XMLHttpRequest(),
method = "GET",
url = "https://developer.mozilla.org/";
requests.push(xhr);
MDN says :
The XMLHttpRequest.abort() method aborts the request if it has already
been sent. When a request is aborted, its readyState is set to 0
(UNSENT), but the readystatechange event is not fired.
What's important to note here is that while you will be able to .abort() requests on the client side (and thus not have to worry about the server's response), you are still swarming your server because all those requests are still being sent.
My opinion is that you had it right the first time, by implementing a mechanism that limits the frequency of AJAX requests. You mentioned that you had a problem with this freezing your control (I assume you mean that the browser is either taking longer to respond to user actions, or stops responding completely), and this could be a sign that there is a problem with the way your application handles asynchronous code.
Make sure you are using async APIs like Promise correctly, avoid loops that do heavy processing or just wait around in client code, and make your event processing (i.e your AJAX callback) simple and fast to reduce the impact on the user.

What happens when an XMLHttpRequest is aborted?

Will an aborted XMLHttpRequest still download the response from the server?
At what point in the request lifecycle does it differ from a regular request?
Do different browsers behave differently?
Is it bad practise to abort requests?
No, the download will (should) cancel (does in my browser at least)
When a request is aborted, its readyState is changed to XMLHttpRequest.UNSENT (0) and the request's status code is set to 0. -- MDN
No, at least hopefully not. They should be following the spec.
In my opinion, definitely not. It's a waste of bandwidth and other resources to have requests you no longer need running in the background. Much better to abort them.
Two recent use-cases from personal experience:
A table with various parameters for filtering. Depending on the parameters selected, the resulting request sometimes took a while to complete. If you selected a slow set of parameters A, and then a fast set of parameters B before A completed, you'd first see the results of B in the table, but then A would eventually complete and "replace" the contents of the table so you'd suddenly see A instead.
Solution: Abort the previous incomplete request before starting the next one.
SPA with pages with sometimes long running requests, for example the previously mentioned table. When navigating away to a different page, there were sometimes several requests running in the background for stuff no longer needed.
Solution: Register those requests to be aborted when the page/component was unmounted.

Race conditions during simultaneous link click and asynchronous AJAX request?

I'm currently facing a situation similar to the relatively-simple example shown below. When a user clicks on a link to a third-party domain, I need to capture certain characteristics present in the user's DOM and store that data on my server. It's critical that I capture this data for all JS-enabled users, with zero data loss.
I'm slightly concerned that my current implementation (shown below) may be problematic. What would happen if the external destination server was extremely fast (or my internal /save-outbound-link-data endpoint was extremely slow), and the user's request to visit the external link was processed before the internal AJAX request had enough time to complete? I don't think this would be a problem (because in this situation, the browser doesn't care about receiving a response from the AJAX request), but getting some confirmation from fellow developers would be much appreciated.
Also, would the answer to the question above vary if the <a> link pointed to an internal URL rather than an external one?
<script type="text/javascript">
$(document).ready(function() {
$('.record-outbound-click').on('click', function(event) {
var link = $(this);
$.post(
'/save-outbound-link-data',
{
destination: link.attr('href'),
category: link.data('cat')
},
function() {
// Link tracked successfully.
}
);
});
});
</script>
<a href="http://www.stackoverflow.com" class="record-outbound-click" data-cat="programming">
Visit Stack Overflow
</a>
Please note that using event.preventDefault(), along with window.location.href = var.attr('href') inside $.post's success callback, isn't a viable solution for me. Neither is sending the user to a preliminary script on my server (for instance, /outbound?cat=programming&dest=http://www.stackoverflow.com), capturing their data, and then redirecting them to their destination.
Edit 2
Also consider the handshake step (Google's docs):
Time it took to establish a connection, including TCP handshakes/retries and negotiating a SSL.
I don't think you and the server you're sending the AJAX request to can complete the handshake if your client is no longer open for connection to the server (i.e., you're already at Stackoverflow or whatever website your link navigates to.)
Edit 1
More broadly, though, I was hoping to understand from a theoretical point of view whether or not the risk I'm concerned about is a legitimate one.
That's an interesting question, and the answer may not be as obvious as it seems.
That's just a sample request/response in my network tab. Definitely shouldn't be thought of to be used as any sort of trend or representation for general requests/responses.
I think the gap we might be most concerned with is the 1.933ms stall time. There's also other additional steps that need to happen before the actual request is sent (which itself was about 0.061ms).
I'd be worried if there's an interruption in any of the 3 steps leading up to the actual request (which took about 35ms give or take).
I think the question is, if you go somewhere else before the "stalled", "DNS Lookup", and "Initial connection" steps happen, is the request still going to be sent? That part, I don't know. But what about any general computer or browser lag beforehand?
Like you mentioned, the idea that somehow the req/res cycle to/from Stackoverflow would be faster than what's happening on your client (i.e., the initiation itself -- not even the complete cycle -- of a network request to your server) is probably a bit ridiculous, but I think theoretically (as you mentioned, this is what you're interested in), it's probably a bad idea in general to depend on these types of race conditions.
Original answer
What about making the AJAX request synchronous?
$.ajax({
type: "POST",
url: url,
async: false
});
This is generally a terrible idea, but if, in your case, the legacy code is so limiting that you have no way to modify it and this is your last option (think, zombie apocalypse), then consider it.
See jQuery: Performing synchronous AJAX requests.
The reason it's a bad idea is because it's completely blocking (in normal circumstances, you don't want potentially un-completeable requests blocking your main thread). But in your case, it looks like that's actually exactly what you want.

Disable browser cache

I implemented a REST service and i'm using a web page as client.
My page has some javascript functions that performs several times the same http get request to REST server and process the replies.
My problem is that the browser caches the first reply and not actualy sends the following requests..
Is there some way to force the browser execute all the requests without caching?
I'm using internet explorer 8.0
Thanks
Not sure if it can help you, but sometimes, I add a random parameter in the URL of my request in order to avoid being cached.
So instead of having:
http://my-server:8080/myApp/foo?bar=baz
I will use:
http://my-server:8080/myApp/foo?bar=baz&random=123456789
of course, the value of the random is different for every request. You can use the current time in milliseconds for that.
Not really. This is a known issue with IE, the classic solution is to append a random parameter at the end of the query string for every request. Most JS libraries do this natively if you ask them to (jQuery's cache:false AJAX option, for instance)
Well, of course you don't actually want to disable the browser cache entirely; correct caching is a key part of REST and the fact that it can (if properly followed by both client and server) allow for a high degree of caching while also giving fine control over the cache expiry and revalidation is one of the key advantages.
There is though an issue, as you have spotted, with subsequent GETs to the same URI from the same document (as in DOM document lifetime, reload the page and you'll get another go at that XMLHttpRequest request). Pretty much IE seems to treat it as it would a request for more than one copy of the same image or other related resource in a web page; it uses the cached version even if the entity isn't cacheable.
Firefox has the opposite problem, and will send a subsequent request even when caching information says that it shouldn't!
We could add a random or time-stamped bogus parameter at the end of a query string for each request. However, this is a bit like screaming "THIS IS SPARTA!" and kicking our hard-won download into a deep pit that no Health & Safety inspector considered putting a safety rail around. We obviously don't want to repeat a full unconditional request when we don't need to.
However, this behaviour has a time component. If we delay the subsequent request by a second, then IE will re-request when appropriate while Firefox will honour the max-age and expires headers and not re-request when needless.
Hence, if two requests could be within a second of each other (either we know they are called from the same function, or there's the chance of two events triggering it in close succession) using setTimeout to delay the second request by a second after the first has completed will make it use the cache correctly, rather than in the two different sorts of incorrect behaviour.
Of course, a second's delay is a second's delay. This could be a big deal or not, depending primarily on the size of the downloaded entity.
Another possibility is that something that changes so rapidly shouldn't be modelled as GETting the state of a resource at all, but as POSTing a request for a current status to a resource. This does smell heavily of abusing REST and POSTing what should really be a GET though.
Which can mean that on balance the THIS IS SPARTA approach of appending random stuff to query strings is the way to go. It depends, really.

Intercepting XHRs

Is it possible to intercept when the browser does a XHR?
Regardless of JavaScript libraries used?
Like
setTimeout(function() {
// jQuery XHT
$('#container').load('foo.html');
}, 5000);
When the jQuery.load fires, I want to intercept this and add an url parameter to the request.
Thanks in advance for tips and info.
Best regards
tan
Not possible for all frameworks no, you'll have to do it on a per framework basis as they all build them differently. If this was only to be used by your own code then there is nothing stopping you writting a wrapper for you XHR calls which could tag the additional data on the end.
What is it you are wanting to send? A cache breaker? Tracking? Also why for all frameworks?
Also if you want to just debug the requests you can use Firebug in Firefox to see all XHR requests that get sent.
Well each JQuery ajax methods return an XHR (except for load), so you could set your own onreadystatechange if you wanted to.
Yes you can - http://github.com/jpillora/xhook - implements a wrapper over XMLHttpRequest2 allowing you to change view and modify a request before it is made and after it is returned

Categories

Resources