Is it possible that using jQuery, I cancel/abort an Ajax request that I have not yet received the response from?
Most of the jQuery Ajax methods return an XMLHttpRequest (or the equivalent) object, so you can just use abort().
See the documentation:
abort Method (MSDN). Cancels the current HTTP request.
abort() (MDN). If the request has been sent already, this method will abort the request.
var xhr = $.ajax({
type: "POST",
url: "some.php",
data: "name=John&location=Boston",
success: function(msg){
alert( "Data Saved: " + msg );
}
});
//kill the request
xhr.abort()
UPDATE:
As of jQuery 1.5 the returned object is a wrapper for the native XMLHttpRequest object called jqXHR. This object appears to expose all of the native properties and methods so the above example still works. See The jqXHR Object (jQuery API documentation).
UPDATE 2:
As of jQuery 3, the ajax method now returns a promise with extra methods (like abort), so the above code still works, though the object being returned is not an xhr any more. See the 3.0 blog here.
UPDATE 3: xhr.abort() still works on jQuery 3.x. Don't assume the update 2 is correct. More info on jQuery Github repository.
You can't recall the request but you can set a timeout value after which the response will be ignored. See this page for jquery AJAX options. I believe that your error callback will be called if the timeout period is exceeded. There is already a default timeout on every AJAX request.
You can also use the abort() method on the request object but, while it will cause the client to stop listening for the event, it may probably will not stop the server from processing it.
Save the calls you make in an array, then call xhr.abort() on each.
HUGE CAVEAT: You can abort a request, but that's only the client side. The server side could still be processing the request. If you are using something like PHP or ASP with session data, the session data is locked until the ajax has finished. So, to allow the user to continue browsing the website, you have to call session_write_close(). This saves the session and unlocks it so that other pages waiting to continue will proceed. Without this, several pages can be waiting for the lock to be removed.
It's an asynchronous request, meaning once it's sent it's out there.
In case your server is starting a very expensive operation due to the AJAX request, the best you can do is open your server to listen for cancel requests, and send a separate AJAX request notifying the server to stop whatever it's doing.
Otherwise, simply ignore the AJAX response.
AJAX requests may not complete in the order they were started. Instead of aborting, you can choose to ignore all AJAX responses except for the most recent one:
Create a counter
Increment the counter when you initiate AJAX request
Use the current value of counter to "stamp" the request
In the success callback compare the stamp with the counter to check if it was the most recent request
Rough outline of code:
var xhrCount = 0;
function sendXHR() {
// sequence number for the current invocation of function
var seqNumber = ++xhrCount;
$.post("/echo/json/", { delay: Math.floor(Math.random() * 5) }, function() {
// this works because of the way closures work
if (seqNumber === xhrCount) {
console.log("Process the response");
} else {
console.log("Ignore the response");
}
});
}
sendXHR();
sendXHR();
sendXHR();
// AJAX requests complete in any order but only the last
// one will trigger "Process the response" message
Demo on jsFiddle
We just had to work around this problem and tested three different approaches.
does cancel the request as suggested by #meouw
execute all request but only processes the result of the last submit
prevents new requests as long as another one is still pending
var Ajax1 = {
call: function() {
if (typeof this.xhr !== 'undefined')
this.xhr.abort();
this.xhr = $.ajax({
url: 'your/long/running/request/path',
type: 'GET',
success: function(data) {
//process response
}
});
}
};
var Ajax2 = {
counter: 0,
call: function() {
var self = this,
seq = ++this.counter;
$.ajax({
url: 'your/long/running/request/path',
type: 'GET',
success: function(data) {
if (seq === self.counter) {
//process response
}
}
});
}
};
var Ajax3 = {
active: false,
call: function() {
if (this.active === false) {
this.active = true;
var self = this;
$.ajax({
url: 'your/long/running/request/path',
type: 'GET',
success: function(data) {
//process response
},
complete: function() {
self.active = false;
}
});
}
}
};
$(function() {
$('#button').click(function(e) {
Ajax3.call();
});
})
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<input id="button" type="button" value="click" />
In our case we decided to use approach #3 as it produces less load for the server. But I am not 100% sure if jQuery guarantees the call of the .complete()-method, this could produce a deadlock situation. In our tests we could not reproduce such a situation.
It is always best practice to do something like this.
var $request;
if ($request != null){
$request.abort();
$request = null;
}
$request = $.ajax({
type : "POST", //TODO: Must be changed to POST
url : "yourfile.php",
data : "data"
}).done(function(msg) {
alert(msg);
});
But it is much better if you check an if statement to check whether the ajax request is null or not.
Just call xhr.abort() whether it's jquery ajax object or native XMLHTTPRequest object.
example:
//jQuery ajax
$(document).ready(function(){
var xhr = $.get('/server');
setTimeout(function(){xhr.abort();}, 2000);
});
//native XMLHTTPRequest
var xhr = new XMLHttpRequest();
xhr.open('GET','/server',true);
xhr.send();
setTimeout(function(){xhr.abort();}, 2000);
You can abort any continuous ajax call by using this
<input id="searchbox" name="searchbox" type="text" />
<script src="http://code.jquery.com/jquery-1.11.0.min.js"></script>
<script type="text/javascript">
var request = null;
$('#searchbox').keyup(function () {
var id = $(this).val();
request = $.ajax({
type: "POST", //TODO: Must be changed to POST
url: "index.php",
data: {'id':id},
success: function () {
},
beforeSend: function () {
if (request !== null) {
request.abort();
}
}
});
});
</script>
As many people on the thread have noted, just because the request is aborted on the client-side, the server will still process the request. This creates unnecessary load on the server because it's doing work that we've quit listening to on the front-end.
The problem I was trying to solve (that others may run in to as well) is that when the user entered information in an input field, I wanted to fire off a request for a Google Instant type of feel.
To avoid firing unnecessary requests and to maintain the snappiness of the front-end, I did the following:
var xhrQueue = [];
var xhrCount = 0;
$('#search_q').keyup(function(){
xhrQueue.push(xhrCount);
setTimeout(function(){
xhrCount = ++xhrCount;
if (xhrCount === xhrQueue.length) {
// Fire Your XHR //
}
}, 150);
});
This will essentially send one request every 150ms (a variable that you can customize for your own needs). If you're having trouble understanding what exactly is happening here, log xhrCount and xhrQueue to the console just before the if block.
I was doing a live search solution and needed to cancel pending requests that may have taken longer than the latest/most current request.
In my case I used something like this:
//On document ready
var ajax_inprocess = false;
$(document).ajaxStart(function() {
ajax_inprocess = true;
});
$(document).ajaxStop(function() {
ajax_inprocess = false;
});
//Snippet from live search function
if (ajax_inprocess == true)
{
request.abort();
}
//Call for new request
Just use ajax.abort() for example you could abort any pending ajax request before sending another one like this
//check for existing ajax request
if(ajax){
ajax.abort();
}
//then you make another ajax request
$.ajax(
//your code here
);
there is no reliable way to do it, and I would not even try it, once the request is on the go; the only way to react reasonably is to ignore the response.
in most cases, it may happen in situations like: a user clicks too often on a button triggering many consecutive XHR, here you have many options, either block the button till XHR is returned, or dont even trigger new XHR while another is running hinting the user to lean back - or discard any pending XHR response but the recent.
The following code shows initiating as well as aborting an Ajax request:
function libAjax(){
var req;
function start(){
req = $.ajax({
url: '1.php',
success: function(data){
console.log(data)
}
});
}
function stop(){
req.abort();
}
return {start:start,stop:stop}
}
var obj = libAjax();
$(".go").click(function(){
obj.start();
})
$(".stop").click(function(){
obj.stop();
})
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<input type="button" class="go" value="GO!" >
<input type="button" class="stop" value="STOP!" >
If xhr.abort(); causes page reload,
Then you can set onreadystatechange before abort to prevent:
// ↓ prevent page reload by abort()
xhr.onreadystatechange = null;
// ↓ may cause page reload
xhr.abort();
I had the problem of polling and once the page was closed the poll continued so in my cause a user would miss an update as a mysql value was being set for the next 50 seconds after page closing, even though I killed the ajax request, I figured away around, using $_SESSION to set a var won't update in the poll its self until its ended and a new one has started, so what I did was set a value in my database as 0 = offpage , while I'm polling I query that row and return false; when it's 0 as querying in polling will get you current values obviously...
I hope this helped
I have shared a demo that demonstrates how to cancel an AJAX request-- if data is not returned from the server within a predefined wait time.
HTML :
<div id="info"></div>
JS CODE:
var isDataReceived= false, waitTime= 1000;
$(function() {
// Ajax request sent.
var xhr= $.ajax({
url: 'http://api.joind.in/v2.1/talks/10889',
data: {
format: 'json'
},
dataType: 'jsonp',
success: function(data) {
isDataReceived= true;
$('#info').text(data.talks[0].talk_title);
},
type: 'GET'
});
// Cancel ajax request if data is not loaded within 1sec.
setTimeout(function(){
if(!isDataReceived)
xhr.abort();
},waitTime);
});
This is my implementation based on many answers above:
var activeRequest = false; //global var
var filters = {...};
apply_filters(filters);
//function triggering the ajax request
function apply_filters(filters){
//prepare data and other functionalities
var data = {};
//limit the ajax calls
if (activeRequest === false){
activeRequest = true;
}else{
//abort if another ajax call is pending
$request.abort();
//just to be sure the ajax didn't complete before and activeRequest it's already false
activeRequest = true;
}
$request = $.ajax({
url : window.location.origin + '/your-url.php',
data: data,
type:'POST',
beforeSend: function(){
$('#ajax-loader-custom').show();
$('#blur-on-loading').addClass('blur');
},
success:function(data_filters){
data_filters = $.parseJSON(data_filters);
if( data_filters.posts ) {
$(document).find('#multiple-products ul.products li:last-child').after(data_filters.posts).fadeIn();
}
else{
return;
}
$('#ajax-loader-custom').fadeOut();
},
complete: function() {
activeRequest = false;
}
});
}
I have a section of content that allows a user to edit it upon double-clicking on it. If the user changes the content and then stops for 2 seconds, the updated content is sent to the server to be saved.
To do so, I have bound an input event listener to that section that starts a 2 seconds countdown, and if there is already a countdown, the former will be cancelled and a new one will start instead. At the end of the countdown an http POST request is sent to the server with the new data.
The problem is that sometimes at the end of the countdown I see 2 or more requests sent, as if a countdown was not cancelled before a new one was inserted, and I can't figure out why.
The code in question is as follows:
//this function is bound to a double-click event on an element
function makeEditable(elem, attr) {
//holder for the timeout promise
var toSaveTimeout = undefined;
elem.attr("contentEditable", "true");
elem.on("input", function () {
//if a countdown is already in place, cancel it
if(toSaveTimeout) {
//I am worried that sometimes this line is skipped from some reason
$timeout.cancel(toSaveTimeout);
}
toSaveTimeout = $timeout(function () {
//The following console line will sometimes appear twice in a row, only miliseconds apart
console.log("Sending a save. Time: " + Date.now());
$http({
url: "/",
method: "POST",
data: {
action: "edit_content",
section: attr.afeContentBox,
content: elem.html()
}
}).then(function (res) {
$rootScope.data = "Saved";
}, function (res) {
$rootScope.data = "Error while saving";
});
}, 2000);
});
//The following functions will stop the above behaviour if the user clicks anywhere else on the page
angular.element(document).on("click", function () {
unmakeEditable(elem);
angular.element(document).off("click");
elem.off("click");
});
elem.on("click", function (e) {
e.stopPropagation();
});
}
Turns out (with help from the commentators above) that the function makeEditable was called more than once.
Adding the following two lines of code at the beginning of the function fixed the issue:
//if element is already editable - ignore
if(elem.attr("contentEditable") === "true")
return;
I have the following jQuery code, the point of this code is to create a short time delay, so the AJAX request gets time to execute properly:
$('#form_id').submit(function(e) {
e.preventDefault();
$submit_url = $(this).data('submitUrl');
$submit_url = $submit_url.replace('http://','').replace(window.location.host,'');
if ($(this).data('toBeAjaxSubmitted') == true) {
$.ajax($submit_url, {
type : $(this).attr('method'),
data : $(this).serialize(),
complete : function(data) {
$(this).data('toBeAjaxSubmitted', false);
$('#form_id').submit();
}
});
}
});
What happens is, the form starts off with a submit url that I need to submit to in order for the component to save an entry to the database. But I also need user input to submit directly to a payment gateway URL where the user then makes a payment.
The code above creates the AJAX request, but does not return to normal postback behaviour (via $('#form_id').submit()).
It keeps submitting the form over and over, but never posts to the gateway URL or redirects out.
What am I doing wrong?
The following worked for me after some more debugging:
$('#chronoform_Online_Submission_Step8_Payment').submit(function(e) {
var form = this;
e.preventDefault();
$submit_url = $(this).data('submitUrl');
$submit_url = $submit_url.replace('http://','').replace(window.location.host,'');
if ($(this).data('toBeAjaxSubmitted') == true) {
$.ajax($submit_url, {
type : $(this).attr('method'),
data : $(this).serialize(),
complete : function(data, status) {
}
}).done(function() {
form.submit();
});
}
});
What really put me on the wrong path was that in Chrome's Developer Tools I had the following option enabled 'Disable cache (while DevTools is open)' and this was causing some headaches with inconsistent behaviour between Safari, Firefox (which worked) and Chrome which did not.
What about some fiddling with this approach?
$('#form_id').submit(function(e) {
// closures
var $form = $(this);
var fAjaxComplete = function(data) {
// don't send the ajax again
$form.data('toBeAjaxSubmitted', 'false');
// maybe do some form manipulation with data...
// re-trigger submit
$form.trigger('submit');
};
var oAjaxObject = {
type : $form.attr('method'),
data : $form.serialize(),
complete : fAjaxComplete
};
var sSubmitUrl = $form.data('submitUrl');
// scrub url
sSubmitUrl = sSubmitUrl.replace('http://','').replace(window.location.host,'');
// if ajax needed
if ($form.data('toBeAjaxSubmitted') != 'false') {
// go get ajax
$.ajax(sSubmitUrl, oAjaxObject);
// don't submit, prevent native submit behavior, we are using ajax first!
e.preventDefault();
return false;
}
// if you got here, go ahead and submit
return true;
});
I'm trying to add versioning functionality to a custom entity, MFAs, and I'm running into a very odd problem. I have a javascript webresource being called from two places: an onSave event on the form, and as the action of a custom ribbon button. Specifically, the onSave event calls captureSave, while the ribbon button calls makeARevision.
When called from the save button/event, everything works as expected; all information, including the new changes, are pulled to a new record and saved there, while the original record is closed without the changes being saved, and without a prompt to save. However, when called via the custom ribbon button, any unsaved changes do not get brought over to the new record, and the old record prompts for saving. Furthermore, even if the user chooses to save the changes to the old record, the changes are not saved, and the form doesn't automatically close.
The following code is the webresource in question. company_MFASaveOrRevise is just an html page that asks the user whether they want to save the record or create a new revision. Any ideas on what's causing the differences or how to resolve them is appreciated.
function captureSave(executionContext) {
if (Xrm.Page.ui.getFormType() != 1 && Xrm.Page.data.entity.getIsDirty()) {
var retVal = showModalDialog(Xrm.Page.context.getServerUrl() + '/Webresources/company_MFASaveOrRevise', null, 'dialogWidth: 300px; dialogHeight: 100px');
if (retVal == "Revise") {
executionContext.getEventArgs().preventDefault();
makeARevision();
}
else if (retVal == "Save") {
}
}
}
function createLookupValue(oldLookup) {
var lookupVal = new Object();
lookupVal.Id = oldLookup.id;
lookupVal.LogicalName = oldLookup.entityName;
lookupVal.Name = oldLookup.Name;
return lookupVal;
}
function makeARevision() {
var revisedMFA = {};
revisedMFA['company_mfaname'] = Xrm.Page.data.entity.attributes.get('company_mfaname').getValue();
revisedMFA['company_mfadate'] = Xrm.Page.data.entity.attributes.get('company_mfadate').getValue();
revisedMFA['company_estimatedliqdate'] = Xrm.Page.data.entity.attributes.get('company_estimatedliqdate').getValue();
revisedMFA['company_actualliqdate'] = Xrm.Page.data.entity.attributes.get('company_actualliqdate').getValue();
revisedMFA['company_mfanumber'] = Xrm.Page.data.entity.attributes.get('company_mfanumber').getValue();
revisedMFA['company_revisionno'] = Xrm.Page.data.entity.attributes.get('company_revisionno') == null ? 0 : Xrm.Page.data.entity.attributes.get('company_revisionno').getValue() + 1;
revisedMFA['company_requester'] = createLookupValue(Xrm.Page.data.entity.attributes.get('company_requester').getValue()[0]);
revisedMFA['company_mfapreviousrev'] = Xrm.Page.data.entity.attributes.get('company_totalmfatodate').getValue();
revisedMFA['company_contract'] = createLookupValue(Xrm.Page.data.entity.attributes.get('company_contract').getValue()[0]);
$.ajax({
type: 'POST',
contentType: 'application/json; charset=utf-8',
datatype: 'json',
url: getODataUrl() + '/' + 'company_mfaSet',
data: JSON.stringify(revisedMFA),
beforeSend: function (XMLHttpRequest) {
//Specifying this header ensures that the results will be returned as JSON.
XMLHttpRequest.setRequestHeader('Accept', 'application/json');
},
success: function (data, textStatus, request) {
Xrm.Utility.openEntityForm("company_mfa", data.d.company_mfaId.toUpperCase());
var attributes = Xrm.Page.data.entity.attributes.get();
for (var i in attributes) {
attributes[i].setSubmitMode('never');
}
Xrm.Page.ui.close();
},
error: function (request, textStatus, errorThrown) {
alert(errorThrown);
//alert("There was an error creating the revision");
}
});
}
Edit: I had debugger; inserted in various places and was using VS2012 debugger, and found that the attributes were being properly set not to submit, but apparently that didn't stop the confirmation dialog from popping up (even though it works when the webresource is called through the save button). Additionally, Xrm.Page.data.entity.attributes.get(attributeName) returns the post-changes values when called during onSave event, but pre-change values when called from the ribbon. I still don't know why or how to fix it though. Is there something else I should look for?
Use F12 to debug your code when being called from the ribbon (just remember since it is in the ribbon, your javascript code will be in a dynamic script / script block).
I have the following code build using jquery. When a user copies and pastes a a youtube url, i am suppose to extract the video id is the getVideoId(str) method in jquery. Using the id, i get the video image picture title and contents.
When the textbox->("#url") has a length more than 10, i will make a ajax request. Thus the ajax request is working. But now i have another problem. The very first time when the textbox has more than 10 characters, there is two ajax request being fired (tested using firebug). Than when the user enters more characters, there are many ajax request fired.
Thus this will slow down the process of the last ajax request. I just want to get the data of the youtube link and show a suggest where the user can add the title and content. It is like how the old facebook video video link is. Anyone has a better suggest in improving the codes?
jQuery(document).ready(
function(){
$("#url").keyup(function() {
var $t1 = $("#url").val();
var $length = $t1.length;
var $data;
$("#title").val($length);
$("#content").val($t1);
if($length==0){
alert('zero value');
return;
}
if($length>10){
$data = $.ajax({
url: '<?php echo $this->Html->url(array("action" => "retrieveVideoFeed"));?>',
dataType: "json",
data: {
vid: getVideoId($t1)
},
success: function(data) {
alert('success in getting data');
}
});
return;
}
});
function getVideoId(str) {
var urlRegex = /(http|https):\/\/(\w+:{0,1}\w*#)?(\S+)(:[0-9]+)?(\/|\/([\w#!:.?+=&%#!\-\/]))?/;
if (urlRegex.test(str) && str.indexOf('v=') != -1)
{
return str.split('v=')[1].substr(0, 11); // get 11-char youtube video id
} else if (str.length == 11) {
return str;
}
return null;
}
}
);
You could cache the calls and use blur event and not keyup: you are firing a lot of AJAX call because keyup() fires an event each time a key is pressed, you should use blur that fires an event when an input loses focus.
If you cache the calls in an object you can avoid a lot of repeated calls
var cacheUrl = {};
$("#url").blur(function() {
var $t1 = $("#url").val();
var $length = $t1.length;
var $data;
$("#title").val($length);
$("#content").val($t1);
if($length==0){
alert('zero value');
return;
}
if(cacheUrls[$t1] !== undefined && $length>10){
$data = $.ajax({
url: '<?php echo $this->Html->url(array("action" => "retrieveVideoFeed"));?>',
dataType: "json",
data: {
vid: getVideoId($t1)
},
success: function(data) {
//save the data to avoid a future call
cacheUrls[$t1] = data;
alert('success in getting data');
}
});
return;
}elseif ($length>10){
//you already have the data in cacheUrls[$t1]
}
});
EDIT if you want to use the submit key to start the search you could trigger the blur event when you press enter like this:
$("#url").keypress(function(e){
if(e.which == 13){
$(this).blur();
return false;
}
});
I think many ajax requests are fired because you are using $("#url").keyup(function()
so that for every key event in url input the particular funciton will exectue.So, as per i know better to use focusout method instead of keyup.
If you stay with the keyup-Event you maybe want to use an ajaxmanager-plugin for jQuery which can manage queues or limits the number of simultaneous requests.
$.manageAjax.create('myAjaxManager', {
queue: true,
cacheResponse: false,
maxRequests: 1,
queue: 'clear'
});
....
if($length>10){
$data = $.manageAjax.add({ ...
This will prevent having alot of ajaxrequests active at the same time when the user is typing. As soon as he stops the request will not aborted and the results will show up.