Working on a platform, to enable auto-ticketing functionality. For which a REST API request is used for ticket creation. Unfortunately, there are 2 requests popping simultaneously, which results in creating duplicated tickets.
How to handle such case and send only one of these requests?
Tried adding the 2nd request in the response callback of the first, though this does not seem to work.
if (flag == 1){
logger.debug("Node-down alarm-Request raised - +sitn_id);
clearTimeout(mouseoverTimer);
mouseoverTimer = setTimeout(function(){
logger.debug("Inside Call back function - ");
//function call for ticket creation
incidentRequest(sitn_id,confUtil.config.mule_url);
}, 10);
You really should show more of the code that makes the request, though it seems as if you are doing some ajax inside your 'incidentRequest', so I will presume that (if that isn't what you are doing, then please, show your code....) - and since you tags say javascript and jquery - well, here goes...
To stop the 'double send' in an AJAX call, it is simple:
function incidentRequest(sitn_id,confUtil.config.mule_url){
// stop the double by clearing the cache
$.ajaxSetup({cache: false});
// continue on with the AJAX call
// presuming the url you want is confUtil.config.mule_url
// and the data you want to send is sitn_id
$.post(confUtil.config.mule_url, 'sitn_id=' + sitn_id, function (data) {
// do cool stuff
});
}
Hopefully that will help you get moving. If not, then we will need more code of what is going on around all this.
Related
$('#gd').on('click', function(){
// move up and down DOM elements
// some ajax procedure to store new values on database (php/mysql)
});
Is there any danger to repeating this click very quickly for a long time?
For example - if the connection is poor - will the ajax will not complete each time?
I tested on my live server - seems there is no problem, but... I'm still concerned.
And what is the way to avoid possible problems it this scenario - i.e. if a user keeps clicking very quickly on the #gd button.
This "Danger" would be more accurately described as undesired behavior. However, it is indeed issue which should be treated - as sending multiple request when only 1 is required would consume resources on both client and server with no reason.
If you would like to prevent the user from clicking the button while the request is being processed, disable the button after the client send it it, and re-enable it after response processing complete:
$('#gd').on('click', function(){
// 1. do some stuff with DOM
// 2. disable button + make ajax call
$.ajax({someRequestOptions})
.always(function() {
// 3. re-enable button
});
});
I am currently working on a web based time tracking software. I'm developing in grails, but this question is solely related to javascript and asynchronous requests.
The time tracking tool shall enable users to choose a day for the current month, create one or multiple activities for each day and save the entire day. Each activity must be assigned to a project and a contract.
Upon choosing "save", the partial day is saved to the database, the hours are calculated and a table is updated at the bottom of the page, showing an overview of the user's worked hours per month.
Now to my issue: There may be a lot of AJAX request. Patient users might only click the "create activity" button just once and wait until it is created. Others, however, might just keep clicking until something happens.
The main issue here is updating the view, although i also recognized some failed calls because of concurrent database transaction (especially when choosing "save" and "delete" sequentially). Any feedback on that issue -- requests not "waiting" for the same row to be ready again -- will be apreciated as well, yet this is not my question.
I have an updateTemplate(data, day) function, which is invoked onSuccess of respective ajax calls in either of my functions saveRecord(), deleteRecord(), pasteRecords(), makeEditable() (undo save). Here is the example AJAX call in jquery:
$.ajax({
type: "POST",
url: "${g.createLink(controller:"controller", action:"action")}",
data: requestJson,
contentType:"application/json; charset=utf-8",
async: true,
success: function(data, textstatus) {updateTemplate(data["template"], tag); updateTable(data["table"]);},
});
In the controller action, a JSON object is rendered as a response, containing the keys template and table. Each key has a template rendered as a String assigned to it, using g.render.
Now, what happens when I click on create repeatedly in very short intervalls, due to the asynchronous calls, some create (or other) actions are executed concurrently. The issue is that updateTemplate just renders data from the repsonse; the data to render is collected in the create controller action. But the "last" request action only finds the objects created by itself. I think this is because create actions are run concurrently
I figure there is something I'm either overcomplicating or doing something essentially wrong working with a page that refreshs dynamically. The only thing I found that helps are synchronous calls, which works, but the user experience was awful. What options do I have to make this work? Is this really it or am I just looking for the wrong approach? How can I make this all more robust, so that impatient users are not able to break my code?
*********EDIT:********
I know that I could block buttons or keyboard shortcuts, use synchronous calls or similar things to avoid those issues. However, I want to know if it is possible to solve it with multiple AJAX requests being submitted. So the user should be able to keep adding new activities, although they won't appear immediately. There is a spinner for feedback anyway. I just want to somehow make sure that before the "last" AJAX request gets fired, the database is up to date so that the controller action will respond with the up-to-date gsp template with the right objects.
With help of this Stackoverflow answer, I found a way to ensure that the ajax call -- in the javascript function executed lastly -- always responds with an up-to-date model. Basically, I put the javascript functions containing AJAX calls in a waiting queue if a "critical" AJAX request has been initiated before but not completed yet.
For that I define the function doCallAjaxBusyAwareFunction(callable) that checks if the global variable Global.busy is 'true' prior to executing the callable function. If it's true, the function will be executed again until Global.busy is false, to finally execute the function -- collecting the data from the DOM -- and fire the AJAX request.
Definition of the global Variable:
var Global = {
ajaxIsBusy = false//,
//additional Global scope variables
};
Definition of the function doCallAjaxBusyAwareFunction:
function doCallAjaxBusyAwareFunction(callable) {
if(Global.busy == true){
console.log("Global.busy = " + Global.busy + ". Timout set! Try again in 100ms!!");
setTimeout(function(){doCallAjaxBusyAwareFunction(callable);}, 100);
}
else{
console.log("Global.busy = " + Global.busy + ". Call function!!");
callable();
}
}
To flag a function containing ajax as critical, I let it set Global.busy = true at the very start and Global.busy = false on AJAX complete. Example call:
function xyz (){
Global.busy = true;
//collect ajax request parameters from DOM
$.ajax({
//desired ajax settings
complete: function(data, status){ Global.busy = false; }
}
Since Global.busy is set to true at the very beginning, the DOM cannot be manipulated -- e.g. by deletes while the function xyz collects DOM data. But when the function was executed, there is still Global.busy === true until the ajax call completes.
Fire an ajax call from a "busy-aware" function:
doCallAjaxBusyAwareFunction(function(){
//collect DOM data
$.ajax({/*AJAX settings*/});
});
....or fire an ajax call from a "busy-aware" function that is also marked critical itself (basically what I mainly use it for):
doCallAjaxBusyAwareFunction(function(){
Global.busy = true;
//collect DOM data
$.ajax({
//AJAX SETTINGS
complete: function(data, status){ Global.busy = false; }
});
});
Feedback is welcome and other options too, especially if this approach is bad practice. I really hope somebody finds this post and evaluates it, since I don't know if it should be done like that at all. I will leave this question unanswered for now.
i am trying to display the data fetched from database in the loop and between loop i call the function and send ajax request its not working.Actually its displays the only if i used alert command. If i used alert then the browser display the div and then alert if i clicked ok then it displays the second div then again show alert.
Here is the js code
function like(divid,id,session) {
var orgnldiv=document.getElementById(divid);
var ndiv=document.createElement('DIV');
var idd=id+5000;
ndiv.id =idd;
ndiv.className="likeclass";
orgnldiv.appendChild(ndiv);
var dynamicdiv=document.getElementById(idd);
var span=document.createElement('span');
var spanid=idd+5000;
span.id=spanid;
span.className="spanclass";
dynamicdiv.appendChild(span);
var xmllhttp15;
if (window.XMLHttpRequest) {
xmlhttp15=new XMLHttpRequest();
} else {
xmlhttp15=new ActiveXObject("Microsoft.XMLHTTP");
}
xmlhttp15.onreadystatechange = function() {
if (xmlhttp15.readyState==4 && xmlhttp15.status==200) {
document.getElementById(spanid).innerHTML=xmlhttp15.responseText;
}
}
xmlhttp15.open("GET","spancount.php?postid="+id+"&userid="+session);
xmlhttp15.send();
// alert(spanid);
}
please suggest me what can be the reason of this problem my code is working well only if i use alert
The reason why your code works when you use alert is because whenever the alert function is called. The program flow is paused. In other words, your loop wont continue to make another Ajax call until you dismiss the alert.As a result, the request gets handled properly and the response data appears in the span div. that is why I had mentioned to make your calls synchronous instead.
So to answer the question you asked in the comment, Yes at times too many Ajax calls can be a problem. Let's say that the loops runs more than 15-20 times, that means 15-20 simultaneous requests. Now, think about the number of times the same request is being handled by the php script? Definitely a problem here!
Even with Jquery Ajax, the chances of the loop completing successfully is also 50-50 actually because it all boils down to the amount of requests being made , the bandwidth being used and how the request is being processed at the server.
One possible way to fix this problem is : Rather than constantly requesting small peices of data again and again from the server in the loop, Make one Ajax call and get the entire data as json. Then, parse the json and append data to the spans by using the particular span id to extract the relevant data from the json object.
You might have to do a little bit of tweaking in both the above javascript and spancount.php . But it will definitely Save you A LOT of bandwidth. You gotta consider the fact that more than one person could be using your site!!
Hope that cleared up things, all the best with your project :D
I am trying to write a JavaScript interface for an Api, but I can not figure out this issue. I have code to call an ajax request:
mooshark.request('userInfoFromID', {
userID : '20991'
});
That code creates an Ajax request. When it starts, it sets an internal variable to true (to indicate that it is running). Then on the next line I have this:
var data = mooshark.response();
alert(data);
The response function is as follows:
response: function () {
if(this.running == false){
return "done";
} else if (this.running == true){
alert("Running");
setTimeout(this.response, 3000);
}
}
It outputs (in this order) Running. undefined. (JSON response). Running. Not once does it output "done". Is there a way to return "done" when this.running becomes true? I would like to mention that this.running will not always be the same request time. I know there is always the option of wrapping all my code inside the onCompleat function in the ajax request, but I want to have that as a last resort.
Thanks!
This is not possible without freezing the browser.
Whenever your code is running, the browser UI will be completely frozen.
If you want the call to wait for the server to reply, the browser will need to be competely frozen. (which is not a good idea)
Since most of the time you're calling response() through setTimeout(), a return value isn't really useful.
But -- you're likely never setting your 'running' variable to false. You might post more code here (the AJAX response handling code, for example). Also: what exactly are you trying to accomplish by returning / alerting "running" and "done"?
I have a php script that outputs json data. For the purposes of testing, i've put sleep(2) at the start.
I have a html page that requests that data when you click a button, and does $('.dataarea').append(data.html)
(php script returns a json encoded array. data.html has the html that i want to put at the end of <div class="dataarea">...HERE</div>.
The trouble is, if i click the button too fast (ie. more than once within two seconds (due to the sleep(2) in the php script)), it requests the php file again.
how can i make it only do one request at a time?
i've tried this (edited down to show the important parts):
amibusy=false;
$('#next').click('get_next');
function get_next() {
if (amibusy) {
alert('requesting already');
}
else {
amibusy=true;
// do the request, then do the append()
amibusy=false;
}
}
but this doesn't seem to work. i've even tried replacing the amibusy=true|false, with set_busy(), and set_not_busy(). (and made a function am_i_busy() { return amibusy; })
but none of this seems to work. what am i missing?
If you're in jQuery the amibusy would be jQuery.active which contains a count of currently active AJAX requests, like this:
if(jQuery.active > 0) { //or $.active
alert('Request in Progress');
}
Keep in mind that in jQuery 1.4.3 this becomes jQuery.ajax.active.
Disable the button in the click event and enable it again when the request is finished. Note that the request is asynchronous (i.e. "send request" returns immediately), so you must register a function that is called when the answer comes in.
In jQuery, see the load() function and the success method plus the various AJAX events which you can tap into with ajax().
I'm wondering about your "do request" logic. Whenever I've done calls like this they've always been asynchronous meaning I fire the request off and then when the response comes another function handles that. In this case it would finish going through that function after setting the callback handler and set your value of amibusy back to false again before the request actually comes back. You'd need to set that variable in the handler for your post callback.
Could you use the async variable?
http://api.jquery.com/jQuery.ajax/
asyncBoolean Default: true
By default, all requests are sent
asynchronous (i.e. this is set to true
by default). If you need synchronous
requests, set this option to false.
Cross-domain requests and dataType:
"jsonp" requests do not support
synchronous operation. Note that
synchronous requests may temporarily
lock the browser, disabling any
actions while the request is active.