I am implementing a queue system of various information. When it reaches a certain number, I send an ajax request.... user inputs data, when it reaches certain point I send it. BUT, the user can still be entering data. I don't want to lose that.. so, I was thinking of I could use a $.Deferred/promise, while storing the data to a certain point.. firing ajax, and only allow a new request when the previous deferred is successful... also, if the data being entered then increased to the point I have to send it again, I que it..
I am having a hard time wrapping my brain around the methodology of how to implement.
===> capture data
=======> 'n' amount of data is entered
=============> move that data into the 'ready' bucket. (arbitrary, lets user entered 10 input fields and I store into an array. when array reaches 10.. boom send it ).
=============> fire ajax with the 10 items
In the meantime the user can still be entering data. I want to make sure I still capture it and keep que'ing and sending at 10.
I was thinking of a queuing system with a deferred. Not sure if I am over thinking this.
Since the jqXHR object returned by $.ajax() is a Promise, that can be used.
var data = {
// captured data goes here
};
function sendData( val ){
// jqXHR object (which contains a promise)
return $.ajax('/foo/', {
data: { value: val },
dataType: 'json',
success: function( resp ){
// do whatever needed
}
});
}
function when(){
$.when(sendData(data)).done(function (resp) {
when();
});
}
when(); // use this within the if switch
DEMO
Assuming your queue is the array dataQueue, then you can do something like this :
var dataQueue = [];//sacrificial queue of items to be sent in batches via AJAX request
var batchSize = 10;
var requesting = false;//flag used to suppress further requests while a request is still being serviced
//addToQueue: a function called whenever an item is to be added to he queue.
function addToQueue(item) {
dataQueue.push(item);
send();//(conditional on queue length and no request currently being serviced)
}
function send() {
if(dataQueue.length >= batchSize && !requesting) {//is the queue long enough for a batch to be sent, and is no ajax request being serviced
$.ajax({
url: '/path/to/server/side/script',
data: JSON.stringify(dataQueue.splice(0, batchSize)),//.splice removes items from the queue (fifo)
... //further ajax options
}).done(handleResponse).fail(handleFailure).always(resetSend);
requesting = true;
}
}
function handleResponse(data, textStatus, jqXHR) {
//handle the server's response data here
}
function handleFailure(jqXHR, textStatus, errorThrown) {
//handle failure here
}
function resetSend() {
requesting = false;//Lower the flag, to allow another batch to go whenever the queue is long enough.
send();//Call send again here in case the queue is already long enough for another batch.
}
DEMO
Notes:
There's no particular reason to return the jqXHR (or anything else) from send but by all means do so if your application would benefit.
resetSend needs not necessarily be called as the .always handler. Calling from the .done handler (and not the .error handler) would have the effect of "die on failure".
To minimise the number of members in your namespace (global or whatever), you might choose to encapsulate the whole thing in a contructor function or singleton namespace pattern , both of which are pretty trivial.
Encapsulating in a constructor, would allow you to have two or more queues with the desired behaviour, each with its own settings.
The demo has a few extra lines of code to make the process observable.
In the demo, you can set the batchsize to 15, add-add-add to get the queue length up to, say, 12, then reduce the batchsize to 5 and add another item. You should see two sequential requests, and 3 residual items in the queue.
Related
I'm trying to figure out if there's any chance to receive the status of completion of a task (triggered via an ajax call), via multiple (time intervalled) ajax calls.
Basically, during the execution of something that could take long, I want to populate some variable and return it's value when asked.
Server code looks like this:
function setTask($total,$current){
$this->task['total'] = $total;
$this->task['current'] = $current;
}
function setEmptyTask(){
$this->task = [];
}
function getTaskPercentage(){
return ($this->task['current'] * 100) / $this->task['total'];
}
function actionGetTask(){
if (Yii::$app->request->isAjax) {
\Yii::$app->response->format = \yii\web\Response::FORMAT_JSON;
return [
'percentage' => $this->getTaskPercentage(),
];
}
}
Let's say I'm in a for loop, and I know how many times I iterate over:
function actionExportAll(){
$size = sizeof($array);
$c = 0;
foreach($array as $a){
// do something that takes relatively long
$this->setTask($size,$c++);
}
}
While in the client side i have this:
function exportAll(){
var intervalId = setInterval(function(){
$.ajax({
url: '/get-task',
type: 'post',
success: function(data){
console.log(data);
}
});
},3000);
$.ajax({
url: '/export-all',
type: 'post',
success: function(data){
clearInterval(intervalId); // cancel setInterval
// ..
}
});
}
This looks like it could work, besides the fact that ajax calls done in the setInterval function are completed after "export-all" is done and goes in the success callback.
There's surely something that I'm missing in this logic.
Thanks
The problem is probably in sessions.
Let's take a look what is going on.
The request to /export-all is send by browser.
App on server calls session_start() that opens the session file and locks access to it.
The app begins the expensive operations.
In browser the set interval passes and browser send request to /get-task.
App on server tries to handle the /get-task request and calls session_start(). It is blocked and has to wait for /export-all request to finish.
The expensive operations of /export-all are finished and the response is send to browser.
The session file is unlocked and /get-task request can finally continue past session_start(). Meanwhile browser have recieved /export-all response and executes the success callback for it.
The /get-task request is finished and response is send to browser.
The browser recieves /get-task response and executes its success callback.
The best way to deal with it is avoid running the expensive tasks directly from requests executed by user's browser.
Your export-all action should only plan the task for execution. Then the task itself can be executed by some cron action or some worker in background. And the /get-task can check its progress and trigger the final actions when the task is finished.
You should take look at yiisoft/yii2-queue extension. This extension allows you to create jobs, enqueue them and run the jobs from queue by cron task or by running a daemon that will listen for tasks and execute them as they come.
Without trying to dive into your code, which I don't have time to do, I'll say that the essential process looks like this:
Your first AJAX call is "to schedule the unit of work ... somehow." The result of this call is to indicate success and to hand back some kind of nonce, or token, which uniquely identifies the request. This does not necessarily indicate that processing has begun, only that the request to start it has been accepted.
Your next calls request "progress," and provide the nonce given in step #1 as the means to refer to it. The immediate response is the status at this time.
Presumably, you also have some kind of call to retrieve (and remove) the completed request. The same nonce is once again used to refer to it. The immediate response is that the results are returned to you and the nonce is cancelled.
Obviously, you must have some client-side way to remember the nonce(s). "Sessions" are the most-common way to do that. "Local storage," in a suitably-recent web browser, can also be used.
Also note ... as an important clarification ... that the title to your post does not match what's happening: one AJAX call isn't happening "during" another AJAX call. All of the AJAX calls return immediately. But, all of them refer (by means of nonces) to a long-running unit of work that is being carried out by some other appropriate means.
(By the way, there are many existing "workflow managers" and "batch processing systems" out there, open-source on Github, Sourceforge, and other such places. Be sure that you're not re-inventing what someone else has already perfected! "Actum Ne Agas: Do Not Do A Thing Already Done." Take a few minutes to look around and see if there's something already out there that you can just steal.)
So basically I found the solution for this very problem by myself.
What you need to do is to replace the above server side's code into this:
function setTask($total,$current){
$_SESSION['task']['total'] = $total;
$_SESSION['task']['current'] = $current;
session_write_close();
}
function setEmptyTask(){
$_SESSION['task'] = [];
session_write_close();
}
function getTaskPercentage(){
return ($_SESSION['task']['current'] * 100) / $_SESSION['task']['total'];
}
function actionGetTask(){
if (Yii::$app->request->isAjax) {
\Yii::$app->response->format = \yii\web\Response::FORMAT_JSON;
return [
'percentage' => $this->getTaskPercentage(),
];
}
}
This works, but I'm not completely sure if is a good practice.
From what I can tell, it seems like it frees access to the $_SESSION variable and makes it readable by another session (ence my actionGetTask()) during the execution of the actionExportAll() session.
Maybe somebody could integrate this answer and tell more about it.
Thanks for the answers, I will certainly dig more in those approaches and maybe try to make this same task in a better, more elegant and logic way.
I have a web application that has to perform the following task. For a chosen date range, it makes GET request to a web service for each date in the range; this can take a while, and because I want to visualize the data later, all the calls are synchronous (the result of each request gets stored into an array). This retrieval takes a while (several seconds) which means the main thread "freezes."
What would be a good way to avoid this? (E.g. doing the retrieval in a separate thread and getting notified once it's done.)
Consider using promises.
They enable you to perform non-blocking calls to API. It's basically what you are asking for.
EDIT: You can use when() specifically to be notified, when all operations are done.
You should make your GET-requests async and then visualize when all the requests have completed.
var get1 = $get(..
var get2 = $get(..
var get3 = $get(..
$.when(get1, get2, get3).done(function (...) {
// do something with the response
visualize();
});
In fact there is a simple solution. Let's implement a function which needs to be executed when all responses arrived:
function onFinished(responses) {
//Do something
}
Now, let's suppose you have a function which returns the dates as array:
function getDates(range) {
//Do something
}
Also, we need a getURL, like this:
function getURL(date) {
//Do something
}
Finally, let's suppose you have a variable called dateRange which has the range you will use as input in getDates. So we can do this:
var requestDates = getDates(dateRange);
var requestsPending = requestDates.length;
var responses = [];
for (var requestIndex in requestDates) {
$.ajax({
url: getURL(requestDates[requestIndex]),
method: "GET",
//You might pass data as well here if needed in the form of
//data: yourobject,
}).done(function(data, textStatus, jqXHR) {
//Handle response, parse it and store the result using responses.push()
}).fail(function(jqXHR, textStatus, errorThrown) {
//Handle failed requests
}).always(function(param1, param2, param3) {
if (--requestsPending === 0) {
onFinished(responses);
}
});
}
This will send AJAX requests for each date you need and wait for their responses asynchronously, so, you effectively do not wait for the sum of the pending time, but for the longest pending time, which is a great optimization. It is impossible to solve this in a multithreaded fashion, as Javascript is single-threaded, so you need to wait asynchronously for the answers, as the requests won't wait for each-other on the server. If you own the server as well, then you do not need to send a request for each date, but to implement a server-side API function where you will handle date ranges, so client-side will send a single request and wait for the answer.
I am currently working on a web based time tracking software. I'm developing in grails, but this question is solely related to javascript and asynchronous requests.
The time tracking tool shall enable users to choose a day for the current month, create one or multiple activities for each day and save the entire day. Each activity must be assigned to a project and a contract.
Upon choosing "save", the partial day is saved to the database, the hours are calculated and a table is updated at the bottom of the page, showing an overview of the user's worked hours per month.
Now to my issue: There may be a lot of AJAX request. Patient users might only click the "create activity" button just once and wait until it is created. Others, however, might just keep clicking until something happens.
The main issue here is updating the view, although i also recognized some failed calls because of concurrent database transaction (especially when choosing "save" and "delete" sequentially). Any feedback on that issue -- requests not "waiting" for the same row to be ready again -- will be apreciated as well, yet this is not my question.
I have an updateTemplate(data, day) function, which is invoked onSuccess of respective ajax calls in either of my functions saveRecord(), deleteRecord(), pasteRecords(), makeEditable() (undo save). Here is the example AJAX call in jquery:
$.ajax({
type: "POST",
url: "${g.createLink(controller:"controller", action:"action")}",
data: requestJson,
contentType:"application/json; charset=utf-8",
async: true,
success: function(data, textstatus) {updateTemplate(data["template"], tag); updateTable(data["table"]);},
});
In the controller action, a JSON object is rendered as a response, containing the keys template and table. Each key has a template rendered as a String assigned to it, using g.render.
Now, what happens when I click on create repeatedly in very short intervalls, due to the asynchronous calls, some create (or other) actions are executed concurrently. The issue is that updateTemplate just renders data from the repsonse; the data to render is collected in the create controller action. But the "last" request action only finds the objects created by itself. I think this is because create actions are run concurrently
I figure there is something I'm either overcomplicating or doing something essentially wrong working with a page that refreshs dynamically. The only thing I found that helps are synchronous calls, which works, but the user experience was awful. What options do I have to make this work? Is this really it or am I just looking for the wrong approach? How can I make this all more robust, so that impatient users are not able to break my code?
*********EDIT:********
I know that I could block buttons or keyboard shortcuts, use synchronous calls or similar things to avoid those issues. However, I want to know if it is possible to solve it with multiple AJAX requests being submitted. So the user should be able to keep adding new activities, although they won't appear immediately. There is a spinner for feedback anyway. I just want to somehow make sure that before the "last" AJAX request gets fired, the database is up to date so that the controller action will respond with the up-to-date gsp template with the right objects.
With help of this Stackoverflow answer, I found a way to ensure that the ajax call -- in the javascript function executed lastly -- always responds with an up-to-date model. Basically, I put the javascript functions containing AJAX calls in a waiting queue if a "critical" AJAX request has been initiated before but not completed yet.
For that I define the function doCallAjaxBusyAwareFunction(callable) that checks if the global variable Global.busy is 'true' prior to executing the callable function. If it's true, the function will be executed again until Global.busy is false, to finally execute the function -- collecting the data from the DOM -- and fire the AJAX request.
Definition of the global Variable:
var Global = {
ajaxIsBusy = false//,
//additional Global scope variables
};
Definition of the function doCallAjaxBusyAwareFunction:
function doCallAjaxBusyAwareFunction(callable) {
if(Global.busy == true){
console.log("Global.busy = " + Global.busy + ". Timout set! Try again in 100ms!!");
setTimeout(function(){doCallAjaxBusyAwareFunction(callable);}, 100);
}
else{
console.log("Global.busy = " + Global.busy + ". Call function!!");
callable();
}
}
To flag a function containing ajax as critical, I let it set Global.busy = true at the very start and Global.busy = false on AJAX complete. Example call:
function xyz (){
Global.busy = true;
//collect ajax request parameters from DOM
$.ajax({
//desired ajax settings
complete: function(data, status){ Global.busy = false; }
}
Since Global.busy is set to true at the very beginning, the DOM cannot be manipulated -- e.g. by deletes while the function xyz collects DOM data. But when the function was executed, there is still Global.busy === true until the ajax call completes.
Fire an ajax call from a "busy-aware" function:
doCallAjaxBusyAwareFunction(function(){
//collect DOM data
$.ajax({/*AJAX settings*/});
});
....or fire an ajax call from a "busy-aware" function that is also marked critical itself (basically what I mainly use it for):
doCallAjaxBusyAwareFunction(function(){
Global.busy = true;
//collect DOM data
$.ajax({
//AJAX SETTINGS
complete: function(data, status){ Global.busy = false; }
});
});
Feedback is welcome and other options too, especially if this approach is bad practice. I really hope somebody finds this post and evaluates it, since I don't know if it should be done like that at all. I will leave this question unanswered for now.
I'm writing some JavaScript/AJAX code.
Is there anyway to ensure that the server receives the XML requests in the order that they are sent?
If not with plain Ajax, do I get this guarantee if I send everything over a single WebSocket?
Thanks!
If it is of utmost importance that they're received in the proper order, and attaching an iterating id to the form isn't enough:
msg_number = 1; sendAJAX(msg_number); msg_number++;
Then I'd suggest building your own queue-system, and send each subsequent file as the callback of the previous one.
Rather than each element having its own AJAX-access, create one centralized spot in your application to handle that.
Your different AJAX-enabled sections don't even need to know that it is a queue:
AJAX.send({ url : "......", method : "post", success : func(){}, syncronous : true });
On the other side of that, you could have something like:
AJAX.send = function (obj) {
if (obj.synchronous) {
addToSyncQueue(obj); checkQueue();
} else { fireRequest(); }
};
Inside of your sync queue, all you'd need to do is wrap a new function around the old callback:
callback = (function (old_cb) {
return function (response) {
checkQueue();
old_cb(response);
};
}(obj.success));
obj.success = callback;
AJAX.call(obj);
Inside of checkQueue, you'd just need to see if it was empty, and if it wasn't, use
nextObj = queue.shift(); (if you're .push()-ing objects onto the queue -- so first-in, first-out, like you wanted).
A couple of options come to mind:
Send them synchronously, by waiting for a successful response from the server after each XML request is received (i.e. make a queue).
If you know the number of requests you'll be sending beforehand, send the request number as a tag with each request, e.g. <requestNum>1</requestNum><numRequests>5</numRequests>. This doesn't guarantee the order that they're received in, but guarantees that they can be put back in order afterwards, and has the added benefit of being sure that you have all the data.
At my company we use this little ajaxQueue plugin, written by one of the core jQuery contributors:
http://gnarf.net/2011/06/21/jquery-ajaxqueue/
I'm creating a script that performs several functions and I want to update the user as the functions are completed. I have nested $.ajax() calls with each subsequent call in the previous call's success block.
There are a total of 4 calls made for each loop. Let's call them scan_1 through scan_4. The success block of scan_1 calls scan_2 and so on down the chain.
For example, let's say I'm looping over 3 objects. I want the process to go like this:
Loop 1
scan_1
scan_2
scan_3
scan_4
Loop 2
scan_1
scan_2
scan_3
scan_4
Loop 3
scan_1
scan_2
scan_3
scan_4
The problem is that it's running through all the scan_1 calls first. I must be missing something, but I can't seem to figure it out. Any advice would be much appreciated.
For reference, here is a snippet of scan_1 (irrelevant stuff snipped):
for(var i = 1; i <= 3; i++)
{
$.ajax({
type: 'GET',
url: url,
data: 'do=scan&step=1&' + string,
dataType: 'json',
success: function (result)
{
if(result.proceed == 'true')
{
$('#scan_progress').append(result.message);
scan_2();
}
else
{
$('#scan_progress').append(result.message);
}
}
});
}
Thoughts?
Thanks in advance.
Sounds like you need to use jQuery deferred. It basically allows you to chain multiple event handlers to the jQuery Ajax object and gives you finer control over when the callbacks are invoked.
Further reading:
http://msdn.microsoft.com/en-us/scriptjunkie/gg723713
http://www.erichynds.com/jquery/using-deferreds-in-jquery/
It's asynchronous - the "success" fires sometime in the future. The script does not wait for it to respond. Since you're firing off three requests in your loop, they will all be "scan1".
"scan_2" will be called as each request completes.
Change the request to synchronous if you want to control the order of events.
You are starting by sending off three ajax calls at once.
Scan1 (loop 1)
Scan1 (loop 2)
Scan1 (loop 3)
When each Scan 1 completes, it's subsequent Scan 2, and then Scan 3 are called.
What did you actually want to happen? Scan 1 2 and 3 of loop 1, then 1 2 and 3 of loop 2, and then 1 2 and 3 of loop 3? That would require more nesting, or possibly deferred objects.
Instead of using the success callback for each $.ajax() call, you can store each set of AJAX requests (their jqXHR objects) in an array and wait for all of them to resolve:
function scan_1 () {
//setup array to store jqXHR objects (deferred objects)
var jqXHRs = [];
for(var i = 1; i <= 3; i++)
{
//push a new index onto the array, `$.ajax()` returns an object that will resolve when the response is returned
jqXHRs[jqXHRs.length] = $.ajax({
type: 'GET',
url: url,
data: 'do=scan&step=1&' + string,
dataType: 'json'
});
}
//wait for all four of the AJAX requests to resolve before running `scan_2()`
$.when(jqXHRs).then(function () {
if(result.proceed == 'true') {
scan_2();
}
});
}
I've had similar problems working heavily with SharePoint web services - you often need to pull data from multiple sources to generate input for a single process.
To solve it I embedded this kind of functionality into my AJAX abstraction library. You can easily define a request which will trigger a set of handlers when complete. However each request can be defined with multiple http calls. Here's the component (and detailed documentation):
DPAJAX at DepressedPress.com
This simple example creates one request with three calls and then passes that information, in the call order, to a single handler:
// The handler function
function AddUp(Nums) { alert(Nums[1] + Nums[2] + Nums[3]) };
// Create the pool
myPool = DP_AJAX.createPool();
// Create the request
myRequest = DP_AJAX.createRequest(AddUp);
// Add the calls to the request
myRequest.addCall("GET", "http://www.mysite.com/Add.htm", [5,10]);
myRequest.addCall("GET", "http://www.mysite.com/Add.htm", [4,6]);
myRequest.addCall("GET", "http://www.mysite.com/Add.htm", [7,13]);
// Add the request to the pool
myPool.addRequest(myRequest);
Note that unlike many of the other solutions provided this method does not force single threading of the calls being made - each will still run as quickly (or as slowly) as the environment allows but the single handler will only be called when all are complete. It also supports the setting of timeout values and retry attempts if your service is a little flakey.
I've found it insanely useful (and incredibly simple to understand from a code perspective). No more chaining, no more counting calls and saving output. Just "set it and forget it".