jQuery: Retrieve data in a separate thread - javascript

I have a web application that has to perform the following task. For a chosen date range, it makes GET request to a web service for each date in the range; this can take a while, and because I want to visualize the data later, all the calls are synchronous (the result of each request gets stored into an array). This retrieval takes a while (several seconds) which means the main thread "freezes."
What would be a good way to avoid this? (E.g. doing the retrieval in a separate thread and getting notified once it's done.)

Consider using promises.
They enable you to perform non-blocking calls to API. It's basically what you are asking for.
EDIT: You can use when() specifically to be notified, when all operations are done.

You should make your GET-requests async and then visualize when all the requests have completed.
var get1 = $get(..
var get2 = $get(..
var get3 = $get(..
$.when(get1, get2, get3).done(function (...) {
// do something with the response
visualize();
});

In fact there is a simple solution. Let's implement a function which needs to be executed when all responses arrived:
function onFinished(responses) {
//Do something
}
Now, let's suppose you have a function which returns the dates as array:
function getDates(range) {
//Do something
}
Also, we need a getURL, like this:
function getURL(date) {
//Do something
}
Finally, let's suppose you have a variable called dateRange which has the range you will use as input in getDates. So we can do this:
var requestDates = getDates(dateRange);
var requestsPending = requestDates.length;
var responses = [];
for (var requestIndex in requestDates) {
$.ajax({
url: getURL(requestDates[requestIndex]),
method: "GET",
//You might pass data as well here if needed in the form of
//data: yourobject,
}).done(function(data, textStatus, jqXHR) {
//Handle response, parse it and store the result using responses.push()
}).fail(function(jqXHR, textStatus, errorThrown) {
//Handle failed requests
}).always(function(param1, param2, param3) {
if (--requestsPending === 0) {
onFinished(responses);
}
});
}
This will send AJAX requests for each date you need and wait for their responses asynchronously, so, you effectively do not wait for the sum of the pending time, but for the longest pending time, which is a great optimization. It is impossible to solve this in a multithreaded fashion, as Javascript is single-threaded, so you need to wait asynchronously for the answers, as the requests won't wait for each-other on the server. If you own the server as well, then you do not need to send a request for each date, but to implement a server-side API function where you will handle date ranges, so client-side will send a single request and wait for the answer.

Related

Ajax calls DURING another Ajax call to receive server's task calculation status and display it to the client as a progression bar

I'm trying to figure out if there's any chance to receive the status of completion of a task (triggered via an ajax call), via multiple (time intervalled) ajax calls.
Basically, during the execution of something that could take long, I want to populate some variable and return it's value when asked.
Server code looks like this:
function setTask($total,$current){
$this->task['total'] = $total;
$this->task['current'] = $current;
}
function setEmptyTask(){
$this->task = [];
}
function getTaskPercentage(){
return ($this->task['current'] * 100) / $this->task['total'];
}
function actionGetTask(){
if (Yii::$app->request->isAjax) {
\Yii::$app->response->format = \yii\web\Response::FORMAT_JSON;
return [
'percentage' => $this->getTaskPercentage(),
];
}
}
Let's say I'm in a for loop, and I know how many times I iterate over:
function actionExportAll(){
$size = sizeof($array);
$c = 0;
foreach($array as $a){
// do something that takes relatively long
$this->setTask($size,$c++);
}
}
While in the client side i have this:
function exportAll(){
var intervalId = setInterval(function(){
$.ajax({
url: '/get-task',
type: 'post',
success: function(data){
console.log(data);
}
});
},3000);
$.ajax({
url: '/export-all',
type: 'post',
success: function(data){
clearInterval(intervalId); // cancel setInterval
// ..
}
});
}
This looks like it could work, besides the fact that ajax calls done in the setInterval function are completed after "export-all" is done and goes in the success callback.
There's surely something that I'm missing in this logic.
Thanks
The problem is probably in sessions.
Let's take a look what is going on.
The request to /export-all is send by browser.
App on server calls session_start() that opens the session file and locks access to it.
The app begins the expensive operations.
In browser the set interval passes and browser send request to /get-task.
App on server tries to handle the /get-task request and calls session_start(). It is blocked and has to wait for /export-all request to finish.
The expensive operations of /export-all are finished and the response is send to browser.
The session file is unlocked and /get-task request can finally continue past session_start(). Meanwhile browser have recieved /export-all response and executes the success callback for it.
The /get-task request is finished and response is send to browser.
The browser recieves /get-task response and executes its success callback.
The best way to deal with it is avoid running the expensive tasks directly from requests executed by user's browser.
Your export-all action should only plan the task for execution. Then the task itself can be executed by some cron action or some worker in background. And the /get-task can check its progress and trigger the final actions when the task is finished.
You should take look at yiisoft/yii2-queue extension. This extension allows you to create jobs, enqueue them and run the jobs from queue by cron task or by running a daemon that will listen for tasks and execute them as they come.
Without trying to dive into your code, which I don't have time to do, I'll say that the essential process looks like this:
Your first AJAX call is "to schedule the unit of work ... somehow." The result of this call is to indicate success and to hand back some kind of nonce, or token, which uniquely identifies the request. This does not necessarily indicate that processing has begun, only that the request to start it has been accepted.
Your next calls request "progress," and provide the nonce given in step #1 as the means to refer to it. The immediate response is the status at this time.
Presumably, you also have some kind of call to retrieve (and remove) the completed request. The same nonce is once again used to refer to it. The immediate response is that the results are returned to you and the nonce is cancelled.
Obviously, you must have some client-side way to remember the nonce(s). "Sessions" are the most-common way to do that. "Local storage," in a suitably-recent web browser, can also be used.
Also note ... as an important clarification ... that the title to your post does not match what's happening: one AJAX call isn't happening "during" another AJAX call. All of the AJAX calls return immediately. But, all of them refer (by means of nonces) to a long-running unit of work that is being carried out by some other appropriate means.
(By the way, there are many existing "workflow managers" and "batch processing systems" out there, open-source on Github, Sourceforge, and other such places. Be sure that you're not re-inventing what someone else has already perfected! "Actum Ne Agas: Do Not Do A Thing Already Done." Take a few minutes to look around and see if there's something already out there that you can just steal.)
So basically I found the solution for this very problem by myself.
What you need to do is to replace the above server side's code into this:
function setTask($total,$current){
$_SESSION['task']['total'] = $total;
$_SESSION['task']['current'] = $current;
session_write_close();
}
function setEmptyTask(){
$_SESSION['task'] = [];
session_write_close();
}
function getTaskPercentage(){
return ($_SESSION['task']['current'] * 100) / $_SESSION['task']['total'];
}
function actionGetTask(){
if (Yii::$app->request->isAjax) {
\Yii::$app->response->format = \yii\web\Response::FORMAT_JSON;
return [
'percentage' => $this->getTaskPercentage(),
];
}
}
This works, but I'm not completely sure if is a good practice.
From what I can tell, it seems like it frees access to the $_SESSION variable and makes it readable by another session (ence my actionGetTask()) during the execution of the actionExportAll() session.
Maybe somebody could integrate this answer and tell more about it.
Thanks for the answers, I will certainly dig more in those approaches and maybe try to make this same task in a better, more elegant and logic way.

Queuing/throttling jQuery ajax requests

I need to fire a number of ajax requests at a server and then run a callback when they are finished. Normally this would be easy using jQuery's deferred.done() . However to avoid overwhelming the server, I'm queuing the requests, and firing one every X milliseconds.
e.g
var promisesList = [];
var addToQueue = function(workflow) {
workflowQueue.push(workflow);
}
var startWorkflow = function(workflow) {
return $.ajax($endointURL, {
type: "POST",
data: {
action: workflow.id
},
success: function() {
},
error: function(jqXHR, textStatus, errorThrown) {
}
});
};
var startWorkflows = function() {
var promisesList = [];
if (workflowQueue.length > 0) {
var workflow = workflowQueue.shift();
promisesList.push(startWorkflow(workflow));
setTimeout(startWorkflows, delay);
}
};
startWorkflows();
$.when(promisesList).done(function(){
//do stuff
});
The problem with this is, that the promisesList array is initially empty, so the done() callback fires immediately, and then the ajax requests start getting sent by the setTimeout(). Is there an easy way to create the ajax requests initially and kind of "pause" them, then fire them using setTimeout().
I've found various throttle/queue implementations to fire ajax requests sequentially, but I'm happy for them to be fired in parallel, just with a delay on them.
The first thing you're stumbling upon is that when() doesn't work this way with arrays. It accepts an arbitrary list of promises so you can get around that by applying an array using:
$.when.apply(null, promiseList).done(function(){
// Do something
// use the `arguments` magic property to get an ordered list of results
});
Secondly, the throttling method can be done with $.ajax param of {delay:timeInSeconds} but I've proposed a solution that sets up a new defered which is immediately returned (to keep the order) but resolved after a timeout.
See http://jsfiddle.net/9Acb2/1/ for an interactive example

Deferred and ajax and queuing functionality

I am implementing a queue system of various information. When it reaches a certain number, I send an ajax request.... user inputs data, when it reaches certain point I send it. BUT, the user can still be entering data. I don't want to lose that.. so, I was thinking of I could use a $.Deferred/promise, while storing the data to a certain point.. firing ajax, and only allow a new request when the previous deferred is successful... also, if the data being entered then increased to the point I have to send it again, I que it..
I am having a hard time wrapping my brain around the methodology of how to implement.
===> capture data
=======> 'n' amount of data is entered
=============> move that data into the 'ready' bucket. (arbitrary, lets user entered 10 input fields and I store into an array. when array reaches 10.. boom send it ).
=============> fire ajax with the 10 items
In the meantime the user can still be entering data. I want to make sure I still capture it and keep que'ing and sending at 10.
I was thinking of a queuing system with a deferred. Not sure if I am over thinking this.
Since the jqXHR object returned by $.ajax() is a Promise, that can be used.
var data = {
// captured data goes here
};
function sendData( val ){
// jqXHR object (which contains a promise)
return $.ajax('/foo/', {
data: { value: val },
dataType: 'json',
success: function( resp ){
// do whatever needed
}
});
}
function when(){
$.when(sendData(data)).done(function (resp) {
when();
});
}
when(); // use this within the if switch
DEMO
Assuming your queue is the array dataQueue, then you can do something like this :
var dataQueue = [];//sacrificial queue of items to be sent in batches via AJAX request
var batchSize = 10;
var requesting = false;//flag used to suppress further requests while a request is still being serviced
//addToQueue: a function called whenever an item is to be added to he queue.
function addToQueue(item) {
dataQueue.push(item);
send();//(conditional on queue length and no request currently being serviced)
}
function send() {
if(dataQueue.length >= batchSize && !requesting) {//is the queue long enough for a batch to be sent, and is no ajax request being serviced
$.ajax({
url: '/path/to/server/side/script',
data: JSON.stringify(dataQueue.splice(0, batchSize)),//.splice removes items from the queue (fifo)
... //further ajax options
}).done(handleResponse).fail(handleFailure).always(resetSend);
requesting = true;
}
}
function handleResponse(data, textStatus, jqXHR) {
//handle the server's response data here
}
function handleFailure(jqXHR, textStatus, errorThrown) {
//handle failure here
}
function resetSend() {
requesting = false;//Lower the flag, to allow another batch to go whenever the queue is long enough.
send();//Call send again here in case the queue is already long enough for another batch.
}
DEMO
Notes:
There's no particular reason to return the jqXHR (or anything else) from send but by all means do so if your application would benefit.
resetSend needs not necessarily be called as the .always handler. Calling from the .done handler (and not the .error handler) would have the effect of "die on failure".
To minimise the number of members in your namespace (global or whatever), you might choose to encapsulate the whole thing in a contructor function or singleton namespace pattern , both of which are pretty trivial.
Encapsulating in a constructor, would allow you to have two or more queues with the desired behaviour, each with its own settings.
The demo has a few extra lines of code to make the process observable.
In the demo, you can set the batchsize to 15, add-add-add to get the queue length up to, say, 12, then reduce the batchsize to 5 and add another item. You should see two sequential requests, and 3 residual items in the queue.

Is there a better way I can measure the total time it takes to return data from an Ajax call?

I am looking at a method like this:
function aa() {
var start = new Date().getTime();
$.ajax({
cache: false,
url: href + params.param,
dataType: 'html'
})
.done(function (responseText) {
xx();
})
.fail(function (jqXHR, textStatus, errorThrown) {
yy();
})
.always(function () {
zz(start);
});
};
function zz(start) {
var end = new Date().getTime();
var elapsed = end.getMilliseconds() - start.getMilliseconds();
}
First of all. Is this a valid way to measure the time it takes? Also is there another way that I could do this. For example is there some way I could read something from the Ajax header and if so how could I read it?
Please note I am looking for a solution in code so I can report the times on an HTML page.
This is absolutely a valid way to measure the amount of time between when a request is issued and when data is returned. The goal is to measure the amount of time the client sees pass between issuing a request and receiving a response, so using the client's clock is ideal. (You usually want to avoid time comparisons that involve syncing the server's and client's clocks; using the Date: HTTP header would do just that.)
There are two things I would do to improve:
Bind the always handler first. Since Deferred callbacks are executed in the order they are added, binding always first means we won't have the measurement thrown off by computationally expensive operations in the done handler.
There's no need to instatiate Date objects. The static Date.now() returns the current time in ms as a Number.
You could even do the elapsed calculation in scope so it's available to your done/fail callbacks.
function aa() {
var start = Date.now(), elapsed;
$.ajax({
cache: false,
url: href + params.param,
dataType: 'html'
})
.always(function () {
elapsed = Date.now() - start;
});
.done(function (responseText) {
xx(elapsed);
})
.fail(function (jqXHR, textStatus, errorThrown) {
yy(elapsed);
});
};
Firebug for Firefox will tell you how long a call takes
I would argue that your way is pretty clean. There is nothing built-in to the response that tells you how long a request takes unless you put it there yourself. If you try to do some sort of server-side logging though, that will not take into account the time taken for the request to get across the wire.
Perhaps run some kind of monitoring PC that just reloads the page every once in a while and keeps its own log ... sending out an alert if the transfer takes too long.
Under *nix, you could automate that with a simple script and wget and/or curl.
If you are using Ruby or .Net, you can take a look to MiniProfiler. It will do what you want. Or maybe you can take check how he is doing it inside.

Trigger a function only after the completion of multiple AJAX requests

I've got a particular function I want to run once, and only after the completion of several AJAX requests.
My current solution looks a bit like this:
function doWork() {
//This is the function to be run once after all the requests
}
//some tracking/counting variables
var ajaxDoneCounter = 0;
var numOfAjaxRequests = 5;
var workDone = false;
function doWorkTrigger() {
ajaxDoneCounter++;
if( !workDone && ajaxDoneCounter >= numOfAjaxRequests ) {
workDone = true;
doWork();
}
}
// ...
//and a number of ajax requests (some hidden within functions, etc)
//they look something like this:
$.ajax({
url: "http://www.example.com",
dataType: "json",
success: function( data ) {
//load data in to variables, etc
doWorkTrigger();
}
});
One obvious pitfall in the above is that any AJAX call that is not successful will not increment ajaxDoneCount and so doWork() will probably never be called. I can get around that using the error callback in inside any $.ajax, so that doesn't worry me too much.
What I want to know is whether the above is safe and/or good practice?
Is there a trick I've missed, or any thing else that might work better?
Update: Since jQuery 1.5, deferred objects [docs] provide a cleaner solution. Have a look at an example here.
I would use .ajaxComplete(), it will be triggered whenever an Ajax call completed (success or error):
var numOfAjaxRequests = 5;
$(document).ajaxComplete(function() {
numOfAjaxRequests--;
if(!numOfAjaxRequests) {
doWork();
}
});
Then you don't have to edit every Ajax request.
You could even use .ajaxSend() to get notified of starting Ajax requests, instead of hardcoding it (but I am not sure whether this really works, maybe you will experience race conditions):
var numOfAjaxRequests = 0;
$(document).ajaxSend(function() {
numOfAjaxRequests++;
});
I think you should use complete(XMLHttpRequest, textStatus) ajax event instead of success(data, textStatus, XMLHttpRequest).
According to jQuery help:
complete(XMLHttpRequest, textStatus)
A function to be called when the
request finishes (after success and
error callbacks are executed). The
function gets passed two arguments:
The XMLHttpRequest object and a string
describing the status of the request.
This is an Ajax Event.
I don't know enough about JavaScript internals, but there is a danger that the operation:
ajaxDoneCounter++;
is not atomic. If that is the case, then this could be subject to a race condition.

Categories

Resources