I'm writing some JavaScript/AJAX code.
Is there anyway to ensure that the server receives the XML requests in the order that they are sent?
If not with plain Ajax, do I get this guarantee if I send everything over a single WebSocket?
Thanks!
If it is of utmost importance that they're received in the proper order, and attaching an iterating id to the form isn't enough:
msg_number = 1; sendAJAX(msg_number); msg_number++;
Then I'd suggest building your own queue-system, and send each subsequent file as the callback of the previous one.
Rather than each element having its own AJAX-access, create one centralized spot in your application to handle that.
Your different AJAX-enabled sections don't even need to know that it is a queue:
AJAX.send({ url : "......", method : "post", success : func(){}, syncronous : true });
On the other side of that, you could have something like:
AJAX.send = function (obj) {
if (obj.synchronous) {
addToSyncQueue(obj); checkQueue();
} else { fireRequest(); }
};
Inside of your sync queue, all you'd need to do is wrap a new function around the old callback:
callback = (function (old_cb) {
return function (response) {
checkQueue();
old_cb(response);
};
}(obj.success));
obj.success = callback;
AJAX.call(obj);
Inside of checkQueue, you'd just need to see if it was empty, and if it wasn't, use
nextObj = queue.shift(); (if you're .push()-ing objects onto the queue -- so first-in, first-out, like you wanted).
A couple of options come to mind:
Send them synchronously, by waiting for a successful response from the server after each XML request is received (i.e. make a queue).
If you know the number of requests you'll be sending beforehand, send the request number as a tag with each request, e.g. <requestNum>1</requestNum><numRequests>5</numRequests>. This doesn't guarantee the order that they're received in, but guarantees that they can be put back in order afterwards, and has the added benefit of being sure that you have all the data.
At my company we use this little ajaxQueue plugin, written by one of the core jQuery contributors:
http://gnarf.net/2011/06/21/jquery-ajaxqueue/
Related
I'm trying to figure out if there's any chance to receive the status of completion of a task (triggered via an ajax call), via multiple (time intervalled) ajax calls.
Basically, during the execution of something that could take long, I want to populate some variable and return it's value when asked.
Server code looks like this:
function setTask($total,$current){
$this->task['total'] = $total;
$this->task['current'] = $current;
}
function setEmptyTask(){
$this->task = [];
}
function getTaskPercentage(){
return ($this->task['current'] * 100) / $this->task['total'];
}
function actionGetTask(){
if (Yii::$app->request->isAjax) {
\Yii::$app->response->format = \yii\web\Response::FORMAT_JSON;
return [
'percentage' => $this->getTaskPercentage(),
];
}
}
Let's say I'm in a for loop, and I know how many times I iterate over:
function actionExportAll(){
$size = sizeof($array);
$c = 0;
foreach($array as $a){
// do something that takes relatively long
$this->setTask($size,$c++);
}
}
While in the client side i have this:
function exportAll(){
var intervalId = setInterval(function(){
$.ajax({
url: '/get-task',
type: 'post',
success: function(data){
console.log(data);
}
});
},3000);
$.ajax({
url: '/export-all',
type: 'post',
success: function(data){
clearInterval(intervalId); // cancel setInterval
// ..
}
});
}
This looks like it could work, besides the fact that ajax calls done in the setInterval function are completed after "export-all" is done and goes in the success callback.
There's surely something that I'm missing in this logic.
Thanks
The problem is probably in sessions.
Let's take a look what is going on.
The request to /export-all is send by browser.
App on server calls session_start() that opens the session file and locks access to it.
The app begins the expensive operations.
In browser the set interval passes and browser send request to /get-task.
App on server tries to handle the /get-task request and calls session_start(). It is blocked and has to wait for /export-all request to finish.
The expensive operations of /export-all are finished and the response is send to browser.
The session file is unlocked and /get-task request can finally continue past session_start(). Meanwhile browser have recieved /export-all response and executes the success callback for it.
The /get-task request is finished and response is send to browser.
The browser recieves /get-task response and executes its success callback.
The best way to deal with it is avoid running the expensive tasks directly from requests executed by user's browser.
Your export-all action should only plan the task for execution. Then the task itself can be executed by some cron action or some worker in background. And the /get-task can check its progress and trigger the final actions when the task is finished.
You should take look at yiisoft/yii2-queue extension. This extension allows you to create jobs, enqueue them and run the jobs from queue by cron task or by running a daemon that will listen for tasks and execute them as they come.
Without trying to dive into your code, which I don't have time to do, I'll say that the essential process looks like this:
Your first AJAX call is "to schedule the unit of work ... somehow." The result of this call is to indicate success and to hand back some kind of nonce, or token, which uniquely identifies the request. This does not necessarily indicate that processing has begun, only that the request to start it has been accepted.
Your next calls request "progress," and provide the nonce given in step #1 as the means to refer to it. The immediate response is the status at this time.
Presumably, you also have some kind of call to retrieve (and remove) the completed request. The same nonce is once again used to refer to it. The immediate response is that the results are returned to you and the nonce is cancelled.
Obviously, you must have some client-side way to remember the nonce(s). "Sessions" are the most-common way to do that. "Local storage," in a suitably-recent web browser, can also be used.
Also note ... as an important clarification ... that the title to your post does not match what's happening: one AJAX call isn't happening "during" another AJAX call. All of the AJAX calls return immediately. But, all of them refer (by means of nonces) to a long-running unit of work that is being carried out by some other appropriate means.
(By the way, there are many existing "workflow managers" and "batch processing systems" out there, open-source on Github, Sourceforge, and other such places. Be sure that you're not re-inventing what someone else has already perfected! "Actum Ne Agas: Do Not Do A Thing Already Done." Take a few minutes to look around and see if there's something already out there that you can just steal.)
So basically I found the solution for this very problem by myself.
What you need to do is to replace the above server side's code into this:
function setTask($total,$current){
$_SESSION['task']['total'] = $total;
$_SESSION['task']['current'] = $current;
session_write_close();
}
function setEmptyTask(){
$_SESSION['task'] = [];
session_write_close();
}
function getTaskPercentage(){
return ($_SESSION['task']['current'] * 100) / $_SESSION['task']['total'];
}
function actionGetTask(){
if (Yii::$app->request->isAjax) {
\Yii::$app->response->format = \yii\web\Response::FORMAT_JSON;
return [
'percentage' => $this->getTaskPercentage(),
];
}
}
This works, but I'm not completely sure if is a good practice.
From what I can tell, it seems like it frees access to the $_SESSION variable and makes it readable by another session (ence my actionGetTask()) during the execution of the actionExportAll() session.
Maybe somebody could integrate this answer and tell more about it.
Thanks for the answers, I will certainly dig more in those approaches and maybe try to make this same task in a better, more elegant and logic way.
I am implementing a queue system of various information. When it reaches a certain number, I send an ajax request.... user inputs data, when it reaches certain point I send it. BUT, the user can still be entering data. I don't want to lose that.. so, I was thinking of I could use a $.Deferred/promise, while storing the data to a certain point.. firing ajax, and only allow a new request when the previous deferred is successful... also, if the data being entered then increased to the point I have to send it again, I que it..
I am having a hard time wrapping my brain around the methodology of how to implement.
===> capture data
=======> 'n' amount of data is entered
=============> move that data into the 'ready' bucket. (arbitrary, lets user entered 10 input fields and I store into an array. when array reaches 10.. boom send it ).
=============> fire ajax with the 10 items
In the meantime the user can still be entering data. I want to make sure I still capture it and keep que'ing and sending at 10.
I was thinking of a queuing system with a deferred. Not sure if I am over thinking this.
Since the jqXHR object returned by $.ajax() is a Promise, that can be used.
var data = {
// captured data goes here
};
function sendData( val ){
// jqXHR object (which contains a promise)
return $.ajax('/foo/', {
data: { value: val },
dataType: 'json',
success: function( resp ){
// do whatever needed
}
});
}
function when(){
$.when(sendData(data)).done(function (resp) {
when();
});
}
when(); // use this within the if switch
DEMO
Assuming your queue is the array dataQueue, then you can do something like this :
var dataQueue = [];//sacrificial queue of items to be sent in batches via AJAX request
var batchSize = 10;
var requesting = false;//flag used to suppress further requests while a request is still being serviced
//addToQueue: a function called whenever an item is to be added to he queue.
function addToQueue(item) {
dataQueue.push(item);
send();//(conditional on queue length and no request currently being serviced)
}
function send() {
if(dataQueue.length >= batchSize && !requesting) {//is the queue long enough for a batch to be sent, and is no ajax request being serviced
$.ajax({
url: '/path/to/server/side/script',
data: JSON.stringify(dataQueue.splice(0, batchSize)),//.splice removes items from the queue (fifo)
... //further ajax options
}).done(handleResponse).fail(handleFailure).always(resetSend);
requesting = true;
}
}
function handleResponse(data, textStatus, jqXHR) {
//handle the server's response data here
}
function handleFailure(jqXHR, textStatus, errorThrown) {
//handle failure here
}
function resetSend() {
requesting = false;//Lower the flag, to allow another batch to go whenever the queue is long enough.
send();//Call send again here in case the queue is already long enough for another batch.
}
DEMO
Notes:
There's no particular reason to return the jqXHR (or anything else) from send but by all means do so if your application would benefit.
resetSend needs not necessarily be called as the .always handler. Calling from the .done handler (and not the .error handler) would have the effect of "die on failure".
To minimise the number of members in your namespace (global or whatever), you might choose to encapsulate the whole thing in a contructor function or singleton namespace pattern , both of which are pretty trivial.
Encapsulating in a constructor, would allow you to have two or more queues with the desired behaviour, each with its own settings.
The demo has a few extra lines of code to make the process observable.
In the demo, you can set the batchsize to 15, add-add-add to get the queue length up to, say, 12, then reduce the batchsize to 5 and add another item. You should see two sequential requests, and 3 residual items in the queue.
When communicating with a server in javascript in my single page browser application, I would like to provide a callback function that is always called after the server replies, regardless of whether the result was a success or some kind of error.
Two cases where I need this:
1) I want to disable a "save" button while waiting for the server's response, and enable it again after the server responds with an error or a success.
2) I have a polling mechanism where I want to prevent stacking of calls when the server for some reason is being slow to respond - I want to wait for one poll call to finish before making the next.
One solution I have right now involves making sure that two functions (success and error) get passed along as options in a long method chain, which feels like a fragile and cumbersome solution (pseudo-ish code):
function doCall() {
framework1.callit({success : myCallback, error : myCallback})
};
framework123.callit = function(options) {
options = options || {};
if (options.error) {
var oldError = options.error;
options.error = function(errorStuff) {
// callit error stuff
oldError(errorStuff);
} else {
// callit error stuff
}
generalCallFunction(options);
}
function generalCallFunction(options) {
options = // ... checking success and error once again to get general success and error stuff in there, plus adding more options
ajax( blah, blah, options);
}
I also have a backbone solution where I listen to the sync event plus an error callback, in similar ways as above.
I'm always scared that error or success functions get lost on the way, and the whole thing is hard to follow.
Any framework or pattern for making this stuff as easy as possible? Is it a weird thing to have general things that should always happen whether the result was an error or a success?
You can use jQuery.ajax({ details here... ).always(callback);
Or, in Backbone
// logic to create model here
model.fetch().always(callback);
We've all seen some examples in AJAX tutorials where some data is sent. They all (more or less) look like:
var http = createRequestObject(); // shared between printResult() and doAjax()
function createRequestObject() { /* if FF/Safari/Chrome/IE ... */ ... }
function printResult()
{
if (http.readyState == 4) { ... }
}
function doAjax() {
var request = 'SomeURL';
http.open('post', request);
http.onreadystatechange = printResult;
data = ...; // fill in the data
http.send(data);
}
// trigger doAjax() from HTML code, by pressing some button
Here is the scenario I don't understand completely: what if the button is being pressed several times very fast? Should doAjax() somehow re-initialize the http object? And if if the object is re-initialized, what happens with the requests that are being already on air?
PS: to moderator: this question is probably more community-wiki related. As stated here (https://meta.stackexchange.com/questions/67581/community-wiki-checkbox-missing-in-action) - if I've got it right - please mark this question appropriately.
Since AJAX has asynchronus nature, with each button click you would raise async event that would GET/POST some data FROM/TO server. You provide one callback, so it would be triggered as many times as server finishes processing data.
It is normal behaviour by default, you should not reinitialize of http object. If you want to present multiple send operation you have to do that manually (e.g. disabling button as first call being made).
I also suggest to use jQuery $.ajax because it incapsulate many of these details.
Sure that numerous libraries exist nowadays that perform a decent job and should be used in production environment. However, my question was about the under-the-hood details. So here I've found the lamda-calculus-like way to have dedicated request objects per request. Those object will obviously be passed to the callback function which is called when response arrives etc:
function printResult(http) {
if (http.readyState == 4) { ... }
...
}
function doAjax() {
var http = createRequestObject();
var request = 'SomeURL';
http.open('get', request);
http.onreadystatechange = function() { printResult(http); };
http.send(null);
return false;
}
Successfully tested under Chrome and IE9.
I've used a per-page request queue to deal with this scenario (to suppress duplicate requests and to ensure the sequential order of requests), but there may be a more standardized solution.
Since this is not provided by default, you would need to implement it in JavaScript within your page (or a linked script). Instead of starting an Ajax request, clicking a button would add a request to a queue. If the queue is empty, execute the Ajax request, with a callback that removes the queued entry and executes the next (if any).
See also: How to implement an ajax request queue using jQuery
I want to retrieve the height and width of an image on a server by using an ajax post call to a php file which returns a double pipe delimited string 'width||height'
My javascript is correctly alerting that requested string from the php file so the info is now in my script but i cannot seem to access it outside the $.post function.
This works:
var getImagesize = function(sFilename)
{
$.post("getImagesize.php", { filename: sFilename, time: "2pm" },
function(data){
alert(data.split('||'));
});
}
But retrieving is a different matter:
// this line calls the function in a loop through images:
var aOrgdimensions = getImagesize($(this, x).attr('src')) ;
alert(aOrgdimension);
// the called function now looks like this:
var getImagesize = function(sFilename)
{
var aImagedims = new Array();
$.post("getImagesize.php", { filename: sFilename },
function(data){
aImagedims = data.split('||');
});
return "here it is" + aImagedims ;
}
Anyone able to tell me what i'm doing wrong?
You are misunderstanding the way that an AJAX call works. The first "A" in AJAX stands for asynchronous, which means that a request is made independent of the code thread you are running. That is the reason that callbacks are so big when it comes to AJAX, as you don't know when something is done until it is done. Your code, in the meantime, happily continues on.
In your code, you are trying to assign a variable, aOrgdimensions a value that you will not know until the request is done. There are two solutions to this:
Modify your logic to reconcile the concept of callbacks and perform your actions once the request is done with.
Less preferably, make your request synchronous. This means the code and page will "hang" at the point of the request and only proceed once it is over. This is done by adding async: false to the jQuery options.
Thanx for the Asynchronous explaination. I did not realize that, but at least now i know why my vars aren't available.
Edit: Figured it out. Used the callback function as suggested, and all is well. :D