I'm building a Backbone application and I'm observing some behaviour I can't place. Consider the following Collection:
window.Pictures = Backbone.Collection.extend({
model: Picture,
url: 'latest.json',
parse: function(response) {
this.foobar = 1;
},
fetchPage: function() {
this.foobar = 2;
return this;
}
});
On a Chrome (or Firefox) console I've issued the following command:
> p = new Pictures(); p.fetch(); p.fetchPage();
> p.foobar
1
When I do:
> p = new Pictures(); p.fetch()
> p.fetchPage();
> p.foobar
2
I really don't understand this. Why is the first execution different from the second execution?
The fetch call is asynchronous because it involves an AJAX call to the server:
fetch collection.fetch([options])
Fetch the default set of models for this collection from the server, resetting the collection when they arrive.
And fetch will call parse:
parse collection.parse(response)
parse is called by Backbone whenever a collection's models are returned by the server, in fetch.
So p.parse() may be called before or after p.fetchPage() depending on timing issues that are beyond your control.
In the first case:
> p = new Pictures(); p.fetch(); p.fetchPage();
fetchPage is getting called before fetch gets its response from the server and gets around to calling parse so the calling sequence ends up like this this:
You call p.fetch().
AJAX call is made.
You call p.fetchPage().
AJAX response is received.
The AJAX success handler calls p.parse().
In the second case:
> p = new Pictures(); p.fetch()
> p.fetchPage();
Enough time passes between the lines for the AJAX call to return before p.fetchPage() is called so things happen in the order expect purely by accident.
If you need things to happen in a certain order then you'll need to use the success (and possibly error) callback that fetch provides:
The options hash takes success and error callbacks which will be passed (collection, response) as arguments.
So this should give you a consistent result of 2:
p = new Pictures();
p.fetch({
success: function(collection, response) {
collection.fetchPage();
console.log(collection.foobar);
}
});
Of course, if fetchPage involves an AJAX call then you'd have to add yet another layer of callbacks to get a consistent foobar value.
Related
I need to fire a number of ajax requests at a server and then run a callback when they are finished. Normally this would be easy using jQuery's deferred.done() . However to avoid overwhelming the server, I'm queuing the requests, and firing one every X milliseconds.
e.g
var promisesList = [];
var addToQueue = function(workflow) {
workflowQueue.push(workflow);
}
var startWorkflow = function(workflow) {
return $.ajax($endointURL, {
type: "POST",
data: {
action: workflow.id
},
success: function() {
},
error: function(jqXHR, textStatus, errorThrown) {
}
});
};
var startWorkflows = function() {
var promisesList = [];
if (workflowQueue.length > 0) {
var workflow = workflowQueue.shift();
promisesList.push(startWorkflow(workflow));
setTimeout(startWorkflows, delay);
}
};
startWorkflows();
$.when(promisesList).done(function(){
//do stuff
});
The problem with this is, that the promisesList array is initially empty, so the done() callback fires immediately, and then the ajax requests start getting sent by the setTimeout(). Is there an easy way to create the ajax requests initially and kind of "pause" them, then fire them using setTimeout().
I've found various throttle/queue implementations to fire ajax requests sequentially, but I'm happy for them to be fired in parallel, just with a delay on them.
The first thing you're stumbling upon is that when() doesn't work this way with arrays. It accepts an arbitrary list of promises so you can get around that by applying an array using:
$.when.apply(null, promiseList).done(function(){
// Do something
// use the `arguments` magic property to get an ordered list of results
});
Secondly, the throttling method can be done with $.ajax param of {delay:timeInSeconds} but I've proposed a solution that sets up a new defered which is immediately returned (to keep the order) but resolved after a timeout.
See http://jsfiddle.net/9Acb2/1/ for an interactive example
I am implementing a queue system of various information. When it reaches a certain number, I send an ajax request.... user inputs data, when it reaches certain point I send it. BUT, the user can still be entering data. I don't want to lose that.. so, I was thinking of I could use a $.Deferred/promise, while storing the data to a certain point.. firing ajax, and only allow a new request when the previous deferred is successful... also, if the data being entered then increased to the point I have to send it again, I que it..
I am having a hard time wrapping my brain around the methodology of how to implement.
===> capture data
=======> 'n' amount of data is entered
=============> move that data into the 'ready' bucket. (arbitrary, lets user entered 10 input fields and I store into an array. when array reaches 10.. boom send it ).
=============> fire ajax with the 10 items
In the meantime the user can still be entering data. I want to make sure I still capture it and keep que'ing and sending at 10.
I was thinking of a queuing system with a deferred. Not sure if I am over thinking this.
Since the jqXHR object returned by $.ajax() is a Promise, that can be used.
var data = {
// captured data goes here
};
function sendData( val ){
// jqXHR object (which contains a promise)
return $.ajax('/foo/', {
data: { value: val },
dataType: 'json',
success: function( resp ){
// do whatever needed
}
});
}
function when(){
$.when(sendData(data)).done(function (resp) {
when();
});
}
when(); // use this within the if switch
DEMO
Assuming your queue is the array dataQueue, then you can do something like this :
var dataQueue = [];//sacrificial queue of items to be sent in batches via AJAX request
var batchSize = 10;
var requesting = false;//flag used to suppress further requests while a request is still being serviced
//addToQueue: a function called whenever an item is to be added to he queue.
function addToQueue(item) {
dataQueue.push(item);
send();//(conditional on queue length and no request currently being serviced)
}
function send() {
if(dataQueue.length >= batchSize && !requesting) {//is the queue long enough for a batch to be sent, and is no ajax request being serviced
$.ajax({
url: '/path/to/server/side/script',
data: JSON.stringify(dataQueue.splice(0, batchSize)),//.splice removes items from the queue (fifo)
... //further ajax options
}).done(handleResponse).fail(handleFailure).always(resetSend);
requesting = true;
}
}
function handleResponse(data, textStatus, jqXHR) {
//handle the server's response data here
}
function handleFailure(jqXHR, textStatus, errorThrown) {
//handle failure here
}
function resetSend() {
requesting = false;//Lower the flag, to allow another batch to go whenever the queue is long enough.
send();//Call send again here in case the queue is already long enough for another batch.
}
DEMO
Notes:
There's no particular reason to return the jqXHR (or anything else) from send but by all means do so if your application would benefit.
resetSend needs not necessarily be called as the .always handler. Calling from the .done handler (and not the .error handler) would have the effect of "die on failure".
To minimise the number of members in your namespace (global or whatever), you might choose to encapsulate the whole thing in a contructor function or singleton namespace pattern , both of which are pretty trivial.
Encapsulating in a constructor, would allow you to have two or more queues with the desired behaviour, each with its own settings.
The demo has a few extra lines of code to make the process observable.
In the demo, you can set the batchsize to 15, add-add-add to get the queue length up to, say, 12, then reduce the batchsize to 5 and add another item. You should see two sequential requests, and 3 residual items in the queue.
When communicating with a server in javascript in my single page browser application, I would like to provide a callback function that is always called after the server replies, regardless of whether the result was a success or some kind of error.
Two cases where I need this:
1) I want to disable a "save" button while waiting for the server's response, and enable it again after the server responds with an error or a success.
2) I have a polling mechanism where I want to prevent stacking of calls when the server for some reason is being slow to respond - I want to wait for one poll call to finish before making the next.
One solution I have right now involves making sure that two functions (success and error) get passed along as options in a long method chain, which feels like a fragile and cumbersome solution (pseudo-ish code):
function doCall() {
framework1.callit({success : myCallback, error : myCallback})
};
framework123.callit = function(options) {
options = options || {};
if (options.error) {
var oldError = options.error;
options.error = function(errorStuff) {
// callit error stuff
oldError(errorStuff);
} else {
// callit error stuff
}
generalCallFunction(options);
}
function generalCallFunction(options) {
options = // ... checking success and error once again to get general success and error stuff in there, plus adding more options
ajax( blah, blah, options);
}
I also have a backbone solution where I listen to the sync event plus an error callback, in similar ways as above.
I'm always scared that error or success functions get lost on the way, and the whole thing is hard to follow.
Any framework or pattern for making this stuff as easy as possible? Is it a weird thing to have general things that should always happen whether the result was an error or a success?
You can use jQuery.ajax({ details here... ).always(callback);
Or, in Backbone
// logic to create model here
model.fetch().always(callback);
I'm writing some JavaScript/AJAX code.
Is there anyway to ensure that the server receives the XML requests in the order that they are sent?
If not with plain Ajax, do I get this guarantee if I send everything over a single WebSocket?
Thanks!
If it is of utmost importance that they're received in the proper order, and attaching an iterating id to the form isn't enough:
msg_number = 1; sendAJAX(msg_number); msg_number++;
Then I'd suggest building your own queue-system, and send each subsequent file as the callback of the previous one.
Rather than each element having its own AJAX-access, create one centralized spot in your application to handle that.
Your different AJAX-enabled sections don't even need to know that it is a queue:
AJAX.send({ url : "......", method : "post", success : func(){}, syncronous : true });
On the other side of that, you could have something like:
AJAX.send = function (obj) {
if (obj.synchronous) {
addToSyncQueue(obj); checkQueue();
} else { fireRequest(); }
};
Inside of your sync queue, all you'd need to do is wrap a new function around the old callback:
callback = (function (old_cb) {
return function (response) {
checkQueue();
old_cb(response);
};
}(obj.success));
obj.success = callback;
AJAX.call(obj);
Inside of checkQueue, you'd just need to see if it was empty, and if it wasn't, use
nextObj = queue.shift(); (if you're .push()-ing objects onto the queue -- so first-in, first-out, like you wanted).
A couple of options come to mind:
Send them synchronously, by waiting for a successful response from the server after each XML request is received (i.e. make a queue).
If you know the number of requests you'll be sending beforehand, send the request number as a tag with each request, e.g. <requestNum>1</requestNum><numRequests>5</numRequests>. This doesn't guarantee the order that they're received in, but guarantees that they can be put back in order afterwards, and has the added benefit of being sure that you have all the data.
At my company we use this little ajaxQueue plugin, written by one of the core jQuery contributors:
http://gnarf.net/2011/06/21/jquery-ajaxqueue/
I'm creating a script that performs several functions and I want to update the user as the functions are completed. I have nested $.ajax() calls with each subsequent call in the previous call's success block.
There are a total of 4 calls made for each loop. Let's call them scan_1 through scan_4. The success block of scan_1 calls scan_2 and so on down the chain.
For example, let's say I'm looping over 3 objects. I want the process to go like this:
Loop 1
scan_1
scan_2
scan_3
scan_4
Loop 2
scan_1
scan_2
scan_3
scan_4
Loop 3
scan_1
scan_2
scan_3
scan_4
The problem is that it's running through all the scan_1 calls first. I must be missing something, but I can't seem to figure it out. Any advice would be much appreciated.
For reference, here is a snippet of scan_1 (irrelevant stuff snipped):
for(var i = 1; i <= 3; i++)
{
$.ajax({
type: 'GET',
url: url,
data: 'do=scan&step=1&' + string,
dataType: 'json',
success: function (result)
{
if(result.proceed == 'true')
{
$('#scan_progress').append(result.message);
scan_2();
}
else
{
$('#scan_progress').append(result.message);
}
}
});
}
Thoughts?
Thanks in advance.
Sounds like you need to use jQuery deferred. It basically allows you to chain multiple event handlers to the jQuery Ajax object and gives you finer control over when the callbacks are invoked.
Further reading:
http://msdn.microsoft.com/en-us/scriptjunkie/gg723713
http://www.erichynds.com/jquery/using-deferreds-in-jquery/
It's asynchronous - the "success" fires sometime in the future. The script does not wait for it to respond. Since you're firing off three requests in your loop, they will all be "scan1".
"scan_2" will be called as each request completes.
Change the request to synchronous if you want to control the order of events.
You are starting by sending off three ajax calls at once.
Scan1 (loop 1)
Scan1 (loop 2)
Scan1 (loop 3)
When each Scan 1 completes, it's subsequent Scan 2, and then Scan 3 are called.
What did you actually want to happen? Scan 1 2 and 3 of loop 1, then 1 2 and 3 of loop 2, and then 1 2 and 3 of loop 3? That would require more nesting, or possibly deferred objects.
Instead of using the success callback for each $.ajax() call, you can store each set of AJAX requests (their jqXHR objects) in an array and wait for all of them to resolve:
function scan_1 () {
//setup array to store jqXHR objects (deferred objects)
var jqXHRs = [];
for(var i = 1; i <= 3; i++)
{
//push a new index onto the array, `$.ajax()` returns an object that will resolve when the response is returned
jqXHRs[jqXHRs.length] = $.ajax({
type: 'GET',
url: url,
data: 'do=scan&step=1&' + string,
dataType: 'json'
});
}
//wait for all four of the AJAX requests to resolve before running `scan_2()`
$.when(jqXHRs).then(function () {
if(result.proceed == 'true') {
scan_2();
}
});
}
I've had similar problems working heavily with SharePoint web services - you often need to pull data from multiple sources to generate input for a single process.
To solve it I embedded this kind of functionality into my AJAX abstraction library. You can easily define a request which will trigger a set of handlers when complete. However each request can be defined with multiple http calls. Here's the component (and detailed documentation):
DPAJAX at DepressedPress.com
This simple example creates one request with three calls and then passes that information, in the call order, to a single handler:
// The handler function
function AddUp(Nums) { alert(Nums[1] + Nums[2] + Nums[3]) };
// Create the pool
myPool = DP_AJAX.createPool();
// Create the request
myRequest = DP_AJAX.createRequest(AddUp);
// Add the calls to the request
myRequest.addCall("GET", "http://www.mysite.com/Add.htm", [5,10]);
myRequest.addCall("GET", "http://www.mysite.com/Add.htm", [4,6]);
myRequest.addCall("GET", "http://www.mysite.com/Add.htm", [7,13]);
// Add the request to the pool
myPool.addRequest(myRequest);
Note that unlike many of the other solutions provided this method does not force single threading of the calls being made - each will still run as quickly (or as slowly) as the environment allows but the single handler will only be called when all are complete. It also supports the setting of timeout values and retry attempts if your service is a little flakey.
I've found it insanely useful (and incredibly simple to understand from a code perspective). No more chaining, no more counting calls and saving output. Just "set it and forget it".