Related
I am trying to store item in an array, newItemList, along with present hour and an object, which I get from a server as JSON response, which I parse into a js object. This simply means that a typical element of newItemList will be [item, 13(present_hour), obj]
Here's the code,
var item;
//some code to obtain item
chrome.storage.local.get({'itemList':[]}, function(item){
var newItemList = item.itemList;
var d = new Date();
if (isItemInArray(newItemList,item) == -1) {
//api call function as callback
apiCall(function(result){
newItemList.push([item,d.getHours(),result]);
});
} else {
var indexOfitem = findItem(newItemList,item);//finding item in the array
if(//some condition){
apiCall(function(result){
newItemList[indexOfTab][1] = d.getHours();
newItemList[indexOfTab][2] = result;
});
} else {
var indexOfitem = findItem(newItemList,item);
storedApiCall(newItemList[indexOfTab][2]);//sending the stored JSON response
}
}
chrome.storage.local.set({itemList: newItemList});
})
function apiCall(callback){
//API call, to obtain the JSON response from the web server
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
var myObj = JSON.parse(this.responseText);//parsing the JSON response to js object
callback(myObj);
storedApiCall(myObj);//this function just works fine
}
};
xhttp.open("GET", "example.com", true);
xhttp.send();
}
newItemList isn't getting stored in local storage. It contains of only one element, [item, present hour, obj(from present apiCall)]. That's why, only the if part of the code runs each time leading to api calls each time, rendering the else part obsolete.
I read about callback from many famous questions asked around about asynchronous calls, but none connected local storage with callbacks. Before implementing callback, newItemList got stored in local storage, but I couldn't obtain the obj from JSON response for the first time, which is the typical behaviour of asynchronous calls.
Suggest edits, if any.
This line is being executed before the functions given to your calls to apiCall are invoked (i.e. with its initial value item.itemList):
chrome.storage.local.set({itemList: newItemList});
This is because callbacks are typically invoked after some asynchronous action completes. In your case, they will be invoked after whatever operation apiCall performs is complete, e.g. an HTTP response to an API is received.
So if you move that line into those callback functions, at their end after you have mutated newItemList, the set call on chrome.storage.local will be performed with the changes applied (e.g. newItemList.push([item,d.getHours(),result])):
chrome.storage.local.get({'itemList':[]}, function(item){
var newItemList = item.itemList;
var d = new Date();
if (isItemInArray(newItemList,item) == -1) {
//api call function as callback
apiCall(function(result){
newItemList.push([item,d.getHours(),result]);
chrome.storage.local.set({itemList: newItemList});
});
} else {
var indexOfitem = findItem(newItemList,item);//finding item in the array
if(//some condition){
apiCall(function(result){
newItemList[indexOfTab][1] = d.getHours();
newItemList[indexOfTab][2] = result;
chrome.storage.local.set({itemList: newItemList});
});
} else {
var indexOfitem = findItem(newItemList,item);
storedApiCall(newItemList[indexOfTab][2]);//sending the stored JSON response
}
}
})
I think that when the execution arrives on the chrome.storage.local.set({itemList: newItemList}); line, the variable is not set yet because the XMLHttpRequest is async you have to add an chrome.storage.local.set in your callback function or use a promise then set your local storage value
I've some problem with a library calling a function on each item. I've to check the state for this item via an ajax request and don't want to call one request per item, but get a range of item states.
Because these items are dates I can get some range pretty easy - that's the good part :)
So to to give some code ...
var itemStates = {};
var libraryObj = {
itemCallback: function(item) {
return checkState(item);
}
}
function checkState(item) {
if(!itemStates.hasOwnProperty(item)) {
$.get('...', function(result) {
$.extend(true, itemStates, result);
});
}
return itemStates[item];
}
The library is now calling library.itemCallback() on each item, but I want to wait for the request made in checkState() before calling checkState() again (because the chance is extremly high the next items' state was allready requested within the previous request.
I read about the defer and wait(), then() and so on, but couldn't really get an idea how to implement this.
Many thanks to everybody who could help me with this :)
You can achieve this by using jQuery.Deferred or Javascript Promise. In the following code, itemCallback() will wait for previous calls to finish before calling checkState().
var queue = [];
var itemStates = {};
var libraryObj = {
itemCallback: function(item) {
var def = $.Deferred();
$.when.apply(null, queue)
.then(function() {
return checkState(item);
})
.then(function(result) {
def.resolve(result);
});
queue.push(def.promise());
return def.promise();
}
}
function checkState(item) {
var def = $.Deferred();
if (!itemStates.hasOwnProperty(item)) {
$.get('...', function(result) {
$.extend(true, itemStates, result);
def.resolve(itemStates[item]);
});
} else
def.resolve(itemStates[item]);
return def.promise();
}
//these will execute in order, waiting for the previous call
libraryObj.itemCallback(1).done(function(r) { console.log(r); });
libraryObj.itemCallback(2).done(function(r) { console.log(r); });
libraryObj.itemCallback(3).done(function(r) { console.log(r); });
libraryObj.itemCallback(4).done(function(r) { console.log(r); });
libraryObj.itemCallback(5).done(function(r) { console.log(r); });
Same example built with Javascript Promises
var queue = [];
var itemStates = {};
var libraryObj = {
itemCallback: function(item) {
var promise = new Promise(resolve => {
Promise.all(queue)
.then(() => checkState(item))
.then((result) => resolve(result));
});
queue.push(promise);
return promise;
}
}
function checkState(item) {
return new Promise(resolve => {
if (item in itemStates)
resolve(itemStates[item]);
else {
$.get('...', function(result) {
$.extend(true, itemStates, result);
resolve(itemStates[item]);
});
}
});
}
//these will execute in order, waiting for the previous call
libraryObj.itemCallback(1).then(function(r) { console.log(r); });
libraryObj.itemCallback(2).then(function(r) { console.log(r); });
libraryObj.itemCallback(3).then(function(r) { console.log(r); });
libraryObj.itemCallback(4).then(function(r) { console.log(r); });
libraryObj.itemCallback(5).then(function(r) { console.log(r); });
The library is now calling library.itemCallback() on each item, but I want to wait for the request made in checkState() before calling checkState() again (because the chance is extremely high the next items' state was already requested within the previous request.
One thing I can think of doing is making some caching function, depending on the last time the function was called return the previous value or make a new request
var cached = function(self, cachingTime, fn){
var paramMap = {};
return function( ) {
var arr = Array.prototype.slice.call(arguments);
var parameters = JSON.stringify(arr);
var returning;
if(!paramMap[parameters]){
returning = fn.apply(self,arr);
paramMap[parameters]={timeCalled: new Date(), value:returning};
} else {
var diffMs = Math.abs(paramMap[parameters].timeCalled - new Date());
var diffMins = ( diffMs / 1000 ) / 60;
if(diffMins > cachingTime){
returning = fn.apply(self,arr);
paramMap[parameters] = {timeCalled: new Date(), value:returning};
} else {
returning = paramMap[parameters].value;
}
}
return returning;
}
}
Then you'd wrap the ajax call into the function you've made
var fn = cached(null, 1 , function(item){
return $.get('...', function(result) {
$.extend(true, itemStates, result);
});
});
Executing the new function would get you the last promise called for those parameters within the last request made at the last minute with those parameters or make a new request
simplest and dirty way of taking control over the library is to override their methods
But I don't really know core problem here so other hints are below
If you have the control over the checkState then just collect your data and change your controller on the server side to work with arrays that's it
and if you don't know when the next checkState will be called to count your collection and make the request use setTimeout to check collection after some time or setIterval to check it continuously
if you don't want to get same item multiple times then store your checked items in some variable like alreadyChecked and before making request search for this item in alreadyChecked
to be notified when some library is using your item use getter,
and then collect your items.
When you will have enough items collected then you can make the request,
but when you will not have enought items then use setTimeout and wait for some time. If nothing changes, then it means that library finishes the iteration for now and you can make the request with items that left of.
let collection=[];// collection for request
let _items={};// real items for you when you don't want to perfrom actions while getting values
let itemStates={};// items for library
let timeoutId;
//instead of itemStates[someState]=someValue; use
function setItem(someState,someValue){
Object.defineProperty(itemStates, someState, { get: function () {
if(typeof timeoutId=="number")clearTimeout(timeoutId);
//here you can add someState to the collection for request
collection.push(_items[someState]);
if(collection.length>=10){
makeRequest();
}else{
timeoutId=setTimeout(()=>{...checkCollectionAndMakeRequest...},someTime);
}
return someValue;
} });
}
I am learning node.js with learnyounode.
I am having a problem with JUGGLING ASYNC.
The problem is described as follows:
You are given three urls as command line arguments. You are supposed to make http.get() calls to get data from these urls and then print them in the same order as their order in the list of arguments.
Here is my code:
var http = require('http')
var truecount = 0;
var printlist = []
for(var i = 2; i < process.argv.length; i++) {
http.get(process.argv[i], function(response) {
var printdata = "";
response.setEncoding('utf8');
response.on('data', function(data) {
printdata += data;
})
response.on('end', function() {
truecount += 1
printlist.push(printdata)
if(truecount == 3) {
printlist.forEach(function(item) {
console.log(item)
})
}
})
})
}
Here is the questions I do not understand:
I am trying to store the completed data in response.on('end', function(){})for each url using a dictionary. However, I do not know how to get the url for that http.get(). If I can do a local variable inside http.get(), that would be great but I think whenever I declare a variable as var url, it will always point to the last url. Since it is global and it keeps updating through the loop. What is the best way for me to store those completed data as the value with the key equal to the url?
This is how I would go about solving the problem.
#!/usr/bin/env node
var http = require('http');
var argv = process.argv.splice(2),
truecount = argv.length,
pages = [];
function printUrls() {
if (--truecount > 0)
return;
for (i = 0; i < pages.length; i++) {
console.log(pages[i].data + '\n\n');
}
}
function HTMLPage(url) {
var _page = this;
_page.data = '### [URL](' + url + ')\n';
http.get(url, function(res) {
res.setEncoding('utf8');
res.on('data', function(data) {
_page.data += data;
});
res.on('end', printUrls);
});
}
for (var i = 0; i < argv.length; i++)
pages.push(new HTMLPage(argv[i]));
It adds the requests to an array on the start of each request, that way once done I can iterate nicely through the responses knowing that they are in the correct order.
When dealing with asynchronous processing, I find it much easier to think about each process as something with a concrete beginning and end. If you require the order of the requests to be preserved then the entry must be made on creation of each process, and then you refer back to that record on completion. Only then can you guarantee that you have things in the right order.
If you were desperate to use your above method, then you could define a variable inside your get callback closure and use that to store the urls, that way you wouldn't end up with the last url overwriting your variables. If you do go this way though, you'll dramatically increase your overhead when you have to use your urls from process.argv to access each response in that order. I wouldn't advise it.
I went about this challenge a little differently. I'm creating an array of functions that call http.get, and immediately invoking them with their specifc context. The streams write to an object where the key is the port of the server which that stream is relevant to. When the end event is triggered, it adds to that server to the completed array - when that array is full it iterates through and echos in the original order the servers were given.
There's no right way but there are probably a dozen or more ways. Wanted to share mine.
var http = require('http'),
request = [],
dataStrings = {},
orderOfServerInputs = [];
var completeResponses = [];
for(server in process.argv){
if(server >= 2){
orderOfServerInputs[orderOfServerInputs.length] = process.argv[server].substr(-4);
request[request.length] = function(thisServer){
http.get(process.argv[server], function(response){
response.on("data", function(data){
dataStrings[thisServer.substr(-4)] = dataStrings[thisServer.substr(-4)] ? dataStrings[thisServer.substr(-4)] : ''; //if not set set to ''
dataStrings[thisServer.substr(-4)] += data.toString();
});
response.on("end", function(data){
completeResponses[completeResponses.length] = true;
if(completeResponses.length > 2){
for(item in orderOfServerInputs){
serverNo = orderOfServerInputs[item].substr(-4)
console.log(dataStrings[serverNo]);
}
}
});
});
}(process.argv[server]);
}
}
Immediately-Invoked Function Expression (IIFE) could be a solution to your problem. It allows us to bind to function a specific value, in your case, the url which gets the response. In the code below, I bind variable i to index and so, whichever url gets the response, that index of print list will be updated. For more information, refer to this website
var http = require('http')
var truecount = 0;
var printlist = [];
for(var i = 2; i < process.argv.length; i++) {
(function(index){
http.get(process.argv[index], function(response) {
response.setEncoding('utf8');
response.on('data', function(data) {
if (printlist[index] == undefined)
printlist[index] = data;
else
printlist[index]+= data;
})
response.on('end', function() {
truecount += 1
if(truecount == 3) {
printlist.forEach(function(item) {
console.log(item)
})
}
})
})
})(i)
}
I find I sometimes need to iterate some collection and make an ajax call for each element. I want each call to return before moving to the next element so that I don't blast the server with requests - which often leads to other issues. And I don't want to set async to false and freeze the browser.
Usually this involves setting up some kind of iterator context that i step thru upon each success callback. I think there must be a cleaner simpler way?
Does anyone have a clever design pattern for how to neatly work thru a collection making ajax calls for each item?
jQuery 1.5+
I developed an $.ajaxQueue() plugin that uses the $.Deferred, .queue(), and $.ajax() to also pass back a promise that is resolved when the request completes.
/*
* jQuery.ajaxQueue - A queue for ajax requests
*
* (c) 2011 Corey Frang
* Dual licensed under the MIT and GPL licenses.
*
* Requires jQuery 1.5+
*/
(function($) {
// jQuery on an empty object, we are going to use this as our Queue
var ajaxQueue = $({});
$.ajaxQueue = function( ajaxOpts ) {
var jqXHR,
dfd = $.Deferred(),
promise = dfd.promise();
// queue our ajax request
ajaxQueue.queue( doRequest );
// add the abort method
promise.abort = function( statusText ) {
// proxy abort to the jqXHR if it is active
if ( jqXHR ) {
return jqXHR.abort( statusText );
}
// if there wasn't already a jqXHR we need to remove from queue
var queue = ajaxQueue.queue(),
index = $.inArray( doRequest, queue );
if ( index > -1 ) {
queue.splice( index, 1 );
}
// and then reject the deferred
dfd.rejectWith( ajaxOpts.context || ajaxOpts,
[ promise, statusText, "" ] );
return promise;
};
// run the actual query
function doRequest( next ) {
jqXHR = $.ajax( ajaxOpts )
.done( dfd.resolve )
.fail( dfd.reject )
.then( next, next );
}
return promise;
};
})(jQuery);
jQuery 1.4
If you're using jQuery 1.4, you can utilize the animation queue on an empty object to create your own "queue" for your ajax requests for the elements.
You can even factor this into your own $.ajax() replacement. This plugin $.ajaxQueue() uses the standard 'fx' queue for jQuery, which will auto-start the first added element if the queue isn't already running.
(function($) {
// jQuery on an empty object, we are going to use this as our Queue
var ajaxQueue = $({});
$.ajaxQueue = function(ajaxOpts) {
// hold the original complete function
var oldComplete = ajaxOpts.complete;
// queue our ajax request
ajaxQueue.queue(function(next) {
// create a complete callback to fire the next event in the queue
ajaxOpts.complete = function() {
// fire the original complete if it was there
if (oldComplete) oldComplete.apply(this, arguments);
next(); // run the next query in the queue
};
// run the query
$.ajax(ajaxOpts);
});
};
})(jQuery);
Example Usage
So, we have a <ul id="items"> which has some <li> that we want to copy (using ajax!) to the <ul id="output">
// get each item we want to copy
$("#items li").each(function(idx) {
// queue up an ajax request
$.ajaxQueue({
url: '/echo/html/',
data: {html : "["+idx+"] "+$(this).html()},
type: 'POST',
success: function(data) {
// Write to #output
$("#output").append($("<li>", { html: data }));
}
});
});
jsfiddle demonstration - 1.4 version
A quick and small solution using deferred promises. Although this uses jQuery's $.Deferred, any other should do.
var Queue = function () {
var previous = new $.Deferred().resolve();
return function (fn, fail) {
return previous = previous.then(fn, fail || fn);
};
};
Usage, call to create new queues:
var queue = Queue();
// Queue empty, will start immediately
queue(function () {
return $.get('/first');
});
// Will begin when the first has finished
queue(function() {
return $.get('/second');
});
See the example with a side-by-side comparison of asynchronous requests.
This works by creating a function that will automatically chain promises together. The synchronous nature comes from the fact that we are wrapping $.get calls in function and pushing them into a queue. The execution of these functions are deferred and will only be called when it gets to the front of the queue.
A requirement for the code is that each of the functions you give must return a promise. This returned promise is then chained onto the latest promise in the queue. When you call the queue(...) function it chains onto the last promise, hence the previous = previous.then(...).
You can wrap all that complexity into a function to make a simple call that looks like this:
loadSequantially(['/a', '/a/b', 'a/b/c'], function() {alert('all loaded')});
Below is a rough sketch (working example, except the ajax call). This can be modified to use a queue-like structure instead of an array
// load sequentially the given array of URLs and call 'funCallback' when all's done
function loadSequantially(arrUrls, funCallback) {
var idx = 0;
// callback function that is called when individual ajax call is done
// internally calls next ajax URL in the sequence, or if there aren't any left,
// calls the final user specified callback function
var individualLoadCallback = function() {
if(++idx >= arrUrls.length) {
doCallback(arrUrls, funCallback);
}else {
loadInternal();
}
};
// makes the ajax call
var loadInternal = function() {
if(arrUrls.length > 0) {
ajaxCall(arrUrls[idx], individualLoadCallback);
}else {
doCallback(arrUrls, funCallback);
}
};
loadInternal();
};
// dummy function replace with actual ajax call
function ajaxCall(url, funCallBack) {
alert(url)
funCallBack();
};
// final callback when everything's loaded
function doCallback(arrUrls, func) {
try {
func();
}catch(err) {
// handle errors
}
};
Ideally, a coroutine with multiple entry points so every callback from server can call the same coroutine will be neat. Damn, this is about to be implemented in Javascript 1.7.
Let me try using closure...
function BlockingAjaxCall (URL,arr,AjaxCall,OriginalCallBack)
{
var nextindex = function()
{
var i =0;
return function()
{
return i++;
}
};
var AjaxCallRecursive = function(){
var currentindex = nextindex();
AjaxCall
(
URL,
arr[currentindex],
function()
{
OriginalCallBack();
if (currentindex < arr.length)
{
AjaxCallRecursive();
}
}
);
};
AjaxCallRecursive();
}
// suppose you always call Ajax like AjaxCall(URL,element,callback) you will do it this way
BlockingAjaxCall(URL,myArray,AjaxCall,CallBack);
Yeah, while the other answers will work, they are lots of code and messy looking. Frame.js was designed to elegantly address this situation. https://github.com/bishopZ/Frame.js
For instance, this will cause most browsers to hang:
for(var i=0; i<1000; i++){
$.ajax('myserver.api', { data:i, type:'post' });
}
While this will not:
for(var i=0; i<1000; i++){
Frame(function(callback){
$.ajax('myserver.api', { data:i, type:'post', complete:callback });
});
}
Frame.start();
Also, using Frame allows you to waterfall the response objects and deal with them all after the entire series of AJAX request have completed (if you want to):
var listOfAjaxObjects = [ {}, {}, ... ]; // an array of objects for $.ajax
$.each(listOfAjaxObjects, function(i, item){
Frame(function(nextFrame){
item.complete = function(response){
// do stuff with this response or wait until end
nextFrame(response); // ajax response objects will waterfall to the next Frame()
$.ajax(item);
});
});
Frame(function(callback){ // runs after all the AJAX requests have returned
var ajaxResponses = [];
$.each(arguments, function(i, arg){
if(i!==0){ // the first argument is always the callback function
ajaxResponses.push(arg);
}
});
// do stuff with the responses from your AJAX requests
// if an AJAX request returned an error, the error object will be present in place of the response object
callback();
});
Frame.start()
I am posting this answer thinking that it might help other persons in future, looking for some simple solutions in the same scenario.
This is now possible also using the native promise support introduced in ES6. You can wrap the ajax call in a promise and return it to the handler of the element.
function ajaxPromise(elInfo) {
return new Promise(function (resolve, reject) {
//Do anything as desired with the elInfo passed as parameter
$.ajax({
type: "POST",
url: '/someurl/',
data: {data: "somedata" + elInfo},
success: function (data) {
//Do anything as desired with the data received from the server,
//and then resolve the promise
resolve();
},
error: function (err) {
reject(err);
},
async: true
});
});
}
Now call the function recursively, from where you have the collection of the elements.
function callAjaxSynchronous(elCollection) {
if (elCollection.length > 0) {
var el = elCollection.shift();
ajaxPromise(el)
.then(function () {
callAjaxSynchronous(elCollection);
})
.catch(function (err) {
//Abort further ajax calls/continue with the rest
//callAjaxSynchronous(elCollection);
});
}
else {
return false;
}
}
I use http://developer.yahoo.com/yui/3/io/#queue to get that functionality.
The only solutions I can come up with is, as you say, maintaining a list of pending calls / callbacks. Or nesting the next call in the previous callback, but that feels a bit messy.
You can achieve the same thing using then.
var files = [
'example.txt',
'example2.txt',
'example.txt',
'example2.txt',
'example.txt',
'example2.txt',
'example2.txt',
'example.txt'
];
nextFile().done(function(){
console.log("done",arguments)
});
function nextFile(text){
var file = files.shift();
if(text)
$('body').append(text + '<br/>');
if(file)
return $.get(file).then(nextFile);
}
http://plnkr.co/edit/meHQHU48zLTZZHMCtIHm?p=preview
I would suggest a bit more sophisticated approach which is reusable for different cases.
I am using it for example when I need to slow down a call sequence when the user is typing in text editor.
But I am sure it should also work when iterating through the collection. In this case it can queue requests and can send a single AJAX call instead of 12.
queueing = {
callTimeout: undefined,
callTimeoutDelayTime: 1000,
callTimeoutMaxQueueSize: 12,
callTimeoutCurrentQueueSize: 0,
queueCall: function (theCall) {
clearTimeout(this.callTimeout);
if (this.callTimeoutCurrentQueueSize >= this.callTimeoutMaxQueueSize) {
theCall();
this.callTimeoutCurrentQueueSize = 0;
} else {
var _self = this;
this.callTimeout = setTimeout(function () {
theCall();
_self.callTimeoutCurrentQueueSize = 0;
}, this.callTimeoutDelayTime);
}
this.callTimeoutCurrentQueueSize++;
}
}
There's a very simple way to achieve this by adding async: false as a property to the ajax call. This will make sure the ajax call is complete before parsing the rest of the code. I have used this successfully in loops many times.
Eg.
$.ajax({
url: "",
type: "GET",
async: false
...
I'd like to update a page based upon the results of multiple ajax/json requests. Using jQuery, I can "chain" the callbacks, like this very simple stripped down example:
$.getJSON("/values/1", function(data) {
// data = {value: 1}
var value_1 = data.value;
$.getJSON("/values/2", function(data) {
// data = {value: 42}
var value_2 = data.value;
var sum = value_1 + value_2;
$('#mynode').html(sum);
});
});
However, this results in the requests being made serially. I'd much rather a way to make the requests in parallel, and perform the page update after all are complete. Is there any way to do this?
jQuery $.when() and $.done() are exactly what you need:
$.when($.ajax("/page1.php"), $.ajax("/page2.php"))
.then(myFunc, myFailure);
Try this solution, which can support any specific number of parallel queries:
var done = 4; // number of total requests
var sum = 0;
/* Normal loops don't create a new scope */
$([1,2,3,4,5]).each(function() {
var number = this;
$.getJSON("/values/" + number, function(data) {
sum += data.value;
done -= 1;
if(done == 0) $("#mynode").html(sum);
});
});
Run multiple AJAX requests in parallel
When working with APIs, you sometimes need to issue multiple AJAX requests to different endpoints. Instead of waiting for one request to complete before issuing the next, you can speed things up with jQuery by requesting the data in parallel, by using jQuery's $.when() function:
JS
$.when($.get('1.json'), $.get('2.json')).then(function(r1, r2){
console.log(r1[0].message + " " + r2[0].message);
});
The callback function is executed when both of these GET requests finish successfully. $.when() takes the promises returned by two $.get() calls, and constructs a new promise object. The r1 and r2 arguments of the callback are arrays, whose first elements contain the server responses.
Here's my attempt at directly addressing your question
Basically, you just build up and AJAX call stack, execute them all, and a provided function is called upon completion of all the events - the provided argument being an array of the results from all the supplied ajax requests.
Clearly this is early code - you could get more elaborate with this in terms of the flexibility.
<script type="text/javascript" src="http://jqueryjs.googlecode.com/files/jquery-1.3.2.min.js"></script>
<script type="text/javascript">
var ParallelAjaxExecuter = function( onComplete )
{
this.requests = [];
this.results = [];
this.onComplete = onComplete;
}
ParallelAjaxExecuter.prototype.addRequest = function( method, url, data, format )
{
this.requests.push( {
"method" : method
, "url" : url
, "data" : data
, "format" : format
, "completed" : false
} )
}
ParallelAjaxExecuter.prototype.dispatchAll = function()
{
var self = this;
$.each( self.requests, function( i, request )
{
request.method( request.url, request.data, function( r )
{
return function( data )
{
console.log
r.completed = true;
self.results.push( data );
self.checkAndComplete();
}
}( request ) )
} )
}
ParallelAjaxExecuter.prototype.allRequestsCompleted = function()
{
var i = 0;
while ( request = this.requests[i++] )
{
if ( request.completed === false )
{
return false;
}
}
return true;
},
ParallelAjaxExecuter.prototype.checkAndComplete = function()
{
if ( this.allRequestsCompleted() )
{
this.onComplete( this.results );
}
}
var pe = new ParallelAjaxExecuter( function( results )
{
alert( eval( results.join( '+' ) ) );
} );
pe.addRequest( $.get, 'test.php', {n:1}, 'text' );
pe.addRequest( $.get, 'test.php', {n:2}, 'text' );
pe.addRequest( $.get, 'test.php', {n:3}, 'text' );
pe.addRequest( $.get, 'test.php', {n:4}, 'text' );
pe.dispatchAll();
</script>
here's test.php
<?php
echo pow( $_GET['n'], 2 );
?>
Update: Per the answer given by Yair Leviel, this answer is obsolete. Use a promise library, like jQuery.when() or Q.js.
I created a general purpose solution as a jQuery extension. Could use some fine tuning to make it more general, but it suited my needs. The advantage of this technique over the others in this posting as of the time of this writing was that any type of asynchronous processing with a callback can be used.
Note: I'd use Rx extensions for JavaScript instead of this if I thought my client would be okay with taking a dependency on yet-another-third-party-library :)
// jQuery extension for running multiple async methods in parallel
// and getting a callback with all results when all of them have completed.
//
// Each worker is a function that takes a callback as its only argument, and
// fires up an async process that calls this callback with its result.
//
// Example:
// $.parallel(
// function (callback) { $.get("form.htm", {}, callback, "html"); },
// function (callback) { $.post("data.aspx", {}, callback, "json"); },
// function (formHtml, dataJson) {
// // Handle success; each argument to this function is
// // the result of correlating ajax call above.
// }
// );
(function ($) {
$.parallel = function (anyNumberOfWorkers, allDoneCallback) {
var workers = [];
var workersCompleteCallback = null;
// To support any number of workers, use "arguments" variable to
// access function arguments rather than the names above.
var lastArgIndex = arguments.length - 1;
$.each(arguments, function (index) {
if (index == lastArgIndex) {
workersCompleteCallback = this;
} else {
workers.push({ fn: this, done: false, result: null });
}
});
// Short circuit this edge case
if (workers.length == 0) {
workersCompleteCallback();
return;
}
// Fire off each worker process, asking it to report back to onWorkerDone.
$.each(workers, function (workerIndex) {
var worker = this;
var callback = function () { onWorkerDone(worker, arguments); };
worker.fn(callback);
});
// Store results and update status as each item completes.
// The [0] on workerResultS below assumes the client only needs the first parameter
// passed into the return callback. This simplifies the handling in allDoneCallback,
// but may need to be removed if you need access to all parameters of the result.
// For example, $.post calls back with success(data, textStatus, XMLHttpRequest). If
// you need textStatus or XMLHttpRequest then pull off the [0] below.
function onWorkerDone(worker, workerResult) {
worker.done = true;
worker.result = workerResult[0]; // this is the [0] ref'd above.
var allResults = [];
for (var i = 0; i < workers.length; i++) {
if (!workers[i].done) return;
else allResults.push(workers[i].result);
}
workersCompleteCallback.apply(this, allResults);
}
};
})(jQuery);
UPDATE And another two years later, this looks insane because the accepted answer has changed to something much better! (Though still not as good as Yair Leviel's answer using jQuery's when)
18 months later, I just hit something similar. I have a refresh button, and I want the old content to fadeOut and then the new content to fadeIn. But I also need to get the new content. The fadeOut and the get are asynchronous, but it would be a waste of time to run them serially.
What I do is really the same as the accepted answer, except in the form of a reusable function. Its primary virtue is that it is much shorter than the other suggestions here.
var parallel = function(actions, finished) {
finishedCount = 0;
var results = [];
$.each(actions, function(i, action) {
action(function(result) {
results[i] = result;
finishedCount++;
if (finishedCount == actions.length) {
finished(results);
}
});
});
};
You pass it an array of functions to run in parallel. Each function should accept another function to which it passes its result (if any). parallel will supply that function.
You also pass it a function to be called when all the operations have completed. This will receive an array with all the results in. So my example was:
refreshButton.click(function() {
parallel([
function(f) {
contentDiv.fadeOut(f);
},
function(f) {
portlet.content(f);
},
],
function(results) {
contentDiv.children().remove();
contentDiv.append(results[1]);
contentDiv.fadeIn();
});
});
So when my refresh button is clicked, I launch jQuery's fadeOut effect and also my own portlet.content function (which does an async get, builds a new bit of content and passes it on), and then when both are complete I remove the old content, append the result of the second function (which is in results[1]) and fadeIn the new content.
As fadeOut doesn't pass anything to its completion function, results[0] presumably contains undefined, so I ignore it. But if you had three operations with useful results, they would each slot into the results array, in the same order you passed the functions.
you could do something like this
var allData = []
$.getJSON("/values/1", function(data) {
allData.push(data);
if(data.length == 2){
processData(allData) // where process data processes all the data
}
});
$.getJSON("/values/2", function(data) {
allData.push(data);
if(data.length == 2){
processData(allData) // where process data processes all the data
}
});
var processData = function(data){
var sum = data[0] + data[1]
$('#mynode').html(sum);
}
Here's an implementation using mbostock/queue:
queue()
.defer(function(callback) {
$.post('/echo/json/', {json: JSON.stringify({value: 1}), delay: 1}, function(data) {
callback(null, data.value);
});
})
.defer(function(callback) {
$.post('/echo/json/', {json: JSON.stringify({value: 3}), delay: 2}, function(data) {
callback(null, data.value);
});
})
.awaitAll(function(err, results) {
var result = results.reduce(function(acc, value) {
return acc + value;
}, 0);
console.log(result);
});
The associated fiddle: http://jsfiddle.net/MdbW2/
With the following extension of JQuery (to can be written as a standalone function you can do this:
$.whenAll({
val1: $.getJSON('/values/1'),
val2: $.getJSON('/values/2')
})
.done(function (results) {
var sum = results.val1.value + results.val2.value;
$('#mynode').html(sum);
});
The JQuery (1.x) extension whenAll():
$.whenAll = function (deferreds) {
function isPromise(fn) {
return fn && typeof fn.then === 'function' &&
String($.Deferred().then) === String(fn.then);
}
var d = $.Deferred(),
keys = Object.keys(deferreds),
args = keys.map(function (k) {
return $.Deferred(function (d) {
var fn = deferreds[k];
(isPromise(fn) ? fn : $.Deferred(fn))
.done(d.resolve)
.fail(function (err) { d.reject(err, k); })
;
});
});
$.when.apply(this, args)
.done(function () {
var resObj = {},
resArgs = Array.prototype.slice.call(arguments);
resArgs.forEach(function (v, i) { resObj[keys[i]] = v; });
d.resolve(resObj);
})
.fail(d.reject);
return d;
};
See jsbin example:
http://jsbin.com/nuxuciwabu/edit?js,console
The most professional solution for me would be by using async.js and Array.reduce like so:
async.map([1, 2, 3, 4, 5], function (number, callback) {
$.getJSON("/values/" + number, function (data) {
callback(null, data.value);
});
}, function (err, results) {
$("#mynode").html(results.reduce(function(previousValue, currentValue) {
return previousValue + currentValue;
}));
});
If the result of one request depends on the other, you can't make them parallel.
Building on Yair's answer.
You can define the ajax promises dynamically.
var start = 1; // starting value
var len = 2; // no. of requests
var promises = (new Array(len)).fill().map(function() {
return $.ajax("/values/" + i++);
});
$.when.apply($, promises)
.then(myFunc, myFailure);
Suppose you have an array of file name.
var templateNameArray=["test.html","test2.html","test3.html"];
htmlTemplatesLoadStateMap={};
var deffereds=[];
for (var i = 0; i < templateNameArray.length; i++)
{
if (!htmlTemplatesLoadStateMap[templateNameArray[i]])
{
deferreds.push($.get("./Content/templates/" +templateNameArray[i],
function (response, status, xhr) {
if (status == "error") { }
else {
$("body").append(response);
}
}));
htmlTemplatesLoadStateMap[templateNameArray[i]] = true;
}
}
$.when.all(deferreds).always(function(resultsArray) { yourfunctionTobeExecuted(yourPayload);
});
I needed multiple, parallel ajax calls, and the jquery $.when syntax wasn't amenable to the full $.ajax format I am used to working with. So I just created a setInterval timer to periodically check when each of the ajax calls had returned. Once they were all returned, I could proceed from there.
I read there may be browser limitations as to how many simultaneous ajax calls you can have going at once (2?), but .$ajax is inherently asynchronous, so making the ajax calls one-by-one would result in parallel execution (within the browser's possible limitation).