Is there a design pattern to manage parallel AJAX queries? - javascript

I am developing a web application retrieving data from several web services (let's say only two to simplify). What has to be retrieved from one service does not depend on what has been retrieved from the other, so I can launch AJAX requests in parallel. I need then to perform some actions once both queries have returned their data. Since it seems to be something very usual, I am wondering if there is a well-formalised and accepted design pattern to do that. What I am doing so far is there (using jquery):
var data1 = null;
var data2 = null;
$.ajax({
url : url1,
success: function(data) {
data1 = data;
if(data2) perform();
},
});
$.ajax({
url : url2,
success: function(data) {
data2 = data;
if(data1) perform();
},
});
function perform() {
//do interesting stuff on data1 and data2
}
Would you do like that as-well ?

you can do like this
Check : jQuery: api.jquery.com/jquery.when
we can use jQuery's $.when() method, which takes a list of these "Deferred" objects (All jQuery Ajax methods return Deferred objects) and then provides a single callback.
Syntax
$.when(
// Deferred object (probably Ajax request),
// Deferred object (probably Ajax request),
// Deferred object (probably Ajax request)
).then(function() {
// All have been resolved (or rejected), do your thing
});
Example :
$.when($.ajax("/page1.php"), $.ajax("/page2.php"))
.then(myFunc, myFailure);

When I have multiple ajax queries I usually like to keep a list of URLs. I create a list of promises and apply the $.when function to them. Something like this:
var urls = [url1, url2];
var endpoints = [];
for (var i = 0; i < a.length; i+=1) {
endpoints.push($.ajax(urls[i]));
}
$.when.apply($, endpoints).done(function () {
// Function arguments array differs if we have one or more than one endpoint.
// When called with one endpoint arguments is an array of three elements [data, textStatus, jqXHR].
// When called with more than one endpoint arguments is an array of arrays [[data, textStatus, jqXHR], ...].
// Normalize the single endpoint to the generic list one.
var args = endpoints.length > 1 ? arguments : [arguments];
});
Or more concise:
var urls = ['page1', 'page2'];
$.when.apply($, $.map(urls, $.ajax)).done(function () {
console.log(arguments);
});

Related

Removing values from array and adding them to POST method request

I have a cart in my AngularJS app and I need to identify the items through their id through a POST method.
They are stored in an array called patent_ids and are appended to the request like so:
var cartArr = [];
var cartItems = ngCart.getItems()
cartItems.forEach(function(currentValue, index, array){
cartArr.push(currentValue._id)
})
var obj = {
patent_id: cartArr
};
fetchBasketPatents(obj);
function fetchBasketPatents(obj) {
//load of code to call service and pass the ids
}
var REST_SERVICE_URI = 'http://localhost:8080/p3sweb/rest-basket/';
factory.fetchBasketPatents = function(ids) {
var deferred = $q.defer();
$http.post(REST_SERVICE_URI, ids) //REQUEST TO BACK END WITH IDS
.then(
function(response){
deferred.resolve(response.data)
},
function(errResponse){
deferred.reject(errResponse)
}
)
return deferred.promise;
}
It works but it would be easier if rather than sending the ids in a named array, they could be sent as an anonymous array, or even better each item extracted an inserted into an anonymous object.
Question
How do I send the ids in the POST request in either an anonymous array or object? Apologies if I am missing something obvious.
How do I send the ids in the POST request in either an anonymous array or object?
From Document:
$http.post('/someUrl', data, config).then(successCallback, errorCallback);
Where data – {string|Object}
So the answer is: you cannot define anonymous array but object only (like you did)

List of AJAX requests to call later with $.when gets called before time

I'm trying to build a list of ajax requests to be fed into the $.when() method, so I can do something when all requests are finished (like this answer exemplifies: https://stackoverflow.com/a/24050887/1952996 ).
I came up with this code (jsfiddle):
function getUserData(username){
return $.ajax({
url:"/echo/json/",
data:{json: $.toJSON({"username": username}),
delay: 3},
type:"POST",
success:function(response)
{
console.log(response);
}
});}
var userList = ["userA", "userB"];
var userRequests = [];
userList.forEach(function(u){
userRequests.push(getUserData(u));
})
When I run this, I see that the data gets logged in the console. Why is that? I thought the way i'm building the ajax request i'm returning the ajax defered object. And then I try to store this objects in the userRequests. But these should not be being called already. What am I missing?
jQuery.ajax is designed in a way that it makes an AJAX call when called, and returns AJAX deferred object.
Your code is absolutely similar to the following:
function getUserData(username){
var result = $.ajax({
url: "/echo/json/",
data: { json: $.toJSON({"username": username}), delay: 3 },
type: "POST",
success: function(response)
{
console.log(response);
}
});
return result;
}
It clearly shows that a request is made at the moment of function call.
You can now use $.when to wait until all requests are completed.
P.S. By the way, you can replace your forEach:
var userList = ["userA", "userB"];
var userRequests = [];
userList.forEach(function(u){
userRequests.push(getUserData(u));
});
with a shorter version using map:
var userList = ["userA", "userB"];
var userRequests = userList.map(getUserData);

Multiple ajax calls fired simultaneously not working properly

I created a site which load every few seconds data from multiple sources via AJAX. However I experience some strange behavior. Here is the code:
function worker1() {
var currentUrl = 'aaa.php?var=1';
$.ajax({
cache: false,
url: currentUrl,
success: function(data) {
alert(data)
},
complete: function() {
setTimeout(worker1, 2000);
}
});
}
function worker2() {
var currentUrl = 'aaa.php?var=2';
$.ajax({
cache: false,
url: currentUrl,
success: function(data) {
alert(data)
},
complete: function() {
setTimeout(worker2, 2000);
}
});
}
The problem is that many times, one of the workers returns NaN. If I change the frequency of calls for, lets say, 2000 and 1900, then everything is working ok and I got almost no NaN results. When those frequencies are same, I get over 80% NaN results for one of the calls. It seems like the browser cannot handle two requests called at exact same time. I use only those two workers, so the browser shouldn't be overloaded by AJAX requests. Where is the problem?
Note that the aaa.php works with the mySql database and do some simple queries base on parameters in url.
All you need is $.each and the two parameter form of $.ajax
var urls = ['/url/one','/url/two', ....];
$.each(urls, function(i,u){
$.ajax(u,
{ type: 'POST',
data: {
answer_service: answer,
expertise_service: expertise,
email_service: email,
},
success: function (data) {
$(".anydivclass").text(data);
}
}
);
});
Note: The messages generated by the success callback will overwrite
each other as shown. You'll probably want to use
$('#divid').append() or similar in the success function.
Maybe, don't use these workers and use promises instead like below? Can't say anything about the errors being returned though without looking at the server code. Below is working code for what it looks like you are trying to do.
This is a simple example but you could use different resolvers for each url with an object ({url:resolverFunc}) and then iterate using Object.keys.
var urls = [
'http://jsonplaceholder.typicode.com/users/1',
'http://jsonplaceholder.typicode.com/users/2',
'http://jsonplaceholder.typicode.com/users/3',
'http://jsonplaceholder.typicode.com/users/4',
'http://jsonplaceholder.typicode.com/users/5',
'http://jsonplaceholder.typicode.com/users/6',
'http://jsonplaceholder.typicode.com/users/7'
]
function multiGet(arr) {
var promises = [];
for (var i = 0, len = arr.length; i < len; i++) {
promises.push($.get(arr[i])
.then(function(res) {
// Do something with each response
console.log(res);
})
);
}
return $.when(promises);
}
multiGet(urls);
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>

Use data from two separate JSON files in one function

I have two local JSON files that I'm trying to access in a function in my main.js file. Everything works fine with just one JSON file, but I'm not sure how to incorporate the second. Ideally, something like data_set1=$.getJSON("file1.json") would work perfectly, but I see similar questions have been repeatedly asked and because of asynchronous calls, that's not necessarily possible (I don't completely understand all the answers to those questions).
This works as it is:
$.getJSON("data.json", function(json){
var data_points = [];
for (i = 0; i < json.length; i++){
data_points.push([json[i].name, json[i].age]);
}
$(function () {
//do stuff with data_points
but I don't know how to incorporate the second JSON call to make another list for the function at the end to use.
You can use jQuery deferred loading.
var xFile, yFile;
var requestX = $.getJSON("data1.json", function(json){
xFile = json;
});
var requestY = $.getJSON("data2.json", function(json){
yFile = json;
});
$.when(requestX, requestY).then(function(){
// do something;
// this function only gets called when both requestX & requestY complete.
});
Check out JQuery WHEN
An ajax request is ONE request for ONE resource. You can't fetch two different resources with the same request. You'd either need TWO requests:
var completed = 0;
$.getJSON('file1.json', function(json) {
completed++;
if (completed == 2) { all_done(); }
}
$.getJSON('file2.json', function(json) {
completed++;
if (completed == 2) { all_done(); }
}
function all_done(...) { ... }
Or simply combine your two json files into a single one on the server:
{"file1":{data from file one here}, "file2":{data from file two here}}
and access them as data.file1 and data.file2 in your code.
You could just set a var to indicate the file is loaded and call the same function while checking the status. This might be a bad idea depending on the amount of data in the JSON.
var xFileLoaded = false, yFileLoaded = false;
var xFile, yFile;
$.getJSON("data.json", function(json){
xFile = json;
xFileLoaded = true;
doSomething();
});
$.getJSON("data.json", function(json){
yFile = json;
yFileLoaded = true;
doSomething();
});
function doSomething() {
if(xFileLoaded && yFileLoaded) {
// good to go
}
}
Using something very similar to #MarcB's solution, I would keep the files in an array and generalize things slightly:
var files = ["file1.json", "file2.json"];
var results = {};
var completed = 0;
function afterEach(fileName, json) {
results[fileName] = json;
if (++completed >= files.length) {
afterAll();
}
}
files.forEach(function (file) {
$.getJSON(file, afterEach.bind(this, file));
});
This should request all files in parallel, allowing the browser to choose how many connections to actually open and download. After each finishes, the results (json) are put in a hash for storage, under the filename. After all files have completed (assuming files doesn't change), then afterAll function will be called.
It exists with jquery an elegant way to do this :
$.when(
$.ajax({
url: 'file1.json/',
success: function(data) {
// Treatement
}
}),
$.ajax({
url: 'file2.json',
success: function(data) {
// Treatement
}
})
).then( function(){
// Two calls are completed
});
Once the two calls are completed, you can do any manipulation in the block "then".

Read a json file and save the data (as json) in list

I am new to JavaScript and I'm trying to read data from a file and save it in list so that I can retrieve it later. Something like:
sample.json
{
"circles": [
{"x":14,"y":2,"z":4,"r":50},
{"x":14,"y":2,"z":4,"r":50}
]
}
What I am looking for is:
var circle_list = []
circle_lst = some_function_which_reads_json_from_file('sample.json')
//Now circle list contains
circle_list = [
{"x":14,"y":2,"z":4,"r":50},
{"x":14,"y":2,"z":4,"r":50}
]
Later I can just do something like:
for (var i = 0; i <circle_list.lenght;i++){
//do osmething
}
I was looking into
$.getJSON("sample.json" , function(data){
//
});
But then I came to know this call is asynchronous... But I need to maintain execution order.
Don't maintain execution order. This will lock up the browser during the fetching. Use this only as an absolute last resort:
var circle_list;
$.ajax({'dataType': 'json', url: 'sample.json', 'async': false})
.done(function(json) {
circle_list = json;
});
Instead, you can do at least a couple of other things:
Work with callback. It doesn't have to all be in one spot thanks to deferreds.
Load the JSON along with the rest of the page and fetch it from the DOM.

Categories

Resources