How to resolve http response before loading data-table in angularJS? - javascript

In below code, First I am calling a service to do a http request.
Then using the response I am creating a map which I am using later.
Next in data-table calling a new http request and using above map doing some operation before displaying data.
Problem: I know $http will take some time to get the response. I am trying to use promise but I am failing to do so. Please suggest me How do I use promise so that the first http is resolved and map is created before second http call.
//Call to service to do a http call
MasterServices.getAllCustomers().then(function(result) {
$scope.resultdata= result.data;
$scope.resultdata.forEach(element => {
//creating map holding id, name
$scope.oumap.set(element.companyId,element.companyName)
});
});
//Setting Data-Table
vm.dtOptions = DTOptionsBuilder.fromFnPromise(function() {
var defer = $q.defer();
//Calling http call to get some configuration data
MasterServices.getCompConfig().then(function(result) {
angular.forEach(result.data,function(val){
if($scope.oumap.has(val.compId)){
val.companyName=$scope.oumap.get(val.compId);
}else{
val.companyName=" ";
}
});
defer.resolve(result.data);
});
return defer.promise;
}).withPaginationType('full_numbers').withOption('createdRow', createdRow);

Depending on the ecmascript version you are targeting, this would be much simpler if you wrote in the form of async-await.
// You may need to use a try-catch to account for the http request failing
const result = await MasterServices.getAllCustomers();
$scope.resultdata = result.data;
$scope.resultdata.forEach(element => {
//creating map holding id, name
$scope.oumap.set(element.companyId,element.companyName)
});
// Http response is complete, continue doing other things.
If you cannot use the above for some reason, you need to place your "Setting Data-Table" section of code inside the success callback of the first promise.

Related

Is there a way to trigger the done function on a jQuery AJAX call?

I have a function that should only continue after an AJAX call has been completed, but I want to be able to skip that AJAX call if I already have the data from last session in my localstorage.
My current code:
$.when(getAddresses()).done(function (data) {
addresses = data.data;;
localStorage['addresses'] = JSON.stringify(addresses);
{{Rest of the code that should be executed after the AJAX call}}
}
Thanks in advance!
Do it the other way around.
Check for the data locally and don't even send the request if you already have it.
Wrap the data in a promise so it will always have the same API no matter where you fetch it from.
async function get_data() {
let addresses = localStorage.getItem('addresses');
if (addresses) {
return JSON.parse(addresses);
}
let ajaxData = await getAddresses();
addresses = ajaxData.data;
localStorage.setItem('addresses', JSON.stringify(addresses));
return addresses;
}
get_data().then(data => {
// Rest of the code that should be executed after the AJAX call
});
Another approach would be to forget about localStorage and just have the web service set suitable caching headers. Then you can make the HTTP request, but if the cache information shows that the browser cache contains up to date data it won't make the HTTP request at all.
You don't need to reinvent local caching of data. HTTP has it baked in.

Chained jQuery AJAX with promise

I am currently working on a project where 4 get requests are fired simultaneously. I am at the same time using fade effects, and the asynchronous nature of this results in empty data intermittently.
I have been looking into this method as described in
Prefer way of doing multiple dependent ajax synchronous call to replace how I am currently doing
$.get('ajax_call_1').then(function(value) {
return $.get('ajax_call_2');
}).then(function(result) {
// success with both here
}, function(err) {
// error with one of them here
});
But, my question is: How can I access the return from each request individually with the above?
You've said the requests are sent simultaneously. The way you've written your code, they are sent sequentially though. Instead, with Promise.all, you can wait for all of the requests' promises and you'll be given an array with the results:
Promise.all([
$.get('ajax_call_1'),
$.get('ajax_call_2'),
$.get('ajax_call_3'),
$.get('ajax_call_4')
]).then(function(results) {
var first = results[0];
var second = results[1];
...
}).catch(function(err) {
// called if one of the requests fails
});

Exection architecture of a series of chained promises when page changes

I have series of OData calls which I make from JavaScript file (myfile.js) using Promises like shown below. The main entry point is the function MakePreferredCustomer(AccountNo). This function is called on load of a web page Page1.htm, which loads the JavaScript file myfile.js also. These series of OData calls (its more than the 5 shown below) take around 180-200 seconds to complete. This is async so, page load of Page1.htm is not affected. Also user does not come to know that these calls are happening in the background (which is a requirement to do this without asking user). But the problem is - what happens if user switches to some other page from this Page before the 180-200 seconds. I am not sure if the async tasks keep on happening in the background or will it stop in between (depending on when user moves away from Page1.htm)?
Is there a definite way this will behave or will this change depending on browser being used or some other external criteria? Please guide.
void MakePreferredCustomer(AccountNo)
{
GetAccountDataFromServer() //This internally does " return Common.MakeWebApiRequest"
.then(HandleAccountDataResponse(request){
//Parse the response using JSON.Parse() and get account no
//Make WebApiRequest to get more details using account no
return Common.MakeWebApiRequest("GET", uri, additionalAccountData, headers);
})
.then(HandleAdditionalAccountDetails(request){
//Parse the response using JSON.Parse() and get additional acc details
//Store these details in additional variables in this function scope
//Make OData call to get Product Details
return Common.MakeWebApiRequest("GET", uri, Productdata, headers);
})
.then(HandleProductDetails(request){
//Parse the response using JSON.Parse() and get product details
//Check if this account had earlier porchased this product.
//Make OData call to get no of times this product should be purchased to become preferred customer
return Common.MakeWebApiRequest("GET", uri, PolicyData, headers);
}) .then(HandlePolicyDetails(request){
//Parse the response using JSON.Parse() and get policy details
//Check No if times prodocut should be bought and how may times this account has bought
//if condition meet, update preferred customers detials with this account as per below OData call
Common.MakeWebApiRequest("POST", uri, ThisAccountIsNewPreferredCustomerData, headers);
})
.catch(HandleException(e){
})
}
Each of the Functions call a common function MakeWebAPIRequest:
Common.MakeWebApiRequest = function (action, uri, data, headers)
{
//Do basci checks in input arguments
return new Promise(function (resolve, reject)
{
var request = new XMLHttpRequest();
request.open(action, encodeURI(uri), true); //async call
//Set OData specific headers using request.setRequestHeader
request.onreadystatechange = function ()
{
if (this.readyState === 4)
{
//Do Handling
}
}
if ("POST" === action && data === "")
{
request.send();
}
else if ("GET" === action)
{
request.send();
}
else
{
request.send(JSON.stringify(data));
}
});
}
When the user navigates away, the XMLHttpRequests will be aborted (or at least ignored on the client side), and the load handlers will be dropped and never executed. So no, javascript processes do not continue to work in the background.
The async requests, once initiated, will continue. As well, the promise handlers will also get invoked and run when the dependent Promise fulfills (or is rejected).
If these processes are independent of the content on Page1.html, then you should be okay. If there is a dependency (example, when one of your promise handlers kick-off and they rely on data available within Page1) then you will have to find a way to put those dependencies within the handlers or preserve what you need through closures.

Multiple AJAX calls in loop

I need a way to send multiple AJAX calls at the same time in Javascript/Angular.
After some searching i couldn't find an answer.
What i want to do is send all my requests as fast as possible.
If i execute my calls in a for loop or in a queue of promises with the $q library in Angular, a request gets sent, waits for it to execute the callback, and then sends the next one.
This is a example code:
var array = [];
Page.get({id:1}, function(result){
for(var i = 0; i < result.region[0].hotspots.length; i++){
var promise = Hotspot.get({id: result.region[0].hotspots[i].id});
array.push(promise);
}
$q.all(array).then(function(data){
console.log(data);
});
});
Page is a angular resource with a get method which requires a ID.
What i want is that they all get sent at the same time and they call their callback when ready. The order in which the calls get returned doesn't really matter.
Thanks
Think outside the box with Web Workers
An interesting aproach to solve this question, is to use web workers to execute the requests in a different thread. if you are not familiar with web workers i advice you to start by this great tutorial of techsith. Basically, you will be able to execute multiple jobs at the same time. See also the W3Schools Documentation.
This article from Html5Rocks teach us how to use Web Workers without a separate script file.
Have you tried using Async.js module?
You can achieve desired behavior using something like
Page.get({id:1}, function(result){
async.each(result.region[0].hotspots, callAsync, function(err, res){
console.log(res);
}
});
function callAsync(hotspot, callback){
callback(null, Hotspot.get({id: hotspot.id});
}
From Doc :
each(coll, iteratee, [callback])
Applies the function iteratee to each item in coll, in parallel. The
iteratee is called with an item from the list, and a callback for when
it has finished. If the iteratee passes an error to its callback, the
main callback (for the each function) is immediately called with the
error.
The $http service sends XHRs in parallel. The code below demostrates 10 XHRs being sent to httpbin.org and subsequently being received in a different order.
angular.module('myApp').controller('myVm', function ($scope, $http) {
var vm = $scope;
vm.sentList = [];
vm.rcvList = [];
//XHR with delay from 9 to 1 seconds
for (var n=9; n>0; n--) {
var url = "https://httpbin.org/delay/" + n;
vm.sentList.push(url);
$http.get(url).then(function (response) {
vm.rcvList.push(response.data.url);
});
};
//XHR with 3 second delay
var url = "https://httpbin.org/delay/3";
vm.sentList.push(url);
$http.get(url).then(function (response) {
vm.rcvList.push(response.data.url);
})
})
The DEMO on JSFiddle.

Non deterministic Backbone Collection property assignment in console

I'm building a Backbone application and I'm observing some behaviour I can't place. Consider the following Collection:
window.Pictures = Backbone.Collection.extend({
model: Picture,
url: 'latest.json',
parse: function(response) {
this.foobar = 1;
},
fetchPage: function() {
this.foobar = 2;
return this;
}
});
On a Chrome (or Firefox) console I've issued the following command:
> p = new Pictures(); p.fetch(); p.fetchPage();
> p.foobar
1
When I do:
> p = new Pictures(); p.fetch()
> p.fetchPage();
> p.foobar
2
I really don't understand this. Why is the first execution different from the second execution?
The fetch call is asynchronous because it involves an AJAX call to the server:
fetch collection.fetch([options])
Fetch the default set of models for this collection from the server, resetting the collection when they arrive.
And fetch will call parse:
parse collection.parse(response)
parse is called by Backbone whenever a collection's models are returned by the server, in fetch.
So p.parse() may be called before or after p.fetchPage() depending on timing issues that are beyond your control.
In the first case:
> p = new Pictures(); p.fetch(); p.fetchPage();
fetchPage is getting called before fetch gets its response from the server and gets around to calling parse so the calling sequence ends up like this this:
You call p.fetch().
AJAX call is made.
You call p.fetchPage().
AJAX response is received.
The AJAX success handler calls p.parse().
In the second case:
> p = new Pictures(); p.fetch()
> p.fetchPage();
Enough time passes between the lines for the AJAX call to return before p.fetchPage() is called so things happen in the order expect purely by accident.
If you need things to happen in a certain order then you'll need to use the success (and possibly error) callback that fetch provides:
The options hash takes success and error callbacks which will be passed (collection, response) as arguments.
So this should give you a consistent result of 2:
p = new Pictures();
p.fetch({
success: function(collection, response) {
collection.fetchPage();
console.log(collection.foobar);
}
});
Of course, if fetchPage involves an AJAX call then you'd have to add yet another layer of callbacks to get a consistent foobar value.

Categories

Resources