I am using angularjs in my project.
I am able to fetch the records from Database and binding in html page.Here I need to get the data from 4 collections in database so I need to perform several server calls to get the data. When i am assigning everything in separate Scope variables. My sample code is below
var click = function(){
$http.get('/CalBuildingget').then(function (response) {
$scope.ViewBuildings = response.data;
});
for (i = 0; i < $scope.ViewBuildings.length; i++) {
$http.get('/CalBuildingFloorget/'+ scope.ViewManageBuildings[i]._id).then(function (response) {
$scope.floorDetails = response.data;
});
}
Here I need to fetch floors for each Building by its Id and store in building scope as an array object and then by floor id fetch again units which again needs to do server calls and assign inside the scope.
How can I achieve that as first it performs the loop then it starts server call of building.
You need to fetch floors in success callback of first request.
So something like this.
var click = function(){
$http.get('/CalBuildingget').then(function (response) {
$scope.ViewBuildings = response.data;
for (i = 0; i < $scope.ViewBuildings.length; i++) {
$http.get('/CalBuildingFloorget/'+ scope.ViewManageBuildings[i]._id).then(function (response) {
$scope.floorDetails = response.data;
});
}
});
You'll mess up the whole performance of your application with approach you are using, are you sure that you want to send HTTP call in loop? think of a case when you have around 1000 records, shall you afford to send 1000 HTTP calls to server? instead why don't you fetch floorDetails in /CalBuildingget/ ?
Never send HTTP calls in loop, think of network bandwidth and application performance.
For multiple subsequent service calls you should always utilise promise concept. conceptually it should be like below:
function callServiceForEachItem() {
var promise;
angular.forEach(items, function(item) {
if (!promise) {
//First time through so just call the service
promise = fakeService.doSomething(item);
} else {
//Chain each subsequent request
promise = promise.then(function() {
return fakeService.doSomething(item);
});
}
});
}
use this link for best practice perform chain service call
you can check this discussion
Related
I am pretty new to ionic 1 and I am working on an application (with Ionic 1 and angular js) with multiple URLs where each URL brings up a list of categories, followed by a list of items for each category and each item has a document URL. How do I preload all these URLs on launch in the background but not display them?Is there any way this can be achieved? a good code sample or tutorial will help greatly.
Also, please let me know if this will be the best approach, as in pre-loading and pre-caching all content upon launch or should it be done category by category or some other way.
Thanks in advance!
You can make multiple Asynchronous service calls in background using $q.
Make a list of URL's in an array and call them at once using $q.all(listOfURL).
Using promises retrieve each response.
By making this asynchronous you can save lot of time.
After getting response you can either store them in $rootScope or in localStorage/sessionStorage.
Update - As the OP is already aware of and using localStorage, thus additional suggestions :-
In that case, you could either call all of your service methods for fetching data at startup or you could use a headless browser such as 'PhantomJS' to visit these URLs at startup and fetch the data.
Thus, your code would look something like :-
var webPage = require('webpage');
var page = webPage.create();
page.open('http://www.google.com/', function(status) {
console.log('Status: ' + status);
// Do other things here...
});
For more information, regarding PhantomJS, please refer to the following links :-
http://phantomjs.org/
http://phantomjs.org/api/webpage/method/open.html
Earlier Suggestions
Make an HTTP request in your service to fetch the data and store it to localStorage, as is shown below :-
$http.get('url', function(response) {
var obj = response.data;
window.localStorage.setItem('key', JSON.stringify(obj)); // Store data to localStorage for later use
});
For fetching data :-
var cachedData = JSON.parse(window.localStorage.getItem('key')); // Load cached data stored earlier
Please refer to the following link for detailed information regarding 'localStorage' :-
https://www.w3schools.com/html/html5_webstorage.asp
Hope this helps!
Best way to share data between different views in angular is to use a service as it is a singleton and can be used in other controllers.
In your main controller you can prefetch your lists of categories asynchronously through a service which can be shared for next views.Below is a small demo which you refer
angular.module("test").service("testservice",function('$http',$q){
var lists = undefined;
// fetch all lists in deferred technique
this.getLists = function() {
// if lists object is not defined then start the new process for fetch it
if (!lists) {
// create deferred object using $q
var deferred = $q.defer();
// get lists form backend
$http.get(URL)
.then(function(result) {
// save fetched posts to the local variable
lists = result.data;
// resolve the deferred
deferred.resolve(lists);
}, function(error) {
//handle error
deferred.reject(error);
});
// set the posts object to be a promise until result comeback
lists = deferred.promise;
}
// in any way wrap the lists object with $q.when which means:
// local posts object could be:
// a promise
// a real lists data
// both cases will be handled as promise because $q.when on real data will resolve it immediately
return $q.when(lists);
};
this.getLists2=function(){
//do it similarly as above
};
}).controller("mainController",function(testservice,$scope){
$scope.lists1=testervice.getLists()
.then(function(lists) {
//do something
});
};
$scope.lists2=testervice.getLists2()
.then(function(lists) {
//do something
});
};
$scope.lists1();
$scope.lists2();
}).controller("demoController1",function(testservice,$scope){
$scope.lists1=testervice.getLists()
.then(function(lists) {
//do something
});
};
$scope.lists2=testervice.getLists2()
.then(function(lists) {
//do something
});
};
$scope.lists1();
$scope.lists2();
});
I am assuming you don't want to load data in next screens, deliver user flawless experience.
Yes you can start loading URLs on you very first page as you want them to get the data you want to use in future screens.
In terms of storage
In AngularJs if you want something to persist throughout the application scope you should use $rootscope[beware keeping lot of data
may leads to memory issues, you need to clear it regularly].
Or another option is to store it in Localstorage. And fetch as per your need.
If you want you can share those arrays between different controllers of screens.
While loading[response getting from server] you can do two things
1. get single JSON response having all the data
2.have multiple urls, and load them serially.
As per your requirement of loading 5th (page)screen data in advance it's not good practice, and even stop user from seeing updates but as it's your requirement. We've couple of approaches:
Add all the category and their respective details as per your pastebin like cardiac then it's details.. kidney then details..
You can do this with managing hierarchies [categories] like parent main group and it's child sub group in JSONArray and details in JSONObject. (This change would be on sender side -server)
You need to load only one url to get all data.
So you don't need to load with different urls like now your doing. But beware this would be a big Json. So when you store it separate the categories and required data [screen-wise requirements] and store in local storage so easy for access.
Another approach would be you have to provide your [category] subgroup names to load so the loading would be like firing same URL with different category names to get data and store it in local storage.
This may lead to fire around 10-15[depends on your categories] urls may affect the UI thread response.
This won't need any changes on your server side response.
**
Programmatically approach to load urls sequentially:
**
URL Loading: This method will get detail of particular category [id or anything
works for you]. This will fire a http request and return a result.
getCategoryDetails(category){
url = url+category;
return $http({
method: 'GET',
url: url,
headers: --
}).then(function onSuccess(response) { //<--- `.then` transforms the promise here
//You can ether store in local storage
return response
}, function onError(response) {
throw customExceptionHadnler.getErrorMsg(response.status, response.data);
});
}
Parallel : This method will do it in parallel, we just load categories[ids] as we have all of them and then use $q.all to wait for all the urls loading to finish.
function loadUrlsParallel(urls) {
var loadUrls = []
for(var i = 0; i < urls.length; i++) {
loadUrls.push(getCategoryDetails(urls[i]))
}
return $q.all(loadUrls)
}
First API: This method to load first url and then Loading urls in
parallel call above method
getListOfCategories(){
url = url;
return $http({
method: 'GET',
url: url,
headers: --
}).then(function onSuccess(response) { //<--- `.then` transforms the promise here
//You can ether store in local storage or directly send response
return response
}, function onError(response) {
throw customExceptionHadnler.getErrorMsg(response.status, response.data);
});
}
urls : you have to prepare list of urls with appending category to
load after loading first url[expecting this returns you all the
categories you will require in your app beforehand] and pass to
loadUrlsParallel method.
You can write loadUrl methods as per your convenience, here whatever
is given is foe example purpose so may not run as it is.
You can load API responses every where from local storage where you've stored after API calls, So this will not ask you to execute API calls on every laoding of pages[screen]
Hope this helps you and solves your prob.
Okay this might be a long post but please do not click away you may know a simple answer.
The case:
Lets say you have build an angular app where people log into the system do some operations and then might log out again. The application will collect data from an API using a factory and service and in order to make the application load even faster you save these data in variables like such:
app.factory("divisionService", function (api, $http, $q) {
var division = {};
var divisionArray = [];
var mergedUserList = [];
return {
getList: function () {
var d = $q.defer();
if (divisionArray <= 0) {
$http.get(api.getUrl('divisionWithUsers', null))
.success(function (response) {
divisionArray = response;
d.resolve(divisionArray);
});
}
if (divisionArray.length > 0) {
d.resolve(divisionArray);
}
return d.promise;
},
This will make sure that if the user attempts to use a controller that uses the divisionService then that user will instantly get the data if it is already fetched.
The issue:
Now the user log's out and another user logs in (without refreshing / reloading ) the page. Once the controller calls this factory it already thinks that it has the correct list meaning that return would be the same as the previous user however this data might be incorrect!
Since all angular services are singletons the service will not be destoryed upon logout even though it should.
The obvious answer
An answer to this question might be: "Well then don't store the data in a variable" and since this will work enormous amount of data might make content of the page load slowly.
So my question is what do you do in the above situation? do you really have to deal with loading the data every time it is request or does angular provide a smart way to solve this problem?
Create a clear() function
Add a clear() function to your divisionService factory which will be responsible to empty the cached data structures (arrays, objects, ...)
app.factory("divisionService", function () {
var division = {};
var divisionArray = [];
var mergedUserList = [];
return {
clear: function(){
// Clear the cached data
for (var key in division)
{
delete division[key];
}
divisionArray.length = 0;
// ...
},
getList: ...
}
});
And call this function from when you logout
function logout(){
divisionService.clear();
}
Refresh the application
You can also refresh the entire application when you logout if you don't want to deal with clearing the cached data (e.g. calling divisionService.clear())
function logout(){
$window.location.reload();
}
this will cause the entire application to be reloaded, and all of the temporary (variable based) cached data will be cleared
Marc,
My first thought is just run
divisionArray = [];
On logout. Let me know if that works. If not, I'll look into it further.
You can cache the user information as well and compare it to see if the user has changed before deciding to refresh the data.
Hi I am new to Angular and javascript and need a bit of help.
I have a service that will need to aggregate data from various locations. I am building a sub-service to pull data from one of these locations. This subservice needs to 1) retrieve data from a REST web service, 2) massage it a bit and 3) return the final data to the invoking service.
I have steps 1 and 2 working, however I am running into a problem on the third. In general, I am having a hard time understanding promises. Yes, I've read the documentation, googled around, even saw a cartoon on it, still can't figure it out.... Anyway, here is the relevant code:
app.service('advsolr',['$http',function($http) {
var DEBUG = false;
var conf = get_conf();
var solr = 'serverurl';
var res = {};
var data = {};
this.query = function(searchp) {
//Run Search
query_solr(searchp);
return data;
};
var query_solr = function(search) {
var g = 'serverurl' //works fine
if (DEBUG) { console.log(g);}
$http.get(g).then(function(response){
res = response.data; // this works
parse_search_res(); //this massages the data and sticks it in the data object
return data; //this does absolutely nothing here
});
};
}]);
The main query method is ran by the other service. This queries a Solr instance, gets the results and massages them into the format I want. I know I can do this elsewhere, but I want to have this as a standalone service for portability and plus I just want this to work dammit.
So the query method runs, I had some other stuff in there, but I took it out for this example since it would not add value. It hits query_solr which gets the data and massages it with parse_search_res, which sticks it into the data global variable.
Now the issue is that query method returns the empty data before parse_search_res had a chance to load the data in it. How can I prevent the query method from returning without the data?
Thanks
The idea of promises is that you initiate some asynchronous operation like AJAX request, then you return corresponding promise object, and a consumer code uses this promise's methods to provide callback function on promise state change.
So to fix your code you need to make query_solr return promise:
app.service('advsolr', ['$http',function($http) {
var DEBUG = false;
var conf = get_conf();
var solr = 'serverurl';
var res = {};
var data = {};
var query_solr = function(search) {
var g = 'serverurl' //works fine
if (DEBUG) { console.log(g);}
return $http.get(g).then(function(response){
res = response.data; // this works
return parse_search_res();
});
};
this.query = function(searchp) {
return query_solr(searchp);
};
}]);
You'll also need to change parse_search_res() to return the massaged data instead of saving it into the "data" variable.
And having set up advsolr service like that, one could use it like this:
advsolr.query('something').then(function(data) {
console.log(data);
});
I'm fetching a collection called logCollection from parse.com in a node JS script on my machine. It has 200 elements. Each log has a link to another table (a pointer) called start. I need to fetch this one too.
Here is my code
Parse.User.logIn("user", "pass").then(function(user) {
// Do stuff after successful login.
console.log('succesfully logged in');
return logCollection.fetch();
}).then(function(content) {
console.log('done fetching logs: ' + logCollection.length);
var promises = [];
_.each(logCollection.models, function(thisLog) {
promises.push(thisLog.attributes.start.fetch());
});
// Return a new promise that is resolved when all of the deletes are finished.
return Parse.Promise.when(promises);
});
The thing is, it will fire at least 200 (start) fetch per second, and it will cause problems with the 30 requests per second limit at parse.com.
Is there a better way to do this? How can I slow down the way js fires the requests?
thanks
In a Parse Query, you can get the fully-fetched objects which are pointed to by that object, by using the include method on the query:
var query = new Parse.Query("SomeClass");
query.include('columnName');
query.find().then(function(results) {
// each result will have 'columnName' as a fully fetched parse object.
});
This also works with sub-sub objects:
query.include('columnName.nestedColumnName');
or as an array:
query.include(['columnName', 'anotherPointerColumn']);
I came out with this solution that works very good. It was all this time on the parse documentation.
https://www.parse.com/docs/js_guide#promises-series
The following code will fire one request only after the last one has been finished. Doing so, I can fire many requests without worrying about getting to the limit.
var query = new Parse.Query("Comments");
query.equalTo("post", 123);
query.find().then(function(results) {
// Create a trivial resolved promise as a base case.
var promise = Parse.Promise.as();
_.each(results, function(result) {
// For each item, extend the promise with a function to delete it.
promise = promise.then(function() {
// Return a promise that will be resolved when the delete is finished.
return result.destroy();
});
});
return promise;
}).then(function() {
// Every comment was deleted.
});
I have an App that requests a list of possible items from a REST service. I use $http or $resource for that.
Now i want to cache those items in localStorage and only sync my local storage with the backend every now and then to check if anything has changed.
So before i did this:
var getAllPlugs = function () {
var backend = $resource(getURLString() + '/getAllPlugsAvailable');
return backend.query();
};
but now i want the function to return my cached items right away and once the asynchronous http request is done for, it should update the item list if something has changed. This of course should be directly reflected in the UI
The problem if i do something like this:
var getAllPlugs = function () {
var backend = $resource(getURLString() + '/getAllPlugsAvailable');
var result = backend.query();
localStorage.setItem("plugs", JSON.stringify(result));
return result
};
i still only get the result of the http request. But how to achieve it so i get the cached ones first and then that object will be updated with changes. Maybe a success callback from my controller passed to the service that calls the backend? I need some inspriation, sorry if it is trivial...
Return the array from local storage. Once the data is there from http replace the content of the array.
var getAllPlugs = function () {
var results = JSON.parse(localStorage["plugs"]);
var backend = $resource(getURLString() + '/getAllPlugsAvailable');
backend.query({}, function(data) {
results.length=0; //clear existing
angular.forEach(data, function(plug) {
results.push(plug);
}
localStorage.setItem("plugs", JSON.stringify(results));
});
return result
};