Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I'm trying to get all data from a custom API (flask with PostgresSQL database) using '$http' service in my controller but the api and database is constructed with pagination which means that if I wan't to access data a need to create services like this:
/*FIRST PAGE*/
$http.get("/api/test", testData)
.success(...)
/*SECOND PAGE*/
$http.get("/api/test?page=2", testData)
.success(...)
This is obviously not good solution but it works ! So could you guide me how to deal with this situation better ? because i know that this api contains over 1 thousand pages...
Cheers !
This is decribed in the official documentation.
Angular's $http service supports config param (second param of .get) which supports params parameter and do all the concatentaion with proper encoding etc etc for you.
params – {Object.<string|Object>} – Map of strings or objects which
will be serialized with the paramSerializer and appended as GET
parameters.
So you can do
angular
.module('xxx')
.factory('getPagedData', function ($http) {
return function(page) {
return $http.get("/api/test", { params: { page: page } });
}
});
And use it like this:
function someController(getPagedData) {
getPagedData(2).then(...); // will do GET /api/test?page=2
}
Also note, the .success method is deprecated. It is even removed in Angular 1.6. Use .then instead.
You can simply just pass a variable every time you want to page to the next data set. So you would have:
$http.get("/api/test?page="+ pageNum, testData).success(..)
Hope this helps!
Try creating a service that uses promises to access the data and break apart the data the way you want:
function getTestData(testData) {
return $http.get("/api/test", testData)
.then(function(response){ //Promise is successful create an object to store the data
}, handleError);
function testError(response){//Handle any errors if the promise fails
return response;
}
}
While this seems like a logical solution...
/* SERVICE */
var baseUrl = "/api/test";
MyService.getTests = function(page, testData) {
var pageParams = "";
if (page > 1) {
pageParams = "?page="+page;
}
return $http.get(baseUrl+pageParams, testData).success().error();
}
...GET requests don't allow you to send any data. I'm not sure how you plan to send testData to a $http.get request. The second parameter of a GET request is the config object, which makes your service much simpler.
var baseUrl = "/api/test";
MyService.getTests = function(page, testData) {
return $http.get(baseUrl, {params: {page: page}) //becomes /api/test?page=1
.success()
.error();
}
Related
I have been working with JavaScript for sometime but recently had to use Angular (v7). I am stuck with a situation where I've a session value set in express session and the value is very dynamic. I am not able to display this value in my view in real-time.
This is my first question on SO, and I know I can be very vague, but please let me know if you think you can help me. I can elaborate more if needed.
I have made a Observale variable in my component, and a service is there to call the express controller, which in turn calls express model where a session value is read and returned.
Now this call is happening only for one time, but I want this call to return session value automatically, whenever it changes.
app.component.html
<button *ngIf="sessionValue$ | async as sval">
{{sval.status}}
</button>
app.component.ts
this.sessionValue$: Observable<any>;
ngOnChanges(change) {
this.intialise();
this.sessionValue$ = this.sessionService.get();
}
sessionService.ts
url = environment.appUrl + 'session';
get(): Observable<any> {
return this.http.get<any>(this.url + '/status');
}
express.session.controller.js
exports.status = function (req, res) {
try {
sessioModel.get(req.session, function (result) {
res.status(200).send(result);
});
} catch (error) {
logger.error(error);
}
}
express.session.model.js
exports.get= function (session, handleResponse) {
var sessionValue= session['variableMe'];
const retVal= sessionValue['nowValue'];
if (!retVal) {
handleResponse({ status: "" });
} else {
handleResponse({ status: retVal });
}
};
I expect the value on html page to change every time session variable changes its value.
I would recommend using sockets, this would allow you to communicate in real-time between the front end and backend without having to do a service call every time. This is a pretty good library with good documentation on how to get set up.
The backend is a bit trickier to get set up, so if you have trouble be sure to reach out for help.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I am using Firebase with Cloud function to test the good value of a code in a million pre-generated stored in Firebase Realtime Database.
It will be used in a mobile application to verify if a user have buy the bundle in real life.
I found 2 working solutions. In the first, I put the code directly in the name of the property. In the second I put the code in the child property called "key"
In the second case, the key parameter is indexed.
I need a fast (log n complexity) access to get the response.
Do you know if any of my solutions will work for about 1 million entries and 100 calls by second on Firebase.
(I am not familiar with NoSQL.)
In my sample, the codes are "ABCD-0000-000X"
(do not take the property called "user" in consideration)
First Solution : Use the code value as parent
Cloud Function source code
exports.checkKey = functions.https.onRequest((req, res) => {
const code = req.query.code;
return admin.database().ref("Codes/" + code).once("value").then(snapshot => {
if (snapshot.val() === null) {
return res.send("Invalid Code");
}
const nb = snapshot.child("nb");
if (nb.val() > 4) {
return res.send("NO more code");
}
snapshot.ref.update({ "nb": nb.val() + 1 });
return res.send("OK");
});
Second Solution : Use the code in child
exports.getKey = functions.https.onRequest((req, res) => {
const code = req.query.code;
var ref = admin.database().ref("Codes");
ref.orderByChild("key").equalTo(code).on("child_added", function (snapshot) {
const nb = snapshot.child("nb");
if (nb.val() > 4) {
return res.send("NOK");
}
snapshot.child("nb").set(nb.val() + 1);
return res.send("OK");
}
});
Thanks for your help.
There is no way you're going to be able to query a list of one million items. So storing the keys as a property named key is not going to work.
But if you keep the keys as the key of each item, means you can access the item by its path. And that scales really well.
So I'd go with your first approach.
That said: it's hard to recommend anything specific without knowing all use-cases, which nobody (including typically the project creator at an early stage) is likely to know. So I'd also recommend simply learning a bit more about NoSQL data modeling, by reading NoSQL data modeling, watching Firebase for SQL developers, and by experimenting with various approaches before committing to any specific one.
I am pretty new to ionic 1 and I am working on an application (with Ionic 1 and angular js) with multiple URLs where each URL brings up a list of categories, followed by a list of items for each category and each item has a document URL. How do I preload all these URLs on launch in the background but not display them?Is there any way this can be achieved? a good code sample or tutorial will help greatly.
Also, please let me know if this will be the best approach, as in pre-loading and pre-caching all content upon launch or should it be done category by category or some other way.
Thanks in advance!
You can make multiple Asynchronous service calls in background using $q.
Make a list of URL's in an array and call them at once using $q.all(listOfURL).
Using promises retrieve each response.
By making this asynchronous you can save lot of time.
After getting response you can either store them in $rootScope or in localStorage/sessionStorage.
Update - As the OP is already aware of and using localStorage, thus additional suggestions :-
In that case, you could either call all of your service methods for fetching data at startup or you could use a headless browser such as 'PhantomJS' to visit these URLs at startup and fetch the data.
Thus, your code would look something like :-
var webPage = require('webpage');
var page = webPage.create();
page.open('http://www.google.com/', function(status) {
console.log('Status: ' + status);
// Do other things here...
});
For more information, regarding PhantomJS, please refer to the following links :-
http://phantomjs.org/
http://phantomjs.org/api/webpage/method/open.html
Earlier Suggestions
Make an HTTP request in your service to fetch the data and store it to localStorage, as is shown below :-
$http.get('url', function(response) {
var obj = response.data;
window.localStorage.setItem('key', JSON.stringify(obj)); // Store data to localStorage for later use
});
For fetching data :-
var cachedData = JSON.parse(window.localStorage.getItem('key')); // Load cached data stored earlier
Please refer to the following link for detailed information regarding 'localStorage' :-
https://www.w3schools.com/html/html5_webstorage.asp
Hope this helps!
Best way to share data between different views in angular is to use a service as it is a singleton and can be used in other controllers.
In your main controller you can prefetch your lists of categories asynchronously through a service which can be shared for next views.Below is a small demo which you refer
angular.module("test").service("testservice",function('$http',$q){
var lists = undefined;
// fetch all lists in deferred technique
this.getLists = function() {
// if lists object is not defined then start the new process for fetch it
if (!lists) {
// create deferred object using $q
var deferred = $q.defer();
// get lists form backend
$http.get(URL)
.then(function(result) {
// save fetched posts to the local variable
lists = result.data;
// resolve the deferred
deferred.resolve(lists);
}, function(error) {
//handle error
deferred.reject(error);
});
// set the posts object to be a promise until result comeback
lists = deferred.promise;
}
// in any way wrap the lists object with $q.when which means:
// local posts object could be:
// a promise
// a real lists data
// both cases will be handled as promise because $q.when on real data will resolve it immediately
return $q.when(lists);
};
this.getLists2=function(){
//do it similarly as above
};
}).controller("mainController",function(testservice,$scope){
$scope.lists1=testervice.getLists()
.then(function(lists) {
//do something
});
};
$scope.lists2=testervice.getLists2()
.then(function(lists) {
//do something
});
};
$scope.lists1();
$scope.lists2();
}).controller("demoController1",function(testservice,$scope){
$scope.lists1=testervice.getLists()
.then(function(lists) {
//do something
});
};
$scope.lists2=testervice.getLists2()
.then(function(lists) {
//do something
});
};
$scope.lists1();
$scope.lists2();
});
I am assuming you don't want to load data in next screens, deliver user flawless experience.
Yes you can start loading URLs on you very first page as you want them to get the data you want to use in future screens.
In terms of storage
In AngularJs if you want something to persist throughout the application scope you should use $rootscope[beware keeping lot of data
may leads to memory issues, you need to clear it regularly].
Or another option is to store it in Localstorage. And fetch as per your need.
If you want you can share those arrays between different controllers of screens.
While loading[response getting from server] you can do two things
1. get single JSON response having all the data
2.have multiple urls, and load them serially.
As per your requirement of loading 5th (page)screen data in advance it's not good practice, and even stop user from seeing updates but as it's your requirement. We've couple of approaches:
Add all the category and their respective details as per your pastebin like cardiac then it's details.. kidney then details..
You can do this with managing hierarchies [categories] like parent main group and it's child sub group in JSONArray and details in JSONObject. (This change would be on sender side -server)
You need to load only one url to get all data.
So you don't need to load with different urls like now your doing. But beware this would be a big Json. So when you store it separate the categories and required data [screen-wise requirements] and store in local storage so easy for access.
Another approach would be you have to provide your [category] subgroup names to load so the loading would be like firing same URL with different category names to get data and store it in local storage.
This may lead to fire around 10-15[depends on your categories] urls may affect the UI thread response.
This won't need any changes on your server side response.
**
Programmatically approach to load urls sequentially:
**
URL Loading: This method will get detail of particular category [id or anything
works for you]. This will fire a http request and return a result.
getCategoryDetails(category){
url = url+category;
return $http({
method: 'GET',
url: url,
headers: --
}).then(function onSuccess(response) { //<--- `.then` transforms the promise here
//You can ether store in local storage
return response
}, function onError(response) {
throw customExceptionHadnler.getErrorMsg(response.status, response.data);
});
}
Parallel : This method will do it in parallel, we just load categories[ids] as we have all of them and then use $q.all to wait for all the urls loading to finish.
function loadUrlsParallel(urls) {
var loadUrls = []
for(var i = 0; i < urls.length; i++) {
loadUrls.push(getCategoryDetails(urls[i]))
}
return $q.all(loadUrls)
}
First API: This method to load first url and then Loading urls in
parallel call above method
getListOfCategories(){
url = url;
return $http({
method: 'GET',
url: url,
headers: --
}).then(function onSuccess(response) { //<--- `.then` transforms the promise here
//You can ether store in local storage or directly send response
return response
}, function onError(response) {
throw customExceptionHadnler.getErrorMsg(response.status, response.data);
});
}
urls : you have to prepare list of urls with appending category to
load after loading first url[expecting this returns you all the
categories you will require in your app beforehand] and pass to
loadUrlsParallel method.
You can write loadUrl methods as per your convenience, here whatever
is given is foe example purpose so may not run as it is.
You can load API responses every where from local storage where you've stored after API calls, So this will not ask you to execute API calls on every laoding of pages[screen]
Hope this helps you and solves your prob.
I'm in a confused trouble with AngularJS and my REST API (Java). I've created a tree-view-drag-drop directive, it has a function to select its items and then delete them, but when I perform a DELETE action using $resource, AngularJS overrides or ignores the request body that is where I send the items of my selection array, how can I solve it? Is there any other patterns that I can use? Maybe some modification in API... or I don't know I'd like some suggestions about this problem and how to solve in the best way both in backend and frontend.
UPDATE
JSFiddle: http://bit.ly/1QmG83Z
As far as I know, HTTP method DELETE doesn't take a body.
You would need an endpoint in your api to treat this "batch" request using an array in body
Or you could also launch a DELETE request on each resource via Angular without any body
Perhaps the best approach should be creating a POST request, where you would pass your array, and then you could treat the delete atomically:
Service
MyService.$inject = ['$resource'];
function MyService($resource) {
return $resource('/echo/json', {}, {
remove: {
method: 'POST'
}
});
Call
MyService.remove(categoriesToDelete, function(response) {
console.log(response);
// do something after deletion
}
REST method example
#POST
#Path("/json")
#Consumes(MediaType.APPLICATION_JSON)
public Response delete(final String array) {
// You can convert your input to an array and then process it
JSONArray responseArray = new JSONArray(array);
System.out.println("Received input: " + responseArray);
JSONObject jsonObject = new JSONObject();
jsonObject.put("Array received", responseArray);
return Response.status(Status.OK).entity(jsonObject.toString()).build();
}
Also, take a look at this post for further enlightenment.
Well, all the ideas you gave me were great, and helped me to reach a solution, but reading all the answers and other topics the best solution that I found was to group all ID's in a single string separated by spaces, then I pushed to the path variable, and made a DELETE request as a single resource. Then, my endpoint will split the "combined-resource" and retrieve each of them separately to perform a delete action.
I am using Restangular to handle my token/header authentication in a single page Angular web application.
Using addFullRequestInterceptor, I set the correct headers for each outgoing REST API call, using a personal key for encrypting data.
Restangular
.setBaseUrl(CONSTANTS.API_URL)
.setRequestSuffix('.json')
.setDefaultHeaders({'X-MyApp-ApiKey': CONSTANTS.API_KEY})
.addFullRequestInterceptor(requestInterceptor)
.addErrorInterceptor(errorInterceptor);
function requestInterceptor(element, operation, route, url, headers, params, httpConfig) {
var timeStamp = Helpers.generateTimestamp(),
//Condensed code for illustration purposes
authSign = Helpers.generateAuthenticationHash(hashIngredients, key, token),
allHeaders = angular.extend(headers, {
'X-MyApp-Timestamp': timeStamp,
'Authentication': authSign
});
return {
headers: allHeaders
}
}
Works great. There is one exception I need though: For a new visitor that has not logged in yet, a generic key/token pair is requested via REST. This key/token pair is used in the headers of the login authentication call.
So for this call, I create a separate Restangular sub-configuration. In this configuration I want to override the requestInterceptor. But this seems to be ignored (i.e. the original interceptor is still called). It doesn't matter if I pass null or a function that returns an empty object.
var specialRestInst = Restangular.withConfig(function(RestangularConfigurer) {
RestangularConfigurer.addFullRequestInterceptor(function() {return {}});
}),
timeStamp = Helpers.generateTimestamp(),
header = {'X-MyApp-Timestamp': timeStamp};
specialRestInst.one('initialise').get({id: 'app'}, header)
So as documented by Restangular, withConfig takes the base confuration and extends it. I would like to know how to removeFullRequestInterceptor (this function does not exist), override it, or something like that.
I would take a different approach and try to pass a flag to the interceptor. If the flag exists then the authSign is excluded. You can do this using withHttpConfig. It's better to exclude on special cases then to always having to tell the interceptor to include the authSign.
So you would update the interceptor like this.
function requestInterceptor(element, operation, route, url, headers, params, httpConfig) {
var timeStamp = Helpers.generateTimestamp();
var allHeaders = {'X-MyApp-Timestamp': timeStamp};
if(!httpConfig.excludeAuth) {
//Condensed code for illustration purposes
var authSign = Helpers.generateAuthenticationHash(hashIngredients, key, token);
allHeaders['Authentication'] = authSign;
}
return angular.extend(headers, allHeaders);
}
When you need to exclude the authSign you would use restangular like this.
specialRestInst.withHttpConfig({excludeAuth: true}).get({id: 'app'});
You should be able to add any values to http config you want as a long as they aren't already used.
I'm not sure if this will work as expected, but I can't see why it wouldn't work.