How do I prefetch url's in ionic/angularjs? - javascript

I am pretty new to ionic 1 and I am working on an application (with Ionic 1 and angular js) with multiple URLs where each URL brings up a list of categories, followed by a list of items for each category and each item has a document URL. How do I preload all these URLs on launch in the background but not display them?Is there any way this can be achieved? a good code sample or tutorial will help greatly.
Also, please let me know if this will be the best approach, as in pre-loading and pre-caching all content upon launch or should it be done category by category or some other way.
Thanks in advance!

You can make multiple Asynchronous service calls in background using $q.
Make a list of URL's in an array and call them at once using $q.all(listOfURL).
Using promises retrieve each response.
By making this asynchronous you can save lot of time.
After getting response you can either store them in $rootScope or in localStorage/sessionStorage.

Update - As the OP is already aware of and using localStorage, thus additional suggestions :-
In that case, you could either call all of your service methods for fetching data at startup or you could use a headless browser such as 'PhantomJS' to visit these URLs at startup and fetch the data.
Thus, your code would look something like :-
var webPage = require('webpage');
var page = webPage.create();
page.open('http://www.google.com/', function(status) {
console.log('Status: ' + status);
// Do other things here...
});
For more information, regarding PhantomJS, please refer to the following links :-
http://phantomjs.org/
http://phantomjs.org/api/webpage/method/open.html
Earlier Suggestions
Make an HTTP request in your service to fetch the data and store it to localStorage, as is shown below :-
$http.get('url', function(response) {
var obj = response.data;
window.localStorage.setItem('key', JSON.stringify(obj)); // Store data to localStorage for later use
});
For fetching data :-
var cachedData = JSON.parse(window.localStorage.getItem('key')); // Load cached data stored earlier
Please refer to the following link for detailed information regarding 'localStorage' :-
https://www.w3schools.com/html/html5_webstorage.asp
Hope this helps!

Best way to share data between different views in angular is to use a service as it is a singleton and can be used in other controllers.
In your main controller you can prefetch your lists of categories asynchronously through a service which can be shared for next views.Below is a small demo which you refer
angular.module("test").service("testservice",function('$http',$q){
var lists = undefined;
// fetch all lists in deferred technique
this.getLists = function() {
// if lists object is not defined then start the new process for fetch it
if (!lists) {
// create deferred object using $q
var deferred = $q.defer();
// get lists form backend
$http.get(URL)
.then(function(result) {
// save fetched posts to the local variable
lists = result.data;
// resolve the deferred
deferred.resolve(lists);
}, function(error) {
//handle error
deferred.reject(error);
});
// set the posts object to be a promise until result comeback
lists = deferred.promise;
}
// in any way wrap the lists object with $q.when which means:
// local posts object could be:
// a promise
// a real lists data
// both cases will be handled as promise because $q.when on real data will resolve it immediately
return $q.when(lists);
};
this.getLists2=function(){
//do it similarly as above
};
}).controller("mainController",function(testservice,$scope){
$scope.lists1=testervice.getLists()
.then(function(lists) {
//do something
});
};
$scope.lists2=testervice.getLists2()
.then(function(lists) {
//do something
});
};
$scope.lists1();
$scope.lists2();
}).controller("demoController1",function(testservice,$scope){
$scope.lists1=testervice.getLists()
.then(function(lists) {
//do something
});
};
$scope.lists2=testervice.getLists2()
.then(function(lists) {
//do something
});
};
$scope.lists1();
$scope.lists2();
});

I am assuming you don't want to load data in next screens, deliver user flawless experience.
Yes you can start loading URLs on you very first page as you want them to get the data you want to use in future screens.
In terms of storage
In AngularJs if you want something to persist throughout the application scope you should use $rootscope[beware keeping lot of data
may leads to memory issues, you need to clear it regularly].
Or another option is to store it in Localstorage. And fetch as per your need.
If you want you can share those arrays between different controllers of screens.
While loading[response getting from server] you can do two things
1. get single JSON response having all the data
2.have multiple urls, and load them serially.
As per your requirement of loading 5th (page)screen data in advance it's not good practice, and even stop user from seeing updates but as it's your requirement. We've couple of approaches:
Add all the category and their respective details as per your pastebin like cardiac then it's details.. kidney then details..
You can do this with managing hierarchies [categories] like parent main group and it's child sub group in JSONArray and details in JSONObject. (This change would be on sender side -server)
You need to load only one url to get all data.
So you don't need to load with different urls like now your doing. But beware this would be a big Json. So when you store it separate the categories and required data [screen-wise requirements] and store in local storage so easy for access.
Another approach would be you have to provide your [category] subgroup names to load so the loading would be like firing same URL with different category names to get data and store it in local storage.
This may lead to fire around 10-15[depends on your categories] urls may affect the UI thread response.
This won't need any changes on your server side response.
**
Programmatically approach to load urls sequentially:
**
URL Loading: This method will get detail of particular category [id or anything
works for you]. This will fire a http request and return a result.
getCategoryDetails(category){
url = url+category;
return $http({
method: 'GET',
url: url,
headers: --
}).then(function onSuccess(response) { //<--- `.then` transforms the promise here
//You can ether store in local storage
return response
}, function onError(response) {
throw customExceptionHadnler.getErrorMsg(response.status, response.data);
});
}
Parallel : This method will do it in parallel, we just load categories[ids] as we have all of them and then use $q.all to wait for all the urls loading to finish.
function loadUrlsParallel(urls) {
var loadUrls = []
for(var i = 0; i < urls.length; i++) {
loadUrls.push(getCategoryDetails(urls[i]))
}
return $q.all(loadUrls)
}
First API: This method to load first url and then Loading urls in
parallel call above method
getListOfCategories(){
url = url;
return $http({
method: 'GET',
url: url,
headers: --
}).then(function onSuccess(response) { //<--- `.then` transforms the promise here
//You can ether store in local storage or directly send response
return response
}, function onError(response) {
throw customExceptionHadnler.getErrorMsg(response.status, response.data);
});
}
urls : you have to prepare list of urls with appending category to
load after loading first url[expecting this returns you all the
categories you will require in your app beforehand] and pass to
loadUrlsParallel method.
You can write loadUrl methods as per your convenience, here whatever
is given is foe example purpose so may not run as it is.
You can load API responses every where from local storage where you've stored after API calls, So this will not ask you to execute API calls on every laoding of pages[screen]
Hope this helps you and solves your prob.

Related

JavaScript - Promise fulfilled too early?

I created a small sample application using VueJs and created a C# REST API to store and retrieve data in a SQL Server back end.
For testing, I created a simple web page with a form to create a "note". The note is stored by the following function, 'saveData()':
saveData()
{
let promiseStack = [];
var jsondata = JSON.stringify(this.note);
promiseStack.push(this.$http.post('REST_API/note', jsondata));
Promise.all(promiseStack).then(data =>
{
this.$http.get('REST_API/note');
this.$router.push({ name: 'viewnotes', params: { id: data[0].body.id }})
}, error =>
{
console.log(error);
});
}
I tried to use a promise to wait until the 'store' operation in the backend is complete, and issue a GET request to retrieve all notes once the promise is fulfilled.
However, the get request inside the promise doesn't return any data. If I issue the get request manually later one, I retrieve the data that was stored previously.
So I had look into the C# REST API. There are currently two functions: createNote(...), getAllNotes(...). I used a StreamWriter to log to the filesystem when these functions are called, using milisecond precision. What I see is that 'createNote' is called after 'getAllNotes'. So I suspect that the API is working correctly, but something with the way I'm using promises seems to be awfully wrong.
Maybe somebody has a hint?
UPDATE
I know that the GET request doesn't return any data by using the developer toolbar in Chromium. The response is empty
The developer toolbar in the network tab shows that the requests are submitted in the correct order, so the "POST" request is issued first
It seems I found the problem. I had a 'href' tag in my 'Save' link, which triggered an early routing. The intended 'POST' and 'GET' were fired correctly, but there was another 'GET' inbetween somewhere because of the 'href' tag in the link, even though it was empty.
I removed the tag, now it works as intended.

How to query firebase for many to many relationship?

It is my first time developing a SPA, and I am not using JS frameworks like React, Vue or Angular. My project just uses the firebase sdk and jquery to access the DOM elements.
In my app, the users can be associated with projects. Since that, I have a user-projects and project-users paths to represent that relationship.
When a user logs in my app I request users/uid to get the user data. After that I have to fetch the projects associated with the user. I will take the ids of the associated projects to finally request the data of each project.
I'm trying to use promises as described here, but I get nothing in the console.
function loadUserProjects() {
// Authenticated user
var user = firebase.auth().currentUser;
// General reference to the real time db
var ref = firebase.database().ref();
// Request the user data
ref.child('users/'+user.uid).on('value').then(function(snapshot) {
var user_data = snapshot.val(); console.log(user_data);
// Global variable to store the id of the selected project
project_selected_key = user_data.project_selected;
// Get the list of associated projects
return ref.child('user-projects/'+user.uid).on('value').then(function(snapshot) {
console.log(snapshot);
return snapshot;
});
}).then(function (projectsSnapshot) {
console.log(projectsSnapshot);
// List associated projects
var project_options = '';
projectsSnapshot.forEach(function (e) {
project_options += '<option value="'+e.key+'">'+e.val()+'</option>';
});
if (! project_options) {
project_options = '<option disabled selected value>- NingĂșn proyecto -</option>';
}
$('#project_selected').html(project_options);
}, function(error) {
// Something went wrong.
console.error(error);
});
}
I know that I have to use one additional request, because at this point the <select>will be populated with truevalues (the additional request have to query the full data of each project). But I am not getting messages in the console.
Thanks in advance.
After that, I need to define different levels of privilege in each project, and associate a level when a project is assigned to a specific user. Initially I was very excited about the real time, but it seems that firebase is getting more complicated than I supposed.
A Firebase on() listener can respond to multiple events. A promise can only resolve once, that's why it's only available when you use Firebase's once() operation.
return ref.child('user-projects/'+user.uid).once('value');

Best practice for multiple AJAX API calls that require a response from the previous call?

I'm working on an internal page that allows a user to upload a CSV with resources and dates, and have the page add all the scheduling information for these resources to our management software. There's a pretty decent API for doing this, and I have a working model, but it seems...cludgy.
For each resource I have to start a new session, then create a new reservation, then add resources, then confirm that the reservation isn't blocked, then submit the reservation. Most of the calls return a variable I need for the next step in the process, so each relies on the previous ajax call.
Currently I'm doing this via nested ajax calls similar to this:
$.ajax('startnewsession').then($.ajax('createreservation').then('etcetc'))
While this works, I feel like there has to be an easier, or more "proper" way to do it, both for cleaner code and for adaptability.
What you're doing is correct, assuming you can't change the API you are communicating with.
There's really no way of getting around having to do some sort of nested ajax calls if you need the response data of the previous one for the next one. Promises (.then) however make it a bit more pretty than having to do callbacks.
The proper solution (if possible) would of course be to implement your API in such a way that it would require less roundtrips from the client to the server. Considering there's no user input in between each of these steps in the negotiation process for creating a reservation, your API should be able to complete the entire flow for creating a reservation, without having to contact the client until it needs more input from the user.
Just remember to do some error handling between each of the ajax calls in case they should fail - you don't want to start creating the following up API calls with corrupt data from a previously failed request.
var baseApiUrl = 'https://jsonplaceholder.typicode.com';
$.ajax(baseApiUrl + '/posts/1')
.then(function(post) {
$.ajax(baseApiUrl + '/users/' + post.userId)
.then(function(user) {
console.log('got name: ' + user.name);
}, function(error) {
console.log('error when calling /users/', error)
});
}, function(error) {
console.log('error when calling /posts/', error)
});
Short answer: as usual I'm trying to do some chains like this:
ajaxCall1.then(
response => ajaxCall2(response)
).then(
response => ajaxCall3(response)
)
I'm trying to avoid using of when. As usual I (and bet you too) have 1 ajax call (for form submit), sometimes 2 chaining ajax calls, for an example, if I need to get data for table, first query for total rows count, and if count greater than 0, another call for data. In this case I'm using:
function getGridData() {
var count;
callForRowsCount().then(
(response) => {
count = response;
if(count > 0) {
return callForData();
} else {
return [];
}
}
).then(response => {
pub.fireEvent({
type: 'grid-data',
count: count,
data: response
})
})
}
publisher trigger event, and I have all my components updated.
In some realy rare cases, I need to use when. But this is always bad design. It happen in case, when I need to load pack of additional data before of main request, or when backend not support bulk update, and I need to send pack of ajax calls to update many of database entities. Something like this:
var whenArray = [];
if(require1) {
whenArray.push(ajaxCall1);
}
if(require2) {
whenArray.push(ajaxCall2);
}
if(require3) {
whenArray.push(ajaxCall3);
}
$.when.apply($, whenArray).then(() => loadMyData(arguments));

How do I know my collection already has data using Backbone.JS?

I am developing a site using javascript framework BACKBONE.JS. In my site, There is one Category Selection drop down. Using Backbone collection fetch, I have rendered my category drop down successfully. In my header i have three horizontal menu[image shown below]. User click of the menu(Page navigation is done using backbone routers). My main content of the day will change. The user can filter the content based on the category. My category filter drop down option will not change frequently.
ALL = http://www.Site1.com
MOBILE = http://www.Site1.com/#all/mobile
DESKTOP = http://www.Site1.com/#all/desktop
My Router:
dealapp.AppRouter = Backbone.Router.extend({
routes: {
"": "home",
"all/mobile": "mobile",
"all/descktop": "displayAllVoucher"
},
home: function () {},
mobile: function () {},
desktop: function () {}
});
Success Case
I am loading my site Using "http://www.Site1.com/". The function home will get a call and do the listed action. If i am navigating to some other tab(mobile/desktop), my category drop down displaying.[ Note : i am fetching my category from the server in the home function]
scenario
I am loading my site using "http://www.Site1.com/#all/deal" directly. In this case my category drop down is not rendering , i am getting an empty drop down. I know that i haven't added my category fetch in the other two functions mobile and desktop. If i include the category fetch in mobile and desktop function each time then my category fetch call goes to server and fetches data from server.
My doubt
How do i know if my collection already has data? I want to reuse the already downloaded data. If data not available in the local storage then i need to fetch it from the server.
You can override fetch on the collection. Fetch returns a deferred object, you can store this on the collection itself. If the deferred is null you will call the prototype fetch. The advantage is that in your code you always call fetch, and if the collection has data you return the cached data.
fetch : function(options) {
if(this.deferred){
return this.deferred;
}
this.deferred = Backbone.Collection.prototype.fetch.call(this, options);
return this.deferred;
}
This specific problem was dealt with by others and a few plugins can be found.
The one I am currently using with success is the Thorax framework that adds a few things over Backbone.
For Model and Collection they added isPopulated() and isEmpty() method as can be seen here in their documentation.
They will tell you if there is data in the collection or not. If you don't want to use the entire framework, just copying the code from their Git repository here, would do.
In a few words they solve the problem by overriding fetch to set a property called _fetched to true when the data are fetched.
Another way would be to cache the data. Most of the time this is a good idea, but this depends. In your scenario it could be a good idea to cache it in a localStorage.
A plugin I found that seems to do it's job is Backbone fetch cache.
Description:
This plugin intercepts calls to fetch and stores the results in a
cache object (Backbone.fetchCache._cache). If fetch is called with {
cache: true } in the options and the URL has already been cached the
AJAX call will be skipped.
Yet another version is mentioned in this answer: Caching collections in backbone.js?
As the answerer there said, you could do it similar to this:
var manager = (function(){
var constructors = {
'example': ExampleCollection
};
var collections = {};
return {
getCollection: function(name) {
if(!collections[name]) {
var collection = new constructors[name]();
collection.fetch();
collections[name] = collection;
}
return collections[name];
}
}
})();
Here the manager is responsible for instantiating collections and
fetching them. When you call:
var exampleCollection = manager.getCollection('example');
you get an instance of example collection with data being already
fetched. Whenever you need this collection again you can call the
method again. You will then get the exact same instance with no need
to fetch it again.

Ext JS Store POST request filter params

I have a Store configured with a proxy to POST data to the server. I add records to this store dynamically. After calling the sync() method on the store the data gets sent to the server. But looking at the network traffic I see that the whole records data is sent. How can I configure the store to send only individual data (like only IDs of the record)?
I have tried seeting the WriteAllFields property to false on the JSON writter connected to the proxy but this did not help
I have also tried this approach: Ext.JS Prevent Proxy from sending extra fields but the request was not even performed
var documentStore = Ext.getStore('Document');
var trashStore = Ext.getStore('TrashDocuments');
documentStore.each (function(record) {
console.debug(record);
//record.phantom = true;
//record.setDirty();
trashStore.add(record);
documentStore.remove(record);
});
var newWriter = Ext.create('Ext.data.writer.Json',{
getRecordData: function(record){
return {'id':record.data.id};
}
});
trashStore.getProxy().setWritter(newWritter);
trashStore.sync({
success: function()
{
console.debug("success!!");
},
failure: function()
{
console.debug("failed...");
},
callback: function()
{
console.debug("calling callback");
},
scope: this
});
console.debug("END");
writeAllFields config is not working because that is only meaningful for non-phantom records; any fields of a phantom (new) record will be viewed as "changed" and therefore included in the request packet.
To exclude specific fields from being included in the request, add persist:false to the fields you don't want to include.
That being said, I don't quite understand why you'd only want to write the ids of a newly added record. Unless you are explicitly having Ext JS create those ids for you, what are you actually going to be sending to server? I don't know your use case, but typically it is desirable to have your persistence layer (e.g., a database) assign identifiers to your records.

Categories

Resources