REST service cache strategy with AngularJS - javascript

I have an AngularJS application and I want to cache the REST service responses. I found some libraries like angular-cached-resource which can do this by storing the data into the local storage of the web browser.
But sometimes I do some POST / PUT / DELETE REST calls and then some of the REST previously cached service responses need to be performed again. So it seems that it is possible to delete the cached responses then and the call will be sent to the server next time.
But what about if the server sends me in HTTP Header some values like the expires or the etag? I have to read the HTTP Header and react by myself or is there a library in AngularJS which can also handle this?
So if I should hit the server and not read the cache of the local storage is dependent on the HTTP Header Cache fields and if there are any PUT / POST / DELETE calls which have the response that for example "reload of every user settings element" are needed. So I have to take this response and create a map which tells me that for example REST services A, C and F (user settings related stuff) needs to hit the server again next time when they are executed or if the Cache expires from the HTTP Headers.
Is this possible with an AngularJS library or do you have any other recommendations? I think this is similar to Observer or PubSub Pattern, isn't it?
One more thing: Is it also possible to have something like PubSub without using a cache / local storage (so also no HTTP Header Cache controls)? So I can not call the REST service, because then it would hit the server, which I do not want in some circumstances (response from a previous REST call which returns me the event "reload of every user settings element").

You can try something like this.
app.factory('requestService', ['$http', function ($http) {
var data = {};
var service = {
getCall : funtion(requstUrl, successCallback, failureCallback, getFromCache){
if(!getFromCache){
$http.get(requstUrl)
.success(function(data){
successCallback(data);
data.requstUrl = data;
})
.error(function(){
failureCallback(data);
})
}else{
successCallback(data.requstUrl);
}
},
postCall : function(requestUrl, paramToPass, successCallback, failureCallback, getFromCache){
if(!getFromCache){
$http.post(requestUrl, paramToPass)
.success(function(data){
successCallback(data);
data.requstUrl = data;
})
.error(function(data){
failureCallback(data);
})
}else{
successCallback(data.requstUrl);
}
}
};
return service;
}]);
This is just a simple code I wrote to implement your concept. I haven't tested it and is all yours.

Related

NGINX JavaScript module with a session storage (like Redis)

I know there is a possibility to process each request via a JS script right inside the NGINX server.
I know there is the Lua Nginx module and the Lua Redis driver, and it's possible to write a script in Lua and use Redis right from the NGINX server.
However, I want to use standard functionality of NGINX and I prefer to code in JS. I wonder is it possible to use some session storage with the NJS? And how to do it? Particularly, I would like to use Redis as a session storage.
If one avoids to compile and install third-party modules for Nginx himself, I suppose the best way to build session storage with njs and Redis is to utilize the builtin ngx_http_upstream_module module and set up something like that
http {
[...]
upstream redis {
server unix:/var/run/redis/nginx.sock;
}
[...]
js_path conf.d/js/;
js_import redismiddleware.js;
[...]
server {
[...]
location /redisadapter {
internal;
try_files #dummy #redis;
}
location /request-we-are-tracking-no-1/ {
js_content redismiddleware.processRequestConditional;
}
[...]
location /request-we-are-tracking-no-2/ {
js_content redismiddleware.processRequestUnconditional;
}
[...]
}
and the corresponding script
var queryString = require('querystring')
function processRequestConditional (request) {
request.subrequest('/redisadapter', {
method : 'POST',
body : queryString.stringify({
/* data to transfer to Redis */
})
}).then(function (response) {
var reply = {}
/**
* parsing and checking feedback from Redis,
* saving some data from feedback to reply
* object, doing any things as planned,
* running any additional routines et
* cetera
*/
if (/* Redis reply is OK */) {
return reply;
} else {
throw new Error('Because we don`t like you!')
}
}).then(function (data) {
/**
* Making one more subrequest to obtain the content
* the client is waiting for
*/
request.subrequest('/secret-url-with-content').then(
(response) => request.return(response.httpStatus, response.body)
)
}).catch((error) {
/**
* Returning to client with response "There will be
* be no response"
*/
request.return(403, error.message)
})
}
function processRequestUnconditional (request) {
request.subrequest('/redisadapter', {
method : 'POST',
body : queryString.stringify({
/* data to transfer to Redis */
})
}).then(function (response) {
/**
* parsing and checking feedback from Redis,
* doing some things, running some or other
* additional routines depending on reply
*/
})
request.subrequest('/secret-url-with-content').then(
(response) => request.return(response.httpStatus, response.body)
)
}
export default { processRequestConditional, processRequestUnconditional }
Short summary
Redis is listening and replying on socket
/var/run/redis/nginx.sock
Virtual internal location /redisadapter receives the requests from njs script, transfers them to Redis and returns the replies back to njs method, which started the requests sequense
To establish data exchange with Redis on some location, we take control over Nginx standard flow and maintain these locations with custom njs methods
Thus, the code in methods in addition to Redis related information exchange routines should itself implement the complete subroutine of retrieving the requested content and delivering this content back to client, since we took this job from Nginx
This is achieved by sets of local subrequests from server to server, which are transparent for clients. njs method completes the set and itself delivers the requested content back to the client
The above example contains two methods, processRequestConditional and processRequestUnconditional, with the goal to show the most used alternative logic ways to maintain such kind of tasks
The first one, processRequestConditional, demonstrates subrequests chain - i.e., the content, requested by client, won't be obtained, until njs method is busy with its primary task transferring the next piece of data into Redis session storage. Moreover, if the script is not satisfied with Redis feedback, the request of content for client is skipped at all, and the client faces refusal message instead
The second method, processRequestUnconditional, transfers the data to Redis storage the same way as the first one above, but this time the fate of the client's request does not depend on results of Redis feedback, thus the secondary request for content is issued at the same time with primary, and flows in parallel while script continue the information exchange round with session storage
Of course, my brief explanation leaves a lot of details behind the scenes, but I hope the basic concept is now clear
Feel free to ask additional questions

Angular 5 InMemoryWebAPI - How to simulate CRUD with HTTP Verbs?

In working with the Angular docs / tutorial on the in-memory web api, I want to return some JSON values that indicate the success or failure of a request. i.e.:
{success:true, error:""}
or
{success:false, error:"Database error"}
But, the code in the example for the in-memory-data.service.ts file only has the one method: createDb().
How do update that service code to respond to a PUT/POST/DELETE request differently than a GET?
Note: In real-life / production, the backend will be PHP, and we can return these values any way we want (with the correct status codes). This question is specifically directed at making the In Memory Web API mock those responses.
Example:
Executing:
return = this.http.post(url,someJsonData,httpHeaders);
I would want return to be:
{success:'true',id:1234} with an HTTP Status code of 200.
Later, to delete that record that was just created:
url = `/foo/` + id + '/'; // url = '/foo/1234/';
this.http.delete(url);
This wouldn't really need a JSON meta data response. An HTTP Status code of 200 is sufficient.
How do update that service code to respond to a PUT/POST/DELETE
request differently than a GET?
The server will always respond to the requests that you have made. So if you fire an Ajax request it may be one of the known HTTP Methods (like POST, GET, PUT...). Afterwards you'll wait for the answer.
/** POST: add a new hero to the database */
addHero (hero: Hero): Observable<Hero> {
return this.http.post<Hero>(this.heroesUrl, hero, httpOptions)
.pipe(
catchError(this.handleError('addHero', hero))
);
}
/** GET: get a new hero from the database */
getHero (heroId: number): Observable<Hero> {
return this.http.get(this.heroesUrl + '/' + heroId)
}
I found the answer I was looking for: It's HttpInterceptors.
This blog post and the corresponding github repo demonstrate exactly how to simulate CRUD operations in Angular 2/5 without having to setup a testing server.

Refresh page after load on cache-first Service Worker

I'm currently considering adding service workers to a Web app I'm building.
This app is, essentially, a collection manager. You can CRUD items of various types and they are usually tightly linked together (e.g. A hasMany B hasMany C).
sw-toolbox offers a toolbox.fastest handler which goes to the cache and then to the network (in 99% of the cases, cache will be faster), updating the cache in the background. What I'm wondering is how you can be notified that there's a new version of the page available. My intent is to show the cached version and, then, if the network fetch got a newer version, to suggest to the user to refresh the page in order to see the latest edits. I saw something in a YouTube video a while ago but the presenter gives no clue of how to deal with this.
Is that possible? Is there some event handler or promise that I could bind to the request so that I know when the newer version is retrieved? I would then post a message to the page to show a notification.
If not, I know I can use toolbox.networkFirst along with a reasonable timeout to make the pages available even on Lie-Fi, but it's not as good.
I just stumbled accross the Mozilla Service Worker Cookbooj, which includes more or less what I wanted: https://serviceworke.rs/strategy-cache-update-and-refresh.html
Here are the relevant parts (not my code: copied here for convenience).
Fetch methods for the worker
// On fetch, use cache but update the entry with the latest contents from the server.
self.addEventListener('fetch', function(evt) {
console.log('The service worker is serving the asset.');
// You can use respondWith() to answer ASAP…
evt.respondWith(fromCache(evt.request));
// ...and waitUntil() to prevent the worker to be killed until the cache is updated.
evt.waitUntil(
update(evt.request)
// Finally, send a message to the client to inform it about the resource is up to date.
.then(refresh)
);
});
// Open the cache where the assets were stored and search for the requested resource. Notice that in case of no matching, the promise still resolves but it does with undefined as value.
function fromCache(request) {
return caches.open(CACHE).then(function (cache) {
return cache.match(request);
});
}
// Update consists in opening the cache, performing a network request and storing the new response data.
function update(request) {
return caches.open(CACHE).then(function (cache) {
return fetch(request).then(function (response) {
return cache.put(request, response.clone()).then(function () {
return response;
});
});
});
}
// Sends a message to the clients.
function refresh(response) {
return self.clients.matchAll().then(function (clients) {
clients.forEach(function (client) {
// Encode which resource has been updated. By including the ETag the client can check if the content has changed.
var message = {
type: 'refresh',
url: response.url,
// Notice not all servers return the ETag header. If this is not provided you should use other cache headers or rely on your own means to check if the content has changed.
eTag: response.headers.get('ETag')
};
// Tell the client about the update.
client.postMessage(JSON.stringify(message));
});
});
}
Handling of the "resource was updated" message
navigator.serviceWorker.onmessage = function (evt) {
var message = JSON.parse(evt.data);
var isRefresh = message.type === 'refresh';
var isAsset = message.url.includes('asset');
var lastETag = localStorage.currentETag;
// ETag header usually contains the hash of the resource so it is a very effective way of check for fresh content.
var isNew = lastETag !== message.eTag;
if (isRefresh && isAsset && isNew) {
// Escape the first time (when there is no ETag yet)
if (lastETag) {
// Inform the user about the update.
notice.hidden = false;
}
//For teaching purposes, although this information is in the offline cache and it could be retrieved from the service worker, keeping track of the header in the localStorage keeps the implementation simple.
localStorage.currentETag = message.eTag;
}
};

Caching on frontend or on backend

Right now I send my requests via ajax to the backend server, which does some operations, and returns a response:
function getData() {
new Ajax().getResponse()
.then(function (response) {
// handle response
})
.catch(function (error) {
// handle error
});
}
The thing is that each time a user refreshes the website, every request is sent again. I've been thinking about caching them inside the local storage:
function getData() {
if (Cache.get('getResponse')) {
let response = Cache.get('getResponse');
// handle response
return true;
}
new Ajax().getResponse()
.then(function (response) {
// handle response
})
.catch(function (error) {
// handle error
});
}
This way if a user already made a request, and the response is cached inside the localStorage, I don't have to fetch data from the server. If a user changes values from the getResponse, I would just clear the cache.
Is this a good approach? If it is, is there a better way to do this? Also, should I cache backend responses the same way? What's the difference between frontend and backend caching?
Is this a good approach? It depends on what kind of data you are storing
Be aware that everything stored on frontend can be changed by the user so this is potential security vulnerability.
This is the main difference between backend and frontend caching, backend caching can't be edited by the user.
If you decide to do frontend caching here is a code how to do it:
localStorage.setItem('getResponse', JSON.stringify(response));
For retrieving stored data from local storage
var retrievedObject = localStorage.getItem('getResponse');
NOTE:
I assume that you are storing object not a string or integer . If you are storing a string, integer, float... Just remove JSON.stringify
The best practice is to use The Cache API "a system for storing and retrieving network requests and their corresponding responses".
The Cache API is available in all modern browsers. It is exposed via the global caches property, so you can test for the presence of the API with a simple feature detection:

Send user details (session token) within every AJAX requests (Sencha Touch 2)

i am building a Sencha Touch 2 Application with userspecific datasets.
Architecture of the App:
Sencha Touch App <=====> Java Server backend with REST Services
( many AJAX requests =) )
What i actually have is:
Login the user with username/password
The app gets initialized and the loginform comes into play. After submitting the form as a AJAX request, the server backend checks the userdata and calls the client callback function.
And what i want to do is:
The callback function should
create a cookie with the sessiontoken or
store the sessiontoken within the localstorage (http://docs.sencha.com/touch/2-0/#!/api/Ext.data.proxy.LocalStorage) or
store the sessiontoken within js variable
Okay, shouldn't be the problem.
But how can i achieve the following:
Most of the data is specific for one user and should be returned by the REST service if needed (by clicking on the navigation,...). How can i send the sessiontoken (see above) within every AJAX request - so the server can provide the suitable datasets (assuming the token is valid)?
Send cookies within AJAX requests
I have already read that cookies gets automaticly added to the request if the url is on the same space, right? The Java Server is on the same domain (localhost:8080) but the cookies aren't available - instead of requests on urls like 'app.json'. I thought that cross-domain-requests are really domain specific?
Send paramaters within AJAX requests
Because the cookies aren't avi i thought about the possiblity of 'manually' adding parameters to the ajax requests. The App will contain many AJAX requests and thats why i dont want to add the token manually - i tried to override the requests function of Ext.Ajax but i failed ;-( :
(function() {
var originalRequest = Ext.data.Connection.prototype.request;
Ext.override(Ext.data.Connection, {
request : function(options) {
alert("Do sth... like add params");
return originalRequest.apply(this, options);
}
});
})();
ERROR:
Uncaught Error: [ERROR][Ext.data.Connection#request] No URL specified
I also tried to add a listener
Ext.Ajax.add({
listeners : {
beforerequest : function( conn, options, eOpts ){
alert("Do sth... like add params");
}
}
});
ERROR:
Uncaught TypeError: Object [object Object] has no method 'add'
Any idea about how i can add the token?
Or any better way of handling these case?
Thanks!
Finally i successfully used:
function addAjaxInterceptor(context)
{
Ext.Ajax.on('beforerequest', function(conn, options, eOptions)
{
// add the param to options...
}, context);
}
Executed from the app (=> addAjaxInterceptor(this)).
But the following solution is more suitable for my situation i think:
Ext.Ajax._defaultHeaders = {
// params as json
};
(Cause Ext.Ajax is a singleton object and i dont change the params for every request)

Categories

Resources