How to persistently cache data fetched from remote server with javascript - javascript

I am building a reactjs (with hooks) web app which presents the user some data fetched from remote server pages (using a simple proxy).
Data on remote server changes about once per week, so I would like to persistently cache data (for example using LocalStorage) on the client, until server side pages are not updated, for a better user experience.
I'm using axios for data fetching, but I could also use a basic fetch.
The problem is that I can't understand how to cache data on the client until any update on the server: I see for example axios supports caching mechanisms (also using interceptors), but no way I can specify something like ETag or If-None-Match request header, but I can only specifiy a fixed amount of time before the cache is invalidated.
It is possible I'm missing something obvious, and that I'm asking something unfeasible...

I can think of a few ways solving this:
1) using a DDP protocol https://en.wikipedia.org/wiki/Distributed_Data_Protocol or some polling mechanism (e.g. fetch every x minutes)
2) add some "expireAt" timestamp in the response itself and use that one on the client to figure out when to check again for updates
3) using websockets with some pub/sub mechanism
They all have their ups and downs and you'd need to check if they are feasible for your application.
Best,
Sebo

Related

Get real-time data only one way (server to client). Is it worth websockets?

I need my website to get real time data from the server (it is for a project in html5, css3, javascript, php, mysql).
Initially I thought about websockets but maybe it is something "beast", since I don't need two-way communication, I just need to capture on the web (in real time without the user doing anything) the values of a mysql field and depending on the themselves do one thing or another in javascript.
My system could have about 1,000 users at a time.
What system do you recommend me? Would you know of any example?
So you basically have a website and you plan on receiving notifications from the server.That to me seems like a continous flow of data from the server to you the client at irregular intervals , and the reasonable way to do it (unless you plan on saving in an IP table your clients) is via websockets.This would be the push based approach.
Once you have established connection it remains open and you can get your data continously.
Another option like mentioned above would be to continously pull data from the server (query server for changes) and this could be done with HTTP.
So if you choose push based option you could you Websockets or you could look at Server Sent Events.
For pulling (request-response) you could use HTTP or something lighter like gRPC.
For more information check the options here
The question is not if the communication is one- or two-way but which communication partner initiates a speech act. If the real-time data of your application changes asynchronously and not in a certain rhythm then the server should send updates asynchronously to the webclient. And this is actually one of the standard usages of websockets which cannot be well implemented with HTTP request/response pairs (client would have to poll).

Is there a best practice to prevent the user of my Angular SPA to manipulate the Form-Data in a Request before it is sent to the server

My Angular (Version 5) app is secured with a JWT Token and AuthGuards. In theory a user is able to manipulate form data with Chromes Developer Tools before the aggregated form values are sent to the server. Are there nowadays new good practices to prevent this on client side and therefore I can assume that the data sent over with https to the server can be trusted ?
This Question addresses the problem: A server side session in combination with validation is recommended there. But in a Restful Architecture there is no session anymore and we can´t prevent all combinations of manipulation attempts by using validations on server side. Therefore I am looking for a convenient client side solution, that makes it uncomfortable for the normal user to manipulate with developer tools. Also I know that there can´t be 100% trustful Client implementation. But complicating the manipulation attempts would be nice trade-off.
Consider creating HttpClient interceptors (see https://angular.io/guide/http), which will be automatically invoked for each HTTP request (including the ones initiated by forms). In those interceptors, you can implement some business logic to ensure that the data was not being manipulated by the user.

Force Cache Update for AJAX Requests

I am working on a large Java EE based enterprise portal.
The users navigation is retrieved via jQuery-AJAX requests. Since the navigation is rather big and these AJAX requests to the server are a bit long running, I use
cache: true
option to let the browser store the request in its cache and later on retrieve results for repeating requests.
So far so good. In some cases, like e.g. navigation entries have changed or the user changed the frontend language, I need make the browser reload the requests freshly from the server.
I know, I can use "cache: false", but instead of just bypassing the browser cache, I'd rather make the browser update the obsolete cached responses by requesting fresh data from the server.
Is there any option I can add from frontend or backend side to the requests or their results, to make the browser discard the obsolete results with newly retrieved values?
I have encountered this same issue. I found that by setting the HTTP header "Last-Modified" on the HTTP responses, the browsers automatically refreshed the cached data based upon the age of the cache content.

How to update web application with data every n minutes

I want to create a web application that displays data from a public api. I will use d3 (a javascript data-visualization library). I want to retrieve data from the api every ten minutes, and update my page (say it is traffic, or something). I have not built many web applications, how do I get the updates?
Should the js on the client side use a timer to request updates from the server side of my application (perhaps the application is written in Rails or node.js). The server then makes the api call and sends a response asynchronously? Is this called a socket? I have read that HTML5 provides sockets.
Or, perhaps an AJAX request?
Or, does the server side of my application create a timer, make the api call, and then "push" updates to the view. This seems wrong to me, there could be other views in this application, and the server shouldn't have to keep track of which view is active.
Is there a standard pattern for this type of web application? Any examples or tutorials greatly appreciated.
An AJAX request (XMLHttpRequest) is probably the way to go.
I have a very simple example of an XMLHttpRequest (with Java as the backend) here: https://stackoverflow.com/a/18028943/1468130
You could recreate a backend to receive HTTP GET requests in any other server-side language. Just echo back whatever data you retrieved, and xmlhttp.onload() will catch it.
Depending on how complex your data is, you may want to find a JSON library for your server-side language of choice, and serialize your data to JSON before echoing it back to your JS. Then you can use JavaScript's JSON.parse() method to convert your server data to an object that can easily be used by the client script.
If you are using jQuery, it handles AJAX very smoothly, and using $.ajax() would probably be easier than plain-old XMLHttpRequest.
http://api.jquery.com/jQuery.ajax/
(There are examples throughout this page, mostly-concentrated at the bottom.)
It really annoys me how complicated so many of the AJAX tutorials are. At least with jQuery, it's pretty easy.
Basically, you just need to ask a script for something (initiate the request, send url parameters), and wait for the script to give you something back (trigger your onload() or jqxhr.done() functions, supplying those functions with a data parameter).
For your other questions:
Use JavaScript's setTimeout() or setInterval() to initiate an AJAX request every 600000 milliseconds. In the request's onload callback, handle your data and update the page appropriately.
The response will be asynchronous.
This isn't a socket.
"Pushing" probably isn't the way to go in this case.
If I understand correctly and this API is external, then your problem can be divided into two separate sub-problems:
1) Updating data at the server. Server should download data once per N minutes. So, it should not be connected to customers' AJAX calls. If two customers will come to the website at the same time, your server will make two API call, what is not correct.
Actually, you should create a CRON job at the server that will call API and store its' result at the server. In this case your server will always make one call at a time and have quite a fresh information cached.
2) Updating data at clients. If data at customers' browsers should be updated without refreshing the page, then you should use some sort of Ajax. It can make a request to your server once per X minutes to get a fresh data or use so-called long-polling.
I think the most effective way to implement real time Web application is to use Web socket to push changes from the server rather than polling from the client side. This way users can see changes instantaneously once server notify that there is new data available. You can read more on the similar post.
I have tried using nodejs package called socket.io to make a real time virtual classroom. It works quite well for me.

Is it bad to use socket.io for most of the requests or should I only use it to push data to client?

Is it bad to replace AJAX routes (called with $.Ajax from jquery) such as:
GET /animals
GET /animals/[id]
POST /animals
With socket.io events (event binded on client and server so the client can have a response):
emit("animals:read")
emit("animals:read", {id:asdasd})
emit("animals:write", animalData)
or should I "only" use socket.io it to push data to client?
[EDIT]
I could see one problem if I don't use socket io for my POST route. The problem is that I can't easely use the client's socket to broadcast data:
Server:
on("animals:write", function(data){
saveAnimal(req.data)
socket.broadcast(...)
emit("animals:write", writenAnimal)
})
VS
app.post(function(req,res){
saveAnimal(data)
// cant broadcast :(
res.send(201,writenAnimal)
})
I will push data to clients in some other requests for sure, so all clients will have at least 1 socket.
IMHO socket.io should be used if you want real-time data provided for your website. Take for example Stackoverflow. It uses websocket to update in realtime your scores and update your notifications.
But if you really want to create application that is SEO-friendly ( I mean supporting http for serving your pages ) and more important if you are aware of the difficulties in managing sessions and permissions in socket.io, I think you'll prefer AJAX for your pages and socket.io for other realtime data.
I would use ajax for this, its http based data requests, nothing realtime.
If you don't want to push data to the client, I don't see why you would use socket.io instead of AJAX. I mean with AJAX you don't need to handle a session with the client and it would probably scaled better.
With socket.io, for every connected client you need some sort of object on the server paired up with that one client. It will use more memory on the server for no reason if permanent connections are unnecessary or unwanted.
Also, AJAX will be better if you want to re-use your code with other systems, it is working with a huge existing ecosystem of tools.
That being said, if you need WebSocket features like pushing the data to the client or doing some kind of broadcast, you might consider using socket.io instead of AJAX since it will be hard to achieve this with AJAX and socket.io offers those features.

Categories

Resources