How to check if database has been updated - javascript

I have this react app that periodically makes a fetch request to a database to update a list.
Is there a way to do this without using a timing loop ?
Block Diagram:

It depends on the technologies you are using.
The first thing you need to think is when the notification is created. If it really is created after a database change, the DB may need to trigger an event notifying it. Some databases offers this service, like Postgres, MySQL and even Firebase Realtime Database.
Once you identified the event, the best case is when you have a bidirectional connection between client and server, when the server can emit an event to the client. It's a common approach to use it with websockets, but sometimes you just need to emit an notification, and then, using Firebase Cloud Notifications, you can skip to have a WS server to handle this.
Otherwise, the only way is with long pooling.

Related

Can I force react to rerender when other user is inserting data into database?

I would like to create simple forum where users would be able to add/delete/modify posts stored in database. I have simple auth app: client(react app) - server(express). Is it possible to force client side to rerender for one user when there is a change made by another user while both are loggedIn?
It should be simple enough. When a logged in user is on a page that you want to keep synced, open a websocket to the server. Then, when any user makes a change, the server can update its database, and then once that's finished, it can send a websocket message to all users with connected sockets on that page, containing the new data to render.
In the client-side React, create the socket and subscribe to its message event. When a message is sent from the server, call the state setter with the new data.
This mechanism is somewhat similar to how Stack Overflow pushes "An edit has been made to this post" notifications when a post gets edited - anyone looking at the page will see such server-sent notifications through the websocket connection to Stack Exchange.
Websockets is one approach you can follow. If you think this will be complex to implement, you can poll for the data every minute or so.
This is very useful library tailored for React:
useQuery- https://tanstack.com/query/v4/docs/reference/useQuery?from=reactQueryV3&original=https://react-query-v3.tanstack.com/reference/useQuery
You can use it for polling/caching and making regular network calls as well. Lots of utility is provided that you can leverage; especially, considering the use case you seem to be tackling.
There is a slight learning curve, but its worth it if you're a react developer.

How to reduce amount of data being transmitted in Firebase Realtime Database?

I am using the Google Cloud Firebase Realtime database to store messages. The messages are saved in
users/$userID/messages/$topic[0..n]/message[0..n]
I am using the official JS library AngularFire. I am listening on new messages via the following code:
this.observable = this.db.list(`users/${user.uid}/topics/${topic}/`).valueChanges();
I can now subscribe to the observable. Imagine the user has 1 million messages in a given topic. Whenever I add a new message, I receive the 1 million messages in the callback.
My question is, how much data is actually transferred behind the scenes if I modify or add a new message? I know the library keeps a local copy of the database.
On top, how do I find out which message got modified? Or do I need to figure this out myself?
If you have an existing listener, and a new message is added, only that new message is sent to that client. You can easily verify this for yourself by looking at the web socket traffic in the network tab of your browser's developer tools.
But I would recommend to use a query and limit to reduce the number of messages retrieved, as it seems unlikely any user will read 1 million messages, and it's wasteful to retrieve (much) more data than the user will see.
Based on the AngularFire documentation on querying lists, that should be something like:
this.db.list(`users/${user.uid}/topics/${topic}/`,
ref => ref.orderByKey().limitToLast(20)
).valueChanges()

How to handle offline messages in React-Native using NodeJS and SocketIO

I am currently using SocketIO and NodeJS to handle messages. However the problem is that when the user becomes offline there's no way that the other user will receive the message.
The solution I came up with was to store the message in the database.
But a new problem arise, when fetching the message and push notification.
If I do fetch for "n" minutes in the server when the app is in background/inactive. There will be a lot of request in the server, and I personally think that it is inefficient. and also it drains the battery.
What is the proper way how to handle fetching the messages from database or pushing notification in app without making too much request in "n" minutes and draining too much power?
You need to save the last sync time in the App. And whenever app comes from background/Inactive state. You need to call an API with this time. This API will give you all the messages and the Push notification which has comes after the last sync time. In this way, With one API call, you will be able to get all the messages and push notifications. I had used this approach to syncing my data in one of my app.
My suggestion is to implement a system of background jobs in the API, checking when there is a new notification to be launched, or with the notification already ready waiting to be launched in the queue. You can search for queue manager like Bull, Bee-Queue.
To launch push notification in the closed/inactive app, you can use a service like OneSignal or Firebase.
I implemented this a few weeks ago and did it this way.
API = Node.js, Bull Queue
App = React Native, OneSignal
Going back to this question if somebody stumbled upon this question.
The best way to handle offline messages regardless if you are using NodeJS/MongoDB, etc. is to store it on server's database. Then call an API that fetches the the messages which is equal to user's ID whenever the mobile app comes to foreground.
If your problem is that you needed notification and you are using
react-native-push-notifications / react-native-push-notification-ios
Then you should use the data notification to include the message on notification parameter on the server's side(Assuming that you are using Firebase Cloud Messaging). With this way you can directly save the message on the mobile's database.

Sending HTTP response after consuming a Kafka topic

I’m currently writing a web application that has a bunch of microservices. I’m currently exploring how to properly communicate between all these services and I’ve decided to stick with a message bus, or more specifically Apache Kafka.
However, I have a few questions that I’m not sure how to conceptually get around.
I’m using an API Gateway-service as the main entry to the application. It acts as the main proxy to forward operations to the applicable microservices.
Consider the following scenario:
User sends a POST-request to the API Gateway with some information.
The Gateway produces a new message and publishes it to a Kafka topic.
Subscribed microservices pick up the message in the topic and processes the data.
So, how am I now supposed to respond to the client from the Gateway? What if I need some data from that microservice? Feels like that HTTP request could timeout. Should I stick with websockets between the client and API Gateway instead?
And also, if the client sends a GET request to fetch some data, how am I supposed to approach that using Kafka?
Thanks.
Let's say you're going to create an order. This is how it should work:
Traditionally we used to have an auto-increment field or a sequence in the RDBMS table to create an order id. However, this means order id is not generated until we save the order in DB. Now, when writing data in Kafka, we're not immediately writing to the DB and Kafka cannot generate order id. Hence you need to use some scalable id generation utility like Twitter Snowflake or something with the similar architecture so that you can generate an order id even before writing the order in Kafka
Once you have the order id, write a single event message on Kafka topic atomically (all-or-nothing). Once this is successfully done, you can send back a success response to the client. Do not write to multiple topics at this stage as you'll lose atomicity by writing to multiple topics. You can always have multiple consumer groups that write the event to multiple other topics. One consumer group should write the data in some persistent DB for querying
You now need to address the read-your-own-write i.e. immediately after receiving success response the user would want to see the order. But your DB is probably not yet updated with the order data. To acheive this, write the order data to a distributed cache like Redis or Memcached immediately after writing the order data to Kafka and before returning the success response. When the user reads the order, the cached data is returned
Now you need to keep the cache updated with the latest order status. That you can always do with a Kafka consumer reading the order status from a Kafka topic
To ensure that you don't need to keep all orders in cache memory. You can evict data based on LRU. If while reading an order, the data is not on cache, it will be read from the DB and written to the cache for future requests
Finally, if you want to ensure that the ordered item is reserved for the order so that no one else can take, like booking a flight seat, or the last copy of a book, you need a consensus algorithm. You can use Apache Zookeeper for that and create a distribured lock on the item
Do you have an option to create more endpoints in the gateway?
I would have the POST endpoint dedicated just for producing the message to the Kafka queue, which the other microservice will consume. And as a returned object from the endpoint, it'll contain some sort of reference or id to get the status of the message.
And create another GET endpoint in the gateway where you can retrieve the status of the message with the reference of the message you got when you created it.

AngularJS and MySQL real-time communication

I have built a web application using AngularJS (front-end) and PHP/MySQL (back-end).
I was wondering if there is a way to "watch" the MySQL database (without Node.js), so if one user adds some data to it, the changes are synced to other users too.
E.g. I know Firebase does that, but it's object oriented database and I am unable to do the advanced queries there like I do with SQL.
I was thinking to use $interval and $http and do ajax requests, so that way I could detect changes in the database. Well, that's possible, but it'll then do thousands of http requests to the server everyday and plus interpret php on each request.
I believe nothing is impossible, I just need an idea to do this, which I don't have, so that's why I am asking for a help here.
If you want a form of "real-time communication" you'll likely have to incorporate some form of long-polling from the client. Unless you use web sockets, but that's a big post about a bunch of different things. You're right to be concerned about bandwidth and demand on the DB though. So here's my suggestion:
If you don't have experience with web sockets then log your events in a separate table/view and use the pub/sub method to subscribe entities to an event, and broadcast that event to the table. Then long-poll against the watcher view to see when changes may have occurred. If one did occur then you query for the exact value.
Another option would be to use some query system with "deciders" that hold messages. Take a look at Amazon's SQS platform for a better explanation of how this could work. Basically you have a queue that holds messages and a decider chooses where to store the message using some hash or sorting method (to reduce run time). When the client requests an update, the decider finds any messages that would apply based on the hash/sort and returns them. Then you just have to decide how and when to destruct the messages.
The second option would require a lot more tinkering though, so it's really about your preference. I think what you'll find the difficulty to be is that most solutions have to deal with the fact that the message has to be delivered 1 or More times and you'll need to track when someone received the message and if it can now be deleted from the queue/event table or if you still need to wait. Otherwise you'll consume a lot of memory.

Categories

Resources