How to reduce amount of data being transmitted in Firebase Realtime Database? - javascript

I am using the Google Cloud Firebase Realtime database to store messages. The messages are saved in
users/$userID/messages/$topic[0..n]/message[0..n]
I am using the official JS library AngularFire. I am listening on new messages via the following code:
this.observable = this.db.list(`users/${user.uid}/topics/${topic}/`).valueChanges();
I can now subscribe to the observable. Imagine the user has 1 million messages in a given topic. Whenever I add a new message, I receive the 1 million messages in the callback.
My question is, how much data is actually transferred behind the scenes if I modify or add a new message? I know the library keeps a local copy of the database.
On top, how do I find out which message got modified? Or do I need to figure this out myself?

If you have an existing listener, and a new message is added, only that new message is sent to that client. You can easily verify this for yourself by looking at the web socket traffic in the network tab of your browser's developer tools.
But I would recommend to use a query and limit to reduce the number of messages retrieved, as it seems unlikely any user will read 1 million messages, and it's wasteful to retrieve (much) more data than the user will see.
Based on the AngularFire documentation on querying lists, that should be something like:
this.db.list(`users/${user.uid}/topics/${topic}/`,
ref => ref.orderByKey().limitToLast(20)
).valueChanges()

Related

Can I force react to rerender when other user is inserting data into database?

I would like to create simple forum where users would be able to add/delete/modify posts stored in database. I have simple auth app: client(react app) - server(express). Is it possible to force client side to rerender for one user when there is a change made by another user while both are loggedIn?
It should be simple enough. When a logged in user is on a page that you want to keep synced, open a websocket to the server. Then, when any user makes a change, the server can update its database, and then once that's finished, it can send a websocket message to all users with connected sockets on that page, containing the new data to render.
In the client-side React, create the socket and subscribe to its message event. When a message is sent from the server, call the state setter with the new data.
This mechanism is somewhat similar to how Stack Overflow pushes "An edit has been made to this post" notifications when a post gets edited - anyone looking at the page will see such server-sent notifications through the websocket connection to Stack Exchange.
Websockets is one approach you can follow. If you think this will be complex to implement, you can poll for the data every minute or so.
This is very useful library tailored for React:
useQuery- https://tanstack.com/query/v4/docs/reference/useQuery?from=reactQueryV3&original=https://react-query-v3.tanstack.com/reference/useQuery
You can use it for polling/caching and making regular network calls as well. Lots of utility is provided that you can leverage; especially, considering the use case you seem to be tackling.
There is a slight learning curve, but its worth it if you're a react developer.

Discord Bot SQL Query Efficiency

Long story short, I have been developing a Discord Bot that requires a query to the database every time a message is sent in a server. It will then perform an action depending on the message etc. The query is asynchronous, therefore it will not block another message from being handled.
However in terms of scalability, I do not believe querying a database every time a message is sent is very speedy and could become a problem. Is there a better solution? I am unaware of a way to store data within a particular discord server, which would likely solve my issue.
My main idea is to have heap storage, where the most recently active servers (ie sent messages recently), their data is queried into the heap, and when they are inactive, it is removed from the heap. Is this a good solution? Or is it better to just keep querying every time?
You could create a cache and every time you fetch or insert something into your database you can write this into the cache.
Then, if you need some data you can check if it's in the cache and if not, get it from the database and store it in the cache right after.
This prevents unnecessary access to the database because the database is only accessed if your bot does not have the required data stored locally.
Note:
The cache will only be cleared when you restart the bot. But of course, you can also clear it after a certain amount of time or by other triggers.
If you need an example, you can take a look at my guildMemberAdd event and the corresponding config command

How to schedule push notifcations for react native expo?

I am trying to send push notifications to a user at a scheduled time. Say they set the date for an event and they want to be notified 30 minutes before, that is when I would like to send them a notification. I am using firebase as my backend and the project is built with expo.
I am curious how I would use expo's notification system if I am using firebase cloud messaging because it says I need separate permission from firebase (I already have the expo token for each user). I have looked into node cron/schedule and also react-native push notification but I am unsure which would be the best solution and where I would deploy the solution (such as running a cloud function).
I assume I need some type of function that takes the token, message body, title, and date and then sets it up to schedule it either to the server or locally. And then that function would be called when they press the button to receive the notification. They can also change the date of the event so it would need to switch the date if the user did that.
Any advice would be greatly appreciated as I have been researching this for days and still am unsure of the best approach.
One possible approach:
In your backend, schedule a cron job that runs every minute (or every 15 seconds) and checks against database which events have start time within next 30 mins.
Once you have the events, find out the users registered for those events and collect their user ids.
As you mention that you already have stored the tokens, so I assume that those tokens exist in some table against the user id. (e.g. mapping of user-id and tokens). Look up this table to fetch the tokens of those users.
Prepare the notification payload and call firebase messaging to send notification against the token. For example, at this point you can call the sendToDevice() function from Firebase SDK: firebase.messaging().sendToDevice(tokens, payload);
Now you can implement these steps in your backend (e.g. Nodejs) or you can deploy a cloud function for this and setup scheduling for this cloud function.
Let me know if you need any further help!

Sending HTTP response after consuming a Kafka topic

I’m currently writing a web application that has a bunch of microservices. I’m currently exploring how to properly communicate between all these services and I’ve decided to stick with a message bus, or more specifically Apache Kafka.
However, I have a few questions that I’m not sure how to conceptually get around.
I’m using an API Gateway-service as the main entry to the application. It acts as the main proxy to forward operations to the applicable microservices.
Consider the following scenario:
User sends a POST-request to the API Gateway with some information.
The Gateway produces a new message and publishes it to a Kafka topic.
Subscribed microservices pick up the message in the topic and processes the data.
So, how am I now supposed to respond to the client from the Gateway? What if I need some data from that microservice? Feels like that HTTP request could timeout. Should I stick with websockets between the client and API Gateway instead?
And also, if the client sends a GET request to fetch some data, how am I supposed to approach that using Kafka?
Thanks.
Let's say you're going to create an order. This is how it should work:
Traditionally we used to have an auto-increment field or a sequence in the RDBMS table to create an order id. However, this means order id is not generated until we save the order in DB. Now, when writing data in Kafka, we're not immediately writing to the DB and Kafka cannot generate order id. Hence you need to use some scalable id generation utility like Twitter Snowflake or something with the similar architecture so that you can generate an order id even before writing the order in Kafka
Once you have the order id, write a single event message on Kafka topic atomically (all-or-nothing). Once this is successfully done, you can send back a success response to the client. Do not write to multiple topics at this stage as you'll lose atomicity by writing to multiple topics. You can always have multiple consumer groups that write the event to multiple other topics. One consumer group should write the data in some persistent DB for querying
You now need to address the read-your-own-write i.e. immediately after receiving success response the user would want to see the order. But your DB is probably not yet updated with the order data. To acheive this, write the order data to a distributed cache like Redis or Memcached immediately after writing the order data to Kafka and before returning the success response. When the user reads the order, the cached data is returned
Now you need to keep the cache updated with the latest order status. That you can always do with a Kafka consumer reading the order status from a Kafka topic
To ensure that you don't need to keep all orders in cache memory. You can evict data based on LRU. If while reading an order, the data is not on cache, it will be read from the DB and written to the cache for future requests
Finally, if you want to ensure that the ordered item is reserved for the order so that no one else can take, like booking a flight seat, or the last copy of a book, you need a consensus algorithm. You can use Apache Zookeeper for that and create a distribured lock on the item
Do you have an option to create more endpoints in the gateway?
I would have the POST endpoint dedicated just for producing the message to the Kafka queue, which the other microservice will consume. And as a returned object from the endpoint, it'll contain some sort of reference or id to get the status of the message.
And create another GET endpoint in the gateway where you can retrieve the status of the message with the reference of the message you got when you created it.

how to show real time data to all users using react and firebase?

I am building a messaging app that updates in realtime. So far I can log in with google and post a message and then that message displays on screen. however, if I log in via another google account (the app is hosted on heroku) and post a message as userB then userA won't see this message on their screen until they refresh the page. what is the best way to update all screens in real time so people can actually have a conversation in real time.
every message is posted and stored in the firebase. my only solution so far requires using the javascript setInterval method and pulling from the database every 3-5 seconds. this worked however it caused the app to become very slow and laggy and a poor experience. any pointers/tips are welcomed
You are using the Firebase and its one of the main feature is the real-time database. Firebase will automatically let you know if there is any change in your JSON database. You no need to send the request in interval basic.
You can refer Zero to App: Develop with Firebase - Google I/O 2016 It is also a messaging app demo by the Google Guys.
You can find the sample source code in Github to send and receive the message in real-time.
There are a lot of ways to do this. Generally, you will want to be notified by the server once a new message has come in and not have to ping the server every X seconds.
You could look at these:
socket.io and learn about websockets in general
A nice list of existing chat apps that utilize react
Google's cloud messaging, as you already use firebase, this might be the way to go for you here.
This should lead you in the right direction.

Categories

Resources