Node async race condition - javascript

I have a route in Node that:
Accepts a user ID
Gets the user from redis
Updates a property on the user
Saves the user back to redis
As redis uses async methods to get and save, if another request comes in for the same user, I get stale results.
What is the best pattern to make sure the second request doesn't process until the first is finished? Using sync versions of get and set seems wrong as it's locking, although I don't think it will have any noticeable effect in my application.

Related

Which event is triggered when we use ref.push to write in firebase database javascript

When we are working in firebase using javascript which event is triggered after we insert data using ref.push or ref.set.
I wanted to know if my data is inserted or not
I also wanted to throw an error when user have disconnected from internet while inserting data in firebase
I haven't seen any function or any method in internet which tells me about if data is successfully inserted or not.
This functions Promise-based, so you can use try/catch:
try {
firebase.push(data) // or set
} catch (error) {
console.log(error) // here is error
}
The Firebase Realtime Database doesn't consider a lack of internet connection an error condition. Instead it continues to work to its best ability in the given conditions.
When you perform a write operation (with set, push, update, or remove) while there is no internet connectivity:
The first client fires local events immediately, so that your app can update the UI for the new/updated data.
It then queues the write operation for delivery once the connection is restored.
Once the connection is restored, the client sends any pending write operations it has in the order in which the client performed them.
It then handles the response from the server, which (if the server rejects the operation because of security rules) may lead to firing more local events so that the app can put the UI back into the correct sate.
And it then finally calls any completion listeners, and resolves or rejects the promise for the set(), push(), update(), or remove() method.
You'll note that there is no error raised at any point for a lack of an internet connection.
If you don't want to send any data to the local queue when the app has no internet connection, it's best to detect if the Firebase client is connected to the server. You can do this by listening to the .info/connected pseudo-node. This covers more than just having an internet connection btw, but also cases where the internet connections works but the client can't reach Firebase. The best practice here is to use a "global" listener for this status, and disable the relevant UI elements if the client is not connected.

Right way of passing consistent data from DB to user without repeatedly querying

Database stores some data about the user which almost never change. Well sometimes information might change if the user wants to edit his name for example.
Data information is about each user's name, username and his company data.
The first two are being shown to his navigation bar all the time using ejs, like User_1 is logged in, his company profile data when he needs to create an invoice.
My current way is to fetch user data through middleware using router.use so the extracted information is always available through all routes/views, for example:
router.use(function(req, res ,next) { // this block of code is called as middleware in every route
req.getConnection(function(err,conn){
uid = req.user.id;
if(err){
console.log(err);
return next("Mysql error, check your query");
}
var query = conn.query('SELECT * FROM user_profile WHERE uid = ? ', uid, function(err,rows){
if(err){
console.log(err);
return next(err, uid, "Mysql error, check your query");
}
var userData = rows;
return next();
});
});
})
.
I understand that this is not an optimal way of passing user profile data to every route/view since it makes new DB queries every time the user navigates through the application.
What would be a better way of having this data available without repeating the same query in each route yet having them re-fetched once the user changes a portion of this data, like his fullname ?
You've just stumbled into the world of "caching", welcome! Caching is a very popular choice for use cases like this, as well as many others. A cache is essentially somewhere to store data that you can get back much quicker than making a full DB query, or a file read, etc.
Before we go any further, it's worth considering your use case. If you're serving only a few users and have a low load on your service, caching might be over-engineering and in fact making a DB request might be the simplest idea. Adding caching can add a lot of complexity to your code as things move forward, not enough to scare you, but enough to cause hard to trace bugs. So consider for a moment your service load, if it's not very high (say an internal application for somewhere you work with only maybe a few requests every few minutes) then just reading from the DB is probably not going to slow down a request too much. In this case, reading from the DB is the simplest and probably best solution. However, if you're noticing that this DB request is slowing down your application for requests or making it harder to scale up, then caching might be for you.
A really popular approach for this would be to get something like "redis" which is a key-value database that holds everything in memory (RAM). Redis can sit as a service like MySQL and has a very basic query language. It is blindingly fast and can scale to enormous loads. If you're using Express, there are a number of NPM modules that help you access a redis instance. Simply push in your credentials and you can then make GET and SET requests (to get data or to set data).
In your example, you may wish to store a users profile in a JSON format against their user id or username in redis. Then, create a function called getUserProfile which takes in the ID or username. This can then look it up in redis, if it finds the record then it can return it to your main controller logic. If it does not, it can look it up in your MySQL database, save it in redis, and then return it to the controller logic (so it'll be able to get it from cache next time).
Your next problem is known for being a very pesky problem in computer science. It's "Cache Invalidation", in this case if your user profile updates you want to "invalidate" your cache. A way of doing this would be to update your cached version when the user updates their profile (or any other data saved). Alternatively, you could also just remove the cached version from redis and then next time it's requested from getUserProfile, it will be fetched from the DB fresh, and then put into redis for next time.
There are many other ways to approach this, but this will most likely solve your problem in the simplest way without too much overhead. It will also be easy to expand in the future!

firebase Is it possible to update a transaction on disconnect()

I have to maintain maxCountOfConcurrent Users in a day.
For this I was thinking of adding a transaction,
currently i use something like this to remove the username from online users
rootScope.userPresenceRef.onDisconnect().remove();
Is it possible to have something like this
rootScope.userPresenceRef.onDisconnect().transaction(function(count) {});
An onDisconnect() handler is implemented as a single write operation on the server, when it detects that the client has disconnected. At this point there is no way for the server to talk to the client anymore, so the write operation must consist purely of data that can be determined at the time the onDisconnect() handler is registered.
Since a transaction in Firebase requires communication between the client and the server, there is no way to run a transaction on disconnect. You will have to find a way to model the data without requiring it to be a transaction.
You can use functions
Structure Your Data as
usersData -> uid -> status - online/offline
Listen to update changes of status in functions
if change -> online (make transition to increase count else to decrease count)

Changing a Meteor collection subscription for all clients

I am developing a webapp in which I'd need one client, associated with the admin, to trigger an event (e.g., a new value selected in a dropdown list) which in turns will tell all the other connected clients to change the subscription, possibly using a parameter, i.e., the new selected value.
Something along the lines of
Template.bid.events
"change .roles": (e, tpl) ->
e.preventDefault()
role = tpl.$("select[name='role']").val()
Meteor.subscribe role
Of course this works for the current client only.
One way I thought would be keeping a separate collection that points a the current collection to be used, so the clients can programmatically act on that. It feels cumbersome, thou.
Is there a Meteor-way to achieve this?
Thanks
In meteor, whenever you have a problem that sounds like: "I need to synchronize data across clients", you should use a collection. I realize it seems like overkill just to send one piece of data, but I assure you it's currently the path of least resistance.
There are ways you can expose pseudo-collections which don't actually write to mongo, but for your use case that really sounds like overkill - new Mongo.Collection is the way to go.
You can use streams to setup a simple line of communication between connected clients and the server. It doesn't store data in MongoDB. Just let all connected clients listen to a stream and switch subscriptions when a new message comes in with the subscription name. Make sure only your client associated to your admin can push messages to the stream.
Available package: https://atmospherejs.com/lepozepo/streams
Examples: http://arunoda.github.io/meteor-streams/

Callback, which fires on every user action

I would like to implement a timeout functionality in an AngularJS web app. Whenever a user does anything, a callback sets the timer to 10 minutes. When this timer expires on the client-side, a timeout request is sent to the server.
How can I implement a callback which fires from every user action? Maybe register onclick and onkeydown listener on the whole page/window?
According to the angular docs for $rootScope:
If you want to be notified whenever $digest() is called, you can
register a watchExpression function with $watch() with no listener.
I am not sure if $digest is called on every user action but it might be a good place to start.
FYI - How I implement a User time-out is as follows:
Authenticate user via server side API
Have API store the user in a cache with sliding expiration and pass back a session token
Have each secured API end-point require a session token and use this to fetch the User from the cache thus resetting the expiration timer.
If the User does not exist in the cache return a 403 forbidden error which your client code handles, presumably by sending the user back to login page. Actually I return a custom 403 that has a specific 'User Timeout' code and message so my client can handle the time-out gracefully.
This should work fine for most Single Page Apps because pretty much anything the user does causes a state change and most state changes involve a call to the server to fetch or save stuff. It misses out on minor user actions but in I have found that the real world use of this technique has sufficient resolution to be effective.
Looks like hackedbychinese.github.io/ng-idle does what I need

Categories

Resources