How should I manage in-memory data in Node? - javascript

I have a simple app built using Node, Express, and Socket.io on the server side. My page queries my API when it needs to retrieve data that will not change, and uses WebSockets for getting live updates from the server for dynamic data. The app allows a single person, the "Supervisor", to send questions to any number of "Users" (unauthenticated) and view their answers as they trickle in. The Users send their data to the server using a POST request, and it is streamed to the Supervisor over a WebSocket. The server stores user data in a simple array, and uses an ES6 map of the items in the array (users) to objects containing each their questions and answers, like this:
class User {}
let users = [], qa = new Map();
io.on('connection', socket => {
let user = new User(socket.id);
users.push(user);
qa.set(user, {});
socket.on('question-answered', ({id, answer}) => {
let questionData = qa.get(user);
questionData[id] = answer;
qa.set(user, questionData);
});
});
This is obviously a very primitive way of handling data, but I don't see the need for additional complexity. The data doesn't need to persist across server crashes or restarts (the user's questions and answers are also stored in localStorage), and MongoDB and even Redis just seem like overkill for this kind of data.
So my question is, am I going about this the right way? Are there any points I'm missing? I just want a simple way to store data in memory and be able to access it through client-side GET requests and socket.io. Thank you for any help.

If an array and a map provide you the type of access you need to the data and you don't need crash persistence and you have an appropriate amount of memory to hold the amount of data, then you're done.
There is no need for more than that unless your needs (query, persistence, performance, multi-user, crash recovery, backup, etc...) require something more complicated. A simple cliche applies here: If it ain't broke, it don't need fixing.

Related

How to process tracked information in Application Insights

I am using Application Insights to track events in my web pages:
appInsights.trackEvent("my-event", { test: true });
However I can see that each entry in the log, collects some info regarding several other things like:
User Id
Session Id
Operation name
The last one is sensitive as I can get the name of the computer or some other stuff. In order to comply to the GDPR, I wanna strip out those info from my log.
How do I tell Application Insights, to process the data before logging them? In my case, I would like to get access to the object which will be sent out by trackEvent and modify it before it is transmitted.
You can use TelemetryInitializers for that. They allow you to modify items before they are send to Application Insights
In your case it could be as simple as
appInsights.queue.push(function () {
appInsights.context.addTelemetryInitializer(function (envelope) {
envelope.tags['ai.operation.name'] = 'xxx';
});

Right way of passing consistent data from DB to user without repeatedly querying

Database stores some data about the user which almost never change. Well sometimes information might change if the user wants to edit his name for example.
Data information is about each user's name, username and his company data.
The first two are being shown to his navigation bar all the time using ejs, like User_1 is logged in, his company profile data when he needs to create an invoice.
My current way is to fetch user data through middleware using router.use so the extracted information is always available through all routes/views, for example:
router.use(function(req, res ,next) { // this block of code is called as middleware in every route
req.getConnection(function(err,conn){
uid = req.user.id;
if(err){
console.log(err);
return next("Mysql error, check your query");
}
var query = conn.query('SELECT * FROM user_profile WHERE uid = ? ', uid, function(err,rows){
if(err){
console.log(err);
return next(err, uid, "Mysql error, check your query");
}
var userData = rows;
return next();
});
});
})
.
I understand that this is not an optimal way of passing user profile data to every route/view since it makes new DB queries every time the user navigates through the application.
What would be a better way of having this data available without repeating the same query in each route yet having them re-fetched once the user changes a portion of this data, like his fullname ?
You've just stumbled into the world of "caching", welcome! Caching is a very popular choice for use cases like this, as well as many others. A cache is essentially somewhere to store data that you can get back much quicker than making a full DB query, or a file read, etc.
Before we go any further, it's worth considering your use case. If you're serving only a few users and have a low load on your service, caching might be over-engineering and in fact making a DB request might be the simplest idea. Adding caching can add a lot of complexity to your code as things move forward, not enough to scare you, but enough to cause hard to trace bugs. So consider for a moment your service load, if it's not very high (say an internal application for somewhere you work with only maybe a few requests every few minutes) then just reading from the DB is probably not going to slow down a request too much. In this case, reading from the DB is the simplest and probably best solution. However, if you're noticing that this DB request is slowing down your application for requests or making it harder to scale up, then caching might be for you.
A really popular approach for this would be to get something like "redis" which is a key-value database that holds everything in memory (RAM). Redis can sit as a service like MySQL and has a very basic query language. It is blindingly fast and can scale to enormous loads. If you're using Express, there are a number of NPM modules that help you access a redis instance. Simply push in your credentials and you can then make GET and SET requests (to get data or to set data).
In your example, you may wish to store a users profile in a JSON format against their user id or username in redis. Then, create a function called getUserProfile which takes in the ID or username. This can then look it up in redis, if it finds the record then it can return it to your main controller logic. If it does not, it can look it up in your MySQL database, save it in redis, and then return it to the controller logic (so it'll be able to get it from cache next time).
Your next problem is known for being a very pesky problem in computer science. It's "Cache Invalidation", in this case if your user profile updates you want to "invalidate" your cache. A way of doing this would be to update your cached version when the user updates their profile (or any other data saved). Alternatively, you could also just remove the cached version from redis and then next time it's requested from getUserProfile, it will be fetched from the DB fresh, and then put into redis for next time.
There are many other ways to approach this, but this will most likely solve your problem in the simplest way without too much overhead. It will also be easy to expand in the future!

Persistent object in node.js

I am fairly new to node and backend work in general. We are using express at work for a pretty large monolith application that houses all of our endpoints and services. My task is to grab a property that comes in on the request object (i.e. request.someObject), and use it in several services. In a DOM environment I would use something like localStorage or sessionStorage to store that data, and then reuse it where needed across the application. Im going to try and explain a little further with code:
Endpoint
router.route('/:cartId/generatePaymentLink')
.post(jsonParser, urlParser, function(request, response, next) {
var theObject = request.someObject;
// need to pass theObject to several services
});
Services are stored within separate files, here is an example of one
var paypalInfo = new Paypal({
userId: theObject.user_id,
password: theObject.password
});
I can pass it through parameters to the services that use data, but its used in several different places and would have to be defined in those individual services. Is there a way to create a persistent object/config file that I can just import and have that data, or something like sessionStorage in the DOM?
EDIT: Not a database, looking for another solution. For performance reasons we are avoiding this

Changing a Meteor collection subscription for all clients

I am developing a webapp in which I'd need one client, associated with the admin, to trigger an event (e.g., a new value selected in a dropdown list) which in turns will tell all the other connected clients to change the subscription, possibly using a parameter, i.e., the new selected value.
Something along the lines of
Template.bid.events
"change .roles": (e, tpl) ->
e.preventDefault()
role = tpl.$("select[name='role']").val()
Meteor.subscribe role
Of course this works for the current client only.
One way I thought would be keeping a separate collection that points a the current collection to be used, so the clients can programmatically act on that. It feels cumbersome, thou.
Is there a Meteor-way to achieve this?
Thanks
In meteor, whenever you have a problem that sounds like: "I need to synchronize data across clients", you should use a collection. I realize it seems like overkill just to send one piece of data, but I assure you it's currently the path of least resistance.
There are ways you can expose pseudo-collections which don't actually write to mongo, but for your use case that really sounds like overkill - new Mongo.Collection is the way to go.
You can use streams to setup a simple line of communication between connected clients and the server. It doesn't store data in MongoDB. Just let all connected clients listen to a stream and switch subscriptions when a new message comes in with the subscription name. Make sure only your client associated to your admin can push messages to the stream.
Available package: https://atmospherejs.com/lepozepo/streams
Examples: http://arunoda.github.io/meteor-streams/

Socket.io: Emit to specified Clients

I have a node.js/socket.io webapp that is currently working correctly polling an API and populating the html page with the emitted results.
My problem is that multiple people need to use this and I would like to separate their instances so that each person will only receive the results of their query.
Right now, when anyone uses the site it will return results of another user that may be also using the site.
I have tried to get around this using this method:
var clients = {};
io.sockets.on('connection', function(socket){
console.log("this is "+socket.id);
clients.id = socket.id;
})
io.sockets.socket(clients.id).emit('progress',{info:listing});
Of course this gets replaced with each new user that logs into the site so then everything that was emitted to the original user is now being emitted to the new user.
What I want to know is if there is any built-in function to get around this or if I should proceed with another persistent store.
Any help would be greatly appreciated.
Edit
By storing the socket object in the express.sessionStore instead of just in the program.
io.sockets.on(function(socket){
request.sessionStore.socket = socket;
})
The above code now works and only emits to the event originator.
This looks like it's been answered in another thread. The idea is to create an array of clients, and associated it to some type of client/user identification, like an ID or name.
Sending a message to a client via its socket.id

Categories

Resources