Should I store all database data in memory node.js? - javascript

I have two options when it comes to a server I am developing.
Option a) when the server start, extract all data from database and store it on memory as objects.
Option b) everytime a data is required, extract it from database, transform it into object and return that. Dont store all data on memory.
I'm kind of afraid that if there is too much data stored in memory the app crashes. But I'm not sure how this is suppose to be handle.
Which is the right way to handle this?
Fill free to extend another option.

You can do a bit of both.
You can store some data in memory. For example if your database holds some settings which are some sort of constants and are not thousands of lines, then you can load them in memory. If you have some data that are accessed very frequently and are not being updated very often, you can also load them at runtime and keep them in memory.
If you have some data that expect to change frequently and/or are going to grow extensively, such as user data, or content. Load them from db everytime they are required.
If your application is a large scale app with high traffic, you might want to add some caching system which will basically store some data in memory or file and update it when it is changed or when it expires with a time to live property.

Related

Advice for lots of data clientside in a browser session

I need some advice.
I'm looking for a way to work with a huge amount of data (800+mb) in the browser. This data should not be persistent. Users upload a csv file which they play with (filter) in the browser before doing anything with it. The columnnames in those csv files can vary wildly.
I was wondering what method to use here. Storing the data in memory in JavaScript variables does not work, as it crashes the page. I've been looking at indexedDB, which gives me the opportunity to keep the data in a database and even gives me filtering options. However this system is actually designed to store persistent data for offline use.
Preferably I would like to delete the database when loading a new file, create a new database with it's own objectstore and fill it with new data. But I found that the database is not deleted as long as I don't refresh the page and I cannot change it.
My questions: is using indexedDB a viable approach for this specific case? If yes, does anyone any pointers how to use it in this case? Are there other possibilities that I have overlooked?

Caching objects in memory in Javascript

I'm writing a web app which fetches a list of files from the server and displays it. The user can click on a folder to descend into it (without actually leaving the page). Retrieving the list can take a while (~10ms per file, which is a lot when you have 2000 files), so I want to cache the results when possible to avoid having to re-fetch it if the user goes into a subdirectory and then back out.
However if I just store the results in some global variable, they'll quickly fill up all the user's memory. Is there some way to tell the browser "feel free to delete this object if you're low on memory", or to be notified when memory is low?
If you'd like to store those objects on the users computer, to prevent requesting from the server again you'd probably want to use something like LocalStorage to do so.
Store.js provides a nice API around local storage-solutions.
The hard part for you now will be to check which files belong to a certain folder so you can store it. Something like a tree data-structure might be nice to give shape to these folders, paired with an ID you might be able to map them to a place in localstorage.

Where to best store frequently changed data?

I am creating a gaming service in which players will be paired and compete against one another in real-time. I am building this is node/websockets and react. My question is very high level:
Where should I store data like the number of users online, list of online users, etc. I am sure I can store it in a DB but I know it will change frequently. Is this best suited for one of nodes memory stores like "data-store" or just have a set of variables on the server accessible to the websockets (what I have now)?
Or should I just put it in a DB anyway???
If you just have a single server and the data is all of the type you've listed that is temporal (doesn't need to survive a server restart), then just using a set of variables and keeping the data in memory is perfectly fine, is simplest to implement and will perform the best. There is really no reason to involve the additional overhead of a traditional disk-based store unless the data is overly large (which it does not seem to be).
If you cluster your server, then frequently changed data that does not need to be persistently stored across server restarts can be kept in an in-memory database in it's own process such as redis. Each server in your cluster can then query redis any time it needs the latest copy of the data. Because it's an in-memory data store, it's efficient for data that changes a lot while also being available to multiple processes.

Are node.js data structures volatile/when to use a database over data structure?

I am considering using node.js to build an API like service but am having trouble understanding whether to use a data structure vs. storing information in a database/text file.
Basically the program would allow for a user to come on line and collect that users geo-coded location. Then the service would store that information in either a javascript data structure or store it into a database or text file. Then another user would log on and I would connect them with a user who is closes to them.
My question is, if I have a datastructure (some sort of custom implemented sorted list based off of geo-codes) would all of that information be volatile and I would loose it if the program crashed?
Would it be more preferable to store the information in a text file or database even though the access and write of that information would take longer?
Also, if I was using the data structure approach, would that make it more difficult to scale the application if I needed to expand to additional servers?
Any thoughts?
My question is, if I have a datastructure (some sort of custom
implemented sorted list based off of geo-codes) would all of that
information be volatile and I would loose it if the program crashed?
Yes, it would be volatile and you would lose it if the program crashed. All Javascript data is kept in RAM.
Would it be more preferable to store the information in a text file or
database even though the access and write of that information would
take longer?
When exactly to save data to a persistent store is highly dependent upon the details of the situation. You could have only a disk store or you could have a RAM store that is periodically stored to disk or you could have a combination (a RAM cache in front of a persistent store). Using a database will generally do a lot of caching for you.
Also, if I was using the data structure approach, would that make it
more difficult to scale the application if I needed to expand to
additional servers?
If you want to share a live data store among multiple servers, then you have to use something other than just Javascript data stored in node.js memory. The usual solution is to go with some sort of external database which itself can be either in-memory (like Redis) or disk-store (like Mongo or Couchbase) which all the different servers can then access.

Meteor reactive publish based on client session variable

I'm making a meteor js web app that presents the client an html range slider tied to a session variable.
I want the server to only publish data with values less than the current value of the slider with data sorted from newest to oldest. I have a lot of database entries (2000+). If I publish everything within the max of slider my browsers too slow. If I limit the publish to 100 entries or so, I miss out on a lot of data with small values (which happen to be older) when I bring the slider down.
What are the best practices for trying to be scalable (not sending too much data to the client)? Is a reactive publish function the key (using onchange with the slider value as the key)? That sounds like a lot of server round trips. Help!
Would pagination be acceptable from a UX standpoint? If so, there are packages that may help, for instance alethes:pages.
Otherwise, Adam is on the right track by suggesting to use Tracker.autorun (Tracker has replaced Deps).
As with any other publication, make sure your publish function only returns the fields that you need on the client, in order to minimize the data transferred and the memory consumption.

Categories

Resources