Caching objects in memory in Javascript - javascript

I'm writing a web app which fetches a list of files from the server and displays it. The user can click on a folder to descend into it (without actually leaving the page). Retrieving the list can take a while (~10ms per file, which is a lot when you have 2000 files), so I want to cache the results when possible to avoid having to re-fetch it if the user goes into a subdirectory and then back out.
However if I just store the results in some global variable, they'll quickly fill up all the user's memory. Is there some way to tell the browser "feel free to delete this object if you're low on memory", or to be notified when memory is low?

If you'd like to store those objects on the users computer, to prevent requesting from the server again you'd probably want to use something like LocalStorage to do so.
Store.js provides a nice API around local storage-solutions.
The hard part for you now will be to check which files belong to a certain folder so you can store it. Something like a tree data-structure might be nice to give shape to these folders, paired with an ID you might be able to map them to a place in localstorage.

Related

How to secure and validate data uploads to GitHub Pages site?

I have a GitHub pages site where I'm hosting a project. It allows users to export their data from localStorage as stringified JSON object into a .txt file. They can then import their data back from that .txt file, which will store its contents into localStorage.
Having them paste the file contents into a text box is an option, but since I'm intending this to be a single-page application for use on mobile devices, that's not an ideal user experience.
How do I make sure they aren't uploading malicious, incorrect, or unusable data?
As far as security goes, I'm not sure how much of a risk this even is, since GitHub Pages only hosts static pages, and I'm not dealing with any sensitive data in any way. Still, it feels like I should be doing something other than just accepting plaintext files.
The first thing that comes to mind for validating the data is to use regex or another formulaic way to check object contents. The data is organized as an object of objects; all child objects will have the same keys with different values, and the number of objects can vary. I also plan to build in a way to handle empty file uploads, where it defaults to setting localStorage to {}.

large JSON data persist across pages

I have a 40-50MB JSON object that I need to persist across to a different page.
This only needs to happen once (one transition) but I'm still way over HTML5 LocalStorage limits, what other options do I have?
Unfortunately, that is too much data to store for most browsers. Even combining sessionStorage and localStorage both will not get us even close.
There are a few options you can try though:
You can store the data on your own server. This will depend on what web server/environment you are using.
You can use someone else's server to store the data. For example, you could use Google Drive's API. This does mean that your user needs a google account. You could also pay for a service like Amazon S3 to store it.
You could create a 'container' page, which loads and displays the pages, but keeps the session going. How exactly this works depends again on your environment.
40-50m is too huge for a browser, the worse part is if mobile is involved, what you can do is split the data into chunks, keep some in sessionStorage, localStorage and the remaining on your server, so that the part on the server will be fast enough to load, You will have to join them once all is loaded and done. I wouldn't recommend this method though.

Should I store all database data in memory node.js?

I have two options when it comes to a server I am developing.
Option a) when the server start, extract all data from database and store it on memory as objects.
Option b) everytime a data is required, extract it from database, transform it into object and return that. Dont store all data on memory.
I'm kind of afraid that if there is too much data stored in memory the app crashes. But I'm not sure how this is suppose to be handle.
Which is the right way to handle this?
Fill free to extend another option.
You can do a bit of both.
You can store some data in memory. For example if your database holds some settings which are some sort of constants and are not thousands of lines, then you can load them in memory. If you have some data that are accessed very frequently and are not being updated very often, you can also load them at runtime and keep them in memory.
If you have some data that expect to change frequently and/or are going to grow extensively, such as user data, or content. Load them from db everytime they are required.
If your application is a large scale app with high traffic, you might want to add some caching system which will basically store some data in memory or file and update it when it is changed or when it expires with a time to live property.

Are node.js data structures volatile/when to use a database over data structure?

I am considering using node.js to build an API like service but am having trouble understanding whether to use a data structure vs. storing information in a database/text file.
Basically the program would allow for a user to come on line and collect that users geo-coded location. Then the service would store that information in either a javascript data structure or store it into a database or text file. Then another user would log on and I would connect them with a user who is closes to them.
My question is, if I have a datastructure (some sort of custom implemented sorted list based off of geo-codes) would all of that information be volatile and I would loose it if the program crashed?
Would it be more preferable to store the information in a text file or database even though the access and write of that information would take longer?
Also, if I was using the data structure approach, would that make it more difficult to scale the application if I needed to expand to additional servers?
Any thoughts?
My question is, if I have a datastructure (some sort of custom
implemented sorted list based off of geo-codes) would all of that
information be volatile and I would loose it if the program crashed?
Yes, it would be volatile and you would lose it if the program crashed. All Javascript data is kept in RAM.
Would it be more preferable to store the information in a text file or
database even though the access and write of that information would
take longer?
When exactly to save data to a persistent store is highly dependent upon the details of the situation. You could have only a disk store or you could have a RAM store that is periodically stored to disk or you could have a combination (a RAM cache in front of a persistent store). Using a database will generally do a lot of caching for you.
Also, if I was using the data structure approach, would that make it
more difficult to scale the application if I needed to expand to
additional servers?
If you want to share a live data store among multiple servers, then you have to use something other than just Javascript data stored in node.js memory. The usual solution is to go with some sort of external database which itself can be either in-memory (like Redis) or disk-store (like Mongo or Couchbase) which all the different servers can then access.

Store/Backup Database into a file, differences of IndexeDB, WebSQL and SQLlite?

My question is about IndexedDB vs. WebSQL vs. SQLite. There is no need to explain that they are different, what I would like to know is:
Do those three "Database Solutions" allow for storing all its Data
to a file?
(and of course to do the reverse, initialize all its data given a backup file?)
.
Background
Since I already have done some research, which partly answers this question, allow me to provide this background info to the question:
SQLite
(yes it does allow storage and retrievel of the Database to and from a file)
I have already done some work with SQLite. For this I know that there it actually right away starts the database via a reference to the file. Backup is simple copying the file. Restoring is rewriting the file.
IndexedDB and WebSQL
??? Are to my understanding Database Solutions which "life their life in the Browser's Javascript land" and there we do not deal much with files. Here is part of where the question lies. If I wanted to export the data from either of the two solution to a flat file or lets say a one string variable representation, would that be possible?
This are some SO question I think that relate to it:
SO Question: Exporting WebSQL Data
SO Question: Import and Export Indexeddb data
which indicate that there is no easy toString() (Store Database) method and FromSting() in IndexedDB nor in WebSQL.
It indeed true (and affirmed in an answer here) and there is not easy backup and retrieval for those Database, this would be very sad, and I think a gap. Databases without backup function, really?!
At Present, there is no way to back up and restore Browser databases. The only way you can achieve this is by continuously syncing your back-end database with the browser database and thus keep track of changes in data generated on browser.
Old guy will do in old way, whether appropriate or not. How on earth, browser database need a backup. Data in the client is just cache copy of a slice of server data. There is no need back. If you think IndexedDB (or web sql database) data is durable, you will be glad to know, IndexedDB data belong to a temporary class of browser data, meaning that UA can delete data at their discretion without prompting to user or app.
If your app treat browser data more than cache copy, you are doing wrong.

Categories

Resources