We are creating a tree structure with the help of a custom tool develop in JavaScript/Jquery.
It works great now we have to create that tree with help of a feed file (CSV file).
I am working on creating a POC to understand the behavior of JS file for 25k nodes.
The problem is how do I Insert such volume of data in my Database to check the behavior in browser.
Let me brief you about our approach for inserting the tree in the DB . We create the Left right value using the
NSM model. then insert it in two table one with a collection of node names. Other with left right values and some other
Attributes. So I need to Insert such volume of data at least ( 10K nodes) with left right values of it.
We supply a json object for rendering tree on client side then recursively calling the function to redraw the structure.
Question is not entirely clear, but whenever I need to insert a large amount of data into sql server I use BCP, especially since your data is in CSV format, it should be easy:
http://msdn.microsoft.com/en-us/library/ms162802.aspx
Related
I have a form like below which contains dynamic columns and rows which can be added and deleted as per the user.
Assuming the number of rows being 50 to 60 and columns upto 10, there are a lot of calculations taking place in the javascript.
I am using MySQL database here and php(Laravel) as backend.
Currently, my table structure is as given below to store the above:-
Table 1 - row_headers (row_id,row_name)
Table 2 - column_headers (column_id, column_name)
Table 3 - book_data ( book_id, row_id, column_id, data_value)
The above set of tables do suffice the data storing, but is extremely slow with respect to Store call as well as get call. While getting the complete data back to the UI there is much load on the database as well as HTML to load the data properly(for loops kill all the time) and also the same is a tedious process.
I want to understand how to optimize the above? What table structure should be used other than the above and what is the best way to reduce the load at backend as well as frontend?
Help appreciated.
i need help/tips
i have a huge amount of json data that needs to be merged, sorted and filtered. right now, they're separated into different folders. almost 2GB of json files.
what i'm doing right now is:
reading all files inside each folders
appending JSON parsed data to an Array variable inside my script.
sorting the Array variable
filtering.
save it to one file
i'm rethinking that instead of appending parsed data to a variable, maybe i should store it inside a file ?.. what do you guys think ?
what approach is better when dealing with this kind of situation ?
By the way, i'm experiencing a
Javascript Heap out of memory
you could use some kind of database, e.g. MySQL with table's engine "memory" so it would be saved in ram only and would be blazing quick and would be erased after reboot but you should truncate it anyways after the operation while it's all temp. When you will have data in the table, it will be easy to filter/sort required bits and grab data incrementally by let's say 1000 rows and parse it as needed. You will not have to hold 2gigs of data inside js.
2gigs of data will probably block your js thread during loops and you will get frozen app anyways.
If you will use some file to save temporary data to avoid database, i recommend using some temporary disk which would be mounted on RAM, so you will have much better i/o speed.
I am trying to use Fixed Data Tables in my Web Application, I am dealing with large amount of data like hundreds of thousands of records. I am trying to load all the data at a time to make the best use of Search and Sort functionalities of Data Table.
Here is the link to the data table which I am using.
It is consuming huge time to load data, which is expected, but after loading of data getting some glitches in browser, I mean it is getting stuck.
How to handle huge amount of data in Data Tables with complete functionality?
The main advantage of using the fixed data table is that you can render the entire table based on an array or an object.
The official link for the fixed data table is given at:
http://schrodinger.github.io/fixed-data-table-2/example-object-data.html
The following link consists of rendering the table on the basis of JSON data. Some additional features like client side sorting and filtering can also be added as you mentioned that the data you have is huge.
I want to process some 48,000 rows to build a dashboard and show some stats based on the data in those rows. One particular field which has a length of 30 characters also has some data in form of substrings. How do I parse all of this data, row by row, to come up with the end result? There are plenty of examples out there, couldn't relate to them just as well.
I'm using the "js-xlsx" library in one of my application. The performance is considerably seems to be good.
Here is the github URL.
https://github.com/SheetJS/js-xlsx
I have a some report data available with me which i am fetching from say a webService.
The data is in JSON format of the form :
Year -> Region ->Items == has the data for 12 months(a simple Array)
(3) -> (3) -> (10)
i.e. 3 Year, each contains 5 regions, and each region has say 10 products.
what is the efficient way to generate various representations of this data in javascript.
or What methodologies are to be adopted for report generation of such a kind.
I am Building a BI solution using javascript.
My Application uses the data in different formats for the respective components.
Switching between these components constantly involves regeneration of data.
Hence, either i need the datasets for these components to be generated in advance or have an efficient way to do it on the go.
i need the datasets for these components to be generated in
advance
I believe this is a simple straightforward approach; aggreggate and store your data in separate tables before hand.
This would require a cron or some sort of middle process. Instead of processing and generating reports in javascript on each page load data would be retrieved aggregated and stored in a seperate table.
javascript would then only need to retrieve and present the data in the aggregated table.