DataTables in YUI3 refreshing on every update/change of data - javascript

I has been playing with YUI3 3.5pre5 DataTables and realized a big difference in implementations (from YUI2).
When using addRow(), modifyRow() and set(), all the table is reloaded/redrawn/refreshed instead of only the elements affected.
For example when a row is added all the table is redrawn instead of only the new row (like in YUI2).
This is a real problem if you have a lot of data and everything is redrawn because of some cell updated. Or if you need to update the data every x seconds the table will be constantly refreshing making it harder to work with.
I hope I'm doing it wrong and there is way of doing it or a workaround.
Please let me know if there is a way to make the new DataTables behave correctly.
Thanks!

You're not doing it wrong; that's the current state of the code. I'll be optimizing data mutation -> UI in 3.6.0, and like I did during 3.5.0, I'll be maintaining a preview module in gallery that has the latest updates and features.
There were plenty of performance improvements that I wanted to get into 3.5.0 that I just couldn't fit in due to the architecture and feature migration from 3.4.1.
In the mean time, here's a patch that should help: https://gist.github.com/2295032
Note, it's not compatible with nodeFormatters, and may have other edge cases.

Related

Limitation of Angular and browsers with regards to data loaded

I would like to know what is the maximum data that angular framework can handle. Say, I am displaying a chart using angular and some charting framework like chartjs. I'd like to know up to how many data can the browser display properly, with slowness, or up to when it crashes.
Your question has no simple answer, but I will try to flatten it and give a simple answer, or at least simple things to consider...
Angular (at runtime), like many other frameworks is simply JavaScript,
So let us reduce the question to "Limitation of JavaScript and browsers with regards to data loaded",
JavaScript has no upper limit of memory or storage it can handle,
I've seen JavaScript applications that require more than 15GB of RAM,
and they performed well too.
So assume data size itself is not an issue (unless your application is poorly implemented, leaking memory or just not very efficient, of course).
The main challenge as I see it, is displaying and manipulating the information
without causing unnecessary delay or unresponsiveness.
Displaying the information - let's say you have a list (or a table) containing 1,000,000 possible gifts which you then want to display for the user to select.
Adding the list items to the document one by one is tempting, but will require the browser to repaint after each addition (causing a delay or full unresponsiveness until finished), another way is adding the elements to some DOM element (denoted by N) still being kept in memory, then adding all elements corresponding the list items to the element N (still, just an in memory operation), finally adding N to the document containing the entire list - the will be a much better solution for displaying the large amount of data.
Manipulating the information - displaying is indeed not enough. you would like to move, drag, sort and filter the data being displayed. And as mentioned before, it is a bad idea removing many elements directly from DOM. You should instead remove container from the document's DOM to memory, manipulate the data in it, and then add the container right back to the document. Angular does this kind of magic for you.
(Toggling the 'display:none\block' css attribute of many elements has a similar blocking effect as I recall).
A good practice is implementing an application/page showing only the amount of data that can be processed by a human at a single glance. The rest of it should be considered in the application data-layer, in memory, and should be loaded to display given the appropriate need or request.
To conclude, you can deal with huge amounts of data as long as you provide a mechanism that efficiently filter the displayed information.
I hope it helps...
for further reading:
Slow and fast ways of adding elements to DOM
A question emphasizes the lack of memory limit used by JS
CSS display attribute performance
A good discussion about the reasons for slow DOM
About using HTML5 correctly - old but still true
Once the DOM creation procedure is understood - it much easier to display data without affecting performance / user experience

ReactJS handling a lot of data in state

I have a React application that needs to handle a lot of data. Overall it's a simple application :
Header with a few links
Search bar
Table
The catch is, depending on what is being searched for...the table needs to display up to 1,000 rows by about 100 columns. When the http request returns this data and set in state, the application pretty much becomes unusable. Any other attempt to update state after it contains that data either takes forever or crashes the browser. Even when I limit the table to display 20 rows x 100 columns, the state update is significantly faster, but still noticeable to update.
I tried scouring the web to find a good solution and have come up short so any ideas/help/suggestions are welcome. If redux will help, I have no issue implementing that, I just don't want to waste my time if there is no payoff.
You're probably re-rendering the entire table along with the search bar whenever you type in it. Maybe try putting the table into a subcomponent which implements shouldComponentUpdate(). Is the table data supposed to dynamically update whenever you type a character into the input? That might be expensive, you could also consider only re-rendering the table when a user clicks the "search" button by the input.
Rather than beat around the bush trying to figure out if it's a rendering issue, though, I recommend using the React performance tools*. Benchling has a solid blog post which walks through the process of performance debugging in React, but the magic is Perf.printWasted(). It will show you how much time was spent re-rendering components which were the same after the rendering (e.g. your table when you're only changing the text input). Test steps:
From the console, run: Perf.start()
type in your input, ideally just one keystroke
console : Perf.stop()
console : Perf.printWasted()
This method is super painless, but you can also use the Chrome profiling tools to find out the same info.
*I just found out that they've been deprecated in hot-off-the-press React 16, but you're likely still using a 15.x variant which works fine.
Also, +1 for react-virtualized! It's excellent, supports many columns and layouts.
If I understood you correctly your problem is not the amount of data - problem in DOM rendering.
I think the best solution for you will be to render only visible part of table. For example try to use this library or similar.

Best Table Sorter Not Jquery?

which one is best table sorter choice i wanted to do
Default filter
Sorting
Pagination
Filter
Which one is the best choice for the table filtering that do above all things?? and Most Important Not Jquery one...!!
This is what I am using : Table_Sort
May I suggest the dojo DataGrid.
The DataGrid (and all related classes/widgets) are separated from the actual data so you can put a filter on the data source in any way you would want. There is a big choice of data stores like a REST store or a simple CSV store.
Sorting can be done in the grid itself, delegated to the store or delegated to the server. Depending on what your need it is it might be wise to sort your set on the server to keep performance high in the browsers.
The page does do paging but not in the conventional way that you can flick through the pages. There might be a plug-in component on the grid to make that possible though but I don't know. The current implementation just renders the pages that are visible and when you scroll it will render additional pages.
For more info and examples:
http://docs.dojocampus.org/dojox/grid/index?action=show&redirect=dojox/grid
In my experience with dojo grid I think you will be best off using the dojox.grid.Enhanced grid. It is a bit faster then the DataGrid and new plug-ins and features are constantly added. And with a bit of reading and hanging out in #dojo it is quite possible to write your own plug-in.

JavaScript data grid for millions of rows [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I need to present a large number of rows of data (ie. millions of rows) to the user in a grid using JavaScript.
The user shouldn't see pages or view only finite amounts of data at a time.
Rather, it should appear that all of the data are available.
Instead of downloading the data all at once, small chunks are downloaded as the user comes to them (ie. by scrolling through the grid).
The rows will not be edited through this front end, so read-only grids are acceptable.
What data grids, written in JavaScript, exist for this kind of seamless paging?
(Disclaimer: I am the author of SlickGrid)
UPDATE
This has now been implemented in SlickGrid.
Please see http://github.com/mleibman/SlickGrid/issues#issue/22 for an ongoing discussion on making SlickGrid work with larger numbers of rows.
The problem is that SlickGrid does not virtualize the scrollbar itself - the scrollable area's height is set to the total height of all the rows. The rows are still being added and removed as the user is scrolling, but the scrolling itself is done by the browser. That allows it to be very fast yet smooth (onscroll events are notoriously slow). The caveat is that there are bugs/limits in the browsers' CSS engines that limit the potential height of an element. For IE, that happens to be 0x123456 or 1193046 pixels. For other browsers it is higher.
There is an experimental workaround in the "largenum-fix" branch that raises that limit significantly by populating the scrollable area with "pages" set to 1M pixels height and then using relative positioning within those pages. Since the height limit in the CSS engine seems to be different and significantly lower than in the actual layout engine, this gives us a much higher upper limit.
I am still looking for a way to get to unlimited number of rows without giving up the performance edge that SlickGrid currently holds over other implementations.
Rudiger, can you elaborate on how you solved this?
https://github.com/mleibman/SlickGrid/wiki
"SlickGrid utilizes virtual rendering to enable you to easily work with hundreds of thousands of items without any drop in performance. In fact, there is no difference in performance between working with a grid with 10 rows versus a 100’000 rows."
Some highlights:
Adaptive virtual scrolling (handle hundreds of thousands of rows)
Extremely fast rendering speed
Background post-rendering for richer cells
Configurable & customizable
Full keyboard navigation
Column resize/reorder/show/hide
Column autosizing & force-fit
Pluggable cell formatters & editors
Support for editing and creating new rows."
by mleibman
It's free (MIT license).
It uses jQuery.
The best Grids in my opinion are below:
Flexigrid: http://flexigrid.info/
jQuery Grid: http://www.trirand.com/blog/
jqGridView: http://plugins.jquery.com/project/jqGridView
jqxGrid: https://www.jqwidgets.com/
Ingrid: http://reconstrukt.com/ingrid/
SlickGrid http://github.com/mleibman/SlickGrid
DataTables http://www.datatables.net/index
ShieldUI http://demos.shieldui.com/web/grid-virtualization/performance-1mil-rows
Smart.Grid https://www.htmlelements.com/demos/grid/overview/
My best 3 options are jqGrid, jqxGrid and DataTables. They can work with thousands of rows and support virtualization.
I don't mean to start a flame war, but assuming your researchers are human, you don't know them as well as you think. Just because they have petabytes of data doesn't make them capable of viewing even millions of records in any meaningful way. They might say they want to see millions of records, but that's just silly. Have your smartest researchers do some basic math: Assume they spend 1 second viewing each record. At that rate, it will take 1000000 seconds, which works out to more than six weeks (of 40 hour work-weeks with no breaks for food or lavatory).
Do they (or you) seriously think one person (the one looking at the grid) can muster that kind of concentration? Are they really getting much done in that 1 second, or are they (more likely) filtering out the stuff the don't want? I suspect that after viewing a "reasonably-sized" subset, they could describe a filter to you that would automatically filter out those records.
As paxdiablo and Sleeper Smith and Lasse V Karlsen also implied, you (and they) have not thought through the requirements. On the up side, now that you've found SlickGrid, I'm sure the need for those filters became immediately obvious.
I can say with pretty good certainty that you seriously do not need to show millions of rows of data to the user.
There is no user in the world that will be able to comprehend or manage that data set so even if you technically manage to pull it off, you won't solve any known problem for that user.
Instead I would focus on why the user wants to see the data. The user does not want to see the data just to see the data, there is usually a question being asked. If you focus on answering those questions instead, then you would be much closer to something that solves an actual problem.
I recommend the Ext JS Grid with the Buffered View feature.
http://www.extjs.com/deploy/dev/examples/grid/buffer.html
(Disclaimer: I am the author of w2ui)
I have recently written an article on how to implement JavaScript grid with 1 million records (http://w2ui.com/web/blog/7/JavaScript-Grid-with-One-Million-Records). I discovered that ultimately there are 3 restrictions that prevent from taking it highter:
Height of the div has a limit (can be overcome by virtual scrolling)
Operations such as sort and search start being slow after 1 million records or so
RAM is limited because data is stored in JavaScript array
I have tested the grid with 1 million records (except IE) and it performs well. See article for demos and examples.
dojox.grid.DataGrid offers a JS abstraction for data so you can hook it up to various backends with provided dojo.data stores or write your own. You'll obviously need one that supports random access for this many records. DataGrid also provides full accessibility.
Edit so here's a link to Matthew Russell's article that should provide the example you need, viewing millions of records with dojox.grid. Note that it uses the old version of the grid, but the concepts are the same, there were just some incompatible API improvements.
Oh, and it's totally free open source.
I used jQuery Grid Plugin, it was nice.
Demos
Here are a couple of optimizations you can apply you speed up things. Just thinking out loud.
Since the number of rows can be in the millions, you will want a caching system just for the JSON data from the server. I can't imagine anybody wanting to download all X million items, but if they did, it would be a problem. This little test on Chrome for an array on 20M+ integers crashes on my machine constantly.
var data = [];
for(var i = 0; i < 20000000; i++) {
data.push(i);
}
console.log(data.length);​
You could use LRU or some other caching algorithm and have an upper bound on how much data you're willing to cache.
For the table cells themselves, I think constructing/destroying DOM nodes can be expensive. Instead, you could just pre-define X number of cells, and whenever the user scrolls to a new position, inject the JSON data into these cells. The scrollbar would virtually have no direct relationship to how much space (height) is required to represent the entire dataset. You could arbitrarily set the table container's height, say 5000px, and map that to the total number of rows. For example, if the containers height is 5000px and there are a total of 10M rows, then the starting row ≈ (scroll.top/5000) * 10M where scroll.top represents the scroll distance from the top of the container. Small demo here.
To detect when to request more data, ideally an object should act as a mediator that listens to scroll events. This object keeps track of how fast the user is scrolling, and when it looks like the user is slowing down or has completely stopped, makes a data request for the corresponding rows. Retrieving data in this fashion means your data is going to be fragmented, so the cache should be designed with that in mind.
Also the browser limits on maximum outgoing connections can play an important part. A user may scroll to a certain position which will fire an AJAX request, but before that finishes the user can scroll to some other portion. If the server is not responsive enough the requests would get queued up and the application will look unresponsive. You could use a request manager through which all requests are routed, and it can cancel pending requests to make space.
I know it's an old question but still.. There is also dhtmlxGrid that can handle millions of rows. There is a demo with 50,000 rows but the number of rows that can be loaded/processed in grid is unlimited.
Disclaimer: I'm from DHTMLX team.
I suggest you read this
http://www.sitepen.com/blog/2008/11/21/effective-use-of-jsonreststore-referencing-lazy-loading-and-more/
Disclaimer: i heavily use YUI DataTable without no headache for a long time. It is powerful and stable. For your needs, you can use a ScrollingDataTable wich suports
x-scrolling
y-scrolling
xy-scrolling
A powerful Event mechanism
For what you need, i think you want is a tableScrollEvent. Its API says
Fired when a fixed scrolling DataTable has a scroll.
As each DataTable uses a DataSource, you can monitoring its data through tableScrollEvent along with render loop size in order to populate your ScrollingDataTable according to your needs.
Render loop size says
In cases where your DataTable needs to display the entirety of a very large set of data, the renderLoopSize config can help manage browser DOM rendering so that the UI thread does not get locked up on very large tables. Any value greater than 0 will cause the DOM rendering to be executed in setTimeout() chains that render the specified number of rows in each loop. The ideal value should be determined per implementation since there are no hard and fast rules, only general guidelines:
By default renderLoopSize is 0, so all rows are rendered in a single loop. A renderLoopSize > 0 adds overhead so use thoughtfully.
If your set of data is large enough (number of rows X number of Columns X formatting complexity) that users experience latency in the visual rendering and/or it causes the script to hang, consider setting a renderLoopSize.
A renderLoopSize under 50 probably isn't worth it. A renderLoopSize > 100 is probably better.
A data set is probably not considered large enough unless it has hundreds and hundreds of rows.
Having a renderLoopSize > 0 and < total rows does cause the table to be rendered in one loop (same as renderLoopSize = 0) but it also triggers functionality such as post-render row striping to be handled from a separate setTimeout thread.
For instance
// Render 100 rows per loop
var dt = new YAHOO.widget.DataTable(<WHICH_DIV_WILL_STORE_YOUR_DATATABLE>, <HOW YOUR_TABLE_IS STRUCTURED>, <WHERE_DOES_THE_DATA_COME_FROM>, {
renderLoopSize:100
});
<WHERE_DOES_THE_DATA_COME_FROM> is just a single DataSource. It can be a JSON, JSFunction, XML and even a single HTML element
Here you can see a Simple tutorial, provided by me. Be aware no other DATA_TABLE pluglin supports single and dual click at the same time. YUI DataTable allows you. And more, you can use it even with JQuery without no headache
Some examples, you can see
List item
Feel free to question about anything else you want about YUI DataTable.
regards,
I kind of fail to see the point, for jqGrid you can use the virtual scrolling functionality:
http://www.trirand.net/aspnetmvc/grid/performancevirtualscrolling
but then again, millions of rows with filtering can be done:
http://www.trirand.net/aspnetmvc/grid/performancelinq
I really fail to see the point of "as if there are no pages" though, I mean... there is no way to display 1,000,000 rows at once in the browser - this is 10MB of HTML raw, I kind of fail to see why users would not want to see the pages.
Anyway...
best approach i could think of is by loading the chunk of data in json format for every scroll or some limit before the scrolling ends. json can be easily converted to objects and hence table rows can be constructed easily unobtrusively
I would highly recommend Open rico.
It is difficult to implement in the the beginning, but once you grab it you will never look back.
I know this question is a few years old, but jqgrid now supports virtual scrolling:
http://www.trirand.com/blog/phpjqgrid/examples/paging/scrollbar/default.php
but with pagination disabled
I suggest sigma grid, sigma grid has embed paging features which could support millions of rows. And also, you may need a remote paging to do it.
see the demo
http://www.sigmawidgets.com/products/sigma_grid2/demos/example_master_details.html
Take a look at dGrid:
https://dgrid.io/
I agree that users will NEVER, EVER need to view millions of rows of data all at once, but dGrid can display them quickly (a screenful at a time).
Don't boil the ocean to make a cup of tea.

What's a good JavaScript grid with tabs?

I have 3 sets of tabular data I want to display with a JavaScript framework in ASP.NET MVC. I know I can embed a separate grid in a tab, but this seems inefficient especially when large datasets are involved since I imagine 3 separate grids would be created. I haven't found a JavaScript datagrid which emulates what a spreadsheet does with multiple tabs. This example from YUI might come close though:
http://developer.yahoo.com/yui/examples/datatable/dt_dynamicfilter_source.html
I'm a little familiar with jQuery, but would be willing to switch to any framework which makes this easy. I don't really need to edit the data. Any suggestions?
EDIT: I didn't mean this to be about jQuery. Maybe some details about my scenario would help, as suggested in one of the comments. I want to display tabular data from an ordering system containing thousands of records. I'd like 3 tabs:
All orders entered in the system which haven't been paid for yet.
All orders from a specific vendor.
All orders which have been paid for.
Since each category has thousands of rows, I only want to load data if the user starts paging.
I thought having 3 tabs with 3 separate grids (one within each tab) wouldn't be performant. I haven't actually tried though, so I'm probably guilty of prematurely optimizing. I'm looking for a grid with tab support built-in. I don't think there's one for jQuery. Perhaps ExtJS?
Since you tagged this with Ext JS, I'll mention that it's quite simple to render grids into tabs using Ext JS. It also supports deferred load/render, so that only the first tab/grid would load initially, then the others would be loaded on first access. Without knowing your specific requirements it's hard to comment further.
EDIT (based on edited question): Ext grids don't directly support tabbing, but they can be embedded within a TabPanel as I mentioned for the same effect. However, based on your description, it sounds more like a filtering scenario to me. I don't see the point in having the overhead of multiple grids when only one will ever be visible, and each one's purpose is to show a specific view (i.e. filter) of the same data. I would just have a single grid with a toolbar or some other method of providing your toggle between filters, and use Ext's built-in store filtering/querying to create your views on demand. The Ext grid supports paging out of the box (client or server, in your case it would be server for thousands of records). There is also a very popular plugin called LiveGrid that provides for virtual scroll-paging of large data sets.
I'm not necessarily advocating Ext over any other framework -- I just happen to be most familiar with it and I think it could solve your problem quite nicely. I would suggest trying it out for yourself to be sure.
jQuery Grid is kinda what people use a lot. I use it and it's pretty good.
jqGrid Link
I wouldn't draw a grid with three tabs. I'd use a single grid with a tab control and then load data via jQuery as required.
Or maybe have three PartialViews that you can load dynamically when you hit a tab.
You could also use dhtmlx grid.
You could use JS tab object to create tabs.
And use javascript grid framework to create grids and populate data into grids.

Categories

Resources