virtual scrolling, drag and drop in angular 7 - javascript

I searched the internet for these new features of angular 7, but didn't fully understand it.
I went through drag and drop and virtual scrolling
Could someone please shed some light on these?

now consider a case where you are to display huge chunk of data, now either you will do pagination which will include an api call per page (if data changes frequently) or if you load everything at once which will slow down or kill UI process.
Virtual Scroll is about loading huge chunk of data in DOM without hampering the performance.
it's key features are:
data is displayed as per size of viewport i.e. if you container div is 500 px it will show around 10-15 rows at a time.
Ad you scroll these rows are changed but number of elements in the DOM will remain consistent.
This is handy when you have to show huge chunk of data without implementing pagination.
Thus it improves the UI performance.
I implemented Virtual list displaying multiple column and array length was 1 million which is a huge amount of data to display at a time.
Virtual list is implemented over virtual scroll and it supports multiple columns.
check out detailed explanation and code here:
https://www.codeproject.com/Articles/5260356/Virtual-List-in-Angular
please have a look in the image:

Related

Angular 6 performance issue with large table [duplicate]

This question already has answers here:
Angular many rows grid performance in Chrome browser
(2 answers)
Closed 4 years ago.
I have a problem with Angular 6 performance issue. In the page there is a large table which has 100 rows and each row has 100 columns. Then this page is kind of laggy when I trying to use libraries like ng-select or ng-bootstrap datepicker, etc. Even if those libraries has no data exchange with the table. Which means even if the ng-select is just embedded in the HTML and has no data filled, the open and close of the ng-select drop down is laggy, takes about 0.5 sec to load. Same with other libraries. When I reduce the table to 10 rows, the lagging issue is improved significantly. Why does this happen?
Another observation is that when I use native tags of the HTML such as select option, it is not laggy at all, it react instantly. How to improve the performance in my situation? Thanks!
Code is basically something like this.
app-component:
<ng-select></ng-select>
<row-component *ngFor="let basket of baskets"></row-component>
row-component:
<div *ngFor="let apple of apples">
blah blah blah
</div>
It could also just be an issue of how many DOM elements that are being created and displayed on the page. You could try and use row virtualization which only renders rows that are displayed on the screen.
The fact that using ng-select makes the website more laggy than native html tags makes me think that the additional event listeners from the angular components have also decreased the performance of your webpage.
Ag-Grid has a great article about how they optimized displaying entries in a table. https://www.ag-grid.com/ag-grid-performance-hacks/
Do you use any library for the table like Angular Material?
One possible (and very common) solution is virtual scrolling:
https://material.angular.io/cdk/scrolling/overview#virtual-scrolling
There are a couple of things you may want to consider:
First, do you need to display all 100 columns at once? Is it possible for you to break up the columns into more manageable chunks, and maybe use a tabbed interface (Something like the Angular Bootstrap Tabset https://ng-bootstrap.github.io/#/components/tabset/examples) and group related columns into tabs to reduce the amount of columns you have to display on one page.
Second, there will be a performance issue when you get over a certain number of rows, which is where paging the data would be a good option (again, maybe look at Bootstrap paging https://ng-bootstrap.github.io/#/components/pagination/examples). You can set a hard limit - maybe 40 rows per page, or set the row limit dynamically by getting the browser window size, subtracting the amount of space you need for menus etc., and dividing the rest by the row height to determine how many rows will fit in the space you have available, and use that as your page size. This way you'll always only have to display a subset of the columns, and only as many rows as will fit on the screen without scrolling, and you should find your performance improves dramatically.

HTML Large Table scrolling very slowly

I have a table with thousands upon thousands of rows with 4 cells in each.
https://streamable.com/pywk9
You can see above my table is very big (no idea if its showing very well the laggy scrolling).
I understand this is because the browser has to actually render everything and with large datasets it could take time.
Is there a way WITHOUT pagination to make it more bearable?
Perhaps some way to render content ONLY shown in browsers view like a certain angularJS component (without needing angularjs)
A table that size is useful to nobody, you say you don't want to paginate but why? Pagination is a usability feature for a reason. Just implement pagination and searching. A grid with more than a few thousand rows is not going to be useful to anyone.
If you really need to have the table that big and scrollable then the technique used is to remove the item that are not in view out of the DOM. As you scroll you create DOM objects for the items as they come into view and remove them as they move out of view. This way there are not thousands of DOM objects the browser has to deal with.

Better to load javascript navigation hierarchy all at once or level by level?

I am working on some custom jQuery/javascript navigation for a site and I am curious about the performance implications of a design decision.
The way it works is for every option there are up to 8 child options, this hierarchy can go 4 levels deep. I believe this makes for 8^4 or 4096 possible navigation items (probably less but this is the max). These relationships are defined on the server side.
Currently I am working with test data, so there are only about 50 navigation items. When the page loads, I create every navigation item and then only display what is needed for the current selection.
Should I consider rewriting this to only load the items that are needed when a selection is made via an AJAX call or something? I am concerned that my current approach may not scale well if it goes up to 4096 navigation items.
If having 4096 navigation items is a real possibility then you'll have to do something like what you're describing. Simply loading the items into the DOM will take considerable time and further processing will cause greater delays and a poor experience.
For a small number of items, it probably isn't worth your while to over-engineer the solution. However, the performance gains on a large number of items would be expected to be significant.
Here is an example of on-demand loading in a Telerik Treeview. I'm not advocating purchasing the controls (great controls but expensive) however it is an excellent example of what is possible. Coding this on your own wouldn't be difficult to do and, as you can see, makes for a great user experience.
My two cents: if you have the time, do it now before things get even more complicated/difficult to do later.
Downloading them all at the same time is definitely an option, though loading them into the DOM is another story. If you really reach the 4096 possibility limit, you can be looking at pushing down 1-2 megabytes a page load (not to much considering image sizes). Unless you are looking at more data (maybe 16 nodes, 8 levels deep 16^8), then it would be a valid concern.
you could always load 2 deep (8^2 = 64), then when they open a panel, load everything for that panel. The second layer they need to click through should give you enough time to load the rest of the values.

I have a couple thousand javascript objects which I need to display and scroll through. What are my options?

I'm working off of designs which show a scrollable box containing a list of a user's "contacts".
Users may have up to 10,000 contacts.
For now assume that all contacts are already in memory, and I'm simply trying to draw them. If you want to comment on the question of how wise it is to load 10k items of data in a browser, please do it here.
There are two techniques I've seen for managing a huge list like this inside a scrollable box.
Just Load Them All
This seems to be how gmail approaches displaying contacts. I currently have 2k contacts in gmail. If I click "all contacts", I get a short delay, then the scrollable box at the right begins to fill with contacts. It looks like they're breaking the task into chunks, probably separating the DOM additions into smaller steps and putting those steps into timeouts in order to not freeze the entire interface while the process completes.
pros:
Simple to implement
Uses native UI elements the way they were designed to be used
Google does it, it must be ok
cons
Not totally snappy -- there is some delay involved, even on my development machine running Firefox. There will probably be quite a lot of delay for a user running a slower machine running IE6
I don't know what sort of limits there are in how large I can allow the DOM to grow, but it seems to me there must be some limit to how many nodes I can add to it. How will an older browser on an older machine react if I ask it to hold 10k nodes in the DOM?
Draw As Needed
This seems to be how Yahoo deals with displaying contact lists. The create a scrollable box, put a super-tall, empty placeholder inside it, and draw contacts only when the user scrolls to reveal them.
pros:
DOM nodes are drawn only as needed, so there's very little delay while loading, and much less risk of overloading the browser with too many DOM nodes
cons:
Trickier to implement, and more opportunity for bugs. For example, if I scroll quickly in the yahoo mail contact manager as soon as the page loads, I'm able to get contacts to load on top of one another. Of course, bugs can be worked out, but obviously this approach will introduce more bugs.
There's still the potential to add a huge number of DOM nodes. If the user scrolls slowly through the entire list, every item will get drawn, and we'll still end up with an enormous DOM
Are there other approaches in common use for displaying a huge list? Any more pros or cons with each approach to add? Any experience/problems/success using either of these approaches?
I would chunk up the DOM-writing into handle-able amounts (say, 25 or 50), then draw the chunks on demand. I wouldn't worry about removing the old DOM elements until the amount drawn gets quite large.
I would divide the contacts into chunks, and keep a sort of view buffer alive that changes which chunks are written to the DOM as the user scrolls through the list. That way the total number of dom elements never rises above a certain threshold. Could be fairly tricky to implement, however.
Using this method you can also dynamically modify the size of chunks and the size of the buffer, depending on the browser's performance (dynamic performance optimization), which could help out quite a bit.
It's definitely non-trivial to implement, however.
The bug you see in Yahoo may be due to absolutely positioned elements: if you keep your CSS simple and avoid absolutely/relatively positioning your contact entries, it shouldn't happen.

JavaScript data grid for millions of rows [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I need to present a large number of rows of data (ie. millions of rows) to the user in a grid using JavaScript.
The user shouldn't see pages or view only finite amounts of data at a time.
Rather, it should appear that all of the data are available.
Instead of downloading the data all at once, small chunks are downloaded as the user comes to them (ie. by scrolling through the grid).
The rows will not be edited through this front end, so read-only grids are acceptable.
What data grids, written in JavaScript, exist for this kind of seamless paging?
(Disclaimer: I am the author of SlickGrid)
UPDATE
This has now been implemented in SlickGrid.
Please see http://github.com/mleibman/SlickGrid/issues#issue/22 for an ongoing discussion on making SlickGrid work with larger numbers of rows.
The problem is that SlickGrid does not virtualize the scrollbar itself - the scrollable area's height is set to the total height of all the rows. The rows are still being added and removed as the user is scrolling, but the scrolling itself is done by the browser. That allows it to be very fast yet smooth (onscroll events are notoriously slow). The caveat is that there are bugs/limits in the browsers' CSS engines that limit the potential height of an element. For IE, that happens to be 0x123456 or 1193046 pixels. For other browsers it is higher.
There is an experimental workaround in the "largenum-fix" branch that raises that limit significantly by populating the scrollable area with "pages" set to 1M pixels height and then using relative positioning within those pages. Since the height limit in the CSS engine seems to be different and significantly lower than in the actual layout engine, this gives us a much higher upper limit.
I am still looking for a way to get to unlimited number of rows without giving up the performance edge that SlickGrid currently holds over other implementations.
Rudiger, can you elaborate on how you solved this?
https://github.com/mleibman/SlickGrid/wiki
"SlickGrid utilizes virtual rendering to enable you to easily work with hundreds of thousands of items without any drop in performance. In fact, there is no difference in performance between working with a grid with 10 rows versus a 100’000 rows."
Some highlights:
Adaptive virtual scrolling (handle hundreds of thousands of rows)
Extremely fast rendering speed
Background post-rendering for richer cells
Configurable & customizable
Full keyboard navigation
Column resize/reorder/show/hide
Column autosizing & force-fit
Pluggable cell formatters & editors
Support for editing and creating new rows."
by mleibman
It's free (MIT license).
It uses jQuery.
The best Grids in my opinion are below:
Flexigrid: http://flexigrid.info/
jQuery Grid: http://www.trirand.com/blog/
jqGridView: http://plugins.jquery.com/project/jqGridView
jqxGrid: https://www.jqwidgets.com/
Ingrid: http://reconstrukt.com/ingrid/
SlickGrid http://github.com/mleibman/SlickGrid
DataTables http://www.datatables.net/index
ShieldUI http://demos.shieldui.com/web/grid-virtualization/performance-1mil-rows
Smart.Grid https://www.htmlelements.com/demos/grid/overview/
My best 3 options are jqGrid, jqxGrid and DataTables. They can work with thousands of rows and support virtualization.
I don't mean to start a flame war, but assuming your researchers are human, you don't know them as well as you think. Just because they have petabytes of data doesn't make them capable of viewing even millions of records in any meaningful way. They might say they want to see millions of records, but that's just silly. Have your smartest researchers do some basic math: Assume they spend 1 second viewing each record. At that rate, it will take 1000000 seconds, which works out to more than six weeks (of 40 hour work-weeks with no breaks for food or lavatory).
Do they (or you) seriously think one person (the one looking at the grid) can muster that kind of concentration? Are they really getting much done in that 1 second, or are they (more likely) filtering out the stuff the don't want? I suspect that after viewing a "reasonably-sized" subset, they could describe a filter to you that would automatically filter out those records.
As paxdiablo and Sleeper Smith and Lasse V Karlsen also implied, you (and they) have not thought through the requirements. On the up side, now that you've found SlickGrid, I'm sure the need for those filters became immediately obvious.
I can say with pretty good certainty that you seriously do not need to show millions of rows of data to the user.
There is no user in the world that will be able to comprehend or manage that data set so even if you technically manage to pull it off, you won't solve any known problem for that user.
Instead I would focus on why the user wants to see the data. The user does not want to see the data just to see the data, there is usually a question being asked. If you focus on answering those questions instead, then you would be much closer to something that solves an actual problem.
I recommend the Ext JS Grid with the Buffered View feature.
http://www.extjs.com/deploy/dev/examples/grid/buffer.html
(Disclaimer: I am the author of w2ui)
I have recently written an article on how to implement JavaScript grid with 1 million records (http://w2ui.com/web/blog/7/JavaScript-Grid-with-One-Million-Records). I discovered that ultimately there are 3 restrictions that prevent from taking it highter:
Height of the div has a limit (can be overcome by virtual scrolling)
Operations such as sort and search start being slow after 1 million records or so
RAM is limited because data is stored in JavaScript array
I have tested the grid with 1 million records (except IE) and it performs well. See article for demos and examples.
dojox.grid.DataGrid offers a JS abstraction for data so you can hook it up to various backends with provided dojo.data stores or write your own. You'll obviously need one that supports random access for this many records. DataGrid also provides full accessibility.
Edit so here's a link to Matthew Russell's article that should provide the example you need, viewing millions of records with dojox.grid. Note that it uses the old version of the grid, but the concepts are the same, there were just some incompatible API improvements.
Oh, and it's totally free open source.
I used jQuery Grid Plugin, it was nice.
Demos
Here are a couple of optimizations you can apply you speed up things. Just thinking out loud.
Since the number of rows can be in the millions, you will want a caching system just for the JSON data from the server. I can't imagine anybody wanting to download all X million items, but if they did, it would be a problem. This little test on Chrome for an array on 20M+ integers crashes on my machine constantly.
var data = [];
for(var i = 0; i < 20000000; i++) {
data.push(i);
}
console.log(data.length);​
You could use LRU or some other caching algorithm and have an upper bound on how much data you're willing to cache.
For the table cells themselves, I think constructing/destroying DOM nodes can be expensive. Instead, you could just pre-define X number of cells, and whenever the user scrolls to a new position, inject the JSON data into these cells. The scrollbar would virtually have no direct relationship to how much space (height) is required to represent the entire dataset. You could arbitrarily set the table container's height, say 5000px, and map that to the total number of rows. For example, if the containers height is 5000px and there are a total of 10M rows, then the starting row ≈ (scroll.top/5000) * 10M where scroll.top represents the scroll distance from the top of the container. Small demo here.
To detect when to request more data, ideally an object should act as a mediator that listens to scroll events. This object keeps track of how fast the user is scrolling, and when it looks like the user is slowing down or has completely stopped, makes a data request for the corresponding rows. Retrieving data in this fashion means your data is going to be fragmented, so the cache should be designed with that in mind.
Also the browser limits on maximum outgoing connections can play an important part. A user may scroll to a certain position which will fire an AJAX request, but before that finishes the user can scroll to some other portion. If the server is not responsive enough the requests would get queued up and the application will look unresponsive. You could use a request manager through which all requests are routed, and it can cancel pending requests to make space.
I know it's an old question but still.. There is also dhtmlxGrid that can handle millions of rows. There is a demo with 50,000 rows but the number of rows that can be loaded/processed in grid is unlimited.
Disclaimer: I'm from DHTMLX team.
I suggest you read this
http://www.sitepen.com/blog/2008/11/21/effective-use-of-jsonreststore-referencing-lazy-loading-and-more/
Disclaimer: i heavily use YUI DataTable without no headache for a long time. It is powerful and stable. For your needs, you can use a ScrollingDataTable wich suports
x-scrolling
y-scrolling
xy-scrolling
A powerful Event mechanism
For what you need, i think you want is a tableScrollEvent. Its API says
Fired when a fixed scrolling DataTable has a scroll.
As each DataTable uses a DataSource, you can monitoring its data through tableScrollEvent along with render loop size in order to populate your ScrollingDataTable according to your needs.
Render loop size says
In cases where your DataTable needs to display the entirety of a very large set of data, the renderLoopSize config can help manage browser DOM rendering so that the UI thread does not get locked up on very large tables. Any value greater than 0 will cause the DOM rendering to be executed in setTimeout() chains that render the specified number of rows in each loop. The ideal value should be determined per implementation since there are no hard and fast rules, only general guidelines:
By default renderLoopSize is 0, so all rows are rendered in a single loop. A renderLoopSize > 0 adds overhead so use thoughtfully.
If your set of data is large enough (number of rows X number of Columns X formatting complexity) that users experience latency in the visual rendering and/or it causes the script to hang, consider setting a renderLoopSize.
A renderLoopSize under 50 probably isn't worth it. A renderLoopSize > 100 is probably better.
A data set is probably not considered large enough unless it has hundreds and hundreds of rows.
Having a renderLoopSize > 0 and < total rows does cause the table to be rendered in one loop (same as renderLoopSize = 0) but it also triggers functionality such as post-render row striping to be handled from a separate setTimeout thread.
For instance
// Render 100 rows per loop
var dt = new YAHOO.widget.DataTable(<WHICH_DIV_WILL_STORE_YOUR_DATATABLE>, <HOW YOUR_TABLE_IS STRUCTURED>, <WHERE_DOES_THE_DATA_COME_FROM>, {
renderLoopSize:100
});
<WHERE_DOES_THE_DATA_COME_FROM> is just a single DataSource. It can be a JSON, JSFunction, XML and even a single HTML element
Here you can see a Simple tutorial, provided by me. Be aware no other DATA_TABLE pluglin supports single and dual click at the same time. YUI DataTable allows you. And more, you can use it even with JQuery without no headache
Some examples, you can see
List item
Feel free to question about anything else you want about YUI DataTable.
regards,
I kind of fail to see the point, for jqGrid you can use the virtual scrolling functionality:
http://www.trirand.net/aspnetmvc/grid/performancevirtualscrolling
but then again, millions of rows with filtering can be done:
http://www.trirand.net/aspnetmvc/grid/performancelinq
I really fail to see the point of "as if there are no pages" though, I mean... there is no way to display 1,000,000 rows at once in the browser - this is 10MB of HTML raw, I kind of fail to see why users would not want to see the pages.
Anyway...
best approach i could think of is by loading the chunk of data in json format for every scroll or some limit before the scrolling ends. json can be easily converted to objects and hence table rows can be constructed easily unobtrusively
I would highly recommend Open rico.
It is difficult to implement in the the beginning, but once you grab it you will never look back.
I know this question is a few years old, but jqgrid now supports virtual scrolling:
http://www.trirand.com/blog/phpjqgrid/examples/paging/scrollbar/default.php
but with pagination disabled
I suggest sigma grid, sigma grid has embed paging features which could support millions of rows. And also, you may need a remote paging to do it.
see the demo
http://www.sigmawidgets.com/products/sigma_grid2/demos/example_master_details.html
Take a look at dGrid:
https://dgrid.io/
I agree that users will NEVER, EVER need to view millions of rows of data all at once, but dGrid can display them quickly (a screenful at a time).
Don't boil the ocean to make a cup of tea.

Categories

Resources