For Loop Hanging Browser. How to Populate DataTable Efficiently - javascript

I am trying to populate a DataTable (DataTables.net) from a javascript array of objects. While I don't have problem doing so if the data is relatively small, such as less than 6000 rows--and data can be as large as 20,000 rows. But when the data is larger then I get Google Chrome to hang and I get the message to Wait or exit. Firstly, my current code:
var data = [];
for (var i = 0; i < arr_Members_in_Radius.length; i++) {
data.push([arr_Members_in_Radius[i].record_id);
}
var search_results_table_2 = $('#tbl_search_results').DataTable({
destroy: true,
data: data
});
The above code hangs when data is large. So, following this link How to rewrite forEach to use Promises to stop "freezing" browsers? I implemented a Promise based approach:
function PopulateDataTable(current_row) {
search_results_table.row.add([current_row.record_id, current_row.JOA_Enrollment, current_row.mbr_full_name,
current_row.member_address, current_row.channel_subchannel, current_row.joa_clinic, current_row.serviceaddress]).draw();
};
var wait = ms => new Promise(resolve => setTimeout(resolve, ms));
arr_Members_in_Radius.reduce((p, i) => p.then(() => PopulateDataTable(i) ).then(() => wait(1)), Promise.resolve());
and while that seems to solve the problem of the browser hanging, the datatable gets updated one row at a time--which is very time consuming and makes the datatable unusuable (scrolling issues, searching, sorting etc) until all/sufficient of the 6000 rows are loaded. It would be nice if datatable gets loaded at least 100+ at a time in the PopulateDataTable() function call. I wonder how will I able to do that. Or please suggest a different approach.
Thank you!

It turns out that the data table was indeed able to handle 20,000 rows of data without any issue. The problem was--and to some extent still is-- that the subsequent call to another function load large data on a map (a Leaflet.js map) was causing the browser to hang. So I guess too much data processing too close to each other. Here is how I fixed it. The function to call the display on the map is now waiting for 3 seconds before calling. I will tweak my code to increase/reduce the timeout based on the amount of data. Not an elegant solution but the browser doesn't hang anymore and I get to display all the data per user selections.
setTimeout(function () {
createRouteOnMap(arr_Members_in_Radius);
}, 3000);

Related

Angular: Increase Query Loading Time in Firebase Database

I have an angular app where i am querying my firebase database as below:
constructor() {
this.getData();
}
getData() {
this.projectSubscription$ = this.dataService.getAllProjects()
.pipe(
map((projects: any) =>
projects.map(sc=> ({ key: sc.key, ...sc.payload.val() }))
),
switchMap(appUsers => this.dataService.getAllAppUsers()
.pipe(
map((admins: any) =>
appUsers.map(proj =>{
const match: any = admins.find(admin => admin.key === proj.admin);
return {...proj, imgArr: this.mapObjectToArray(proj.images), adminUser: match.payload.val()}
})
)
)
)
).subscribe(res => {
this.loadingState = false;
this.projects = res.reverse();
});
}
mapObjectToArray = (obj: any) => {
const mappedDatas = [];
for (const key in obj) {
if (Object.prototype.hasOwnProperty.call(obj, key)) {
mappedDatas.push({ ...obj[key], id: key });
}
}
return mappedDatas;
};
And here is what I am querying inside dataService:
getAllProjects() {
return this.afDatabase.list('/projects/', ref=>ref.orderByChild('createdAt')).snapshotChanges();
}
getAllAppUsers() {
return this.afDatabase.list('/appUsers/', ref=>ref.orderByChild('name')).snapshotChanges();
}
The problem I am facing with this is I have 400 rows of data which I am trying to load and it is taking around 30seconds to load which is insanely high. Any idea how can I query this in a faster time?
We have no way to know whether the 30s is reasonable, as that depends on the amount of data loaded, the connection latency and bandwidth of the client, and more factors we can't know/control.
But one thing to keep in mind is that you're performing 400 queries to get the users of each individual app, which is likely not great for performance.
Things you could consider:
Pre-load all the users once, and then use that list for each project.
Duplicate the name of each user into each project, so that you don't need to join any data at all.
If you come from a background in relational databases the latter may be counterintuitive, but it is actually very common in NoSQL data modeling and is one of the reasons NoSQL databases scale so well.
I propose 3 solutions.
1. Pagination
Instead of returning all those documents on app load, limit them to just 10 and keep record of the last one. Then display the 10 (or any arbitrary base number)
Then make the UI in such a way that the user has to click next or when the user scrolls, you fetch the next set based on the previous last document's field's info.
I'm supposing you need to display all the fetched data in some table or list so having the UI paginate the data should make sense.
2. Loader
Show some loader UI on website load. Then when all the documents have fetched, you hide the loader and show the data as you want. You can use some custom stuff for loader, or choose from any of the abundant libraries out there, or use mat-progress-spinner from Angular Material
3. onCall Cloud Function
What if you try getting them through an onCall cloud function? It night be faster because it's just one request that the app will make and Firebase's Cloud Functions are very fast within Google's data centers.
Given that the user's network might be slow to iterate the documents but the cloud function will return all at once and that might give you what you want.
I guess you could go for this option only if you really really need to display all that data at once on website load.
... Note on cost
Fetching 400 or more documents every time a given website loads might be expensive. It'll be expensive if the website is visited very frequently by very many users. Firebase cost will increase as you are charged per document read too.
Check to see if you could optimise the data structure to avoid fetching this much.
This doesn't apply to you if this some admin dashboard or if fetching all users like this is done rarely making cost to not be high in that case.

Refreshing UI parts on each loop of an AJAX sync call

Okay, it might be a simple question, but so far I didn't find anything helpful by searching, so I am giving it a try here.
I am using plain old javascript/jquery on asp.net core on some project I am working on.
I am currently performing some actions on some employees in a foreach loop.
For each employee I am calling synchronously via ajax an API.
What I want is the UI to be updated, showing the current employee being processed in a progress bar.
While on debug, the process seems to work fine, but during normal process, it seems that the UI thread is not updated, only after all the work has been done. As such, as soon as I start processing the employees, the screen is stuck and closes after the work has been done. No progress bar is shown.
I only managed to show the progress animation and the first employee using the below trick
$('#applyYes').click(function (e) {
e.preventDefault();
var year = $('#yearCombo').val();
$('#applyInfo').hide();
$('#applyLoad').show();
$('#applyAction').prop('innerText', 'Calculating...');
setTimeout(function () {
var employeeIDs = multipleEmployees_Array();
for (var i = 1; i <= employeeIDs.length; i++) {
employeeID = employeeIDs[i - 1];
applyAmount(employeeID, year); //1. Updates progress bar 2. Ajax sync call
}
}, 0);
})
As far as I understand the e.preventDefault seems to move the timeout function being processed after UI thread finishes.
What is the optimal way of achieving what I want?
PS: No external libraries if possible. I am working on an third-party platform, that makes it difficult to add external libraries (policies, permissions etc.)
Synchronous HTTP requests are deprecated for precisely this reason. Don't use them.
Use Asynchronous requests instead.
If you want to run them sequentially then either:
Trigger i+1 in i's success function or
Use a Promise based API and await the results

Handling large data sets on client side

I'm trying to build an application that uses Server Sent Events in order to fetch and show some tweets (latest 50- 100 tweets) on UI.
Url for SSE:
https://tweet-service.herokuapp.com/stream
Problem(s):
My UI is becoming unresponsive because there is a huge data that's coming in!
How do I make sure My UI is responsive? What strategies should I usually adopt in making sure I'm handling the data?
Current Setup: (For better clarity on what I'm trying to achieve)
Currently I have a Max-Heap that has a custom comparator to show latest 50 tweets.
Everytime there's a change, I am re-rendering the page with new max-heap data.
We should not keep the EventSource open, since this will block the main thread if too many messages are sent in a short amount of time. Instead, we only should keep the event source open for as long as it takes to get 50-100 tweets. For example:
function getLatestTweets(limit) {
return new Promise((resolve, reject) => {
let items = [];
let source = new EventSource('https://tweet-service.herokuapp.com/stream');
source.onmessage = ({data}) => {
if (limit-- > 0) {
items.push(JSON.parse(data));
} else {
// resolve this promise once we have reached the specified limit
resolve(items);
source.close();
}
}
});
}
getLatestTweets(100).then(e => console.log(e))
You can then compare these tweets to previously fetched tweets to figure out which ones are new, and then update the UI accordingly. You can use setInterval to call this function periodically to fetch the latest tweets.

Non blocking Javascript and concurrency

I have code on a web-worker and because i can't post to it an object with methods(functions) , i dont know how to stop blocking the UI with this code:
if (data != 'null') {
obj['backupData'] = obj.tbl.data().toArray();
obj['backupAllData'] = data[0];
}
obj.tbl.clear();
obj.tbl.rows.add(obj['backupAllData']);
var ext = config.extension.substring(1);
$.fn.dataTable.ext.buttons[ext + 'Html5'].action(e, dt, button, config);
obj.tbl.clear();
obj.tbl.rows.add(obj['backupData'])
This code exports records from an html table. Data is an array and is returned from a web worker and sometimes can have 50k or more objects.
As obj and all the methods that it contains are not transferable to we-worker, when data length 30k ,40k or 50k or even more, the UI blocks.
which is the best way to do this?
Thanks in advance.
you could try wrapping the heavy work in an async function like a timeout to allow the engine to queue the whole logic and elaborate it as soon as it has time
setTimeout(function(){
if (data != 'null') {
obj['backupData'] = obj.tbl.data().toArray();
obj['backupAllData'] = data[0];
}
//heavy stuff
}, 0)
or , if the code is extremely long, you can try figure it out a strategy to split your code into chunk of operation and execute each chunk in a separate async function (timeout)
Best way to iterate over an array without blocking the UI
Update:
Sadly, ImmutableJS doesn't work at the moment across webworkers. You should be able to transfer the ArrayBuffer so you don't need to parse it back into an array. Also read this article. If your workload is that heavy, it would be best to actually send back one item at a time from the worker.
Previously:
The code is converting all the data into an array, which is immediately costly. Try returning an immutable data structure from web worker if possible. This will guarantee that it doesn't change when the references change and you can continue iterating over it slowly in batches.
The next thing you can do is to use requestIdleCallback to schedule small batches of items to be processed.
This way you should be able to make the UI breathe a bit.

How do I get the showing rows from datatable Scroller extention?

When using the Scroller extention for datatables, you don't have pagination but all rows in one scrollbar. I would like to know, what event is fired after scrolling down the tables, to e.g. see row 50-60 of 100.. I also need to know how to get that 10 rows out from the datatable. I'am using the lastest versions. Thanks alot.
This is how you get the rows from the current page in Datatable.net without the Scroller extention.
drawCallback: function (settings) {
var api = new $.fn.dataTable.Api(settings);
// Output the data for the visible rows to the browser's console
console.log(api.rows({ page: 'current' }).data().length);
}
Update 1: My Init of Scroller table:
initScrollerTable = function ($table, url, inclFilter, dataTableOptionsSpecific) {
$.ajax({
method: "POST",
url: url,
data: dataParameterHelper.getCommonData(),
dataType: "json"
})
.done(function (rows) {
var dataTableOptions = $.extend(
{},
{ data: rows },
dataTableOptionsSpecific
);
initTable($table, inclFilter, dataTableOptions);
});
};
Update 2 Deeper elaborating on the origional reason for the question to clarify.
Okay long story short. My table contained 26000 rows, and it took 7mins for me to load it. It contained of ALOT of DB calls and the JSON was a size of 21MB! I wanted to optimize it.
First attempt: I tripped my json to absolute minimum bringing it down to 1.5MB but it still took almost 7mins. On the second test it contained all the html, with hardcoded numbers/strings and i did 0 DB calls. It only took 3.5secs!. Yesterday I didn't knew I only had to focus on optimizing my calls to the DB.
Yesterday, when I posted the origional questing, my idea was to populate the table without any data pulled form the DB, and instead load them in a kind of lazy loading way. Let's say I showed the 10 first rows. with 3 column each where I need to call the DB for each cell, that's 30 times total. So the idea was to make 30 request for the (10) current rows, and replace the placeholder with the actually value. If you understand. I would still be better than 26.000 * 3 DB calls :)
And for that I needed to hook up on an event to get the 10 current rows-id's, I could then loop through making the 30 AJAX request. So maybe it's the scroller event I need for that, if there is someone like that.
But I don't know if it's a good idea. Usually a "good" idea is only "good" before/until you learn the best pratice :)
I think I will start focusing on reducing the DB calls with some inner joins and what have you, retrieving a big resultset I can loop through and populate all my 26.000(and later 50.000) rows, in under 15 secs!
SOLUTION
Use xhr event to handle Ajax requests and page.info() to retrieve information about the table as shown below.
Please note that event handler needs to be attached before you initialize your table to handle initial Ajax request.
$('#example').on('xhr.dt', function ( e, settings, json, xhr ) {
var api = new $.fn.dataTable.Api(settings);
var info = api.page.info();
console.log('Data has been loaded', info);
});
DEMO
See this jsFiddle for code and demonstration.

Categories

Resources