building a table with javascript - javascript

I'm trying to understand why it takes so long to rebuild a table using javascript on Firefox 43.0.2
A simplified version of my code is below. Both the "real" code and the simple version use "publishTable() to add rows to a table. PublishTable deletes a table body element if it exists, creates a new one, adds about 9000 rows to it, and attaches the completed table body to the table.
PublishTable runs on load, and when the user clicks a "go" button. Therefore, I expect performance to be similar on load and rebuild.
When the "real" page first loads, Firefox takes about 300ms to construct the table [according to performance.now()]. When the alert() announcing this result is closed, i can immediately scroll the page up and down.
But if i click the "go" button to rebuild the table, Firefox spins its wheels for tens of seconds (or more) after I close the alert() dialog. A "stop script?" dialog can appear more than once. The simple version behaves similarly.
So: Why is the performance so radically different between initial build, and rebuild? It seems clearly possible to build the table in 300ms! Is there anything I can do about it?
Some further observations:
IE's performance is much worse on initial load, and as bad on rebuild. Chrome's performance is pretty good: 2 seconds to build or rebuild. If I use innerHTML, rather than insertRow, appendChild, etc., results are similar.
If i remove the line attaching the table body to the table, the wheel-spinning symptom does not occur.
In the "waterfall" chart (in the Firefox performance tool), the the "DOM event" takes up much more time than the "event handler" (which I think covers the run-time of my code), and I don't know what that means. What is happening between the time the js stops running, and the DOM event ends, that doesn't fall in one of the other waterfall categories?
The DOM event is followed by a brief time to recalculate style, a time to paint, and then a sequence of many "cycle collection" periods, then "incremental gc", "cc graph reduction", "cycle collection", "graph reduction", and so on, for tens of seconds. In one case, the performance call-tree allocated 49 seconds to "Gecko" (which seems to be idle time) and another 25 seconds to "graphics" (and within that, a mere 1 second is allocated to publishTable()). Is there something here I can act on?
I'm out of reasonable ideas for further analysis, or how I might modify the js. I don't understand enough about the performance information to act on it. (And now, after timing with IE and Chrome, I'm not even sure to whom the question should be addressed.)
Is there a fundamental error in the code? A better table construction strategy? A way to use the performance tool to understand the problem? A bug in Firefox? (And now I'm going to do the thing on the server side. But I'm still curious about what's going on.)
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
</head>
<body>
<div id='a'>
<button type="button" disabled id="btnGo">Go</button><br />
<button type="button" id="btnQ">?</button><br />
<table id="tideTable" style="Width:40%;margin-left:auto;margin-right:auto;">
</table>
</div>
<div id="b">
hello
</div>
<script>
(function() {
var mmm = ['one', 'two', 'three', "four", "five", "six", "seven"];
function publishTable() {
// The user may run this several times, varying some parameters each time.
var tStart = performance.now();
var table = document.getElementById('tideTable');
// var tableBody = table.getElementsByTagName('tbody')[0];
var tableBody = table.tBodies[0];
if (tableBody) {
tableBody.parentNode.removeChild(tableBody);
}
showHeight();
tableBody = document.createElement('tbody');
for (var i=0; i < 8500; i++) {
appendTableRow(tableBody, mmm);
}
table.appendChild(tableBody);
document.getElementById("btnGo").disabled = false;
alert("Time: " + (performance.now() - tStart) + "ms");
showHeight();
}
function appendTableRow(tableBody, columns) {
var cell;
var textNode;
var row = tableBody.insertRow(-1);
for (var i = 0; i < columns.length; i++) {
cell = row.insertCell(i);
textNode = document.createTextNode(columns[i]);
cell.appendChild(textNode);
}
}
function showHeight(){
var el = document.getElementById('b');
var topPos = el.offsetTop;
alert("position: " + topPos);
}
document.getElementById("btnGo").addEventListener("click", publishTable);
document.getElementById("btnQ").addEventListener("click", showHeight);
publishTable();
})();
</script>
</body>
</html>

I guess, it could be because of the removal of existing items, before inserting the new ones. You could try the following:
meassure what happens to the performance, if you just extend the table, without deletion
build the table before inserting it, e.g. make a variable tableContent put the rows in it, and then add tableContent to the table, that should be faster, because your browser has to rerender the page on every insert.
And I would advice you to consider the use of angularJS, if you plan to make the table dynamic

I tried swapping out the line:
var tableBody = table.getElementsByTagName('tbody')[0];
with the built-in getter:
var tableBody = table.tBodies[0];
and this seems to stabilize the build time. Still a bit slow, but near-consistent times reported for initial build and rebuilds.
This could be coincidental, but something you may want to mess around with.

Your JS is minified and in a CloudFront CDN.
The first demo is async and the second demo is defer
Async
https://jsfiddle.net/zer00ne/6m9f24j5/
Defer
https://jsfiddle.net/zer00ne/fcpy9z0c/
Results
Same times.
142ms on Firefox loading.
Avg of 230ms on each click event.
846ms on Chrome loading.
Avg of 930ms on each click event.
Put your <script> tags before the closing </body> tag
https://jsfiddle.net/zer00ne/y7mguyry/
(function() {
var mmm = ['one', 'two', 'three', "four", "five", "six", "seven"];
function publishTable() {
// The user may run this several times, varying some parameters each time.
var tStart = performance.now();
var table = document.getElementById('tideTable');
var tableBody = table.getElementsByTagName('tbody')[0];
if (tableBody) {
tableBody.parentNode.removeChild(tableBody);
}
tableBody = document.createElement('tbody');
for (var i = 0; i < 8500; i++) {
appendTableRow(tableBody, mmm);
}
table.appendChild(tableBody);
document.getElementById("btnGo").disabled = false;
alert("Time: " + (performance.now() - tStart) + "ms");
}
function appendTableRow(tableBody, columns) {
var cell;
var textNode;
var row = tableBody.insertRow(-1);
for (var i = 0; i < columns.length; i++) {
cell = row.insertCell(i);
textNode = document.createTextNode(columns[i]);
cell.appendChild(textNode);
}
}
document.getElementById("btnGo").addEventListener("click", publishTable);
publishTable();
})();
<button type="button" disabled id="btnGo">
Go</button>
<br />
<table id="tideTable" style="Width:40%;margin-left:auto;margin-right:auto;">
</table>

Related

Javascript: Dynamically generated functions, referring to an external included JS File with GPT rendering

Thanks for looking at this. I think I am making a conceptional mistake in my thoughts, that's why I will let you know about my scenario first:
I have 1 or x DIVs where I display DFP AdUnits and will use these dynamically generated functions on. The function triggers as soon as the DIV is in a visible area:
I generate the links dynamically
function scriptincluder(divid){
var gbgscript = document.createElement('script');
gbgscript.async = true;
gbgscript.type = 'text/javascript';
gbgscript.src = 'https://anyscript.net/script.js?function=myfmydiv1&div=mydiv1 ';
var node = document.getElementsByTagName('script')[0];
node.parentNode.insertBefore(gbgscript, node);
}
With this function I dynamically create the link and this works so far. So I generate links for myfmydiv1/div1, myfmydiv2/div2, myfmydiv3/div3… so on. And add them to the parentNode.
I generate the AdSlots dynamically
googletag.cmd.push(function() {
for (var slot in divslots) {
window['slot_'.concat(slot.toString())] = googletag.defineSlot('/Adslot/Adslot/Adslot/Adslot/Adslot/Adslot', slotsize[slot], slot.toString()).addService(googletag.pubads());
// generate external link pixel from #1:
scriptincluder(slot.toString());
}
googletag.pubads().enableSingleRequest();
googletag.pubads().disableInitialLoad(); // ad unit will not render yet
googletag.enableServices();
});
In this part I generate the Ad Units and add it to a global variable "window['slot_'.concat(slot.toString())]" (<== I have seen this on the web and I am curious if that's the right way to go. At least I can see it in the GCR dev. tool)
I generate the functions referring to the link at #1 dynamically.
for (var slot in divslots) {
var [‘myf’ + escape(slot)] = function() {
alert("I am: " + slot);
googletag.cmd.push(function() {
googletag.pubads().refresh([window['slot_'.concat(key2.toString())]]);});
}
}
The function is triggered once the DIV slot is in a visible area and refreshes the Ad Unit.
It always triggers the wrong function. For example, div1 triggers function of div2 and div1 doesn’t actually load, but div2. Any ideas/help?
I figured out the solution with an experienced programmer colleague of mine.
He suggested to use the cont variable for the variable function in the last piece of code.
for (var slot in divslots) {
const myFunction = 'myf' + escape(slot);
const mySlot = 'slot_'.concat(slot.toString());
var [myFunction] = function() {
alert("I am: " + slot);
googletag.cmd.push(function() {
googletag.pubads().refresh([window[mySlot]]);});
}
}

How can I dynamically add new content to my page each time my array length increases?

I have a function that is adding new strings to an array at random intervals. How can I display with javascript and/or jquery each new string on my page every time the length of the array increases?
You can set a recursive timer function that will updated your array display container every time it is called (adapted from Javascript dynamic array of strings):
<html>
<body>
<script type="text/javascript">
var arr = [];
function add() {
arr.push("String " + Math.random());
}
function show() {
var html = '';
for (var i = 0; i < arr.length; i++) {
html += '<div>' + arr[i] + '</div>';
}
var con = document.getElementById('container');
con.innerHTML = html;
}
function start() {
setTimeout(function() {
add();
// Note: you can call show in an independent timeout
show();
start();
}, 1000);
}
</script>
<input type="button" onclick="start();" value="start" />
<br />
<div id="container"></div>
</body>
</html>
Or you can make it smarter and update the container only if the array's length changed.
Yet another way is to pass a display container update callback to your array update function, so that whenever you update your array - you just go and re-display your array.
<html>
<body>
<script type="text/javascript">
var arr = [];
var lastDisplayed = 0;
function add() {
arr.push("String #" + lastDisplayed + ": " + Math.random());
show(); // Update display container
};
function show() {
var node;
var textnode;
var container = document.getElementById('container'); // Get parent container
for (; lastDisplayed < arr.length; lastDisplayed++) {
node = document.createElement("li"); // Create a <li> node
textnode = document.createTextNode(arr[lastDisplayed]); // Create a text node
node.appendChild(textnode);
container.appendChild(node);
}
};
function start() {
setTimeout(function() {
add();
start();
}, 1000);
};
</script>
<input type="button" onclick="start();" value="start" />
<br />
<ul id="container"></ul>
</body>
</html>
Internally, Angular and other frameworks implement a combination of these approaches.
Important note: depending on your application, you might want to explore different approaches to updating your page to preserve responsiveness and performance of your interface. For example, you might want to space your GUI updates in time if array elements are added too often. You may also want to keep adding elements to your DOM model (see the second example) instead of rewriting it (like in the first example) if the existing elements of your array remain unchanged. Similar issues might need to be considered if using a dedicated framework like Angular.
I would recommend using a library that handle property subscriptions like knockout or angular, but since that wasnt mentioned in the question I will give this example.
var someArray = [];
var standardPush = someArray.push;
someArray.push = function(){
// depending on your requirements you can switch these next two lines around
for (var i = 0; i < arguments.length; i++) {
updateUI(arguments[i]); // call code to update UI
standardPush(arguments[i]); // actually ad record to array
}
}
function updateUI(record){
// Some code that updates your UI
// example
$("#container").append($("<div>").text(record));
}
then just call it like a normal array.push
someArray.push(someRecord);
// or
someArray(record1, record2, record3....);
This code is more fun than practical, I again would recommend a library that handles property subscriptions.

Unresponsive Browser during JavaScript Execution

I have to show a progressbar/status indicator using pure JavaScript, no jQuery please.
My code is:
<script type="text/javascript">
function processObjects()
{
var selectedRows = {}; // array of selected rows from table
var count = selectedRows.length; // count value exceeds 100
var myDiv = document.getElementById("myDiv");
for(var i=0; i < count; i++)
{
myDiv.innerHTML = (i+1)+"/"+count;
// Process each object from array
// no Ajax call
// takes almost 0.1 sec for each object <- this is not an issue
}
}
</script>
<div id="myDiv"></div>
<input type="button" onclick="processObjects()" value="Process Objects" />
<table>
<!-- Table with lots of rows with checkboxs -->
</table>
Problem:
When I run this script in any Browser, the page becomes unresponsive and does not update the status in using innerHTML as 1/100...2/100...3/100 as so on.
what could be the possible solution to stop browser from becoming unresponsive?
JS is single threaded and it has to take the full attention of the browser while being inside a function.
You need to call long processes through setTimeout() function if you need to give the browser a chance to breath while processing something long.
See how I do this in the following example:
function doProgress(count) {
if (count == 100)
return;
document.getElementById("myDiv").innerHTML = count;
count++;
setTimeout(doProgress, 0, count); //<- calling the same function with new count here. "0" is the milliseconds to call it after. "count" is the argument to pass
}
It only demonstrate this technique and there are lot of best practices to follow once you master it.
Javascript locks the view while code is executing (unless you are using a canvas) so you must end the execution of your code before being able to see results in your DOM.
Even if this article is about angular, the intro explains quite well how javascript works and why it freezes a browser http://jimhoskins.com/2012/12/17/angularjs-and-apply.html
if you want to keep it simple you can do this:
<script type="text/javascript">
var start=0;
var selectedRows = {}; // array of selected rows from table
var count = selectedRows.length; // count value exceeds 100 value
var myDiv = document.getElementById("myDiv");
function processObject(){
myDiv.innerHTML = (++start)+"/"+count;
// Process one object from array using "start" as index
if(start<count){
setTimeout(processObject, 100);
}
}
function processObjects(){
//eventually update values
selectedRows=[] //adds items to array
count = selectedRows.length;
myDiv = document.getElementById("myDiv");
processObject();
}
</script>
<div id="myDiv"></div>
<input type="button" onclick="processObjects()" value="Process Objects" />
<table>
<!-- Table with lots of rows with checkboxs -->
</table>
if you don't want to use global variables you can do this:
function processObject(){
processObject.myDiv.innerHTML = (++processObject.start)+"/"+processObject.count;
// Process one object from array using "start" as index
if(processObject.start<processObject.count){
setTimeout(processObject, 100);
}
}
function processObjects(){
processObject.selectedRows=[]; //array with rows to process
processObject.count=processObject.selectedRows.length
processObject.start=0;
processObject.myDiv=document.getElementById("myDiv");
processObject();
}

Reload a content almost invisibly and no blinking effect using JAVASCRIPT

I'm writing a small progam wherein I'm getting data using $.get then display the data so far so good and then there's this part then when I click a certain link it refresh the page but it has this blinking effect. Is there a way on how to reload the content get the new updated content then replace the previously loaded data.
NOTE: I didn't use setInterval or setTimeout function because it slows down the process of my website. any answer that does not include those functions are really appreciated.
Here's the code
function EmployeeIssues(){
$('#initial_left').css({'display' : 'none'});
var table = $('#table_er');
$.get('admin/emp_with_issues', function(result){
var record = $.parseJSON(result);
var data = record.data,
employees = data.employees,
pages = data.pages;
if(employees){
$('#er_tab_label').html('<b>Employees with Issues</b>');
for (var i = 0; i < employees.length; i++) {
$('#table_er').fadeIn('slow');
table.append(write_link(employees[i])); // function that displays the data
}
if(pages){
$('#pagination').html(pages);
}
}else{
$('#er_tab_label').html('<b>No employees with issues yet.</b>');
}
});
table.html('');
}
then this part calls the function and display another updated content
$('#refresh_btn').on('click', function(e){
e.preventDefault();
var tab = $('#tab').val();
if(tab == 'er'){
EmployeeIssues();
}
});
What should I do to display the content without any blinking effect?
thanks :-)
This section might be the issue :
if(employees){
$('#er_tab_label').html('<b>Employees with Issues</b>');
for (var i = 0; i < employees.length; i++) {
$('#table_er').fadeIn('slow');
table.append(write_link(employees[i])); // function that displays the data
}
if(pages){
$('#pagination').html(pages);
}
} else ...
It seems you're asking table_er to fade in once per run of the loop whereas s there can only be one such table, you only need to do it once ?
first try re-arringing it like this:
if(employees){
$('#er_tab_label').html('<b>Employees with Issues</b>');
$('#table_er').hide(); // hide it while we add the html
for (var i = 0; i < employees.length; i++) {
table.append(write_link(employees[i])); // function that displays the data
}
$('#table_er').fadeIn('slow'); // only do this after the table has all its html
if(pages){
$('#pagination').html(pages);
}
} else ....
Another possibility is that you're running through a loop and asking jquery to do stuff while the loop is running. It might be better to work out the whole HTML for the new page data in a string and then get the screen to render it in one line. I cna't do this for you as I don't know what's in write_link etc but something like this ..
if(employees){
$('#er_tab_label').html('<b>Employees with Issues</b>');
var sHTML ="";
$('#table_er').hide(); // hide it while we add the html
for (var i = 0; i < employees.length; i++) {
sHTML+=write_link(employees[i]); // maybe this is right ? if write_link returns an HTML string ?
}
table.append(sHTML); // add the HTML from the string in one go - stops the page rendering while the code is running
$('#table_er').fadeIn('slow'); // now show the table.
if(pages){
$('#pagination').html(pages);
}
} else ...

What is the fastest JSON parser for JavaScript?

I want to show a list with 1000 rows using Json that's support by Struts2 like pug-in. I use flexigrid (jquery) to parse 1000 rows to display. But it's so slow, and sometimes my browser crashes. (Firefox & IE).
So, what is the fastest Javascript framework to parse about 1000 rows?
What is the fastest JSON parser for JavaScript?
eval or when available, native JSON parser, at least in Chrome, Safari, Firefox 3.something, Opera 10.50, and even IE8 (only in IE8-mode)
Show the user what they want to see.
Show 50 rows, add a filter or a search.
If you really think that data should be reachable in a single page, maybe what you want is to fetch data while the user scrolls (and thus pick up smaller portions at a time).
I don't think you'll get acceptable performance from just about any grid component showing 1,000 at the same time, especially not on IE (even IE8). But most grids should be able to support having 1,000 in memory (well, depending on how big they are) and displaying a window into them (say, 20 rows, 40 rows, etc.) with paging and filtering options, without a significant performance problem. That would be a better user experience as well, I would think.
Edit
I got curious enough to check, and yeah, JSON parse time is not the problem; it'll be the rendering. Below is an example of very, very simple (not production) paging entirely client-side. On my netbook, IE7 parses the 1,000 rows of simple JSON objects in 36ms, so even complex objects shouldn't be an issue. That's using Prototype's evalJSON, which (even now) just defers to eval and puts the data in parentheses (they'll be changing that).
1000rows.html
<!DOCTYPE HTML>
<html>
<head>
<meta http-equiv="Content-type" content="text/html;charset=UTF-8">
<title>1,000 Row Test Page</title>
<style type='text/css'>
body {
font-family: sans-serif;
}
#log p {
margin: 0;
padding: 0;
}
</style>
<script type='text/javascript' src='http://ajax.googleapis.com/ajax/libs/prototype/1.6.1.0/prototype.js'></script>
<script type='text/javascript' src='1000rows.js'></script>
</head>
<body><div>
<input type='button' id='btnLoadData' value='Load Data'>
<input type='button' id='btnNext' value='Next'>
<input type='button' id='btnPrevious' value='Previous'>
<table>
<thead>
<tr><th>Name</th><th>Description</th><th>Count</th></tr>
</thead>
<tfoot>
<tr><th colspan='3' id='theLabel'></th></tr>
</tfoot>
<tbody id='theData'>
<tr><td colspan='3'></td></tr>
</tbody>
</table>
<hr>
<div id='log'></div>
</div></body>
</html>
1000rows.js (using Prototype; jQuery would be different but similar)
(function() {
var data, windowTop, WINDOW_SIZE;
// "Constant" for the size of our window into the data
WINDOW_SIZE = 20; // Rows
// No data yet
clearData();
// Hook up our observers when we can
document.observe('dom:loaded', function() {
$('btnLoadData').observe('click', loadData);
$('btnNext').observe('click', function(event) {
event.stop();
updateWindow(WINDOW_SIZE);
});
$('btnPrevious').observe('click', function(event) {
event.stop();
updateWindow(-WINDOW_SIZE);
});
});
// Clear our data to a known state
function clearData() {
data = [];
windowTop = 0;
}
// Click handler for the load data button
function loadData() {
var success;
log("Loading data..");
clearData();
updateWindow();
success = false;
// Note: Using text/plain rather than application/json so
// Prototype doesn't parse the data for me, so I can measure
// how long it takes to do it.
new Ajax.Request("data.txt", {
onSuccess: function(response) {
var start, duration;
success = true;
log("Got data, parsing");
start = new Date().getTime();
data = response.responseText.evalJSON();
duration = new Date().getTime() - start;
log("Data parsed in " + duration + "ms");
updateWindow.defer();
}
});
}
function updateWindow(offset) {
var dataElement, labelElement, markup, index, template;
// Get the target element
dataElement = $('theData');
labelElement = $('theLabel');
if (!dataElement || !labelElement) {
return;
}
// If no data, simply say that
if (!data || data.length <= 0) {
dataElement.update("");
labelElement.update("No information");
return;
}
// Ensure that windowTop is rational
if (WINDOW_SIZE > data.length) {
windowTop = 0;
}
else {
if (typeof offset == 'number') {
windowTop += offset;
}
if (windowTop + WINDOW_SIZE > data.length) {
windowTop = data.length - WINDOW_SIZE;
}
if (windowTop < 0) {
windowTop = 0;
}
}
template = new Template(
"<tr><td>#{name}</td><td>#{description}</td><td>#{count}</td></tr>"
);
markup = "";
index = windowTop + WINDOW_SIZE - 1;
if (index >= data.length) {
index = data.length - 1;
}
$('theLabel').update('Showing rows ' + windowTop + ' through ' + index);
while (index >= windowTop) {
markup = template.evaluate(data[index]) + markup;
--index;
}
dataElement.update(markup);
}
// Log a message
function log(msg) {
$('log').appendChild(new Element('p').update(msg));
}
})();
data.txt (quite boring, of course)
[
{"name": "Name #0001", "description": "Description #0001", "count": 1},
{"name": "Name #0002", "description": "Description #0002", "count": 2},
{"name": "Name #0003", "description": "Description #0003", "count": 3},
...
{"name": "Name #1000", "description": "Description #1000", "count": 1000}
]
...a full copy of data.txt can be found here.
1,000 rows of what? jQuery is actually pretty quick, especially since performance upgrades in version 1.4 (released just days ago). If you're experiencing problems showing 1,000 rows, I would first ask you why you're showing that many - no human ought to have to scroll that much. And second, is all of the information crucial, and are you only passing crucial information into the JSON value. And lastly, are you making your DOM unnecessarily-complicated with the way you're adding the data?
Again, if you're querying only what you need to show, and you're showing a reasonable about of data (posting 1,000 rows on the screen isn't reasonable), jQuery will be more than sufficient for your needs.
If you really want speed, the javascript eval("..."); function is the fastest. Unfortunately it's not safe as it can execute malicious javascript.
There's also the javascript JSON Parser (found here) from JSON.org. They've written the javascript to parse JSON strings to create a JSON object (I've heard that debugging using Firebug, a Firefox add-ons, creates an array of JSON objects but I've never tried it).

Categories

Resources