I have a simple html page containing a large table with more than 2000 rows. I have jQuery code written for searching and sorting in that table. It takes quite some time for searching and sorting (which is understandable).
What I want is to have a screen blocker in place when script is searching or sorting the table. This behavior is observable on AJAX calls on many websites that can be achieved by implementing onAjaxBegin and onAjaxComplete events of jQuery.
Is there any such method that can be used to put a screen blocker for long running script. if not, what is the alternative?
I would recommend breaking it up and iterate with setTimeout.
For example, instead of:
function example1() {
for (var i = 0; i < 1000; i++) {
// SOME CODE
}
}
You could write:
function example2() {
var i = 0;
helper();
function helper() {
// SOME CODE
if (++i < 1000) {
setTimeout(helper, 0);
}
}
}
You don't have to have every iteration in different callback. You could convert 1000 iterations in 1 function call to 10 iterations per function call in 100 function calls or something that would be most suitable in your case. The idea is to not block the user interface for so long that the user will notice.
Another idea would be to use Web Workers if you can but this will not work on older browsers (which may or may not be a problem for you, if you're writing a browser extension or you know what your users will use, etc.).
If you do it the way you explained in your question then you will make the browser completely unresponsive during your calculations and you will most likely trigger a "slow script - do you want to kill it?" kind of warning.
jQuery blockUI will block elements or the page and is very customizable.
Related
I'm prerendering my HTML pages for the search engines bots via PhantomJS through Selenium, so that they can see the fully loaded content. Currently, after PhantomJS reached the page, I'm waiting 5 seconds so that I'm sure everything is loaded.
Instead of waiting those 5 seconds every time, one solution I contemplate is to wait until an attribute html-ready on the <body /> tag is set to true:
<html ng-app>
<head>...</head>
<body html-ready="{{htmlReady}}">
...
</body>
</html>
.controller("AnyController", function($scope, $rootScope, AnyService) {
$rootScope.htmlReady = false;
AnyService.anyLongAction(function(anyData) {
$scope.anyData = anyData;
$rootScope.htmlReady = true;
});
})
The question is: will the html-ready attribute always be set to true after any view update has been done (e.g. displaying the anyData)? In other words, is it possible that during a laps, the html-ready attribute is true while the page is not fully loaded yet? If yes, how can it be handled?
It should be done after the digest, thus it has more chances to work as expected.
AnyService.anyLongAction(function(anyData) {
$scope.anyData = anyData;
$timeout(function () {
$rootScope.htmlReady = true;
}, 0, false);
});
But it is useless in terms of the app. You have to watch for changes in every single place, Angular doesn't offer anything to make the task easier.
Fortunately, you are free to abstract from Angular and keep it simple.
var ignoredElements = [];
ignoredElements = ignoredElements.concat($('.continuously-updating-widget').toArray());
var delay = 200; // add to taste
var timeout;
var ready = function () {
$('body').off('DOMSubtreeModified');
clearTimeout(timeoutLimit);
alert('ready');
};
$('body').on('DOMSubtreeModified', function (e) {
if (ignoredElements.indexOf(e.target) < 0) {
clearTimeout(timeout);
timeout = setTimeout(ready, delay);
}
});
var timeoutLimit = setTimeout(ready, 5000);
Feel free to angularify it if needed, though it isn't the production code anyway.
It is a good idea to put the handler into throttle wrapper function (the event will spam all the way). If you use remote requests on the page that can potentially exceed timeout delay, it may be better to combine this approach with several promises from async services and resolve them with $q.all. Still, much better than looking after every single directive and service.
DOMSubtreeModified is considered to be obsolete (it never was really acknowledged, MutationObserver is recommended instead), but current versions of FF and Chrome support it, and it should be ok for Selenium.
Short answer
No. It isn't guaranteed that your markup will be completely rendered when html-ready is set.
Long answer
To the best of my knowledge it's not possible to accurately determine when Angular has finished updating the DOM after the model changed. In general it happens very fast and it doesn't take more than a few cycles to finish, but that's not always the case.
Correctly detecting when a page has finished loading/rendering is actually quite a challenge, and if you take a look at the source code of specialized tools, like prerender, you'll see that they use several different checks in order to try to decide whether a page is ready or not. And even so it doesn't work 100% of the time (Phantom may crash, a request may take longer than usual to complete, and so on).
If you really want to come up with your own solution for this problem, I suggest that you take a look at prerender's source code (or another similar project) to get some inspiration.
This is a very simple use case. Show an element (a loader), run some heavy calculations that eat up the thread and hide the loader when done. I am unable to get the loader to actually show up prior to starting the long running process. It ends up showing and hiding after the long running process. Is adding css classes an async process?
See my jsbin here:
http://jsbin.com/voreximapewo/12/edit?html,css,js,output
To explain what a few others have pointed out: This is due to how the browser queues the things that it needs to do (i.e. run JS, respond to UI events, update/repaint how the page looks etc.). When a JS function runs, it prevents all those other things from happening until the function returns.
Take for example:
function work() {
var arr = [];
for (var i = 0; i < 10000; i++) {
arr.push(i);
arr.join(',');
}
document.getElementsByTagName('div')[0].innerHTML = "done";
}
document.getElementsByTagName('button')[0].onclick = function() {
document.getElementsByTagName('div')[0].innerHTML = "thinking...";
work();
};
(http://jsfiddle.net/7bpzuLmp/)
Clicking the button here will change the innerHTML of the div, and then call work, which should take a second or two. And although the div's innerHTML has changed, the browser doesn't have chance to update how the actual page looks until the event handler has returned, which means waiting for work to finish. But by that time, the div's innerHTML has changed again, so that when the browser does get chance to repaint the page, it simply displays 'done' without displaying 'thinking...' at all.
We can, however, do this:
document.getElementsByTagName('button')[0].onclick = function() {
document.getElementsByTagName('div')[0].innerHTML = "thinking...";
setTimeout(work, 1);
};
(http://jsfiddle.net/7bpzuLmp/1/)
setTimeout works by putting a call to a given function at the back of the browser's queue after the given time has elapsed. The fact that it's placed at the back of the queue means that it'll be called after the browser has repainted the page (since the previous HTML changing statement would've queued up a repaint before setTimeout added work to the queue), and therefore the browser has had chance to display 'thinking...' before starting the time consuming work.
So, basically, use setTimeout.
let the current frame render and start the process after setTimeout(1).
alternatively you could query a property and force a repaint like this: element.clientWidth.
More as a what is possible answer you can make your calculations on a new thread using HTML5 Web Workers
This will not only make your loading icon appear but also keep it loading.
More info about web workers : http://www.html5rocks.com/en/tutorials/workers/basics/
I have a very heavy graphical issue to perform, and I need to be able to show an onscreen progress bar and also prevent the browser from getting "freeze".
I understand that tight looping is blocking the UI, and JavaScript is single threaded, so I using setTimeout in order to perform some graphical testing as follow:
function FG_ShowHM(y) {
for(var x=0 ; x<100 ; x++) {
if(FG_TreeH[y*100+x]=="") {
FG_hmctx.fillStyle = "rgba(255,255,255,1)";
}
else {
var col=DegToCol(FG_min,FG_max,FG_TreeH[y*100+x]);
FG_hmctx.fillStyle = "rgba("+col.r+","+col.g+",0,1)";
}
FG_hmctx.fillRect(x*3,y*3,3,3);
}
ProgBT+=0.5;
y++;
if(y<100) {
window.setTimeout(FG_ShowHM(y),100); // move on
}
else {
XPW();
}
}
And a call to that function from within another function:
window.setTimeout(FG_ShowHM(0));
NOTE: PW() is just a shortcut to jQuery functions that creating the "please wait evement, and XPW is just a shortcut to remove the "please wait" window.
For some reason the UI is still stack without possibility to show any progress, and more than that, after few seconds the browser get completely "freeze"...
I have tried many many ways to solve this issue, but without success.... I would like to know what is the best way to show up progress in such a long operation, or at least prevent the browser from getting "freeze".
Thanks in advance.
The problem is the way you use window.setTimeout. As a first argument it expects a function and a number as a 2nd. When you do window.setTimeout(FG_ShowHM(y),100); you actually don't pass a function as a parameter but execute it and the result of the execution pass to window.setTimeout. As a result - you have an infinite recursion.
To fix it - correct the way of calling window.setTimeout to
window.setTimeout(function() { FG_ShowHM(y) }, 100);
Note: there are a lot of places in your code of such window.setTimeout usage. So be attentive.
Read more about window.setTimeout here.
When looking to improve a page's performance, one technique I haven't heard mentioned before is using setTimeout to prevent javascript from holding up the rendering of a page.
For example, imagine we have a particularly time-consuming piece of jQuery inline with the html:
$('input').click(function () {
// Do stuff
});
If this code is inline, we are holding up the perceived completion of the page while the piece of jquery is busy attaching a click handler to every input on the page.
Would it be wise to spawn a new thread instead:
setTimeout(function() {
$('input').click(function () {
// Do stuff
})
}, 100);
The only downside I can see is that there is now a greater chance the user clicks on an element before the click handler is attached. However, this risk may be acceptable and we have a degree of this risk anyway, even without setTimeout.
Am I right, or am I wrong?
The actual technique is to use setTimeout with a time of 0.
This works because JavaScript is single-threaded. A timeout doesn't cause the browser to spawn another thread, nor does it guarantee that the code will execute in the specified time. However, the code will be executed when both:
The specified time has elapsed.
Execution control is handed back to the browser.
Therefore calling setTimeout with a time of 0 can be considered as temporarily yielding to the browser.
This means if you have long running code, you can simulate multi-threading by regularly yielding with a setTimeout. Your code may look something like this:
var batches = [...]; // Some array
var currentBatch = 0;
// Start long-running code, whenever browser is ready
setTimeout(doBatch, 0);
function doBatch() {
if (currentBatch < batches.length) {
// Do stuff with batches[currentBatch]
currentBatch++;
setTimeout(doBatch, 0);
}
}
Note: While it's useful to know this technique in some scenarios, I highly doubt you will need it in the situation you describe (assigning event handlers on DOM ready). If performance is indeed an issue, I would suggest looking into ways of improving the real performance by tweaking the selector.
For example if you only have one form on the page which contains <input>s, then give the <form> an ID, and use $('#someId input').
setTimeout() can be used to improve the "perceived" load time -- but not the way you've shown it. Using setTimeout() does not cause your code to run in a separate thread. Instead setTimeout() simply yields the thread back to the browser for (approximately) the specified amount of time. When it's time for your function to run, the browser will yield the thread back to the javascript engine. In javascript there is never more than one thread (unless you're using something like "Web Workers").
So, if you want to use setTimeout() to improve performance during a computation-intensive task, you must break that task into smaller chunks, and execute them in-order, chaining them together using setTimeout(). Something like this works well:
function runTasks( tasks, idx ) {
idx = idx || 0;
tasks[idx++]();
if( idx < tasks.length ) {
setTimeout( function(){ runTasks(tasks, idx); },1);
}
}
runTasks([
function() {
/* do first part */
},
function() {
/* do next part */
},
function() {
/* do final part */
}
]);
Note:
The functions are executed in order. There can be as many as you need.
When the first function returns, the next one is called via setTimeout().
The timeout value I've used is 1. This is sufficient to cause a yield, and the browser will take the thread if it needs it, or allow the next task to proceed if there's time. You can experiment with other values if you feel the need, but usually 1 is what you want for these purposes.
You are correct, there is a greater chance of a "missed" click, but with a low timeout value, its pretty unlikely.
Are you able to halt JavaScript execution without locking up the browser? The way you would normally halt execution is to do an infinite while()-loop, but in the case of FireFox, it locks up the browser until the loop has ended.
What's your take on this?
I am trying to override window.confirm() to implement my own dialog using HTML. I am doing this so I don't have to change existing code (it's a pretty big code-base).
I need to be able to halt execution to allow user-input; to in turn return a boolean like the standard confirm function does:
if (confirm("..."))
{
// user pressed "OK"
}
else
{
// user pressed "Cancel"
}
Update
To my knowledge; this cannot be done using setTimeout() or setInterval() since these functions execute the code thats given to them asynchronously.
confirm() prompt() and alert() are special functions--they call out of the JavaScript sandbox into the browser, and the browser suspends JavaScript execution. You can't do the same thing, since you need to build your functionality into JavaScript.
I don't think there's a great way to drop in a replacement without doing some restructuring along the lines of:
myconfirmfunction(function() {
/* OK callback */
}, function() {
/* cancel callback */
});
Either use callbacks or make your code Firefox-only. In Firefox with support for JavaScript 1.7 and higher, you can use the yield statement to simulate your desired effect. I have created a library for this purpose called async.js. The standard library for async.js includes a confirm method, which can be used as such:
if (yield to.confirm("...")) {
// user pressed OK
} else {
// user pressed Cancel
}
You cannot stop the event thread in JavaScript, so instead you have to work around the problem, usually by using callback functions. These are functions that are run at a later time, but can be passed around like any other object in JavaScript. You might be familiar with them from AJAX programming. So, for example:
doSomeThing();
var result = confirm("some importart question");
doSomeThingElse(result);
Would be converted into:
doSomeThing();
customConfirm("some importart question", function(result){
doSomeThingElse(result);
});
where customConfirm now takes a question and passes the result to the function it takes as an argument. If you implement a DOM dialog with a button, then connect an event listener to the OK and CANCEL buttons, and call the callback function when the user clicks on one of them.
There is an extension to the JavaScript language called StratifiedJS. It runs in every browser, and it allows you to do just that: halting one line of JavaScript code without freezing the browser.
You can enable Stratified JavaScript e.g. by including Oni Apollo ( http://onilabs.com/docs ) in your webpage like:
<script src="http://code.onilabs.com/latest/oni-apollo.js"></script>
<script type="text/sjs"> your StratifiedJS code here </script>
Your code would look like this:
var dom = require("dom");
displayYourHtmlDialog();
waitfor {
dom.waitforEvent("okbutton", "click");
// do something when the user pressed OK
}
or {
dom.waitforEvent("cancelbutton", "click");
}
hideYourHtmlDialog();
// go on with your application
the way you normally halt execution should hardly ever be an infinite while loop.
break up your work into parts, that you call with SetTimeout
change this:
DoSomeWork();
Wait(1000);
var a = DoSomeMoreWork();
Wait(1000);
DoEvenMoreWork(a);
to this:
DoSomeWork();
setTimeout(function() {
var a = DoSomeMoreWork();
setTimeout(function() {
DoEvenMoreWork(a);
}, 1000);
}, 1000);
I don't think there's any way to reasonably re-create the functionality of confirm() or prompt() in your own JavaScript. They're "special" in the sense of being implemented as calls into the native browser library. You can't really do a modal dialog of that sort in JavaScript.
I have seen various UI libraries that simulate the effect by putting an element on top of the page, that looks & acts like a modal dialog, but those are implemented using async callbacks.
You will have to modify the existing library, rather than replacing window.confirm.
I tried using tight looping for this. I needed to slow down a native event (which AFAIK is the only use case for a synchronous wait that can't be re-architected asynchronously). There are lots of example loops out there that claim not to lock up the browser; but none of them worked for me (the browser didn't lock up, but they prevented it from doing the thing I was waiting for in the first place), so I abandoned the idea.
Next I tried this - storing and replaying the event, which seems to be impossible cross-browser too. However depending on the event and how flexible you need to be, you can get close.
In the end I gave up, and feel much better for it; I found a way to make my code work without having to slow down the native event at all.