In the UI of an application that I've been developing, I have been trying to show a "loader" div before a specific (synchronous) processing starts and hide it after the processing has ended. However, the loader does not appear at all during the whole processing.
I found out that browsers try to optimise the rendering process, by eliminating needless reflows, where possible. There are some ways to manually trigger the reflow process (for cases like mine), but I haven't managed to make any of them work. You can find below a minimal example, where the calculation of offsetHeight is used to trigger reflow, but the TEST div is not appearing at all.
<html>
<head>
<script src="https://code.jquery.com/jquery-2.2.4.js"></script>
</head>
<body>
<div id="test" style="display:none; background-color:red; height:500px">TEST</div>
<script>
function reflow() {
var t = $("#test")[0].offsetHeight;
$(window).trigger('resize');
window.getComputedStyle($("#test")[0], null);
$("#test").focus();
}
$("#test").show();
reflow();
console.log("Processing started");
var sum = 0;
for(i = 0; i < 100000000; i++) {
sum += i;
}
console.log("Processing ended");
$("#test").hide();
<script>
</body>
</html>
Working JSFiddle also here
Some of the ways attempted to force reflow are the following (as you can see in the script):
offsetHeight's calculation of element
window resize
window.getComputedStyle()
focus of element
NOTE: I know it can be solved, by adding a timeout. However, I want to find a way to force reflow, since timeout is a "hack" and is not appropriate for this functionality.
Use setTimeout with a "0" parameter for the wait, this will force it to stop just long enough for your show/hide to work.
https://jsfiddle.net/o3y60e0L/3/
$("#test").show();
var t = $("#test")[0].offsetHeight;
console.log("Processing started");
setTimeout(function() {
var sum = 0;
for(i = 0; i < 1000000000; i++) {
sum += i;
}
console.log("Processing ended");
$("#test").hide();
}, 0);
It finally seems that even when you trigger a re-flow event manually, as shown in my code, this event is added to an events queue. Since there is only 1 javascript thread per window, these events are handled only when the executing code has finished.
Even though I could not find an official documentation of this behaviour, I could neither trigger a synchronous reflow (while some code is executing) with any of the proposed techniques. So, the only solution seems to be to add a small timeout, so that thread is idle for a small period and can process the reflow event, before starting executing the remaining code (as suggested by #Dan Smolinske).
In order to avoid perplexing different functionalities, one can create a new method executeWithLoader, as below:
function executeWithLoader(callback) {
$("#test").show();
setTimeout(function() {
callback();
$("#test").hide();
}, 50, this);
}
And call this function from the class the executes the processing in the following way:
function processing() {
console.log("Processing started");
var sum = 0;
for(i = 0; i < 100000000; i++) {
sum += i;
}
console.log("Processing ended");
}
executeWithLoader(processing);
I leave this as accepted answer for now, but if there is any future answer that describes a successful way to trigger re-flow, I will switch it.
Related
I have a block of code that executes when a button is clicked. The code uses a loop that sometimes takes a while to complete. When the user clicks the button, I want the cursor to change a "wait" cursor before the loop begins. Once the loop is finished, the cursor should return to normal.
What is actually happening (in Chrome for Windows at least) is that the style doesn't get updated until after the loop. It seems to be a quirk of how buttons work. I really don't know. I'm out of guesses!
A sample fiddle: http://jsfiddle.net/ra51npjr/1/ (it just uses console.log to execute "something"... you might need to change how many times the loop runs depending on how zippy or slow your machine is).
Sample HTML:
<div class="fakebody">
<button id="foo">Foo</button>
</div>
Sample CSS:
.fakeBody {
height: 1000px;
width: 100%;
}
.wait {
cursor: wait !important;
}
Sample JavaScript:
$('#foo').on('click', function (e) {
$('.fakebody').addClass('wait');
for (i = 0; i < 10000; i++) {
console.log(i);
}
$('.fakebody').removeClass('wait');
});
--
Here are my ASSUMPTIONS on how the script should work:
The click happens, which fires up the code. Indeed, if I log "started!" inside the code block, it will correctly log that it has started
The cursor should be a wait cursor so long as it is hovering anywhere over "fakebody".
The for loop is just a simple way to kill a few seconds to see the effect. Feel free to substitute any other loop that takes a while to complete
At the end of the loop, the cursor is no longer a wait cursor
What is actually happening:
The loop executes
At the end of the loop, the cursor turns to a "wait" cursor and then instantly back to a regular cursor. The change doesn't happen until the loop is complete
Does anybody know a technique or workaround to get the cursor to change before the loop starts instead of only after it is finished? Is this known behaviour that I need to educate myself about (and if so, do you know where I should start looking?)
This is a common issue in JavaScript. This question may provide some deeper insight, but essentially the point is that synchronous JavaScript execution must finish before the browser can perform other actions (like updating the view).
Because .addClass, the for loop, and .removeClass all occur synchronously, the browser doesn't get a chance to redraw anything. A technique that is often used in these cases is to setTimeout with a timeout of 0, which essentially just "yields" control back to the browser.
$('.fakebody').addClass('wait');
setTimeout(function() {
for (i = 0; i < 10000; i++) {
console.log(i);
}
$('.fakebody').removeClass('wait');
}, 0);
If this is a common pattern, you could potentially extract it out to a function (which would also help improve readability) that wraps the async setTimeout. Here's a simple example:
/**
* Wraps a long-running JavaScript process in a setTimeout
* which yields to allow the browser to process events, e.g. redraw
*/
function yieldLongRunning(preFn, fn, postFn, ctx) {
if (arguments.length <= 2) {
ctx = fn; fn = preFn;
preFn = postFn = function() {};
}
preFn.call(ctx);
setTimeout(function() {
fn.call(ctx);
postFn.call(ctx);
}, 0);
}
And use it like so:
yieldLongRunning(function() {
$('.fakebody').addClass('wait');
},
function() {
for (i = 0; i < 10000; i++) {
console.log(i);
}
},
function() {
$('.fakebody').removeClass('wait');
});
As a side point, note that setTimeout(..., 0) simply queues the function in the browser's event loop, alongside other queued JavaScript functions, as well as other types of events (like redraws). Thus, no setTimeout call is guaranteed to run precisely at the given time - the timeout argument is simply a lower-bound (and, in fact, there is a minimum timeout of 4ms specified by HTML5 spec, which browsers use to prevent infinite timeout loops; you can still use 0, though, and the browser will add it to the event queue after the minimum delay).
I think you should try to force a redraw by hiding + showing the parent element.
Try this:
document.getElementById('fakebody').style.display = 'none';
document.getElementById('fakebody').style.display = 'block';
Before and after the loop (i.e. when you want the child element "foo" to refresh.
EDIT: Since you're using jquery you could do this:
$('#fakebody').hide().show(0);
Demo - Use queue & dequeue to construct an order of what should happen when in jQuery.
$('#foo').on('click', function (e) {
$('.fakebody').addClass('wait').queue(function(n) {
for (i = 0; i < 10000; i++) { console.log(i); }
}).removeClass('wait').dequeue();
});
I have some code like following :
LoadingImage.show("#contentpage", urlStk.LoadImg);
var errors = 0;
var ComponentToUpdate = new Array();
var storedItems = JSON.parse(localStorage.getItem("Components"));
$(".myDataGridRow").each(function () {
errors += validateInput(this);
component = {
FamilyCode: $.trim($("td[name=FamilyCode]", this).text())
}
ComponentToUpdate.push(component);
});
LoadingImage.hide("#contentpage");
The $(".myDataGridRow").each()) loop can be a little bit slow. So I try to display some waiting animated gif that overlays on the data grid and its rows (myDataGridRow).
LoadingImage.show() and LoadingImage.hide() methods do work fine when the executed code between is some ajax call to a remote server.
The problem is that the animated gif never appears in this case (the each() loop is only going through HTML elements and performing simpls validations), nor its parent DIV container...
After many tests, it seems that any javascript code written before the each() loop seems to be executed after (I have not tried the alert() case, but any css changes on other elements are blocked till the each() loop finishes, timers declared before are triggered after... ) ??
Forcing the display of the waiting image inside the each loop does not work.
Any help idea will be welcome.
At a guess, your animation isn't playing because the JavaScript is running in the main thread. If you use setTimeout() or setImmediate(), you can run your expensive code in a separate thread, which will allow the browser itself to handle displaying the animation.
Example below:
LoadingImage.show("#contentpage", urlStk.LoadImg);
setImmediate(function () {
var errors = 0;
var ComponentToUpdate = new Array();
var storedItems = JSON.parse(localStorage.getItem("Components"));
$(".myDataGridRow").each(function () {
errors += validateInput(this);
component = {
FamilyCode: $.trim($("td[name=FamilyCode]", this).text())
}
ComponentToUpdate.push(component);
});
LoadingImage.hide("#contentpage");
});
I want to refresh a span element before my calculations start to inform user that calculation was started.
the following code never displays 'calculating' message:
<script>
document.getElementById('text').innerHTML = 'calculating';
for (var i=0; i<9999;i++){
var y = Math.pow(i,i);
console.log(y);
}
document.getElementById('text').innerHTML = 'done';
</script>
</body>
</html>
how to fix that?
Guys, setTimeout is not an option for me. besides it looks ugly.
Place your code inside a setTimeout() call, and add a very minimal delay. I usually put 0ms, which just waits for the frame to render, then calls the function. Example:
document.getElementById('text').innerHTML = 'calculating';
setTimeout(function() {
for (var i=0; i<9999;i++){
var y = Math.pow(i,i);
console.log(y);
}
document.getElementById('text').innerHTML = 'done';
}, 0);
The first argument is a function to be called after the time passes, the second argument is the time to wait before calling the function. Note that this may still hang the browser if the calculation is too big (e.g. 999999 loop steps on a 2.5GHz i5)
I have the following code which demonstrates the difference in calling a long-running function directly from an event trigger, vs. using setTimeout().
Intended behavior:
When the first button is pressed, it appears pressed, the calculation runs for several seconds, then when the calculation finishes, the button appears depressed again and the second column changes from "not calculating yet" to "calculation done". (I won't elaborate on why that is supposed to happen; it's explained in linked answer.)
When the second button is pressed, the button depresses immediately; the second column immediately changes to "calculating..." text. When the calculation finishes several seconds later, the second column changes from "calculating..." to "calculation done".
What actually happens:
This works perfectly in Chrome (both buttons behave as expected)
This works perfectly in Internet Explorer 8
This does NOT work in Firefox (v.25) as-is. Specifically, the second button behaves 100% as the first one.
Changing the timeout in setTimeout() from 0 to 1 has no effect
Changing the timeout in setTimeout() from 0 to 500 works
Which leaves me with a big conundrum.
According to the whole reason behind why setTimeout() works whereas lack of one doesn't, the delay should have zero effect on how things work, since setTimeout()'s main purpose is to change the queuing order here, NOT to delay things.
So, why is it not working with delay 0 or 1 on Firefox, but works as expected with delay 500 (and works with any delay on Internet Explorer 8/Chrome)?
UPDATE: In addition to source code below, I also made a JSFiddle. But for some reason JSFiddle refuses to even load on my Internet Explorer 8, so for that testing, the code below is required.
UPDATE2: Someone raised the possibility of there being an issue with configuration setting dom.min_timeout_value in Firefox. I have edited it from 4 to 0, restarted the browser, and nothing was fixed. It still fails with a timeout of 0 or 1 and succeeds with 500.
Here is my source code - I simply saved it to a HTML file on C: drive and opened in all three browsers:
<html><body>
<script src="http://code.jquery.com/jquery-1.9.1.js"></script>
<table border=1>
<tr><td><button id='do'>Do long calc - bad status!</button></td>
<td><div id='status'>Not Calculating yet.</div></td></tr>
<tr><td><button id='do_ok'>Do long calc - good status!</button></td>
<td><div id='status_ok'>Not Calculating yet.</div></td></tr>
</table>
<script>
function long_running(status_div) {
var result = 0;
for (var i = 0; i < 1000; i++) {
for (var j = 0; j < 700; j++) {
for (var k = 0; k < 200; k++) {
result = result + i + j + k;
}
}
}
$(status_div).text('calclation done');
}
// Assign events to buttons
$('#do').on('click', function () {
$('#status').text('calculating....');
long_running('#status');
});
$('#do_ok').on('click', function () {
$('#status_ok').text('calculating....');
window.setTimeout(function (){ long_running('#status_ok') }, 0);
});
</script>
</body></html>
To test, you will need to change the nested loop boundaries to 300/100/100 for Internet Explorer 8; or to 1000/1000/500 for Chrome, due to different sensitivity of "this JS is taking too long" error coupled with JavaScript engine speed.
There is a copy of the current (Jun 28, 2016) implementation of window.setTimeout() in Ubuntu.
As we can see, the timer gets inserted by this line of code:
nsAutoPtr<TimeoutInfo>* insertedInfo =
mTimeouts.InsertElementSorted(newInfo.forget(), GetAutoPtrComparator(mTimeouts));
Then a few lines below you have an if() statement:
if (insertedInfo == mTimeouts.Elements() && !mRunningExpiredTimeouts) {
...
The insertedInfo == mTimeouts.Elements() checks whether the timer that was just inserted already timed out. The following block does NOT execute the attached function, but the main loop will immediately notice that a timer timed out and thus it will skip the IDLE state (a yield of the CPU) that you are expecting.
This clearly (at least to me) explains the behavior you are experiencing. The rendering on the screen is another process (task/thread) and the CPU needs to be relinquished for that other process to get a chance to re-paint the screen. For that to happen, you need to wait long enough so your timer function does not get executed immediately and a yield happens.
As you've notice a pause of 500ms does the trick. You can probably use a smaller number, such as 50ms. Either way it is not going to guarantee that a yield happens, but chances are it will happen if the computer on which that code is running is not currently swamped (i.e. an anti-virus is not currently running full speed in the background...)
The complete SetTimeout() function from Firefox:
(location of the file in the source: dom/workers/WorkerPrivate.cpp)
int32_t
WorkerPrivate::SetTimeout(JSContext* aCx,
dom::Function* aHandler,
const nsAString& aStringHandler,
int32_t aTimeout,
const Sequence<JS::Value>& aArguments,
bool aIsInterval,
ErrorResult& aRv)
{
AssertIsOnWorkerThread();
const int32_t timerId = mNextTimeoutId++;
Status currentStatus;
{
MutexAutoLock lock(mMutex);
currentStatus = mStatus;
}
// It's a script bug if setTimeout/setInterval are called from a close handler
// so throw an exception.
if (currentStatus == Closing) {
JS_ReportError(aCx, "Cannot schedule timeouts from the close handler!");
}
// If the worker is trying to call setTimeout/setInterval and the parent
// thread has initiated the close process then just silently fail.
if (currentStatus >= Closing) {
aRv.Throw(NS_ERROR_FAILURE);
return 0;
}
nsAutoPtr<TimeoutInfo> newInfo(new TimeoutInfo());
newInfo->mIsInterval = aIsInterval;
newInfo->mId = timerId;
if (MOZ_UNLIKELY(timerId == INT32_MAX)) {
NS_WARNING("Timeout ids overflowed!");
mNextTimeoutId = 1;
}
// Take care of the main argument.
if (aHandler) {
newInfo->mTimeoutCallable = JS::ObjectValue(*aHandler->Callable());
}
else if (!aStringHandler.IsEmpty()) {
newInfo->mTimeoutString = aStringHandler;
}
else {
JS_ReportError(aCx, "Useless %s call (missing quotes around argument?)",
aIsInterval ? "setInterval" : "setTimeout");
return 0;
}
// See if any of the optional arguments were passed.
aTimeout = std::max(0, aTimeout);
newInfo->mInterval = TimeDuration::FromMilliseconds(aTimeout);
uint32_t argc = aArguments.Length();
if (argc && !newInfo->mTimeoutCallable.isUndefined()) {
nsTArray<JS::Heap<JS::Value>> extraArgVals(argc);
for (uint32_t index = 0; index < argc; index++) {
extraArgVals.AppendElement(aArguments[index]);
}
newInfo->mExtraArgVals.SwapElements(extraArgVals);
}
newInfo->mTargetTime = TimeStamp::Now() + newInfo->mInterval;
if (!newInfo->mTimeoutString.IsEmpty()) {
if (!nsJSUtils::GetCallingLocation(aCx, newInfo->mFilename, &newInfo->mLineNumber)) {
NS_WARNING("Failed to get calling location!");
}
}
nsAutoPtr<TimeoutInfo>* insertedInfo =
mTimeouts.InsertElementSorted(newInfo.forget(), GetAutoPtrComparator(mTimeouts));
LOG(TimeoutsLog(), ("Worker %p has new timeout: delay=%d interval=%s\n",
this, aTimeout, aIsInterval ? "yes" : "no"));
// If the timeout we just made is set to fire next then we need to update the
// timer, unless we're currently running timeouts.
if (insertedInfo == mTimeouts.Elements() && !mRunningExpiredTimeouts) {
nsresult rv;
if (!mTimer) {
mTimer = do_CreateInstance(NS_TIMER_CONTRACTID, &rv);
if (NS_FAILED(rv)) {
aRv.Throw(rv);
return 0;
}
mTimerRunnable = new TimerRunnable(this);
}
if (!mTimerRunning) {
if (!ModifyBusyCountFromWorker(true)) {
aRv.Throw(NS_ERROR_FAILURE);
return 0;
}
mTimerRunning = true;
}
if (!RescheduleTimeoutTimer(aCx)) {
aRv.Throw(NS_ERROR_FAILURE);
return 0;
}
}
return timerId;
}
IMPORTANT NOTE: The JavaScript instruction yield, has nothing to do with what I am talking about. I am talking about the sched_yield() functionality which happens when a binary process calls certain functions, such as sched_yield() itself, poll(), select(), etc.
I faced this issue with Firefox while toggling CSS classes using jQuery to control a CSS transition.
Increasing the duration of setTimeout to 50 from 0 helped, but as Alexis suggested this wasn’t 100% reliable.
The best (if longwinded) solution I found was to combine an interval timer with an IF statement to actually check whether the necessary styles had been applied before triggering the transition, rather using setTimeout and assuming execution had taken place in the intended order, e.g.
var firefox_pause = setInterval(function() {
//Test whether page is ready for next step - in this case the div must have a max height applied
if ($('div').css('max-height') != "none") {
clear_firefox_pause();
//Add next step in queue here
}
}, 10);
function clear_firefox_pause() {
clearInterval(firefox_pause);
}
In my case at least, this seems to work every time in Firefox.
In Firefox, the minimum value for setTimeout() calls is configurable and defaults to 4 in current versions:
dom.min_timeout_value The minimum length of time, in milliseconds,
that the window.setTimeout() function can set a timeout delay for.
This defaults to 4 ms (before 10 ms). Calls to setTimeout() with a
delay smaller than this will be clamped to this minimum value.
Values like 0 or 1 should behave like 4—no idea if that will cause delays in your code or just break it.
When this function is called, the style change on the "gif" element does not show up until "lotsOfProcessing()" finishes. However, when I uncomment the alert("test"), the style change is shown before the alert pops up.
What I am trying to do is have an animated gif displayed while lotsOfProcessing is running. This seemed pretty straight forward solution but it is clearly not working. Any suggestions / solutions?
function nameOfFuntion()
{
document.getElementById("gif").style.display = "inline";
//alert("test");
lotsOfProcessing();
}
JavaScript code executes on the same thread as the browser's rendering. Everything that needs to be drawn waits for JavaScript execution to complete - including the next frame of any GIF animation.
The only solution is to break your long processing code down into smaller parts and delay each part using timers.
For example:
function nameOfFuntion() {
document.getElementById("gif").style.display = "inline";
//alert("test");
lotsOfProcessing();
}
function lotsOfProcessing() {
var i = 0;
window.setTimeout(function () {
partOfIntenseProcessing();
if (i < 1000000)
i++, window.setTimeout(arguments.callee, 10);
}, 10);
}
This will delay how long it will take for your processing to complete, but between timer execution the GIF can continue to animate.
You can also take a look at Web Workers, which allow you to run JavaScript operations in a background thread. However, they are not widely implemented yet (read: not available in Internet Explorer).
Perform your heavy processing in a delayed function with window.setTimeout():
function nameOfFunction()
{
document.getElementById("gif").style.display = "inline";
window.setTimeout(lotsOfProcessing, 10);
}
That's strange indeed. Seems like lotsOfProcessing gets javascript's single thread before the dom has time to refresh, but it's the first time I hear of something like that.
You might try this (not that is not an ideal solution):
function nameOfFuntion()
{
document.getElementById("gif").style.display = "inline";
setTimeout(lotsOfProcessing, 100);
}
This is a vaguely educated guess but it may be worth trying to put document.getElementById("gif").style.display = "inline"; into a function eg.
function nameOfFuntion()
{
showGif();
//alert("test");
lotsOfProcessing();
}
function showGif() {
document.getElementById("gif").style.display = "inline";
}
My thinking is that perhaps the lotsOfProcessing() is getting hoisted to the top of nameOfFunction() because it's a function and therefore getting processed first.