I was under the impression that all DOM manipulations were synchronous.
However, this code is not running as I expect it to.
RecordManager.prototype._instantiateNewRecord = function(node) {
this.beginLoad();
var new_record = new Record(node.data.fields, this);
this.endLoad();
};
RecordManager.prototype.beginLoad = function() {
$(this.loader).removeClass('hidden');
};
RecordManager.prototype.endLoad = function() {
$(this.loader).addClass('hidden');
};
The Record constructor function is very large and it involves instantiating a whole bunch of Field objects, each of which instantiates some other objects of their own.
This results in a 1-2 second delay and I want to have a loading icon during this delay, so it doesn't just look like the page froze.
I expect the flow of events to be:
show loading icon
perform record instantiation operation
hide loading icon
Except the flow ends up being:
perform record instantiation operation
show loading icon
hide loading icon
So, you never even see the loading icon at all, I only know its loading briefly because the updates in the chrome development tools DOM viewer lag behind a little bit.
Should I be expecting this behavior from my code? If so, why?
Yes, this is to be expected. Although the DOM may have updated, until the browser has a chance to repaint, you won't see it. The repaint will get queued the same way as all other things get queued in the browser (ie it won't happen until the current block of JavaScript has finished executing), though pausing in a debugger will generally allow it to happen.
In your case, you can fix it using setTimeout with an immediate timeout:
RecordManager.prototype._instantiateNewRecord = function(node) {
this.beginLoad();
setTimeout(function() {
var new_record = new Record(node.data.fields, this);
this.endLoad();
}, 0);
};
This will allow the repaint to happen before executing the next part of your code.
JavaScript is always synchronous. It mimics multi-threaded behavior when it comes to ajax calls and timers, but when the callback gets returned, it will be blocking as usual.
That said, you most likely have a setTimeout in that constructor somewhere (or a method you're using does). Even if it's setTimeout(fnc, 0).
Related
I'm working on an end-to-end test using Protractor. The part of the application I'm working on first uses ng-switch statements to show/hide questions in the registration process, one at a time. There's an animation between questions that gave me the hardest time. For example, attempting to load the page->go to next question->assert that an element exists was tough, among other things. The script would load the page, click the next button, then make the assert before the next slide was on screen.
What's worse is that for about half of a second between questions, both the old question and the new one existed on the DOM. The best non-sleep wait mechanism I could come up with was to do a browser.wait() that first waited for there to be two questions on the DOM, then chain another browser.wait() that waited for there to be only one question on the DOM again and then proceed from there. (this entire operation is wrapped into registerPage.waitForTransition() in the code)
The browser.wait()s were not always blocking long enough, so I ended up writing code that looks like this:
it('moves to previous question after clicking previous link', function() {
var title;
// Get the current slide title, then click previous, wait for transition,
// then check the title again to make sure it changed
registerPage.slideTitle.getText()
.then(function(text) {
title = text;
})
.then(registerPage.prevLink.click())
.then(registerPage.waitForTransition())
.then(function() {
expect(registerPage.slideTitle.getText()).not.toBe(title);
});
});
in order to ensure that each wait was properly completed before executing the next command. Now this works perfectly. What was happening before was that the tests would succeed 95% of the time, but would occasionally fire off the asserts or the next click action, etc. before the transition was actually 100% complete. That doesn't happen anymore, but I feel like this is almost OVERusing .then() on promises. But at the same time, it makes sense to force everything to occur sequentially since that's how interacting with a site actually works. Load the page, then wait for the next button to slide in, then make a selection, then click the next button, then wait for the next slide, etc.
Am I doing this in a completely bad-practice style or is this acceptable use of promises when using Protractor on an app with heavy animations?
I like these kind of code-review-like questions, so thanks for posting.
I do think some of your .thens are unnecessary. The .click() and expect shouldn't need them, as they should be added to the controlFlow for you. The expect should also handle the promise for your getText().
The problem you're having would seem to be within your waitForTransition() method, operating outside the controlFlow. Depending on how you're handling the waits within this method, you may need to add it to the controlFlow yourself. Eg. are you calling non-webdriver commands? I've also had good luck with using Expected Conditions isClickable() in cases like these.
Additionally, I would also offload much of this code to your page object, especially when waiting is required. For example, if you add something like this to your page object:
registerPage:
this.getSlideTitleText = function() {
return this.slideTitle.getText().then(function(text) {
return text;
});
};
this.clickPrevLink = function() {
this.prevLink.click();
return this.waitForTransition(); // fix this and the rest should work
};
then your test could be...
it('moves to previous question after clicking previous link', function() {
var title = registerPage.getSlideTitleText();
registerPage.clickPrevLink();
expect(registerPage.getSlideTitleText()).not.toBe(title);
});
This is a very simple use case. Show an element (a loader), run some heavy calculations that eat up the thread and hide the loader when done. I am unable to get the loader to actually show up prior to starting the long running process. It ends up showing and hiding after the long running process. Is adding css classes an async process?
See my jsbin here:
http://jsbin.com/voreximapewo/12/edit?html,css,js,output
To explain what a few others have pointed out: This is due to how the browser queues the things that it needs to do (i.e. run JS, respond to UI events, update/repaint how the page looks etc.). When a JS function runs, it prevents all those other things from happening until the function returns.
Take for example:
function work() {
var arr = [];
for (var i = 0; i < 10000; i++) {
arr.push(i);
arr.join(',');
}
document.getElementsByTagName('div')[0].innerHTML = "done";
}
document.getElementsByTagName('button')[0].onclick = function() {
document.getElementsByTagName('div')[0].innerHTML = "thinking...";
work();
};
(http://jsfiddle.net/7bpzuLmp/)
Clicking the button here will change the innerHTML of the div, and then call work, which should take a second or two. And although the div's innerHTML has changed, the browser doesn't have chance to update how the actual page looks until the event handler has returned, which means waiting for work to finish. But by that time, the div's innerHTML has changed again, so that when the browser does get chance to repaint the page, it simply displays 'done' without displaying 'thinking...' at all.
We can, however, do this:
document.getElementsByTagName('button')[0].onclick = function() {
document.getElementsByTagName('div')[0].innerHTML = "thinking...";
setTimeout(work, 1);
};
(http://jsfiddle.net/7bpzuLmp/1/)
setTimeout works by putting a call to a given function at the back of the browser's queue after the given time has elapsed. The fact that it's placed at the back of the queue means that it'll be called after the browser has repainted the page (since the previous HTML changing statement would've queued up a repaint before setTimeout added work to the queue), and therefore the browser has had chance to display 'thinking...' before starting the time consuming work.
So, basically, use setTimeout.
let the current frame render and start the process after setTimeout(1).
alternatively you could query a property and force a repaint like this: element.clientWidth.
More as a what is possible answer you can make your calculations on a new thread using HTML5 Web Workers
This will not only make your loading icon appear but also keep it loading.
More info about web workers : http://www.html5rocks.com/en/tutorials/workers/basics/
I have the following code (Backbone view, rendering using Handlebars):
_this.$el.addClass("loading");
_this.el.innerHTML = _this.template({
some: data
});
_this.otherCPUConsumingRenderingFunctions();
_this.$el.removeClass("loading");
The CSS class displays a "Loading" message on screen to warn the user, since rendering takes time due to a large amount of data and a complex rendering.
My problem is that the CSS class is correctly applied (I see it in the inspector) but nothing is displayed on screen.
If I put breakpoints and go step-by-step, it will work perfectly.
The issue occurs both with Chrome and Firefox.
No rendering function in browsers is synchronous. So your otherCPUConsumingRenderingFunctions is most probably returning as soon as you call it. It does it's thing later asynchronously.
That is why your loading class gets removed as soon as it is added.
Most likely you'll need to use a callback after the rendering function completes. Also remember expensive rendering operations, depending upon their design, can be blocking — meaning the dom does not get a chance to re-render until all the work is done. In this case it will add the loading class and remove it all before the dom redraws. Stepping through your code provides the browser time to re-render which is why you'll see it working when debugging.
Perhaps something like this
_this.otherCPUConsumingRenderingFunctions = function (callback) {
// do work here
callback();
};
_this.$el.addClass("loading");
_this.el.innerHTML = _this.template({
some: data
});
// You can use a timeout to "schedule" this work on the next tick.
// This will allow your dom to get updated before the expensive work begins.
window.setTimeout(function () {
_this.otherCPUConsumingRenderingFunctions(function () {
// Ensure this only runs after the rendering completes.
_this.$el.removeClass("loading");
});
}, 1);
The backburner.js project was created to help mitigate this kind of problem. It works well with Backbone too.
To see the problem in action, see this jsbin. Clicking on the button triggers the buttonHandler(), which looks like this:
function buttonHandler() {
var elm = document.getElementById("progress");
elm.innerHTML = "thinking";
longPrimeCalc();
}
You would expect that this code changes the text of the div to "thinking", and then runs longPrimeCalc(), an arithmetic function that takes a few seconds to complete. However, this is not what happens. Instead, "longPrimeCalc" completes first, and then the text is updated to "thinking" after it's done running, as if the order of the two lines of code were reversed.
It appears that the browser does not run "innerHTML" code synchronously, but instead creates a new thread for it that executes at its own leisure.
My questions:
What is happening under the hood that is leading to this behavior?
How can I get the browser to behave the way I would expect, that is, force it to update the "innerHTML" before it executes "longPrimeCalc()"?
I tested this in the latest version of chrome.
Your surmise is incorrect. The .innerHTML update does complete synchronously (and the browser most definitely does not create a new thread). The browser simply does not bother to update the window until your code is finished. If you were to interrogate the DOM in some way that required the view to be updated, then the browser would have no choice.
For example, right after you set the innerHTML, add this line:
var sz = elm.clientHeight; // whoops that's not it; hold on ...
edit — I might figure out a way to trick the browser, or it might be impossible; it's certainly true that launching your long computation in a separate event loop will make it work:
setTimeout(longPrimeCalc, 10); // not 0, at least not with Firefox!
A good lesson here is that browsers try hard not to do pointless re-flows of the page layout. If your code had gone off on a prime number vacation and then come back and updated the innerHTML again, the browser would have saved some pointless work. Even if it's not painting an updated layout, browsers still have to figure out what's happened to the DOM in order to provide consistent answers when things like element sizes and positions are interrogated.
I think the way it works is that the currently running code completes first, then all the page updates are done. In this case, calling longPrimeCalc causes more code to be executed, and only when it is done does the page update change.
To fix this you have to have the currently running code terminate, then start the calculation in another context. You can do that with setTimeout. I'm not sure if there's any other way besides that.
Here is a jsfiddle showing the behavior. You don't have to pass a callback to longPrimeCalc, you just have to create another function which does what you want with the return value. Essentially you want to defer the calculation to another "thread" of execution. Writing the code this way makes it obvious what you're doing (Updated again to make it potentially nicer):
function defer(f, callback) {
var proc = function() {
result = f();
if (callback) {
callback(result);
}
}
setTimeout(proc, 50);
}
function buttonHandler() {
var elm = document.getElementById("progress");
elm.innerHTML = "thinking...";
defer(longPrimeCalc, function (isPrime) {
if (isPrime) {
elm.innerHTML = "It was a prime!";
}
else {
elm.innerHTML = "It was not a prime =(";
}
});
}
When looking to improve a page's performance, one technique I haven't heard mentioned before is using setTimeout to prevent javascript from holding up the rendering of a page.
For example, imagine we have a particularly time-consuming piece of jQuery inline with the html:
$('input').click(function () {
// Do stuff
});
If this code is inline, we are holding up the perceived completion of the page while the piece of jquery is busy attaching a click handler to every input on the page.
Would it be wise to spawn a new thread instead:
setTimeout(function() {
$('input').click(function () {
// Do stuff
})
}, 100);
The only downside I can see is that there is now a greater chance the user clicks on an element before the click handler is attached. However, this risk may be acceptable and we have a degree of this risk anyway, even without setTimeout.
Am I right, or am I wrong?
The actual technique is to use setTimeout with a time of 0.
This works because JavaScript is single-threaded. A timeout doesn't cause the browser to spawn another thread, nor does it guarantee that the code will execute in the specified time. However, the code will be executed when both:
The specified time has elapsed.
Execution control is handed back to the browser.
Therefore calling setTimeout with a time of 0 can be considered as temporarily yielding to the browser.
This means if you have long running code, you can simulate multi-threading by regularly yielding with a setTimeout. Your code may look something like this:
var batches = [...]; // Some array
var currentBatch = 0;
// Start long-running code, whenever browser is ready
setTimeout(doBatch, 0);
function doBatch() {
if (currentBatch < batches.length) {
// Do stuff with batches[currentBatch]
currentBatch++;
setTimeout(doBatch, 0);
}
}
Note: While it's useful to know this technique in some scenarios, I highly doubt you will need it in the situation you describe (assigning event handlers on DOM ready). If performance is indeed an issue, I would suggest looking into ways of improving the real performance by tweaking the selector.
For example if you only have one form on the page which contains <input>s, then give the <form> an ID, and use $('#someId input').
setTimeout() can be used to improve the "perceived" load time -- but not the way you've shown it. Using setTimeout() does not cause your code to run in a separate thread. Instead setTimeout() simply yields the thread back to the browser for (approximately) the specified amount of time. When it's time for your function to run, the browser will yield the thread back to the javascript engine. In javascript there is never more than one thread (unless you're using something like "Web Workers").
So, if you want to use setTimeout() to improve performance during a computation-intensive task, you must break that task into smaller chunks, and execute them in-order, chaining them together using setTimeout(). Something like this works well:
function runTasks( tasks, idx ) {
idx = idx || 0;
tasks[idx++]();
if( idx < tasks.length ) {
setTimeout( function(){ runTasks(tasks, idx); },1);
}
}
runTasks([
function() {
/* do first part */
},
function() {
/* do next part */
},
function() {
/* do final part */
}
]);
Note:
The functions are executed in order. There can be as many as you need.
When the first function returns, the next one is called via setTimeout().
The timeout value I've used is 1. This is sufficient to cause a yield, and the browser will take the thread if it needs it, or allow the next task to proceed if there's time. You can experiment with other values if you feel the need, but usually 1 is what you want for these purposes.
You are correct, there is a greater chance of a "missed" click, but with a low timeout value, its pretty unlikely.