Why is the asyncQueue processed before watchers? - javascript

I have a situation in my AngularJS-based app where the visibility of element A results in element B being bigger or smaller in width (just due to how the element CSS styles are set up). The visibility of element A is toggled by binding to a boolean on the scope using ng-show="showRail". For reasons I don't need to go into here, when the visibility of element A is toggled, I need to get element B's new width. If I use $timeout to evaluate element B's width I get an accurate reading but it's too late (the next frame) and causes a flicker due to some rendering that must be done as a result. I understand that $evalAsync is intended to be best for executing some logic after the DOM has been updated but before the browser has rendered. So on the handler where I'm toggling the showRail boolean I'm running a scope.$evalAsync but it appears it's too early and element B has still not received its new width.
I went searching through Angular's $digest method and found that the asyncQueue is processed before watchers. This seems backward to me and seems to explain why element A's visibility hasn't changed before I try to retrieve element B's new width. I'm hoping someone can explain why this is the case and maybe it will lead me to a solution to my specific problem. Thanks.

Take a look at setImmediate:
https://github.com/YuzuJS/setImmediate
It's almost tailor-made for this kind of thing. There's nothing in Angular's core that lets you control the order in which these operations are performed, and the answer to "why" isn't going to help you solve your problem. But setImmediate can help you avoid the flicker and could be a 5-minute solution if it works...

Related

When exactly does the browser repaint and reflow?

I was trying to understand better what happens when DOM manipulations happen in a browser. Something that I could not find a lot of resources about, was when exactly repaints and reflows are executed by the browser (there is a lot of literature what a repaint or reflow is, but almost nothing about when it happens).
Very specifically, I was wondering about code like this:
function clickHandler() {
domElement1.textContent = 'Hello';
domElement2.textContent = 'World';
}
Assume that there are two DOM elements stored in the variables domElement1 and domElement2. The clickHandler function is attached to a button, and should update the content of both DOM elements when it is clicked.
Now the question: Is it possible that this causes two repaints? I.e. that the browser updates only domElement1, shows that change on the screen causing repaints/reflows/etc., and afterwards updates domElement2 causing another repaint/reflow/etc.? Or is there a guarantee that the event handler of the button will be completely executed before anything is changed in the actual DOM?
My guess would have been that the browser tries to keep 60 frames per seconds, and that theoretically it could happen that these two changes might happen in two different frames. Is that correct?
Browser reflow and browser repaint are two different, but related, ideas.
Reflow is essentially the computation of the offsets and dimensions of given elements to understand what comes where and takes how much of space, and so on.
Repaint is the next stage after reflow. It is when the computation made in the reflow stage is actually used to paint pixels on the screen. This process is sometimes also known as rasterization.
First thing's first: repaint only happens when the main thread is free, i.e. when no event listener, no callback, just simply nothing is currently in line for execution. And, as I can make its intuition, repaint only happens when a reflow was triggered previously.
In contrast to this, a reflow might be triggered with every single DOM mutation, depending on the underlying code.
As you can reason, the most naive approach for any browser is to trigger reflow after every single DOM mutation (e.g. the domElement1.textContent = 'Hello'; statement that you shared above). But as you might agree, this would be extremely inefficient.
What if there is a whole bunch of such statements in a line? Clearly, in this case, it would be much much better for the browser to do reflow just once at the very end of the whole block of statements and thus keep itself from wasting computing resources.
Now here's one thing to keep in mind. As you may know, browser engines, especially for JavaScript, have become extremely complicated, sophisticated and intelligent over the past few years. These days, they can make extremely intelligent guesses on what a code precisely asks for and then ultimately apply numerous kinds of optimizations over it in order to run it at the top-most speed.
These optimizations include holding on to the execution of the reflow algorithm when there is no need for it. That how every single browser exactly does this is out of the scope of this answer and is obviously implementation-dependent. Chrome might rely on one method, Firefox on the other, Safari on the other, and so on.
There isn't really any point in digging into each one's code bases and finding the exact way it works.
However, there is a common thing between all browsers these days and that is that they try to do the least amount of unnecessary work. That's one of the reasons of the success of modern-day JavaScript — the robust engines are right at its back.
So, to end this answer, I would say that no, there is almost no possibility that the following code would trigger reflow twice:
function clickHandler() {
domElement1.textContent = 'Hello';
domElement2.textContent = 'World';
// Reflow should happen at the end of this function.
}
However, the following might trigger it twice because the second statement wants to know about something (i.e. the distance of domElement1 from the top edge of the viewport) that can only be obtained once the reflow algorithm has been run by the browser:
function clickHandler() {
domElement1.textContent = 'Hello';
console.log(dom1Element.getBoundingClientRect().top); // Should trigger reflow.
domElement2.textContent = 'World';
// Reflow should happen at the end of this function.
}
Note the usage of the words 'almost' and 'might' in the sentences above. I've used them to emphasize on the fact that I am not 100% sure. There is over 50% probability that what I say does actually happen on every single browser, but the probability is still not 100%. As I said before, engines work differently, and I can't tell you something blindly which I've never inspected myself.

Angularjs performance issues in $apply, but bindings are fast

I have a medium sized angular app which uses angular-1.2.10 and ui-router-0.2.8. When I transition to a particular state I am getting frame rate issues on the animation regardless of if I am using $animate on a ng-show or manually animating it.
When I dig into the profiler I can see that the $apply after the XHR is taking up to 200ms. Which I am presuming to be the cause of the lag. When I remove the code in the state I am going to, this problem goes as expected.
There is no large ng-repeat, and the bindings are fast:
This is leaving me a bit stuck as I can't see where the issue is originating from. If anyone can see something to point me in the right direction that would be great.
UPDATE
I have done into incoginto mode and run the same tests, with the $digest counter. The $digest runs 40 times and produces the following.
Lots of things seem to take a long time(30ms+) but I still can't find a cause.
UPDATE
looking at the timeline there seems to be a lot of DOMSubTreeModified.
Angular uses $digest cycles to see if anything needs updating. The pure fact that you've counted a lot is probably just another symptom of potential optimization. The true problem lies in the time it is taking, and the processing bottleneck since it's slowing down animations. So there are a couple of things to try:
Make sure you are not deep-watching anything, which means you shouldn't be passing 'true' for objectEquality. This process is more intensive and uses more memory as well.
Use isolate scope if directives are involved - if you can. Having an isolate scope will reduce the chatter of $digests in contrast to an inherited scope, since it will re-digest all shared scopes whenever the parent-controller changes.
Try replacing $watch statements with an event handler if they are rendered in the DOM. The reason for this is you can reduce the number of times the DOM is re-rendered by $broadcasting an event once the data has been processed (at the end of the XHR call) instead of it re-rendering each time you modify a property.
Can you animate via CSS using hardware-accelerated attributes to smooth it out?
Multiple $digests means you have cascading model changes, where changing a triggers a $watch('a') that in turn changes b, which triggers another digest that might trigger a $watch('c'), which triggers another digest that might (heaven forbid) trigger a $watch('a').
That cascade can take a long time even if each individual $watch evaluation is fast. If you can do all of your changes in one go without sending them propogating between watches you'll cut down your digest count.
It's hard to help without the code and bindings in markup. If you have parts of UI which are read only and it doesn't depend on multiple digest cycles, try using bindonce: https://github.com/Pasvaz/bindonce. It might help you reduce the number of watchers and unexpected digest cycles.

JQuery How to make condition not to run only once.

I have a little problem with conditions and its triggering. I have 2 object in my HTML (div and img), that I am trying to constantly align by JS. By constantly I mean so that when window size changes, they realign (since one is aligned to the center - and no :), I can't center-align the second one as well, because I also need to match the size, which definitely requires JS).
I made a little function that aligns them and sets proper dimensions to it and I am triggering the function on every window.onresize event (as well as on document ready). But I found out, that it does not trigger on zoom action and besides that it would be more suitable for me not to be dependent on window.onresize.
So I thought there would be a posibility to write a condition like
if (div.width() != img.widht()) {
// do something to match it again
}
But it turned out to only run this condition on the ready event (resp. load event, since I have a picture). So my question is, if there is any way, so that the condition would be checking its state just continuosly? I know, I can probably set Interval to take care of that, but a) I guess that like 99% of all executions would be pointless and b) unless I set it to like very quick repetition, it would not even fix the div's and img's mismatch problem immediately.
Thank you very much.
You can certainly define you own custom event and execute the aligning code when it occurs, but you still need a way to fire the event at appropriate time.
That can only happen during the ordinary execution flow of the program (not an option here) or in the handler for one of the existing events: if none of those events is consistently fired when the trigger condition occurs, then you're only left with timers.
I'd be happy to be wrong on this, tho'.
Consider requestAnimationFrame as an alternative to setInterval.

Optimize JS/jQuery performance (getBoundingClientRect) and eliminating layout redraw

So I have a project where I'm trying to optimize a fairly complex Javascript function to the max - partly this is due to the fact that its supposed to run on smart-phones (Webkit) and every little bit counts.
I've been using various debugging and timing techniques to go through my code and rewrite everything that might be slow - like parts of jQuery based stuff where native might do better and so on. What the function does is basically take a string of html text and cut it up to fit exactly into 3 DIVs that do not have fixed position or size (a client templating mechanism).
At the moment the entire function takes around 100ms to execute in iPads browser (but in the production environment I need to ideally execute it 200 times) and the problem is that out of those 100ms at least 20ms are because of this single line of code (in 3 loops):
var maxTop = $(cur).offset().top + $(cur).outerHeight();
"cur" is just a reference to a container DIV element and the line above is calculating its bottom position (so where my text should break). From looking at the offset jQuery code I understand it uses getBoundingClientRect and even eliminating jQuery offset/sizing and calling it directly does nothing to speed it up - so its getBoundingClientRect fault (at least in Webkit). I did a bit of research on it and I understand it causes layout redraw.
But still - can't believe that I do multiple DOM clears/clones/appends and all of those are much faster than a simple element position lookup? Any ideas out there? Maybe something webkit specific? Or something that doesn't cause redraw?
Would much appreciate it!
did you try:
var maxTop = cur.offsetTop + cur.offsetHeight;
?
point is, offsetTop and offsetHeight are native dom properties, and so access should be faster than through a function.
Since I also ran into a similar problem, I had a loop in which I was fixing a series (sometimes 1000+) of DOM elements (from float to absolute). I immediately applied the fixed styling to the elements, which was a big mistake to make: Every time something is written to the DOM the style has to be recalculated when your script asks for a position of an element. Hence, do all your reading, and then all your writing, even if that means two separate loops (you can safely write to the dataset property of your DOM element).
See also: http://gent.ilcore.com/2011/03/how-not-to-trigger-layout-in-webkit.html

Performance of setting img src to unchanged value?

If I have an img tag like
<img src="example.png" />
and I set it via
myImg.src = "example.png";
to the same value again, will this be a no-op, or will browsers unnecessarily redraw the image? (I'm mainly interested in the behaviour of IE6-8, FF3.x, Safari 4-5 and Chrome.)
I need to change many (hundreds of) images at once, and manually comparing the src attribute might be a little bit superfluous - as I assume, that the browser already does this for me?
Don't assume the browser will do it for you. I am working on a project of similar scale which requires hundreds of (dynamic-loading) images, with speed as the top priority.
Caching the 'src' property of every element is highly recommended. It is expensive to read and set hundreds of DOM element properties, and even setting src to the same value may cause reflow or paint events.
[Edit] The majority of sluggishness in the interface was due to all my loops and processes. Once those were optimized, the UI was very snappy, even when continuously loading hundreds of images.
[Edit 2] In light of your additional information (the images are all small status icons), perhaps you should consider simply declaring a class for each status in your CSS. Also, you might want to look into using cloneNode and replaceNode for a very quick and efficient swap.
[Edit 3] Try absolutely-positioning your image elements. It will limit the amount of reflow that needs to happen, since absolutely-positioned elements are outside of the flow.
When you change a bunch of elements at once, you're usually blocking the UI thread anyway, so only one redraw after the JavaScript completes is happening, meaning the per-image redraw really isn't a factor.
I wouldn't double check anything here, let the browser take care of it, the new ones are smart enough to do this in an efficient way (and it's never really been that much of a problem anyway).
The case you'll see here is new images loading and re-flowing the page as they load, that's what's expensive here, existing images are very minor compared to this cost.
I recommend using CSS Sprite technique. More info at: http://www.alistapart.com/articles/
You can use an image that contains all the icons. Then instead of changing the src attribute, you update the background property.

Categories

Resources