When is Element.getBoundingClientRect guaranteed to be updated / accurate? - javascript

I am working on some code that uses Element.getBoundingClientRect (gBCR), coupled with inline style updates, to perform calculation. This is not for a general website and I am not concerned or interested in if there are "better CSS ways" of doing this task.
The JavaScript is run synchronously and performs these steps:
The parent's gBCR is fetched
Calculations are performed and;
A child element of the parent has inline CSS styles (eg. size and margins) updated
The parent's gBCR is fetched again
Am I guaranteed that the computed client bounds will reflect the new bounding rectangle of the parent at step 4?
If not guaranteed by a specification, is this "guaranteed" by modern1 browser implementations? If "mostly guaranteed", what notable exceptions are there?
Elements are not being added to or removed from the DOM and the elements being modified are direct children of the parent node; if such restrictions / information is relevant.
1"Modern": UIWebView (iOS 6+), WebView (Android 2+), and the usual Chrome/WebKit, FF, IE9+ suspects - including mobile versions.

I'm just stuck at gBCR unreliability on ios8.4.1/Safari8.0.
Prepare a large div on top of body (gBCR is 0) and scroll to bottom (gBCR is negative). Resize the div into 1x1 then window.scrollY automatically goes 0. gBCR should also be 0 but still stay negative value. With setTimeout, 200ms later, you can confirm the right value 0.

Old question, still the problem puzzled me and in my searches I had tumbled on this question. It might help others.
The best guarantee that I could find to make getBoundingClientRect() work reliably is to force a refresh at the top of the window, calculate the positions, and then go back wherever the user was.
Code would look something like:
scroll_pos = document.documentElement.scrollTop // save current position
window.scrollTo(0, 0); // go up
v_align = parseInt(el.getBoundingClientRect().top) // example of gBCR for vert.alignment
//... whatever other code you might need
window.scrollTo(0, scroll_pos); // get back to the starting position
Usually the operation is lightning fast, so the user should not notice it.

Related

Does 'display:none' improve or worsen performance?

I have a page with a lot of vertical scrolling and thousands of DOM elements. For improving performance, I thought about setting display: none; to the content of the divs above and below the viewport, that is, the divs which are not visible (while keeping their heights, obviously):
In order to check if my idea makes any sense I searched SO and I found this question. According to the comments and to the accepted answer, the best strategy is do nothing, since display: none; triggers reflow and may have the opposite effect:
Setting display to none triggers reflow which is completely opposite of what you want if what you want is to avoid reflow. Not doing anything doesn't trigger reflow. Setting visibility to hidden will also not trigger reflow. However, not doing anything a much easier.
However, there is a recent answer (which unfortunately seems more like a comment or even a question) that claims that display: none; is the current strategy used by sites like Facebook, where the vertical scroll is almost infinite.
It's worth mentioning that, unlike OP's description in that question, each visible div in my site is interactive: the user can click, drag, and do other stuff with the div's contents (which, I believe, makes the browser repainting the page).
Given all these information, my question is: does display: none; applied to the divs above/below the viewport improve performance or does it worsen performance? Or maybe it has no effect?
The "display: none" property of an Element removes that element from the document flow.
Redefining that element display property from none to any other dynamically, and vice versa, will again force the change in document flow.
Each time requiring a recalculation of all elements under the stream cascade for new rendering.
So yes, a "display: none" property applied to a nonzero dimensional and free flow or relatively positioned element, will be a costly operation and therefore will worsen the performance!
This will not be the case for say position: absolute or otherwise, removed elements form the natural and free document flow who's display property may be set to none and back without triggering e re-flow on the body of the document.
Now in your specific case [see edited graph] as you move/scroll down bringing the 'display: block' back to the following div will not cause a re-flow to the rest of the upper part of the document. So it is safe to make them displayable as you go. Therefore will not impact the page performance. Also display: none of tail elements as you move up, as this will free more display memory. And therefore may improve the performance.
Which is never the case when adding or removing elements from and within the upper part of the HTML stream!
The answer is, like just about everything, it depends. I think you're going to have to benchmark it yourself to understand the specific situation. Here's how to run a "smoothness" benchmark since the perception of speed is likely more important than actual system performance for you.
As others have stated display:none leaves the DOM in memory. Typically the rendering is the expensive part but that's based on how many elements have to be rendered when things change. If the repaint operation is still having to check every element you may not see a huge performance increase. Here are some other options to consider.
Use Virtual DOM
This is why frameworks like React & Vue use a Virtual DOM. The goal is to take over the browser's job of deciding what to update and only making smaller changes.
Fully Add/Remove elements
You could replicate something similar by using Intersection Observer to figure out what's in/out of the viewport and actually add/subtract from the DOM instead of just relying on display:none alone since parsing javascript is generally more efficient than large paints.
Add GPU Acceleration
On the flip side, if the GPU is taking over rendering the paint might not be a performance suck but that's only on some devices. You can try it by adding transform: translate3d(0,0,0); to force GPU acceleration.
Give the Browser Hints
You may also see an improvement by utilizing CSS Will-Change attribute. One of the inputs is based on the content being outside the viewport. So will-change:scroll-position; on the elements.
CSS Content-Visibility (The bleeding edge)
The CSS Working group at W3C has the CSS containment module in draft form. The idea is to allow the developer to tell the browser what to paint and when. This includes paint & layout containment. content-visibility:auto is a super helpful property designed for just this type of problem. Here's more background.
Edit (April 2021) this is now available in Chrome 85+, Edge (Chromium) 85+, and Opera 71+. We're still waiting on Firefox support, but Can I Use puts it a 65% coverage.
It's worth a look as the demos I saw made a massive difference in performance and Lighthouse scores.
Two tips for improving performance when you have thousands of DOM elements and they needs to be scrolled through, interact etc.
Try to manage bindings provided by front end frameworks manually. Front end frameworks might need lot of additional processing for the simple data binding you need. They are good up to certain number of DOM elements. But if your case is special and exceeds the number of DOM elements in an average case, the way to go is to manually bind them considering the circumstance. This can certainly remove any lagging.
Buffer the DOM elements in and around the view port. If your DOM elements are a representation of the data in a table(s), fetch them with a limit and only render what you fetched. The user scrolling action should make the fetching and rendering going upwards or downwards.
Just hiding the elements is definitely not going to solve your performance problem coming by having thousands of DOM elements. Even though you can't see them, they occupy the DOM tree and the memory too. Only the browser doesn't have to paint them.
Here are some articles:
https://codeburst.io/taming-huge-collections-of-dom-nodes-bebafdba332
https://areknawo.com/dom-performance-case-study/
To add to the already posted answers.
The key takeaways from my testing:
setting an element to display: none; decreases ram usage
elements that are not displayed are not affected by layout shifts and therefore have no (or very little) performance cost in this regard
Firefox is way better at handling lots of elements (~x50)
Also try to minimize layout changes.
Here is my test setup:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Document</title>
<style>
body {
margin: 0;
}
.testEl {
width: 100%;
height: 10px;
}
</style>
</head>
<body>
<main id="main"></main>
</body>
<script>
// firefox Max
// const elementCount = 12200000;
// chrome Max
const elementCount = 231000;
const main = document.getElementById("main");
const onlyShowFirst1000 = true;
let _content = ""
for (let i = 0; i < elementCount; i++) {
_content += `<div class="testEl" style="background-color: hsl(${(Math.random() * 360)|0}, 100%, 50%); display: ${!onlyShowFirst1000 || i < 1000 ? "block" : "none"}"></div>`;
}
main.innerHTML = _content;
const addOneEnd = () => {
const newEl = document.createElement("div");
newEl.classList.add("testEl");
newEl.style.backgroundColor = `hsl(${(Math.random() * 360)|0}, 100%, 50%)`
requestAnimationFrame(() => {
main.appendChild(newEl);
})
};
const addOneBeginning = () => {
const newEl = document.createElement("div");
newEl.classList.add("testEl");
newEl.style.backgroundColor = `hsl(${(Math.random() * 360)|0}, 100%, 50%)`
requestAnimationFrame(() => {
main.insertBefore(newEl, main.firstChild);
})
};
const loop = (front = true) => {
front ? addOneBeginning() : addOneEnd();
setTimeout(() => loop(front), 100);
};
</script>
</html>
I create a lot of elements and have the option to only display the first 1000 of them using the onlyShowFirst1000 flag. when displaying all elements, firefox allowed up to ~12200000 elements (using 10gb of my RAM) and chrome up to ~231000.
Memory usage (at 231000 elements):
+----------+----------+-------------+
| false | true | reduction % |
+---------+----------+----------+-------------+
| Chrome | 415,764k | 243,096k | 42% |
+---------+----------+----------+-------------+
| Firefox | 169.9MB | 105.7MB | 38% |
+---------+----------+----------+-------------+
Changing the display property of an element to or from none causes the area to be repainted, but the area of your element will usually be relatively small, therefore the performance cost will also be small.
But depending on your layout, the display change might also cause a layout shift which could be quite costly since it would cause a big part of your page to repaint.
In the future (e.g. chrome 85) you will also be able to use the content-visibility property to tell the browser which elements dont have to be rendered.
Also, you set the browser to show repaints using the dev tools, for chrome open the rendering tab and check "Paint flashing".
The strategy of "virtual scrolling" is remove the HTML element when it's out of viewport, this improve the performance because reduce the number of the elements into the dom and reduce the time for repaint/reflow all the document.
Display none don't reduce the size of the dom, just make the element not visible, like visible hidden, without occupies the visual space.
Display none don't improve the performance because the goal of virtual scrolling is reduce the number of the elements into the dom.
Make an element display none or remove an element, trigger a reflow, but with display none you worst the performance because you don't have the benefit of reduce the dom.
About performance, display none is just like visible hidden.
Google Lighthouse flags as bad performance pages with DOM trees that:
Have more than 1,500 nodes total
Have a depth greater than 32 nodes
Have a parent node with more than 60 child nodes
In general, look for ways to create DOM nodes only when needed, and destroy nodes when they're no longer needed.
The only one benefit of display none is: will not cause a repaint or reflow when change.
Source:
https://web.dev/dom-size/
https://developers.google.com/speed/docs/insights/browser-reflow

Keeping scroll position when adding elements on top works in Firefox but not Chrome

I have a Meteor app (source code) which has a stream of entries and new entries are being constantly added on top. I am trying to make it so that if an user scrolls down to a particular entry, that entry should stay visible and not move even when more entries are added on top. Adding and removing entries is animated using Velocity.
I have made code which does that, but it works only in Firefox, while in Chrome it quickly starts jumping around as more entries are coming. Why is that and how could I fix it?
I'm going to post this since it took me a while to figure out. For me it had to do with the Scroll Anchoring feature which was introduced as default in Chrome 56.
The overflow-anchor property enables us to opt out of Scroll Anchoring, which is a browser feature intended to allow content to load above the user's current DOM location without changing the user's location once that content has been fully loaded. Source
You might want to try setting overflow-anchor to none, to opt out of the Scroll Anchoring functionality:
body {
overflow-anchor: none;
}
You can find a demo here, showcasing the difference with and without scroll anchoring.
After you insert the elements at the top, you need to manually re-scroll to the correct position:
function insertNewElementAtTop(parent, elem) {
var scrollTopBeforeInsert = parent.scrollTop;
parent.insertBefore(elem, eParent.firstChild);
parent.scrollTop = scrollTopBeforeInsert + elem.offsetHeight;
}

What is the most efficient way to modify DOM elements and limit reflow?

When working with a very dynamic UI (think Single Page App) with potentially large JS libraries, view templates, validation, ajax, animations, etc... what are some strategies that will help minimize or reduce the time the browser spends on reflow?
For example, we know there are many ways to accomplish a DIV size change but are there techniques that should be avoided (from a reflow standpoint) and how do the results differ between browsers?
Here is a concrete example:
Given a simple example of 3 different ways to control the size of a DIV when the window is resized, which of these should be used to minimize reflows?
http://jsfiddle.net/xDaevax/v7ex7m6v/
//Method 1: Pure Javascript
function resize(width, height) {
var target = document.getElementById("method1");
target.setAttribute("style","width:" + width + "px");
target.setAttribute("style", "height:" + height + "px");
console.log("here");
} // end function
window.onresize = function() {
var height = (window.innerHeight / 4);
var width = (window.innerWidth / 4);
console.log(height);
resize(height, width);
}
//Method #3 Jquery animate
$(function() {
$(window).on("resize", function(e, data) {
$("#method3").animate({height: window.innerHeight / 4, width: window.innerWidth / 4}, 600)
});
});
It's best to try and avoid changing DOM elements whenever possible. At times you can prevent reflow at all by sticking to CSS properties or, if required, using CSS' transforms so that the element itself is not affected at all but, instead the visual state is just changed. Paul Lewis and Paul Irish go into detail about why this is the case in this article.
This approach will not work in all cases because sometimes it's required to change the actual DOM element, but for many animations and such, transforms brings the best performance.
If your operations do require reflow, you can minimize the effect it has by:
Keeping the DOM depth small
Keeping your CSS selector simple (and saving complicated ones to a variable in JavaScript)
Avoiding inline styles
Avoiding tables for layout
Avoiding JavaScript whenever possible
Nicole Sullivan posted a pretty good article on the subject that goes into more details of browser reflows and repaints.
If you're actually changing the DOM, not DOM properties, it's best to do it in larger chunks rather than smaller ones like this Stack Overflow post suggests.
In the example you provided, the second method is the best because it is using CSS properties without needing JavaScript. Browsers are pretty good at rendering elements who's dimensions and position is determined solely by CSS. However, it's not always possible to get the element where we need to be with pure CSS.
The worst method is by far the third because jQuery's animate is terribly slow to start out with, but making it fire on resize makes the animates stack on top of each other so it lags wayyy behind if you resize it much at all. You can prevent this by either setting a timeout with a boolean to check whether it's been fired already or, more preferably, don't use jQuery's animate to do this at all but instead use jQuery's .css() since the resize function fires so often that it will look animated anyway.

How to get updated width of content after a css change?

I'm trying to modify the css class of my body element.
Before modifying the class, I check the scroll width of my content:
$(window.document.body).prop('scrollWidth'); // 800px
Now I modify the css class and check the scroll width again:
$(window.document.body).prop('class', someCssClassName);
$(window.document.body).prop('scrollWidth'); // still reports 800px
I know the scroll width should not be 800px after this particular change. I start a timer and keep printing the scroll width, and after a few ms I see it change to 600px.
So it seems like I can't immediately get the updated content width (or I'm misinterpreting what's going on).
Is there a way to get notified when the re-flow is complete, so that I might get the updated width?
I don't want to set a timer and keep checking, if possible.
I'm trying this in an android WebView.. browser.. . So I'm not sure if this behavior will be the same if I try in a desktop browser.
Thank you
To answer the question: accessing the scrollWidth property automatically flushes any style change (forces a reflow) and then returns the computed value. This happens synchronously, hence you don't need to "wait" for the reflow to complete -- the JS will simply freeze while the reflow happens and then return the correct scrollWidth value.
You are actually facing a very specific Blink/WebKit bug in their scrollWidth implementation regarding the body element.
I've simplified your code a bit by removing some unnecessary jQuery abstraction (fiddle):
document.body.className = 'w600px';
console.log(
document.body.scrollWidth, // Firefox: 600, Chrome: viewport width
$(document.body).width() // 600 in both browsers
);
.w600px {
width: 600px;
}
From the CSSOM element.scrollWidth spec:
3. If the element is the HTML body element, the Document is in quirks mode and the element has no associated scrolling box, return max(viewport scrolling area width, viewport width).
It seems like Chrome is not checking whether the document is in Quirks mode and returning the viewport (scrolling) width independent of the document mode.
You should open a Chromium issue in these cases.
Workarounds
It really depends on your use case. $(document.body).width() is usually fine, unless the content overflows the body element's width.
Alternatively, wrap all the page's contents inside of a div and use it to apply the class and to retrieve the scrollWidth from.
Try
function getprop( el, prop ) {
var props = window.getComputedStyle (
$(el).get(0)).getPropertyValue(prop);
return String(props);
};
console.log(getprop("body", "width"))
http://jsfiddle.net/guest271314/6uKH6/5/

How to detect when the height of your page changes?

I have a javascript heavy app which has widgets like autocomplete dropdowns and tabs and so forth. Sometimes when dropdowns appear and disappear, or when you switch between tabs, it changes the height of the document. This can cause annoyances if the scrollbar appears and disappears rapidly, because it shifts the page. I would like to detect when a page changes its height, so I can fix the height to the maximum so far, so that if the scrollbar appears it won't disappear only a second later. Any suggestions?
Update: onresize won't work because that's for changes in the size of the viewport/window - I want changes in the length of the document. I hadn't known about the watch function, looks like it will work at least for FF, but IE doesn't support it.
I belive this question has already been answered on stackoverflow here:
Detect Document Height Change
Basically you have to store the current document height and keep checking for a change via a timeoutcall
The element to watch here is document.body.clientHeight (or in jquery $(document).height() )
I think you can trap "onresize" events
here is a link to the w3schools.com description
You can user resize event to trap the change of the size of window using jquery as follows:
$(window).resize(function(){
// your code to check sizes and take action
}
);
Alternately you can track the change in document (not tested)
$(document).resize(function(){
// your code to check sizes and take action
}
);
One idea would be to use the watch() method on the clientHeight property:
document.body.watch("clientHeight", function(property, oldHeight, newHeight) {
// what you want to do when the height changes
});
The function you specify will get executed whenever the specified property changes. At any point you can use the unwatch() method to get it to stop.

Categories

Resources