Should ResizeObserver trigger for hidden elements? [duplicate] - javascript

Seems when I pass a node to ResizeObserver.observe() on DOMContentLoaded event, it invokes immediately. Is it considered a normal behavior?

Yes, this behavior is per spec. They do have this note:
Observation will fire when watched Element is inserted/removed from DOM.
Observation will fire when watched Element display gets set to none.
Observations do not fire for non-replaced inline Elements.
Observations will not be triggered by CSS transforms.
Observation will fire when observation starts if Element is being rendered, and Element’s size is not 0,0.
So in your case, either the element was not yet in the DOM and case 1 will make it fire, either it was already, and case 5 will (though in `DOMContentLoaded, that should be 5 ;).
But following the actual normative specs it appears that in any case the observation should fire initially. We can note that Chrome did quite recently change their behavior to do exactly this in CRBUG 1128016, since prior to that change they did not initially fire on hidden elements.

Related

JS: event listener on hidden element

Do I need to removeEventListener when I change element's style display:none?
Do events effect page performance?
No you don't actually need to, specially if that element is going to be displayed again, because it wouldn't be cool to add/remove the listener every single time. Event listeners are asynchronous, and only impact on performance when they are executed, so since that you're hiding that element (and the user isn't able to interact with it) your listener will not be called as long as it stays hidden, and there will not be any performance impact.
Plus: even if you were completely removing that element using parentEl.removeChild(childEl), you still wouldn't have needed to remove the listeners, because deleting an element will cause all its listeners to be removed as well.

Does adding too many event listeners affect performance?

I have a general question about javascript (jQuery) events/listeners.
Is there any limit for the number of click listener without getting performance problems?
In terms of performance, the number of elements the event is bound to is where you'd see any issues.
Here is a jsperf test. You'll see that binding to many elements is much slower, even though only one event is being bound in each case.
The 3rd test in the jsperf shows how you'd bind the event to a parent element and use delegation to listen for the elements you want. (In this case .many)
n.b. The test shows that the 3rd option is the fastest, but that's because it's targeting the element with an id and not a class.
Update: Here's another perf test showing 3 events bound to an id vs one event bound using a class
Though this is an old question, I do not feel that it's completely answered yet.
As atmd pointed out: It's already important where you're adding the event handlers to.
But the original question seems to be more concerned about the performance impact of triggering those event handlers (e.g. click or scroll events).
And yes, adding additional event handlers to an element DOES decrease performance.
Here is a performance comparison to test the following cases:
https://jsbench.me/ztknuth40j/1
The results
One <div> has 10 click handlers, and the click event is triggered by jQuery.
→ 72.000 clicks/sec
One <div> has 100 click handlers, and the click event is triggered by jQuery.
→ 59.000 clicks/sec ~ 19% slower than first case
This shows, that additional event handlers can slow down the execution
One <div> has 10 click handlers, and the click event is triggered by plain JS.
→ 84.000 clicks/sec ~ 6% faster than the first case
Using plain JS is a little bit faster than using jQuery
One <div> has 100 click handlers, and the click event is triggered by plain JS.
→ 14.000 clicks/sec ~ 77% slower than second case
This is interesting: When using native events, the number of listeners seems to degrade the performance faster than using jQuery.
(Those results vary on every run and depend largely on your hardware and browser)
Keep in mind that those tests are done with an empty function. When adding a real function that performs some additional tasks, the performance will slow down even further.
Here is a second test that changes the contents of a div on each click:
https://jsbench.me/ztknuth40j/2
Is it slow?
On the other hand: Even 100 operations per second are super fast (it means, that every event handler is executed 100 times in a single second) and no user will notice the delay.
I think you will not run into problems with user-action events like click or mouseenter handlers, but need to watch out when using events that fire rapidly, like scroll or mouseover.
Also, as computers get faster and browsers apply more and more optimizations, there is no hard limit for how many event handlers are "too much". It not only depends on the function that's called and the event that's observed but also on the device and browser of the user.

Javascript: Atomicity / Interactions between browser view and DOM

I have two specific Javascript questions that are probably answered by one general answer. Please feel free to also submit the corresponding general question--I have difficulties expressing myself.
When I manipulate multiple DOM elements in a single Javascript callback, is the view possibly updated "live" with each individual manipulation, or atomically after the callback returns?
When a user clicks an HTML element twice in a short timeframe, and the corresponding click handler disables the HTML element, is there a guarantee that the handler won't be executed twice?
Preemptively, I do not have a standards citation for this. This is strictly in my experience.
I have never noticed the visible pixels update while Javascript is executing in real time. I suspect that they will not during the standard operation of the browser - it certainly is possible that debugging presents an exception. I have, however, observed synchronous reflow calculations occurring on DOM elements between the top and bottom of a single function call, but these reflow calculations never made it to the pixel buffer ( that I noticed ). These appear to occur synchronously.
function foo() {
$('#myElement').width(); // 100
$('#myElement').parent().width(); // 150
$('#myElement').css('width', 200);
$('#myElement').width(); // 200
$('#myElement').parent().width(); // 250
}
Regarding multiple clicks on an element that is disabled within the click handler, I suspect that the second click will not fire. I believe when the operating system receives a click event it passes it to the browser and it is placed in a queue. This queue is serviced by the same thread that executes Javascript. The OS click event will remain in the queue until Javascript completes execution at which time it will be removed, wrapped as a browser click event, and bubbled through the DOM. At this point the button will already be disabled and the click event will not activate it.
I'm guessing the pixel buffer is painted on-screen as another operation of this same thread though I may be mistaken.
This is based on my vague recollection of standards that I have seen quoted and referenced elsewhere. I don't have any links.
All script executions happen within the same thread. Therefore you can never have simultaneous actions and don't have to worry about concurrent modification of elements. This also means you don't need to worry about a click handler being fired while one is currently executed. However, this doesn't mean they cant immediately fire it again when your script is finished. The execution may be so fast that its indistinguishable.
First Bullet: The updates will be live. For example, attach the following function to an onclick handler:
function(){
var d = document.getElementById("myelement")
d.setAttribute("align", "center")
d.setAttribute("data-foo","bar")
d.setAttribute("data-bar","baz")
}
Now load this in your browser set a breakpoint on the first line. trigger the event and step through line-by-line while watching the DOM. The updates will happen live, they are not going to happen all at once.
If you want them to happen atomically, you'll want to clone the DOM element in question, make the changes on the clone, then replace the original element with the clone. The cloned element is still being updated in realtime, but the user-visible effect is atomic.
Second Bullet: If the second click event comes in after the element has been disabled, then yes, you won't get a second callback. But if there is any delay between the first click and the disable call, (for example some kind of lengthy check needs to be performed to determine if the element should be disabled) and the second click occurs in that delay, it will fire the callback a second time. The browser has no way to know that multiple click events isn't acceptable behavior in a given script.

Unbind inline javascript events from HTML elements in memory

How do I completely unbind inline javascript events from their HTML elements?
I've tried:
undelegating the event from the body element
unbinding the event from the element
and even removing the event attribute from the HTML element
To my surprise at least, only removing the onchange attribute (.removeAttr('onchange')) was able to prevent the event from firing again.
<input type="text" onchange="validateString(this)"></input>
I know this is possible with delegates and that's probably the best way to go, but just play along here. This example is purely hypothetical just for the sake of proposing the question.
So the hypothetical situation is this:
I'm writing a javascript validation library that has javascript events tied to input fields via inline HTML attributes like so:
<input type="text" onchange="validateString(this)"></input>
But, I'd like to make the library a little better by unbinding my events, so that people working with this library in a single-page application don't have to manage my event handlers and so that they don't have to clutter their code at all by wiring up input events to functions in my hypothetical validation library... whatever. None of that's true, but it seems like a decent usecase.
Here's the "sample" code of Hypothetical Validation Library.js:
http://jsfiddle.net/CoryDanielson/jwTTf/
To test, just type in the textbox and then click elsewhere to fire the change event. Do this with the web inspector open and recording on the Timeline tab. Highlight the region of the timeline that correlates to when you've fired the change event (fire the change event multiple times) and you'll see the event listeners (in the window below) increase by 100 on each change event. If managed & removed properly, each event listener would be properly removed before rendering a new input, but I have not found a way to properly do that with inline javascript events.
What that code does is this:
onChange, the input element triggers a validation function
That function validates the input and colors the border if successful
Then after 1 second (to demonstrate the memory leak) the input element is replaced with identical HTML 100 times in a row without unbinding the change event (because I don't know how to do that.. that's the problem here). This simulates changing the view within a single-page app. This creates 100 new eventListeners in the DOM, which is visible through the web inspector.
Interesting Note. $('input').removeAttr('onchange'); will actually prevent the onchange event from being fired in the future, but does not garbage collect the eventListener/DOM stuff that is visible in the web inspector.
This screenshot is after change event fires 3 times. Each time, 100 new DOM nodes are rendered with identical HTML and I've attempted to unbind the onchange event from each node before replacing the HTML.
Update: I came back to this question and just did a quick little test using the JSFiddle to make sure that the answer was valid. I ran the 'test' dozens of times and then waited -- sure enough, the GC came through and took care of business.
I don't think you have anything to worry about. Although the memory can no longer be referenced and will eventually be garbage collected, it still shows up in the Web Inspector memory window. The memory will be garbage collected when the GC decides to garbage collect it (e.g., when the browser is low on memory or after some fixed time). The details are up to the GC implementer. You can verify this by just clicking the "Collect Garbage" button at the bottom of the Web Insepctor window. I'm running Chrome 23 and after I enter text in your validation box about 5 or 6 times, the memory usage comes crashing down, apparently due to garbage collection.
This phenomenon is not specific to inline events. I saw a similar pattern just by repeatedly allocating a large array and then overwriting the reference to that large array, leaving lots of orphaned memory for GC. Memory ramps up for a while, then the GC kicks in and does its job.
My first sggestion would have been to use off('change') but it seems you've already tried that. It's possible that the reason it's not working is because the handler wasn't attached with .on('change'). I don't know too much about how jQuery handles listener like this internally, but try attaching with .on('change', function ()... or .bind('change', function ()... instead.

jquery lost events

I would like to know if is there some jquery known behaviour that cause the lost of events handlers (in particular in iframes)?
I've a strange kind of problem.
I've built a webapp composed of two iframe.
First i load content in the first iframe. I add some event event handler using jquery to first iframe content dom.
Everything works.
On user input i load a page in the second iframe. Here too, I add some event handlers using jquery.
Then the strange thing happens: jquery lost the event handlers in the first iframe.
I said 'jquery lost' because if I add an event listener old way, it is still present.
Problem solved.
The problem was caused accessing iframe2.contentWindow or iframe2.contentDocument on the second iframe, when the src of the second iframe was changed (1st time everything worked, from the 2nd onward caused problems) and the second frame was statically coded in the html.
To solve the problem I always remove the second iframe and recreate and append it to dom dynamically via javascript.
The problem occurs only on opera 9.7 embedded for mips (not sure for the exact version)
You might want to use live to bind events. This way when you add new elements with the same selector it will have the event binded to them.
$("p").live("click", function(){
$(this).after("<p>Another paragraph!</p>");
});
Every subsequent p that is added to the page will have the event binded too.

Categories

Resources