I'm trying to make a starring night with twinkling stars in css3 + Javascript, however, my animation is consuming a lot of CPU, the main animation:
#for $i from 0 through 400 {
.star:nth-child(#{$i}) {
$star-size: (random() * (1-4) +4) + px;
top: (random(100)) + vh;
left: (random(100)) + vw;
width: $star-size;
height: $star-size;
animation: blinker 1.2s alternate infinite ease-in-out;
animation-delay: (random(30) / 10) + s;
transform: scale(0.2);
}
}
#keyframes blinker {
100% {
transform: scale(1);
}
}
the full code: https://jsfiddle.net/sam7krx0/
is there any way to make this code perform better?
Edit:
tried with translateZ(0) and with will-change: transform but the animation still being rendered by the CPU.
https://jsfiddle.net/8hn97kcx/2/
Edit 2:
It seems that firefox might be the problem, while testing on chrome the animation uses way less CPU.
Edit 3:
profile of the fiddle above running on firefox developer edition 69.0b4:
firefox profile
CPU usage:
Have you tried using the will-change property - this helps the browser know about the change and offload it to the compositor if possible.
The OP code was horrendously inefficient in that it uses 400+ uniquely generated selectors. So the bulk of the processing time involves maintaining the CSS animation loop and looking up 400+ classes on each alternation of said CSS animation. This is a rare case wherein class selectors are a burden and not useful. Since each s.star needs these unique styles, it would take less computing power to generate the CSS property values on a template literal and then assign it to the tag as an inline-style. (See Demo)
Besides doing away with ridiculously huge .class lists on a bloated stylesheet, the demo makes full use of a documentFragment. DOM operations are expensive on resources (imagine 400+ tags being appended to one location). Doing everything on the fragment, then finally to the DOM by ๐appending documentFragment just once and 400 .star are in the DOM๐. The OP code on the other hand ๐will append 400 s.star one at a time... that's 400+ DOM operations.๐
Also on the OP code it is deceiving as to the size of the actual CSS. SCSS, a post-processor is used, so what looks like 8 lines of weird looking CSS is actually ๐3200 lines of CSS๐ after it has been compiled and cached by the browser. The CSS in the demo is what it appears to be ...๐9 lines๐ for .star selector.
/**| documentFragment
- The only global variable points to a documentFragment not attached to the DOM.
- Use fragment as you would a document object when moving, creating, destroying,
appending, detaching, etc... HTML element tags from and to the DOM. Doing so will
greatly improve processing times when adding 400+ uniquely styled tags.
- When all .star tags have been created, modified, and appended to the fragment --
only the fragment itself needs to be appended to the DOM instead of 400 tags.
*/
var fragment = document.createDocumentFragment();
/**| randomRange(min, max, integer = false)
#Params: min [number].....: The minimum
max [number].....: The maximum
integer [boolean]: default is false which results will be floats.
If true then results will be integers.
Utility function that will return a random number from a given range of consecutive
numbers.
*/
const randomRange = (min, max, integer = false) => {
let numbers = integer ? {
min: Math.ceil(min),
max: Math.floor(max)
} : {
min: min,
max: max
};
return Math.random() * (numbers.max - numbers.min + 1) + numbers.min;
};
/**| starGenerator(limit)
#Params: limit [number]: The number of s.star to generate.
A generator function that creates s.star tags. Assigning individual tag properties
and setting randomly determined values would involve a ton of unique selectors.
To avoid a ton of lookups in a CSS stylesheet a mile long, it's easier to create and
maintain one template literal of the CSS properties interpolated with random values.
Each s.star would be assigned an inline-style of five CSS properties/values by one
statement via `.cssText` property.
*/
function* starGenerator(limit) {
let iteration = 0;
while (iteration < limit) {
iteration++;
const star = document.createElement("s");
star.classList.add("star");
let properties = `
width: ${randomRange(1, 4)}px;
height: ${randomRange(1, 4)}px;
top: ${randomRange(0, 100, true)}vh;
left: ${randomRange(0, 100, true)}vw;
animation-delay: ${randomRange(1, 30, true) / 10}s`;
star.style.cssText = properties;
yield star;
}
return fragment;
}
/**| nightfall(selector, limit = 400)
#Params: selector [string]: Target parent tag
limit [number].. : The maximum number of s.star to generate.
Interface function that facilitates DOM procedures with minimal presence in DOM.
*/
const nightfall = (selector, limit = 400) => {
const base = document.querySelector(selector);
base.classList.add('sky');
for (let star of starGenerator(limit)) {
fragment.appendChild(star);
}
return base.appendChild(fragment);
};
// Call nightfall() passing the selector "main"
nightfall("main");
.sky {
position: relative;
background: #000;
height: 100vh;
overflow: hidden;
}
.star {
display: block;
position: absolute;
animation: twinkle 1.2s alternate infinite ease-in-out;
transform: scale(0.2);
border-radius: 50%;
background: #fff;
box-shadow: 0 0 6px 1px #fff;
z-index: 2;
text-decoration: none;
}
#keyframes twinkle {
100% {
transform: scale(1);
}
}
<main></main>
That's because the rendering is done by CPU which can be a loose in performance. There is an option in CSS to run such an animation on GPU.
Your snippet adjusted
#for $i from 0 through 400 {
.star:nth-child(#{$i}) {
$star-size: (random() * (1-4) +4) + px;
transform: translateY((random(100)) + vh) translateX((random(100)) + vw) translateZ(0);
width: $star-size;
height: $star-size;
animation: blinker 1.2s alternate infinite ease-in-out;
animation-delay: (random(30) / 10) + s;
transform: scale(0.2);
}
}
#keyframes blinker {
100% {
transform: scale(1);
}
}
It's very important to add translateZ because only 3D renderings are done by GPU.
Doing animations on GPU is also called accelerated animations, please check this helpful article for more information about: https://www.sitepoint.com/introduction-to-hardware-acceleration-css-animations/
it's not only problem with you code.
it's also from your CPU ability, trying to upgrade your CPU and RAM to perform better.
sometimes you can't build mid - high animation in low spec computer.
Related
This question already has an answer here:
How to change #keyframes vlaues using javascript?
(1 answer)
Closed 1 year ago.
I have the following CSS Keyframe:
#keyframes progressStatus
{
to
{
stroke-dashoffset: 165;
}
}
I am trying to change the value 165 to something else with Javascript.
For this particular example you could use a CSS variable.
This simple snippet alters a variable called --strokeDashoffset every time the div is clicked (and toggles between having an animation and not, just for the demo).
div {
--strokeDashoffset: 165px; /* initial position */
background-color: magenta;
width: var(--strokeDashoffset);
height: 5vmin;
animation: none;
animation-fill-mode: forwards;
animation-duration: 1s;
animation-iteration-count: 1;
}
.strokeDashoffset {
animation-name: progressStatus;
}
#keyframes progressStatus
{
to
{
stroke-dashoffset: var(--strokeDashoffset);
}
}
<div onclick="this.classList.toggle('strokeDashoffset'); this.style.setProperty('--strokeDashoffset', Math.random(0,1)*100 + 'vmin');" >CLICK ME</div>
While the CSS variable approach in the other answer is nicer, here is one that modifies the relevant CSS directly since I already typed it out. This might be useful if you can't rely on CSS custom properties (variables) or a polyfill.
In the CSS Object Model, a #keyframes rule has its own array of child rules corresponding to the keyframes themselves (from, to and/or percentages). So, if your example were the first stylesheet in the document (and had no other rules):
const stylesheet = document.stylesheets[0];
const progressStatus = stylesheet.cssRules[0];
const toKeyframe = progressStatus.cssRules[0];
toKeyframe.style.strokeDashoffset = '170 80'; // or whatever desired value
Picking stylesheets and nested rules by array indexes is cumbersome, error-prone and easily breaks with changes. In production code, you'll at least want to locate the rule by iterating through all rules with various tests, like rule.type === CSSRule.KEYFRAMES_RULE && rule.name === 'progressStatus'.
I have created basic animation of a circle using css #keyframes.
I'm using javascript to trigger animation start/stop by click inside the circle.
The animation itself could be divided into 5 (looped) phases:
pause-expand-pause-shrink-pause (see #keyframes css section below)
The goal I want to achieve is, eventually, to be able to set animation duration and change the values of keyframes (say by having input fields for pause and expand/shrink durations - details doesn't really matter for the scope of this question). I have put together a JavaScript function, to perform this task, and have set it to onload just to test how it works.
My HTML:
<!doctype html>
<html>
<head>
<meta content="text/html;charset=utf-8" http-equiv="Content-Type">
<meta content="utf-8" http-equiv="encoding">
<link rel="stylesheet" href="style.css">
<script src = "animation.js"></script>
</head>
<body onload=setAnimationDuration(1,1)>
<div id="circle" class='circle-paused' onclick=cssAnimation()></div>
</body>
</html>
My CSS:
#circle {
position: absolute;
top: 50%;
left: 50%;
transform: translateX(-50%) translateY(-50%);
}
.circle-paused {
width: 9%;
padding-top: 9%;
border-radius: 50%;
background-color: #800080;
margin: auto;
}
.circle-animated {
/* 2 sec pause 4 sec expand_shrink*/
width: 9%;
padding-top: 9%;
-webkit-animation-name: my-circle; /* Safari 4.0 - 8.0 */
-webkit-animation-duration: 12s; /* Safari 4.0 - 8.0 */
animation-name: my-circle;
animation-duration: 12s;
animation-iteration-count: infinite;
animation-timing-function: linear;
border-radius: 50%;
margin: auto;
}
#keyframes my-circle {
0% {background-color: #800080; width: 9%; padding-top: 9%;}
33.3% {background-color: #D8BFD8; width: 28%; padding-top: 28%;}
50% {background-color: #D8BFD8; width: 28%; padding-top: 28%;}
83.3% {background-color: #800080; width: 9%; padding-top: 9%;}
100% {background-color: #800080; width: 9%; padding-top: 9%;}
}
My JavaScript:
function cssAnimation() {
if (document.getElementById('circle').className == 'circle-paused') {
document.getElementById('circle').className = 'circle-animated'
} else {
document.getElementById('circle').className = 'circle-paused'
}
}
function findKeyframes(animation_name) {
// get list of current keyframe rules
var style_sheet = document.styleSheets;
for (var i = 0; i < style_sheet.length; ++i) {
for (var j = 0; j < style_sheet[i].cssRules.length; ++j) {
// type 7 correspond to CSSRule.KEYFRAMES_RULE, for more info see https://developer.mozilla.org/en-US/docs/Web/API/CSSRule
if (style_sheet[i].cssRules[j].type == 7 && style_sheet[i].cssRules[j].name == animation_name) {
return style_sheet[i].cssRules[j];
}
}
}
// keyframe rules were not found for given animation_name
return null;
}
function getPercentage(total, fraction) {
// Returns what percentage the fraction is from total
// The result is rounded to 1 decimal place
return Math.round(((100 / total) * fraction) * 10) / 10;
}
function setAnimationDuration(pause, expand_shrink) {
var total_animation_duration = (pause * 2) + (expand_shrink * 2)
var pause_percentage = getPercentage(total_animation_duration, pause)
var expand_shrink_percentage = getPercentage(total_animation_duration, expand_shrink)
var pause1 = pause_percentage + expand_shrink_percentage;
var shrink = pause1 + expand_shrink_percentage;
var frame_percentage_list = [0, expand_shrink_percentage, pause1, shrink, 100]
var key_frame_list = findKeyframes('my-circle')
var new_rule_list = []
var to_be_removed_key_list = []
//create array of new rules to be inserted
//collecting old keys of rules to be deleted
for(var i = 0; i < key_frame_list.cssRules.length; i++) {
var current_rule = key_frame_list.cssRules[i].cssText
to_be_removed_key_list.push(key_frame_list.cssRules[i].keyText)
new_rule_list.push(current_rule.replace(/[+-]?([0-9]*[.])?[0-9]+%/, frame_percentage_list[i] + '%'))
}
// delete old rules
for(var i = 0; i < to_be_removed_key_list.length; i++) {
key_frame_list.deleteRule(to_be_removed_key_list[i])
}
// populate new ruels
for(var i = 0; i < new_rule_list.length; i++) {
key_frame_list.appendRule(new_rule_list[i])
}
document.getElementById('circle').style.animationDuration = total_animation_duration + "s"
}
Code above, on JSFiddle
The problem itself:
The code is working as expected in FireFox (55.0.3), Chrome (61.0) and Safari (11.0). Though, when I started testing it in IE11, I have found that key_frame_list.deleteRule('rule_key') throws an Invalid argument error. While researching the issue, I have found (and went trough) this article (though it does not tackle the IE problem, it improved my overall understanding of the css animations). On MSDN I have find two references, concerning deleteRule: one and two. Though I didn't really understood what was meant, in the second one, by:
The key must resolve to a number between 0 and 1, or the rule is ignored.
I assumed that in IE you have to pass index to deleteRule instead of string key. So I've tried to check my assumption in IE console. Here is what I have found (given my js code is in onload):
var key_frame_list = findKeyframes('my-circle')
key_frame_list.cssRules.length => 5
key_frame_list.deleteRule(0)
key_frame_list.cssRules.length => 4
key_frame_list.deleteRule(1)
key_frame_list.cssRules.length => 3
key_frame_list.deleteRule(0)
key_frame_list.deleteRule(1)
key_frame_list.deleteRule(2)
...
key_frame_list.cssRules.length => 3
What is happening is:
key_frame_list.deleteRule(0) - removes first rule (which is 0%)
key_frame_list.deleteRule(1) - removes the last rule (which is 100%)
After that no matter which index I pass to key_frame_list.deleteRule() the key_frame_list.cssRules.length remains 3.
My expectation was that I will be able to recur with key_frame_list.deleteRule(0) and remove all the rules (as I expected that the indexes will shift after each rule deletion).
Now, I would like to understand:
What is the proper way (basically, 'Am I doing something wrong?') to use deleteRule in the IE (or if another method should be used)?
Why am I not able to delete more than two rules out of five?
Is there a method suitable for this purposes that will work with the same arguments on Firefox, Chrome and IE11, I am not aware of?
What is the proper way (basically, 'Am I doing something wrong?') to use deleteRule in the IE (or if another method should be used)?
Why am I not able to delete more than two rules out of five?
The first MSDN link does not apply; that deleteRule() method applies to top-level rules, not keyframe rules.
The text "The key must resolve to a number between 0 and 1, or the rule is ignored." from the second link is actually taken from the 2013 WD of css-animations, and means that instead of a string containing the 0%-100% keyframe selector, Internet Explorer expects a decimal number representing the percentage. The argument does not represent an index.
So for a 0% keyframe rule, IE expects the value 0; for a 100% keyframe rule, IE expects the value 1; and for a 33.3% keyframe rule, IE expects the floating-point value 0.333:
key_frame_list.deleteRule(0) // Deletes the 0% keyframe rule
key_frame_list.deleteRule(0.333) // Deletes the 33.3% keyframe rule
key_frame_list.deleteRule(1) // Deletes the 100% keyframe rule
Once a 0% rule has been deleted, if no 0% rules remain then additional calls to deleteRule(0) will do nothing.
And since keyframes cannot exceed 100%, deleteRule(2) is meaningless since it would mean deleting a 200% keyframe rule, which cannot exist.
Is there a method suitable for this purposes that will work with the same arguments on Firefox, Chrome and IE11, I am not aware of?
No; Internet Explorer 11 follows the 2013 WD (itself having been developed between 2012 and 2013 following the release of Internet Explorer 10), meaning its implementation is not consistent with the current standard, in which the deleteRule() method has been changed to accept a string argument instead of a numeric argument.
This means the API is incompatible, and so there is no clean workaround. You'll just have to attempt both arguments. I changed the following statement in your fiddle:
// delete old rules
for(var i = 0; i < to_be_removed_key_list.length; i++) {
key_frame_list.deleteRule(to_be_removed_key_list[i])
}
to:
// delete old rules
for(var i = 0; i < to_be_removed_key_list.length; i++) {
try {
key_frame_list.deleteRule(to_be_removed_key_list[i])
} catch (e) {
key_frame_list.deleteRule(+(parseFloat(to_be_removed_key_list[i]) / 100).toFixed(3))
}
}
The +(parseFloat(to_be_removed_key_list[i]) / 100).toFixed(3) bit converts a percentage string to a numeric value taking rounding errors into account. The rounding errors inherent to IEEE-754 floating-point numbers is the reason the API was changed in the first place (that, and consistency with the appendRule() method which has always expected a string), except since it was only changed some time after Internet Explorer 11 was released, and since IE11 will no longer receive platform updates, this means IE11 is stuck with its old WD implementation (which, I must emphasize, was current at the time of its development).
So I have a Polymer app that I am writing. I have written a non-polymer web-component for a loading overlay that I can show whilst Polymer is loading and when the app Websocket is connecting/reconnecting.
Here is an exert of some of the CSS I have to give an indication of what I am doing:
.overlay {
background: #000;
bottom: 0;
height: 100%;
left: 0;
opacity: 0;
pointer-events: none;
position: fixed;
right: 0;
transition: opacity 0.2s;
top: 0;
width: 100%;
z-index: 9999999;
}
.overlay[opened] {
opacity: 0.8;
pointer-events: auto;
}
.loader {
display: none;
}
.overlay[opened] .loader {
display: block;
}
Now this overlay and the CSS based loader animation I have is only used when I load the application realistically, however if the WebSocket were to disconnect it would be shown too.
My question is, for performance reasons, should I be removing the element from the DOM entirely and add it back if its required? Does the fact that the overlay is completely transparent when not in use and the loader animation is hidden mean they have no impact on drawing performance?
Note: I am looking to avoid the "don't micro-optimise" answer if possible ;)
TL;DR:
In general, a rendered element affects page performance when changes to it trigger repaint on subsequent elements in DOM or when it triggers resize on its parent(s), as resize can get expensive from being fired up to 100 times/second, depending on device.
As long as changes to your element do not trigger repaint on subsequent elements in DOM tree, the difference between having it rendered, hidden behind some opaque element (or above the content, with opacity:0 and pointer-events:none) and having it not displayed at all is insignificant.
Changes to your element will not trigger repaint on anything but itself, because it has position:fixed. The same would be true if it had position:absolute or if the changes to it would be made through properties that do not trigger repaint on subsequent siblings, like transform and opacity.
Unless the loader is really heavy on the rendering engine (which is rarely the case โ think WebGL loaders with 3d scenes, materials and lights mapping โ in which case it would be better to not display it when not shown to the user), the difference would be so small that the real challenge is to measure this difference, performance wise.
In fact, I would not be surprised if having it rendered and only changing its opacity and pointer-events properties is not, overall, less expensive than toggling its display property, because the browser doesn't have to add/remove it from DOM each time you turn it on/off. But, again, the real question is: how do we measure it?
Edit: Actually, I made a small testing tool, with 10k modals. I got the following results, in Chrome, on Linux:
`opacity` average: 110.71340000000076ms | count: 100
`display` average: 155.47145000000017ms | count: 100
... so my assumption was correct: display is more expensive overall.
The opacity changes are mostly around 110ms with few exceptions, while the display changes are faster when nodes are removed but slower when added.
Feel free to test it yourself, in different browsers, on different systems:
$(window).on('load', function () {
let displayAvg = 0, displayCount = 0,
opacityAvg = 0, opacityCount = 0;
for (let i = 0; i < 10000; i++) {
$('body').append($('<div />', {
class: 'modal',
html:'10k ร modal instances'
}))
}
$(document)
.on('click', '#display', function () {
$('.modal').removeClass('opacity');
let t0 = performance.now();
$('.modal').toggleClass('display');
setTimeout(function () {
let t1 = performance.now();
displayAvg += (t1 - t0);
console.log(
'`display` toggle took ' +
(t1 - t0) +
'ms \n`display` average: ' +
(displayAvg / ++displayCount) +
'ms | count: ' +
displayCount
);
})
})
.on('click', '#opacity', function () {
$('.modal').removeClass('display');
let t0 = performance.now();
$('.modal').toggleClass('opacity');
setTimeout(function () {
let t1 = performance.now();
opacityAvg += (t1 - t0);
console.log(
'`opacity` + `pointer-events` toggle took ' +
(t1 - t0) +
'ms \n`opacity` average: ' +
(opacityAvg / ++opacityCount) +
'ms | count: ' +
opacityCount
);
});
})
});
body {
margin: 0;
}
.buttons-wrapper {
position: relative;
z-index: 1;
margin-top: 3rem;
}
.modal {
height: 100vh;
width: 100vw;
position: fixed;
top: 0;
left: 0;
padding: 1rem;
}
.modal.display {
display: none;
}
.modal.opacity {
opacity: 0;
pointer-events: none;
}
.as-console-wrapper {
z-index: 2;
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<div class="buttons-wrapper">
<button id="display">Toggle `display`</button>
<button id="opacity">Toggle `opacity` + `pointer-events`</button>
</div>
But this average is for 10k elements. Divide it by 10k and it's virtually no difference at all: we're talking less than 0.45% of a millisecond.
If an element is animated with 'Animation' property and its duration is infinite browser will continuously repaint the site and that will affect the site performance and lower the FPS.
However hiding elements with properties such as Opacity:0; will not do the trick because element is still in CSSOM rendering tree queue.
Visibility:hidden; and display:none; should do the trick based on CSSOM construction browser doesn't render hidden elements such as display:none; & visibility:hidden
How can we use JavaScript to get the value of the declared CSS value (like the value that shows up when inspecting an element using Chrome's inspector? Everything I try and search for on the web seems to only want to give the computed value.
For my specific use case I want to know what the width will be at the end of a CSS transition, which is set in the stylesheet, not inline. When I check for the width using JavaScript, I get the computed width, which at the time of retrieval in the script is at the beginning of the transition, so shows me 0 even though in 300ms it will be a declared width like 320px.
Take a look at transitionend event. Can it be used for your use case?
EDIT
Since this answer got upvoted i'm pasting some code below.
element.addEventListener("transitionend", function() {
// get CSS property here
}, false);
I think this is what you're after:
$('#widen').on('click', function(e){
$('.example').addClass('wider');
$('#prefetch').text('The div will be: ' + getWidth('wider'));
});
function getWidth(className){
for (var s = 0; s < document.styleSheets.length; s++){
var rules = document.styleSheets[s].rules || document.styleSheets[s].cssRules; console.log(rules);
for (var r = 0; r < rules.length; r++){
var rule = rules[r]; console.log(rule);
if (rule.selectorText == '.'+className){
console.log('match!');
return rule.style.width;
}
}
}
return undefined;
}
.example {
width: 100px;
background-color: #ccc;
border: 1px solid black;
}
.wider {
width: 320px;
-webkit-transition: width 5s;
-moz-transition: width 5s;
-o-transition: width 5s;
transition: width 5s;
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
<div class="example">This is a simple container.</div>
<button id="widen" type="button">Widen</button>
<span id="prefetch"></span>
Keep in mind, I believe this still falls victim to cross-domain preventions (if the CSS is hosted on a CDN/other domain, you won't be able to retrieve the styleSheet, and, therefore, not be able to access then eventual width.
I selected the most helpful answer to my question, but it appears that the solution I was looking for does not exist. What I needed was to get what the width of an element would be in pixels BEFORE the transition actually was completed. This width is percent based and so the actual pixels width would vary based on a number of factors. What I ended up doing in reality was:
Making a jQuery clone of the item of which I needed the "end"
transition measurement.
Positioning the clone off screen
Adding inline styles to the clone, remove the CSS inherited transition properties so that it immediately gets the final/end width.
Using JS to save the ending width to a variable
Removing the clone with jQuery's .remove()
Doing this, I now know what the ending width in pixels would be at the moment the element begins to transition, rather than having to wait until the end of the transition to capture the ending width.
Here is a function that does what I described above.
var getTransitionEndWidth = function($el) {
$('body').append('<div id="CopyElWrap" style="position:absolute; top:-9999px; opacity:0;"></div>');
var copyEl = document.createElement('div');
var $copy = $(copyEl);
var copyElClasses = $el.attr('class');
$copy.attr('class', copyElClasses).css({
WebkitTransition: 'all 0 0',
MozTransition: 'all 0 0',
MsTransition: 'all 0 0',
OTransition: 'all 0 0',
transition: 'all 0 0'
}).appendTo($('#CopyElWrap'));
var postWidth = $copy.width();
$('#CopyElWrap').remove();
return postWidth;
};
As this question observes, immediate CSS transitions on newly-appended elements are somehow ignored - the end state of the transition is rendered immediately.
For example, given this CSS (prefixes omitted here):
.box {
opacity: 0;
transition: all 2s;
background-color: red;
height: 100px;
width: 100px;
}
.box.in { opacity: 1; }
The opacity of this element will be set immediately to 1:
// Does not animate
var $a = $('<div>')
.addClass('box a')
.appendTo('#wrapper');
$a.addClass('in');
I have seen several ways of triggering the transition to get the expected behaviour:
// Does animate
var $b = $('<div>')
.addClass('box b')
.appendTo('#wrapper');
setTimeout(function() {
$('.b').addClass('in');
},0);
// Does animate
var $c = $('<div>')
.addClass('box c')
.appendTo('#wrapper');
$c[0]. offsetWidth = $c[0].offsetWidth
$c.addClass('in');
// Does animate
var $d = $('<div>')
.addClass('box d')
.appendTo('#wrapper');
$d.focus().addClass('in');
The same methods apply to vanilla JS DOM manipulation - this is not jQuery-specific behaviour.
Edit - I am using Chrome 35.
JSFiddle (includes vanilla JS example).
Why are immediate CSS animations on appended elements ignored?
How and why do these methods work?
Are there other ways of doing it
Which, if any, is the preferred solution?
The cause of not animating the newly added element is batching reflows by browsers.
When element is added, reflow is needed. The same applies to adding the class. However when you do both in single javascript round, browser takes its chance to optimize out the first one. In that case, there is only single (initial and final at the same time) style value, so no transition is going to happen.
The setTimeout trick works, because it delays the class addition to another javascript round, so there are two values present to the rendering engine, that needs to be calculated, as there is point in time, when the first one is presented to the user.
There is another exception of the batching rule. Browser need to calculate the immediate value, if you are trying to access it. One of these values is offsetWidth. When you are accessing it, the reflow is triggered. Another one is done separately during the actual display. Again, we have two different style values, so we can interpolate them in time.
This is really one of very few occasion, when this behaviour is desirable. Most of the time accessing the reflow-causing properties in between DOM modifications can cause serious slowdown.
The preferred solution may vary from person to person, but for me, the access of offsetWidth (or getComputedStyle()) is the best. There are cases, when setTimeout is fired without styles recalculation in between. This is rare case, mostly on loaded sites, but it happens. Then you won't get your animation. By accessing any calculated style, you are forcing the browser to actually calculate it.
Using jQuery try this (An Example Here.):
var $a = $('<div>')
.addClass('box a')
.appendTo('#wrapper');
$a.css('opacity'); // added
$a.addClass('in');
Using Vanilla javaScript try this:
var e = document.createElement('div');
e.className = 'box e';
document.getElementById('wrapper').appendChild(e);
window.getComputedStyle(e).opacity; // added
e.className += ' in';
Brief idea:
The getComputedStyle() flushes all pending style changes and
forces the layout engine to compute the element's current state, hence
.css() works similar way.
About css()from jQuery site:
The .css() method is a convenient way to get a style property from the
first matched element, especially in light of the different ways
browsers access most of those properties (the getComputedStyle()
method in standards-based browsers versus the currentStyle and
runtimeStyle properties in Internet Explorer) and the different terms
browsers use for certain properties.
You may use getComputedStyle()/css() instead of setTimeout. Also you may read this article for some details information and examples.
Please use the below code, use "focus()"
Jquery
var $a = $('<div>')
.addClass('box a')
.appendTo('#wrapper');
$a.focus(); // focus Added
$a.addClass('in');
Javascript
var e = document.createElement('div');
e.className = 'box e';
document.getElementById('wrapper').appendChild(e).focus(); // focus Added
e.className += ' in';
I prefer requestAnimationFrame + setTimeout (see this post).
const child = document.createElement("div");
child.style.backgroundColor = "blue";
child.style.width = "100px";
child.style.height = "100px";
child.style.transition = "1s";
parent.appendChild(child);
requestAnimationFrame(() =>
setTimeout(() => {
child.style.width = "200px";
})
);
Try it here.
#Frizi's solution works, but at times I've found that getComputedStyle has not worked when I change certain properties on an element. If that doesn't work, you can try getBoundingClientRect() as follows, which I've found to be bulletproof:
Let's assume we have an element el, on which we want to transition opacity, but el is display:none; opacity: 0:
el.style.display = 'block';
el.style.transition = 'opacity .5s linear';
// reflow
el.getBoundingClientRect();
// it transitions!
el.style.opacity = 1;
Anything fundamentally wrong with using keyframes for "animate on create"?
(if you strictly don't want those animations on the initial nodes, add another class .initial inhibitin animation)
function addNode() {
var node = document.createElement("div");
var textnode = document.createTextNode("Hello");
node.appendChild(textnode);
document.getElementById("here").appendChild(node);
}
setTimeout( addNode, 500);
setTimeout( addNode, 1000);
body, html { background: #444; display: flex; min-height: 100vh; align-items: center; justify-content: center; }
button { font-size: 4em; border-radius: 20px; margin-left: 60px;}
div {
width: 200px; height: 100px; border: 12px solid white; border-radius: 20px; margin: 10px;
background: gray;
animation: bouncy .5s linear forwards;
}
/* suppres for initial elements */
div.initial {
animation: none;
}
#keyframes bouncy {
0% { transform: scale(.1); opacity: 0 }
80% { transform: scale(1.15); opacity: 1 }
90% { transform: scale(.9); }
100% { transform: scale(1); }
}
<section id="here">
<div class="target initial"></div>
</section>
Rather than trying to force an immediate repaint or style calculation, I tried using requestAnimationFrame() to allow the browser to paint on its next available frame.
In Chrome + Firefox, the browser optimizes rendering too much so this still doesn't help (works in Safari).
I settled on manually forcing a delay with setTimeout() then using requestAnimationFrame() to responsibly let the browser paint. If the append hasn't painted before the timeout ends the animation might be ignored, but it seems to work reliably.
setTimeout(function () {
requestAnimationFrame(function () {
// trigger the animation
});
}, 20);
I chose 20ms because it's larger than 1 frame at 60fps (16.7ms) and some browsers won't register timeouts <5ms.
Fingers crossed that should force the animation start into the next frame and then start it responsibly when the browser is ready to paint again.
setTimeout() works only due to race conditions, requestAnimationFrame() should be used instead. But the offsetWidth trick works the best out of all options.
Here is an example situation. We have a series of boxes that each need to be animated downward in sequence. To get everything to work we need to get an animation frame twice per element, here I put once before the animation and once after, but it also seems to work if you just put them one after another.
Using requestAnimationFrame twice works:
Works regardless of how exactly the 2 getFrame()s and single set-class-name step are ordered.
const delay = (d) => new Promise(resolve => setTimeout(resolve, d));
const getFrame = () => new Promise(resolve => window.requestAnimationFrame(resolve));
async function run() {
for (let i = 0; i < 100; i++) {
const box = document.createElement('div');
document.body.appendChild(box);
// BEFORE
await getFrame();
//await delay(1);
box.className = 'move';
// AFTER
await getFrame();
//await delay(1);
}
}
run();
div {
display: inline-block;
background-color: red;
width: 20px;
height: 20px;
transition: transform 1s;
}
.move {
transform: translate(0px, 100px);
}
Using setTimeout twice fails:
Since this is race condition-based, exact results will vary a lot depending on your browser and computer. Increasing the setTimeout delay helps the animation win the race more often, but guarantees nothing.
With Firefox on my Surfacebook 1, and with a delay of 2ms / el, I see about 50% of the boxes failing. With a delay of 20ms / el I see about 10% of the boxes failing.
const delay = (d) => new Promise(resolve => setTimeout(resolve, d));
const getFrame = () => new Promise(resolve => window.requestAnimationFrame(resolve));
async function run() {
for (let i = 0; i < 100; i++) {
const box = document.createElement('div');
document.body.appendChild(box);
// BEFORE
//await getFrame();
await delay(1);
box.className = 'move';
// AFTER
//await getFrame();
await delay(1);
}
}
run();
div {
display: inline-block;
background-color: red;
width: 20px;
height: 20px;
transition: transform 1s;
}
.move {
transform: translate(0px, 100px);
}
Using requestAnimationFrame once and setTimeout usually works:
This is Brendan's solution (setTimeout first) or pomber's solution (requestAnimationFrame first).
# works:
getFrame()
delay(0)
ANIMATE
# works:
delay(0)
getFrame()
ANIMATE
# works:
delay(0)
ANIMATE
getFrame()
# fails:
getFrame()
ANIMATE
delay(0)
The once case where it doesn't work (for me) is when getting a frame, then animating, then delaying. I do not have an explanation why.
const delay = (d) => new Promise(resolve => setTimeout(resolve, d));
const getFrame = () => new Promise(resolve => window.requestAnimationFrame(resolve));
async function run() {
for (let i = 0; i < 100; i++) {
const box = document.createElement('div');
document.body.appendChild(box);
// BEFORE
await getFrame();
await delay(1);
box.className = 'move';
// AFTER
//await getFrame();
//await delay(1);
}
}
run();
div {
display: inline-block;
background-color: red;
width: 20px;
height: 20px;
transition: transform 1s;
}
.move {
transform: translate(0px, 100px);
}
Edit: the technique used in the original answer, below the horizontal rule, does not work 100% of the time, as noted in the comments by mindplay.dk.
Currently, if using requestAnimationFrame(), pomber's approach is probably the best, as can be seen in the article linked to in pomber's answer. The article has been updated since pomber answered, and it now mentions requestPostAnimationFrame(), available behind the Chrome flag --enable-experimental-web-platform-features now.
When requestPostAnimationFrame() reaches a stable state in all major browsers, this will presumably work reliably:
const div = document.createElement("div");
document.body.appendChild(div);
requestPostAnimationFrame(() => div.className = "fade");
div {
height: 100px;
width: 100px;
background-color: red;
}
.fade {
opacity: 0;
transition: opacity 2s;
}
For the time being, however, there is a polyfill called AfterFrame, which is also referenced in the aforementioned article. Example:
const div = document.createElement("div");
document.body.appendChild(div);
window.afterFrame(() => div.className = "fade");
div {
height: 100px;
width: 100px;
background-color: red;
}
.fade {
opacity: 0;
transition: opacity 2s;
}
<script src="https://unpkg.com/afterframe/dist/afterframe.umd.js"></script>
Original answer:
Unlike Brendan, I found that requestAnimationFrame() worked in Chrome 63, Firefox 57, IE11 and Edge.
var div = document.createElement("div");
document.body.appendChild(div);
requestAnimationFrame(function () {
div.className = "fade";
});
div {
height: 100px;
width: 100px;
background-color: red;
}
.fade {
opacity: 0;
transition: opacity 2s;
}