I would like to use requestAnimationFrame to play an HTML <video> element. This is useful because it offers greater control over the playback (e.g. can play certain sections, control the speed, etc). However, I'm running into an issue with the following approach:
function playAnimation() {
window.cancelAnimationFrame(animationFrame);
var duration = video.seekable.end(0);
var start = null;
var step = function(timestamp) {
if (!start) start = timestamp;
const progress = timestamp - start;
const time = progress / 1000;
video.currentTime = time;
console.log(video.currentTime);
if (time > duration) {
start = null;
}
animationFrame = window.requestAnimationFrame(step);
}
animationFrame = window.requestAnimationFrame(step);
}
In Google Chrome, the video plays a little bit but then freezes. In Firefox it freezes even more. The console shows that the video's currentTime is being updated as expected, but it's not rendering the new time. Additionally, in the instances when the video is frozen, the ontimeupdate event does not fire, even though the currentTime is being updated.
A simple demo can be found here: https://codepen.io/TGordon18/pen/bGVQaXM
Any idea what's breaking?
Update:
Interestingly, controlling/throttling the animationFrame actually helps in Firefox.
setTimeout(() => {
animationFrame = window.requestAnimationFrame(step);
}, 1000 / FPS);
This doesn't seem like the right approach though
The seeking of the video is usually slower than one frame of requestAnimationFrame. One ideal frame of requestAnimationFrame is about 16.6ms (60 FPS), but the duration of the seek depends on how the video is encoded and where in the video you want to seek. When in step function you set video.currentTime and then do the same thing in the next frame, the previous seek operation most likely has not finished yet. As you continue calling video.currentTime over and over again, browser still tries to execute old tasks until the point it starts freezing because it is overwhelmed with the number of tasks. It might also influence how it fires the events like timeupdate.
The solution might be to explicitly wait for the seek to finish and only after that asking for the next animation frame.
video.onseeked = () => {
window.requestAnimationFrame(step);
}
https://codepen.io/mradionov/pen/vYNvyym?editors=0010
Nevertheless you most likely won't be able to achieve the same smooth real-time playback like in the video tag, because of how the seeking operation works. Unless you are willing to drop some current frames when the previous frame is still not ready.
Basically storing an entire image for each video frame is very expensive. One of the core video compression techniques is to store full video frame only in some intervals, like every 1 second (they are called key-frames of I-frames). The rest of the frames in between will store the difference from the previous frame (P-frames), which is pretty small compared to entire image. When video plays as usual, it already has previous frame in buffer, the only thing it needs to do is apply the difference for the next frame. But when you make a seek operation, there is no previous frame to calculate the difference from. Video decoder has to find the nearest key-frame with the full image and then apply the difference for all of the following frames up until the point it finally reaches the frame you wanted to seek to.
If you use my suggestion to wait for previous seek operation to complete before requesting for the next seek, you will see that video starts smooth, but when it gets closer to 2.5 seconds it will stutter more in more, until it reaches 2.5s+ and becomes smooth again. But then again it will start stuttering up to the point of 5s, and become smooth again after 5s+. That's because key-frame interval for this video is 2.5 seconds and the farther the timestamp you want to seek to from the key-frame, the longer it will take, because more frames need to be decoded.
Related
I've been facing a strange issue beyond my understanding of how to fix it.
I'm trying to create the following multiplayer game structure:
The server running at 20-30 fps
The client logic loop at the same FPS as the server
The client render loop
I'm using PixiJS for UI and here is where I got stuck.
( I've opened a thread here as well )
And I have a demo here: https://playcode.io/1045459
Ok, now let's explain the issue!
private update = () => {
let elapsed = PIXI.Ticker.shared.elapsedMS
if (elapsed > 1000) elapsed = this.frameDuration
this.lag += elapsed
//Update the frame if the lag counter is greater than or
//equal to the frame duration
while (this.lag >= this.frameDuration) {
//Update the logic
console.log(`[Update] FPS ${Math.round(PIXI.Ticker.shared.FPS)}`)
this.updateInputs(now())
//Reduce the lag counter by the frame duration
this.lag -= this.frameDuration
}
// Render elements in-between states
const lagOffset = this.lag / this.frameDuration
this.interpolateSpaceships(lagOffset)
}
In the client loop I keep track of both logic & render parts, limiting the logic one at 20FPS. It all works "cookies and clouds" until the browser has a sudden frame rate drop from 120fps to 60fps. Based on my investigation and a nice & confusing spreadsheet that I've put together when the frame rate drops, the "player" moves 2x more ( eg. 3.3 instead of 1.66 ) On paper it's normal and the math is correct, BUT this creates a small bounce / jitter / stutter or whatever naming this thing has.
In the demo that I've created in playcode it's not visible. My assumption is that the code is too basic and the framerate never drops.
Considering that the math and the algorithm are correct ( which I'm not yet sure ), I've turned my eyes to other parts that might affect this. I'm using pixi-viewport to follow the character. Could it be that the following part creates this bounce?
Does anyone have experience writing such a game loop?
Update:
Okkkk, mindblowing result. I just found out that this happens even with the most simple version of the game loop ever. Just by multiplying x = x + speed * delta every frame.
For the same reason. Sudden drops in FPS.
Ok, I've found the solution. Will post it here as there is not a lot of info about it. The solution is to smooth out sudden fps drops over multiple frames. Easy right? 😅
const ticker = new PIXI.Ticker();
// The number of frames to use for smoothing
const smoothingFrames = 10;
// The smoothed frame duration
let smoothedFrameDuration = 0;
ticker.add((deltaTime) => {
// Add the current frame duration to the smoothing array
smoothedFrameDuration = (smoothedFrameDuration * (smoothingFrames - 1) + deltaTime) / smoothingFrames;
// Update the game logic here
// Use the smoothed frame duration instead of the raw deltaTime value
});
ticker.start();
Chrome/Firefox are deciding to fire requestAnimationFrame at 30FPS. The profiler shows most of the time is spent idling, so its not cpu cycles being burned up.
I tried writing a delta timing to "catch up" / stay synced. The FPS is correct according to my FPS meter, but the result is very choppy animating. I also tried Drawing immediately after each Update, still choppy.
I am running a very vanilla rAF paradigm and notice the window size affects the FPS, but the profiler doesn't seem to explain why the FPS drops at full screen
let start = null;
let step = (timestamp)=>{
if(thisTimerID !== scope._timerID){
return;
}
start = start || timestamp + delay;
let current = timestamp;
let delta = current - start;
if(delta >= delay) {
this.Update();
this.Draw();
start = timestamp;
}
requestAnimationFrame(step);
};
thisTimerID = this._timerID = requestAnimationFrame(step);
Should I try requestAnimationFrame, and if I detect a low FPS when initializing, fall back to setInterval? Should I just use requestAnimationFrame anyway? Maybe there's a flaw in my delta timing logic? I know theres a huge flaw because if you switch tabs then come back, it will try to catch up and might think it missed thousands of frames!
Thank you
EDIT:
requestAnimationFrame IS firing at 60fps under minimal load. It is the Draw method that is causing the frame rate to drop. But why? I have 2 profiles showing a smaller resolution runs at 60fps, full screen does not.
60FPS 350x321 Resolution:
38FPS 1680x921 Resolution:
What's really strange to me, is the full screen version is actually doing LESS work according to the profiler.
I want to make a JavaScript animation take 5 seconds to complete using requestAnimationFrame().
I don't want a strict and precise timing, so anything close to 5 seconds is OK and I want my code to be simple and readable, so solutions like this won't work for me.
My question is, is it safe to assume most browsers render the page at 60 fps? i.e. if I want my animation to take 5 seconds to complete, I'll divide it to 60 * 5 = 300 steps and with each call of function draw() using requestAnimationFrame(), draw the next step of animation. (Given the fact the animation is pretty simple, just moving a colored div around.)
By the way, I can't use jQuery.
Edit: Let me rephrase the question this way: Do all browsers 'normally' try to render the page at 60 fps? I want to know if Chrome for example renders at 75 fps or Firefox renders at 70 fps.
(Normal condition: CPU isn't highly loaded, RAM is not full, there are no storage failures, room is properly ventilated and nobody tries to throw my laptop out the window.)
Relying on 60 frames per second is very unsafe, because the browser isn't always in the same conditions, and even if it tries to render the page at the maximum fps possible, there's always a chance of the processor/cpu/gpu being busy doing something else, causing the FPS to drop down.
If you want to rely on FPS (although I wouldn't suggest you so), you should first detect the current fps, and adjust the speed of your animation frame per frame. Here's an example:
var lastCall, fps;
function detectFps() {
var delta;
if (lastCall) {
delta = (Date.now() - lastCall)/1000;
lastCall = Date.now();
fps = 1/delta;
} else {
lastCall = Date.now();
fps = 0;
}
}
function myFunc() {
detectFps();
// Calculate the speed using the var fps
// Animate...
requestAnimationFrame(myFunc);
}
detectFps(); // Initialize fps
requestAnimationFrame(myFunc); // Start your animation
It depends on the GPU and monitor combination. I have a good GPU and a 120 hertz monitor, so it renders at 120 fps. During the render, If I move to 60 hertz monitor, it will max out at 60 fps.
Another factor, that happens in some browsers/OS, is the iGPU being used instead of the discrete gpu.
As already stated by others, it isn't.
But if you need to end your animation in approximately 5 seconds and it's not crucial not to miss any frames in the animation, you can use the old setTimeout() way. That way you can miss a target by a few milliseconds, and some of the frames in your animation will be skipped (not rendered) because of the fps mismatch, but this can be a "good enough" solution, especially if your animation is simple as you state it is, there's a chance that users won't even see the glitch.
It's not safe to assume everyone can handle animation.
People will have different needs.
A lot of common animations, and common web design practices, give me awful migraines, so I set my browser to 1 frame per second to kill the animation without causing too much fast flashing.
I have a project that displays something every quarter of the video.
To do this it takes the video duration and / 4. and then *2 for half, * 3 for 3 quarters.
It also does a timeupdate
if timeupdate = videoduration/4 then display content.
This is fine in firefox because timeupdate ticks every frame. But in Chrome timeupdate ticks every 250ms which means the function to display the content won't run because timeupdate is never = video duration/4.
I though rounding to a whole number would fix it but then this means that because it runs the function when timeupdate = videoduration/4 and it runs every 250ms in Chrome, the content is displayed 4 times per quarter.
Any ideas how i can fix this?
Edit - Code for reference:
video.addEventListener('loadedmetadata', function() {
perc = video.duration/4;
});
// display the current and remaining times
video.addEventListener("timeupdate", function () {
// Current time
var vTime = video.currentTime;
perc1 = perc;
vTime1 = vTime;
if(vTime1 == perc1){
$('div.track').html('25%');
}
if(vTime1 == perc1+perc1){
$('div.track').html('50%');
}
if(vTime1 == perc1+perc1+perc1){
$('div.track').html('75%');
}
}, false);
The setTimeout is not guaranteed to run when you ask it to. It should never run earlier but may run later. You simply can't use it as a real-time counter.
In this case, the best thing you can do is keep track of how many times it has run... Rather than trying to match percents, match run times.
The other way is to you run your timer MUCH more often (every 50ms or so) and check and see if you have passed a timer-mark, and if so, your HTML then.
I am using the Web Audio API to display a visualization of the audio being played. I have an <audio> element that is controlling the playback, I then hook it up to the Web Audio API with by creating a MediaElementSource node from the <audio> element. That is then connected to a GainNode and an AnalyserNode. The AnalyserNode's smoothingTimeConstant is set to 0.6. The GainNode is then connected to the AudioContext.destination.
I then call my audio processing function: onAudioProcess(). That function will continually call itself using:
audioAnimation = requestAnimationFrame(onAudioProcess);
The function uses the AnalyserNode to getByteFrequencyData from the audio, then loops through the (now populated) Uint8Array and draws each frequency magnitude on the <canvas> element's 2d context. This all works fine.
My issue is that when you pause the <audio> element, my onAudioProcess function continues to loop (by requesting animation frames on itself) which is needlessly eating up CPU cycles. I can cancelAnimationFrame(audioAnimation) but that leaves the last-drawn frequencies on the canvas. I can resolve that by also calling clearRect on the canvas's 2d context, but it looks very odd compared to just letting the audio processing loop continue (which slowly lowers each bar to the bottom of the canvas because of the smoothingTimeConstant).
So what I ended up doing was setting a timeout when the <audio> is paused, prior to canceling the animation frame. Doing this I was able to save CPU cycles when no audio was playing AND I was still able to maintain the smooth lowering of the frequency bars drawn on the <canvas>.
MY QUESTION: How do I accurately calculate the number of milliseconds it takes for a frequency magnitude of 255 to hit 0 (the range is 0-255) based on the AnalyserNode's smoothingTimeConstant value so that I can properly set the timeout to cancel the animation frame?
Based on my reading of the spec, I'd think you'd figure it out like this:
var val = 255
, smooth = 0.6
, sampl = 48000
, i = 0
, ms;
for ( ; val > 0.001; i++ ){
val = ( val + val * smooth ) / 2;
}
ms = ( i / sampl * 1000 );
The problem is that with this kind of averaging, you never really get all the way down to zero - so the loop condition is kind of arbitrary. You can make that number smaller and as you'd expect, the value for ms gets larger.
Anyway, I could be completely off-base here. But a quick look through the actual Chromium source code seems to sort of confirm that this is how it works. Although I'll be the first to admit my C++ is pretty bad.