WebKit browsers slow to update canvas - javascript

I've created a simple binary clock in javascript, using Canvas, and in Chrome/Safari even though I'm drawing to the canvas many times every second it only updates the screen exactly once per second.
FireFox is updating the instant I draw to the canvas, and I think my code is efficient (activity monitor says the browser is using 5% to 10% of a single CPU core while the animation is running).
How can I make webkit browsers update more often? My actual code is at jsfiddle:
http://jsfiddle.net/mqGKR/
But basically this is what I'm doing:
function updateCanvas()
{
if (!canvasNeedsUpdating()) {
return;
}
var ctx = blockView.getContext("2d");
ctx.clearRect(0, 0, width, height);
if (canvasNeedsFill()) {
ctx.fillStyle = "rgba(255,255,255,1.0)";
ctx.fillRect(0, 0, width, height);
}
window.setTimeout(updateCanvas, 10);
}

Wow. This gets weird.
This has nothing to do with canvas. It has to do with your BinaryTime class. There's some difference in the functioning of the Date objects between at least Chrome and Firefox.
The beginning and end are 1370318400000 and 1370404800000 in FireFox. Every time. Presumably this is what you want looking at the comments.
They are changing every single time in Chrome, which means they are definitely not representing midnight this morning and midnight tonight as your comments suggest.
In other words, the Date object in Chrome/webkit appears broken. But it's more accurate. It's less accurate in Firefox in a more subtle way, but for now lets focus on a fix. (Later tonight, while crying into a tub of ice cream, I'll submit some bug reports).
But Chrome is doing the right thing here, because you are not calling setMilliseconds and chrome is respecting that. Firefox gets weird and does the wrong thing, but it just so happens to be what you want.
So anyway, the easy way that works off the bat is to use setHours with all four arguments:
// init "beginning" timestamp as midnight this morning
var beginning = new Date();
beginning.setHours(0, 0, 0, 0);
beginning = beginning.getTime();
// init "end" timestamp as as midnight tonight
var end = new Date(date);
end.setHours(0, 0, 0, 0);
end.setDate(end.getDate() + 1);
end = end.getTime();
I'd just do that for now. Working example:
http://jsfiddle.net/wvR6H/
The slightly-more-drawn-out problem is that in Chrome/WebKit you need to also set the milliseconds:
blah.setMilliseconds(0);
You need to set it in FireFox too, you're exploiting a sort-of Firefox bug as your code is written right now. It would be "broken" in Firefox too if you had beginning = new Date(), in other words with an empty constructor. See here for instance: http://jsfiddle.net/VbWnk/
It just so happens that new Date(new Date()) in Firefox trims off the milliseconds for you. Actually, to be fair, IE works the same way so Chrome/Webkit is the odd one out. The ECMAScript specification is not clear on who's right (FF/IE seem to be right, but talk for EcmaScript 6 indicate they may special case new Date(Date). A Date object is not technically an acceptable argument to the Date constructor, but a string is, and the Date string does not contain milliseconds. This suggests FireFox/IE are more correct, but WebKit's way is understandable too, and may be right in the future.
...But anyway, setHours(a,b,c,d) sets the hours, minutes, seconds, milliseconds as shorthand, so that's easier to write.
Hope your project goes well.

Related

Javascript Date.now() function [duplicate]

I got this code over here:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
if(currentDate - date >= 1000) {
console.log(currentDate, date);
console.log(currentDate-date);
}
else {
console.log("It was less than a second!");
console.log(currentDate-date);
}
}, 1000);
In my computer, it always executes correctly, with 1000 in the console output. Interestedly in other computer, the same code, the timeout callback starts in less than a second and the difference of currentDate - date is between 980 and 998.
I know the existence of libraries that solve this inaccuracy (for example, Tock).
Basically, my question is: What are the reasons because setTimeout does not fire in the given delay? Could it be the computer that is too slow and the browser automatically tries to adapt to the slowness and fires the event before?
PS: Here is a screenshot of the code and the results executed in the Chrome JavaScript console:
It's not supposed to be particularly accurate. There are a number of factors limiting how soon the browser can execute the code; quoting from MDN:
In addition to "clamping", the timeout can also fire later when the page (or the OS/browser itself) is busy with other tasks.
In other words, the way that setTimeout is usually implemented, it is just meant to execute after a given delay, and once the browser's thread is free to execute it.
However, different browsers may implement it in different ways. Here are some tests I did:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
console.log(currentDate-date);
}, 1000);
// Browser Test1 Test2 Test3 Test4
// Chrome 998 1014 998 998
// Firefox 1000 1001 1047 1000
// IE 11 1006 1013 1007 1005
Perhaps the < 1000 times from Chrome could be attributed to inaccuracy in the Date type, or perhaps it could be that Chrome uses a different strategy for deciding when to execute the code—maybe it's trying to fit it into the a nearest time slot, even if the timeout delay hasn't completed yet.
In short, you shouldn't use setTimeout if you expect reliable, consistent, millisecond-scale timing.
In general, computer programs are highly unreliable when trying to execute things with higher precision than 50 ms. The reason for this is that even on an octacore hyperthreaded processor the OS is usually juggling several hundreds of processes and threads, sometimes thousands or more. The OS makes all that multitasking work by scheduling all of them to get a slice of CPU time one after another, meaning they get 'a few milliseconds of time at most to do their thing'.
Implicity this means that if you set a timeout for 1000 ms, chances are far from small that the current browser process won't even be running at that point in time, so it's perfectly normal for the browser not to notice until 1005, 1010 or even 1050 milliseconds that it should be executing the given callback.
Usually this is not a problem, it happens, and it's rarely of utmost importance. If it is, all operating systems supply kernel level timers that are far more precise than 1 ms, and allow a developer to execute code at precisely the correct point in time. JavaScript however, as a heavily sandboxed environment, doesn't have access to kernel objects like that, and browsers refrain from using them since it could theoretically allow someone to attack the OS stability from inside a web page, by carefully constructing code that starves other threads by swamping it with a lot of dangerous timers.
As for why the test yields 980 I'm not sure - that would depend on exactly which browser you're using and which JavaScript engine. I can however fully understand if the browser just manually corrects a bit downwards for system load and/or speed, ensuring that "on average the delay is still about the correct time" - it would make a lot of sense from the sandboxing principle to just approximate the amount of time required without potentially burdening the rest of the system.
Someone please correct me if I am misinterpreting this information:
According to a post from John Resig regarding the inaccuracy of performance tests across platforms (emphasis mine)
With the system times constantly being rounded down to the last queried time (each about 15 ms apart) the quality of performance results is seriously compromised.
So there is up to a 15 ms fudge on either end when comparing to the system time.
I had a similar experience.
I was using something like this:
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000));
setTimeout(function ()
{
CountDownClock(ElementID, RelativeTime);
}, iMillSecondsTillNextWholeSecond);//Wait until the next whole second to start.
I noticed it would Skip a Second every couple Seconds, sometimes it would go for longer.
However, I'd still catch it Skipping after 10 or 20 Seconds and it just looked rickety.
I thought, "Maybe the Timeout is too slow or waiting for something else?".
Then I realized, "Maybe it's too fast, and the Timers the Browser is managing are off by a few Milliseconds?"
After adding +1 MilliSeconds to my Variable I only saw it skip once.
I ended up adding +50ms, just to be on the safe side.
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000) + 50);
I know, it's a bit hacky, but my Timer is running smooth now. :)
Javascript has a way of dealing with exact time frames. Here’s one approach:
You could just save a Date.now when you start to wait, and create an interval with a low ms update frame, and calculate the difference between the dates.
Example:
const startDate = Date.now()
setInterval(() => {
const currentDate = Date.now()
if (currentDate - startDate === 1000 {
// it was a second
clearInterval()
return
}
// it was not a second
}, 50)

Sound fades out, but does not fade in -- why?

I think I understand the main concept of Web Audio API, as well as how sounds are working in general. And even though I managed to make the sound "fade out", I cannot figure out, why it is not "fading in" in the following snippet I wrote to represent the problem:
(function ()
{
'use strict';
var context = new AudioContext(),
wave = context.createOscillator(),
gain = context.createGain(),
ZERO = 0.000001;
wave.connect(gain);
gain.connect(context.destination);
wave.type = 'sine';
wave.frequency.value = 200;
gain.gain.value = ZERO;
wave.start(context.currentTime);
gain.gain.exponentialRampToValueAtTime(1.00, 1.0);
gain.gain.exponentialRampToValueAtTime(ZERO, 3.0);
})();
NOTE: The same problem appeared on Firefox (Linux) and Chrome (Windows) too
Replacing your gain.gain.value = ZERO line with:
gain.gain.setValueAtTime(ZERO, 0);
will fix the problem.
The rationale is in the actual specification of the exponentialRampToValueAtTime() function:
Schedules an exponential continuous change in parameter value from the previous scheduled parameter value to the given value
So, if there's no previous scheduled parameter value (only a fixed value) then the function cannot interpolate. The same applies to the linearRampToValueAtTime function.
This may also be useful from the MDN documentation:
AudioParam.value ... Though it can be set, any modifications happening while there are automation events scheduled — that is events scheduled using the methods of the AudioParam — are ignored, without raising any exception
You need to
gain.gain.setValueAtTime(ZERO, 0);
because just setting
gain.gain.value = ZERO;
does not set a schedule point in the AudioParam scheduler - so it's scheduling from the last know schedule point (which is the default value of 1 at time=0). Mixing setting .value and scheduling does not tend to work well; I've had an article 75% written about this for a long time, and just haven't gotten it released.

Safari and requestAnimationFrame gets DOMHighResTimestamp; window.performance not available

I created an animation using requestAnimationFrame. Works fine in Windows Chrome and IE; Safari (tested Safari 6 and 7) breaks. It turns out that rAF get a DOMHighResTimestamp rather than a Date timestamp. That's all fine and good, and what I expected, as it's now part of the spec. However, as far as I've been able to find, there's no way to get the current DOMHighResTimestamp (i.e. window.performance is not available, even with a prefix). So if I create the start time as a Date timestamp, it behaves radically wrong when I try to determine progress within the rAF callback (very small negative numbers).
If you look at this JSBin on Safari, it won't animate at all.
In this JBin, I've made a change to "skip" the first frame (where the time parameter is undefined), so that startTime gets set to the time parameter on the next frame. Seems to work, but skipping a frame seems a bit crappy.
Is there some way to get the current DOMHighResTimestamp in Safari, given the lack of window.performance? Or alternately, force rAF into some sort of legacy mode that forces it to get a Date timestamp instead?
Does anyone know why Safari has this inconsistency, where it provides the parameter in a format that you can't get at any other way?
Performance.now() is only a recommendation as of now. https://developer.mozilla.org/en-US/docs/Web/API/Performance.now() I can only assume it's a matter of time before it's official, seeing as how everyone but Safari has it built in.
Besides that fact use this to your advantage. Since you know requestAnimationFrame returns a DOMHighResTimestamp use that for your timing.
Game.start = function(timestamp){
if(timestamp){
this.time = timestamp;
requestAnimationFrame(Game.loop);
}else{
requestAnimationFrame(Game.start);
}
}
Game.loop = function(timestamp){
Game.tick(timestamp);
... your code
requestAnimationFrame(Game.loop);
}
Game.tick = function(timestamp) {
this.delta = timestamp - this.time;
this.time = timestamp;
};
What I do here, is call Game.start which will begin the loop (I've run into situations where the timestamp was undefined so we try until we get something valid). Once we get that we have our base time and since RAF is going to return a timestamp our tick function never has to call Date.now or performance.now as long as we pass it the timestamp returned by requestAnimationFrame.

Better way of getting time in milliseconds in javascript?

Is there an alternative in JavaScript of getting time in milliseconds using the date object, or at least a way to reuse that object, without having to instantiate a new object every time I need to get this value? I am asking this because I am trying to make a simple game engine in JavaScript, and when calculating the "delta frame time", I have to create a new Date object every frame. While I am not too worried about the performance implications of this, I am having some problems with the reliability of the exact time returned by this object.
I get some strange "jumping" in the animation, every second or so, and I am not sure if this is related to JavaScript's Garbage Collection or a limitation of the Date object when updating so fast. If I set the delta value to some constant, then the animation if perfectly smooth, so I am fairly sure this "jumping" is related to the way I get the time.
The only relevant code I can give is the way I calculate the delta time :
prevTime = curTime;
curTime = (new Date()).getTime();
deltaTime = curTime - prevTime;
When calculating movement / animation I multiply a constant value with the delta time.
If there is no way to avoid getting the time in milliseconds by using the Date object, would a function that increments a variable (being the elapsed time in milliseconds since the game started), and which is called using the SetTimer function at a rate of once every milliseconds be an efficient and reliable alternative?
Edit : I have tested now my code in different browsers and it seems that this "jump" is really only apparent in Chrome, not in Firefox. But it would still be nice if there were a method that worked in both browsers.
Try Date.now().
The skipping is most likely due to garbage collection. Typically garbage collection can be avoided by reusing variables as much as possible, but I can't say specifically what methods you can use to reduce garbage collection pauses.
As far that I know you only can get time with Date.
Date.now is the solution but is not available everywhere : https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Date/now.
var currentTime = +new Date();
This gives you the current time in milliseconds.
For your jumps. If you compute interpolations correctly according to the delta frame time and you don't have some rounding number error, I bet for the garbage collector (GC).
If there is a lot of created temporary object in your loop, garbage collection has to lock the thread to make some cleanup and memory re-organization.
With Chrome you can see how much time the GC is spending in the Timeline panel.
EDIT: Since my answer, Date.now() should be considered as the best option as it is supported everywhere and on IE >= 9.
I know this is a pretty old thread, but to keep things up to date and more relevant, you can use the more accurate performance.now() functionality to get finer grain timing in javascript.
window.performance = window.performance || {};
performance.now = (function() {
return performance.now ||
performance.mozNow ||
performance.msNow ||
performance.oNow ||
performance.webkitNow ||
Date.now /*none found - fallback to browser default */
})();
If you have date object like
var date = new Date('2017/12/03');
then there is inbuilt method in javascript for getting date in milliseconds format which is valueOf()
date.valueOf(); //1512239400000 in milliseconds format
This is a very old question - but still for reference if others are looking at it - requestAnimationFrame() is the right way to handle animation in modern browsers:
UPDATE: The mozilla link shows how to do this - I didn't feel like repeating the text behind the link ;)

How to tell what's causing slow HTML5 Canvas performance?

How can I tell if the canvas's slow performance is caused by the drawing itself, or the underlying logic that calculates what should be drawn and where?
The second part of my question is: how to calculate canvas fps? Here's how I did it, seems logical to me, but I can be absolutely wrong too. Is this the right way to do it?
var fps = 0;
setInterval(draw, 1000/30);
setInterval(checkFps, 1000);
function draw() {
//...
fps++;
}
function checkFps() {
$("#fps").html(fps);
fps = 0;
}
Edit:
I replaced the above with the following, according to Nathan's comments:
var lastTimeStamp = new Date().getTime();
function draw() {
//...
var now = new Date().getTime();
$("#fps").html(Math.floor(1000/(now - lastTimeStamp)));
lastTimeStamp = now;
}
So how's this one? You could also calculate only the difference in ms since the last update, performance differences can be seen that way too. By the way, I also did a side-by-side comparison of the two, and they usually moved pretty much together (a difference of 2 at most), however, the latter one had bigger spikes, when performance was extraordinarily low.
Your FPS code is definitely wrong
setInterval(checkFps, 1000);
No-one assures this function will be called exactly every second (it could be more than 1000ms, or less - but probably more), so
function checkFps() {
$("#fps").html(fps);
fps = 0;
}
is wrong (if fps is 32 at that moment it is possible that you have 32 frames in 1.5s (extreme case))
beter is to see what was the real time passes since the last update and calculate the realtimepassed / frames (I'm sure javascript has function to get the time, but I'm not sure if it will be accurate enough = ms or better)
fps is btw not a good name, it contains the number of frames (since last update), not the number of frames per second, so frames would be a better name.
In the same way
setInterval(draw, 1000/30);
is wrong, since you want to achieve a FPS of 30, but since the setInterval is not very accurate (and is probably going to wait longer than you say, you will end up with lower FPS even if the CPU is able to handle the load)
Webkit and Firebug both provide profiling tools to see where CPU cycles are being spent in your javascript code. I'd recommend starting there.
For the FPS calculation, I don't think your code is going to work, but I don't have any good recommendation :(
Reason being: Most (all?) browsers use a dedicated thread for running javascript and a different thread for running UI updates. If the Javascript thread is busy, the UI thread won't be triggered.
So, you can run some javascript looping code that'll "update" the UI 1000 times in succession (for instance, setting the color of some text) - but unless you add a setTimeout to allow the UI thread to paint the change, you won't see any changes until the 1000 iterations are finished.
That said, I don't know if you can assertively increment your fps counter at the end of the draw() routine. Sure, your javascript function has finished, but did the browser actually draw?
Check if you dont use some innerHTML method to debug your project. This can slow your project in a way you can't imagine, especially if you do some concatenation like this innerHTML += newDebugValues;
Or like desau said, profile your cpu usage with firebug or webkit inner debugger.

Categories

Resources