Is there an alternative in JavaScript of getting time in milliseconds using the date object, or at least a way to reuse that object, without having to instantiate a new object every time I need to get this value? I am asking this because I am trying to make a simple game engine in JavaScript, and when calculating the "delta frame time", I have to create a new Date object every frame. While I am not too worried about the performance implications of this, I am having some problems with the reliability of the exact time returned by this object.
I get some strange "jumping" in the animation, every second or so, and I am not sure if this is related to JavaScript's Garbage Collection or a limitation of the Date object when updating so fast. If I set the delta value to some constant, then the animation if perfectly smooth, so I am fairly sure this "jumping" is related to the way I get the time.
The only relevant code I can give is the way I calculate the delta time :
prevTime = curTime;
curTime = (new Date()).getTime();
deltaTime = curTime - prevTime;
When calculating movement / animation I multiply a constant value with the delta time.
If there is no way to avoid getting the time in milliseconds by using the Date object, would a function that increments a variable (being the elapsed time in milliseconds since the game started), and which is called using the SetTimer function at a rate of once every milliseconds be an efficient and reliable alternative?
Edit : I have tested now my code in different browsers and it seems that this "jump" is really only apparent in Chrome, not in Firefox. But it would still be nice if there were a method that worked in both browsers.
Try Date.now().
The skipping is most likely due to garbage collection. Typically garbage collection can be avoided by reusing variables as much as possible, but I can't say specifically what methods you can use to reduce garbage collection pauses.
As far that I know you only can get time with Date.
Date.now is the solution but is not available everywhere : https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Date/now.
var currentTime = +new Date();
This gives you the current time in milliseconds.
For your jumps. If you compute interpolations correctly according to the delta frame time and you don't have some rounding number error, I bet for the garbage collector (GC).
If there is a lot of created temporary object in your loop, garbage collection has to lock the thread to make some cleanup and memory re-organization.
With Chrome you can see how much time the GC is spending in the Timeline panel.
EDIT: Since my answer, Date.now() should be considered as the best option as it is supported everywhere and on IE >= 9.
I know this is a pretty old thread, but to keep things up to date and more relevant, you can use the more accurate performance.now() functionality to get finer grain timing in javascript.
window.performance = window.performance || {};
performance.now = (function() {
return performance.now ||
performance.mozNow ||
performance.msNow ||
performance.oNow ||
performance.webkitNow ||
Date.now /*none found - fallback to browser default */
})();
If you have date object like
var date = new Date('2017/12/03');
then there is inbuilt method in javascript for getting date in milliseconds format which is valueOf()
date.valueOf(); //1512239400000 in milliseconds format
This is a very old question - but still for reference if others are looking at it - requestAnimationFrame() is the right way to handle animation in modern browsers:
UPDATE: The mozilla link shows how to do this - I didn't feel like repeating the text behind the link ;)
Related
If you create a very simple program that has a setInterval with 1 second delay, and you log the times its function is called, you will notice that the interval 'drifts'.
Basically, it actually takes (1,000ms + some amount of time) between each call.
For this program, it actually takes ~1,005ms between each call.
What causes the drift?
Is it taking 5ms to requeue setInterval?
Is it the length of the time it takes to run the function? (I doubt this, but having trouble concluding.)
Why does setInterval behave this way, and not just base itself on some clock time? (e.g. if you have 1,000ms delay and you started at time 3... just check if 1,003 then 2,003 and so on has elapsed?)
Example:
const startTime = new Date().valueOf();
function printElapsedTime(startTime) {
console.log(new Date().valueOf() - startTime);
}
let intervalObj = setInterval(printElapsedTime, 1000, startTime);
Output:
1005
2010
3015
4020
So you are not sync'd to 1 second anymore. Since it drifts by about 5, after 100 runs it will be running a half second 'later' than expected.
This question discusses how to avoid this drift, but does not explain WHY this drift is happening. (As in it does not say that setInterval is recursively adding itself to the event queue after each call - which takes 3ms ... which is just a guess at the drift cause).
While no Javascript running on a standard browser claims to be real-time (as pointed out in several comments) there are steps you an take to make things not get as out of hand as it appears the example in the question does (the errors being cumulative).
Just to start with an experiment I ran this on my Windows 10 Chrome:
const startTime = new Date().valueOf();
function printElapsedTime(startTime) {
let curTime = new Date().valueOf();
console.log(curTime - startTime);
}
let intervalObj = setInterval(printElapsedTime, 1000, startTime);
<div id="show">0</div>
This gave fairly consistent error each second and around the minute time you can see there was no cumulative drift:
However, using Firefox on the same system there was cumulative drift and this can be seen as pretty significant by the one minute mark:
So the question is, can anything be done to make it a bit better across browsers?
This snippet ditches setInterval and instead uses setTimeout on each invocation:
const startTime = new Date().valueOf();
let nextExpected = startTime + 1000;
function printElapsedTime(startTime) {
let curTime = new Date().valueOf();
console.log(curTime - startTime);
let nextInterval = 1000 + nextExpected - curTime;
setTimeout(printElapsedTime, nextInterval, startTime);
nextExpected = curTime + nextInterval;
}
let intervalObj = setTimeout(printElapsedTime, 1000, startTime);
<div id="show">0</div>
On Firefox this gave:
There was no cumulative drift and the error around the one minute mark was no worse than earlier.
So, in attempt to actually answer the question:
Computers do have other duties to attend to and cannot guarantee to process a timeout function at an exact time (though the spec requires them not to process before the interval has elapsed). In the given code in particular console.log will take time, settingup a new interval (in the final example) takes time, but the laptop/phone etc will also be dealing with lots of other stuff at the same time, housekeeping in the background, listening for interrupts etc etc.
Different browsers seem to treat setInterval differently - the spec doesn't seem to say what if anything they should do about cumulative drift. From the experiments here it seems that Chrome/Edge at least on my Windows10 laptop does some mitigating which means the drift isn't cumulative whereas FF doesn't seem to adjust and the drift can be significant.
It would be interesting to know if others on different systems get equivalent results. Anyway, the basic message is don't rely on such timeouts, it is not a real time system.
Long story short, none of desktop operating systems is real-time os
https://en.m.wikipedia.org/wiki/Real-time_operating_system
Thus, executing a task like calling the callback function is not guaranteed in an exact time. The os does it’s best to juggle all the tasks, take care of power/resource constraints to optimize the performance as a whole. As a result, timings float around a little.
Interestingly, you get a consistent 5 ms shift. I have no explanation for that
I got this code over here:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
if(currentDate - date >= 1000) {
console.log(currentDate, date);
console.log(currentDate-date);
}
else {
console.log("It was less than a second!");
console.log(currentDate-date);
}
}, 1000);
In my computer, it always executes correctly, with 1000 in the console output. Interestedly in other computer, the same code, the timeout callback starts in less than a second and the difference of currentDate - date is between 980 and 998.
I know the existence of libraries that solve this inaccuracy (for example, Tock).
Basically, my question is: What are the reasons because setTimeout does not fire in the given delay? Could it be the computer that is too slow and the browser automatically tries to adapt to the slowness and fires the event before?
PS: Here is a screenshot of the code and the results executed in the Chrome JavaScript console:
It's not supposed to be particularly accurate. There are a number of factors limiting how soon the browser can execute the code; quoting from MDN:
In addition to "clamping", the timeout can also fire later when the page (or the OS/browser itself) is busy with other tasks.
In other words, the way that setTimeout is usually implemented, it is just meant to execute after a given delay, and once the browser's thread is free to execute it.
However, different browsers may implement it in different ways. Here are some tests I did:
var date = new Date();
setTimeout(function(e) {
var currentDate = new Date();
console.log(currentDate-date);
}, 1000);
// Browser Test1 Test2 Test3 Test4
// Chrome 998 1014 998 998
// Firefox 1000 1001 1047 1000
// IE 11 1006 1013 1007 1005
Perhaps the < 1000 times from Chrome could be attributed to inaccuracy in the Date type, or perhaps it could be that Chrome uses a different strategy for deciding when to execute the code—maybe it's trying to fit it into the a nearest time slot, even if the timeout delay hasn't completed yet.
In short, you shouldn't use setTimeout if you expect reliable, consistent, millisecond-scale timing.
In general, computer programs are highly unreliable when trying to execute things with higher precision than 50 ms. The reason for this is that even on an octacore hyperthreaded processor the OS is usually juggling several hundreds of processes and threads, sometimes thousands or more. The OS makes all that multitasking work by scheduling all of them to get a slice of CPU time one after another, meaning they get 'a few milliseconds of time at most to do their thing'.
Implicity this means that if you set a timeout for 1000 ms, chances are far from small that the current browser process won't even be running at that point in time, so it's perfectly normal for the browser not to notice until 1005, 1010 or even 1050 milliseconds that it should be executing the given callback.
Usually this is not a problem, it happens, and it's rarely of utmost importance. If it is, all operating systems supply kernel level timers that are far more precise than 1 ms, and allow a developer to execute code at precisely the correct point in time. JavaScript however, as a heavily sandboxed environment, doesn't have access to kernel objects like that, and browsers refrain from using them since it could theoretically allow someone to attack the OS stability from inside a web page, by carefully constructing code that starves other threads by swamping it with a lot of dangerous timers.
As for why the test yields 980 I'm not sure - that would depend on exactly which browser you're using and which JavaScript engine. I can however fully understand if the browser just manually corrects a bit downwards for system load and/or speed, ensuring that "on average the delay is still about the correct time" - it would make a lot of sense from the sandboxing principle to just approximate the amount of time required without potentially burdening the rest of the system.
Someone please correct me if I am misinterpreting this information:
According to a post from John Resig regarding the inaccuracy of performance tests across platforms (emphasis mine)
With the system times constantly being rounded down to the last queried time (each about 15 ms apart) the quality of performance results is seriously compromised.
So there is up to a 15 ms fudge on either end when comparing to the system time.
I had a similar experience.
I was using something like this:
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000));
setTimeout(function ()
{
CountDownClock(ElementID, RelativeTime);
}, iMillSecondsTillNextWholeSecond);//Wait until the next whole second to start.
I noticed it would Skip a Second every couple Seconds, sometimes it would go for longer.
However, I'd still catch it Skipping after 10 or 20 Seconds and it just looked rickety.
I thought, "Maybe the Timeout is too slow or waiting for something else?".
Then I realized, "Maybe it's too fast, and the Timers the Browser is managing are off by a few Milliseconds?"
After adding +1 MilliSeconds to my Variable I only saw it skip once.
I ended up adding +50ms, just to be on the safe side.
var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000) + 50);
I know, it's a bit hacky, but my Timer is running smooth now. :)
Javascript has a way of dealing with exact time frames. Here’s one approach:
You could just save a Date.now when you start to wait, and create an interval with a low ms update frame, and calculate the difference between the dates.
Example:
const startDate = Date.now()
setInterval(() => {
const currentDate = Date.now()
if (currentDate - startDate === 1000 {
// it was a second
clearInterval()
return
}
// it was not a second
}, 50)
Let's introduce by a note from www.w3.org including two important links to compare.
The PerformanceTiming interface was defined in [NAVIGATION-TIMING] and
is now considered obsolete. The use of names from the
PerformanceTiming interface is supported to remain backwards
compatible, but there are no plans to extend this functionality to
names in the PerformanceNavigationTiming interface defined in
[NAVIGATION-TIMING-2] (or other interfaces) in the future.
I have made a function to get a Navigation Time that should be both backward and forward compatible, because we are in the middle era of transforming to level 2. So this function to get a time from an event name works in Chrome but not Firefox:
function nav(eventName) {
var lev1 = performance.timing; //deprecated unix epoch time in ms since 1970
var lev2 = performance.getEntriesByType("navigation")[0]; //ms since page started to load. (since performance.timing.navigationStart)
var nav = lev2 || lev1; //if lev2 is undefined then use lev1
return nav[eventName]
}
Explanation: When there is no "navigation" entry this falls back to the deprecated way to do navigation timing based on Unix epoch time time in milliseconds since 1970 (lev1), while the new way (lev2) is HR time in milliseconds since the current document navigation started to load, that is useful together with User Timing that always have had the HR time format.
How can we get the function return HR time in all cases?
When I see a number with more than 10 digits without a period I know it is a time got from the deprecated Navigation Timing level 1. All other test cases give decimal point numbers meaning it is HR times with higher precision. The biggest issue is that they have different time origin.
I have gone through confusion, trial errors and frustrated serching (MDN has not updated to level 2) to confirm and state that:
Navigation Timing Level 1 use unix epoch time and the rest...
Navigation Timing Level 2 use HR time
User Timing Level 1 use HR time
User Timing Level 2 use HR time
Also performance.now() has HR time both in Chrome and Firefox.
How to convert unix epoch time to HR time?
SOLVED .:
The code is corrected by help from Amadan.
See comments in tha accepted answer.
function nav(eventName, fallback) {
var lev1 = performance.timing; //deprecated unix epoch time in ms since 1970
var lev2 = performance.getEntriesByType("navigation")[0]; //ms since page started to load
var nav = lev2 || lev1; //if lev2 is undefined then use lev1
if (!nav[eventName] && fallback) eventName = fallback
// approximate t microseconds it takes to execute performance.now()
var i = 10000, t = performance.now()
while(--i) performance.now()
t = (performance.now() - t)/10000 // < 10 microseconds
var oldTime = new Date().getTime(),
newTime = performance.now(),
timeOrigin = performance.timeOrigin?
performance.timeOrigin:
oldTime - newTime - t; // approximate
return nav[eventName] - (lev2? 0: timeOrigin);
// return nav[eventName] - (lev2? 0: lev1.navigationStart); //alternative?
}
The performance.timeOrigin is reduced in the case where old timing lev1 is used.
If browser does not have it then approximate timeOrigin by reducing performance.now() the time since timeOrigin, from (new Date().getTime()) the time since Unix Epoch to result in the time to timeOrigin since Unix Epoch. Apparently it is the definition though the link was a bit vague about it. I confirmed by testing and I trust the answer. Hopefully w3c have a better definition of timeOrigin than: the high resolution timestamp of the start time of the performance measurement.
The functions returned value represents the time elapsed since the time origin.
It may be insignificant in most cases, but the measured time t it took to execute performance.now() is removed to approximate simultaneous execution.
I measured t to almost 10 microseconds on my Raspberry Pi that was fairly stable with various loop sizes. But my Lenovo was not as precise rounding off decimals and getting shorter times on t when tested bigger loop sizes.
An alternative solution is commented away in the last line of code.
The deprecated performance.timing.navigationStart:
representing the moment, in miliseconds since the UNIX epoch, right
after the prompt for unload terminates on the previous document in the
same browsing context. If there is no previous document, this value
will be the same as PerformanceTiming.fetchStart
So, to check current document (ignoring any previous) then use the deprecated performance.timing.fetchStart:
representing the moment, in miliseconds since the UNIX epoch, the
browser is ready to fetch the document using an HTTP request. This
moment is before the check to any application cache.
It is of course correct to use a deprecated property if it is the only one the browser understand. It is used when "navigation" is not defined in the getEntriesByType otherwise having good browser support.
A quick check confirmed each other by this line just before return:
console.log(performance.timeOrigin + '\n' + lev1.navigationStart + '\n' + lev1.fetchStart)
With a result that looks like this in my Chrome
1560807558225.1611
1560807558225
1560807558241
It is only possible if the browser supports HR time 2:
let unixTime = hrTime + performance.timeOrigin;
let hrTime = unixTime - performance.timeOrigin;
However, performance is generally used for time diffs, which do not care what the origin of absolute timestamps is.
For the browsers that do not support HR time 2, or those that "support" it half-heartedly, you can fake it this way:
const hrSyncPoint = performance.now();
const unixSyncPoint = new Date().getTime();
const timeOrigin = unixSyncPoint - hrSyncPoint;
It's not super-exact, but should be good enough for most purposes (on my system, performance.timeOrigin - timeOrigin is sub-millisecond).
I created an animation using requestAnimationFrame. Works fine in Windows Chrome and IE; Safari (tested Safari 6 and 7) breaks. It turns out that rAF get a DOMHighResTimestamp rather than a Date timestamp. That's all fine and good, and what I expected, as it's now part of the spec. However, as far as I've been able to find, there's no way to get the current DOMHighResTimestamp (i.e. window.performance is not available, even with a prefix). So if I create the start time as a Date timestamp, it behaves radically wrong when I try to determine progress within the rAF callback (very small negative numbers).
If you look at this JSBin on Safari, it won't animate at all.
In this JBin, I've made a change to "skip" the first frame (where the time parameter is undefined), so that startTime gets set to the time parameter on the next frame. Seems to work, but skipping a frame seems a bit crappy.
Is there some way to get the current DOMHighResTimestamp in Safari, given the lack of window.performance? Or alternately, force rAF into some sort of legacy mode that forces it to get a Date timestamp instead?
Does anyone know why Safari has this inconsistency, where it provides the parameter in a format that you can't get at any other way?
Performance.now() is only a recommendation as of now. https://developer.mozilla.org/en-US/docs/Web/API/Performance.now() I can only assume it's a matter of time before it's official, seeing as how everyone but Safari has it built in.
Besides that fact use this to your advantage. Since you know requestAnimationFrame returns a DOMHighResTimestamp use that for your timing.
Game.start = function(timestamp){
if(timestamp){
this.time = timestamp;
requestAnimationFrame(Game.loop);
}else{
requestAnimationFrame(Game.start);
}
}
Game.loop = function(timestamp){
Game.tick(timestamp);
... your code
requestAnimationFrame(Game.loop);
}
Game.tick = function(timestamp) {
this.delta = timestamp - this.time;
this.time = timestamp;
};
What I do here, is call Game.start which will begin the loop (I've run into situations where the timestamp was undefined so we try until we get something valid). Once we get that we have our base time and since RAF is going to return a timestamp our tick function never has to call Date.now or performance.now as long as we pass it the timestamp returned by requestAnimationFrame.
So I've been a good net citizen, using feature detection to see whether the browser supports requestAnimationFrame and only fall back to a setTimeout-based solution otherwise (something around the lines of Paul Irish's famous post).
var NOW = Date.now || function () { return new Date.getTime(); };
var reqAnimFrame =
window.requestAnimationFrame ||
window.webkitRequestAnimationFrame ||
/* ... || */
function (callback) {
setTimeout(function () { callback(NOW()); }, 1000 / 60);
};
var previousTime = NOW();
function animStep(time) {
var timePassed = time - previousTime;
myCharacter.move(myCharacter.speed * timePassed);
previousTime = time;
reqAnimationFrame(animStep);
}
// start the animation
reqAnimationFrame(animStep);
This worked great everywhere until Internet Explorer 10 came along. In IE10, the time parameter passed doesn't seem to have anything to do with the current time, screwing up the calculation of timePassed.
What's going on?
All (as far as I know) other browsers that implement requestAnimationFrame go by the specification in the (at the time of writing) current Working Draft:
Let time be [the redraw time] expressed as the number of milliseconds since 1970-01-01T00:00:00Z.
That's representing time precisely like your NOW() does.
IE10 however goes by the spec in the current Editor's Draft:
Let time be the result of invoking the now method of the Performance interface within this context.
Which essentially means the number of milliseconds since the browser loaded this page (it also means the the measurement is more precise, since performance.now returns fractional milliseconds).
And thus when you calculate timePassed for the first time in IE10, you are getting something like negative 43 years.
Fortunately, because the value passed to the requestAnimationFrame callback has the same unit in either case, just a different point of reference, you can easily adjust to this.
There are three possibilities:
You could just throw away the very first animation step, using it only to set previousTime, but not doing anything else.
You could ignore the passed parameter and use your NOW() (or performance.now) every time, thus always having the same same point of reference.
Or you could change the start of the animation to this:
// start the animation
reqAnimationFrame(function (t) {
previousTime = t - (NOW() - previousTime);
animStep(t);
);
This will make the calculation (including the first one) of timePassed correct no matter which spec the browser follows. And since it changes only the very first invocation, you don't have any additional overhead in the long run either.