I am trying to get the time since epoch in milliseconds of an event in javascript - however when I do
alert('event at exactly' + event.timeStamp);
It prints numbers in the realm of 8000 to 10000 or a bit more.
Milliseconds since the epoch is usually numbers like 1519211809934
What am I missing here? How do I properly get the milliseconds since epoch for the exact time an event fired?
Well, event.timeStamp measures something completely different:
This value is the number of milliseconds elapsed from the beginning
of the current document's lifetime till the event was created.
If you want to get the exact time of when event has been originated, you should add its value to performance.timeOrigin instead. Note that the latter is not supported by Safari.
Alternatively, you can trigger event on your own in the initialization phase of your application, then just compare its timeStamp with Date.now() and use the difference as a document's starting point. That might be useful if there's a significant event loop delay causing the discrepancy between those measurements you worry about.
Note that (judging by the comments) you might still get some fluctuations, as some browsers tend to be less precise with that data:
To offer protection against timing attacks and fingerprinting, the
precision of Event.timeStamp might get rounded depending on browser
settings.
In Firefox, the privacy.reduceTimerPrecision preference is enabled by
default and defaults to 20us in Firefox 59; in 60 it will be 2ms. [...]
If you also enable privacy.resistFingerprinting, the precision will be 100ms or the value of privacy.resistFingerprinting.reduceTimerPrecision.microseconds, whichever is larger.
I need to code myself a mini, locally running HTML5 + JavaScript app, which I will use as a timer to time a person performing squats.
The idea is simple: When I press A on the keyboard, it will store the current time with seconds and miliseconds into a local table as a repetition start. When I press B, it will store the current time as a repetition end.
What I'm not 100% sure about is how reliable the JavaScript timestamp really is. What is my best bet here? Here are a few ideas:
run it on the latest version of Chrome
disable the internet connection, so that the OS will not sync/change its current time
Is there anything else I should be careful about?
I don't need the time to be absolutely exact, only relatively; meaning that the last timestamp minus the first timestamp will yield the real time taken to perform the whole session. I don't care to know exactly at what time it started.
If you're retrieving the system time in Javascript with something like Date.now() in order to measure the time between two events, then that will be exactly as accurate as the system time is on the local computer. How exactly accurate that is will depend entirely upon the clock in the local system and whether there are any changes to the system time during the measurement period.
If there are no changes to the system time (such as a clock sync with an external source), then most system clocks are pretty darn accurate these days. Measuring an event that takes minutes would likely be accurate within a few milliseconds which is more accuracy than you can achieve by marking start and stop with just a keypress anyway since the precision on exactly when the key is pressed relative to the start and stop of the event is certainly not better than several hundred milliseconds.
In JavaScript, is it possible to call a function playing 10 different wav sounds at 44.1 kHz and call that same function again in (1/44100)*(128/60)*16 seconds with a 1/44.1 millisecond precision preferably with chrome/safari and in that case how?
I'm looking at making a music loop machine playing a few simultaneous loops. The precision is needed otherwise there will be unwanted hearable issues with the sounds (phasing).
Robert,
It's possible to measure time with high accuracy - via performance.now() - but you cannot get a callback with that kind of precision. In fact, in light of layout passes and JavaScript execution in the main thread, and the ever-looming threat of garbage collection happening in the main thread, you can't get anywhere NEAR even millisecond precision; you generally ought to be planning on potential interruptions in the tens of milliseconds for robustness.
The answer to this is to use scheduling, particularly in the Web Audio API - I see that you saw the article I wrote about this a year ago on HTML5Rocks (http://www.html5rocks.com/en/tutorials/audio/scheduling/), but you missed the significant piece - you shouldn't be calling
audioSource2.noteOn(0, 0.1190, 1.875);
you need the time offset to schedule it ahead appropriately:
audioSource2.noteOn(time, 0.1190, 1.875);
If you look at my original code, that's how I'm scheduling the oscillator ahead of time. The scheduler runs in a "slow" callback loop - being called only every 100ms or so - but schedules ahead a few beats. If you truly need to mute notes that may already be scheduled in the next 1/10th of a second, then you can keep a node in the middle to disconnect().
I would take a look at either DOM High Resolution timestamp, which can be accessed with window.performance.now(), or request Animation Frame, with window.requestAnimationFrame
You can use this library which I have written : https://github.com/sebpiq/WAAClock
It lets you schedule things precisely and easily and also provides useful functionalities such as : cancel event, change tempo, ... everything necessary for a loop machine. Under the hood, it implements the tricks explained in this article (already linked by other people) : http://www.html5rocks.com/en/tutorials/audio/scheduling/
If by loop machine you mean continuously looping a few samples (and not a drum machine, where you just play a sample at a point in time), you might also want to look into this : https://github.com/sebpiq/WAATableNode
Are there any timing functions in JavaScript with microsecond resolution?
I am aware of timer.js for Chrome, and am hoping there will be a solution for other friendly browsers, like Firefox, Safari, Opera, Epiphany, Konqueror, etc. I'm not interested in supporting any IE, but answers including IE are welcome.
(Given the poor accuracy of millisecond timing in JS, I'm not holding my breath on this one!)
Update: timer.js advertises microsecond resolution, but it simply multiplies the millisecond reading by 1,000. Verified by testing and code inspection. Disappointed. :[
As alluded to in Mark Rejhon's answer, there is an API available in modern browsers that exposes sub-millisecond resolution timing data to script: the W3C High Resolution Timer, aka window.performance.now().
now() is better than the traditional Date.getTime() in two important ways:
now() is a double with submillisecond resolution that represents the number of milliseconds since the start of the page's navigation. It returns the number of microseconds in the fractional (e.g. a value of 1000.123 is 1 second and 123 microseconds).
now() is monotonically increasing. This is important as Date.getTime() can possibly jump forward or even backward on subsequent calls. Notably, if the OS's system time is updated (e.g. atomic clock synchronization), Date.getTime() is also updated. now() is guaranteed to always be monotonically increasing, so it is not affected by the OS's system time -- it will always be wall-clock time (assuming your wall clock is not atomic...).
now() can be used in almost every place that new Date.getTime(), + new Date and Date.now() are. The exception is that Date and now() times don't mix, as Date is based on unix-epoch (the number of milliseconds since 1970), while now() is the number of milliseconds since your page navigation started (so it will be much smaller than Date).
now() is supported in Chrome stable, Firefox 15+, and IE10. There are also several polyfills available.
Note: When using Web Workers, the window variable isn't available, but you can still use performance.now().
There's now a new method of measuring microseconds in javascript:
http://gent.ilcore.com/2012/06/better-timer-for-javascript.html
However, in the past, I found a crude method of getting 0.1 millisecond precision in JavaScript out of a millisecond timer. Impossible? Nope. Keep reading:
I'm doing some high-precisio experiments that requires self-checked timer accuracies, and found I was able to reliably get 0.1 millisecond precision with certain browsers on certain systems.
I have found that in modern GPU-accelerated web browsers on fast systems (e.g. i7 quad core, where several cores are idle, only browser window) -- I can now trust the timers to be millisecond-accurate. In fact, it's become so accurate on an idle i7 system, I've been able to reliably get the exact same millisecond, over more than 1,000 attempts. Only when I'm trying to do things like load an extra web page, or other, the millisecond accuracy degrades (And I'm able to successfully catch my own degraded accuracy by doing a before-and-after time check, to see if my processing time suddenly lengthened to 1 or more milliseconds -- this helps me invalidate results that has probably been too adversely affected by CPU fluctuations).
It's become so accurate in some GPU accelerated browsers on i7 quad-core systems (when the browser window is the only window), that I've found I wished I could access a 0.1ms precision timer in JavaScript, since the accuracy is finally now there on some high-end browsing systems to make such timer precision worthwhile for certain types of niche applications that requires high-precision, and where the applications are able to self-verify for accuracy deviations.
Obviously if you are doing several passes, you can simply run multiple passes (e.g. 10 passes) then divide by 10 to get 0.1 millisecond precision. That is a common method of getting better precision -- do multiple passes and divide the total time by number of passes.
HOWEVER...If I can only do a single benchmark pass of a specific test due to an unusually unique situation, I found out that I can get 0.1 (And sometimes 0.01ms) precision by doing this:
Initialization/Calibration:
Run a busy loop to wait until timer increments to the next millisecond (align timer to beginning of next millisecond interval) This busy loop lasts less than a millisecond.
Run another busy loop to increment a counter while waiting for timer to increment. The counter tells you how many counter increments occured in one millisecond. This busy loop lasts one full millisecond.
Repeat the above, until the numbers become ultra-stable (loading time, JIT compiler, etc). 4. NOTE: The stability of the number gives you your attainable precision on an idle system. You can calculate the variance, if you need to self-check the precision. The variances are bigger on some browsers, and smaller on other browsers. Bigger on faster systems and slower on slower systems. Consistency varies too. You can tell which browsers are more consistent/accurate than others. Slower systems and busy systems will lead to bigger variances between initialization passes. This can give you an opportunity to display a warning message if the browser is not giving you enough precision to allow 0.1ms or 0.01ms measurements. Timer skew can be a problem, but some integer millisecond timers on some systems increment quite accurately (quite right on the dot), which will result in very consistent calibration values that you can trust.
Save the final counter value (or average of the last few calibration passes)
Benchmarking one pass to sub-millisecond precision:
Run a busy loop to wait until timer increments to the next millisecond (align timer to beginning of next millisecond interval). This busy loop lasts less than a millisecond.
Execute the task you want to precisely benchmark the time.
Check the timer. This gives you the integer milliseconds.
Run a final busy loop to increment a counter while waiting for timer to increment. This busy loop lasts less than a millisecond.
Divide this counter value, by the original counter value from initialization.
Now you got the decimal part of milliseconds!!!!!!!!
WARNING: Busy loops are NOT recommended in web browsers, but fortunately, these busy loops run for less than 1 millisecond each, and are only run a very few times.
Variables such as JIT compilation and CPU fluctuations add massive inaccuracies, but if you run several initialization passes, you'll have full dynamic recompilation, and eventually the counter settles to something very accurate. Make sure that all busy loops is exactly the same function for all cases, so that differences in busy loops do not lead to differences. Make sure all lines of code are executed several times before you begin to trust the results, to allow JIT compilers to have already stabilized to a full dynamic recompilation (dynarec).
In fact, I witnessed precision approaching microseconds on certain systems, but I wouldn't trust it yet. But the 0.1 millisecond precision appears to work quite reliably, on an idle quad-core system where I'm the only browser page. I came to a scientific test case where I could only do one-off passes (due to unique variables occuring), and needed to precisely time each pass, rather than averaging multiple repeat pass, so that's why I did this.
I did several pre-passes and dummy passes (also to settle the dynarec), to verify reliability of 0.1ms precision (stayed solid for several seconds), then kept my hands off the keyboard/mouse, while the benchmark occured, then did several post-passes to verify reliability of 0.1ms precision (stayed solid again). This also verifies that things such as power state changes, or other stuff, didn't occur between the before-and-after, interfering with results. Repeat the pre-test and post-test between every single benchmark pass. Upon this, I was quite virtually certain the results in between were accurate. There is no guarantee, of course, but it goes to show that accurate <0.1ms precision is possible in some cases in a web browser.
This method is only useful in very, very niche cases. Even so, it literally won't be 100% infinitely guaranteeable, you can gain quite very trustworthy accuracy, and even scientific accuracy when combined with several layers of internal and external verifications.
Here is an example showing my high-resolution timer for node.js:
function startTimer() {
const time = process.hrtime();
return time;
}
function endTimer(time) {
function roundTo(decimalPlaces, numberToRound) {
return +(Math.round(numberToRound + `e+${decimalPlaces}`) + `e-${decimalPlaces}`);
}
const diff = process.hrtime(time);
const NS_PER_SEC = 1e9;
const result = (diff[0] * NS_PER_SEC + diff[1]); // Result in Nanoseconds
const elapsed = result * 0.0000010;
return roundTo(6, elapsed); // Result in milliseconds
}
Usage:
const start = startTimer();
console.log('test');
console.log(`Time since start: ${endTimer(start)} ms`);
Normally, you might be able to use:
console.time('Time since start');
console.log('test');
console.timeEnd('Time since start');
If you are timing sections of code that involve looping, you cannot gain access to the value of console.timeEnd() in order to add your timer results together. You can, but it get gets nasty because you have to inject the value of your iterating variable, such as i, and set a condition to detect if the loop is done.
Here is an example because it can be useful:
const num = 10;
console.time(`Time til ${num}`);
for (let i = 0; i < num; i++) {
console.log('test');
if ((i+1) === num) { console.timeEnd(`Time til ${num}`); }
console.log('...additional steps');
}
Cite: https://nodejs.org/api/process.html#process_process_hrtime_time
The answer is "no", in general. If you're using JavaScript in some server-side environment (that is, not in a browser), then all bets are off and you can try to do anything you want.
edit — this answer is old; the standards have progressed and newer facilities are available as solutions to the problem of accurate time. Even so, it should be remembered that outside the domain of a true real-time operating system, ordinary non-privileged code has limited control over its access to compute resources. Measuring performance is not the same (necessarily) as predicting performance.
editing again — For a while we had performance.now(), but at present (2022 now) browsers have degraded the accuracy of that API for security reasons.
I'm trying to create a countdown timer that's based on a time on the server.
I originally had the server set the time left, and just did a setTimeout for 1 second that would decrement the time.
I found 2 problems with this:
There is a lag from the server setting the time until the client's page is rendered and the JavaScript begins to run. The lag amount depends on the speed of the users internet connection and computer/JavaScript engine.
I think setTimeout of 1 second may have been getting behind a little on slower computers.
I changed it so the server would set the ending time and the JavaScript on the client would take the time (in UTC) and calculate the remaining time left. It would then do this on every setTimeout callback. This makes the time and countdown perfect. If the client has a fast computer/JavaScript engine, the timer stays on page. If the computer/JavaScript engine is slower, you may see a second be skipped here and there, but the time is never off.
I found 1 problem with this method so far:
Every client's clock may be different.
So, the time left may be a couple seconds, or 30 seconds, or even days off if the clients time is not correct on their computer.
Is there a way that I can have the time left be exact based on the server's ending date?
I don't know what kind of resolution you need, but given network and page rendering latencies, it's going to be impossible to get client-server agreement to much better than a second. I would suggest you do an ajax poll every 5 or 10 seconds, and adjust your timer accordingly. There is also comet which is essentially "reverse" ajax, which can push the times to the client. But either way, you still have network and renderign latencies to contend with.